I am new to MEMSQL and trying to restore data into MEMSQL db using .csv file but its below error
ERROR 1017 ER_FILE_NOT_FOUND: Can’t find file: ‘\home\vagrant\filename.csv’ (errno: 2)
CSV data is imported from another server.
I have memsql on virtual machine.
I have copied table dump csv to \home\vagrant\ location.
I am trying below command to restore data.
LOAD DATA INFILE ‘\home\vagrant\filename.csv’ INTO TABLE “tableName” FIELDS TERMINATED BY ‘\t’ LINES TERMINATED BY ‘\n’;
Thanks in advance
Just to clarify, the filename.csv file is located inside the virtual machine running memsql or is located on same machine your running the LOAD DATA from?
If the file is on the same machine your running LOAD DATA from you need to add the LOCAL keyword (LOAD DATA LOCAL INFILE ...)
I wanted to know if there is some way to transfer data from ASE database to an external file using ASE command line.
In sqlite there is the following command - ".output data.csv" where data.csv is an external file in which dataset is transferred from database we are currenlty in. Wanted to know if there is some command in ASE to do the similar task.
My question is about the following, it is required to pass a database of Oracle Database 10g Express Edition to 11g. I was given the backup on a pendrive, it is a file with extension .dmp (Dump file).
I installed the 11g Express Edition on a new server but also installed the database that comes with this (XE).
I want to restore the database from the 10g to another unit other than C, which is where the Oracle 11g database is installed. I also want this new database to "replace" the XE (I do not know if it is the correct way to say it).
I have only found adjustments and location changes but only within the same unit.
Any scope would be very useful.
Thank you.
Judging from the comments, it sounds like you have been given a Database Dump file (.dmp) from a database on a pendrive, and you need to figure out how to get that file into a database, correct?
First, let's go over some background. What is a dump file (.dmp)? From Oracle:
The dump file set is made up of one or more disk files that contain
table data, database object metadata, and control information. The
files are written in a proprietary, binary format. During an import
operation, the Data Pump Import utility uses these files to locate
each database object in the dump file set.
At a high level, that .dmp file is a collection of DDL and DML statements that will recreate whatever data and objects that were exported. .dmp files make it easier to transport and move large amounts of data between databases using Data Pump. But what is Data Pump? Again, from Oracle:
Oracle Data Pump technology enables very high-speed movement of data and metadata from one database to another.
Basically, Data Pump is a set of utilities (EXPDP & IMPDP) that are used to move data between databases. The .dmp file you have was likely created using EXPDP. You will need to use IMPDP to import that .dmp file into a database.
Here's were it gets interesting - you say that you already have an 11g database, correct? If you want to, you should be able to import the 10g dump file directly into your 11g database without any issues. The reason is that Oracle tends to be backwards compatible, and typically speaking, anything that you do with one version of Oracle will be compatible with the version that immediately succeeds it. Jumping from something like Oracle 8i to 11g won't work, but you can always go from 8i to 9i, from 9i to 10g, and so on.
If you want to import that dump file into your 11g database, here's what you'll need to do:
Create a DBA account, or have an account that has been granted Data Pump privileges explicitly.
Move the .dmp file to the server where your 11g database lives. If you want to make it even easier for yourself, you can move the .dmp file to your database's datapump directory. If you don't know where that is, execute the following query on your database: select * from all_directories where directory_name = 'DATA_PUMP_DIR'; This query will return a directory. You don't have to use this directory, it will just make it easier.
Once you have the dump file in place and you have all of the necessary database and operating system privileges, you are ready to import the dmp file. Open a new command line window, set your Oracle home if it is not already set, and then navigate to the directory where you placed the .dmp file. Your import command will look something like this:
impdp [USERNAME]/[PASSWORD]#[DATABASE] directory=[DIRECTORY] dumpfile=[FILENAME].dmp logfile=[LOGFILE].log
Where [USERNAME]/[PASSWORD] are your credentials, [DATABASE] is the name of the database you're importing the dump file into, [DIRECTORY] is whatever directory you placed the dump file in, [FILENAME] is the name of the .dmp file, and [LOGFILE] is whatever name you chose for your log file.
Assuming your database has everything necessary for the .dmp file, the import should begin and you will start seeing status updates that look similar to this:
Starting [USERNAME]."SYS_IMPORT_FULL_01": [USERNAME]/******** directory=DATA_PUMP_DIR dumpfile=[FILENAME].dmp logfile=[LOGFILE].log
Processing object type SCHEMA_EXPORT/USER
Processing object type SCHEMA_EXPORT/SYSTEM_GRANT
Processing object type SCHEMA_EXPORT/ROLE_GRANT
Processing object type SCHEMA_EXPORT/TABLE/TABLE
Note that this is just an example, your results may look different. Assuming all goes well, you will see a message like this at the end:
Job [USERNAME]."SYS_IMPORT_FULL_01" completed
If you don't want to import it into your existing 11g database, you could always spin up a new database and import the .dmp file to that one using these same guidelines.
That should be enough to get you started down the right path, hope this helps and good luck!
P.S. Here is a great FAQ on the Data Pump utilities as well: http://www.orafaq.com/wiki/Import_Export_FAQ
I have a database called 'AdvWorks'.
I have a backup call 'BackupDB.bak'.
When I restore the BackupDB.bak over AdvWorks, SQL tries to overwrite the data files of the BackupDB rather than the AdvWorks data files. I have to change this everything and to me this makes no sense.
Can someone please tell me why it behaves like this?
And secondly, can this default location be changed to the restore db files rather than the backup db files?
Thanks.
, SQL tries to overwrite the data files of the BackupDB rather than the AdvWorks data files
This is because,the location is also copied in backup file
To overcome this,use RESTORE WITH MOVE..
RESTORE DATABASE MyNwind
FROM MyNwind_1
WITH NORECOVERY,
MOVE 'MyNwind_data_1' TO 'D:\MyData\MyNwind_data_1.mdf',
MOVE 'MyNwind_data_2' TO 'D:\MyData\MyNwind_data_2.ndf';
I am trying to read the content of my backup E.g.(testdb.bak) in a text format E.g.(Notepad or another text editor)
This command is retrieving only the file holds:
Restore HEADERONLY From disk = 'C:\backups\testdb.bak'
Is there any way or file extension, file type or format, where I can see the whole content of a backup? without restoring it.
Is there any way or file extension, file type or format, where I can see the whole content of a backup? without restoring it.
No there is no command in SQL Server which would let you see the content of .bak file. If you open it in notepad you would see some information and lots of garbage characters. This is also not supported
The best way to see its content is restore it on SQL Server database and run as many select statements you like on the tables.