I'm trying to do a data-only export using oracle data pump. I just wanted to make a test run on one single schema, however the result is not correct. I call expdp as follows:
./expdp user/password#x.x.x.x:1524/orcl1 DIRECTORY=EXPORTDIR DUMPFILE=orcl1_export.dmp LOGFILE=orcl1_export.log CONTENT=DATA_ONLY SCHEMAS=pfl_adm
Even I provided the username and password in the command line, I still get a prompt to log in as SYSDBA. Once I log in as SYSDBA, the export runs, however it seems that the tool is exporting another Oracle instance (we have several copies with the same schemas). The number of exported rows in the log file does not match with the number of rows in the database. Do you have any ideas?
Related
I am trying to import data from Db dump
I have created a user 'user' and Granted following permissions:
CONNECT RESOURCE UNLIMITED TABLESPACE DBA ALL PRIVILEGES IMP_FULL_DATABASE
I am running the following command:
imp <user>/<password>#<server> touser=<user>
FILE=C:\App\<path>\admin\orcl\dpdump\EXPDAT.DMP full=y log=imp.log;
While running I am getting the following error message.
IMP-00401: dump file "C:\App\<path>\admin\orcl\dpdump\EXPDAT.DMP" may
be an Data Pump export dump file IMP-00000: Import terminated unsuccessfully
There are two import and export utilities.
One is client - server based, and is also deprecated. That would be IMPORT and EXPORT, shortened as IMP and EXP.
Data Pump is the 'new,' server based set of utilities - much more powerful and efficient at getting data in and out of your database.
You'll need to place your DMP file in a Database 'DIRECTORY' - these are known OS directories to the database, you can see them in the data dictionary via
SELECT * FROM ALL_DIRECTORIES
It's likely you already have a directory already defined just for data pump, look for something like 'DATA PUMP DIR'
Data Pump has a utility you can run from the OS, a PL/SQL API, and there is a Wizard in SQL Developer.
View > DBA menu.
Add a connection (not SYS), right-click on the Data Pump category, select Import Wizard...then walk the dialog.
We'll create and kick off the job for you, you can also watch the progress of the job and check for any errors.
My question is about the following, it is required to pass a database of Oracle Database 10g Express Edition to 11g. I was given the backup on a pendrive, it is a file with extension .dmp (Dump file).
I installed the 11g Express Edition on a new server but also installed the database that comes with this (XE).
I want to restore the database from the 10g to another unit other than C, which is where the Oracle 11g database is installed. I also want this new database to "replace" the XE (I do not know if it is the correct way to say it).
I have only found adjustments and location changes but only within the same unit.
Any scope would be very useful.
Thank you.
Judging from the comments, it sounds like you have been given a Database Dump file (.dmp) from a database on a pendrive, and you need to figure out how to get that file into a database, correct?
First, let's go over some background. What is a dump file (.dmp)? From Oracle:
The dump file set is made up of one or more disk files that contain
table data, database object metadata, and control information. The
files are written in a proprietary, binary format. During an import
operation, the Data Pump Import utility uses these files to locate
each database object in the dump file set.
At a high level, that .dmp file is a collection of DDL and DML statements that will recreate whatever data and objects that were exported. .dmp files make it easier to transport and move large amounts of data between databases using Data Pump. But what is Data Pump? Again, from Oracle:
Oracle Data Pump technology enables very high-speed movement of data and metadata from one database to another.
Basically, Data Pump is a set of utilities (EXPDP & IMPDP) that are used to move data between databases. The .dmp file you have was likely created using EXPDP. You will need to use IMPDP to import that .dmp file into a database.
Here's were it gets interesting - you say that you already have an 11g database, correct? If you want to, you should be able to import the 10g dump file directly into your 11g database without any issues. The reason is that Oracle tends to be backwards compatible, and typically speaking, anything that you do with one version of Oracle will be compatible with the version that immediately succeeds it. Jumping from something like Oracle 8i to 11g won't work, but you can always go from 8i to 9i, from 9i to 10g, and so on.
If you want to import that dump file into your 11g database, here's what you'll need to do:
Create a DBA account, or have an account that has been granted Data Pump privileges explicitly.
Move the .dmp file to the server where your 11g database lives. If you want to make it even easier for yourself, you can move the .dmp file to your database's datapump directory. If you don't know where that is, execute the following query on your database: select * from all_directories where directory_name = 'DATA_PUMP_DIR'; This query will return a directory. You don't have to use this directory, it will just make it easier.
Once you have the dump file in place and you have all of the necessary database and operating system privileges, you are ready to import the dmp file. Open a new command line window, set your Oracle home if it is not already set, and then navigate to the directory where you placed the .dmp file. Your import command will look something like this:
impdp [USERNAME]/[PASSWORD]#[DATABASE] directory=[DIRECTORY] dumpfile=[FILENAME].dmp logfile=[LOGFILE].log
Where [USERNAME]/[PASSWORD] are your credentials, [DATABASE] is the name of the database you're importing the dump file into, [DIRECTORY] is whatever directory you placed the dump file in, [FILENAME] is the name of the .dmp file, and [LOGFILE] is whatever name you chose for your log file.
Assuming your database has everything necessary for the .dmp file, the import should begin and you will start seeing status updates that look similar to this:
Starting [USERNAME]."SYS_IMPORT_FULL_01": [USERNAME]/******** directory=DATA_PUMP_DIR dumpfile=[FILENAME].dmp logfile=[LOGFILE].log
Processing object type SCHEMA_EXPORT/USER
Processing object type SCHEMA_EXPORT/SYSTEM_GRANT
Processing object type SCHEMA_EXPORT/ROLE_GRANT
Processing object type SCHEMA_EXPORT/TABLE/TABLE
Note that this is just an example, your results may look different. Assuming all goes well, you will see a message like this at the end:
Job [USERNAME]."SYS_IMPORT_FULL_01" completed
If you don't want to import it into your existing 11g database, you could always spin up a new database and import the .dmp file to that one using these same guidelines.
That should be enough to get you started down the right path, hope this helps and good luck!
P.S. Here is a great FAQ on the Data Pump utilities as well: http://www.orafaq.com/wiki/Import_Export_FAQ
how can i import to one data pump file to table in Oracle data integrator?
i have on local directory
i should read file(data Pump) from this directory
and import to table in oracle.
By Oracle Data Integrator How is it possible?
To import a dumpfile you must run the "impdp" command.
There are a few ways you can make ODI do that for you:
You can create a shell script file that calls the impdp with all necessary parameters and create an ODI package (using OdiOSCommand) that simply runs the shell script. For this to work your ODI agent must have access to the script and also to the database client (or the database home) so it can run impdp. (you can also use OdiOSCommand to run impdp directly)
The same idea from step 1 can be done using an ODI Procedure (if maybe the import is just part of a bigger integration flow)
ODI also has an LKM that uses Data Pump, but it is used to export a source table into a dumpfile and import it into a target database... If you have access to the source table metadata inside ODI Studio, you can create a simple mapping between source and target tables, choose to use the Data Pump LKM and simulate the execution. ODI will create all the necessary code to import the dumpfile
I am looking for a command line utility to export and import an entire SQL Server database. I am looking to automate the process of moving data from source database to destination database given the credentials.
This would be similar to exp and imp command for Oracle.
I have already looked at Bcp and SS import export wizard.
Would someone point to any such utility?
I haven't found one if it exists. I typically script up a power shell function like this one to serve the purpose.
export with power shell
You can then call the script from the command prompt and even add parameters to export by table, database etc.
I have a .tbl file with data and I'm trying to import this data into a table. I'm using SQL Developer for this with this command:
load data infile "C:\path\users.tbl"
insert into table users fields terminated by "|" lines terminated by "\r\n;
But nothing is working, the data is not loaded and no errors are shown...Do you see why it's not working?
That looks like SQL*Loader syntax.
For that to work, you'd have to run SQL*Loader, which is a separate command-line program available in your ORACLE_HOME/bin directory.
If you don't have an ORACLE_HOME, you'll need to install the client. Then open a shell/cmd window, and run your command there.
OR, if you want to use SQL Developer, you can use our wizard to read the file and insert the data, row-by-row.