I am trying to import data from Db dump
I have created a user 'user' and Granted following permissions:
CONNECT RESOURCE UNLIMITED TABLESPACE DBA ALL PRIVILEGES IMP_FULL_DATABASE
I am running the following command:
imp <user>/<password>#<server> touser=<user>
FILE=C:\App\<path>\admin\orcl\dpdump\EXPDAT.DMP full=y log=imp.log;
While running I am getting the following error message.
IMP-00401: dump file "C:\App\<path>\admin\orcl\dpdump\EXPDAT.DMP" may
be an Data Pump export dump file IMP-00000: Import terminated unsuccessfully
There are two import and export utilities.
One is client - server based, and is also deprecated. That would be IMPORT and EXPORT, shortened as IMP and EXP.
Data Pump is the 'new,' server based set of utilities - much more powerful and efficient at getting data in and out of your database.
You'll need to place your DMP file in a Database 'DIRECTORY' - these are known OS directories to the database, you can see them in the data dictionary via
SELECT * FROM ALL_DIRECTORIES
It's likely you already have a directory already defined just for data pump, look for something like 'DATA PUMP DIR'
Data Pump has a utility you can run from the OS, a PL/SQL API, and there is a Wizard in SQL Developer.
View > DBA menu.
Add a connection (not SYS), right-click on the Data Pump category, select Import Wizard...then walk the dialog.
We'll create and kick off the job for you, you can also watch the progress of the job and check for any errors.
Related
I'm currently trying to import an Oracle DB .dmp (dump) file into my Oracle DB using DBeaver but have trouble doing so.
The Oracle DB in question is running in a docker container. I successfully connected to this Oracle database with DBeaver, and can thus browse the database using DBeaver.
Currently however, the DB is empty. That's where the .dmp file comes in.
I want to import this .dmp file into my database, under a certain schema but I cannot seem to do this. The dump file looks something like this: 'export.dmp' and is around 16MB big.
I'd like to import the data from the .dmp file to be able to browse the data to get familiar with it, as similar data will be stored in our own database.
I looked online but was unable to get an answer that works for me.
I tried using DBeaver but I don't seem to have the option to import or restore a DB via a .dmp file. At best, DBeaver proposes to import data using a .CSV file. I also downloaded the Oracle tool SQLDeveloper, but I can't manage to connect to my database in the docker container.
Online there is also talk of an import / export tool that supposedly can create these .dmp files and import them, but I'm unsure how to get this tool and whether that is the way to do it.
If so, I still don't understand how I can get to browse the data in DBeaver.
How can I import and browse the data from the .dmp file in my Oracle DB using DBeaver?
How to find Oracle datapump dir
presumably set to /u01/app/oracle/admin/<mydatabase>/dpdump on your system
How to copy files from host to docker container
docker cp export.dmp container_id:/u01/app/oracle/admin/<mydatabase>/dpdump/export.dmp
How do I get into a Docker container's shell
docker exec -it <mycontainer> bash
How to import an Oracle database from dmp file
If it was exported using expdp, then start the import with impdp:
impdp <username>/<password> dumpfile=export.dmp full=y
It will output the log file in the same default DATA_PUMP_DIR directory in the container.
oracle has two utilities IMPORT and IMPDP to import dumps , with IMPORT you can't use database directories and you have to specify the location . The IMPDP on other hand require database directory .
having said that you can't import oracle export dumps using dbeaver , you have to use IMPORT or IMPDP utility from OS.
My question is about the following, it is required to pass a database of Oracle Database 10g Express Edition to 11g. I was given the backup on a pendrive, it is a file with extension .dmp (Dump file).
I installed the 11g Express Edition on a new server but also installed the database that comes with this (XE).
I want to restore the database from the 10g to another unit other than C, which is where the Oracle 11g database is installed. I also want this new database to "replace" the XE (I do not know if it is the correct way to say it).
I have only found adjustments and location changes but only within the same unit.
Any scope would be very useful.
Thank you.
Judging from the comments, it sounds like you have been given a Database Dump file (.dmp) from a database on a pendrive, and you need to figure out how to get that file into a database, correct?
First, let's go over some background. What is a dump file (.dmp)? From Oracle:
The dump file set is made up of one or more disk files that contain
table data, database object metadata, and control information. The
files are written in a proprietary, binary format. During an import
operation, the Data Pump Import utility uses these files to locate
each database object in the dump file set.
At a high level, that .dmp file is a collection of DDL and DML statements that will recreate whatever data and objects that were exported. .dmp files make it easier to transport and move large amounts of data between databases using Data Pump. But what is Data Pump? Again, from Oracle:
Oracle Data Pump technology enables very high-speed movement of data and metadata from one database to another.
Basically, Data Pump is a set of utilities (EXPDP & IMPDP) that are used to move data between databases. The .dmp file you have was likely created using EXPDP. You will need to use IMPDP to import that .dmp file into a database.
Here's were it gets interesting - you say that you already have an 11g database, correct? If you want to, you should be able to import the 10g dump file directly into your 11g database without any issues. The reason is that Oracle tends to be backwards compatible, and typically speaking, anything that you do with one version of Oracle will be compatible with the version that immediately succeeds it. Jumping from something like Oracle 8i to 11g won't work, but you can always go from 8i to 9i, from 9i to 10g, and so on.
If you want to import that dump file into your 11g database, here's what you'll need to do:
Create a DBA account, or have an account that has been granted Data Pump privileges explicitly.
Move the .dmp file to the server where your 11g database lives. If you want to make it even easier for yourself, you can move the .dmp file to your database's datapump directory. If you don't know where that is, execute the following query on your database: select * from all_directories where directory_name = 'DATA_PUMP_DIR'; This query will return a directory. You don't have to use this directory, it will just make it easier.
Once you have the dump file in place and you have all of the necessary database and operating system privileges, you are ready to import the dmp file. Open a new command line window, set your Oracle home if it is not already set, and then navigate to the directory where you placed the .dmp file. Your import command will look something like this:
impdp [USERNAME]/[PASSWORD]#[DATABASE] directory=[DIRECTORY] dumpfile=[FILENAME].dmp logfile=[LOGFILE].log
Where [USERNAME]/[PASSWORD] are your credentials, [DATABASE] is the name of the database you're importing the dump file into, [DIRECTORY] is whatever directory you placed the dump file in, [FILENAME] is the name of the .dmp file, and [LOGFILE] is whatever name you chose for your log file.
Assuming your database has everything necessary for the .dmp file, the import should begin and you will start seeing status updates that look similar to this:
Starting [USERNAME]."SYS_IMPORT_FULL_01": [USERNAME]/******** directory=DATA_PUMP_DIR dumpfile=[FILENAME].dmp logfile=[LOGFILE].log
Processing object type SCHEMA_EXPORT/USER
Processing object type SCHEMA_EXPORT/SYSTEM_GRANT
Processing object type SCHEMA_EXPORT/ROLE_GRANT
Processing object type SCHEMA_EXPORT/TABLE/TABLE
Note that this is just an example, your results may look different. Assuming all goes well, you will see a message like this at the end:
Job [USERNAME]."SYS_IMPORT_FULL_01" completed
If you don't want to import it into your existing 11g database, you could always spin up a new database and import the .dmp file to that one using these same guidelines.
That should be enough to get you started down the right path, hope this helps and good luck!
P.S. Here is a great FAQ on the Data Pump utilities as well: http://www.orafaq.com/wiki/Import_Export_FAQ
how can i import to one data pump file to table in Oracle data integrator?
i have on local directory
i should read file(data Pump) from this directory
and import to table in oracle.
By Oracle Data Integrator How is it possible?
To import a dumpfile you must run the "impdp" command.
There are a few ways you can make ODI do that for you:
You can create a shell script file that calls the impdp with all necessary parameters and create an ODI package (using OdiOSCommand) that simply runs the shell script. For this to work your ODI agent must have access to the script and also to the database client (or the database home) so it can run impdp. (you can also use OdiOSCommand to run impdp directly)
The same idea from step 1 can be done using an ODI Procedure (if maybe the import is just part of a bigger integration flow)
ODI also has an LKM that uses Data Pump, but it is used to export a source table into a dumpfile and import it into a target database... If you have access to the source table metadata inside ODI Studio, you can create a simple mapping between source and target tables, choose to use the Data Pump LKM and simulate the execution. ODI will create all the necessary code to import the dumpfile
I am looking for a command line utility to export and import an entire SQL Server database. I am looking to automate the process of moving data from source database to destination database given the credentials.
This would be similar to exp and imp command for Oracle.
I have already looked at Bcp and SS import export wizard.
Would someone point to any such utility?
I haven't found one if it exists. I typically script up a power shell function like this one to serve the purpose.
export with power shell
You can then call the script from the command prompt and even add parameters to export by table, database etc.
I'm trying to do a data-only export using oracle data pump. I just wanted to make a test run on one single schema, however the result is not correct. I call expdp as follows:
./expdp user/password#x.x.x.x:1524/orcl1 DIRECTORY=EXPORTDIR DUMPFILE=orcl1_export.dmp LOGFILE=orcl1_export.log CONTENT=DATA_ONLY SCHEMAS=pfl_adm
Even I provided the username and password in the command line, I still get a prompt to log in as SYSDBA. Once I log in as SYSDBA, the export runs, however it seems that the tool is exporting another Oracle instance (we have several copies with the same schemas). The number of exported rows in the log file does not match with the number of rows in the database. Do you have any ideas?