Importing Dump (.dmp) file into mysql - database

Can I import .dmp file created from oracle (8i) into MySql database? I tried importing the dump file directly using mysql workbench, but showing some error which I am mentioning below.
Creating schema newschema
10:49:15 Restoring G:\dmp\pass.dmp
Running: mysql.exe --defaults-extra-file="c:\users\acer\appdata\local\temp\tmpliqb6y.cnf" --host=localhost --user=root --port=3306 --default-character-set=utf8 --comments --database=newschema < "G:\\dmp\\xyz.dmp"
ERROR 1064 (42000) at line 1: You have an error in your SQL syntax; check the manual that corresponds to your MySQL server version for the right syntax to use near '
How to solve this?

No, you probably can't.
An Oracle export is a proprietary, undocumented files format. It can, realistically, only be read by the Oracle import utility. There may be some third party tools that may have reverse engineered the file format, but I'm not aware of any that would import into MySQL.
If you import the data into a local Oracle database, it becomes much easier to transfer it into a MySQL database. Any ETL tool can do so, you can write an application to move the data, you can connect from Oracle to MySQL to push the data, you can extract the DDL and DML from the Oracle database, etc.

Related

How to import an Oracle DB .dmp file using DBeaver?

I'm currently trying to import an Oracle DB .dmp (dump) file into my Oracle DB using DBeaver but have trouble doing so.
The Oracle DB in question is running in a docker container. I successfully connected to this Oracle database with DBeaver, and can thus browse the database using DBeaver.
Currently however, the DB is empty. That's where the .dmp file comes in.
I want to import this .dmp file into my database, under a certain schema but I cannot seem to do this. The dump file looks something like this: 'export.dmp' and is around 16MB big.
I'd like to import the data from the .dmp file to be able to browse the data to get familiar with it, as similar data will be stored in our own database.
I looked online but was unable to get an answer that works for me.
I tried using DBeaver but I don't seem to have the option to import or restore a DB via a .dmp file. At best, DBeaver proposes to import data using a .CSV file. I also downloaded the Oracle tool SQLDeveloper, but I can't manage to connect to my database in the docker container.
Online there is also talk of an import / export tool that supposedly can create these .dmp files and import them, but I'm unsure how to get this tool and whether that is the way to do it.
If so, I still don't understand how I can get to browse the data in DBeaver.
How can I import and browse the data from the .dmp file in my Oracle DB using DBeaver?
How to find Oracle datapump dir
presumably set to /u01/app/oracle/admin/<mydatabase>/dpdump on your system
How to copy files from host to docker container
docker cp export.dmp container_id:/u01/app/oracle/admin/<mydatabase>/dpdump/export.dmp
How do I get into a Docker container's shell
docker exec -it <mycontainer> bash
How to import an Oracle database from dmp file
If it was exported using expdp, then start the import with impdp:
impdp <username>/<password> dumpfile=export.dmp full=y
It will output the log file in the same default DATA_PUMP_DIR directory in the container.
oracle has two utilities IMPORT and IMPDP to import dumps , with IMPORT you can't use database directories and you have to specify the location . The IMPDP on other hand require database directory .
having said that you can't import oracle export dumps using dbeaver , you have to use IMPORT or IMPDP utility from OS.

Restore an Oracle database to another disk drive in Windows Server 2012

My question is about the following, it is required to pass a database of Oracle Database 10g Express Edition to 11g. I was given the backup on a pendrive, it is a file with extension .dmp (Dump file).
I installed the 11g Express Edition on a new server but also installed the database that comes with this (XE).
I want to restore the database from the 10g to another unit other than C, which is where the Oracle 11g database is installed. I also want this new database to "replace" the XE (I do not know if it is the correct way to say it).
I have only found adjustments and location changes but only within the same unit.
Any scope would be very useful.
Thank you.
Judging from the comments, it sounds like you have been given a Database Dump file (.dmp) from a database on a pendrive, and you need to figure out how to get that file into a database, correct?
First, let's go over some background. What is a dump file (.dmp)? From Oracle:
The dump file set is made up of one or more disk files that contain
table data, database object metadata, and control information. The
files are written in a proprietary, binary format. During an import
operation, the Data Pump Import utility uses these files to locate
each database object in the dump file set.
At a high level, that .dmp file is a collection of DDL and DML statements that will recreate whatever data and objects that were exported. .dmp files make it easier to transport and move large amounts of data between databases using Data Pump. But what is Data Pump? Again, from Oracle:
Oracle Data Pump technology enables very high-speed movement of data and metadata from one database to another.
Basically, Data Pump is a set of utilities (EXPDP & IMPDP) that are used to move data between databases. The .dmp file you have was likely created using EXPDP. You will need to use IMPDP to import that .dmp file into a database.
Here's were it gets interesting - you say that you already have an 11g database, correct? If you want to, you should be able to import the 10g dump file directly into your 11g database without any issues. The reason is that Oracle tends to be backwards compatible, and typically speaking, anything that you do with one version of Oracle will be compatible with the version that immediately succeeds it. Jumping from something like Oracle 8i to 11g won't work, but you can always go from 8i to 9i, from 9i to 10g, and so on.
If you want to import that dump file into your 11g database, here's what you'll need to do:
Create a DBA account, or have an account that has been granted Data Pump privileges explicitly.
Move the .dmp file to the server where your 11g database lives. If you want to make it even easier for yourself, you can move the .dmp file to your database's datapump directory. If you don't know where that is, execute the following query on your database: select * from all_directories where directory_name = 'DATA_PUMP_DIR'; This query will return a directory. You don't have to use this directory, it will just make it easier.
Once you have the dump file in place and you have all of the necessary database and operating system privileges, you are ready to import the dmp file. Open a new command line window, set your Oracle home if it is not already set, and then navigate to the directory where you placed the .dmp file. Your import command will look something like this:
impdp [USERNAME]/[PASSWORD]#[DATABASE] directory=[DIRECTORY] dumpfile=[FILENAME].dmp logfile=[LOGFILE].log
Where [USERNAME]/[PASSWORD] are your credentials, [DATABASE] is the name of the database you're importing the dump file into, [DIRECTORY] is whatever directory you placed the dump file in, [FILENAME] is the name of the .dmp file, and [LOGFILE] is whatever name you chose for your log file.
Assuming your database has everything necessary for the .dmp file, the import should begin and you will start seeing status updates that look similar to this:
Starting [USERNAME]."SYS_IMPORT_FULL_01": [USERNAME]/******** directory=DATA_PUMP_DIR dumpfile=[FILENAME].dmp logfile=[LOGFILE].log
Processing object type SCHEMA_EXPORT/USER
Processing object type SCHEMA_EXPORT/SYSTEM_GRANT
Processing object type SCHEMA_EXPORT/ROLE_GRANT
Processing object type SCHEMA_EXPORT/TABLE/TABLE
Note that this is just an example, your results may look different. Assuming all goes well, you will see a message like this at the end:
Job [USERNAME]."SYS_IMPORT_FULL_01" completed
If you don't want to import it into your existing 11g database, you could always spin up a new database and import the .dmp file to that one using these same guidelines.
That should be enough to get you started down the right path, hope this helps and good luck!
P.S. Here is a great FAQ on the Data Pump utilities as well: http://www.orafaq.com/wiki/Import_Export_FAQ

Import DB2 files to SQL Server

Given the DAT file and the DDL file for each table in a DB2 database, can I import this data to SQL Server? I have no access to the original server or any copy of a DB2 server so connecting to a live instance isn't an option.
Can I do this without a live instance of DB2 or should I go back to the client and ask for CSV files? Is there a procedure or tool that makes this process smoother? I've tried to find a file-based connection string to use to connect to a set of DB2 files with no luck. I've also tried SwissSQLDB2ToSQLServer and SqlLinesData to see if they have a file-based option built in.
OK, given the comment above, you can't import DB2's container files (DAT, LRG, or anything else) directly. You need a CSV or equivalent. Yes, one way to get this is run the EXPORT utility on a live DB2 database. HTH!

Can I import dump on existing & running oracle database using impdp?

Can I import dump on existing/running oracle database using impdp? Would that clear the existing data before importing the new data or it would be just an update?
Note: No change in schema in new dump. The dump is taken using expdp.
Database: Oracle 10g
Appreciate your response!
Muzaffar
Yes, Impdp works only on running databases. Please also define one parameter in impdp table_exists_action with values replace/append/ignore for tables whichever you prefer.

Migrating from Postgres to SQL Server 2008

I need to migrate a database from Postgres 7 to SQL Server 2008. I am familiar with the SSIS Import and Export wizard but I am stumped about how to define the data source or define the data provider.
What is the best way to migrate Postgres to SQL Server, and how do I define data sources/drivers for postgres?
I was having problems using the Import Wizard in SQL Server 2008 R2 to import tables from PostgreSQL. I had the PostgreSQL ODBC driver installed, so for the Data Source in the Import Wizard I chose ".Net Framework Data Provider for Odbc" and supplied the DSN name for my PostgreSQL database. The wizard found the tables okay, but when I went to perform the import I got the error
Column information for the source and destination data could not be retrieved.
“Billing” -> [dbo].[Billing]:
– Cannot find column -1.
I found the solution in the Microsoft blog post here. Apparently the problem is that various ODBC drivers use different attribute names when reporting column metadata. To get the import to work I had to edit the "ProviderDescriptors.xml" file, which was located at
C:\Program Files\Microsoft SQL Server\100\DTS\ProviderDescriptors\ProviderDescriptors.xml
In the ...
<dtm:ProviderDescriptor SourceType="System.Data.Odbc.OdbcConnection">
... element I had to change the attributes from ...
<dtm:ColumnSchemaAttributes
NameColumnName = "COLUMN_NAME"
OrdinalPositionColumnName="ORDINAL_POSITION"
DataTypeColumnName = "TYPE_NAME"
MaximumLengthColumnName = "COLUMN_SIZE"
NumericPrecisionColumnName = "COLUMN_SIZE"
NumericScaleColumnName = "DECIMAL_DIGITS"
NullableColumnName="NULLABLE"
NumberOfColumnRestrictions="4"
/>
... to ...
<dtm:ColumnSchemaAttributes
NameColumnName = "COLUMN_NAME"
OrdinalPositionColumnName="ORDINAL_POSITION"
DataTypeColumnName = "TYPE_NAME"
MaximumLengthColumnName = "LENGTH"
NumericPrecisionColumnName = "PRECISION"
NumericScaleColumnName = "SCALE"
NullableColumnName="NULLABLE"
NumberOfColumnRestrictions="4"
/>
That is, I had to tweak the MaximumLengthColumnName, NumericPrecisionColumnName, and NumericScaleColumnName attribute values to "LENGTH", "PRECISION", and "SCALE", respectively.
Once that change was made the import from PostgreSQL to SQL Server ran successfully.
I wish you the best of luck in trying to import from PostgreSQL into SQL Server using SQL Server Import and Export Wizard. However, I have read numerous message board threads with people having trouble getting it to work. For example:
Import Data from Postgresql to SQL Server 08 Error
Here is the most helpful thread that I have found on the topic:
Import data from postgreSQL into SQL server 2005
To help someone who might be trying to achieve similar goal as mine. Instead of selecting the “PostgreSQL OLE DB Provider” in the data source drop down menu of SQL Server Import and Export Wizard, select “.Net Framework Data Provider for Odbc”
Then you have to make a DSN and provide a ConnectionString. Following ConnectionString worked for me
Driver={PostgreSQL};Server=localhost;Port=5432;Database=TestMasterMap;Uid=postgres;Pwd=;
To make a DSN you have to go into Administrative Toolsà Data Sources (ODBC) and create a user DSN. Once this is done you can supply the DSN name in the DSN text box of SQL Server Import and Export Wizard.
One commenter claimed that it worked, but that he got "Out of memory while reading tuples" errors on big tables. So for tables with more than 3 million rows, he had to break the import up into 3 million row chunks.
Also, there's a link to the native .NET provider for PostgreSQL in that thread.
Personally, if this is something that I only had to do once, and if I understood the schema and the data fairly well, I would try:
export the data from PostgreSQL as flat files
create the schema in SQL Server (without PKs or constraints)
use the SSIS Import/Export Wizard to import the flat files
then create PKs and necessary constraints
It might take you less time to do the above than messing with SSIS Import/Export Wizard and PostgreSQL for days (but it would be nice if those tools worked!)
As I finished commenting the answer above, I thought of trying SQL WorkbenchJ; it has a datapump feature that worked pretty well for me. I managed to export data from my PostgreSQL database to an SQL server instance.
Those who'd like to run this in batch mode (via shell), here's how to do it: Google Groups Thread. The WbCopy command mentioned on the discussion isn't really documented anywhere I could find, but you can generate one through the datapump interface and then change whatever you need.
To give a little more practical example of how you can achieve what's described in marked answer; you can export from PostgresQL to flat files then use bcp Utility to import in SQL Server.
e.g. in a .bat file, for a single table (and you need to have the table already created in the destination SQL DB):
#echo off
set DbName=YOUR_POSTGRES_DB_NAME
set csvpath=C:\PATH_TO_CSV\CSV_NAME.csv
set username=YOUR_POSTGRES_DB_USERNAME
:: Export to CSV, note we're using a ~ delimiter to avoid issues with commas in fields
psql -U %username% -d %DbName% -c "COPY (select * from SOURCE_TABLE_NAME) TO STDOUT (FORMAT CSV, HEADER TRUE, DELIMITER '~', ENCODING 'UTF8');" > %csvpath%
:: Import CSV to SQL Server
set logpath=C:\bcplog.txt
set errorlogpath=C:\bcperrors.txt
set sqlserver=YOUR_SQL_SERVER
set sqldb=YOUR_DB_NAME
:: page code 65001 = UTF-8
bcp DESTINATION_TABLE_NAME IN %csvpath% -t~ -F1 -c -C65001 -S %sqlserver% -d %sqldb% -T -o %logpath% -e %errorlogpath%

Resources