Migrating from Postgres to SQL Server 2008 - sql-server

I need to migrate a database from Postgres 7 to SQL Server 2008. I am familiar with the SSIS Import and Export wizard but I am stumped about how to define the data source or define the data provider.
What is the best way to migrate Postgres to SQL Server, and how do I define data sources/drivers for postgres?

I was having problems using the Import Wizard in SQL Server 2008 R2 to import tables from PostgreSQL. I had the PostgreSQL ODBC driver installed, so for the Data Source in the Import Wizard I chose ".Net Framework Data Provider for Odbc" and supplied the DSN name for my PostgreSQL database. The wizard found the tables okay, but when I went to perform the import I got the error
Column information for the source and destination data could not be retrieved.
“Billing” -> [dbo].[Billing]:
– Cannot find column -1.
I found the solution in the Microsoft blog post here. Apparently the problem is that various ODBC drivers use different attribute names when reporting column metadata. To get the import to work I had to edit the "ProviderDescriptors.xml" file, which was located at
C:\Program Files\Microsoft SQL Server\100\DTS\ProviderDescriptors\ProviderDescriptors.xml
In the ...
<dtm:ProviderDescriptor SourceType="System.Data.Odbc.OdbcConnection">
... element I had to change the attributes from ...
<dtm:ColumnSchemaAttributes
NameColumnName = "COLUMN_NAME"
OrdinalPositionColumnName="ORDINAL_POSITION"
DataTypeColumnName = "TYPE_NAME"
MaximumLengthColumnName = "COLUMN_SIZE"
NumericPrecisionColumnName = "COLUMN_SIZE"
NumericScaleColumnName = "DECIMAL_DIGITS"
NullableColumnName="NULLABLE"
NumberOfColumnRestrictions="4"
/>
... to ...
<dtm:ColumnSchemaAttributes
NameColumnName = "COLUMN_NAME"
OrdinalPositionColumnName="ORDINAL_POSITION"
DataTypeColumnName = "TYPE_NAME"
MaximumLengthColumnName = "LENGTH"
NumericPrecisionColumnName = "PRECISION"
NumericScaleColumnName = "SCALE"
NullableColumnName="NULLABLE"
NumberOfColumnRestrictions="4"
/>
That is, I had to tweak the MaximumLengthColumnName, NumericPrecisionColumnName, and NumericScaleColumnName attribute values to "LENGTH", "PRECISION", and "SCALE", respectively.
Once that change was made the import from PostgreSQL to SQL Server ran successfully.

I wish you the best of luck in trying to import from PostgreSQL into SQL Server using SQL Server Import and Export Wizard. However, I have read numerous message board threads with people having trouble getting it to work. For example:
Import Data from Postgresql to SQL Server 08 Error
Here is the most helpful thread that I have found on the topic:
Import data from postgreSQL into SQL server 2005
To help someone who might be trying to achieve similar goal as mine. Instead of selecting the “PostgreSQL OLE DB Provider” in the data source drop down menu of SQL Server Import and Export Wizard, select “.Net Framework Data Provider for Odbc”
Then you have to make a DSN and provide a ConnectionString. Following ConnectionString worked for me
Driver={PostgreSQL};Server=localhost;Port=5432;Database=TestMasterMap;Uid=postgres;Pwd=;
To make a DSN you have to go into Administrative Toolsà Data Sources (ODBC) and create a user DSN. Once this is done you can supply the DSN name in the DSN text box of SQL Server Import and Export Wizard.
One commenter claimed that it worked, but that he got "Out of memory while reading tuples" errors on big tables. So for tables with more than 3 million rows, he had to break the import up into 3 million row chunks.
Also, there's a link to the native .NET provider for PostgreSQL in that thread.
Personally, if this is something that I only had to do once, and if I understood the schema and the data fairly well, I would try:
export the data from PostgreSQL as flat files
create the schema in SQL Server (without PKs or constraints)
use the SSIS Import/Export Wizard to import the flat files
then create PKs and necessary constraints
It might take you less time to do the above than messing with SSIS Import/Export Wizard and PostgreSQL for days (but it would be nice if those tools worked!)

As I finished commenting the answer above, I thought of trying SQL WorkbenchJ; it has a datapump feature that worked pretty well for me. I managed to export data from my PostgreSQL database to an SQL server instance.
Those who'd like to run this in batch mode (via shell), here's how to do it: Google Groups Thread. The WbCopy command mentioned on the discussion isn't really documented anywhere I could find, but you can generate one through the datapump interface and then change whatever you need.

To give a little more practical example of how you can achieve what's described in marked answer; you can export from PostgresQL to flat files then use bcp Utility to import in SQL Server.
e.g. in a .bat file, for a single table (and you need to have the table already created in the destination SQL DB):
#echo off
set DbName=YOUR_POSTGRES_DB_NAME
set csvpath=C:\PATH_TO_CSV\CSV_NAME.csv
set username=YOUR_POSTGRES_DB_USERNAME
:: Export to CSV, note we're using a ~ delimiter to avoid issues with commas in fields
psql -U %username% -d %DbName% -c "COPY (select * from SOURCE_TABLE_NAME) TO STDOUT (FORMAT CSV, HEADER TRUE, DELIMITER '~', ENCODING 'UTF8');" > %csvpath%
:: Import CSV to SQL Server
set logpath=C:\bcplog.txt
set errorlogpath=C:\bcperrors.txt
set sqlserver=YOUR_SQL_SERVER
set sqldb=YOUR_DB_NAME
:: page code 65001 = UTF-8
bcp DESTINATION_TABLE_NAME IN %csvpath% -t~ -F1 -c -C65001 -S %sqlserver% -d %sqldb% -T -o %logpath% -e %errorlogpath%

Related

Exporting sql scripts of IBM DB2 objects

Is there any way to export a file with SQL scripts of IBM DB2 objects.
I have done this in MS-SQL by following the below link. I am not sure how to do this in DB2.
Exporting data In SQL Server as INSERT INTO
I've never used iSeries Navigator to do this, however, a pretty good once over just now didn't yield anything similar to what you're looking for from SSMS.
What I recommend is DBeaver Community Edition or Toad as others have mentioned. DBeaver CE is a free resource which leans me towards that option.
Create a connection to your DB2 instance (the software will download the drivers for you after prompting), and then use the program to create your SQL insert statements.
Once you add your connection in, expand the library/schema, go to Tables or Views, and right click the object you'd like to export and click Export Data:
Select SQL and click next:
There will be a few more options to select, but this should get you where you want to go. Let me know if you need any more help.
For some reasons we are not supposed to use the third party tool in our old server.
For generating the DML statements, I used the SQL query:
SELECT 'Insert into Table (DIV,SA,KEY1,KEY2,VALUE1,VALUE) values('''||DIV||''','''||SA||''','''|| KEY1 ||''','''|| KEY2 ||''','''|| VALUE1 ||''','''|| VALUE ||''');' from Table
For generating the DDL statements, I used the below command:
db2look -i DBUser -w DBpassword -d DBSchema -a -e -x -o FIlE_OUT.txt

Import DB2 files to SQL Server

Given the DAT file and the DDL file for each table in a DB2 database, can I import this data to SQL Server? I have no access to the original server or any copy of a DB2 server so connecting to a live instance isn't an option.
Can I do this without a live instance of DB2 or should I go back to the client and ask for CSV files? Is there a procedure or tool that makes this process smoother? I've tried to find a file-based connection string to use to connect to a set of DB2 files with no luck. I've also tried SwissSQLDB2ToSQLServer and SqlLinesData to see if they have a file-based option built in.
OK, given the comment above, you can't import DB2's container files (DAT, LRG, or anything else) directly. You need a CSV or equivalent. Yes, one way to get this is run the EXPORT utility on a live DB2 database. HTH!

Can I import dump on existing & running oracle database using impdp?

Can I import dump on existing/running oracle database using impdp? Would that clear the existing data before importing the new data or it would be just an update?
Note: No change in schema in new dump. The dump is taken using expdp.
Database: Oracle 10g
Appreciate your response!
Muzaffar
Yes, Impdp works only on running databases. Please also define one parameter in impdp table_exists_action with values replace/append/ignore for tables whichever you prefer.

Importing Dump (.dmp) file into mysql

Can I import .dmp file created from oracle (8i) into MySql database? I tried importing the dump file directly using mysql workbench, but showing some error which I am mentioning below.
Creating schema newschema
10:49:15 Restoring G:\dmp\pass.dmp
Running: mysql.exe --defaults-extra-file="c:\users\acer\appdata\local\temp\tmpliqb6y.cnf" --host=localhost --user=root --port=3306 --default-character-set=utf8 --comments --database=newschema < "G:\\dmp\\xyz.dmp"
ERROR 1064 (42000) at line 1: You have an error in your SQL syntax; check the manual that corresponds to your MySQL server version for the right syntax to use near '
How to solve this?
No, you probably can't.
An Oracle export is a proprietary, undocumented files format. It can, realistically, only be read by the Oracle import utility. There may be some third party tools that may have reverse engineered the file format, but I'm not aware of any that would import into MySQL.
If you import the data into a local Oracle database, it becomes much easier to transfer it into a MySQL database. Any ETL tool can do so, you can write an application to move the data, you can connect from Oracle to MySQL to push the data, you can extract the DDL and DML from the Oracle database, etc.

SQL Server import of file created through bcp

I'm currently reviewing how to import a file created from bcp in SQL Server on one computer into my local SQL Server. This is a datafile I received from a 3rd party so I have no idea of the data structure etc. of the information. My SQL Server skills are quite new so please bear with me :) I've reviewd the bcp documents and attempted the following:
bcp TestDatabase in File.bcp -T
Results: Invalid Object name 'TestDatabase'
I created the test database TestDatabase and tried the query again but with the same response. I then added -Slocal and got a login timeout, seems like progress!
I removed the -T flag and tried varying combinations of usernames and passwords without any luck.
So I guess to start, is there an underlying issue I'm missing, syntax I'm not following etc. or should I just play around with the creds for my local SQL Server?
You need to specify the server, username, and table. Try this:
bcp TestDatabase..SomeTableName in File.bcp -S Server -U Username -P Password
If you look at the bcp Utility docs
-T means to use Integrated Security (your logged in account)
-Sis the server name parameter.
These two parameters are not interchangeable.
You can only use the -U and -P in place of -T if you have SQL Authentication turned on (and you shouldn't if you can avoid it)
Finally Chris Shain is correct. You need to specify Schema and table or ViewName, not just a DB Name, as well as the Server (-S)
Also from the documentation (and the point E.J. Brennan was making)
To import data into a table, you must either use a format file created
for that table or understand the structure of the table and the types
of data that are valid for its columns.
So you can't expect to just take a file and have bcp magically make a table for you. You need to use SSIS to help you or some other tool to do that.

Resources