Exporting sql scripts of IBM DB2 objects - database

Is there any way to export a file with SQL scripts of IBM DB2 objects.
I have done this in MS-SQL by following the below link. I am not sure how to do this in DB2.
Exporting data In SQL Server as INSERT INTO

I've never used iSeries Navigator to do this, however, a pretty good once over just now didn't yield anything similar to what you're looking for from SSMS.
What I recommend is DBeaver Community Edition or Toad as others have mentioned. DBeaver CE is a free resource which leans me towards that option.
Create a connection to your DB2 instance (the software will download the drivers for you after prompting), and then use the program to create your SQL insert statements.
Once you add your connection in, expand the library/schema, go to Tables or Views, and right click the object you'd like to export and click Export Data:
Select SQL and click next:
There will be a few more options to select, but this should get you where you want to go. Let me know if you need any more help.

For some reasons we are not supposed to use the third party tool in our old server.
For generating the DML statements, I used the SQL query:
SELECT 'Insert into Table (DIV,SA,KEY1,KEY2,VALUE1,VALUE) values('''||DIV||''','''||SA||''','''|| KEY1 ||''','''|| KEY2 ||''','''|| VALUE1 ||''','''|| VALUE ||''');' from Table
For generating the DDL statements, I used the below command:
db2look -i DBUser -w DBpassword -d DBSchema -a -e -x -o FIlE_OUT.txt

Related

Is there any possible way to convert PostgreSQL database to MS SQL Server?

I am using Postgres db. I need to migrate this database to Microsoft SQL Server database. Is there any possible way?
Use this command:
pg_dump -U username your_db_name > db_backup.sql --column-inserts
The --column-inserts option generates INSERT statements possible for each row of data and should be compatible, but you may have to change the syntax just a little bit. (And it may cause data-loss, https://www.postgresql.org/docs/current/app-pgdump.html)
Then just simply open the .sql file and run it inside SqlServer

Copy one table data to another database table in same server Azure Services

I have a Azure Service database server. Server has databases. by mistakenly , one database table data was deleted. now i want to restore the data from my other database table exist in same server.
i know, azure services do not allow to do this. but is there any possibility or any resolution for better solution.
ServeDB
-> db1(users table data deleted)
-> db2 (i want to recover from users table data from this database,exist in same server)
You can use the bcp utility to export the table to a local drive in your computer.
bcp NLayerApp.dbo.Customer out "C:\MyFolderPath\Customer.txt" -T -c -S WIN7VS2010RC1\SQLEXPRESS
Then you can import it to the other database using the same utility.
bcp TestDB.dbo.Customer in "C:\MyFolderPath\Customer.txt" -c -U mysqlazureuser#mysqlazureservername -S tcp:mysqlazureservername.database.windows.net -P mypassword
You can learn more about bcp here.
You can use a regular Import/Export wizard of SSMS, the same as you would do with a database on premise.
https://azure.microsoft.com/en-us/blog/exporting-data-from-sql-azure-importexport-wizard/

Dropping a postgres database in cmdline, still seeing the database when \list

I'm trying to drop my database and create a new one through the command line.
I login using psql postgres and then do a \list, see a list of the two databases i created which i now want to delete. so i tried using a DROP DATABASE databasename;
I don't see any error while executing that statement but when i try to \list again to see if that DB are deleted, i still see that that the DB exists. Can someone please tell me why this could happen? and how to surely delete those DB.
There are a couple caveats to DROP DATABASE:
It can only be executed by the database owner.
It cannot be executed while you or anyone else are connected to the target database.
I generally use the dropdb command-line tool to do this, since it's a wrapper around DROP DATABASE which doesn't require you to explicitly connect first. It still has the caveat that there can't be any users currently connected to the database, but it's generally quicker/easier to use.
I would recommend you try issuing a command like this:
dropdb -h <host> -U <user> -p <port> <name of db to drop>
Similarly, you can use the createdb command-line tool to create a database.
More info on DROP DATABASE: http://www.postgresql.org/docs/current/static/sql-dropdatabase.html
Edit:
Also, it is worth looking in the Postgres log (likely in /var/log/postgresql by default) to see if perhaps there is anything in there that wasn't surfaced in the results.

SQL Server import of file created through bcp

I'm currently reviewing how to import a file created from bcp in SQL Server on one computer into my local SQL Server. This is a datafile I received from a 3rd party so I have no idea of the data structure etc. of the information. My SQL Server skills are quite new so please bear with me :) I've reviewd the bcp documents and attempted the following:
bcp TestDatabase in File.bcp -T
Results: Invalid Object name 'TestDatabase'
I created the test database TestDatabase and tried the query again but with the same response. I then added -Slocal and got a login timeout, seems like progress!
I removed the -T flag and tried varying combinations of usernames and passwords without any luck.
So I guess to start, is there an underlying issue I'm missing, syntax I'm not following etc. or should I just play around with the creds for my local SQL Server?
You need to specify the server, username, and table. Try this:
bcp TestDatabase..SomeTableName in File.bcp -S Server -U Username -P Password
If you look at the bcp Utility docs
-T means to use Integrated Security (your logged in account)
-Sis the server name parameter.
These two parameters are not interchangeable.
You can only use the -U and -P in place of -T if you have SQL Authentication turned on (and you shouldn't if you can avoid it)
Finally Chris Shain is correct. You need to specify Schema and table or ViewName, not just a DB Name, as well as the Server (-S)
Also from the documentation (and the point E.J. Brennan was making)
To import data into a table, you must either use a format file created
for that table or understand the structure of the table and the types
of data that are valid for its columns.
So you can't expect to just take a file and have bcp magically make a table for you. You need to use SSIS to help you or some other tool to do that.

Migrating from Postgres to SQL Server 2008

I need to migrate a database from Postgres 7 to SQL Server 2008. I am familiar with the SSIS Import and Export wizard but I am stumped about how to define the data source or define the data provider.
What is the best way to migrate Postgres to SQL Server, and how do I define data sources/drivers for postgres?
I was having problems using the Import Wizard in SQL Server 2008 R2 to import tables from PostgreSQL. I had the PostgreSQL ODBC driver installed, so for the Data Source in the Import Wizard I chose ".Net Framework Data Provider for Odbc" and supplied the DSN name for my PostgreSQL database. The wizard found the tables okay, but when I went to perform the import I got the error
Column information for the source and destination data could not be retrieved.
“Billing” -> [dbo].[Billing]:
– Cannot find column -1.
I found the solution in the Microsoft blog post here. Apparently the problem is that various ODBC drivers use different attribute names when reporting column metadata. To get the import to work I had to edit the "ProviderDescriptors.xml" file, which was located at
C:\Program Files\Microsoft SQL Server\100\DTS\ProviderDescriptors\ProviderDescriptors.xml
In the ...
<dtm:ProviderDescriptor SourceType="System.Data.Odbc.OdbcConnection">
... element I had to change the attributes from ...
<dtm:ColumnSchemaAttributes
NameColumnName = "COLUMN_NAME"
OrdinalPositionColumnName="ORDINAL_POSITION"
DataTypeColumnName = "TYPE_NAME"
MaximumLengthColumnName = "COLUMN_SIZE"
NumericPrecisionColumnName = "COLUMN_SIZE"
NumericScaleColumnName = "DECIMAL_DIGITS"
NullableColumnName="NULLABLE"
NumberOfColumnRestrictions="4"
/>
... to ...
<dtm:ColumnSchemaAttributes
NameColumnName = "COLUMN_NAME"
OrdinalPositionColumnName="ORDINAL_POSITION"
DataTypeColumnName = "TYPE_NAME"
MaximumLengthColumnName = "LENGTH"
NumericPrecisionColumnName = "PRECISION"
NumericScaleColumnName = "SCALE"
NullableColumnName="NULLABLE"
NumberOfColumnRestrictions="4"
/>
That is, I had to tweak the MaximumLengthColumnName, NumericPrecisionColumnName, and NumericScaleColumnName attribute values to "LENGTH", "PRECISION", and "SCALE", respectively.
Once that change was made the import from PostgreSQL to SQL Server ran successfully.
I wish you the best of luck in trying to import from PostgreSQL into SQL Server using SQL Server Import and Export Wizard. However, I have read numerous message board threads with people having trouble getting it to work. For example:
Import Data from Postgresql to SQL Server 08 Error
Here is the most helpful thread that I have found on the topic:
Import data from postgreSQL into SQL server 2005
To help someone who might be trying to achieve similar goal as mine. Instead of selecting the “PostgreSQL OLE DB Provider” in the data source drop down menu of SQL Server Import and Export Wizard, select “.Net Framework Data Provider for Odbc”
Then you have to make a DSN and provide a ConnectionString. Following ConnectionString worked for me
Driver={PostgreSQL};Server=localhost;Port=5432;Database=TestMasterMap;Uid=postgres;Pwd=;
To make a DSN you have to go into Administrative Toolsà Data Sources (ODBC) and create a user DSN. Once this is done you can supply the DSN name in the DSN text box of SQL Server Import and Export Wizard.
One commenter claimed that it worked, but that he got "Out of memory while reading tuples" errors on big tables. So for tables with more than 3 million rows, he had to break the import up into 3 million row chunks.
Also, there's a link to the native .NET provider for PostgreSQL in that thread.
Personally, if this is something that I only had to do once, and if I understood the schema and the data fairly well, I would try:
export the data from PostgreSQL as flat files
create the schema in SQL Server (without PKs or constraints)
use the SSIS Import/Export Wizard to import the flat files
then create PKs and necessary constraints
It might take you less time to do the above than messing with SSIS Import/Export Wizard and PostgreSQL for days (but it would be nice if those tools worked!)
As I finished commenting the answer above, I thought of trying SQL WorkbenchJ; it has a datapump feature that worked pretty well for me. I managed to export data from my PostgreSQL database to an SQL server instance.
Those who'd like to run this in batch mode (via shell), here's how to do it: Google Groups Thread. The WbCopy command mentioned on the discussion isn't really documented anywhere I could find, but you can generate one through the datapump interface and then change whatever you need.
To give a little more practical example of how you can achieve what's described in marked answer; you can export from PostgresQL to flat files then use bcp Utility to import in SQL Server.
e.g. in a .bat file, for a single table (and you need to have the table already created in the destination SQL DB):
#echo off
set DbName=YOUR_POSTGRES_DB_NAME
set csvpath=C:\PATH_TO_CSV\CSV_NAME.csv
set username=YOUR_POSTGRES_DB_USERNAME
:: Export to CSV, note we're using a ~ delimiter to avoid issues with commas in fields
psql -U %username% -d %DbName% -c "COPY (select * from SOURCE_TABLE_NAME) TO STDOUT (FORMAT CSV, HEADER TRUE, DELIMITER '~', ENCODING 'UTF8');" > %csvpath%
:: Import CSV to SQL Server
set logpath=C:\bcplog.txt
set errorlogpath=C:\bcperrors.txt
set sqlserver=YOUR_SQL_SERVER
set sqldb=YOUR_DB_NAME
:: page code 65001 = UTF-8
bcp DESTINATION_TABLE_NAME IN %csvpath% -t~ -F1 -c -C65001 -S %sqlserver% -d %sqldb% -T -o %logpath% -e %errorlogpath%

Resources