I'm currently reviewing how to import a file created from bcp in SQL Server on one computer into my local SQL Server. This is a datafile I received from a 3rd party so I have no idea of the data structure etc. of the information. My SQL Server skills are quite new so please bear with me :) I've reviewd the bcp documents and attempted the following:
bcp TestDatabase in File.bcp -T
Results: Invalid Object name 'TestDatabase'
I created the test database TestDatabase and tried the query again but with the same response. I then added -Slocal and got a login timeout, seems like progress!
I removed the -T flag and tried varying combinations of usernames and passwords without any luck.
So I guess to start, is there an underlying issue I'm missing, syntax I'm not following etc. or should I just play around with the creds for my local SQL Server?
You need to specify the server, username, and table. Try this:
bcp TestDatabase..SomeTableName in File.bcp -S Server -U Username -P Password
If you look at the bcp Utility docs
-T means to use Integrated Security (your logged in account)
-Sis the server name parameter.
These two parameters are not interchangeable.
You can only use the -U and -P in place of -T if you have SQL Authentication turned on (and you shouldn't if you can avoid it)
Finally Chris Shain is correct. You need to specify Schema and table or ViewName, not just a DB Name, as well as the Server (-S)
Also from the documentation (and the point E.J. Brennan was making)
To import data into a table, you must either use a format file created
for that table or understand the structure of the table and the types
of data that are valid for its columns.
So you can't expect to just take a file and have bcp magically make a table for you. You need to use SSIS to help you or some other tool to do that.
Related
My customers runs an very old (seems to me) Sybase 12.5.2 database. I want/need to export all tables from a database to multiple (for each table) flat (text) files. I have access to ISQL command line prompt with the admin user. I havent worked ever with an Sybase database before.
Sybase Adaptive Server Enterprise (ASE) allows multiple databases to be hosted. You don't specify whether only one of the databases in the database server needs to be exported or if all of them do.
For each database, the following query will list the names of the tables
select name from sysobjects where type = 'U'
Sybase ASE also comes with a tool called "bcp" which stands for "Bulk Copy". It is an easy way of creating a flat file of a table's contents.
bcp database.schema.table out file_name -c -U username -S server_name
It has more options that may be of interest, especially around field and row terminators. Documentation for the most relevant version (12.5.1) can be found here:
http://infocenter.sybase.com/help/index.jsp?topic=/com.sybase.dc30191_1251/html/utility/BABGCCIC.htm
i have been using BCP commands to export data from sybase environments.bcp is a command line utility which you can use it to export data from multiple types of databases
below is a very example and you can try it for
bcp Table Name out OUTPUT FILE PATH\FILENAME.dat -S SERVER NAME -U USERNAME -P PASSWORD -F Format -r row_terminator -e error output file path and name
You can create a batch file with such commands and do multiple exports on one hit.
If you have access to any ETL tool you can exporting the data using the same as well.
I'm trying to drop my database and create a new one through the command line.
I login using psql postgres and then do a \list, see a list of the two databases i created which i now want to delete. so i tried using a DROP DATABASE databasename;
I don't see any error while executing that statement but when i try to \list again to see if that DB are deleted, i still see that that the DB exists. Can someone please tell me why this could happen? and how to surely delete those DB.
There are a couple caveats to DROP DATABASE:
It can only be executed by the database owner.
It cannot be executed while you or anyone else are connected to the target database.
I generally use the dropdb command-line tool to do this, since it's a wrapper around DROP DATABASE which doesn't require you to explicitly connect first. It still has the caveat that there can't be any users currently connected to the database, but it's generally quicker/easier to use.
I would recommend you try issuing a command like this:
dropdb -h <host> -U <user> -p <port> <name of db to drop>
Similarly, you can use the createdb command-line tool to create a database.
More info on DROP DATABASE: http://www.postgresql.org/docs/current/static/sql-dropdatabase.html
Edit:
Also, it is worth looking in the Postgres log (likely in /var/log/postgresql by default) to see if perhaps there is anything in there that wasn't surfaced in the results.
Personally I'm with a doubt saving my XML in sql server. I am using FOR XML PATH to generate the xml from a table in my database. The sql server shows me on the screen the XML generated but not saved to file it (well, I guess not lol). How do I save it by passing a directory?
I have the following query to generate XML:
select TableName,
operation,
UserName,
DataAcesso,
CamposTabela,
ValoresCampos,
CamposPKs,
ValoresCamposPKs
FROM TabelaLog
FOR XML PATH ('Log')
anyone know how to save to file?
thank you!
You can try the BCP utiity
bcp "select TableName, operation, UserName, DataAcesso, CamposTabela, ValoresCampos, CamposPKs, ValoresCamposPKs FROM TabelaLog FOR XML PATH ('Log')" queryout "D:\MyTable.csv" -c -t , -S SERVERNAME -T
However, the headers must be passed explicitly, if you want them. You can use a UNION ALL for that purpose.
Check out this post. Has some good instructions on exporting info from a query\table to file
How do I send a database query to a text file?
Alternatively you could use SQL Server Integration services (SSIS) and setup a scheduled job to export the information periodically. That would be my option. Another alternative would be to pull it into your application and then save\export it from server side code rather than letting the database do it for you.
I need to migrate a database from Postgres 7 to SQL Server 2008. I am familiar with the SSIS Import and Export wizard but I am stumped about how to define the data source or define the data provider.
What is the best way to migrate Postgres to SQL Server, and how do I define data sources/drivers for postgres?
I was having problems using the Import Wizard in SQL Server 2008 R2 to import tables from PostgreSQL. I had the PostgreSQL ODBC driver installed, so for the Data Source in the Import Wizard I chose ".Net Framework Data Provider for Odbc" and supplied the DSN name for my PostgreSQL database. The wizard found the tables okay, but when I went to perform the import I got the error
Column information for the source and destination data could not be retrieved.
“Billing” -> [dbo].[Billing]:
– Cannot find column -1.
I found the solution in the Microsoft blog post here. Apparently the problem is that various ODBC drivers use different attribute names when reporting column metadata. To get the import to work I had to edit the "ProviderDescriptors.xml" file, which was located at
C:\Program Files\Microsoft SQL Server\100\DTS\ProviderDescriptors\ProviderDescriptors.xml
In the ...
<dtm:ProviderDescriptor SourceType="System.Data.Odbc.OdbcConnection">
... element I had to change the attributes from ...
<dtm:ColumnSchemaAttributes
NameColumnName = "COLUMN_NAME"
OrdinalPositionColumnName="ORDINAL_POSITION"
DataTypeColumnName = "TYPE_NAME"
MaximumLengthColumnName = "COLUMN_SIZE"
NumericPrecisionColumnName = "COLUMN_SIZE"
NumericScaleColumnName = "DECIMAL_DIGITS"
NullableColumnName="NULLABLE"
NumberOfColumnRestrictions="4"
/>
... to ...
<dtm:ColumnSchemaAttributes
NameColumnName = "COLUMN_NAME"
OrdinalPositionColumnName="ORDINAL_POSITION"
DataTypeColumnName = "TYPE_NAME"
MaximumLengthColumnName = "LENGTH"
NumericPrecisionColumnName = "PRECISION"
NumericScaleColumnName = "SCALE"
NullableColumnName="NULLABLE"
NumberOfColumnRestrictions="4"
/>
That is, I had to tweak the MaximumLengthColumnName, NumericPrecisionColumnName, and NumericScaleColumnName attribute values to "LENGTH", "PRECISION", and "SCALE", respectively.
Once that change was made the import from PostgreSQL to SQL Server ran successfully.
I wish you the best of luck in trying to import from PostgreSQL into SQL Server using SQL Server Import and Export Wizard. However, I have read numerous message board threads with people having trouble getting it to work. For example:
Import Data from Postgresql to SQL Server 08 Error
Here is the most helpful thread that I have found on the topic:
Import data from postgreSQL into SQL server 2005
To help someone who might be trying to achieve similar goal as mine. Instead of selecting the “PostgreSQL OLE DB Provider” in the data source drop down menu of SQL Server Import and Export Wizard, select “.Net Framework Data Provider for Odbc”
Then you have to make a DSN and provide a ConnectionString. Following ConnectionString worked for me
Driver={PostgreSQL};Server=localhost;Port=5432;Database=TestMasterMap;Uid=postgres;Pwd=;
To make a DSN you have to go into Administrative Toolsà Data Sources (ODBC) and create a user DSN. Once this is done you can supply the DSN name in the DSN text box of SQL Server Import and Export Wizard.
One commenter claimed that it worked, but that he got "Out of memory while reading tuples" errors on big tables. So for tables with more than 3 million rows, he had to break the import up into 3 million row chunks.
Also, there's a link to the native .NET provider for PostgreSQL in that thread.
Personally, if this is something that I only had to do once, and if I understood the schema and the data fairly well, I would try:
export the data from PostgreSQL as flat files
create the schema in SQL Server (without PKs or constraints)
use the SSIS Import/Export Wizard to import the flat files
then create PKs and necessary constraints
It might take you less time to do the above than messing with SSIS Import/Export Wizard and PostgreSQL for days (but it would be nice if those tools worked!)
As I finished commenting the answer above, I thought of trying SQL WorkbenchJ; it has a datapump feature that worked pretty well for me. I managed to export data from my PostgreSQL database to an SQL server instance.
Those who'd like to run this in batch mode (via shell), here's how to do it: Google Groups Thread. The WbCopy command mentioned on the discussion isn't really documented anywhere I could find, but you can generate one through the datapump interface and then change whatever you need.
To give a little more practical example of how you can achieve what's described in marked answer; you can export from PostgresQL to flat files then use bcp Utility to import in SQL Server.
e.g. in a .bat file, for a single table (and you need to have the table already created in the destination SQL DB):
#echo off
set DbName=YOUR_POSTGRES_DB_NAME
set csvpath=C:\PATH_TO_CSV\CSV_NAME.csv
set username=YOUR_POSTGRES_DB_USERNAME
:: Export to CSV, note we're using a ~ delimiter to avoid issues with commas in fields
psql -U %username% -d %DbName% -c "COPY (select * from SOURCE_TABLE_NAME) TO STDOUT (FORMAT CSV, HEADER TRUE, DELIMITER '~', ENCODING 'UTF8');" > %csvpath%
:: Import CSV to SQL Server
set logpath=C:\bcplog.txt
set errorlogpath=C:\bcperrors.txt
set sqlserver=YOUR_SQL_SERVER
set sqldb=YOUR_DB_NAME
:: page code 65001 = UTF-8
bcp DESTINATION_TABLE_NAME IN %csvpath% -t~ -F1 -c -C65001 -S %sqlserver% -d %sqldb% -T -o %logpath% -e %errorlogpath%
I want to run a query over a table in SQL Server to save out the data as files.
The table has one column with a filename in it and one column that is an image column with binary file contents data in it.
I'm sure I saw some syntax that would let me do this, but I cannot for the life of me find it anymore.
Is this possible?
You can do this with the bcp.exe from the command line which you could call through xp_cmdshell.
bcp "select MyBlobField from myTable WHERE a=b " queryout "c:\MyImage.jpg" -T -n
You can probably do it through OLE automation natively in SQL Server; but its not something I have tried.
An easy alternative is (if you have 2005/8) a CLR into the DB to do the job. THere are lots of code examples on the web how to do that.