SQL Server OPENROWSET error reading bcp file - sql-server

I'm trying to transfer table data from one SQL Server to another and wanting to use the bcp utility for it. This is purely to transfer data between two identical schemas, but I'm not able to use something like SSDT; I need something that can be scriptable and portable so it can be run by others with just SQL server and SSMS access.
I am generating a native output file and format file like so:
$> bcp database.TableName OUT c:\data\bcp\TableName.bcp -T -N -S SQLINSTANCE
$> bcp database.TableName format nul -f c:\data\bcp\TableName.fmt -T -N
Then in Management Studio I am trying to in turn read the files like this:
SELECT
*
FROM
OPENROWSET (BULK 'c:\data\bcp\TableName.bcp',
FORMATFILE = 'c:\data\bcp\TableName.fmt') AS t1
But am getting this error:
The bulk load failed. The column is too long in the data file for row 6, column 19. Verify that the field terminator and row terminator are specified correctly.
I have followed this process before successfully, and it works for other tables. But I'm running into issue with this table. The column mentioned is of datatype nvarchar(max). I can inspect what I think is the "problem" record in the source data and it's just a very long string but I don't see anything else special about it.
Is there something else I should be doing when generating the format file or what else am I missing?

If you are only exporting for the purpose of importing to another SQL Server, native format is the way to go. And is this case you don't need to use format files. Just do a native export and import.
Note you are specifying a capital -N and that's not native. Native is lower -n.
You should export using something like:
bcp database.Schema.TableName OUT c:\data\bcp\TableName.bcp -T -n -S SQLINSTANCE
Then on the importing side I sugest using BULK IMPORT, which don't need a format file for native at all:
BULK INSERT TargetDB.dbo.TargetTable
FROM 'c:\data\bcp\TableName.bcp'
WITH (DATAFILETYPE = 'native');
If you can't use BULK INSERT and must absolutely go for OPENROWSET, you need a format file. bcp can generate that for you, but again, lower case -n:
bcp database.Schema.TableName format nul -f c:\data\bcp\TableName.fmt -T -n -S SQLINSTANCE
Now your OPENROWSET should work.

Related

SQL Server BLOB image column - extracting with BCP queryout - corrupted files AND bug

I need to export pdf and image files from a column in SQL Server 2012. Jonas's 3-step process in "How to export an image column to files in sql server" is the clearest instruction I've found. I did everything exactly as he stated. Two notes:
Where he says "your_db" in the BCP statement, you need database_name.schema_name.table_name.
No line breaks are allowed.
I was able to export files one at a time after this, but they were corrupt. The files were slightly smaller (by 1-15 KB) than the actual working PDFs that I can access through the UI.
This turned out to be a format issue - if you're not exporting XML files, you have to create a special format file. There's a great solution by Conor, in "SQL Server BCP export corrupted file" that tells how to create a format file and reference it in your BCP query. First I created the format file from the command line:
C:\Program Files (x86)\Microsoft SQL Server\110\Tools\Binn>bcp CentricityPM.dbo.PatientProfileAttachment format nul -T -n -f C:\bcpdir\bcpfile.fmt
Then I edited and re-saved the format file as per Conor's post. I ran my BCP query:
EXEC master ..xp_cmdshell 'BCP "SELECT data FROM CentricityPM.dbo.PatientProfileAttachment WHERE PatientProfileid = ''11568'' AND type = ''pdf'' " queryout "C:\exportdir\testfile.pdf" -T -N'
The error:
Starting copy...
SQLState = S1000, NativeError = 0
Error = [Microsoft][SQL Server Native Client 11.0]Host-file columns may be skipped only when copying into the Server
SQLState = S1000, NativeError = 0
Error = [Microsoft][SQL Server Native Client 11.0]Unable to resolve column level collations
NULL
BCP copy out failed
Jon of All Trades noted in "BCP Error: columns may be skipped only when copying into the Server" that this is a Microsoft bug reported on 8/6/2010. He suggested creating a table with the right number of columns. I created a table with one column and one row of my data (?!) which Conor had actually referenced in his post but I didn't really get it until this point.
Please note, this is NOT useful to me, because I need not only the data, but a way to identify it (I have the name I want for each file stored in another column). But I gave it a try anyway - I re-ran the bcp format file:
C:\Program Files (x86)\Microsoft SQL Server\110\Tools\Binn>bcp CentricityPM.dbo.PatientProfileAttachment format nul -T -n -f C:\bcpdir\bcpfile.fmt
Here's the format file it gave me:
11.0
1
1 SQLIMAGE 4 0 "" 1 data ""
And here are the edits I made - I changed the data type as TT suggested below, and changed the 4 to a 0:
11.0
1
1 SQLBINARY 0 0 "" 1 data ""
I ran my query:
EXEC master ..xp_cmdshell 'BCP "SELECT data FROM CentricityPM.dbo.TempImageFour" queryout "C:\exportdir\testfile.pdf" -T -fC "C:\bcpdir\bcpfile.fmt" '
It ran with no errors... but the file is still corrupted.
Can anyone see anything I've done wrong? Does anyone know a workaround for the one-column-only bug? Or if you know of a working tool that will do this for me, that'd be great too. I tried https://sqlblobextractor.codeplex.com/ early on, with no success.
You are using parameter -f "C:\bcpdir\bcpfile.fmt" but from my experience that should be -fC "C:\bcpdir\bcpfile.fmt". To be honest I don't remember anymore why... I once made something similar to export files (.zip) from database and my command has -fC parameter for the export file. I whish I could give you a proper explanation. Anyway, HTH.
Try the following command:
EXEC master..xp_cmdshell 'BCP "SELECT data FROM CentricityPM.dbo.TempImageFour" QUERYOUT "C:\exportdir\testfile.pdf" -T -fC "C:\bcpdir\bcpfile.fmt"'
An alternative is to specify the -C RAW option. This specifies that no conversion is done from one code page to another.
EXEC master..xp_cmdshell 'BCP "SELECT data FROM CentricityPM.dbo.TempImageFour" QUERYOUT "C:\exportdir\testfile.pdf" -T -f "C:\bcpdir\bcpfile.fmt" -C RAW'
Also, make sure that your format file has SQLBINARY as data type for your column.

SQL - Automatic results to CSV or Text File

I was wondering if anyone can help.
I have a number of queries in SQL (all in separate *.sql files). I wanted to know if there is a way to run these queries automatically or mass run them to be saved to either a csv or txt file?
Also, I have come variables within these queries which will need to be amended on a weekly bases before the queries are run.
Thanks.
KJ
Could you please provide some additional help in relation to the variables? Previously I would declare and set variables as:
DECLARE #TW_FROM DATETIME
DECLARE #TW_TO DATETIME
SET #TW_FROM = '2015-11-16 00:00:00';
SET #TW_TO = '2015-11-22 23:00:00';
How do I do this using sqlcmd?
Yes, you can use sqlcmd to do this.
First of all - variables. You can refer to your variables in the .sql files using $(variablename) wherever you want to substitue the variable. For example,
use $(dbname);
select $(columnname) from table1 where column= '$(var1)'
You then call sqlcmd with the following command (note the argument -v variables)
sqlcmd -S servername -d database -i "yoursqlfile.sql" -v dbname="database" columnname="column" var1="Fred"
In order to output this to a file, you tag > filename.txt on the end
sqlcmd -S servername -d database -i "yoursqlfile.sql" -v dbname="database" columnname="column" var1="Fred" > filename.txt
If you want to output to a csv, you can also specify the delimiter using the argument -s (note the idfference with the capital S for server). So now we have
sqlcmd -S servername -d database -s "," -i "yoursqlfile.sql" -v dbname="database" columnname="column" var1="Fred" > filename.csv
If you want to output several commands to the same csv or txt file, use >> instead of > as it add to teh bottom of the file, rather than replacing it.
sqlcmd -S servername -d database -s "," -i "yoursqlfile.sql" -v dbname="database" columnname="column" var1="Fred" >> filename.csv
To run this for several scripts, you can put the statements in a batch file, and then change the variables every week.
You could write a batch file that uses sqlcmd:
MSDN sqlcmd
That will allow you to call script files in a loop and output the results to a file.
Convert your current scrips to a Stored Procedure.
You can then pass your variables to that and run the query.
If you have SQL Server agent available (SQL standard or better) you can use this to automate the running of the stored procedures.
Otherwise the same can be achieved with Task Scheduler in windows.
As for exporting to CSV this will be useful.
It depends on where your SQL Server is acutally running. It might be quite tricky to write anything to the location you want.
You could read about BCP.
My suggestion is:
Create an UDF (best is inline-UDF!) from all of your queries within your database. Than call them from EXCEL or any other fitting product. You might want to set up an Excel where all your queries are filled one on each Sheet automatically

SQL Server 2014 : Not able to import any data into LocalDB using bcp and format file (zero rows, no errors)

I'm trying to use a non-XML bcp format file to import data into LocalDB on Win7 64 bit. Simplest possible use case.
OS Name: Microsoft Windows 7 Home Premium
OS Version: 6.1.7601 Service Pack 1 Build 7601
LocalDB version: Microsoft SQL Server 2014 (12.0.2000.8)
BCP version: 12.0.2000.8
Basically, latest version of everything downloaded from Microsoft SQL Server 2014 site a few days ago.
I'm able to connect to the LocalDB instance via bcp to create a format file, but the generated format file doesn't work when trying to re-import the simplest possible data using it. No matter what I try, bcp loads zero rows, fails silently and prints no error information to the specified error file.
/* create the table */
use try_db;
create table try(num integer);
/* create the format file based on the table. */
bcp try_db.dbo.TRY format nul -n -T -f TRY.fmt -S (localdb)\default_db
/* above command creates a file TRY.fmt with the following contents */
12.0
1
1 SQLINT 1 4 "" 1 num ""
/* then I create a file data.txt, with just the number 99 in it, followed by a Windows line terminator (\r\n) */
/* then try importing the file into the table */
bcp try_db.dbo.TRY in data.txt -f TRY.fmt -T -S (localdb)\default_db -e errors.txt
Result:
Starting copy...
0 rows copied.
Network packet size (bytes): 4096
Clock Time (ms.) Total : 1
Nothing is written to errors.txt. I am just not able to get bcp to import anything at all using a format file!
I haven't tried it with SQL Server itself (only with LocalDB) but it shouldn't matter for stuff as simple as this.
I tried editing the TRY.fmt file line as follows:
1 SQLINT 1 4 "\r\n" 1 num ""
But that didn't help either.
I am able to get it to successfully import using -c instead of -f:
bcp try_db.dbo.TRY in data.txt -c -T -S (localdb)\default_db -e errors.txt
Any thoughts on (a) why bcp won't import using the format file, and (b) why it prints no errors to the specified error file? There must be something really simple I'm getting wrong here.
Please, no recommendations to use BULK INSERT or SSIS (etc) instead. bcp should just work as documented!
The format file is describing the source data not the destination. When you use -c or datafiletype='char' your input datatypes must be SQLCHAR. Native datatypes are only valid when using -n or datafiletype='native'. A source file in native format is always binary so bcp needs to know the data type of each field in order to read the correct amount of bytes and interpret them correctly.
I think I found the answer. The bcp format spec doesn't work properly! It seems that even for numeric or datetime import fields, you have to specify "SQLCHAR" as the datatype in the .fmt file. Any attempt to use the actual .fmt file generated by "bcp format" is hopeless -- if it gives you SQLINT or SQLDATE lines back, you have to replace those with SQLCHAR for the thing to work, even if the db columns are in fact numeric or date/datetime types.
What a crock!

SQL Server BCP export corrupted file?

i have a small problem with BCP functionality in SQL Server 2012.
The things is:
im loading .jpg image (167KB in size) using below command:
INSERT [tabela_testowa] ( Data )
SELECT * FROM OPENROWSET (BULK N'C:\foty\ch6_MagicShop.jpg', SINGLE_BLOB) a;
and then im trying to export it back to disk using:
BCP "SELECT data FROM tabela_testowa WHERE ID = 1" queryout "C:\test\file.jpg" -T -n -d test
File gets saved on disk no problem, size is also 167 KB but.. it cant be opened like the original copy.
I dont know whatever some parameter is wrong in BCP export? Or maybe it gets corrupted at import stage?
Anyone had similiar problems?
Thank god, thanks to #user_0 answer and #user3494351's cryptic answer and comment and this ancient forum post I finally figured this out after several hours of banging my head against the wall.
The issue is that BCP likes to add an extra 8 bytes to the file by default. This corrupts the file and makes it unable to be opened if you just use the native -n flag.
However, BCP allows you to specify a format file as output that can allow you to tell it not to add the extra 8 bytes. So I have a table I created (to be used in a cursor) in SQL Server that only has ONE ROW and ONE COLUMN with my binary data. Table must exist when you run the first command.
In command line first you need to do this:
bcp MyDatabase.MySchema.MyTempTable format nul -T -n -f formatfile.fmt
This creates formatfile.fmt in the directory you are in. I did on E:\ drive. Here's what it looks like:
10.0
1
1 SQLBINARY 8 0 "" 1 MyColumn ""
That 8 right there is the variable that bcp says how many bytes to add to your file. It is the bastard that is corrupting your files. Change that sucker to a 0:
10.0
1
1 SQLBINARY 0 0 "" 1 MyColumn ""
Now just run your BCP script, drop the -n flag and include the -f flag:
bcp "SELECT MyColumn FROM MyDatabase.MySchema.MyTempTable" queryout "E:\MyOutputpath" -T -f E:\formatfile.fmt
BCP is adding informations to his file. Just few data, but you are not exporting just a jpg file.
You say 167 KB, but watch the real bytes, not the rounded dimension. There will be a difference.
You cannot export the image via BCP.
OK so i solved the issue.
Format file has to be added using -f and path to the file. It can be create by running bcp without any format and order it to save format file to disk. Then we can use this format file so its no longer needed to answer those questions, and file itself has no additional data and can be opened without problems

BCP to bulk insert into a single row/field

I'm trying to get BCP to insert the contents of a text file into a single field.
Example file content
Field1,field2,Field3
1,test,,
2,,test
3,test,test
The following command imports each line above as a new row into my temp table.
bcp mydb..tempTable in c:\testFile.txt -T -c
I think the solution is to use the -r switch to specify the row terminator as the end of the file but I'm unsure how to do this.
EDIT
I found the solution. The textfile I am importing is first created using the BCP, in my example all of the file contents comes from a single nvarchar(max) field and row. If I set the row terminator via -r during the export then this also becomes the end of my file. I can then import using bcp mydb..tempTable in c:\testFile.txt -T -c -r {eof}.
The only issue I have now is that the output from the BCP command states "Error = [Microsoft][SQL Server Native Client 10.0]Unexpected EOF encountered in BCP data-file", however, the data still imports as I want so presumably I can ignore this?

Resources