BCP is not working if first column is empty - sql-server

I have source file in which 90% of first field is empty. I want to load this file to SQL server table with BCP utility. When i run BCP command, BCP utility is not able to recognize or distinguish records.
My Source file has data as below.
|100168|27238800000|14750505|1|273
|100168|27238800000|14750505|1|273
|100681|88392930052|37080101|1|252
|101014|6810000088|90421505|12|799
|101595|22023000000|21050510|8|780
I am using
**bcp [DBNAME].[dbo].[TABLE1] in \\filelocation\filename -e \\filelocation\filename_Error.txt -c -t | -S ServerName -T -h TABLOCK -m 1**
I am getting error message in error.txt
as ## Row 1, Column 28: String data, right truncation ## 100168 27238800000 14750505 1 273
100168|27238800000|14750505|1|273. Here BCP is not able to recognize
records. Due to this BCP is trying loaded next record data into last
field which is causing data truncation.
Table schema is
CREATE TABLE [DBO].[TABLE1](
FLD1 VARCHAR(10)
,FLD2 VARCHAR(10)
,FLD3 VARCHAR(22)
,FLD4 VARCHAR(15)
,FLD5 VARCHAR(10)
,FLD6 VARCHAR(12) )

You need to quote the pipe. Pipe (character |) is used for redirecting standard output for command lines
The following simplified line works with your sample
bcp.exe [db].dbo.[table1] in "path\Data.dat" -S".\instance" -T -c -t"|"
I omitted the error limit -m, log -e and table lock hint -h, but those should not affect the import, but if you still have an issue try quoting parameters like the server name and filenames
I used a text file with standard \r\n row terminators as expected by -c

Related

UTF-8 error when bulk loading data into Snowflake that was exported from SQL Server using BCP

I have a CSV file I'm creating by exporting a table in SQL Server 2016 SP2 using the bulk copy utility (bcp.exe). I'm setting the code page to 65001 (which Microsoft's documentation states is UTF-8). However, when I stage the file in Snowflake and then try to use the COPY command to move it into a table, I get an error that says, "Invalid UTF8 detected in string '0xFF0xFE00x0000x0010x0010x0060x0000x0000x0000x0000x0010x00n0x0040x00M0x00M0x00c0x00A0x00A0x00M0x00'."
If I use the IGNORE_UTF8_ERRORS flag, I get data in my table that is unintelligible. Any suggestions about how to fix the issue would be gratefully received.
Here's my BCP call:
BCP "SELECT Id, Name FROM database_name.owner_name.table_name WHERE Id = '0011602001r4ddgAZA'" queryout C:\temp\test.csv "-t|" -w -T -S. -C 65001
Here's the code in Snowflake:
--Create a file format
create or replace file format SFCI_Account
type = 'CSV'
field_delimiter = '|'
validate_utf8 = True
;
-- Create a Stage object
create or replace stage SFCI_Account_stage
file_format = SFCI_Account;
-- Check my file is there
list #SFCI_Account_stage;
-- Copy the file into the table
copy into Test
from #SFCI_Account_stage
file_format = (format_name = SFCI_Account)
pattern='.*.csv.gz'
on_error = 'skip_file';
Apparently, all I needed to do was change the -w to -c in my BCP call and add the following:
-r "\r\n"
So, my final BCP call looks like this:
BCP "SELECT Id, Name FROM database_name.owner_name.table_name WHERE Id = '0011602001r4ddgAZA'" queryout C:\temp\test.csv "-t|" -c -T -S. -C 65001 -r "\r\n"
Now, that fixed the issue of the the UTF-8 error, but now I have to figure out how to deal with carriage returns in the data.

bcp with format file using command dos in windows

I'am trying to import data into sql server table from a file using a format file.
In fact I have 2 databases: a production database and a local database
I want to insert some row of the table shipper of the production database in the local one. The table shipper don't have neither the same columns nor the same order of column in the 2 databases.
That's why I used a file format to do my bcp.
I generate file containing the rows I want to insert in my local database with the following commande
bcp "SELECT shipper_id,Shipper_name FROM ProductionDatabase.dbo.shipper where shipper_id >5" queryout shipper.txt -c -T
It works !!
I generate then the format file with the schema of my local table with the following commande
bcp LocalDatabase.dbo.shipper nul -T -n -f shipper-n.fmt
It works !!
Unfortunately when I tried to insert the file data in my local table
with the following commande:
bcp LocalDatabase.dbo.shipper in shipper.txt -T -f shipper-n.fmt
it generates the following error (translated from french)
Can anyone know what is the problem and how can I get arround it.
Thanks in advance
unexpected end of file encountered in the bcp data file
Your format file does not match the data. You are exporting using text using -c
bcp "SELECT shipper_id,Shipper_name FROM ProductionDatabase.dbo.shipper where shipper_id >5" queryout shipper.txt -c -T
But your format file is made for native (binary) data using -n
bcp LocalDatabase.dbo.shipper nul -T -n -f shipper-n.fmt
Either export both as native (my recommendation), or both as text. To prevent this error, export the data file and the format file at the same time, simply add -f shipper.fmt to your export
Text version:
bcp "SELECT shipper_id,Shipper_name FROM ProductionDatabase.dbo.shipper where shipper_id >5" queryout shipper.txt -c -T -f shipper.fmt
or
Native Version:
bcp "SELECT shipper_id,Shipper_name FROM ProductionDatabase.dbo.shipper where shipper_id >5" queryout shipper.txt -n -T -f shipper.fmt
PS. Since you can run into scenarios where your record or row delimiters exist in the data you should pick a character sequence that does not exist in your data as a separator for instance -t"\t|\t" (Tab-Pipe-Tab) for fields and -r"\t|\n" (Tab-Pipe-Newline) for rows. If you combine the format statement with the export the data and the format file will match and you have the freedom to change the separators on a single command line.
Specify separators after the -n or -c on the command line

BCP: Importing data with keeping identity values causes "Invalid character value for cast specification"

I've got a problem while copying data from SQL Server 2012 to Azure DB.
Here I'm listing the steps I have made.
Created .dat and .xml format files as follows:
bcp.exe dbo.user_sayti out "c:\\dbo.user_sayti.dat" -w -k -Slocalhost -dsource_db -Uuser -Ppwd
bcp.exe dbo.user_sayti format nul -f "c:\\dbo.user_sayti.xml" -w -x -Slocalhost -dsource_db -Uuser -Ppwd
dbo.user_sayti.dat
dbo.user_sayti.xml
Made an attempt to copy them to Azure DB with keeping identity values:
bcp.exe dbo.user_sayti in "c:\\dbo.user_sayti.dat" -E -f "c:\\dbo.user_sayti.xml" -Sserver.database.windows.net -dtarget_db -Uuser -Ppwd
And have got "Invalid character value for cast specification" error.
I don't understand why, because the .dat file contains value 127 for the identity column id (PK, int, not NULL) and the number of values matches the number of rows.
Then I've tried the same command without -E and the process finished successfully (there is the only row in this particular table and it appeared in the target_db with identity column value = 1).
Replacing -f parameter (schema definition) with -w solved that stupid problem :)

Sybase bcp error

What i want to do is copy a table into a file, truncate the table and copy the data back into the table.
For this, i am using the following two commands:
Out: bcp TABLE out file.csv -S SERVER -U user -P password -r '\n' -t '^|' -c
In: bcp TABLE in file.csv -S SERVER -U user-P password-r '\n' -t '^|' -c -J iso_1 -b 5000
This is the error i get:
CSLIB Message: - L0/O0/S0/N36/1/0:
cs_convert: cslib user api layer: common library error: The result is truncated because the conversion/operation resulted in overflow.
The interesting part ( for me, at least ) is that i get the error only for rows with the first column being an ODD number. From the first 3 million rows, it cuts half of them, all having the first column ( the PK ) an odd number.
I tried with different options, but none seem to work: no problem with the charset as far as i can tell, there are no huge columns such that they are truncated and it is not the carriage return missing.
Any help would be greatly appreciated.
UPDATE: After creating a format-file there are no more errors, but it only copies half of the data back into the table.
UPDATE: I managed to create a format file which works and loads all data, but i cannot use it on another server (it works in testing environment, it needs to run in production environment), since it says Attempt to read an unknown version of bcp format-file.? I know what this means, but is there any way of finding the correct values of the version?
SOLVED: After digging back in the database, it seems that the problem was indeed data inconsistency due to the fact that the VIEW used in production to copy the table only copied 25 columns, but the table has 26 columns ( somebody altered the table and i didn't know and hadn't noticed that it happened ). Fixed the View and now it works.
Since you are going out of/into the same server, I recommend you use bcp with the native flag.
bcp DBNAME..TABLE out file.bcp -SSERVER -Uuser -Ppassword -n
bcp DBNAME..TABLE in file.bcp -SSERVER -Uuser -Ppassword -n -b5000
Character mode can get wierd, and I only use it when it is required.

BCP to bulk insert into a single row/field

I'm trying to get BCP to insert the contents of a text file into a single field.
Example file content
Field1,field2,Field3
1,test,,
2,,test
3,test,test
The following command imports each line above as a new row into my temp table.
bcp mydb..tempTable in c:\testFile.txt -T -c
I think the solution is to use the -r switch to specify the row terminator as the end of the file but I'm unsure how to do this.
EDIT
I found the solution. The textfile I am importing is first created using the BCP, in my example all of the file contents comes from a single nvarchar(max) field and row. If I set the row terminator via -r during the export then this also becomes the end of my file. I can then import using bcp mydb..tempTable in c:\testFile.txt -T -c -r {eof}.
The only issue I have now is that the output from the BCP command states "Error = [Microsoft][SQL Server Native Client 10.0]Unexpected EOF encountered in BCP data-file", however, the data still imports as I want so presumably I can ignore this?

Resources