BCP import only half of csv file - sql-server

I have a problem with BCP. I try to import CSV file to table through view as below:
bcp vImport in "CSV\data.csv" -U[user] -P[password] -S[server] -ddb -c -t; -F2 -e"CSV\error.err"
The file contains about 900 rows, but BCP has loaded only 400 rows. I was looking for parameter or Db setting, but without success.
I cut the file to 400 rows than BCP loaded 200 rows.

The reason was that source table has got one more column than csv file so last column of table was filled with next row from csv file.

Related

Import data into SQL Server using BCP utility (export the log file with the error records and continue inserting with the normal records)

I have a data set and want to import it into my database with the condition:
In case there is a record that cannot be imported, it can be extracted into a log
Although existing records can not be imported but still allow import of records that can be imported (other records) and continue to process
Currently I use the BCP utility to import data into the table from the csv file with:
bcp table_name IN C:\Users\09204086121\Desktop\data.csv -T -c -o C:\Users\09204086121\Desktop\logOut.log -e C:\Users\09204086121\Desktop\errOut.log
It just satisfies my condition 1 above.
I need that when the record has error (duplicate primary key,...), write to log (1) and continue to insert into the table the other normal records (2).
I came up with the idea that combining trigger with bcp, after creating a trigger and adding the parameter -h "FIRE_TRIGGERS" to the bcp statement, the insert will ignore records that have the same key but it won't write to the log.
This is my trigger.
ALTER TRIGGER [PKGORDERCOMMON].[T_ImportData] ON [PKGORDERCOMMON].[IF_R_BUNRUI1]
INSTEAD OF INSERT
AS
BEGIN
--Insert non duplicate records
INSERT INTO [IF_R_BUNRUI1]
(
SYSTEM_KB,
BUNRUI1_CD,
BUNRUI1_KANJI_NA,
BUNRUI1_KANA_NA,
CREATE_TS
)
SELECT SYSTEM_KB,
BUNRUI1_CD,
BUNRUI1_KANJI_NA,
BUNRUI1_KANA_NA,
CREATE_TS
FROM inserted i
WHERE NOT EXISTS
(
SELECT *
FROM [IF_R_BUNRUI1] c
WHERE c.BUNRUI1_CD = i.BUNRUI1_CD
AND c.SYSTEM_KB = i.SYSTEM_KB
);
END;
Is there anyone who can help me.
BCP is not meant for what you are asking it to do (separate good and bad records). For instance, bcp -e option has a limit to how many records it will show. Im not sure if this limit is tied to the "max errors" option, but regardless there is a limit.
Your best option is to load all the records and address bad data in t-sql.
Load all records in such a way to ignore conversion errors. Either:
load entire line from file into a single, large varchar column. Parse out columns and qc data as needed.
or
load all columns from source file into generic varchar columns with large enough size to accomodate your source data.
Either way, when done, use t-sql to inspect your data and split among good/bad records.

How to use bcp for columns that are Identity?

I want to restore my table with BCP by code in the below.
BCP framework.att.attendance in "D:\test\mhd.txt" -T -c
But the column (id) is identity in this table.
When data is restored with BCP I want id columns to be unchanged.
In other words, if the id of the first row is '7' before BCP, I want to import data and the id of the first row will be still be '7'.
What should I do?
BCP IMPORT
-E
-E Specifies that identity value or values in the imported data file are to be used for the identity column.
If -E is not given, the identity values for this column in the data file being imported are ignored.

Issue extracting individual image from SQL Server using BCP

I have a table of data, one column contains images. I am trying to extract an individual records image column and export to image file using BCP. My code is as follows:
bcp "SELECT PictureData FROM BigTable WHERE RecordId='ASDF-QWER' queryout "C:\Path\File.jpg" -n -SCONNECTION\STRING -T
The output after running the command appears successful, it says 1 rows copied but when I double click on the file created it says "cannot open this file". If I modify the statement to select different rows and copy them to a text file using something like:
bcp "SELECT Name,Address FROM BigTable WHERE RecordId='ASDF-QWER' queryout "C:\Path\File.txt" -n -SCONNECTION\STRING -T
Then the text file contains the data I expect it to contain

How to import flat file by skipping header and footer using Sql Server query

I want to import flat file using SQL Server query. Flat file is PIPE delimited with one HEADER row and one FOOTER row.
Flat file sample is as follows,
H|201501204|01
1|abc|123
2|efg|456
4|hij|789
T|03
in above file H and T are fixed defining HEADER and FOOTER respectively, '201501204' will be the date, '01'will be the flat file number, '03'will be count of data rows in flat file.
I have tried using Bulk Insert but I am losing one Data Row while importing Flat file.
I am using FIRSTROW=2 and LASTROW=2651, number of data rows in my Flat file are 2650 but after importing i am getting only 2649 rows in table.
Use the BULK INSERT statement (see https://msdn.microsoft.com/en-us/library/ms188365.aspx), the option you want is FIRSTROW and set that to 2 (the 2nd row in the file). You can also use LASTROW.
Alternatively load the data into a staging table first - single column and then split it in SQL Server having excluded the H and T by WHERE LEFT( input_line, 1 ) not in ( 'H', 'T' )

Copying a dynamic number of tables into flat files using SSIS

Problem
I would like to copy a number of tables (around 100) into seperate flat files. The tables do not have the same number of columns.
So far
I have extracted which tables I wish to copy using a sys query and loop through each table using a for each container. The table name is stored in a variable. The for each container has a data flow task with an OLE DB Source component. This extracts all fields using a expression query (to insert the table name).
Now what?
I am stuck on how to save the data into a flat file. Each flat file should have the table name as filename (that can be done by declaring the connectionstring as an expression), but I am clueless on how to handle and bind the dynamic amount of columns.
To be honest, SSIS doesn't play particularly nicely when you're looking to loop through files with differing columns. You'd probably be better using BCP in this case. It can output CSV files directly, and, if you can run xp_cmdshell, you can run through each of your tables in T-SQL. For example, something like the following...
declare #bcpsql varchar(8000)
select #bcpsql = 'bcp <tablename> out c:\<tablename>.txt -c -t, -T -S <yourserver>'
exec master..xp_cmdshell #bcpsql
Link to BOL on BCP...http://msdn.microsoft.com/en-us/library/ms162802.aspx

Resources