I have a stored procedure that uses bcp queryout and populates the file with the results of
SELECT * FROM ##TempTable
Occasionally, the file created is empty. I know it cannot be permission based on the output location as the file is created and saved, so the SELECT must be returning zero rows. This is a production environment and I am not permitted to put any debugging etc to see what the count of the SELECT statement returns prior to the bcp line - but I know the table is populated as it is referenced later on in the sp and that section of the code never fails
Has anyone seen bcp act this way?
The switches I am using against bcp are
-t -T -c -S
Thanks
Have you check to ensure that none of your fields contain a NULL. Use ISNULL() to replace NULL with ''.
Related
I have a script with dynamic query. I want to execute the query and output its result to a file. I can't seem to figure out how to output result of an "execute" statement.
Sample code below.
declare #sql_text varchar(300)
select #sql_text = select 1
exec (#sql_text) > output.txt
To give more context. My actual script would be looping through the dynamic query and output to different files (dynamic filename as well).
You set the output file via the -o parameter to the isql client to execute the SQL. This will send the output to a file from any SQL be that normal or dynamic SQL.
So put the SQL in an input file and then run
isql -U user - P password -S -i input_filename -o output.txt
You can't call directly to a operating system file from within ASE itself without enabling xp_cmdshell which is a potential security issue (as it allows O/S commands to be run as the user running the Sybase dataserver) and is therefore prohibited in most sites.
I'm trying to transfer table data from one SQL Server to another and wanting to use the bcp utility for it. This is purely to transfer data between two identical schemas, but I'm not able to use something like SSDT; I need something that can be scriptable and portable so it can be run by others with just SQL server and SSMS access.
I am generating a native output file and format file like so:
$> bcp database.TableName OUT c:\data\bcp\TableName.bcp -T -N -S SQLINSTANCE
$> bcp database.TableName format nul -f c:\data\bcp\TableName.fmt -T -N
Then in Management Studio I am trying to in turn read the files like this:
SELECT
*
FROM
OPENROWSET (BULK 'c:\data\bcp\TableName.bcp',
FORMATFILE = 'c:\data\bcp\TableName.fmt') AS t1
But am getting this error:
The bulk load failed. The column is too long in the data file for row 6, column 19. Verify that the field terminator and row terminator are specified correctly.
I have followed this process before successfully, and it works for other tables. But I'm running into issue with this table. The column mentioned is of datatype nvarchar(max). I can inspect what I think is the "problem" record in the source data and it's just a very long string but I don't see anything else special about it.
Is there something else I should be doing when generating the format file or what else am I missing?
If you are only exporting for the purpose of importing to another SQL Server, native format is the way to go. And is this case you don't need to use format files. Just do a native export and import.
Note you are specifying a capital -N and that's not native. Native is lower -n.
You should export using something like:
bcp database.Schema.TableName OUT c:\data\bcp\TableName.bcp -T -n -S SQLINSTANCE
Then on the importing side I sugest using BULK IMPORT, which don't need a format file for native at all:
BULK INSERT TargetDB.dbo.TargetTable
FROM 'c:\data\bcp\TableName.bcp'
WITH (DATAFILETYPE = 'native');
If you can't use BULK INSERT and must absolutely go for OPENROWSET, you need a format file. bcp can generate that for you, but again, lower case -n:
bcp database.Schema.TableName format nul -f c:\data\bcp\TableName.fmt -T -n -S SQLINSTANCE
Now your OPENROWSET should work.
I have source file in which 90% of first field is empty. I want to load this file to SQL server table with BCP utility. When i run BCP command, BCP utility is not able to recognize or distinguish records.
My Source file has data as below.
|100168|27238800000|14750505|1|273
|100168|27238800000|14750505|1|273
|100681|88392930052|37080101|1|252
|101014|6810000088|90421505|12|799
|101595|22023000000|21050510|8|780
I am using
**bcp [DBNAME].[dbo].[TABLE1] in \\filelocation\filename -e \\filelocation\filename_Error.txt -c -t | -S ServerName -T -h TABLOCK -m 1**
I am getting error message in error.txt
as ## Row 1, Column 28: String data, right truncation ## 100168 27238800000 14750505 1 273
100168|27238800000|14750505|1|273. Here BCP is not able to recognize
records. Due to this BCP is trying loaded next record data into last
field which is causing data truncation.
Table schema is
CREATE TABLE [DBO].[TABLE1](
FLD1 VARCHAR(10)
,FLD2 VARCHAR(10)
,FLD3 VARCHAR(22)
,FLD4 VARCHAR(15)
,FLD5 VARCHAR(10)
,FLD6 VARCHAR(12) )
You need to quote the pipe. Pipe (character |) is used for redirecting standard output for command lines
The following simplified line works with your sample
bcp.exe [db].dbo.[table1] in "path\Data.dat" -S".\instance" -T -c -t"|"
I omitted the error limit -m, log -e and table lock hint -h, but those should not affect the import, but if you still have an issue try quoting parameters like the server name and filenames
I used a text file with standard \r\n row terminators as expected by -c
What i want to do is copy a table into a file, truncate the table and copy the data back into the table.
For this, i am using the following two commands:
Out: bcp TABLE out file.csv -S SERVER -U user -P password -r '\n' -t '^|' -c
In: bcp TABLE in file.csv -S SERVER -U user-P password-r '\n' -t '^|' -c -J iso_1 -b 5000
This is the error i get:
CSLIB Message: - L0/O0/S0/N36/1/0:
cs_convert: cslib user api layer: common library error: The result is truncated because the conversion/operation resulted in overflow.
The interesting part ( for me, at least ) is that i get the error only for rows with the first column being an ODD number. From the first 3 million rows, it cuts half of them, all having the first column ( the PK ) an odd number.
I tried with different options, but none seem to work: no problem with the charset as far as i can tell, there are no huge columns such that they are truncated and it is not the carriage return missing.
Any help would be greatly appreciated.
UPDATE: After creating a format-file there are no more errors, but it only copies half of the data back into the table.
UPDATE: I managed to create a format file which works and loads all data, but i cannot use it on another server (it works in testing environment, it needs to run in production environment), since it says Attempt to read an unknown version of bcp format-file.? I know what this means, but is there any way of finding the correct values of the version?
SOLVED: After digging back in the database, it seems that the problem was indeed data inconsistency due to the fact that the VIEW used in production to copy the table only copied 25 columns, but the table has 26 columns ( somebody altered the table and i didn't know and hadn't noticed that it happened ). Fixed the View and now it works.
Since you are going out of/into the same server, I recommend you use bcp with the native flag.
bcp DBNAME..TABLE out file.bcp -SSERVER -Uuser -Ppassword -n
bcp DBNAME..TABLE in file.bcp -SSERVER -Uuser -Ppassword -n -b5000
Character mode can get wierd, and I only use it when it is required.
I'm trying to get BCP to insert the contents of a text file into a single field.
Example file content
Field1,field2,Field3
1,test,,
2,,test
3,test,test
The following command imports each line above as a new row into my temp table.
bcp mydb..tempTable in c:\testFile.txt -T -c
I think the solution is to use the -r switch to specify the row terminator as the end of the file but I'm unsure how to do this.
EDIT
I found the solution. The textfile I am importing is first created using the BCP, in my example all of the file contents comes from a single nvarchar(max) field and row. If I set the row terminator via -r during the export then this also becomes the end of my file. I can then import using bcp mydb..tempTable in c:\testFile.txt -T -c -r {eof}.
The only issue I have now is that the output from the BCP command states "Error = [Microsoft][SQL Server Native Client 10.0]Unexpected EOF encountered in BCP data-file", however, the data still imports as I want so presumably I can ignore this?