Export data with bcp utility with double quotes - sql-server

I'm trying to exporting data from MS sql server by using bcp utility command line. The problem is that in the exported output is missing the first double quote at the first line and I cannot explain the reason.
Below the command that I'm using for the export:
/opt/mssql-tools/bin/bcp db_schema out dump.csv -c -t"\",\"" -r"\"\n\"" -S my_host -U my_user
But the output result is missing the first double quotes on first line (only the first line) of the exported csv file:
801","40116","Hazelnut MT -L","Thursday Promo","Large","","5.9000","","801","1.0000","","3.6500","2.2500",".0000","default","","","","","Chatime","02/06/2014","09125a9cfffd4143a00e73e3b62f15f2","CB01","",".0000","5.9000","6.9000",".0000",".0000",".0000",".0000",".0000",".0000","0","","0","0","0","","","","","","","","","Modern Milk Tea","","","0","","","1","0","","","","","","","","0","Hau Chan","","","","","","","","","","0","","","","","","","-1","","","","","","","","","","","","0","00000000420714AA","2014-06-02","1900-01-01","","","","","","","","","","","","","","","","","","","","","","","","","","","","","","","","","","","","","","","","","","","","","","","","","","","","",""
Am I missing something?

If you know the field names, you can try the following:
Create Table [names]
(
Id Int,
fname VarChar(15),
lname VarChar(25)
)
Insert Into names Values
(1, 'Jim', 'Morrison'),
(2,'Robert', 'Plant'),
(3,'Janis', 'Joplin')
BCP command: Using quotename() with char(34)
(This BCP command uses a Trusted connection)
bcp "SELECT quotename(Cast(Id As VarChar(15)),char(34)), quotename(fname,char(34)), quotename(lname,char(34)) FROM names" queryout dump.csv -c -t"," -S SERVER -d DBNAME -T
Result:
"1","Jim","Morrison"
"2","Robert","Plant"
"3","Janis","Joplin"

I'll bet that you also have a line at the end of the file that is just a single double-quote. Why? With the command line switches you've provided, you're saying "end every field with "," and every row with "\n"". And that's what it's doing.
So, the second and subsequent lines start with a double-quote because the previous line ends with one.
Clippy DBA says "it looks like you're trying to produce a CSV". Without knowing why, it's hard for me... er... Clippy to suggest an alternative. Specifically, what is going to be reading this file? Will you be reading this with Excel or something else that's expecting a specific format? WIll you be importing it into another database table?

The quotename answer provided by level3looper will work in your case.
For completeness, Im providing a solution I've given in the past for the same purpose.
I like this one a little better because it keeps the definition of formatting in the format file, which is where I prefer to go to get that info. The quotename is a good solution for quick, adhoc work, but for automation, business process, I would recommand the link below.
Essentially, you just add a "dummy" column to the beginning of the definition of the extract file and delimit that column with a single doublequote. Then you also note in the format file to skip the first column. This gives you just the doublequote at the start of the line.
sql server bcp bulk insert pipe delimited with text qualifier format file

Related

BCP Command to Export Query Results to File with Pipe Delimiter

I have a requirement to create a SQL Job that exports a query to a Pipe / Vertical bar delimited file (|) and save it on a network drive in either *.txt or *.csv format. Right now, I am just trying to get this to work inside SSMS, calling the BCP command and exporting the stored procedure in the proper format to the network location, but not able to get it working.
I have been researching this and there are two methods for this.
Use the export data wizard to create a job and schedule that to run. But this method, if we need to make changes, I believe we cannot change the SSIS package that is created so we lose flexibility
Use the BCP command to export the file.
I greatly prefer to use option #2, the BCP command, but I am having problems. I just cannot seem to get the syntax correct and hoping someone could show me what I am missing:
This is my command:
Exec master..xp_cmdshell 'bcp EXEC [dbo].[usp_Report_1123] ''7786'' -t| out \\networkDrive\Reports\REPORT_1123\report1123.csv -T'
But I get the following messages:
output
'out' is not recognized as an internal or external command,
operable program or batch file.
NULL
The stored procedure does work and returns data. The network path, if I enter it into my computer, finds the path. But I am not sure what I am missing and hoping someone could help.

I generate excel sheet with sql query but it generates only in 93 version how can i make it to 2013 version?

exec master..xp_cmdshell 'bcp " Select 'column' union all Select "
cast(column As nvarchar(max))from [NEWDATABASE].[dbo].[TempPower] WHERE BarCode = 'batman'"queryout D:\TempPower.xls -o "D:\querycommanddetails.txt" -T -c -C RAW'"
bcp does not, from the documentation I can find, export to any Excel binary format. What you're doing is producing a delimited text file and giving it an Excel extension. But file extensions don't dictate how the data is stored/represented in the file; they're just a hint to tell Windows what application to open them in and the application a hint about how to process it.
If you open the file in a text editor, you'll see the raw data in plain text there. Excel may even give you a warning when you attempt to open a delimited text file that has the xls extension because it's not getting what it expected.
If you need to output directly to a xslx file, you'll need to produce it through another method.

How to import file contents into table column as data

A project I'm working on at work involves modifying one of the subsystems to store/pull data that is currently stored in files into the database. Each of the files is a single, sometimes-large, chunk of custom (xml-based) script generated by another custom tool.
Conceptually, I'm looking for an easy way to do something like:
For Each file in folder_and_subfolders
INSERT INTO table
(script_name, version_num, script )
VALUES
({file_name}, 1, {file_contents})
;
Next
Preferably on an entire directory tree at once.
If there's no easy way to do this via T-SQL, I can write a utility to do the job, but I'd prefer something that didn't require having to write another custom tool that will only be used once.
So, I don't have SQL Server installed and therefore can't test this, but if you are looking for a simple batch file that could do what you're after, I'd suggest something like the following might well help;
#echo off
SET xmldir=./myxmlfiles/live/here/
echo --- Processing files
for %%f in ("%xmldir%*.xml") do (echo Running %%f.... && #sqlcmd -I -U %1 -P %2 -S %3 -d %4 -v filename="%xmldir%%%f" -i ProcessFile.sql)
I'm not sure how much you know about sqlcmd, but it is a command line tool that is generally provided by SQL Server. It will allow you to run SQL commands, or in the case above, run a script which is indicated by the -i parameter. I am assuming that you'd place your SQL statement in there to perform your additions to the table.
The other parameters to sqlcmd are described below;
-I sets QUOTED_IDENTIFIER on (you may or may not need this. I did for an earlier issue I faced with sqlcmd and QUOTED_IDENTIFIER)
-U sets the database username
-P sets the database password
-S sets the database server
-d sets the database to connect to
-v is the interesting one here as it lets you pass parameters to your script. Note that on the MSDN page describing this, it states that if your path or filename contains spaces, then you'll need to enclose it in quotes, so check that out. Basically though, you'd be able to refer to the parameter inside your sql script (ProcessFile.sql) like INSERT INTO mytable (file_name) VALUES ('$(filename)')
You'd have to use the logic described in the answer from my previous comment to ensure

xp_cmdshell command not executing last command when run as job

First off, before everybody shouts at me - I'm bug fixing in legacy code and a re-write is off the cards for now - I have to try to find a fix using the xp_cmdshell command.
I have a proc which is executed via a scheduled job. The proc is full of TSQL like the below to dump data to a log file.
SELECT *
INTO Temp
FROM MyView
SET #cmd1 = 'bcp "SELECT * FROM [myDatabase].dbo.Temp" queryout "C:\temp.txt" -T -c -t" "'
SET #cmd2= 'type "C:\temp.txt" >> "C:\output.txt"'
EXEC master..xp_cmdshell #cmd1
EXEC master..xp_cmdshell #cmd2
DROP TABLE Temp
The problem is that the last of these commands in the proc doesn't appear to run. I can see the result in the text.txt file, but not the output.txt file. All of the preceding work fine though and it works fine when I run this on it's own.
Can anyone suggest why this might happen or suggest an alternative way to achieve this?
Thanks
I think, that BCP as external process runs async. So it could be, that your file is not yet written in the moment you are trying to copy its content.
Suggestion 1: Include an appropriate wait time
Suggestion 2: Call your first command a second time with changed target file name
Suggestion 3: Use copy rather than type
You might create a file c\temp.txt with just hello world in it. Try to type it into one file before the BCP and type it into another file after the BCP.

sqlcmd: reconciling -W and -Y

I have an automated batch process that uses sqlcmd to pull data from a database and dump it into a text file. Many fields in that database are of type varchar(max), Sqlcmd limits these fields to 256 characters unless I add something like -y 0 to the flags in the sqlcmd call.
This gives me the full text for fields larger than 256 characters, but it also adds a great deal of whitespace; the fields are padded to make each field as big as it could possibly be according to its data type, essentially giving me huge files with lots of padding and wasted space.
I could fix this by adding -W to my sqlcmd flags, but this gives me an error saying that -W and -y are incompatible.
Has anyone had this problem before? Thoughts on how to solve it?
From this thread there is the suggestion that specifying the column separator using -s can trim the data as if it is not specified the data comes out fixed width.
If that does not work have you tried RTRIM(LTRIM(ColumnName)) in your SELECT query?
I had this issue while creating a CSV file with SQLCMD and got it solved by tricking SQLCMD. Instead of returning several fields and let SQLCMD put the comma as separator, I just concatenate all the fields putting the commas myself in it. I know it's an ugly workaround, but that solved my problem. Hope it can help someone else.

Resources