How to add text qualifier (") while exporting data using SQLCMD command? - sql-server

I am exporting data from SQL Server table to CSV using SQLCMD:
sqlcmd -S serverdetails -s"^" -d dbname -U username -P password -W -Q "SET NOCOUNT on; Select * from table with (nolock) " > c:\\USERS\\a\\b\\export_file.csv -s"^" -W """;
I am getting data like this:
id^column1^column2^column3^column4
1^abc^cde^www.google.com^8776565
2^abc^cde^www.google.com^8776565
3^abc^cde^www.google.com^8776565
I want output like this:
"id"^"column1"^"column2"^"column3"^"column4"
"1"^"abc"^"cde"^"www.google.com"^"8776565"
"2"^"abc"^"cde"^"www.google.com"^"8776565"
"3"^"abc"^"cde"^"www.google.com"^"8776565"
Please suggest how I can do this with select * from table. I don't want to specify all columns and concatenate " with them.

The double quote character has to be escaped with a backslash,
try to add "\"^\"";
sqlcmd -S serverdetails -s"\"^\"" -d dbname -U username -P password -W -Q "SET NOCOUNT on; Select * from table with (nolock) " > c:\\USERS\\a\\b\\export_file.csv -s"\"^\"" -W """

Related

SQL Server table swap

I have a table that is truncated and loaded with data everyday the problem is truncating the table is taking a while and users are noticing this. What I am wondering is, is there a way to have two of the same tables and truncate one then load the new data and then have the users user that new table and just keep switching between the two table.
If you're clearing out the old table, as well as populating new you could use the OUTPUT clause. Be mindful of the potential for log growth, consider a loop/batch approach if this may be a problem.
DELETE
OldDatabase.dbo.MyTable
OUTPUT
DELETED.col1
, DELETED.col2
, DELETED.col3
INTO
NewDatabase.dbo.MyTable
Or you can Use BCP which is a handy alternative to be aware of. Note this is using SQLCMD syntax.
:setvar SourceServer OldServer
:setvar SourceDatabase OldDatabase
:setvar DestinationServer NewServer
:setvar DestinationDatabase NewDatabase
:setvar BCPFilePath "C:\"
!!bcp "$(SourceDatabase).dbo.MyTable" FORMAT nul -S "$(SourceServer)" -T -n -q -f "$(BCPFilePath)MyTable.fmt"
!!bcp "SELECT * FROM $(SourceDatabase).dbo.MyTable WHERE col1=x AND col2=y" queryout "$(BCPFilePath)MyTable.dat" -S "$(SourceServer)" -T -q -f "$(BCPFilePath)MyTable.fmt" -> "$(BCPFilePath)MyTable.txt"
!!bcp "$(DestinationDatabase).dbo.MyTable" in $(BCPFilePath)MyTable.dat -S $(DestinationServer) -T -E -q -b 2500 -h "TABLOCK" -f $(BCPFilePath)MyTable.fmt

How to create a new local table from a select query on remote db in PostgreSQL?

I can use the following command to do so as long as I create the table and the appropriate columns first. I would like the command to be able to create table for me based on the results of my query.
psql -h remote.host -U myuser -p 5432 -d remotedb -c "copy (SELECT view.column FROM schema.view LIMIT 10) to stdout" | psql -h localhost -U localuser -d localdb -c "copy localtable from stdin"
Again, it will populate the data properly if I create the table and columns ahead of time, but it would be much easier if I could automate that with a comand that creates the table according to the results of my query.

Using variables in SQLCMD for Linux

I'm running the Microsoft SQLCMD tool for Linux (CTP 11.0.1720.0) on a Linux box (Red Hat Enterprise Server 5.3 tikanga) with Korn shell. The tool is properly configured, and works in all cases except when using scripting variables.
I have an SQL script, that looks like this.
SELECT COLUMN1 FROM TABLE WHERE COLUMN2 = '$(param1)';
And I'm running the sqlcmd command like this.
sqlcmd -S server -d database -U user -P pass -i input.sql -v param1="DUMMYVALUE"
When I execute the above command, I get the following error.
Sqlcmd: 'param1=DUMMYVALUE': Invalid argument. Enter '-?' for help.
Help lists the below syntax.
[-v var = "value"...]
Am I missing something here?
You don't need to pass variables to sqlcmd. It auto picks from your shell variables:
e.g.
export param1=DUMMYVALUE
sqlcmd -S $host -U $user -P $pwd -d $db -i input.sql
In the RTP version (11.0.1790.0), the -v switch does not appear in the list of parameters when executing sqlcmd -?. Apparently this option isn't supported under the Linux version of the tool.
As far as I can tell, importing parameter values from environment variables doesn't work either.
If you need a workaround, one way would be to concatenate one or more :setvar statements with the text file containing the commands you want to run into a new file, then execute the new file. Based on your example:
echo :setvar param1 DUMMYVALUE > param_input.sql
cat input.sql >> param_input.sql
sqlcmd -S server -d database -U user -P pass -i param_input.sql
You can export the variable in linux. After that you won't need to pass the variable in sqlcmd. However, I did notice you will need to change your sql script and remove the :setvar command if it doesn't have a default value.
export dbName=xyz
sqlcmd -Uusername -Sservername -Ppassword -i script.sql
:setvar dbName --remove this line
USE [$(dbName)]
GO
I think you're just not quoting the input variables correctly. I created this bash script...
#!/bin/bash
# Create a sql file with a parameterized test script
echo "
set nocount on
select k = '-db', v = '\$(db)' union all
select k = '-schema', v = '\$(schema)' union all
select '-', 'static'
go" > ./test.sql
# capture input variables
DB=$1
SCHEMA="${2:-dbo}"
# Exec sqlcmd
sqlcmd -S 'localhost\lemur' -E -i ./test.sql -v "db=${DB}" -v "schema=${SCHEMA}"
... and tested it like so:
$ ./test.sh master
k v
------- ------
-db master
-schema dbo
- static

SQLCMD passing in double quote to scripting variable

I am trying to pass in double quote to a scripting variable in SQLCMD. Is there a way to do this?
sqlcmd -S %serverName% -E -d MSDB -i MyScript.sql -m 1 -v Parameter="\""MyValueInDoubleQuote\"""
And my sql script is as follow:
--This Parameter variable below is commented out since we will get it from the batch file through sqlcmd
--:SETVAR Parameter "\""MyValueInDoubleQuote\"""
INSERT INTO [MyTable]
([AccountTypeID]
,[Description])
VALUES
(1
,$(Parameter))
GO
If you have your sql script set up in this fashion:
DECLARE #myValue VARCHAR(30)
SET #myValue = $(MyParameter)
SELECT #myValue
Then you can get a value surrounded by double quotes into #myValue by just enclosing your parameter in single quotes:
sqlcmd -S MyDb -i myscript.sql -v MyParameter='"123"'
This works because -v is going to replace the $(MyParameter) string with the text '"123"'. The resulting script will look like this before it is executed:
DECLARE #myValue VARCHAR(30)
SET #myValue = '"123"'
SELECT #myValue
Hope that helps.
EDIT
This sample is working for me (tested on SQL Server 2008, Windows Server 2K3). It inserts a record into the table variable #MyTable, and the value in the Description field is enclosed in double quotes:
MyScript.sql (no need for setvar):
DECLARE #MyTable AS TABLE([AccountTypeID] INT, [Description] VARCHAR(50))
INSERT INTO #MyTable ([AccountTypeID] ,[Description])
VALUES(1, $(Parameter))
SELECT * FROM #MyTable
SQLCMD:
sqlcmd -S %serverName% -E -d MSDB -i MyScript.sql -m 1 -v Parameter='"MyValue"'
If you run that script, you should get the following output, which I think is what you're looking for:
(1 rows affected)
AccountTypeID Description
------------- --------------------------------------------------
1 "MyValue"
Based on your example, you don't need to include the quotes in the variable, as they can be in the sql command, like so:
sqlcmd -S %serverName% -E -d MSDB -i MyScript.sql -m 1 -v Parameter="MyValueNoQuotes"
and
INSERT INTO [MyTable]
([AccountTypeID]
,[Description])
VALUES
(1
,"$(Parameter)")
(Though I am more accustomed to use single quotes, as in ,'$(Parameter)'

SQL Server BCP: How to put quotes around all fields?

I have this BCP command:
'bcp DBName..vieter out c:\test003.txt -c -T /t"\",\"" -S SERVER'
The output CSV I get does not put quotes around the field names, instead it puts it around the commas! How can I get the /t"\",\"" to put quotes around all fields.
Thanks all
Setting the row terminator in addition to the field terminator should do the trick
'bcp DBName..vieter out c:\test003.txt -c -T -t"\",\"" -r"\"\n\"" -S SERVER'
This will likely work, but miss off the leading " for the first field of the first line, and perhaps the last field of the last line - I'm not sure, just guessing really, no server here!
or try using QUOTENAME to wrap text fields (you could also wrap numbers, but that isn't normally required.)
'bcp "SELECT id, age, QUOTENAME(name,'"') FROM DBName..vieter" queryout c:\test003.txt -c -T -t"," -S SERVER'
You need to use CHAR(34) for the quote. This page has more details: http://www.sqlteam.com/forums/topic.asp?TOPIC_ID=153000
Alternatively, if you are fine for Powershell based script, you can try with below code, which does automatic quoting.
Invoke-sqlcmd -ConnectionString "Server=SERVERNAME, `
3180;Database=DATABASENAME;Trusted_Connection=True;" `
-Query "SET NOCOUNT ON;SELECT * FROM TABLENAME" -MaxCharLength 700 | `
Export-Csv -NoTypeInformation -path C:\temp\FileName.csv -Encoding UTF8
bcp "SELECT char(34) + * +char(34) FROM atable queryout "C:\temp\out.csv" -T -N -c /t"\",\""
This will put quotes before and after each field (including the first and the last).
Here are the list of commands i used .
BCP "DECLARE #colnames VARCHAR(max);SELECT #colnames = COALESCE(#colnames + ',', '') + column_name from databaseName.INFORMATION_SCHEMA.COLUMNS where TABLE_NAME='tableName'; select #colnames;" queryout "C:\HeadersOnly.csv" -r"\n\"" -c -T -Uusername -Ppassword -SserverName
bcp databaseName.schema.tableName out "C:\EmployeeDatawithoutheaders.csv" -T -t"\",\"" -r"\"\n\"" -c -Uusername -Ppassword -SserverName
copy /b C:\HeadersOnly.csv+C:\EmployeeDatawithoutheaders.csv C:\EmployeeData.csv
del C:\HeadersOnly.csv
del C:\EmployeeDatawithoutheaders.csv
I guess your goal was to clearly seperate field values by using an unique identifier so that import procedure doesn't have an issue.
I had same issue and found this workaroud useful: Using an unusual field terminator, for example | or even a string /#/ can be very unique and shouldn't mess with your string content. You also can HEX-Values (limited, see https://learn.microsoft.com/en-us/sql/tools/bcp-utility?view=sql-server-2017)
export
bcp DB.dbo.Table out /tmp/output2.csv -c -t "/#/" -U sa -P secret -S localhost
import
bcp TargetTable in /tmp/output2.csv -t "/#/" -k -U sa -P secret -S localhost -d DBNAME -c -b 50000
The actual workable answer, that removes the leading quote, is to :
A) generate format file with bcp :
bcp db.schema.tabel format nul -c -x -f file.xml -t"\",\"" -r"\"\r\n" -T -k
B) edit that file to manually copy field 1 to field 0 above, as the first field, set Max_Length=1 and remove the separator and one quot the was in field1
<FIELD ID="0" xsi:type="CharTerm" TERMINATOR="\"" MAX_LENGTH="1" COLLATION="SQL_Latin1_General_CP1_CI_AS"/>
The trick works, as you are adding a field (interface to the file) to detect the first seprator, which results in an always null-value, but not add a row (interface for the query output).

Resources