delete from database using batch file and sql scipt - sql-server

BATCH FILE:
ECHO OFF
SET /P colu= Enter column name:
SET /P sn= Enter ID (split by commas):
rem Grab filenames in an array
set n=0
for %%a in (%SN%) do (
set /A n+=1
set "id[!n!]=%%~a"
)
rem For example, to process the filenames:
for /L %%i in (1,1,%n%) do (
echo %%i- !id[%%i]!
)
sqlcmd -U USER-P PASSW -S SERVER -d DB -i sqlFILE.sql -o LOGS.txt -v delete=colu d_id=snode
I want to delete several ids from the database. This batch file asks the user what they want to be deleted and it is suppose to put the entered values in an array.
These entered values are entered into this SQL script:
declare #columnName nvarchar(255)
declare #intValue int
set #columnName = '$(colu)' --COLUMN NAME WHERE VALUE IS LOCATED
set #intValue = '$(sn)' --VALUE TO BE DELETED
declare #DeleteValue varchar(10)
set #DeleteValue = convert(varchar(10), #intValue)
declare #sql nvarchar(max) = ''
select #sql = #sql + 'delete ' + object_name(c.object_id) + ' where ' + #columnName + ' IN ' + #DeleteValue + ';'
from sys.columns c
where c.name = #columnName
select #sql
exec sp_executesql #sql
The problem is that the program is still looking at the values entered as one whole value and not separate values.

Related

Move file in SQL server based on the beginning of the name

I need to copy certain file from a folder A to a folder B, but I need to copy only file based on a condition. The name of the file must start with the same value as a variable I have in my stored procedure.
Here is what I've got at the moment:
DECLARE #SQLFile VARCHAR(1024)
DECLARE #MessageId INT = 3 --copy all the files from the source folder that start with this variable
DECLARE #SourceFolderPath VARCHAR(1024)
DECLARE #DestinationFolderPath VARCHAR(1024)
SET #DestinationFolderPath = '\\mydestination'
SET #SourceFolderPath = '\\mysource'
SET #SQLFile = ' COPY /Y ' + #SourceFolderPath + ' /B ' + #DestinationFolderPath
EXEC master..xp_cmdshell #SQLFile
With this code I copy all the files but I don't know if there is a way to integrate the condition of beginning of name.
Thanks
I'm guessing a simple file glob will do what you want. But as people said in the comments using xp_cmdshell can create all kinds of security issues if used where untrusted data is kicking about. For this reason it is also often blocked by SQL Server security policy.
DECLARE #SQLFile VARCHAR(1024)
DECLARE #MessageId INT = 3 --copy all the files from the source folder that start with this variable
DECLARE #SourceFolderPath VARCHAR(1024)
DECLARE #DestinationFolderPath VARCHAR(1024)
SET #DestinationFolderPath = '\\mydestination'
SET #SourceFolderPath = '\\mysource'
SET #SQLFile = ' COPY /Y ' + #SourceFolderPath + '\' + CAST(#MessageID AS varchar(max)) + '* /B ' + #DestinationFolderPath
EXEC master..xp_cmdshell #SQLFile

Command Prompt echo date returns nothing

I have a batch/sql script pair used to back databases up. I recently went into the folder to get a db back and I saw that it wasn't working. I checked out my batch script and it's not giing me the day or month but the year is showing up as 'on'! My SQL script is working fine, and I want to add that my batch script used to work.
Here's my batch script
FOR /F "TOKENS=1* DELIMS= " %%A IN ('DATE/T') DO SET CDATE=%%B
FOR /F "TOKENS=1,2 eol=/ DELIMS=/ " %%A IN ('DATE/T') DO SET mm=%%B
FOR /F "TOKENS=1,2 DELIMS=/ eol=/" %%A IN ('echo %CDATE%') DO SET dd=%%B
FOR /F "TOKENS=2,3 DELIMS=/ " %%A IN ('echo %CDATE%') DO SET yyyy=%%B
SET date=%dd%-%mm%-%yyyy%
rem day month year check
echo %dd%
echo %mm%
echo %yyyy%
echo %date%
mkdir %date%
sqlcmd -S .\SQLEXPRESS -i backupQuery.sql
and here's 'backupQuery.sql'
DECLARE #BackupLocation AS NVARCHAR(255) = 'C:\SQL\backups\'
/* Set #CurrentDate to a timestamp in mmddyyyy form */
DECLARE #CurrentDate AS NVARCHAR(255)
SET #CurrentDate = REPLICATE('0', 2 - LEN(CAST(DAY(GETDATE()) AS NVARCHAR))) /*day number 1*/
+
CAST(DAY(GETDATE()) AS NVARCHAR) /*day number 2*/
+ '-' +
REPLICATE('0', 2 - LEN(CAST(MONTH(GETDATE()) AS NVARCHAR))) /*month number 1*/
+
CAST(MONTH(GETDATE()) AS NVARCHAR) /*month number 1*/
+ '-' +
CAST(YEAR(GETDATE()) AS NVARCHAR) /*year*/
DECLARE #DatabaseName AS NVARCHAR(255)
/* Get all databases except for the temporary database */
DECLARE Databases CURSOR FOR
SELECT name
FROM sys.databases
WHERE name
NOT IN
('tempdb',
'master',
'model',
'msdb')
OPEN Databases
FETCH NEXT FROM Databases
INTO #DatabaseName
WHILE ##FETCH_STATUS = 0
BEGIN
DECLARE #Command AS NVARCHAR(MAX)
SET #Command = 'BACKUP DATABASE ' + QUOTENAME(#DatabaseName) + ' TO DISK = ''' +
#BackupLocation + #CurrentDate + '\' + #DatabaseName + '.BAK'''
EXEC(#Command)
FETCH NEXT FROM Databases
INTO #DatabaseName
END
CLOSE Databases
DEALLOCATE Databases
They're meant to backup to a directory named after the current date, but it's coming through as --on?!
Myself and #aschipfl managed to solve it, the reason it was breaking I believe was due to the fact that I was reassigning a system variable. All I had to do was use the DATE CONVERT format 6 in my SQL and the %date% variable in cmd. That solved it and I now have a format of dd-MMM-yy.

SQL BCP invalid object name ##Labels

I'm having trouble with BCP. It keeps saying invalid object name ##Labels despite me creating a global table. What am I doing wrong please?
The code is: -
DECLARE #SQL varchar(max)
SET #BatchNo = 'abc123'
DECLARE #test TABLE(A varchar(max),B varchar(max),C varchar(max),D varchar(max),E varchar(max),F varchar(max),G varchar(max),H varchar(max),I varchar(max),J varchar(max))
insert into #test values ('1','2','3','4','5','6','7','8','9','10')
SELECT * INTO ##Labels FROM #test
SET #SQL = 'SELECT * FROM ##Labels'
DECLARE #TMPfile varchar(25)
DECLARE #folder varchar(128)
DECLARE #LabelDir varchar(128)
DECLARE #template varchar(25)
DECLARE #FinalFile varchar(40)
DECLARE #cmdstr varchar(300)
SET #TMPfile = #BatchNo + '.tmp'
--Trigger folder
SET #folder = '\\WIN-0H\LABELLING\XFER\'
--Print Directive Folder
SET #LabelDir = '\\WIN-0H\DIR\'
--Label Data Template
SET #template = 'cl.csv'
--Fine output file
SET #FinalFile = #BatchNo + '.CHLABEL'
--Bulk Copy Query to csv temp file
SET #cmdstr = 'bcp "' + #SQL + '" QUERYout ' + #folder + #TMPfile + ' -c -t "," -T'
SELECT * FROM ##Labels
EXEC master..xp_cmdshell #cmdstr
PRINT #cmdstr
--join the label csv template to the actual data
SET #cmdstr = 'copy /Y /B ' + #LabelDir + #template + ' + ' + #folder + #TMPfile + ' ' + #folder + #FinalFile
EXEC master..xp_cmdshell #cmdstr
PRINT #cmdstr
--Remove all temporary files
SET #cmdstr = 'del ' + #folder + #TMPfile
EXEC master..xp_cmdshell #cmdstr
PRINT #cmdstr
PRINT 'Im Printing'
DROP TABLE ##Labels
The error message is:
Error = [Microsoft][SQL Server Native Client 11.0][SQL Server]Invalid
object name '##Labels'.
The BCP command-line utility runs independently of the T-SQL script, even when inviked via xp_cmdshell. It connects to the default instance on the same server it runs on unless the BCP /S parameter specifies otherwise.
In this case, the global temp table was created on the named instance where the script ran. However, the BCP command connected to the default instance and the global temp table did not exist there (fortunately).

How to use a SQL Server Parameter Input from Command Line?

I'm a SQL Newbie and I'm trying to figure out how to use a Parameter with SQL Thru Command-Line
For my boss/staff I have written Batch Files to run SQL Code for them to export data and the like. In Access all I have to do is [Paramerter] and it prompts for data to be entered.
The #State Variable I'd like to be able to be set dynamically. I'd like the batch file to ask for State and Query use that information. I have no idea how to do it.
Batch File
sqlcmd -E -S ServerName -i C:\Lead$\SQL\MakeSTPhoeLists.sql
pause
The SQL File
Use LeadsDb
Go
Declare #State VarChar(2)
Set #State = 'DE'
DELETE FROM tblzExportPhone
INSERT INTO tblzExportPhone ( Phone )
SELECT tblLeads.Phone
FROM tblLeads
WHERE tblLeads.ST = #State
Declare #FileName VarChar(100)
Set #FileName = 'H:\Leads\Storage\STLists\' + #State +'StatePhoneList.csv'
DECLARE #bcp_cmd4 VARCHAR(400) = ' BCP.EXE LeadsDb..tblzExportPhone out '
SET #bcp_cmd4 = #bcp_cmd4 + 'H:\Leads\SQL\Formats\PhoneTmp.csv' + ' -T -f H:\Leads\SQL\Formats\tblzExportPhone.fmt'
SET #bcp_cmd4 = #bcp_cmd4 + ' & Copy /b H:\Leads\SQL\Formats\ExPhone.csv+H:\Leads\SQL\Formats\PhoneTmp.csv ' + #FileName + ' /y'
Set #bcp_cmd4 = #bcp_cmd4 + ' & Del H:\Leads\SQL\Formats\PhoneTmp.csv'
Thank You.
In your sql file use this notation $(statename)
Add this to your command file -v statename = %1
And execute it passing the parameter mycommanfile.cmd DE
Also read this for a full example.
Essentially what I found out is you have to make a Stored Procedure and Call a query to run it. I'm going to leave an example. BTW SQL Query Creates CSV File from table Using bcp and then Zips it up using RAR Dos Command.
BTW Stored procedure is used due to every which other way I try to do it withoutI failed, but once I added it I was golden. Can use Dos prompt insead of bat file but I'm setting up bat files for 'Semi-Computer Smart people'
Hope this helps a newbie like me out^^
BAT File (cmd prompt)
Echo phone numbers in the State to send to Paramount.
Echo Then after the file is made converts it to Zip.
Echo '
Set /p State=Enter Initials of the State? :
sqlcmd -E -S Titania -i H:\Lead$\SQL\STPhones.sql -v StateName=%State%
SQL Command #1
Use dbNameHere
go
Declare #State NVarChar(2)
EXECUTE [dbo].[STPhoneListB] #State=$(StateName)
SQLCommand #2
Use dbNameHere
Go
CREATE PROCEDURE [dbo].[STPhoneListB]
#State NVarChar(2) = 'DE',
#Folder VarChar(100) = 'H:\Lead$'
AS
BEGIN
Declare #FileName VarChar(150)
Set #FileName = #Folder + '\STPhones_'+ #State +'.csv'
Declare #DosCMD VarChar(150) = 'Del ' + #Folder +'\STPhones'+ #State
+'.Zip /q'
EXEC master..xp_cmdshell #DosCMD
DECLARE #bcp_cmd4 VARCHAR(400)
DELETE FROM tblzExportPhone
INSERT INTO tblzExportPhone ( Phone )
SELECT tblLeads.Phone
FROM tblLeads
WHERE tblLeads.ST = #State
Set #bcp_cmd4 = 'BCP.EXE LeadsDb..tblzExportPhone out '
SET #bcp_cmd4 = #bcp_cmd4 + 'H:\SQL\PhoneTmp.csv' +
' -T -f H:\SQL\tblzExportPhone.fmt'
SET #bcp_cmd4 = #bcp_cmd4
+ ' & Copy /b H:\SQL\ExPhone.csv+H:\SQL\PhoneTmp.csv '
+ #FileName + ' /y'
Set #bcp_cmd4 = #bcp_cmd4 + ' & Del H:\SQL\PhoneTmp.csv'
EXEC master..xp_cmdshell #bcp_cmd4
Set #bcp_cmd4 = 'cd '+ #Folder +
' & "C:\Program Files\WinRAR\rar.exe" m STPhones_'+ #State +'.Zip ' +
'STPhones_'+ #State +'.csv'
EXEC master..xp_cmdshell #bcp_cmd4
DELETE FROM tblzExportPhone
END

Export table to file with column headers (column names) using the bcp utility and SQL Server 2008

I have seen a number of hacks to try to get the bcp utility to export column names along with the data. If all I am doing is dumping a table to a text file what is the most straightforward method to have bcp add the column headers?
Here's the bcp command I am currently using:
bcp myschema.dbo.myTableout myTable.csv /SmyServer01 /c /t, -T
This method automatically outputs column names with your row data using BCP.
The script writes one file for the column headers (read from INFORMATION_SCHEMA.COLUMNS table) then appends another file with the table data.
The final output is combined into TableData.csv which has the headers and row data. Just replace the environment variables at the top to specify the Server, Database and Table name.
set BCP_EXPORT_SERVER=put_my_server_name_here
set BCP_EXPORT_DB=put_my_db_name_here
set BCP_EXPORT_TABLE=put_my_table_name_here
BCP "DECLARE #colnames VARCHAR(max);SELECT #colnames = COALESCE(#colnames + ',', '') + column_name from %BCP_EXPORT_DB%.INFORMATION_SCHEMA.COLUMNS where TABLE_NAME='%BCP_EXPORT_TABLE%'; select #colnames;" queryout HeadersOnly.csv -c -T -S%BCP_EXPORT_SERVER%
BCP %BCP_EXPORT_DB%.dbo.%BCP_EXPORT_TABLE% out TableDataWithoutHeaders.csv -c -t, -T -S%BCP_EXPORT_SERVER%
set BCP_EXPORT_SERVER=
set BCP_EXPORT_DB=
set BCP_EXPORT_TABLE=
copy /b HeadersOnly.csv+TableDataWithoutHeaders.csv TableData.csv
del HeadersOnly.csv
del TableDataWithoutHeaders.csv
Note that if you need to supply credentials, replace the -T option with -U my_username -P my_password
This method has the advantage of always having the column names in sync with the table by using INFORMATION_SCHEMA.COLUMNS. The downside is that it creates temporary files. Microsoft should really fix the bcp utility to support this.
This solution uses the SQL row concatenation trick from here combined with bcp ideas from here
The easiest is to use the queryout option and use union all to link a column list with the actual table content
bcp "select 'col1', 'col2',... union all select * from myschema.dbo.myTableout" queryout myTable.csv /SmyServer01 /c /t, -T
An example:
create table Question1355876
(id int, name varchar(10), someinfo numeric)
insert into Question1355876
values (1, 'a', 123.12)
, (2, 'b', 456.78)
, (3, 'c', 901.12)
, (4, 'd', 353.76)
This query will return the information with the headers as first row (note the casts of the numeric values):
select 'col1', 'col2', 'col3'
union all
select cast(id as varchar(10)), name, cast(someinfo as varchar(28))
from Question1355876
The bcp command will be:
bcp "select 'col1', 'col2', 'col3' union all select cast(id as varchar(10)), name, cast(someinfo as varchar(28)) from Question1355876" queryout myTable.csv /SmyServer01 /c /t, -T
For:
Windows, 64 bit
SQL Server (tested with SQL Server 2017 and it should work for all versions):
Option 1: Command Prompt
sqlcmd -s, -W -Q "set nocount on; select * from [DATABASE].[dbo].[TABLENAME]" | findstr /v /c:"-" /b > "c:\dirname\file.csv"
Where:
[DATABASE].[dbo].[TABLENAME] is table to write.
c:\dirname\file.csv is file to write to (surrounded in quotes to handle a path with spaces).
Output .csv file includes headers.
Note: I tend to avoid bcp: it is legacy, it predates sqlcmd by a decade, and it never seems to work without causing a whole raft of headaches.
Option 2: Within SQL Script
-- Export table [DATABASE].[dbo].[TABLENAME] to .csv file c:\dirname\file.csv
exec master..xp_cmdshell 'sqlcmd -s, -W -Q "set nocount on; select * from [DATABASE].[dbo].[TABLENAME]" | findstr /v /c:"-" /b > "c:\dirname\file.csv"'
Troubleshoooting: must enable xp_cmdshell within MSSQL.
Sample Output
File: file.csv:
ID,Name,Height
1,Bob,192
2,Jane,184
3,Harry,186
Speed
As fast as theoretically possible: same speed as bcp, and many times faster than manually exporting from SSMS.
Parameter Explanation (optional - can ignore)
In sqlcmd:
-s, puts a comma between each column.
-W eliminates padding either side of the values.
set nocount on eliminates a garbage line at the end of the query.
For findstr:
All this does is remove the second line underline underneath the header, e.g. --- ----- ---- ---- ----- --.
/v /c:"-" matches any line that starts with "-".
/b returns all other lines.
Importing into other programs
In Excel:
Can directly open the file in Excel.
In Python:
import pandas as pd
df_raw = pd.read_csv("c:\dirname\file.csv")
A good alternative is SqlCmd, since it does include headers, but it has the downside of adding space padding around the data for human readability. You can combine SqlCmd with the GnuWin32 sed (stream editing) utility to cleanup the results. Here's an example that worked for me, though I can't guarantee that it's bulletproof.
First, export the data:
sqlcmd -S Server -i C:\Temp\Query.sql -o C:\Temp\Results.txt -s" "
The -s" " is a tab character in double quotes. I found that you have to run this command via a batch file, otherwise the Windows command prompt will treat the tab as an automatic completion command and will substitute a filename in place of the tab.
If Query.sql contains:
SELECT name, object_id, type_desc, create_date
FROM MSDB.sys.views
WHERE name LIKE 'sysmail%'
then you'll see something like this in Results.txt
name object_id type_desc create_date
------------------------------------------- ----------- ------------------- -----------------------
sysmail_allitems 2001442204 VIEW 2012-07-20 17:38:27.820
sysmail_sentitems 2017442261 VIEW 2012-07-20 17:38:27.837
sysmail_unsentitems 2033442318 VIEW 2012-07-20 17:38:27.850
sysmail_faileditems 2049442375 VIEW 2012-07-20 17:38:27.860
sysmail_mailattachments 2097442546 VIEW 2012-07-20 17:38:27.933
sysmail_event_log 2129442660 VIEW 2012-07-20 17:38:28.040
(6 rows affected)
Next, parse the text using sed:
sed -r "s/ +\t/\t/g" C:\Temp\Results.txt | sed -r "s/\t +/\t/g" | sed -r "s/(^ +| +$)//g" | sed 2d | sed $d | sed "/^$/d" > C:\Temp\Results_New.txt
Note that the 2d command means to delete the second line, the $d command means to delete the last line, and "/^$/d" deletes any blank lines.
The cleaned up file looks like this (though I replaced the tabs with | so they could be visualized here):
name|object_id|type_desc|create_date
sysmail_allitems|2001442204|VIEW|2012-07-20 17:38:27.820
sysmail_sentitems|2017442261|VIEW|2012-07-20 17:38:27.837
sysmail_unsentitems|2033442318|VIEW|2012-07-20 17:38:27.850
sysmail_faileditems|2049442375|VIEW|2012-07-20 17:38:27.860
sysmail_mailattachments|2097442546|VIEW|2012-07-20 17:38:27.933
sysmail_event_log|2129442660|VIEW|2012-07-20 17:38:28.040
I was trying to figure how to do this recently and while I like the most popular solution at the top, it simply would not work for me as I needed the names to be the alias's that I entered in the script so I used some batch files (with some help from a colleague) to accomplish custom table names.
The batch file that initiates the bcp has a line at the bottom of the script that executes another script that merges a template file with the header names and the file that was just exported with bcp using the code below. Hope this helps someone else that was in my situation.
echo Add headers from template file to exported sql files....
Echo School 0031
copy e:\genin\templates\TEMPLATE_Courses.csv + e:\genin\0031\courses0031.csv e:\genin\finished\courses0031.csv /b
I was having the same issue. I needed to export the column header using SQL server bcp utility.This way I exported table "headers" with data into same exported file in one go.
DECLARE #table_name VARCHAR(50) ='mytable'
DECLARE #columnHeader VARCHAR(8000)
SELECT #columnHeader = COALESCE(#columnHeader+',' ,'')+ ''''+column_name +'''' FROM Nal2013.INFORMATION_SCHEMA.COLUMNS WHERE TABLE_NAME=#table_name
SELECT #raw_sql = 'bcp "SELECT '+ #columnHeader +' UNION ALL SELECT * FROM mytable" queryout c:\datafile.csv -c -t, -T -S '+ ##servername
EXEC xp_cmdshell #raw_sql
Happy coding :)
Here is a pretty simple stored procedure that does the trick as well...
CREATE PROCEDURE GetBCPTable
#table_name varchar(200)
AS
BEGIN
DECLARE #raw_sql nvarchar(3000)
DECLARE #columnHeader VARCHAR(8000)
SELECT #columnHeader = COALESCE(#columnHeader+',' ,'')+ ''''+column_name +'''' FROM INFORMATION_SCHEMA.COLUMNS WHERE TABLE_NAME = #table_name
DECLARE #ColumnList VARCHAR(8000)
SELECT #ColumnList = COALESCE(#ColumnList+',' ,'')+ 'CAST('+column_name +' AS VARCHAR)' FROM INFORMATION_SCHEMA.COLUMNS WHERE TABLE_NAME = #table_name
SELECT #raw_sql = 'SELECT '+ #columnHeader +' UNION ALL SELECT ' + #ColumnList + ' FROM ' + #table_name
--PRINT #raw_SQL
EXECUTE sp_executesql #raw_sql
END
GO
Some of the solutions here are overly complex. Here's one with 4 lines of code, no batch files, no external apps and all self-contained in the SQL server.
In this example, my table is named "MyTable" and it has two columns named Column1 and Column2. Column2 is an integer, so we need to CAST it to varchar for the export:
DECLARE #FileName varchar(100)
DECLARE #BCPCommand varchar(8000)
DECLARE #ColumnHeader varchar(8000)
SET #FileName = 'C:\Temp\OutputFile.csv'
SELECT #ColumnHeader = COALESCE(#ColumnHeader+',' ,'')+ ''''+column_name +'''' FROM INFORMATION_SCHEMA.COLUMNS WHERE TABLE_NAME='MyTable'
SET #BCPCommand = 'bcp "SELECT '+ #ColumnHeader +' UNION ALL SELECT Column1, CAST(Column2 AS varchar(100)) AS Column2 FROM MyTable" queryout "' + #FileName + '" -c -t , -r \n -S . -T'
EXEC master..xp_cmdshell #BCPCommand
You could add this to a stored procedure to fully automate your .CSV file (with header row) creation.
Everyone's versions do things a little different. This is the version that I have developed over the years. This version seems to account for all of the issues I have encountered. Simply populate a data set into a table then pass the table name to this stored procedure.
I call this stored procedure like this:
EXEC #return_value = *DB_You_Create_The_SP_In*.[dbo].[Export_CSVFile]
#DB = N'*YourDB*',
#TABLE_NAME = N'*YourTable*',
#Dir = N'*YourOutputDirectory*',
#File = N'*YourOutputFileName*'
There are also two other variables:
#NullBlanks -- This will take any field that doesn't have a value and
null it. This is useful because in the true sense of the CSV
specification each data point should have quotes around them. If you
have a large data set this will save you a fair amount of space by
not having "" (two double quotes) in those fields. If you don't find this useful then set it to 0.
#IncludeHeaders -- I have one stored procedure for outputting CSV
files, so I do have that flag in the event I don't want headers.
This will create the stored procedure:
CREATE PROCEDURE [dbo].[Export_CSVFile]
(#DB varchar(128),#TABLE_NAME varchar(128), #Dir varchar(255), #File varchar(250),#NULLBLANKS bit=1,#IncludeHeader bit=1)
AS
DECLARE #CSVHeader varchar(max)='' --CSV Header
, #CmdExc varchar(8000)='' --EXEC commands
, #SQL varchar(max)='' --SQL Statements
, #COLUMN_NAME varchar(128)='' --Column Names
, #DATA_TYPE varchar(15)='' --Data Types
DECLARE #T table (COLUMN_NAME varchar(128),DATA_TYPE varchar(15))
--BEGIN Ensure Dir variable has a backslash as the final character
IF NOT RIGHT(#Dir,1) = '\' BEGIN SET #Dir=#Dir+'\' END
--END
--BEGIN Drop TEMP Table IF Exists
SET #SQL='IF (EXISTS (SELECT * FROM '+#DB+'.INFORMATION_SCHEMA.TABLES WHERE TABLE_NAME = ''TEMP_'+#TABLE_NAME+''')) BEGIN EXEC(''DROP TABLE ['+#DB+'].[dbo].[TEMP_'+#TABLE_NAME+']'') END'
EXEC(#SQL)
--END
SET #SQL='SELECT COLUMN_NAME,DATA_TYPE FROM '+#DB+'.INFORMATION_SCHEMA.COLUMNS WHERE TABLE_NAME ='''+#TABLE_NAME+''' ORDER BY ORDINAL_POSITION'
INSERT INTO #T
EXEC (#SQL)
SET #SQL=''
WHILE exists(SELECT * FROM #T)
BEGIN
SELECT top(1) #DATA_TYPE=DATA_TYPE,#COLUMN_NAME=COLUMN_NAME FROM #T
IF #DATA_TYPE LIKE '%char%' OR #DATA_TYPE LIKE '%text'
BEGIN
IF #NULLBLANKS = 1
BEGIN
SET #SQL+='CASE PATINDEX(''%[0-9,a-z]%'','+#COLUMN_NAME+') WHEN ''0'' THEN NULL ELSE ''"''+RTRIM(LTRIM('+#COLUMN_NAME+'))+''"'' END AS ['+#COLUMN_NAME+'],'
END
ELSE
BEGIN
SET #SQL+='''"''+RTRIM(LTRIM('+#COLUMN_NAME+'))+''"'' AS ['+#COLUMN_NAME+'],'
END
END
ELSE
BEGIN SET #SQL+=#COLUMN_NAME+',' END
SET #CSVHeader+='"'+#COLUMN_NAME+'",'
DELETE top(1) #T
END
IF LEN(#CSVHeader)>1 BEGIN SET #CSVHeader=RTRIM(LTRIM(LEFT(#CSVHeader,LEN(#CSVHeader)-1))) END
IF LEN(#SQL)>1 BEGIN SET #SQL= 'SELECT '+ LEFT(#SQL,LEN(#SQL)-1) + ' INTO ['+#DB+'].[dbo].[TEMP_'+#TABLE_NAME+'] FROM ['+#DB+'].[dbo].['+#TABLE_NAME+']' END
EXEC(#SQL)
IF #IncludeHeader=0
BEGIN
--BEGIN Create Data file
SET #CmdExc ='BCP "'+#DB+'.dbo.TEMP_'+#TABLE_NAME+'" out "'+#Dir+'Data_'+#TABLE_NAME+'.csv" /c /t, -T'
EXEC master..xp_cmdshell #CmdExc
--END
SET #CmdExc ='del '+#Dir+#File EXEC master..xp_cmdshell #CmdExc
SET #CmdExc ='ren '+#Dir+'Data_'+#TABLE_NAME+'.csv '+#File EXEC master..xp_cmdshell #CmdExc
END
else
BEGIN
--BEGIN Create Header and main file
SET #CmdExc ='echo '+#CSVHeader+'> '+#Dir+#File EXEC master..xp_cmdshell #CmdExc
--END
--BEGIN Create Data file
SET #CmdExc ='BCP "'+#DB+'.dbo.TEMP_'+#TABLE_NAME+'" out "'+#Dir+'Data_'+#TABLE_NAME+'.csv" /c /t, -T'
EXEC master..xp_cmdshell #CmdExc
--END
--BEGIN Merge Data File With Header File
SET #CmdExc = 'TYPE '+#Dir+'Data_'+#TABLE_NAME+'.csv >> '+#Dir+#File EXEC master..xp_cmdshell #CmdExc
--END
--BEGIN Delete Data File
SET #CmdExc = 'DEL /q '+#Dir+'Data_'+#TABLE_NAME+'.csv' EXEC master..xp_cmdshell #CmdExc
--END
END
--BEGIN Drop TEMP Table IF Exists
SET #SQL='IF (EXISTS (SELECT * FROM '+#DB+'.INFORMATION_SCHEMA.TABLES WHERE TABLE_NAME = ''TEMP_'+#TABLE_NAME+''')) BEGIN EXEC(''DROP TABLE ['+#DB+'].[dbo].[TEMP_'+#TABLE_NAME+']'') END'
EXEC(#SQL)
From all I know, BCP only exports the data - I don't think there's any way to make it export the header row with column names, too.
One common technique seen to solve this is to use a view over your actual data for export, which basically does a UNION ALL over two statements:
the first statement to give back one row with the column headers
the actual data to be export
and then use bcp on that view, instead of your underlying data table directly.
Marc
As well as the solution from marc_s, you can also use osql or sqlcmd
This includes headers and it can act like bcp using -Q and -o. However, they don't support format files like bcp.
You should be able to solve this problem with one cte view and one batch file containing the bcp code. First create the view. Since, it's relatively straightforward, I did not create a temporary table. Normally I do
CREATE VIEW [dbo].[vwxMySAMPLE_EXTRACT_COLUMNS]
AS
WITH MYBCP_CTE (COLUMN_NM, ORD_POS, TXT)
AS
( SELECT COLUMN_NAME
, ORDINAL_POSITION
, CAST(COLUMN_NAME AS VARCHAR(MAX))
FROM [INFORMATION_SCHEMA].[COLUMNS]
WHERE TABLE_NAME = 'xMySAMPLE_EXTRACT_NEW'
AND ORDINAL_POSITION = 1
UNION ALL
SELECT V.COLUMN_NAME
, V.ORDINAL_POSITION
, CAST(C.TXT + '|' + V.COLUMN_NAME AS VARCHAR(MAX))
FROM [INFORMATION_SCHEMA].[COLUMNS] V INNER JOIN MYBCP_CTE C
ON V.ORDINAL_POSITION = C.ORD_POS+1
AND V.ORDINAL_POSITION > 1
WHERE TABLE_NAME = 'xMySAMPLE_EXTRACT_NEW'
)
SELECT CC.TXT
FROM MYBCP_CTE CC INNER JOIN ( SELECT MAX(ORD_POS) AS MX_CNT
FROM MYBCP_CTE C
) SC
ON CC.ORD_POS = SC.MX_CNT
Now, create the batch file. I created this in my Temp directory, but I'm lazy.
cd\
CD "C:\Program Files\Microsoft SQL Server\110\Tools\Binn"
set buildhour=%time: =0%
set buildDate=%DATE:~4,10%
set backupfiledate=%buildDate:~6,4%%buildDate:~0,2%%buildDate:~3,2%%time:~0,2%%time:~3,2%%time:~6,2%
echo %backupfiledate%
pause
The above code just creates a date to append to the end of your file... Next, the first bcp statement with the view to the recursive cte to concatenate it all together.
bcp "SELECT * FROM [dbo].[vwxMYSAMPLE_EXTRACT_COLUMNS] OPTION (MAXRECURSION 300)" queryout C:\Temp\Col_NM%backupfiledate%.txt -c -t"|" -S MYSERVERTOLOGINTO -T -q
bcp "SELECT * FROM [myDBName].[dbo].[vwxMYSAMPLE_EXTRACT_NEW] " queryout C:\Temp\3316_PHYSDATA_ALL%backupfiledate%.txt -c -t"|" -S MYSERVERTOLOGINTO -T -q
Now merge them together using the copy command:
copy C:\Temp\Col_NM%backupfiledate%.txt + C:\Temp\3316_PHYSDATA_ALL%backupfiledate%.txt C:\Temp\3316_PHYSDATA_ALL%backupfiledate%.csv
All set
I got a version based on what I saw previously. It helped me a lot to create export files as CSV or TXT. I'm storing a table into a ## Temp Table:
IF OBJECT_ID('tempdb..##TmpExportFile') IS NOT NULL
DROP TABLE ##TmpExportFile;
DECLARE #columnHeader VARCHAR(8000)
DECLARE #raw_sql nvarchar(3000)
SELECT
* INTO ##TmpExportFile
------ FROM THE TABLE RESULTS YOU WANT TO GET
------ COULD BE A TABLE OR A TEMP TABLE BASED ON INNER JOINS
------ ,ETC.
FROM TableName -- optional WHERE ....
DECLARE #table_name VARCHAR(50) = '##TmpExportFile'
SELECT
#columnHeader = COALESCE(#columnHeader + ',', '') + '''[' + c.name + ']''' + ' as ' + '' + c.name + ''
FROM tempdb.sys.columns c
INNER JOIN tempdb.sys.tables t
ON c.object_id = t.object_id
WHERE t.NAME = #table_name
print #columnheader
DECLARE #ColumnList VARCHAR(max)
SELECT
#ColumnList = COALESCE(#ColumnList + ',', '') + 'CAST([' + c.name + '] AS CHAR(' + LTRIM(STR(max_length)) + '))'
FROM tempdb.sys.columns c
INNER JOIN tempdb.sys.tables t
ON c.object_id = t.object_id
WHERE t.name = #table_name
print #ColumnList
--- CSV FORMAT
SELECT
#raw_sql = 'bcp "SELECT ' + #columnHeader + ' UNION all SELECT ' + #ColumnList + ' FROM ' + #table_name + ' " queryout \\networkdrive\networkfolder\datafile.csv -c -t, /S' + ' SQLserverName /T'
--PRINT #raw_sql
EXEC xp_cmdshell #raw_sql
--- TXT FORMAT
SET #raw_sql = 'bcp "SELECT ' + #columnHeader + ' UNION all SELECT ' + #ColumnList + ' FROM ' + #table_name + ' " queryout \\networkdrive\networkfolder\MISD\datafile.txt /c /S'+ ' SQLserverName /T'
EXEC xp_cmdshell #raw_sql
DROP TABLE ##TmpExportFile
The latest version of sqlcmd adds the -w option to remove extra space after the field value; however, it does not put quotes around strings, which can be a problem with CSV files when importing a field value that contains a comma.
DECLARE #table_name varchar(max)='tableName'--which needs to be exported
DECLARE #fileName varchar(max)='file Name'--What would be file name
DECLARE #query varchar(8000)
DECLARE #columnHeader VARCHAR(max)
SELECT #columnHeader = COALESCE(#columnHeader+',' ,'')+ ''''
+column_name +''''
FROM INFORMATION_SCHEMA.COLUMNS
WHERE TABLE_NAME = #table_name
DECLARE #ColumnList VARCHAR(max)
SELECT #ColumnList = COALESCE(#ColumnList+',' ,'')
+ 'CAST('+column_name +' AS VARCHAR)' +column_name
FROM INFORMATION_SCHEMA.COLUMNS
WHERE TABLE_NAME = #table_name
DECLARE #tempRaw_sql nvarchar(max)
SELECT #tempRaw_sql = 'SELECT '
+ #ColumnList + ' into ##temp11 FROM '
+ #table_name
PRINT #tempRaw_sql
EXECUTE sp_executesql #tempRaw_sql
DECLARE #raw_sql nvarchar(max)
SELECT #raw_sql = 'SELECT '+ #columnHeader
+' UNION ALL SELECT * FROM ##temp11'
PRINT #raw_SQL
SET #query='bcp "'+#raw_SQL+'" queryout "C:\data\'+#fileName
+'.txt" -T -c -t,'
EXEC xp_cmdshell #query
DROP TABLE ##temp11
Please find below another way to make the same thing.
This procedure also takes in a schema name as a parameter in case you need it to access your table.
CREATE PROCEDURE Export_Data_NBA
#TableName nchar(50),
#TableSchema nvarchar(50) = ''
AS
DECLARE #TableToBeExported as nvarchar(50);
DECLARE #OUTPUT TABLE (col1 nvarchar(max));
DECLARE #colnamestable VARCHAR(max);
select #colnamestable = COALESCE(#colnamestable, '')
+COLUMN_NAME+ ','
from INFORMATION_SCHEMA.COLUMNS
where TABLE_NAME = #TableName
order BY ORDINAL_POSITION
SELECT #colnamestable = LEFT(#colnamestable,DATALENGTH(#colnamestable)-1)
INSERT INTO #OUTPUT
select #colnamestable
DECLARE #selectstatement VARCHAR(max);
select #selectstatement = COALESCE(#selectstatement, '')
+ 'Convert(nvarchar(100),'+COLUMN_NAME+')+'',''+'
from INFORMATION_SCHEMA.COLUMNS where TABLE_NAME = #TableName
order BY ORDINAL_POSITION
SELECT #selectstatement = LEFT(#selectstatement,DATALENGTH(#selectstatement)-1)
DECLARE #sqlstatment as nvarchar(max);
SET #TableToBeExported = #TableSchema+'.'+#TableToBeExported
SELECT #sqlstatment = N'Select '+#selectstatement+N'
from '+#TableToBeExported
INSERT INTO #OUTPUT
exec sp_executesql #stmt = #sqlstatment
SELECT * from #OUTPUT
I have successfully achieved this with the below code.
Put the below code in an SQL Server new query window and try:
CREATE TABLE tempDBTableDetails ( TableName VARCHAR(500), [RowCount] VARCHAR(500), TotalSpaceKB VARCHAR(500),
UsedSpaceKB VARCHAR(500), UnusedSpaceKB VARCHAR(500) )
-- STEP 1 ::
DECLARE #cmd VARCHAR(4000)
INSERT INTO tempDBTableDetails
SELECT 'TableName', 'RowCount', 'TotalSpaceKB', 'UsedSpaceKB', 'UnusedSpaceKB'
INSERT INTO tempDBTableDetails
SELECT
S.name +'.'+ T.name as TableName,
Convert(varchar,Cast(SUM(P.rows) as Money),1) as [RowCount],
Convert(varchar,Cast(SUM(a.total_pages) * 8 as Money),1) AS TotalSpaceKB,
Convert(varchar,Cast(SUM(a.used_pages) * 8 as Money),1) AS UsedSpaceKB,
(SUM(a.total_pages) - SUM(a.used_pages)) * 8 AS UnusedSpaceKB
FROM sys.tables T
INNER JOIN sys.partitions P ON P.OBJECT_ID = T.OBJECT_ID
INNER JOIN sys.schemas S ON T.schema_id = S.schema_id
INNER JOIN sys.allocation_units A ON p.partition_id = a.container_id
WHERE T.is_ms_shipped = 0 AND P.index_id IN (1,0)
GROUP BY S.name, T.name
ORDER BY SUM(P.rows) DESC
-- SELECT * FROM [FIINFRA-DB-SIT].dbo.tempDBTableDetails ORDER BY LEN([RowCount]) DESC
SET #cmd = 'bcp "SELECT * FROM [FIINFRA-DB-SIT].dbo.tempDBTableDetails ORDER BY LEN([RowCount]) DESC" queryout "D:\Milind\export.xls" -U sa -P dbowner -c'
Exec xp_cmdshell #cmd
--DECLARE #HeaderCmd VARCHAR(4000)
--SET #HeaderCmd = 'SELECT ''TableName'', ''RowCount'', ''TotalSpaceKB'', ''UsedSpaceKB'', ''UnusedSpaceKB'''
exec master..xp_cmdshell 'BCP "SELECT ''TableName'', ''RowCount'', ''TotalSpaceKB'', ''UsedSpaceKB'', ''UnusedSpaceKB''" queryout "d:\milind\header.xls" -U sa -P dbowner -c'
exec master..xp_cmdshell 'copy /b "d:\Milind\header.xls"+"d:\Milind\export.xls" "d:/Milind/result.xls"'
exec master..xp_cmdshell 'del "d:\Milind\header.xls"'
exec master..xp_cmdshell 'del "d:\Milind\export.xls"'
DROP TABLE tempDBTableDetails
With a little PowerShell script:
sqlcmd -Q "set nocount on select top 0 * from [DB].[schema].[table]" -o c:\temp\header.txt
bcp [DB].[schema].[table] out c:\temp\query.txt -c -T -S BRIZA
Get-Content c:\temp\*.txt | Set-Content c:\temp\result.txt
Remove-Item c:\temp\header.txt
Remove-Item c:\temp\query.txt
Warning: The concatenation follows the .txt file name (in alphabetical order)
Just used this for a DB Migration Activity.
Helped a great bit - given that its a single line.
I simply put this in the SQL Management Studio
SELECT 'sqlcmd -s, -W -Q "set nocount on; select * from [dbname].[dbo].['+ st.NAME + ']" | findstr /v /c:"-" /b >' + st.NAME + '.csv' + FROM sys.tables st
Copied the resultant set into a .bat file and I can now export the entire DB with each table into a CSV.

Resources