I´m trying to export to XML format using BCP and the XML file is generated correctly but the actual content seems wrong. Can anyone help please?
When I try to open the XML in a browser I get the following error message:
This page contains the following errors: error on line 1 at column 62:
Extra content at the end of the document
The SQL select that I'm using is:
DECLARE #fileName VARCHAR(50)
DECLARE #sqlStr VARCHAR(1000)
DECLARE #sqlCmd VARCHAR(1000)
SET #fileName = 'c:\fund_lib\test.xml'
USE PORT_APP_SQL
DROP TABLE ##temp;
WITH cte1
AS (SELECT LTRIM(RTRIM(codigo)) AS code,
CONVERT(VARCHAR(10), fecha, 120) AS date,
precio AS NAV
FROM mpr_price_history
WHERE codigo IN( 'LU0038743380', 'LU0086913042', 'LU0265291665', 'LU0098860363',
'LU0128525689', 'LU0121204944', 'CZ0008474780', 'LU0363630376',
'LU0372180066', 'LU0271663857', 'LU0271663774', 'LU0363630707', 'LU0313643024' ))
SELECT *
INTO ##temp
FROM cte1
SET #sqlStr = 'select * from ##temp order by code, date desc FOR XML RAW (''code'');'
-- Save XML records to a file:
SET #sqlCmd = 'bcp "' + #sqlStr + '" queryout ' + #fileName
+ ' -S "MPR01\SQLEXPRESS" -T -w'
EXEC xp_cmdshell #sqlCmd
And this is the error message if I open it in Firefox ( sorry is in Spanish).
error message seems to be at the end of each line
I think that your whole query can be simplified... No need for a CTE or a temp table...
attention The solution for your problem is - probably! - the missing root node, as mentioned by #TT.. If adding a root node solves your problem, please do not accept my answer (although you might vote it up, if you like it :-) ).
Your problem might be bound to a mix of encodings. If your output includes special characters, there could be some problems when you mix 8-byte encoding (VARCHAR) and the output with option -w. Therefore I put this all in NVARCHAR(MAX)
My suggestion to get things slim:
USE PORT_APP_SQL;
DECLARE #fileName NVARCHAR(50) = 'c:\fund_lib\test.xml';
DECLARE #cmd NVARCHAR(MAX);
SET #cmd =
N'SELECT LTRIM(RTRIM(codigo)) AS code
,CONVERT(VARCHAR(10), fecha, 120) AS [date]
,precio AS NAV
FROM mpr_price_history
WHERE codigo IN( ''LU0038743380'', ''LU0086913042'', ''LU0265291665'', ''LU0098860363'',
''LU0128525689'', ''LU0121204944'', ''CZ0008474780'', ''LU0363630376'',
''LU0372180066'', ''LU0271663857'', ''LU0271663774'', ''LU0363630707'', ''LU0313643024'' )
ORDER BY code,[date] DESC
FOR XML RAW(''code''),ROOT(''root'');'
-- Save XML records to a file:
SET #cmd = N'bcp "' + #cmd + N'" queryout ' + #fileName
+ N' -S "MPR01\SQLEXPRESS" -T -w'
EXEC xp_cmdshell #cmd;
The reason is because the XML doesn't have a root path. This example based on your script should produce XML for which the browser doesn't complain:
DECLARE #fileName VARCHAR(50);
DECLARE #sqlStr VARCHAR(1000);
DECLARE #sqlCmd VARCHAR(1000);
SET #fileName = 'c:\temp\test.xml';
SELECT *
INTO ##temp
FROM (VALUES('LU0038743380',GETDATE(),1),
('LU0086913042',GETDATE(),2),
('LU0265291665',GETDATE(),3),
('LU0098860363',GETDATE(),4)) AS cte1(fecha,[date],nav);
SET #sqlStr = 'select (select * from ##temp FOR XML RAW(''code''),TYPE) FOR XML PATH(''data'');'
-- Save XML records to a file:
SET #sqlCmd = 'bcp "' + #sqlStr + '" queryout ' + #fileName
+ ' -S '+##SERVERNAME+' -T -w';
EXEC xp_cmdshell #sqlCmd;
DROP TABLE ##temp;
I have ran your code with small alterations with no errors.
It seems the issue is with your data.
Look for special XML characters.
I am stuck at a problem for which I cannot find any reason or solution.
I am running a SQL script to export some data to an Excel sheet. There is an application running on the other end which reads and processes the Excel sheet.
Problem: The column headers are being displayed at the bottom and the application is expecting them to be on the top row. I cannot change the functioning of the application.
This was working fine in SQL 2005, but we recently updated to SQL 2012 and this started happening.
I have not found anything over the internet to solve this issue.
This is the SQL script that I am executing
SELECT
#columnNames = COALESCE( #columnNames + ',', '') + '['+ column_name + ']',
#columnConvert = COALESCE( #columnConvert + ',', '') + 'convert(nvarchar(4000),'
+ '['+ column_name + ']' +
case
when data_type in ('datetime', 'smalldatetime') then ',121'
when data_type in ('numeric', 'decimal') then ',128'
when data_type in ('float', 'real', 'money', 'smallmoney') then ',2'
when data_type in ('datetime', 'smalldatetime') then ',120'
else ''
end + ') as ' + '['+ column_name + ']'
FROM tempdb.INFORMATION_SCHEMA.Columns
WHERE table_name = '##TempExportData'
-- execute select query to insert data and column names into new temp table
SELECT #sql = 'select ' + #columnNames + ' into ##TempExportData2 from (select ' + #columnConvert + ', ''2'' as [temp##SortID] from ##TempExportData union all select ''' + replace(replace(replace(#columnNames, ',', ''', '''),'[',''),']','') + ''', ''1'') t order by [temp##SortID]'
exec (#sql)
-- build full BCP query
DECLARE #bcpCommand VARCHAR(8000)
SET #bcpCommand = 'bcp " SELECT * from ##TempExportData2" queryout'
SET #bcpCommand = #bcpCommand + ' ' + #fullFileName + ' -T -w -S' + #serverInstance
EXEC master..xp_cmdshell #bcpCommand
Where TempExportData2 holds the data that along with column headers
I think I understand the problem: You are using the order by in the select into instead of in the final select statement.
You should know that data inside tables is considered unorderd and Sql Server (and any other rdbms I know, actually) does not guarantee the order of rows selected if the select statement does not contain an order by clause.
Therefor, you should add the [temp##SortID] column to your ##TempExportData2 table and use it to sort the last select statement:
SET #bcpCommand = 'bcp " SELECT * from ##TempExportData2 ORDER BY [temp##SortID]" queryout'
Since you don't need that column in the output query, so you might want to specify the column names in that select statement. However, if it's not causing damage to your application that reads the excel file or to the data it produces, I would suggest keeping the select * to make the query more readable.
I have seen a number of hacks to try to get the bcp utility to export column names along with the data. If all I am doing is dumping a table to a text file what is the most straightforward method to have bcp add the column headers?
Here's the bcp command I am currently using:
bcp myschema.dbo.myTableout myTable.csv /SmyServer01 /c /t, -T
This method automatically outputs column names with your row data using BCP.
The script writes one file for the column headers (read from INFORMATION_SCHEMA.COLUMNS table) then appends another file with the table data.
The final output is combined into TableData.csv which has the headers and row data. Just replace the environment variables at the top to specify the Server, Database and Table name.
set BCP_EXPORT_SERVER=put_my_server_name_here
set BCP_EXPORT_DB=put_my_db_name_here
set BCP_EXPORT_TABLE=put_my_table_name_here
BCP "DECLARE #colnames VARCHAR(max);SELECT #colnames = COALESCE(#colnames + ',', '') + column_name from %BCP_EXPORT_DB%.INFORMATION_SCHEMA.COLUMNS where TABLE_NAME='%BCP_EXPORT_TABLE%'; select #colnames;" queryout HeadersOnly.csv -c -T -S%BCP_EXPORT_SERVER%
BCP %BCP_EXPORT_DB%.dbo.%BCP_EXPORT_TABLE% out TableDataWithoutHeaders.csv -c -t, -T -S%BCP_EXPORT_SERVER%
set BCP_EXPORT_SERVER=
set BCP_EXPORT_DB=
set BCP_EXPORT_TABLE=
copy /b HeadersOnly.csv+TableDataWithoutHeaders.csv TableData.csv
del HeadersOnly.csv
del TableDataWithoutHeaders.csv
Note that if you need to supply credentials, replace the -T option with -U my_username -P my_password
This method has the advantage of always having the column names in sync with the table by using INFORMATION_SCHEMA.COLUMNS. The downside is that it creates temporary files. Microsoft should really fix the bcp utility to support this.
This solution uses the SQL row concatenation trick from here combined with bcp ideas from here
The easiest is to use the queryout option and use union all to link a column list with the actual table content
bcp "select 'col1', 'col2',... union all select * from myschema.dbo.myTableout" queryout myTable.csv /SmyServer01 /c /t, -T
An example:
create table Question1355876
(id int, name varchar(10), someinfo numeric)
insert into Question1355876
values (1, 'a', 123.12)
, (2, 'b', 456.78)
, (3, 'c', 901.12)
, (4, 'd', 353.76)
This query will return the information with the headers as first row (note the casts of the numeric values):
select 'col1', 'col2', 'col3'
union all
select cast(id as varchar(10)), name, cast(someinfo as varchar(28))
from Question1355876
The bcp command will be:
bcp "select 'col1', 'col2', 'col3' union all select cast(id as varchar(10)), name, cast(someinfo as varchar(28)) from Question1355876" queryout myTable.csv /SmyServer01 /c /t, -T
For:
Windows, 64 bit
SQL Server (tested with SQL Server 2017 and it should work for all versions):
Option 1: Command Prompt
sqlcmd -s, -W -Q "set nocount on; select * from [DATABASE].[dbo].[TABLENAME]" | findstr /v /c:"-" /b > "c:\dirname\file.csv"
Where:
[DATABASE].[dbo].[TABLENAME] is table to write.
c:\dirname\file.csv is file to write to (surrounded in quotes to handle a path with spaces).
Output .csv file includes headers.
Note: I tend to avoid bcp: it is legacy, it predates sqlcmd by a decade, and it never seems to work without causing a whole raft of headaches.
Option 2: Within SQL Script
-- Export table [DATABASE].[dbo].[TABLENAME] to .csv file c:\dirname\file.csv
exec master..xp_cmdshell 'sqlcmd -s, -W -Q "set nocount on; select * from [DATABASE].[dbo].[TABLENAME]" | findstr /v /c:"-" /b > "c:\dirname\file.csv"'
Troubleshoooting: must enable xp_cmdshell within MSSQL.
Sample Output
File: file.csv:
ID,Name,Height
1,Bob,192
2,Jane,184
3,Harry,186
Speed
As fast as theoretically possible: same speed as bcp, and many times faster than manually exporting from SSMS.
Parameter Explanation (optional - can ignore)
In sqlcmd:
-s, puts a comma between each column.
-W eliminates padding either side of the values.
set nocount on eliminates a garbage line at the end of the query.
For findstr:
All this does is remove the second line underline underneath the header, e.g. --- ----- ---- ---- ----- --.
/v /c:"-" matches any line that starts with "-".
/b returns all other lines.
Importing into other programs
In Excel:
Can directly open the file in Excel.
In Python:
import pandas as pd
df_raw = pd.read_csv("c:\dirname\file.csv")
A good alternative is SqlCmd, since it does include headers, but it has the downside of adding space padding around the data for human readability. You can combine SqlCmd with the GnuWin32 sed (stream editing) utility to cleanup the results. Here's an example that worked for me, though I can't guarantee that it's bulletproof.
First, export the data:
sqlcmd -S Server -i C:\Temp\Query.sql -o C:\Temp\Results.txt -s" "
The -s" " is a tab character in double quotes. I found that you have to run this command via a batch file, otherwise the Windows command prompt will treat the tab as an automatic completion command and will substitute a filename in place of the tab.
If Query.sql contains:
SELECT name, object_id, type_desc, create_date
FROM MSDB.sys.views
WHERE name LIKE 'sysmail%'
then you'll see something like this in Results.txt
name object_id type_desc create_date
------------------------------------------- ----------- ------------------- -----------------------
sysmail_allitems 2001442204 VIEW 2012-07-20 17:38:27.820
sysmail_sentitems 2017442261 VIEW 2012-07-20 17:38:27.837
sysmail_unsentitems 2033442318 VIEW 2012-07-20 17:38:27.850
sysmail_faileditems 2049442375 VIEW 2012-07-20 17:38:27.860
sysmail_mailattachments 2097442546 VIEW 2012-07-20 17:38:27.933
sysmail_event_log 2129442660 VIEW 2012-07-20 17:38:28.040
(6 rows affected)
Next, parse the text using sed:
sed -r "s/ +\t/\t/g" C:\Temp\Results.txt | sed -r "s/\t +/\t/g" | sed -r "s/(^ +| +$)//g" | sed 2d | sed $d | sed "/^$/d" > C:\Temp\Results_New.txt
Note that the 2d command means to delete the second line, the $d command means to delete the last line, and "/^$/d" deletes any blank lines.
The cleaned up file looks like this (though I replaced the tabs with | so they could be visualized here):
name|object_id|type_desc|create_date
sysmail_allitems|2001442204|VIEW|2012-07-20 17:38:27.820
sysmail_sentitems|2017442261|VIEW|2012-07-20 17:38:27.837
sysmail_unsentitems|2033442318|VIEW|2012-07-20 17:38:27.850
sysmail_faileditems|2049442375|VIEW|2012-07-20 17:38:27.860
sysmail_mailattachments|2097442546|VIEW|2012-07-20 17:38:27.933
sysmail_event_log|2129442660|VIEW|2012-07-20 17:38:28.040
I was trying to figure how to do this recently and while I like the most popular solution at the top, it simply would not work for me as I needed the names to be the alias's that I entered in the script so I used some batch files (with some help from a colleague) to accomplish custom table names.
The batch file that initiates the bcp has a line at the bottom of the script that executes another script that merges a template file with the header names and the file that was just exported with bcp using the code below. Hope this helps someone else that was in my situation.
echo Add headers from template file to exported sql files....
Echo School 0031
copy e:\genin\templates\TEMPLATE_Courses.csv + e:\genin\0031\courses0031.csv e:\genin\finished\courses0031.csv /b
I was having the same issue. I needed to export the column header using SQL server bcp utility.This way I exported table "headers" with data into same exported file in one go.
DECLARE #table_name VARCHAR(50) ='mytable'
DECLARE #columnHeader VARCHAR(8000)
SELECT #columnHeader = COALESCE(#columnHeader+',' ,'')+ ''''+column_name +'''' FROM Nal2013.INFORMATION_SCHEMA.COLUMNS WHERE TABLE_NAME=#table_name
SELECT #raw_sql = 'bcp "SELECT '+ #columnHeader +' UNION ALL SELECT * FROM mytable" queryout c:\datafile.csv -c -t, -T -S '+ ##servername
EXEC xp_cmdshell #raw_sql
Happy coding :)
Here is a pretty simple stored procedure that does the trick as well...
CREATE PROCEDURE GetBCPTable
#table_name varchar(200)
AS
BEGIN
DECLARE #raw_sql nvarchar(3000)
DECLARE #columnHeader VARCHAR(8000)
SELECT #columnHeader = COALESCE(#columnHeader+',' ,'')+ ''''+column_name +'''' FROM INFORMATION_SCHEMA.COLUMNS WHERE TABLE_NAME = #table_name
DECLARE #ColumnList VARCHAR(8000)
SELECT #ColumnList = COALESCE(#ColumnList+',' ,'')+ 'CAST('+column_name +' AS VARCHAR)' FROM INFORMATION_SCHEMA.COLUMNS WHERE TABLE_NAME = #table_name
SELECT #raw_sql = 'SELECT '+ #columnHeader +' UNION ALL SELECT ' + #ColumnList + ' FROM ' + #table_name
--PRINT #raw_SQL
EXECUTE sp_executesql #raw_sql
END
GO
Some of the solutions here are overly complex. Here's one with 4 lines of code, no batch files, no external apps and all self-contained in the SQL server.
In this example, my table is named "MyTable" and it has two columns named Column1 and Column2. Column2 is an integer, so we need to CAST it to varchar for the export:
DECLARE #FileName varchar(100)
DECLARE #BCPCommand varchar(8000)
DECLARE #ColumnHeader varchar(8000)
SET #FileName = 'C:\Temp\OutputFile.csv'
SELECT #ColumnHeader = COALESCE(#ColumnHeader+',' ,'')+ ''''+column_name +'''' FROM INFORMATION_SCHEMA.COLUMNS WHERE TABLE_NAME='MyTable'
SET #BCPCommand = 'bcp "SELECT '+ #ColumnHeader +' UNION ALL SELECT Column1, CAST(Column2 AS varchar(100)) AS Column2 FROM MyTable" queryout "' + #FileName + '" -c -t , -r \n -S . -T'
EXEC master..xp_cmdshell #BCPCommand
You could add this to a stored procedure to fully automate your .CSV file (with header row) creation.
Everyone's versions do things a little different. This is the version that I have developed over the years. This version seems to account for all of the issues I have encountered. Simply populate a data set into a table then pass the table name to this stored procedure.
I call this stored procedure like this:
EXEC #return_value = *DB_You_Create_The_SP_In*.[dbo].[Export_CSVFile]
#DB = N'*YourDB*',
#TABLE_NAME = N'*YourTable*',
#Dir = N'*YourOutputDirectory*',
#File = N'*YourOutputFileName*'
There are also two other variables:
#NullBlanks -- This will take any field that doesn't have a value and
null it. This is useful because in the true sense of the CSV
specification each data point should have quotes around them. If you
have a large data set this will save you a fair amount of space by
not having "" (two double quotes) in those fields. If you don't find this useful then set it to 0.
#IncludeHeaders -- I have one stored procedure for outputting CSV
files, so I do have that flag in the event I don't want headers.
This will create the stored procedure:
CREATE PROCEDURE [dbo].[Export_CSVFile]
(#DB varchar(128),#TABLE_NAME varchar(128), #Dir varchar(255), #File varchar(250),#NULLBLANKS bit=1,#IncludeHeader bit=1)
AS
DECLARE #CSVHeader varchar(max)='' --CSV Header
, #CmdExc varchar(8000)='' --EXEC commands
, #SQL varchar(max)='' --SQL Statements
, #COLUMN_NAME varchar(128)='' --Column Names
, #DATA_TYPE varchar(15)='' --Data Types
DECLARE #T table (COLUMN_NAME varchar(128),DATA_TYPE varchar(15))
--BEGIN Ensure Dir variable has a backslash as the final character
IF NOT RIGHT(#Dir,1) = '\' BEGIN SET #Dir=#Dir+'\' END
--END
--BEGIN Drop TEMP Table IF Exists
SET #SQL='IF (EXISTS (SELECT * FROM '+#DB+'.INFORMATION_SCHEMA.TABLES WHERE TABLE_NAME = ''TEMP_'+#TABLE_NAME+''')) BEGIN EXEC(''DROP TABLE ['+#DB+'].[dbo].[TEMP_'+#TABLE_NAME+']'') END'
EXEC(#SQL)
--END
SET #SQL='SELECT COLUMN_NAME,DATA_TYPE FROM '+#DB+'.INFORMATION_SCHEMA.COLUMNS WHERE TABLE_NAME ='''+#TABLE_NAME+''' ORDER BY ORDINAL_POSITION'
INSERT INTO #T
EXEC (#SQL)
SET #SQL=''
WHILE exists(SELECT * FROM #T)
BEGIN
SELECT top(1) #DATA_TYPE=DATA_TYPE,#COLUMN_NAME=COLUMN_NAME FROM #T
IF #DATA_TYPE LIKE '%char%' OR #DATA_TYPE LIKE '%text'
BEGIN
IF #NULLBLANKS = 1
BEGIN
SET #SQL+='CASE PATINDEX(''%[0-9,a-z]%'','+#COLUMN_NAME+') WHEN ''0'' THEN NULL ELSE ''"''+RTRIM(LTRIM('+#COLUMN_NAME+'))+''"'' END AS ['+#COLUMN_NAME+'],'
END
ELSE
BEGIN
SET #SQL+='''"''+RTRIM(LTRIM('+#COLUMN_NAME+'))+''"'' AS ['+#COLUMN_NAME+'],'
END
END
ELSE
BEGIN SET #SQL+=#COLUMN_NAME+',' END
SET #CSVHeader+='"'+#COLUMN_NAME+'",'
DELETE top(1) #T
END
IF LEN(#CSVHeader)>1 BEGIN SET #CSVHeader=RTRIM(LTRIM(LEFT(#CSVHeader,LEN(#CSVHeader)-1))) END
IF LEN(#SQL)>1 BEGIN SET #SQL= 'SELECT '+ LEFT(#SQL,LEN(#SQL)-1) + ' INTO ['+#DB+'].[dbo].[TEMP_'+#TABLE_NAME+'] FROM ['+#DB+'].[dbo].['+#TABLE_NAME+']' END
EXEC(#SQL)
IF #IncludeHeader=0
BEGIN
--BEGIN Create Data file
SET #CmdExc ='BCP "'+#DB+'.dbo.TEMP_'+#TABLE_NAME+'" out "'+#Dir+'Data_'+#TABLE_NAME+'.csv" /c /t, -T'
EXEC master..xp_cmdshell #CmdExc
--END
SET #CmdExc ='del '+#Dir+#File EXEC master..xp_cmdshell #CmdExc
SET #CmdExc ='ren '+#Dir+'Data_'+#TABLE_NAME+'.csv '+#File EXEC master..xp_cmdshell #CmdExc
END
else
BEGIN
--BEGIN Create Header and main file
SET #CmdExc ='echo '+#CSVHeader+'> '+#Dir+#File EXEC master..xp_cmdshell #CmdExc
--END
--BEGIN Create Data file
SET #CmdExc ='BCP "'+#DB+'.dbo.TEMP_'+#TABLE_NAME+'" out "'+#Dir+'Data_'+#TABLE_NAME+'.csv" /c /t, -T'
EXEC master..xp_cmdshell #CmdExc
--END
--BEGIN Merge Data File With Header File
SET #CmdExc = 'TYPE '+#Dir+'Data_'+#TABLE_NAME+'.csv >> '+#Dir+#File EXEC master..xp_cmdshell #CmdExc
--END
--BEGIN Delete Data File
SET #CmdExc = 'DEL /q '+#Dir+'Data_'+#TABLE_NAME+'.csv' EXEC master..xp_cmdshell #CmdExc
--END
END
--BEGIN Drop TEMP Table IF Exists
SET #SQL='IF (EXISTS (SELECT * FROM '+#DB+'.INFORMATION_SCHEMA.TABLES WHERE TABLE_NAME = ''TEMP_'+#TABLE_NAME+''')) BEGIN EXEC(''DROP TABLE ['+#DB+'].[dbo].[TEMP_'+#TABLE_NAME+']'') END'
EXEC(#SQL)
From all I know, BCP only exports the data - I don't think there's any way to make it export the header row with column names, too.
One common technique seen to solve this is to use a view over your actual data for export, which basically does a UNION ALL over two statements:
the first statement to give back one row with the column headers
the actual data to be export
and then use bcp on that view, instead of your underlying data table directly.
Marc
As well as the solution from marc_s, you can also use osql or sqlcmd
This includes headers and it can act like bcp using -Q and -o. However, they don't support format files like bcp.
You should be able to solve this problem with one cte view and one batch file containing the bcp code. First create the view. Since, it's relatively straightforward, I did not create a temporary table. Normally I do
CREATE VIEW [dbo].[vwxMySAMPLE_EXTRACT_COLUMNS]
AS
WITH MYBCP_CTE (COLUMN_NM, ORD_POS, TXT)
AS
( SELECT COLUMN_NAME
, ORDINAL_POSITION
, CAST(COLUMN_NAME AS VARCHAR(MAX))
FROM [INFORMATION_SCHEMA].[COLUMNS]
WHERE TABLE_NAME = 'xMySAMPLE_EXTRACT_NEW'
AND ORDINAL_POSITION = 1
UNION ALL
SELECT V.COLUMN_NAME
, V.ORDINAL_POSITION
, CAST(C.TXT + '|' + V.COLUMN_NAME AS VARCHAR(MAX))
FROM [INFORMATION_SCHEMA].[COLUMNS] V INNER JOIN MYBCP_CTE C
ON V.ORDINAL_POSITION = C.ORD_POS+1
AND V.ORDINAL_POSITION > 1
WHERE TABLE_NAME = 'xMySAMPLE_EXTRACT_NEW'
)
SELECT CC.TXT
FROM MYBCP_CTE CC INNER JOIN ( SELECT MAX(ORD_POS) AS MX_CNT
FROM MYBCP_CTE C
) SC
ON CC.ORD_POS = SC.MX_CNT
Now, create the batch file. I created this in my Temp directory, but I'm lazy.
cd\
CD "C:\Program Files\Microsoft SQL Server\110\Tools\Binn"
set buildhour=%time: =0%
set buildDate=%DATE:~4,10%
set backupfiledate=%buildDate:~6,4%%buildDate:~0,2%%buildDate:~3,2%%time:~0,2%%time:~3,2%%time:~6,2%
echo %backupfiledate%
pause
The above code just creates a date to append to the end of your file... Next, the first bcp statement with the view to the recursive cte to concatenate it all together.
bcp "SELECT * FROM [dbo].[vwxMYSAMPLE_EXTRACT_COLUMNS] OPTION (MAXRECURSION 300)" queryout C:\Temp\Col_NM%backupfiledate%.txt -c -t"|" -S MYSERVERTOLOGINTO -T -q
bcp "SELECT * FROM [myDBName].[dbo].[vwxMYSAMPLE_EXTRACT_NEW] " queryout C:\Temp\3316_PHYSDATA_ALL%backupfiledate%.txt -c -t"|" -S MYSERVERTOLOGINTO -T -q
Now merge them together using the copy command:
copy C:\Temp\Col_NM%backupfiledate%.txt + C:\Temp\3316_PHYSDATA_ALL%backupfiledate%.txt C:\Temp\3316_PHYSDATA_ALL%backupfiledate%.csv
All set
I got a version based on what I saw previously. It helped me a lot to create export files as CSV or TXT. I'm storing a table into a ## Temp Table:
IF OBJECT_ID('tempdb..##TmpExportFile') IS NOT NULL
DROP TABLE ##TmpExportFile;
DECLARE #columnHeader VARCHAR(8000)
DECLARE #raw_sql nvarchar(3000)
SELECT
* INTO ##TmpExportFile
------ FROM THE TABLE RESULTS YOU WANT TO GET
------ COULD BE A TABLE OR A TEMP TABLE BASED ON INNER JOINS
------ ,ETC.
FROM TableName -- optional WHERE ....
DECLARE #table_name VARCHAR(50) = '##TmpExportFile'
SELECT
#columnHeader = COALESCE(#columnHeader + ',', '') + '''[' + c.name + ']''' + ' as ' + '' + c.name + ''
FROM tempdb.sys.columns c
INNER JOIN tempdb.sys.tables t
ON c.object_id = t.object_id
WHERE t.NAME = #table_name
print #columnheader
DECLARE #ColumnList VARCHAR(max)
SELECT
#ColumnList = COALESCE(#ColumnList + ',', '') + 'CAST([' + c.name + '] AS CHAR(' + LTRIM(STR(max_length)) + '))'
FROM tempdb.sys.columns c
INNER JOIN tempdb.sys.tables t
ON c.object_id = t.object_id
WHERE t.name = #table_name
print #ColumnList
--- CSV FORMAT
SELECT
#raw_sql = 'bcp "SELECT ' + #columnHeader + ' UNION all SELECT ' + #ColumnList + ' FROM ' + #table_name + ' " queryout \\networkdrive\networkfolder\datafile.csv -c -t, /S' + ' SQLserverName /T'
--PRINT #raw_sql
EXEC xp_cmdshell #raw_sql
--- TXT FORMAT
SET #raw_sql = 'bcp "SELECT ' + #columnHeader + ' UNION all SELECT ' + #ColumnList + ' FROM ' + #table_name + ' " queryout \\networkdrive\networkfolder\MISD\datafile.txt /c /S'+ ' SQLserverName /T'
EXEC xp_cmdshell #raw_sql
DROP TABLE ##TmpExportFile
The latest version of sqlcmd adds the -w option to remove extra space after the field value; however, it does not put quotes around strings, which can be a problem with CSV files when importing a field value that contains a comma.
DECLARE #table_name varchar(max)='tableName'--which needs to be exported
DECLARE #fileName varchar(max)='file Name'--What would be file name
DECLARE #query varchar(8000)
DECLARE #columnHeader VARCHAR(max)
SELECT #columnHeader = COALESCE(#columnHeader+',' ,'')+ ''''
+column_name +''''
FROM INFORMATION_SCHEMA.COLUMNS
WHERE TABLE_NAME = #table_name
DECLARE #ColumnList VARCHAR(max)
SELECT #ColumnList = COALESCE(#ColumnList+',' ,'')
+ 'CAST('+column_name +' AS VARCHAR)' +column_name
FROM INFORMATION_SCHEMA.COLUMNS
WHERE TABLE_NAME = #table_name
DECLARE #tempRaw_sql nvarchar(max)
SELECT #tempRaw_sql = 'SELECT '
+ #ColumnList + ' into ##temp11 FROM '
+ #table_name
PRINT #tempRaw_sql
EXECUTE sp_executesql #tempRaw_sql
DECLARE #raw_sql nvarchar(max)
SELECT #raw_sql = 'SELECT '+ #columnHeader
+' UNION ALL SELECT * FROM ##temp11'
PRINT #raw_SQL
SET #query='bcp "'+#raw_SQL+'" queryout "C:\data\'+#fileName
+'.txt" -T -c -t,'
EXEC xp_cmdshell #query
DROP TABLE ##temp11
Please find below another way to make the same thing.
This procedure also takes in a schema name as a parameter in case you need it to access your table.
CREATE PROCEDURE Export_Data_NBA
#TableName nchar(50),
#TableSchema nvarchar(50) = ''
AS
DECLARE #TableToBeExported as nvarchar(50);
DECLARE #OUTPUT TABLE (col1 nvarchar(max));
DECLARE #colnamestable VARCHAR(max);
select #colnamestable = COALESCE(#colnamestable, '')
+COLUMN_NAME+ ','
from INFORMATION_SCHEMA.COLUMNS
where TABLE_NAME = #TableName
order BY ORDINAL_POSITION
SELECT #colnamestable = LEFT(#colnamestable,DATALENGTH(#colnamestable)-1)
INSERT INTO #OUTPUT
select #colnamestable
DECLARE #selectstatement VARCHAR(max);
select #selectstatement = COALESCE(#selectstatement, '')
+ 'Convert(nvarchar(100),'+COLUMN_NAME+')+'',''+'
from INFORMATION_SCHEMA.COLUMNS where TABLE_NAME = #TableName
order BY ORDINAL_POSITION
SELECT #selectstatement = LEFT(#selectstatement,DATALENGTH(#selectstatement)-1)
DECLARE #sqlstatment as nvarchar(max);
SET #TableToBeExported = #TableSchema+'.'+#TableToBeExported
SELECT #sqlstatment = N'Select '+#selectstatement+N'
from '+#TableToBeExported
INSERT INTO #OUTPUT
exec sp_executesql #stmt = #sqlstatment
SELECT * from #OUTPUT
I have successfully achieved this with the below code.
Put the below code in an SQL Server new query window and try:
CREATE TABLE tempDBTableDetails ( TableName VARCHAR(500), [RowCount] VARCHAR(500), TotalSpaceKB VARCHAR(500),
UsedSpaceKB VARCHAR(500), UnusedSpaceKB VARCHAR(500) )
-- STEP 1 ::
DECLARE #cmd VARCHAR(4000)
INSERT INTO tempDBTableDetails
SELECT 'TableName', 'RowCount', 'TotalSpaceKB', 'UsedSpaceKB', 'UnusedSpaceKB'
INSERT INTO tempDBTableDetails
SELECT
S.name +'.'+ T.name as TableName,
Convert(varchar,Cast(SUM(P.rows) as Money),1) as [RowCount],
Convert(varchar,Cast(SUM(a.total_pages) * 8 as Money),1) AS TotalSpaceKB,
Convert(varchar,Cast(SUM(a.used_pages) * 8 as Money),1) AS UsedSpaceKB,
(SUM(a.total_pages) - SUM(a.used_pages)) * 8 AS UnusedSpaceKB
FROM sys.tables T
INNER JOIN sys.partitions P ON P.OBJECT_ID = T.OBJECT_ID
INNER JOIN sys.schemas S ON T.schema_id = S.schema_id
INNER JOIN sys.allocation_units A ON p.partition_id = a.container_id
WHERE T.is_ms_shipped = 0 AND P.index_id IN (1,0)
GROUP BY S.name, T.name
ORDER BY SUM(P.rows) DESC
-- SELECT * FROM [FIINFRA-DB-SIT].dbo.tempDBTableDetails ORDER BY LEN([RowCount]) DESC
SET #cmd = 'bcp "SELECT * FROM [FIINFRA-DB-SIT].dbo.tempDBTableDetails ORDER BY LEN([RowCount]) DESC" queryout "D:\Milind\export.xls" -U sa -P dbowner -c'
Exec xp_cmdshell #cmd
--DECLARE #HeaderCmd VARCHAR(4000)
--SET #HeaderCmd = 'SELECT ''TableName'', ''RowCount'', ''TotalSpaceKB'', ''UsedSpaceKB'', ''UnusedSpaceKB'''
exec master..xp_cmdshell 'BCP "SELECT ''TableName'', ''RowCount'', ''TotalSpaceKB'', ''UsedSpaceKB'', ''UnusedSpaceKB''" queryout "d:\milind\header.xls" -U sa -P dbowner -c'
exec master..xp_cmdshell 'copy /b "d:\Milind\header.xls"+"d:\Milind\export.xls" "d:/Milind/result.xls"'
exec master..xp_cmdshell 'del "d:\Milind\header.xls"'
exec master..xp_cmdshell 'del "d:\Milind\export.xls"'
DROP TABLE tempDBTableDetails
With a little PowerShell script:
sqlcmd -Q "set nocount on select top 0 * from [DB].[schema].[table]" -o c:\temp\header.txt
bcp [DB].[schema].[table] out c:\temp\query.txt -c -T -S BRIZA
Get-Content c:\temp\*.txt | Set-Content c:\temp\result.txt
Remove-Item c:\temp\header.txt
Remove-Item c:\temp\query.txt
Warning: The concatenation follows the .txt file name (in alphabetical order)
Just used this for a DB Migration Activity.
Helped a great bit - given that its a single line.
I simply put this in the SQL Management Studio
SELECT 'sqlcmd -s, -W -Q "set nocount on; select * from [dbname].[dbo].['+ st.NAME + ']" | findstr /v /c:"-" /b >' + st.NAME + '.csv' + FROM sys.tables st
Copied the resultant set into a .bat file and I can now export the entire DB with each table into a CSV.