Quicker way to export SQL tables into separate pipe-delimited files? - sql-server

I will need to export the tables in the DB, but into separate files. Instead of running the Export wizard in SQL Server Management Studio for each table, is there a quicker way to accomplish this? The data will need to be in pipe-delimited form. I found a solution, but it doesn't pull the data, just the table definitions.

Here's one way to do it. You can enable xp_cmdshell
-- To allow advanced options to be changed.
EXEC sp_configure 'show advanced options', 1;
GO
-- To update the currently configured value for advanced options.
RECONFIGURE;
GO
-- To enable the feature.
EXEC sp_configure 'xp_cmdshell', 1;
GO
-- To update the currently configured value for this feature.
RECONFIGURE;
GO
Then use the undocumented sp_msforeachtable and bcp to export to a file pipe delimited.
EXECUTE sp_msForEachTable
'EXECUTE master.dbo.xp_cmdshell ''bcp "SELECT * FROM ?" queryout D:\Data\?.txt -t "|" -c -T -S ServerName\InstanceName'''
This works and you'll want to make sure you disable xp_cmdshell if it wasn't already enable as it can be exploited (there's plenty to read on this). Also make sure you have permissions to write the files to wherever they need to go.

You can try using bcp (look here https://msdn.microsoft.com/en-us/library/ms162802.aspx) where you can specify the export format to use pipe for the delimiter. The only drawback is that you will need to specify the table name each time so you will probably need to create a list with the table names and then use a powershell script to loop on the list and execute bcp.

I also found the following which helped, assuming the sp_configure has been set for xp_cmdshell as #SQLChao pointed out in his solution:
Execute sp_MSforeachtable
'Execute master.dbo.xp_cmdshell ''sqlcmd -S DATABASE -E -d mydb -q "SET NOCOUNT ON SELECT * FROM ?" -W -o C:\TEMP\?.bak -s "|"'''
To remove the dashes under column names in the output files:
Execute sp_MSforeachtable
'Execute master.dbo.xp_cmdshell ''findstr /R /C:"^[^-]*$" c:\temp\?.bak > c:\temp\?.txt'''
Finally, remove the .bak files created:
Execute master.dbo.xp_cmdshell 'del c:\temp\*.bak'

Related

How to export SQL data records to CSV format using stored procedure?

Is it possible to export records to csv format using SQL script in stored procedure?
I am trying make job schedule, that will export records into a .csv file. I am using SQL Server 2012.
I don't want to make small application for just an exporting. That's why I am trying to make a script and add it in schedule. For example, I have records like
EmpID EmployeeName TotalClasses
---------------------------------
01 Zaraath 55
02 John Wick 97
File destination location is D:/ExportRecords/file.csv
In SSMS you can save the query result in CSV format.
Try This Below Query
-- To allow advanced options to be changed.
EXECUTE sp_configure 'show advanced options', 1;
GO
-- To update the currently configured value for advanced options.
RECONFIGURE;
GO
-- To enable the feature.
EXECUTE sp_configure 'xp_cmdshell', 1;
GO
-- To update the currently configured value for this feature.
RECONFIGURE;
GO
declare #sql varchar(8000)
select #sql = 'bcp "select * from DatabaseName..TableName" queryout d:\FileName.csv -c -t, -T -S' + ##servername
exec master..xp_cmdshell #sql
You have to create Empty FileName.csv in D:\
You could use BCP (Bulk Copy Program) the built-in cli for exporting data from SQL Server. It's low overhead, executes fast, and has lots of feature switches, like -t (which creates CSV file) which make it good for scripting.
Something like this. The Docs are useful as well
BCP dbo.YourTable out D:/ExportRecords/file.csv -c -t

Export the content of a SQL Server table to a CSV without using xp_cmdshell

I need to export the content of a Table into a file CSV.
I tried to use the execution of a xp_cmdshell from a stored procedure but it doesn't work because this component is turned off as part of the security configuration for this server.
Do you know others way to write a file from a stored procedure?
Here are a couple of methods you can try :
1. Using BCP
syntax :
bcp "SELECT * FROM Database.dbo.input" queryout C:\output.csv -c -t',' -T -S .\SQLEXPRESS
microsoft document : BCP Utility
2. Second Method(for excel, but should work for csv too) :
Insert into OPENROWSET
('Microsoft.Jet.OLEDB.4.0',
'Excel 8.0;Database=D:\testing.xls;',
'SELECT * FROM [SheetName$]') select * from SQLServerTable
For this, you need to enable adhoc distributed queries by following command :
EXEC sp_configure 'show advanced options', 1
RECONFIGURE
GO
EXEC sp_configure 'ad hoc distributed queries', 1
RECONFIGURE
GO
you might encounter a link server error. So you should refer : this stack overflow solution

How to save SQL query result to XML file on disk

I would like to export table from SQL Server 2012 to XML file. I have found nice answer and here how to make XML result from SQL Server database query, but still I am missing how to save this result physically into file.
SQL query is:
SELECT [Created], [Text]
FROM [db304].[dbo].[SearchHistory]
FOR XML PATH('Record'), ROOT('SearchHistory')
I use Microsoft SQL Server Management Studio to execute this result. I see the XML in a result window, but I cannot save it.
There is "Save Result As.." in context menu, but with 98900 rows I run out of my 8GB memory with this option.
Is there a way how to save this query directly to the XML file on disk?
You can also your SQL Server's extended stored procedures to export it to an xml file.
But you would need to configure the sql server before you can use it.
EXEC master.dbo.sp_configure 'show advanced options', 1
RECONFIGURE
EXEC master.dbo.sp_configure 'xp_cmdshell', 1
RECONFIGURE
Once xp_cmdshel is enabled in the SQL Server. You can use the following command to export the data to an xml file.
EXEC xp_cmdshell 'bcp "SELECT [Created], [Text] FROM [db304].[dbo].[SearchHistory] FOR XML PATH(''Record''), ROOT(''SearchHistory'')" queryout "C:\bcptest.xml" -T -c -t,'
You can always use the "Results to File" option in SSMS:
That should output the results of the query execution directly into a file on disk
To this job in SQL Server 2012 is a pain in ass. Finally I end up to update it to SQL Server 2014 as there is already support for SQL UTF-8 files in sqlcmd.
Create an SQL query and save it to the file.
run following:
sqlcmd -S -U sa -P sapassword -i inputquery_file_name -C65001 -o outputfile_name
This example works for me for result sets up to 2GB in size.
EXEC master.dbo.sp_configure 'show advanced options', 1
RECONFIGURE
EXEC master.dbo.sp_configure 'xp_cmdshell', 1
RECONFIGURE
DROP TABLE IF EXISTS ##AuditLogTempTable
SELECT A.MyXML
INTO ##AuditLogTempTable
FROM
(SELECT CONVERT(nvarchar(max),
(
SELECT
A.*
FROM
[dbo].[AuditLog] A
JOIN ImportProviderProcesses IPP ON IPP.ImportType = 'Z'
AND A.OperatorID = IPP.OperatorID
AND A.AuditTypeID in ( '400','424','425' )
WHERE
A.[PostTime] >= IPP.StartTime
AND A.[PostTime] <= dateadd(second, 90, IPP.StartTime)
FOR XML PATH('Record'), ROOT('AuditLog')
)
, 0
) AS MyXML
) A
EXEC xp_cmdshell 'bcp "SELECT MyXML FROM ##AuditLogTempTable" queryout "D:\bcptest1.xml" -T -c -t,'

How to name the filenames of a database and set its location in Visual Studio 2015 Database project?

By selecting "Publish" in the context menu of a VS 2015 Database project, I can create a script, which contains all the necessary commands to deploy the database to the SQL Server ("xyz.publish.sql").
The database name and its paths in this script are declared as variables:
:setvar DatabaseName "myDatabase"
:setvar DefaultFilePrefix "myDatabase"
:setvar DefaultDataPath "D:\Databases\"
:setvar DefaultLogPath "D:\Databases\"
also the filenames seems to be automatically generated:
PRIMARY(NAME = [$(DatabaseName)], FILENAME = N'$(DefaultDataPath)$(DefaultFilePrefix)_Primary.mdf')
LOG ON (NAME = [$(DatabaseName)_log], FILENAME = N'$(DefaultLogPath)$(DefaultFilePrefix)_Primary.ldf')...
Where can I set the paths and the filenames? I don't want "_Primary" to be attached at the filenames and the paths need an additional sub-folder.
If I change in the publish script, my changes will probably be overwritten the next time when this script will be generated by Visual Studio.
You can add a pre-deployment script to the project which detaches the database, moves/renames the files, then reattaches the database using the new files.
-- detach db before moving physical files
USE [master]
GO
exec sp_detach_db #dbname = N'$(DatabaseName)'
GO
-- enable xp_cmdshell
exec sp_configure 'show advanced options', 1
GO
RECONFIGURE
GO
exec sp_configure 'xp_cmdshell', 1 -- 0 = Disable , 1 = Enable
GO
RECONFIGURE
GO
-- move physical files
EXEC xp_cmdshell 'MOVE "$(DefaultDataPath)$(DefaultFilePrefix)_Primary.mdf", "C:\$(DatabaseName)\$(DatabaseName).mdf"'
EXEC xp_cmdshell 'MOVE "$(DefaultLogPath)$(DefaultFilePrefix)_Primary.ldf", "C:\$(DatabaseName)\$(DatabaseName)_log.ldf"'
GO
-- reattach db with new filepath
CREATE DATABASE [$(DatabaseName)] ON
(NAME = [$(DatabaseName)], FILENAME = 'C:\$(DatabaseName)\$(DatabaseName).mdf'),
(NAME = [$(DatabaseName)_log], FILENAME = 'C:\$(DatabaseName)\$(DatabaseName)_log.ldf')
FOR ATTACH
GO
-- disable xp_cmdshell
exec sp_configure 'show advanced options', 1
GO
RECONFIGURE
GO
exec sp_configure 'xp_cmdshell', 0 -- 0 = Disable , 1 = Enable
GO
RECONFIGURE
GO
USE [$(DatabaseName)];
GO
Some notes on this:
I hardcoded C:\ as the new location for the files for simplicity. You're better off creating a SQLCMD variable to store this path.
If xp_cmdshell 'MOVE ... fails, it will do so silently. To keep my answer simple I did not include any error checking, but you can roll your own by simply inserting the results of xp_cmdshell in a temp table. See How to capture the error output from xp_cmdshell in SQL Server.
You may run into permissions problems with the xp_cmdshell 'MOVE ... command. In that case, you may need to adjust the permissions of the source and target paths in the MOVE statement. You may also need to run the command as a different user -- see here (Permissions section) or here for starters.
Yes, they will be changed but that's the way it works. You have to change the script. The other thing you can do is not specify the locations in which SQL Server will use the default ones for the instance - but that too involves changing the script.

stored procedure to export to CSV with BCP

I need to create an on-demand export of user data on our website. The user clicks an export button, classic ASP code executes a stored procedure that generates a file via BCP, and the user is prompted to download it.
I've created the sproc, and its working flawlessly when executed from SSMS. The catch is getting it to work from the site with the limited privileges granted to the account connecting to SQL from the website. Here is a snippet:
-- INSERT TEMP DATA
INSERT INTO t_users_tempExport
SELECT * FROM #tempExport
-- show advanced options
EXEC sp_configure 'show advanced options', 1
RECONFIGURE
-- enable xp_cmdshell
EXEC sp_configure 'xp_cmdshell', 1
RECONFIGURE
-- hide advanced options
EXEC sp_configure 'show advanced options', 0
RECONFIGURE
-- EXPORT TO CSV
DECLARE #sql varchar(8000)
SELECT #sql = 'bcp "select * FROM DBNAME.dbo.tempExport WHERE scopeID='''+#randomString+'''" '
+ 'queryout C:\temp\exportResidents_'+CONVERT(varchar(max),#userID)+'.csv -c -t, -T -S'
+ ##servername
EXEC master..xp_cmdshell #sql
-- RETURN FILE NAME
SELECT 'C:\temp\export_'+CONVERT(varchar(max),#userID)+'.csv' AS fileName
The issue is that I cannot enable xp_cmdshell with the privledges granted to the account that is connecting to SQL from the website. Im kind of at a loss as to how to proceed.
Is it possible to include the sysadmin credentials in the call to BCP? Is there some easier option or work around?
I ended up going a completely different route. I created the CSV file from pure ASP code, only using the sproc to return the data.

Resources