I am using below code to export data from SQL Server to Excel:
update openrowset('Microsoft.ACE.OLEDB.12.0','Excel 12.0;Database=E:\..\.xlsx;',
'select Column1,Column2,Column3 FROM [Sheet1$]')
set Column1=null,Column2=null,Column3=null
insert into OPENROWSET('Microsoft.ACE.OLEDB.12.0','Excel 12.0;Database=E:\..\.xlsx;', 'SELECT * FROM [Sheet1$]')
select Column1,Column2,Column3 from table_Name
I am using 64-bit OS with 64-bit Excel 2010.
This code is working fine on 32-bit system with 32-bit Excel but in 64-bit system first time it's working fine as expected but next time onwards I want to set blank all the record and insert new records and update is working but after the update when I am executing the insert command SQL Server is showing how many rows are affected but when I open the file it's totally blank.
Only SQL Server Integration Services(SSIS) support export to Microsoft Excel workbook.
Microsoft Excel users can open CSV file the same way as a native Excel file.
So export to CSV files is suitable for most cases and you can use a simple command line utilities instead of SQL Server Integration Services (SSIS).
The disadvantage of the export to Excel workbook or CSV file is that a user receives a new file every time and losts its changes.
If you want to use SSIS to xlsx file follow this link
If you change your way to use CSV file:
Pay attention to the following tips:
The datetime fields should follow formats shown above to be undestandable by Microsoft Excel.
Text fields should be quoted otherwise column data comma separates the fields.
First two chars of CSV file should not contain capital "L" or "D" othewise Microsoft Excel shows "SYLK: File format is not valid" error message when users open a file.
final command is:
'-S . => Defines the localhost server
'-d AzureDemo => Defines the database (ex. AzureDemo)
'-E => Defines the trusted connection. Instead you can use user credentials: -U Username -P Password
'-s, => Defines the comma as a column separator. Use -s; for semicolon.
'-W => Removes trailing spaces.
'-Q "SELECT * FROM ..." => Defines a command line query and exit
'ExcelTest.csv => Outputs the result to file ExcelTest.csv
'findstr /V /C:"-" /B => removes strings like "--,-----,--------,--------".
'
sqlcmd -S . -d AzureDemo -E -s, -W -i ExcelTest.sql | findstr /V /C:"-" /B > ExcelTest.csv
Note :
sqlcmd utility has no switch to change NULL to empty value!
Use the ISNULL function in the SQL query to change NULL to ''.
Or add | replace-null.exe > ExcelTest.csv at the end of command text.
Related
Limitations on some software I'm using require me to think outside the box, so can you have a query that automatically returns a "file" with the results in it? I imagine it would be as BLOB or a base 64 encoded string or something similar.
Using Microsoft SQL Server
You can use two methods to achieve this
1. Use SQLcmd
SQLCMD -S SERVERNAME -E -Q "SELECT Col1,Col2,Col3 FROM MyDatabase.dbo.MyTable"
-s "," -o "D:\MyData.csv"
Run this above command in a cmd and achieve the expected result.
2. Use OpenRawset
INSERT INTO OPENROWSET('Microsoft.ACE.OLEDB.12.0','Text;Database=D:\;HDR=YES;FMT=Delimited','SELECT * FROM [FileName.csv]')
SELECT Field1, Field2, Field3 FROM DatabaseName
You need to have the Microsoft.ACE.OLEDB.12.0 provider available. The Jet 4.0 provider will work, too, but it's ancient, so I used this one instead.
limitations:
The .CSV file will have to exist already. If you're using headers (HDR=YES), make sure the first line of the .CSV file is a delimited list of all the fields.
If you can create objects in the target database or another database on the same server, you could use the stored procedures and functions from my answer to this question: SSMS: Automatically save multiple result sets from same SQL script into separate tabs in Excel?
You would just need to exclude the #OutputFileName parameter to force output as a single column, single row varbinary(max).
select * into #systables from sys.tables;
select * into #syscolumns from sys.columns;
select * into #systypes from sys.types;
exec dbo.GetExcelSpreadsheetData
#Worksheets = 'sys.tables/#systables/autofilter|sys.columns/#syscolumns/autofilter|sys.types/#systypes'
drop table #systables;
drop table #syscolumns;
drop table #systypes;
Output (truncated):
ExcelSpreadsheetData
-------------------------------------
0x504B0304140000000800DB78295479400A1
In powershell I would like to use Bulk copy to export and then import data from one table to another in the same database. The DB server is not local on my dev machine so I am using the /S as the Database instance name. The error that I am getting is A valid table name is required for in, out, or format options not sure what I have incorrect with the command. I am using Windows Authentication for studio.
PS C:\Users\dev> $psCommand = "bcp $($db).$($schema).$($table) out $path /S$($server) /t /c, -T"
PS C:\Users\dev> Invoke-Expression $psCommand
A valid table name is required for in, out, or format options.
Currently I am using SQLCMD Utility to load the CSV data to SQL Server. Below is my command which was executed in command prompt to load the data:
sqlcmd -Usa -Pxxx -S192.168.1.223,49546 -dlocal -i"/test.sql" -o"/test.log"
I have also copied my test.sql file contents for your reference:
SET NOCOUNT ON
BULK INSERT test FROM
"\\192.168.1.223\test.csv"
WITH
(
MAXERRORS = 1000000,
CODEPAGE = 1251,
FIELDTERMINATOR = '~%',
ROWTERMINATOR = '0x0a'
)
GO
SELECT CONVERT(varchar,##ROWCOUNT) + ' rows affected'
GO
The insert operation is working fine with the above process. But my concern is, in case of any errors due to data type or data length the row is rejected and I am unable to trace the particular row.
Each time I have to look at the log file for the rejected row number and the data file to check the corresponding row.
Is there any option to generate the error/rejected row to another file, as like we have in ORACLE - SQLPLUS Utility to generate bad file?
I think the option your are looking for is not in sqlcmd, but in BULK INSERT:
ERRORFILE ='file_name'
Specifies the file used to collect rows that have formatting errors and cannot be converted to an OLE DB rowset. These rows are copied into this error file from the data file "as is."
I'm trying to import a PostgreSQL dump of data into SQL Server using bcp. I've written a Python script to switches delimiters into '^' and eliminate other bad formatting, but I cannot find the correct switches to preserve unicode formatting for the strings when importing into SQL Server.
In Python, if I print out the lines that are causing me trouble, the row looks like this with the csv module:
['12', '\xe4\xb8\x89\xe5\x8e\x9f \xe3\x81\x95\xe3\x81\xa8\xe5\xbf\x97']
The database table only has 2 columns: one integer, one varchar.
My statement (simplified) for creating the table is only:
CREATE TABLE [dbo].[example](
[ID] [int] NOT NULL,
[Comment] [nvarchar](max)
)
And to run bcp, I'm using this line:
c:\>bcp dbo.example in fileinput -S servername -T -t^^ -c
It successfully imports about a million rows, but all of my accented characters are broken.
For example, "Böhm, Rüdiger" is turned into "B+¦hm, R++diger". Does anyone have experience with how to properly set switches or other hints with bcp?
Edit: varchar switched to nvarchar, but this does not fix the issue. This output in Python (reading with CSV module):
['62', 'B\xc3\xb6hm, R\xc3\xbcdiger']
is displayed as this in SSMS from the destination DB (delimiters matched for consistency):
select * from dbo.example where id = 62
62;"B├╢hm, R├╝diger"
where in pgAdmin, using the original DB, I have this:
62;"Böhm, Rüdiger"
You may need to modify your BCP command to support wide character sets (note the use of -w instead of -c switch)
bcp dbo.example in fileinput -S servername -T -t^^ -w
BCP documentation reference
See also http://msdn.microsoft.com/en-us/library/ms188289.aspx
If you need to preserve unicode change varchar to nvarchar...
How to export a table to a text file?
I need to get the INSERT script (structure and data) for an already existing table.
In SQL2k, try to read about bulk copy, the command should be bcp i think
examples from MS help file has something like ...
Exporting data from table to text file
bcp "SELECT au_fname, au_lname FROM pubs..authors ORDER BY au_lname" queryout Authors.txt -c -Sservername -Usa -Ppassword
Importing data from text file to table:
The command to bulk copy data from Newpubs.dat into publishers2 is:
bcp pubs..publishers2 in newpubs.dat -c -t , -r \n -Sservername -Usa -Ppassword
Alternatively, you can use the BULK INSERT statement from a query tool, such as SQL Query Analyzer, to bulk copy data:
BULK INSERT pubs..publishers2 FROM 'c:\newpubs.dat'
WITH (
DATAFILETYPE = 'char',
FIELDTERMINATOR = ',',
ROWTERMINATOR = '\n'
)
Gath
In SQL Server 2005 Management Studio, you can use the Import/Export Wizard (not sure if you specifically needed a script, or simply a way to export the structure/data to a file, but this suggestion will do it without an actual script):
right-click on the database containing the table
select Tasks->Export Data
Choose a Data Source screen: (after the welcome screen) leave defaulted values, click Next
Choose a Destination: "Flat File Destination" for the Destination field. Then fill in the file name/path and the other options as you wish, click Next
Select Copy data..., click Next
Select the table to export, click Next
On the Save and Execute Package screen, you can just leave Execute Immediately selected, or if you'd like to save the resulting "script" as a SSIS package you can select that option also. Click Next, then Finish, to execute the export
Your resulting file will have the contents of the table. If you then need to "insert" this data into a different db you can use the "Import Data" option of the Wizard to import the data from the text file into the other database/table.
try this:
http://vyaskn.tripod.com/code.htm#inserts
You can build the INSERT statement programatically by fetching the column info from the information_schema where each row of data describes a column:
SELECT table_name,
ordinal_position,
column_name,
data_type,
is_nullable,
character_maximum_length
FROM information_schema.columns
WHERE table_name LIKE '%TableName%'
ORDER BY ordinal_position
For exporting data BCP is the tool and the BOL has several decent examples:
bcp AdventureWorks.Sales.Currency out Currency.dat -T -c
You can run a insert script generator like this one
or a desktop tool like this link