I'm trying to figure out a way to copy the result of a SQL Server query to Excel. I know that with PostgreSQL, you can execute the following command:
COPY (SELECT id, summary, description FROM table) TO 'C:/test/table.xls';
to achieve the desired result. What is the equivalent method in SQL Server?
And I want to run it as a query statement since I would like to automate this process by running the query with a batch file as a scheduled task.
Try this:
mysql -b -e "$MY_QUERY" > my_data.csv
and see this Q for ref and more detail
Convert mysql query results to CSV (with copy/paste)
Try this:
INSERT INTO
OPENROWSET (
'Microsoft.ACE.OLEDB.12.0',
'Excel 8.0;HDR=NO;Database=C:\test\table.xls;',
[Sheet1$]
)
SELECT id, summary, description FROM table
Some limitation
You must create empty excel file first.
You must add column names in first row appropriate with inserted data.
I figured it out, just use BCP (Bulk Copy Program) to do the job like this:
bcp "select * from [databasename].[dbo].[tablename]" queryout "c:\test\table.csv" -c -t"," -r"\n" -S servername -T
Related
Limitations on some software I'm using require me to think outside the box, so can you have a query that automatically returns a "file" with the results in it? I imagine it would be as BLOB or a base 64 encoded string or something similar.
Using Microsoft SQL Server
You can use two methods to achieve this
1. Use SQLcmd
SQLCMD -S SERVERNAME -E -Q "SELECT Col1,Col2,Col3 FROM MyDatabase.dbo.MyTable"
-s "," -o "D:\MyData.csv"
Run this above command in a cmd and achieve the expected result.
2. Use OpenRawset
INSERT INTO OPENROWSET('Microsoft.ACE.OLEDB.12.0','Text;Database=D:\;HDR=YES;FMT=Delimited','SELECT * FROM [FileName.csv]')
SELECT Field1, Field2, Field3 FROM DatabaseName
You need to have the Microsoft.ACE.OLEDB.12.0 provider available. The Jet 4.0 provider will work, too, but it's ancient, so I used this one instead.
limitations:
The .CSV file will have to exist already. If you're using headers (HDR=YES), make sure the first line of the .CSV file is a delimited list of all the fields.
If you can create objects in the target database or another database on the same server, you could use the stored procedures and functions from my answer to this question: SSMS: Automatically save multiple result sets from same SQL script into separate tabs in Excel?
You would just need to exclude the #OutputFileName parameter to force output as a single column, single row varbinary(max).
select * into #systables from sys.tables;
select * into #syscolumns from sys.columns;
select * into #systypes from sys.types;
exec dbo.GetExcelSpreadsheetData
#Worksheets = 'sys.tables/#systables/autofilter|sys.columns/#syscolumns/autofilter|sys.types/#systypes'
drop table #systables;
drop table #syscolumns;
drop table #systypes;
Output (truncated):
ExcelSpreadsheetData
-------------------------------------
0x504B0304140000000800DB78295479400A1
I have an extremely large database I need to send to the developer, the table has over 120million rows. the developer says he only needs about 10,000 or so rows so I was going to use the sqlcmd -S -d -Q "select top 10000 * from table" -s "," -o "C:\temp\filename.csv"
I decided rather than truncate immediately I would script out the table, rename and test bulk inserting, I tried using
bulk insert tablename from 'c:\temp\filename.csv'
with (
fieldterminator = ',',
rowterminator = '\n'
)
this ends in "Bulk load data conversion error (truncation) for row 1..." error. I also tried in import/export wizard and it fails for the same problem (truncation). increasing the size of the field lengths, solves the problem but I really am having problems understanding why I need to do this. Its the same data from the same table, it should bulk insert right back in?!?
also the problem is happening on every column in the table and by varying lengths, so there is no column with the same number of chars I have to add. all the columns are varchar data type. could the sqlcmd be inserting some kind of corruption in the file? I have tried to look for a problem, I also tried rtrim(ltrim(columname)) to make sure there is no whitespace but I'm not sure this is how it works. I'm using sql server 2012 if this helps.
thanks
You should look into BCP Queryout and BULK INSERT options. Use NATIVE format if you're going from SQL to SQL.
(BCP is command-line):
bcp "select top(10000) * from table" queryout "OUTPUTFILENAME.DAT" -S serverInstanceName -d databaseName -T -n
The Bulk Insert command is SQL (not command line):
bulk insert table from 'path\and\OUTPUTFILENAME.DAT' with (keepidentity,datafiletype = 'native');
(If the table doesn't have an identity, you can eliminate keepidentity,
I have an excel-vba file with about 15000 rows and 4 columns. I need to insert 2 of the columns into Microsoft SQL Management Studio. I have been looking at YouTube tutorials for how to do this but am very new so I got lost.
I have tried to convert my file into a .text file so I could insert it into the SQL server. I also tried to do linked server method but it did not seem to work -- if you have advice on a way to make that work instead, that is welcome as well!
Code in SSMS:
CREATE TABLE ExportTool(TOOLING_SHEET CARCHAR, TOOLS VARCHAR)
INSERT INTO ExportTool VALUES "insert some range here...
SELECT * from ExportTool
** EDITED CODE IN SSMS: **
CREATE TABLE ImportTool(TOOLING_SHEET CARCHAR, TOOLS VARCHAR)
BULK INSERT dbo.ImportTool
FROM OPENROWSET('Microsoft.Jet.OLEDB.4.0', 'J:\test-TDS\tools.text');
WITH
(
/* insert some range here */
)
Select * from ImportTool
Using cmd:
DEV.dbo.ExportTool OUT J:\test-TDS\TOOLS.txt -T -c
ENTER: gave the output 0 rows copied
DEV.dbo.ExportTool OUT J:\test-TDS\TOOLS.text -T -c
ENTER: gave the output :
Error = unable to open BCP host data-file
Am I typing something into the cmd incorrectly that it is not copying the files? And how do I alter my code in SSMS to get the information from that file. I cannot find where I am going wrong.
I am trying to make some normal (understand restorable) backup of mysql backup. My problem is, that I only need to back up a single table, which was last created, or edited. Is it possible to set mysqldump to do that? Mysql can find the last inserted table, but how can I include it in mysql dump command? I need to do that without locking the table, and the DB has partitioning enabled.... Thanks for help...
You can use this SQL to get the last inserted / updated table :-
select table_schema, table_name
from information_schema.tables
where table_schema not in ("mysql", "information_schema", "performance_schema")
order by greatest(create_time, update_time) desc limit 1;
Once you have the results from this query, you can cooperate it into any other language (for example bash) to produce the exact table dump).
./mysqldump -uroot -proot mysql user > mysql_user.sql
For dumping a single table use the below command.
Open cmd prompt and type the path of mysql like c:\program files\mysql\bin.
Now type the command:
mysqldump -u username -p password databasename table name > C:\backup\filename.sql
Here username - your mysql username
password - your mysql password
databasename - your database name
table name - your table name
C:\backup\filename.sql - path where the file should save and the filename.
If you want to add the backup table to any other database you can do it by following steps:
login to mysql
type the below command
mysql -u username -p password database name < C:\backup\filename.sql
How to export a table to a text file?
I need to get the INSERT script (structure and data) for an already existing table.
In SQL2k, try to read about bulk copy, the command should be bcp i think
examples from MS help file has something like ...
Exporting data from table to text file
bcp "SELECT au_fname, au_lname FROM pubs..authors ORDER BY au_lname" queryout Authors.txt -c -Sservername -Usa -Ppassword
Importing data from text file to table:
The command to bulk copy data from Newpubs.dat into publishers2 is:
bcp pubs..publishers2 in newpubs.dat -c -t , -r \n -Sservername -Usa -Ppassword
Alternatively, you can use the BULK INSERT statement from a query tool, such as SQL Query Analyzer, to bulk copy data:
BULK INSERT pubs..publishers2 FROM 'c:\newpubs.dat'
WITH (
DATAFILETYPE = 'char',
FIELDTERMINATOR = ',',
ROWTERMINATOR = '\n'
)
Gath
In SQL Server 2005 Management Studio, you can use the Import/Export Wizard (not sure if you specifically needed a script, or simply a way to export the structure/data to a file, but this suggestion will do it without an actual script):
right-click on the database containing the table
select Tasks->Export Data
Choose a Data Source screen: (after the welcome screen) leave defaulted values, click Next
Choose a Destination: "Flat File Destination" for the Destination field. Then fill in the file name/path and the other options as you wish, click Next
Select Copy data..., click Next
Select the table to export, click Next
On the Save and Execute Package screen, you can just leave Execute Immediately selected, or if you'd like to save the resulting "script" as a SSIS package you can select that option also. Click Next, then Finish, to execute the export
Your resulting file will have the contents of the table. If you then need to "insert" this data into a different db you can use the "Import Data" option of the Wizard to import the data from the text file into the other database/table.
try this:
http://vyaskn.tripod.com/code.htm#inserts
You can build the INSERT statement programatically by fetching the column info from the information_schema where each row of data describes a column:
SELECT table_name,
ordinal_position,
column_name,
data_type,
is_nullable,
character_maximum_length
FROM information_schema.columns
WHERE table_name LIKE '%TableName%'
ORDER BY ordinal_position
For exporting data BCP is the tool and the BOL has several decent examples:
bcp AdventureWorks.Sales.Currency out Currency.dat -T -c
You can run a insert script generator like this one
or a desktop tool like this link