how to import Excel document with multiline rows into SQL? - sql-server

I want to import my Excel workbook with single worksheet into SQL server, but after trying Import and Export Data I found a problem with my source document. My excel document is multilined, so when I trying to import that wizard wants to imports each row in each column but I want to import my data with IDs and insert every rows with single ID in once columns.
How can I do this?
please look at my sample picture, hope helpful for understanding what I want to do.
excel to sql problem

I would manipulate the excel data to make it database friendly. To do this, I would add an extra worksheet and copy the data from the first sheet to the new sheet.
Then, since you want IDs in every row in col A, I would change the value in col A (of the new sheet) to a formula that copies the value from the same cell in the first sheet, unless it is blank, in which case, copies the value from the cell above the cell with the formula.
Something like: =IF(ISBLANK(Sheet1!A2), A1, Sheet1!A2)
This will give you a column of IDs with every row having a value. The Import and Export data process should work happily now.

with bcp we can insert data intotables use the following queries
exec xp_cmdshell 'bcp MyTasks.dbo.emp out d:\f\yes.xls -T -c -U nagababu -P Test123 '
exec xp_cmdshell 'bcp "SELECT * FROM MyTasks.dbo.emp" queryout d:\f\PersonData_n.xls -c -S (local) -T -L 1'
exec xp_cmdshell 'bcp MyTasks.dbo.emp format nul -c -t, -f d:\f\EmployeeData_n.xls -S (local) -T'
we have to use in instead of out while importing data from sheet to table
use it to import daata to table
exec xp_cmdshell 'bcp MyTasks.dbo.emp in d:\f\yes.xls -T -c -U nagababu -P Test123 '

Related

How to use bcp for columns that are Identity?

I want to restore my table with BCP by code in the below.
BCP framework.att.attendance in "D:\test\mhd.txt" -T -c
But the column (id) is identity in this table.
When data is restored with BCP I want id columns to be unchanged.
In other words, if the id of the first row is '7' before BCP, I want to import data and the id of the first row will be still be '7'.
What should I do?
BCP IMPORT
-E
-E Specifies that identity value or values in the imported data file are to be used for the identity column.
If -E is not given, the identity values for this column in the data file being imported are ignored.

Issue extracting individual image from SQL Server using BCP

I have a table of data, one column contains images. I am trying to extract an individual records image column and export to image file using BCP. My code is as follows:
bcp "SELECT PictureData FROM BigTable WHERE RecordId='ASDF-QWER' queryout "C:\Path\File.jpg" -n -SCONNECTION\STRING -T
The output after running the command appears successful, it says 1 rows copied but when I double click on the file created it says "cannot open this file". If I modify the statement to select different rows and copy them to a text file using something like:
bcp "SELECT Name,Address FROM BigTable WHERE RecordId='ASDF-QWER' queryout "C:\Path\File.txt" -n -SCONNECTION\STRING -T
Then the text file contains the data I expect it to contain

using sqlcmd to save a table query to CSV, cannot re-import back into the same table definition?

I have an extremely large database I need to send to the developer, the table has over 120million rows. the developer says he only needs about 10,000 or so rows so I was going to use the sqlcmd -S -d -Q "select top 10000 * from table" -s "," -o "C:\temp\filename.csv"
I decided rather than truncate immediately I would script out the table, rename and test bulk inserting, I tried using
bulk insert tablename from 'c:\temp\filename.csv'
with (
fieldterminator = ',',
rowterminator = '\n'
)
this ends in "Bulk load data conversion error (truncation) for row 1..." error. I also tried in import/export wizard and it fails for the same problem (truncation). increasing the size of the field lengths, solves the problem but I really am having problems understanding why I need to do this. Its the same data from the same table, it should bulk insert right back in?!?
also the problem is happening on every column in the table and by varying lengths, so there is no column with the same number of chars I have to add. all the columns are varchar data type. could the sqlcmd be inserting some kind of corruption in the file? I have tried to look for a problem, I also tried rtrim(ltrim(columname)) to make sure there is no whitespace but I'm not sure this is how it works. I'm using sql server 2012 if this helps.
thanks
You should look into BCP Queryout and BULK INSERT options. Use NATIVE format if you're going from SQL to SQL.
(BCP is command-line):
bcp "select top(10000) * from table" queryout "OUTPUTFILENAME.DAT" -S serverInstanceName -d databaseName -T -n
The Bulk Insert command is SQL (not command line):
bulk insert table from 'path\and\OUTPUTFILENAME.DAT' with (keepidentity,datafiletype = 'native');
(If the table doesn't have an identity, you can eliminate keepidentity,

Copy result of SQL Server query to Excel

I'm trying to figure out a way to copy the result of a SQL Server query to Excel. I know that with PostgreSQL, you can execute the following command:
COPY (SELECT id, summary, description FROM table) TO 'C:/test/table.xls';
to achieve the desired result. What is the equivalent method in SQL Server?
And I want to run it as a query statement since I would like to automate this process by running the query with a batch file as a scheduled task.
Try this:
mysql -b -e "$MY_QUERY" > my_data.csv
and see this Q for ref and more detail
Convert mysql query results to CSV (with copy/paste)
Try this:
INSERT INTO
OPENROWSET (
'Microsoft.ACE.OLEDB.12.0',
'Excel 8.0;HDR=NO;Database=C:\test\table.xls;',
[Sheet1$]
)
SELECT id, summary, description FROM table
Some limitation
You must create empty excel file first.
You must add column names in first row appropriate with inserted data.
I figured it out, just use BCP (Bulk Copy Program) to do the job like this:
bcp "select * from [databasename].[dbo].[tablename]" queryout "c:\test\table.csv" -c -t"," -r"\n" -S servername -T

How to dump a table to a text file in SQL server? in sql Server 2005

How to export a table to a text file?
I need to get the INSERT script (structure and data) for an already existing table.
In SQL2k, try to read about bulk copy, the command should be bcp i think
examples from MS help file has something like ...
Exporting data from table to text file
bcp "SELECT au_fname, au_lname FROM pubs..authors ORDER BY au_lname" queryout Authors.txt -c -Sservername -Usa -Ppassword
Importing data from text file to table:
The command to bulk copy data from Newpubs.dat into publishers2 is:
bcp pubs..publishers2 in newpubs.dat -c -t , -r \n -Sservername -Usa -Ppassword
Alternatively, you can use the BULK INSERT statement from a query tool, such as SQL Query Analyzer, to bulk copy data:
BULK INSERT pubs..publishers2 FROM 'c:\newpubs.dat'
WITH (
DATAFILETYPE = 'char',
FIELDTERMINATOR = ',',
ROWTERMINATOR = '\n'
)
Gath
In SQL Server 2005 Management Studio, you can use the Import/Export Wizard (not sure if you specifically needed a script, or simply a way to export the structure/data to a file, but this suggestion will do it without an actual script):
right-click on the database containing the table
select Tasks->Export Data
Choose a Data Source screen: (after the welcome screen) leave defaulted values, click Next
Choose a Destination: "Flat File Destination" for the Destination field. Then fill in the file name/path and the other options as you wish, click Next
Select Copy data..., click Next
Select the table to export, click Next
On the Save and Execute Package screen, you can just leave Execute Immediately selected, or if you'd like to save the resulting "script" as a SSIS package you can select that option also. Click Next, then Finish, to execute the export
Your resulting file will have the contents of the table. If you then need to "insert" this data into a different db you can use the "Import Data" option of the Wizard to import the data from the text file into the other database/table.
try this:
http://vyaskn.tripod.com/code.htm#inserts
You can build the INSERT statement programatically by fetching the column info from the information_schema where each row of data describes a column:
SELECT table_name,
ordinal_position,
column_name,
data_type,
is_nullable,
character_maximum_length
FROM information_schema.columns
WHERE table_name LIKE '%TableName%'
ORDER BY ordinal_position
For exporting data BCP is the tool and the BOL has several decent examples:
bcp AdventureWorks.Sales.Currency out Currency.dat -T -c
You can run a insert script generator like this one
or a desktop tool like this link

Resources