Export data from SQL database with BCP to CSV - sql-server

I have written this code to export CSV file from my SQL Server database:
BCP [dbo].[testtable] out Z:\MyData3.csv -S testserver.com -U Userid -P password -c -t;
Unfortunately, I don't know why 2 columns on the CSV file stand apart very far from each other:
The result I would like to have is a CSV with header
column1;column2
a1;a2
a11;a21
a122;a222
a12;a23
Can someone please help?
Thank you
create table dbo.testtable (column1 nvarchar(500),column2 nvarchar(1000))
insert into dbo.testtable (column1 ,column2 )
values
('a1','a2'),
('a11','a21'),
('a122','a222'),
('a12','a23')

Related

obtain the ";" in liquibase output sql when using mssql

Is there any way to control the output of the Liquibase command updateSql when running on Microsoft SQL Server, so that the resulting SQL text is terminated with the ; instead of the GO, similarly to what is done by default for PostgreSQL?
Current output for SQL Server:
CREATE TABLE T0000 ([field1] varchar(255), field2 varchar(255))
GO
ALTER TABLE T0000 ALTER COLUMN [field1] varchar(255) NOT NULL
GO
Output for PostgreSQL:
CREATE TABLE public."T0000" (field1 TEXT, field2 TEXT);
ALTER TABLE public."T0000" ALTER COLUMN field1 SET NOT NULL;
Thanks.
I had some result doing:
cat original-sql-file.sql | sed -z 's/\nGO\n/;\n/g'
but this is done as Liquibase output post-processing.
I would like to know if it is possible to obtain this "natively" with some Liquibase setting/command or whatever.

Import CSV rows with 2000 records from node into SQL Server

I am using exceljs (https://www.npmjs.com/package/exceljs) node package to read excel file.
I am looking for fast import csv with 2000 records into SQL Server.
Table:
User [Id PK]
Role [UserId FK]
Unit [UserId FK]
Excel file:
UserName Role Unit
Jack 1 Unit1
#Furqan, did you try executing BULK INSERT command to import CSV file contents into SQL Server database table
Here is sample target table
create table UploadTable (
[User] varchar(100),
[Role] varchar(100),
Unit varchar(100)
)
After you create your table try following SQL command
BULK INSERT UploadTable FROM 'c:\kodyaz\upload.csv'
WITH
(
FIELDTERMINATOR = ',',
ROWTERMINATOR = '0x0A'
)

BCP use produces incorrect syntax error

I am new to the bcp tool. Below is sample code copied from the Microsoft site which I adapted for our server. This is for SQL server.
It works fine until the last line, which is the bcp call.
Here it get the error message
"Msg 102, Level 15, State 1, Line 3 Incorrect syntax near '-'."
Would this be some kind of issue where bcp is not installed on our server?
Thanks
USE CH_Reports;
GO
if OBJECT_ID('myTestCharData','Table') is not null
drop table myTestCharData
CREATE TABLE myTestCharData (
Col1 smallint,
Col2 nvarchar(50),
Col3 nvarchar(50)
);
INSERT INTO myTestCharData(Col1,Col2,Col3)
VALUES(1,'DataField2','DataField3');
INSERT INTO myTestCharData(Col1,Col2,Col3)
VALUES(2,'DataField2','DataField3');
GO
SELECT Col1,Col2,Col3 FROM myTestCharData
-bcp myTestCharData out C:\myTestCharData-c.Dat -c -t, -T

How to select specific column to import using BCP whithout configuration file?

I'm using BCP command to import, in SQL Server 2005, through of a configuration file. So:
EXEC master..xp_cmdshell 'BCP database.dbo.table in d:\folder\foo.csv -f d:\folder\configuration.xml -c -t, -T -F 2 '
I want to import certain columns without having to use the configuration file 'configuration.xml'.
why don't you try by query...
it results the data.... and you pass that to your table...
INSERT INTO tablename
SELECT * FROM
OPENROWSET ('MSDASQL', 'Driver={Microsoft Text Driver (*.txt; *.csv)};DBQ=D:\sam;', 'SELECT name,id from sam.csv');

How can I insert into to a SQL Server database data from an online CSV?

I need to perform a dataload every day from a csv available online e.g. http://www.supplier.com/products.csv
Once I've dumped the csv into a sql table I can do the processing I then need to update / insert etc. The problem is that I don't know how to automate the dataload.
I was hoping I could use a SQL job / task, scheduled to run each day at 06:00, give it a uri and that it could then access the data in the csv...
How can I do that?
You can schedule a SQL Agent job to download the file locally and use BULK INSERT:
CREATE TABLE StagingCSV
(
col1 VARCHAR(60),
col2 VARCHAR(60),
col3 VARCHAR(60),
col4 VARCHAR(60),
-- ...
)
GO
(error rows will be ignored)
BULK
INSERT StagingCSV
FROM 'c:\mycsvfile.txt'
WITH
(
FIELDTERMINATOR = ',',
ROWTERMINATOR = '\n'
)
GO
Other methods:
About Bulk Import and Bulk Export Operations
Importing Bulk Data by Using BULK INSERT or OPENROWSET
You can use Powershell to download a file:
$clnt = new-object System.Net.WebClient
$url = "http://www.supplier.com/products.csv "
$file = "c:\temp\Mycsv.txt"
$clnt.DownloadFile($url, $file)
Another simple (although not free, but still rather cheap) solution is to use the SQL# library which would allow you to do this in just a few lines of T-SQL. This would make it really easy to automate via a SQL Agent Job.
You could emulate the Powershell method (suggested by Mitch) with a single command to grab the CSV file and then read it into the table with another command:
DECLARE #Dummy VARBINARY(1)
SELECT #Dummy = SQL#.INET_DownloadFile('http://www.location.tld/file.csv',
'C:\file.csv')
INSERT INTO dbo.RealTable (Column1, Column2, ...)
EXEC SQL#.File_SplitIntoFields 'C:\file.csv', ',', 0, NULL, NULL
OR, you could bypass going to the file system by reading the CSV file straight into a local variable, splitting that on the carriage-returns into a Temp Table, and then split that into your table:
CREATE TABLE #CSVRows (CSV VARCHAR(MAX))
DECLARE #Contents VARBINARY(MAX)
SELECT #Contents = SQL#.INET_DownloadFile('http://www.location.tld/file.csv',
NULL)
INSERT INTO #CSVRows (CSV)
SELECT SplitVal
FROM SQL#.String_Split(CONVERT(VARCHAR(MAX), #Contents),
CHAR(13) + CHAR(10), 1)
INSERT INTO dbo.RealTable (Column1, Column2, ...)
EXEC SQL#.String_SplitIntoFields 'SELECT CSV FROM #CSVRows', ',', NULL
You can find SQL# at: http://www.SQLsharp.com/
I am the author of the SQL# library, but this seems like a valid solution to the question.
I have not seen an example where you can bulk insert directly from a url.
So, for the remainder, use a sql job and bulk insert.
Bulk inserts made easy: http://www.mssqltips.com/tip.asp?tip=1207
Here's a quick excerpt:
BULK INSERT dbo.ImportTest FROM
'C:\ImportData.txt' WITH (
FIELDTERMINATOR =',', FIRSTROW = 2 )
You can also perform the file download by using an Integration Services Task:
http://www.sqlis.com/post/Downloading-a-file-over-HTTP-the-SSIS-way.aspx

Resources