Azure Synapse has the Bulk Insert option in its GUI for inserting tables.
But what is the underlying code that it is running? I would like to run it as TSQL rather than as a pipeline.
The documentation is unclear that is even supported while variations of this the below all fail
Running the following yields errors:
INSERT INTO [schema].[table][
SELECT * FROM OPENROWSET(
BULK 'filename.parquet',
FORMAT = 'PARQUET'
)
Related
I want to bulk insert records from the SELECT query result but I think in SQL server we can only bulk insert from a file.
Is there a way where I can perform the bulk insert from SELECT query and which should also be faster as compared to bulk insert
Many thanks in advance
I need to modify an MS SQL "job" and add a step. I am creating the step in SSMS to test what I am doing. I am on a DEV server.
I need to do a SELECT INTO to create or populate a table. The only complication is that the FROM clause references a "Linked Server" that is Oracle. The basic query is:
SELECT *
INTO MyDatabase.MySchema.MyTable
FROM LinkedServer..RemoteSchema.RemoteTable
I get two errors reported in SSMS:
No matter what I call the "new" local table SSMS reports that it is an invalid object.
I am told that there is a syntax error near FROM
In the existing DB job there are several examples of this sort of usage. I am just not sure why it is failing here.
What have I tried? I have tried the following in SSMS on my desktop and RDP'd into the DEV server as an 'admin' user to use SSMS there.
SELECT *
INTO MyDatabase.MySchema.MyTable
FROM LinkedServer..RemoteSchema.RemoteTable
--
USE MyDatabase;
SELECT *
INTO MySchema.MyTable
FROM LinkedServer..RemoteSchema.RemoteTable
--
SELECT *
INSERT INTO MyDatabase.MySchema.MyTable
FROM OPENQUERY(LinkedServer, '
select * from RemoteSchema.RemoteTable
');
--
SELECT *
INTO MyDatabase.MySchema.foo
FROM MyDatabase.MySchema.ExistingTable
In the last instance above I am making sure that the source table exists and that the target table does not. I think I am following the rules from HERE
What am I missing?
EDIT
What I was missing was a giant typo. I was actually using incorrect syntax like the third example above: select * INSERT into.... I was blind to the word "INSERT" in my SSMS query window and managed to edit it out of most of the examples above.
You should create an empty table and then insert rows from the linked server into the table.
Create table #MyTable (
col1
, col2 ...
);
INSERT INTO #MyTable (col1, col2 ...)
SELECT col1, col2
FROM LinkedServer..RemoteSchema.RemoteTable
I've got 2 firebird databases
c:\db1.gdb
c:\db2.gdb
both databases schema's are the same so both contain a table
MyTable
-------
Id int
Name varchar(50)
...etc
Its guaranteed that the data is different on both databases, however I need to copy from db2.MyTable into db1.MyTable
A requirement is that I do this using the firebird isql tool.
How would I firstly using isql
-Connect to both db's in one isql command window
-run a sql statement that would do a select all from one table in db2 and insert it into the same table in db1
I'm using firebird 1.5
This is not possible with FB 1.5. You can do this with Firebird 2.5 using the new "execute statement...external" feature that makes it possible to access another firebird database from inside triggers, procedures and code blocks.
If you really must do it using isql then you could write select statement which produces insert statements; run the select in db2, save the result into some file and then execute the statements in db1. The select statement would be something like
select 'insert into MyTable(id, name) values ('|| cast(id as varchar(10)) ||','''|| name ||''');' from MyTable;
However the job is much easier to do using something like Clever Components' Interbase DataPump.
I am migrating several hundred stored procedures from one server to another, so I wanted to write a stored procedure to execute an SP on each server and compare the output for differences.
In order to do this, I would normally use this syntax to get the results into tables:
select * into #tmp1 from OpenQuery(LocalServer,'exec usp_MyStoredProcedure')
select * into #tmp2 from OpenQuery(RemoteServer,'exec usp_MyStoredProcedure')
I then would union them and do a count, to get how many rows differ in the results:
select * into #tmp3
from ((select * from #tmp1) union (select * from #tmp2))
select count(*) from #tmp1
select count(*) from #tmp3
However, in this case, my stored procedure contains an OpenQuery, so when I try to put the exec into an OpenQuery, the query fails with the error:
The operation could not be performed because OLE DB provider "SQLNCLI"
for linked server "RemoteServer" was unable to begin a distributed transaction.
Are there any good workarounds to this? Or does anybody have any clever ideas for things I could do to make this process go more quickly? Because right now, it seems that I would have to run the SP on each server, script the results into tmp tables, then do the compare. That seems like a poor solution!
Thank you for taking the time to read this, and any help would be appreciated greatly!
I think your method would work - you just need to start the MSDTC. This behavior occurs if the Distributed Transaction Coordinator (DTS) service is disabled or if network DTC access is disabled. By default, network DTC access is disabled in Windows. When running and configured properly, the OLE DB provider would be able start the distributed transaction.
Check out this for instructions- it applies to any Windows Server 2003 or 2008.
Similar to your question.
Insert results of a stored procedure into a temporary table
I need to perform a dataload every day from a csv available online e.g. http://www.supplier.com/products.csv
Once I've dumped the csv into a sql table I can do the processing I then need to update / insert etc. The problem is that I don't know how to automate the dataload.
I was hoping I could use a SQL job / task, scheduled to run each day at 06:00, give it a uri and that it could then access the data in the csv...
How can I do that?
You can schedule a SQL Agent job to download the file locally and use BULK INSERT:
CREATE TABLE StagingCSV
(
col1 VARCHAR(60),
col2 VARCHAR(60),
col3 VARCHAR(60),
col4 VARCHAR(60),
-- ...
)
GO
(error rows will be ignored)
BULK
INSERT StagingCSV
FROM 'c:\mycsvfile.txt'
WITH
(
FIELDTERMINATOR = ',',
ROWTERMINATOR = '\n'
)
GO
Other methods:
About Bulk Import and Bulk Export Operations
Importing Bulk Data by Using BULK INSERT or OPENROWSET
You can use Powershell to download a file:
$clnt = new-object System.Net.WebClient
$url = "http://www.supplier.com/products.csv "
$file = "c:\temp\Mycsv.txt"
$clnt.DownloadFile($url, $file)
Another simple (although not free, but still rather cheap) solution is to use the SQL# library which would allow you to do this in just a few lines of T-SQL. This would make it really easy to automate via a SQL Agent Job.
You could emulate the Powershell method (suggested by Mitch) with a single command to grab the CSV file and then read it into the table with another command:
DECLARE #Dummy VARBINARY(1)
SELECT #Dummy = SQL#.INET_DownloadFile('http://www.location.tld/file.csv',
'C:\file.csv')
INSERT INTO dbo.RealTable (Column1, Column2, ...)
EXEC SQL#.File_SplitIntoFields 'C:\file.csv', ',', 0, NULL, NULL
OR, you could bypass going to the file system by reading the CSV file straight into a local variable, splitting that on the carriage-returns into a Temp Table, and then split that into your table:
CREATE TABLE #CSVRows (CSV VARCHAR(MAX))
DECLARE #Contents VARBINARY(MAX)
SELECT #Contents = SQL#.INET_DownloadFile('http://www.location.tld/file.csv',
NULL)
INSERT INTO #CSVRows (CSV)
SELECT SplitVal
FROM SQL#.String_Split(CONVERT(VARCHAR(MAX), #Contents),
CHAR(13) + CHAR(10), 1)
INSERT INTO dbo.RealTable (Column1, Column2, ...)
EXEC SQL#.String_SplitIntoFields 'SELECT CSV FROM #CSVRows', ',', NULL
You can find SQL# at: http://www.SQLsharp.com/
I am the author of the SQL# library, but this seems like a valid solution to the question.
I have not seen an example where you can bulk insert directly from a url.
So, for the remainder, use a sql job and bulk insert.
Bulk inserts made easy: http://www.mssqltips.com/tip.asp?tip=1207
Here's a quick excerpt:
BULK INSERT dbo.ImportTest FROM
'C:\ImportData.txt' WITH (
FIELDTERMINATOR =',', FIRSTROW = 2 )
You can also perform the file download by using an Integration Services Task:
http://www.sqlis.com/post/Downloading-a-file-over-HTTP-the-SSIS-way.aspx