Insert into linkedserver from local table - sql-server

I am trying to insert some data from my local table into linkedserver via sql server. this is what i am doing but it keeps on throwing syntax error.
TRY 1
EXEC(
'INSERT into test.testschema.testoperation
(viasatsubscriptionID, subscriptionrowdate, phonenumberday, viasatcustomerid)
SELECT * FROM rdata.dbo.testoperation'
)AT REDSHIFT64
TRY 2
EXEC(
'INSERT into test.testschema.testoperation
(viasatsubscriptionID, subscriptionrowdate, phonenumberday, viasatcustomerid)'
)AT REDSHIFT64
SELECT * FROM rdata.dbo.testoperation
Both fails.
Any thoughts where i am going wrong?

testoperation is your local table, and since your query runs on the remote server, the table does not exist.
Why don't you try inserting directly to the remote table:
INSERT into REDSHIFT64.test.testschema.testoperation
(viasatsubscriptionID, subscriptionrowdate, phonenumberday, viasatcustomerid)
SELECT * FROM rdata.dbo.testoperation

Related

MS SQL "SELECT INTO" table creation issues

I need to modify an MS SQL "job" and add a step. I am creating the step in SSMS to test what I am doing. I am on a DEV server.
I need to do a SELECT INTO to create or populate a table. The only complication is that the FROM clause references a "Linked Server" that is Oracle. The basic query is:
SELECT *
INTO MyDatabase.MySchema.MyTable
FROM LinkedServer..RemoteSchema.RemoteTable
I get two errors reported in SSMS:
No matter what I call the "new" local table SSMS reports that it is an invalid object.
I am told that there is a syntax error near FROM
In the existing DB job there are several examples of this sort of usage. I am just not sure why it is failing here.
What have I tried? I have tried the following in SSMS on my desktop and RDP'd into the DEV server as an 'admin' user to use SSMS there.
SELECT *
INTO MyDatabase.MySchema.MyTable
FROM LinkedServer..RemoteSchema.RemoteTable
--
USE MyDatabase;
SELECT *
INTO MySchema.MyTable
FROM LinkedServer..RemoteSchema.RemoteTable
--
SELECT *
INSERT INTO MyDatabase.MySchema.MyTable
FROM OPENQUERY(LinkedServer, '
select * from RemoteSchema.RemoteTable
');
--
SELECT *
INTO MyDatabase.MySchema.foo
FROM MyDatabase.MySchema.ExistingTable
In the last instance above I am making sure that the source table exists and that the target table does not. I think I am following the rules from HERE
What am I missing?
EDIT
What I was missing was a giant typo. I was actually using incorrect syntax like the third example above: select * INSERT into.... I was blind to the word "INSERT" in my SSMS query window and managed to edit it out of most of the examples above.
You should create an empty table and then insert rows from the linked server into the table.
Create table #MyTable (
col1
, col2 ...
);
INSERT INTO #MyTable (col1, col2 ...)
SELECT col1, col2
FROM LinkedServer..RemoteSchema.RemoteTable

SQL Server 2012 - Insert into linked server table using openquery

I have a linked server Remoteserver containing a table that holds file and folder names from a dir
When I am on the remote server I can run a built in procedure (xp_dirtree) and populate the 'files' table but what i need to do is to run a query from the local SQL server that does this:
Delete all records from the [Files] table on Remoteserver
Insert data that comes from the stored procedure:
INSERT [Remoteserver].[dbo].[files] (subdirectory,depth,isfile)
EXEC master.sys.xp_dirtree '\\Fileserver\DBBackup',1,1;
Select the 'subdirectory' column
I tried some things using openquery and i am able to select existing records but unable to do the insert.
Any help is appreciated.
Try this
INSERT INTO OPENQUERY([Remoteserver]
,'SELECT subdirectory, depth, [file] FROM [RemoteDB].[dbo].[files]')
EXEC master.sys.xp_dirtree '\\fileserver\DBBackup', 1, 1;
OR
INSERT INTO OPENQUERY([Remoteserver]
,'SELECT subdirectory,depth, [file] FROM [RemoteDB].[dbo].[files]')
select * from OPENQUERY([another_server_name], 'master.sys.xp_dirtree ''\\fileserver\DBBackup\temp'', 1, 1');
But in general you do not need to use OPENQUERY at all if Fileserver and Remoteserver are accessible from the local machine.
INSERT INTO [Remoteserver].[RemoteDB].[dbo].[files] (subdirectory, depth, isfile)
EXEC master.sys.xp_dirtree '\\Fileserver\DBBackup',1,1;

Linked Server Insert-Select Performance

Assume that I have a table on my local which is Local_Table and I have another server and another db and table, which is Remote_Table (table structures are the same).
Local_Table has data, Remote_Table doesn't. I want to transfer data from Local_Table to Remote_Table with this query:
Insert into RemoteServer.RemoteDb..Remote_Table
select * from Local_Table (nolock)
But the performance is quite slow.
However, when I use SQL Server import-export wizard, transfer is really fast.
What am I doing wrong? Why is it fast with Import-Export wizard and slow with insert-select statement? Any ideas?
The fastest way is to pull the data rather than push it. When the tables are pushed, every row requires a connection, an insert, and a disconnect.
If you can't pull the data, because you have a one way trust relationship between the servers, the work around is to construct the entire table as a giant T-SQL statement and run it all at once.
DECLARE #xml XML
SET #xml = (
SELECT 'insert Remote_Table values (' + '''' + isnull(first_col, 'NULL') + ''',' +
-- repeat for each col
'''' + isnull(last_col, 'NULL') + '''' + ');'
FROM Local_Table
FOR XML path('')
) --This concatenates all the rows into a single xml object, the empty path keeps it from having <colname> </colname> wrapped arround each value
DECLARE #sql AS VARCHAR(max)
SET #sql = 'set nocount on;' + cast(#xml AS VARCHAR(max)) + 'set nocount off;' --Converts XML back to a long string
EXEC ('use RemoteDb;' + #sql) AT RemoteServer
It seems like it's much faster to pull data from a linked server than to push data to a linked server: Which one is more efficient: select from linked server or insert into linked server?
Update: My own, recent experience confirms this. Pull if possible -- it will be much, much faster.
Try this on the other server:
INSERT INTO Local_Table
SELECT * FROM RemoteServer.RemoteDb.Remote_Table
The Import/Export wizard will be essentially doing this as a bulk insert, where as your code is not.
Assuming that you have a Clustered Index on the remote table, make sure that you have the same Clustered index on the local table, set Trace flag 610 globally on your remote server and make sure remote is in Simple or bulk logged recovery mode.
If you're remote table is a Heap (which will speed things up anyway), make sure your remote database is in simple or bulk logged mode change your code to read as follows:
INSERT INTO RemoteServer.RemoteDb..Remote_Table WITH(TABLOCK)
SELECT * FROM Local_Table WITH (nolock)
The reason why it's so slow to insert into the remote table from the local table is because it inserts a row, checks that it inserted, and then inserts the next row, checks that it inserted, etc.
Don't know if you figured this out or not, but here's how I solved this problem using linked servers.
First, I have a LocalDB.dbo.Table with several columns:
IDColumn (int, PK, Auto Increment)
TextColumn (varchar(30))
IntColumn (int)
And I have a RemoteDB.dbo.Table that is almost the same:
IDColumn (int)
TextColumn (varchar(30))
IntColumn (int)
The main difference is that remote IDColumn isn't set up as as an ID column, so that I can do inserts into it.
Then I set up a trigger on remote table that happens on Delete
Create Trigger Table_Del
On Table
After Delete
AS
Begin
Set NOCOUNT ON;
Insert Into Table (IDColumn, TextColumn, IntColumn)
Select IDColumn, TextColumn, IntColumn from MainServer.LocalDB.dbo.table L
Where not exists (Select * from Table R WHere L.IDColumn = R.IDColumn)
END
Then when I want to do an insert, I do it like this from the local server:
Insert Into LocalDB.dbo.Table (TextColumn, IntColumn) Values ('textvalue', 123);
Delete From RemoteServer.RemoteDB.dbo.Table Where IDColumn = 0;
--And if I want to clean the table out and make sure it has all the most up to date data:
Delete From RemoteServer.RemoteDB.dbo.Table
By triggering the remote server to pull the data from the local server and then do the insert, I was able to turn a job that took 30 minutes to insert 1258 lines into a job that took 8 seconds to do the same insert.
This does require a linked server connection on both sides, but after that's set up it works pretty good.
Update:
So in the last few years I've made some changes, and have moved away from the delete trigger as a way to sync the remote table.
Instead I have a stored procedure on the remote server that has all the steps to pull the data from the local server:
CREATE PROCEDURE [dbo].[UpdateTable]
-- Add the parameters for the stored procedure here
AS
BEGIN
-- SET NOCOUNT ON added to prevent extra result sets from
-- interfering with SELECT statements.
SET NOCOUNT ON;
-- Insert statements for procedure here
--Fill Temp table
Insert Into WebFileNamesTemp Select * From MAINSERVER.LocalDB.dbo.WebFileNames
--Fill normal table from temp table
Delete From WebFileNames
Insert Into WebFileNames Select * From WebFileNamesTemp
--empty temp table
Delete From WebFileNamesTemp
END
And on the local server I have a scheduled job that does some processing on the local tables, and then triggers the update through the stored procedure:
EXEC sp_serveroption #server='REMOTESERVER', #optname='rpc', #optvalue='true'
EXEC sp_serveroption #server='REMOTESERVER', #optname='rpc out', #optvalue='true'
EXEC REMOTESERVER.RemoteDB.dbo.UpdateTable
EXEC sp_serveroption #server='REMOTESERVER', #optname='rpc', #optvalue='false'
EXEC sp_serveroption #server='REMOTESERVER', #optname='rpc out', #optvalue='false'
If you must push data from the source to the target (e.g., for firewall or other permissions reasons), you can do the following:
In the source database, convert the recordset to a single XML string (i.e., multiple rows and columns combined into a single XML string).
Then push that XML over as a single row (as a varchar(max), since XML isn't allowed over linked databases in SQL Server).
DECLARE #xml XML
SET #xml = (select * from SourceTable FOR XML path('row'))
Insert into TempTargetTable values (cast(#xml AS VARCHAR(max)))
In the target database, cast the varchar(max) as XML and then use XML parsing to turn that single row and column back into a normal recordset.
DECLARE #X XML = (select '<toplevel>' + ImportString + '</toplevel>' from TempTargetTable)
DECLARE #iX INT
EXEC sp_xml_preparedocument #ix output, #x
insert into TargetTable
SELECT [col1],
[col2]
FROM OPENXML(#iX, '//row', 2)
WITH ([col1] [int],
[col2] [varchar](128)
)
EXEC sp_xml_removedocument #iX
I've found a workaround. Since I'm not a big fun of GUI tools like SSIS, I've reused a bcp script to load table into csv and vice versa. Yeah, it's an odd case to have the bulk operation support for files, but tables. Feel free to edit the following script to fit your needs:
exec xp_cmdshell 'bcp "select * from YourLocalTable" queryout C:\CSVFolder\Load.csv -w -T -S .'
exec xp_cmdshell 'bcp YourAzureDBName.dbo.YourAzureTable in C:\CSVFolder\Load.csv -S yourdb.database.windows.net -U youruser#yourdb.database.windows.net -P yourpass -q -w'
Pros:
No need to define table structures every time.
I've tested and it worked way faster than inserting directly through
the LinkedServer.
It's easier to manage than XML (which is limited to
varchar(max) length anyway).
No need of an extra layout of abstraction (tools like SSIS).
Cons:
Using the external tool bcp through the xp_cmdshell interface.
Table properties will be lost after ex/im-poring csv (i.e. datatype, nulls,length, separator within value, etc).

How to run remote sproc via linked server and store results in temp table on a clustered server

I need to be able to run a remote sproc and store it's results in a temp table so that further processing can be done against the data. I can run the below exec statement on it's own just fine and get the data back, however, when trying to insert into the temp table, I get the following error msg:
OLE DB provider "SQLNCLI" for linked server "LinkedServerName" returned message "No transaction is active.".
Msg 7391, Level 16, State 2, Line 8
The operation could not be performed because OLE DB provider "SQLNCLI" for linked server "LinkedServerName" was unable to begin a distributed transaction.
I don't want to use a join because it is being extremely slow, so I thought I'd try selecting the data I need by calling a remote sproc into a temp table, then work with it that way.
I've tried following instructions here with no luck:
http://sql-articles.com/blogs/linked-server-problem-windows-2003-sp1-setting-msdtc-security-configuration/
I believe the main problem is that the source server (where I'm running the below SQL) is a clustered server, and that I'm missing some setting for DTC.
Any ideas?
--drop table #tmp
CREATE TABLE #tmp
(
col1 int,
col2 int
);
insert into #tmp (col1, col2)
exec [LinkedServerName].[RemoteDBName].dbo.remote_sproc '04/01/2011', '04/06/2011'
select * from #tmp
While I didn't find a way to use distributed transactions on a clustered server setup, I did find an alternative way to grab the data remotely using OPENROWSET. Performance wise, it seemed very similar to using a linked server and is working well in our production environment.
/*
-- run the following once to configure SQL server to use OPENROWSET...
sp_configure 'Show Advanced Options', 1
GO
RECONFIGURE
GO
sp_configure 'Ad Hoc Distributed Queries', 1
GO
RECONFIGURE
GO
*/
-- still need a table to store the result set in to work
-- with the data after we grab it...
declare #table table
(
col1 int,
col2 int
);
-- use openrowset instead of a linked server
insert into #table
select *
FROM OPENROWSET('SQLNCLI', 'Server=HOSTNAME;Uid=USERNAME;Pwd=PASSWORD',
'EXEC DBName.dbo.sprocName ''Param1'', ''Param2''')
select * from #table

Using temp tables in SSIS

I've created an ADO.NET connection manager, and a DataReader source with the following SQL Command:
select
'test' as testcol
INTO
#tmp
select * from #tmp
If I click the refresh button in the DataReader component, I get SqlException "Invalid object name #tmp". The SQL statment itself is clearly valid and executes properly in sql server management studio. I've also tried setting DelayValidation on the connection manager, to no avail.
is the error on the INSERT or the SELECT?
if you are issuing only one command that contains both the INSERT and SELECT, try putting a semicolon before the SELECT.
EDIT after OP comment
encapsulate all the logic within a stored procedure:
CREATE PROCEDURE YourProcedureName
AS
select
'test' as testcol
INTO
#tmp
select * from #tmp
GO
the have your application run this single SQL command:
exec YourProcedureName
EDIT after next OP comment
OP doesn't say which SQL Server version they are using, if 2005 or up, try a CTE:
;with CTEtemp as
(
select
'test' as testcol
)
select * from CTEtemp
Why couldn't this be replaced with a "SELECT 'test' as testcol"? The SSIS query parser may be having trouble with it because there's a temp table involved and it expects a single statement, not an actual SQL script. Or, if what you're sharing above is only an example for illustration, maybe something like this:
SELECT *
FROM (SELECT 'test' AS testcol)
Can you elaborate on what you're trying to accomplish here and, if it is, why the temp table is required?
Use sp_executesql
Your command would become
exec sp_executesql #statement=N'
select
''test'' as testcol
INTO
#tmp
select * from #tmp'
You must use nvarchar string (hence the N), and escape single quotes by doubling them.
I had the same problem as you and this is how I just fixed it.

Resources