I am trying to push binary data from SQL Server to an Oracle LONG RAW column. I have a linked server created on SQL Server that connects to the Oracle server. I have a stored procedure on the Oracle side that I am trying to call from SQL Server. I can't seem to get the binary to pass into the stored procedure. I've tried changing the from and to types; however, the data ultimately needs to end up in a LONG RAW column. I have control of the Oracle stored procedure and the SQL Server code, but I do not have control of the predefined Oracle table structure.
varbinary(max) -> long raw
ORA-01460: unimplemented or unreasonable conversion requested
varbinary(max) -> blob
PLS-00306: wrong number or types of arguments in call to 'ADDDOC'
varbinary -> long raw
No errors, but get data truncation or corruption
The varbinary(max) does work if I set the #doc = null.
Below is the Oracle procedure and the SQL Server.
Oracle:
CREATE OR REPLACE
PROCEDURE ADDDOC (param1 IN LONG RAW)
AS
BEGIN
-- insert param1 into a LONG RAW column
DBMS_OUTPUT.PUT_LINE('TEST');
END ADDDOC;
SQL Server:
declare #doc varbinary(max)
select top 1 #doc = Document from Attachments
execute ('begin ADDDOC(?); end;', #doc) at ORACLE_DEV
-- tried this too, same error
--execute ('begin ADDDOC(utl_raw.cast_to_raw(?)); end;', #doc) at ORACLE_DEV
I've also tried creating the record in the Oracle Documents table then updating the LONG RAW field from SQL Server without invoking a stored procedure, but the query just seems to run and run and run and run...
--already created record and got the Id of the record I want to put the data in
--hard coding for this example
declare #attachmentId, #documentId
set #attachmentId = 1
set #documentId = 1
update ORACLE_DEV..MYDB.Documents
set Document = (select Document from Attachments where Id = #attachmentId)
where DocumentId=#documentId
As noted in the comments, LONG RAW is very difficult to work with; unfortunately, our vendor is using the datatype in their product and I have no choice but to work with it. I found that I could not pass binary data from SQL Server to an Oracle stored procedure parameter. I ended up having to create a new record with a NULL value for the LONG RAW field then using an OPENQUERY update to set the field to the VARBINARY(MAX) field; I did try using an update with the four part identifier, as noted in my code sample, but it took over 11 minutes for a single update, this new approach completes in less than 3 seconds. I am using an Oracle stored procedure here because in my real world scenario I am creating multiple records in multiple tables and coded business logic that is not relevant then tying them together with the docId.
This feels more like a workaround than a solution, but it actually works with acceptable performance.
Oracle:
create or replace procedure ADDDOC(docId OUT Number)
as
begin
select docseq.nextval into docId from dual;
-- insert new row, but leave Document LONG RAW field null for now
insert into DOC (Id) values(docId);
end ADDDOC;
SQL Server:
declare #DocId float, #AttachmentID int, #Qry nvarchar(max)
set #AttachmentID = 123 -- hardcoded for example
execute('begin ADDDOC(?); end;', #DocId output) at ORACLE_DEV
-- write openquery sql that will update Oracle LONG RAW field from a SQL Server varbinary(max) field
set #Qry = '
update openquery(ORACLE_DEV, ''select Document from Documents where Id=' + cast(#DocId as varchar) + ''')
set Document = (select Document from Attachments where Id = ' + cast(#AttachmentID as varchar) + ')
'
execute sp_executesql #Qry
Related
My desired end result is to simply be able to SELECT from a Stored Procedure. I've searched the Internet and unfortunately the Internet said this can't be done and that you first need to create a Temp Table to store the data. My problem is that you must first define the columns in the Temp Table before Executing the STORED Procedure. This is just time consuming. I simply want to take the data from the stored procedure and just stick it into a Temp Table.
What is the FASTEST route to achieve this from a coding perspective? To put it simply it's time consuming to first have to lookup the returned fields from a Stored Procedure and then write them all out.
Is there some sort of tool that can just build the CREATE Table Statement based on the Stored Procedure? See screenshot for clarification.
Most of the Stored Procedures I'm dealing with have 50+ fields. I don't look forward to defining each of these fields manually.
Here is good SO Post that got me this far but not what I was hoping. This still takes too much time. What are experienced SQL Server guys doing? I've only just recently made the jump from Oracle to SQL Server and I see that Temp Tables are a big deal in SQL Server from what I can tell.
You have several options to ease your task. However, these won't be fully automatic. Be aware that these won't work if there's dynamic sql in the procedure's code. You might be able to format the result from the functions to increase the automation allowing you to copy and paste easily.
SELECT * FROM sys.dm_exec_describe_first_result_set_for_object(OBJECT_ID('report.MyStoredProcedureWithAnyColumns'), 0) ;
SELECT * FROM sys.dm_exec_describe_first_result_set(N'EXEC report.MyStoredProcedureWithAnyColumns', null, 0) ;
EXEC sp_describe_first_result_set #tsql = N'EXEC report.MyStoredProcedureWithAnyColumns';
GO
If you don't mind ##temp table and some dynamic SQL
NOTE: As Luis Cazares correctly pointed out... the ##temp runs the risk of collision due to concurrency concerns
Example
Declare #SQL varchar(max) = 'Exec [dbo].[prc-App-Lottery-Search] ''8117'''
Declare #temp varchar(500) = '##myTempTable'
Set #SQL = '
If Object_ID(''tempdb..'+#temp+''') Is Not NULL Drop Table '+#temp+';
Create Table '+#temp+' ('+stuff((Select concat(',',quotename(Name),' ',system_type_name)
From sys.dm_exec_describe_first_result_set(#SQL,null,null ) A
Order By column_ordinal
For XML Path ('')),1,1,'') +')
Insert '+#temp+' '+#SQL+'
'
Exec(#SQL)
Select * from ##myTempTable
I have a database on server X containing my source data.
I also have a database on server Y that contains data I need to augment with data on server X.
Currently we have a nightly job on server Y that calls a stored procedure on server X, inserts it into a table variable, then sends the data as xml to a stored procedure in the database on server Y.
Below is a basically what the code looks like:
--Get data from source
DECLARE #MySourceData TABLE
(
[ColumnX] VARCHAR(50),
[ColumnY] VARCHAR(50)
);
INSERT INTO #MySourceData EXECUTE [ServerX].SourceDatabase.dbo.[pListData];
DECLARE #XmlData XML;
SELECT
#XmlData =
(
SELECT
[ColumnX]
,[ColumnY]
FROM
#MySourceData
FOR XML RAW ('Item'), ROOT('Items'), ELEMENTS, TYPE
)
--Send data to target
EXEC TargetDatabase.dbo.pImportData #XmlData;
This approach keeps any server names or database names within the sql of the job step (which we think of as part of configuration), and allows us to abide by our in house development standards of using stored procedures for data access. While this particular solution is only processing a few thousand records, and the xml won't get that big, if we tried applying it in scenarios where the dataset was larger, how poorly it might scale. I'm curious if others have better suggestions.
Assume that I have a table on my local which is Local_Table and I have another server and another db and table, which is Remote_Table (table structures are the same).
Local_Table has data, Remote_Table doesn't. I want to transfer data from Local_Table to Remote_Table with this query:
Insert into RemoteServer.RemoteDb..Remote_Table
select * from Local_Table (nolock)
But the performance is quite slow.
However, when I use SQL Server import-export wizard, transfer is really fast.
What am I doing wrong? Why is it fast with Import-Export wizard and slow with insert-select statement? Any ideas?
The fastest way is to pull the data rather than push it. When the tables are pushed, every row requires a connection, an insert, and a disconnect.
If you can't pull the data, because you have a one way trust relationship between the servers, the work around is to construct the entire table as a giant T-SQL statement and run it all at once.
DECLARE #xml XML
SET #xml = (
SELECT 'insert Remote_Table values (' + '''' + isnull(first_col, 'NULL') + ''',' +
-- repeat for each col
'''' + isnull(last_col, 'NULL') + '''' + ');'
FROM Local_Table
FOR XML path('')
) --This concatenates all the rows into a single xml object, the empty path keeps it from having <colname> </colname> wrapped arround each value
DECLARE #sql AS VARCHAR(max)
SET #sql = 'set nocount on;' + cast(#xml AS VARCHAR(max)) + 'set nocount off;' --Converts XML back to a long string
EXEC ('use RemoteDb;' + #sql) AT RemoteServer
It seems like it's much faster to pull data from a linked server than to push data to a linked server: Which one is more efficient: select from linked server or insert into linked server?
Update: My own, recent experience confirms this. Pull if possible -- it will be much, much faster.
Try this on the other server:
INSERT INTO Local_Table
SELECT * FROM RemoteServer.RemoteDb.Remote_Table
The Import/Export wizard will be essentially doing this as a bulk insert, where as your code is not.
Assuming that you have a Clustered Index on the remote table, make sure that you have the same Clustered index on the local table, set Trace flag 610 globally on your remote server and make sure remote is in Simple or bulk logged recovery mode.
If you're remote table is a Heap (which will speed things up anyway), make sure your remote database is in simple or bulk logged mode change your code to read as follows:
INSERT INTO RemoteServer.RemoteDb..Remote_Table WITH(TABLOCK)
SELECT * FROM Local_Table WITH (nolock)
The reason why it's so slow to insert into the remote table from the local table is because it inserts a row, checks that it inserted, and then inserts the next row, checks that it inserted, etc.
Don't know if you figured this out or not, but here's how I solved this problem using linked servers.
First, I have a LocalDB.dbo.Table with several columns:
IDColumn (int, PK, Auto Increment)
TextColumn (varchar(30))
IntColumn (int)
And I have a RemoteDB.dbo.Table that is almost the same:
IDColumn (int)
TextColumn (varchar(30))
IntColumn (int)
The main difference is that remote IDColumn isn't set up as as an ID column, so that I can do inserts into it.
Then I set up a trigger on remote table that happens on Delete
Create Trigger Table_Del
On Table
After Delete
AS
Begin
Set NOCOUNT ON;
Insert Into Table (IDColumn, TextColumn, IntColumn)
Select IDColumn, TextColumn, IntColumn from MainServer.LocalDB.dbo.table L
Where not exists (Select * from Table R WHere L.IDColumn = R.IDColumn)
END
Then when I want to do an insert, I do it like this from the local server:
Insert Into LocalDB.dbo.Table (TextColumn, IntColumn) Values ('textvalue', 123);
Delete From RemoteServer.RemoteDB.dbo.Table Where IDColumn = 0;
--And if I want to clean the table out and make sure it has all the most up to date data:
Delete From RemoteServer.RemoteDB.dbo.Table
By triggering the remote server to pull the data from the local server and then do the insert, I was able to turn a job that took 30 minutes to insert 1258 lines into a job that took 8 seconds to do the same insert.
This does require a linked server connection on both sides, but after that's set up it works pretty good.
Update:
So in the last few years I've made some changes, and have moved away from the delete trigger as a way to sync the remote table.
Instead I have a stored procedure on the remote server that has all the steps to pull the data from the local server:
CREATE PROCEDURE [dbo].[UpdateTable]
-- Add the parameters for the stored procedure here
AS
BEGIN
-- SET NOCOUNT ON added to prevent extra result sets from
-- interfering with SELECT statements.
SET NOCOUNT ON;
-- Insert statements for procedure here
--Fill Temp table
Insert Into WebFileNamesTemp Select * From MAINSERVER.LocalDB.dbo.WebFileNames
--Fill normal table from temp table
Delete From WebFileNames
Insert Into WebFileNames Select * From WebFileNamesTemp
--empty temp table
Delete From WebFileNamesTemp
END
And on the local server I have a scheduled job that does some processing on the local tables, and then triggers the update through the stored procedure:
EXEC sp_serveroption #server='REMOTESERVER', #optname='rpc', #optvalue='true'
EXEC sp_serveroption #server='REMOTESERVER', #optname='rpc out', #optvalue='true'
EXEC REMOTESERVER.RemoteDB.dbo.UpdateTable
EXEC sp_serveroption #server='REMOTESERVER', #optname='rpc', #optvalue='false'
EXEC sp_serveroption #server='REMOTESERVER', #optname='rpc out', #optvalue='false'
If you must push data from the source to the target (e.g., for firewall or other permissions reasons), you can do the following:
In the source database, convert the recordset to a single XML string (i.e., multiple rows and columns combined into a single XML string).
Then push that XML over as a single row (as a varchar(max), since XML isn't allowed over linked databases in SQL Server).
DECLARE #xml XML
SET #xml = (select * from SourceTable FOR XML path('row'))
Insert into TempTargetTable values (cast(#xml AS VARCHAR(max)))
In the target database, cast the varchar(max) as XML and then use XML parsing to turn that single row and column back into a normal recordset.
DECLARE #X XML = (select '<toplevel>' + ImportString + '</toplevel>' from TempTargetTable)
DECLARE #iX INT
EXEC sp_xml_preparedocument #ix output, #x
insert into TargetTable
SELECT [col1],
[col2]
FROM OPENXML(#iX, '//row', 2)
WITH ([col1] [int],
[col2] [varchar](128)
)
EXEC sp_xml_removedocument #iX
I've found a workaround. Since I'm not a big fun of GUI tools like SSIS, I've reused a bcp script to load table into csv and vice versa. Yeah, it's an odd case to have the bulk operation support for files, but tables. Feel free to edit the following script to fit your needs:
exec xp_cmdshell 'bcp "select * from YourLocalTable" queryout C:\CSVFolder\Load.csv -w -T -S .'
exec xp_cmdshell 'bcp YourAzureDBName.dbo.YourAzureTable in C:\CSVFolder\Load.csv -S yourdb.database.windows.net -U youruser#yourdb.database.windows.net -P yourpass -q -w'
Pros:
No need to define table structures every time.
I've tested and it worked way faster than inserting directly through
the LinkedServer.
It's easier to manage than XML (which is limited to
varchar(max) length anyway).
No need of an extra layout of abstraction (tools like SSIS).
Cons:
Using the external tool bcp through the xp_cmdshell interface.
Table properties will be lost after ex/im-poring csv (i.e. datatype, nulls,length, separator within value, etc).
I have a big table in SQL Server 2008 R2. It contains billions of rows. I need to load the whole data set in our application. Query the whole table is very slow. I want to use bcp dump it into a file and load it. But the problem is there are string columns it contains all kinds of special characters like '\t', '\0', comma, and '\n'. I can't find a good field/row terminator. But long string terminator slows down the data file loading for my application. The question is:
Is there any API that load data faster then SQL query? I find there is a native importing API IRowsetFastLoad. But not lucky on exporting.
Is there any API for BCP native format? I can't find any document about the format of native bcp file.
From BOL:
-n
Performs the bulk copy operation using the native (database) data types of the data. This option does not prompt for each field; it uses the native values.
Billions of rows? Then you will also want to use :
-b batch_size
Specifies the number of rows per batch of data copied. Each batch is copied to the server as one transaction. SQL Server commits or rolls back, in the case of failure, the transaction for every batch.
Can't you access the two databases at once, perhaps through Linked Server? It would make things easier.
DECLARE #StartId BIGINT
DECLARE #NmbrOfRecords BIGINT
DECLARE #RowCount BIGINT
SET #StartId = 0
SET #NmbrOfRecords = 9999
SET #RowCount = 1
WHILE #RowCount > 0
BEGIN
BEGIN TRANSACTION
INSERT INTO DestinationDatabase.dbo.Mytable
SELECT * FROM SourceDatabase.dbo.Mytable
WHERE ID BETWEEN #StartId AND #StartId + #NmbrOfRecords
SET #RowCount = ##ROWCOUNT
SET #StartId = #StartId + #NmbrOfRecords + 1
COMMIT TRANSACTION
END
The Bulk Insert API is exposed to programmers by the SqlBulkCopy class:
http://msdn.microsoft.com/en-us/library/system.data.sqlclient.sqlbulkcopy.aspx
Import and Export Bulk Data by Using the bcp Utility (SQL Server)
http://technet.microsoft.com/en-us/library/aa337544.aspx
Bulk Copy Functions reference:
http://technet.microsoft.com/en-us/library/ms130922.aspx
Is there a way to cause the result set of a SQL Server stored procedure (or any result set, after the fact) to be encoded in XML format?
I want the result set to be encoded in XML as if the FOR XML RAW clause was used during selection.
However the complex stored procedure logic and its internal SELECT statements should not be modified to return XML because the procedure is used for its standard/non-XML result set most of the time.
Update: Emphasis on the fact I'm looking for an answer in the SQL Server environment - the
results should be returned as if SQL Server has directly encoded them itself, as XML, just like it does when using the built-in XML features like the FOR XML clause.
You would insert the data from the SP into a temp table, then select from that FOR XML
This won't work if the SP itself already does a INSERT .. EXEC SPROC because you cannot nest them
Working examples
use tempdb;
create proc giveme
as
select a = 1, b = GETDATE()
union all
select 2, b = '20100101'
Using INSERT.. EXEC
declare #t table (a int, b datetime)
insert #t
exec giveme
select * from #t for xml raw
Using OPENQUERY
exec sp_addlinkedserver 'localhost'
exec sp_serveroption #server = 'localhost'
,#optname = 'DATA ACCESS'
,#optvalue = 'TRUE'
select *
from openquery(localhost, 'exec tempdb..giveme')
for xml raw
You could try using OPENROWSET in cooperation with FOR XML to do the transformation.
By 'after the fact', do you mean still within the SQL Server environment? Or are you talking about a client program?
Within SQL, you could probably write a sproc that acts as a wrapper for your other sprocs, along these lines. The wrapper sproc would handle the FOR XML work.
In .NET, there are a number of ways to do this.
You can try inserting the result set from the stored procedure into a table variable( or temporary table) and selecting the table rows with the FOR XML clause.
Here is an example:
DECLARE #MyDataTable AS TABLE ( col1 int,...., colN int)
Make sure that the #MyDataTable has the same columns as the stored procedure result set(s).
INSERT INTO #MyDataTable
EXECUTE mysp_GetData #param1=value,....,#paramN;
SELECT * FROM #MyDataTable
FOR XML AUTO