We have N tables on Oracle server and we wanted to load all those tables from Oracle to SQL server. We are creating dynamic SSIS packages for same which will take the Oracle ServerName, DB name, schema name, tables list etc. and will load all these tables to SQL server. We have added Link Server on SQL Server (SSMS) for Oracle.
But we are not getting the efficient way to do the same. How we can achieve this in a single SSIS package. How we can handle metadata of Oracle tables and creating the same on SQL server ? This SSIS package should create tables dynamically on SQL server as well , for this we tried Temp table in SSIS package.
Since you have to do it with a large number of tables, I'd write a pl/sql procedure something, built around something like this:
declare
v_sql varchar2(1024);
begin
for x in (select owner, table_name from dba_tables where .....)
v_sql := 'created table '||
table_name ||
'#mssql a select * from '||
x.owner || '.' || x.table_name || ';';
exec immediate v_sql;
end loop;
end;
/
or, if you want to look it over before launching, use sql to write sql. In sqlplus:
set echo off feedback off verify off trimsp on pages 0
spool doit.sql
select 'create table '||
table_name ||
'#mssql as select * from '||
owner || '.' || table_name || ';'
from dba_tables
where .....
;
spool off
then check the spooled sql file for any issues before running.
All code above is off the top of my head. There may be minor syntax issues.
We have two databases one in SQL Server & one in DB2, we have a scenario where we do some data inserts & data updates and deletes in SQL Server & at the same time we also do data inserts updates & deletes in Db2.
We sync data back & forth using some processes, whenever there is a change from SQL Server we sync data to db2 for insert, update & delete, if we have a change in db2 we sync data to SQL Server, we use IBM MQ messages which we dequeue the messages to sync the changes back and forth.
Everything was good until we had some issues of data sync from Db2 to SQL Server, one of our process was down which sync from db2 to SQL Server, so there is an on demand job that runs every night that will do full data refresh from Db2 to SQL Server but we are only doing Merge Update & insert, we are not doing delete as data which is yet to be synced to db2 is also present in SQL Server, so we cannot directly delete as both databases can have more or less records, so data on SQL Server some of them are left orphan, we have a scoping so data which is getting updated in SQL Server cannot be change in db2 and vice versa.
My question is when we are syncing from Db2 to SQL Server, how to identify records that got deleted from db2 only so that we can delete those from SQL Server, we don't want to delete records that are created in SQL Server but yet to be sent to db2, we have 114 tables and we cannot maintain a flag if that is an option to differentiate.
When you said you are synchronizing data back and forth between MS SQL Server and DB2 Server, how are you capturing the changes? If using some CDC tool (IDR, GoldenGate, Informatica), these tools allow you to detect conflicts so you can decide what records to keep or delete.
If you are capturing your changes by an in-house development (triggers or your own log scraper ), you should keep at least the operation type and timestamp in your temporary change data set, so that you can recognize the operation.
If you are comparing the tables and deal with changes, you won't be able to recognize if missing columns at DB2 side represents rows deleted on DB2 side or rows added to SQL side... But you can fix that, by developing a proper change data capture mechanism.
Change tracking on the sql server side might be a viable option (as long as all the tables you would like to sync/"delete from" have a primary key).
With CT you could track which rows, for each table, were created at the sql server side
since the last sync from sql server to db2. Those rows should not be deleted yet:
DELETE
FROM SQL_SERVER_TABLE
WHERE
NOT EXISTS(SELECT * FROM CHANGETABLE())
AND NOT EXISTS(SELECT * FROM DB2_staging)
I would connect SQL to DB2 via linked servers (more there : https://learn.microsoft.com/fr-fr/sql/relational-databases/system-stored-procedures/sp-addlinkedserver-transact-sql?view=sql-server-ver15) and then do queries to find out which record are missing on both sides.
This can be accomplished with OPENQUERY. You can do something like that :
SELECT * FROM YourSqlTable
EXCEPT
SELECT * FROM OPENQUERY(YOURDB2SERVER, 'SELECT * FROM YourDB2Table')
And then the same thing inverted :
SELECT * FROM OPENQUERY(YOURDB2SERVER, 'SELECT * FROM YourDB2Table')
EXCEPT
SELECT * FROM YourSqlTable
You can then send the records on the right server .
If you have a lot of tables to compare you can write these queries with dynamic SQL
DECLARE #TABLENAME nvarchar(200);
DECLARE TABLE_CUR CURSOR FOR
SELECT TABLE_NAME FROM YourDatabaseName.INFORMATION_SCHEMA.TABLES;
OPEN TABLE_CUR
FETCH NEXT FROM TABLE_CUR INTO #TABLENAME;
WHILE ##FETCH_STATUS = 0
BEGIN
DECLARE #Query nvarchar(MAX);
SET #Query = 'SELECT * FROM OPENQUERY(YOURDB2SERVER, ''SELECT *
FROM '+ #TABLENAME + ' '')
EXCEPT
SELECT * FROM '+ #TABLENAME
-- Don't forget the double '' for openquery
EXEC sp_executeSQL #Query;
SET #Query = 'SELECT * FROM '+ #TABLENAME + '
EXCEPT
SELECT * FROM OPENQUERY(YOURDB2SERVER, ''SELECT *
FROM '+ #TABLENAME + ' '')'
-- Don't forget the double '' for openquery
EXEC sp_executeSQL #Query;
END
CLOSE TABLE_CUR;
DEALLOCATE TABLE_CUR;
Thanks for the suggestions, I am not using CDC, but maintaining changes in a LOG table which are yet to be synced to DB2.
DELETE TGT
FROM [IGP].[LocationType] AS TGT
INNER JOIN #locationType SRC ON
TGT.[LocationTypeCode] = SRC.[LocationTypeCode];
I am first inserting the log table data that are yet to be synced to DB2 into #locationType temp table and delete them from IGP(staging Db2 master data) so the updates & Deletes won't be overridden from IGP staging table data which is Db2 master data.
Now I need to take care of inserts that don't exists in Db2 and there in SQL server but it's not synced from the log table I shouldn't be deleting them as it would be data loss, so I use below merge query
MERGE INTO [dbo].[LocationType] AS TGT
USING [IGP].[LocationType] AS SRC
ON TGT.[LocationTypeCode] = SRC.[LocationTypeCode]
WHEN MATCHED AND (EXISTS
(SELECT TGT.[Description] EXCEPT SELECT SRC.[Description]))
THEN
UPDATE SET TGT.[LocationTypeCode] = SRC.[LocationTypeCode],
TGT.[Description] = SRC.[Description]
WHEN NOT MATCHED THEN
INSERT([LocationTypeCode], [Description])
VALUES([LocationTypeCode], [Description])
WHEN NOT MATCHED BY SOURCE
AND (EXISTS (SELECT TGT.[LocationTypeCode]
EXCEPT SELECT [LocationTypeCode] FROM #locationType)) THEN DELETE;
how can i insert from SQL Server to a SQLBase Database using linked Server?
Here are various examples of the correct syntax for SQLServer to SQLBase assuming your LinkedServer is called 'ISLANDLINK' . Note the two dots.
or Go here for the full script and explanation : SQLServer to SQLBase via LinkedServer
or Go here for another example: SQLServer to SQLBase via LinkedServer (more)
--Select:
SELECT * FROM OPENQUERY( ISLANDLINK, 'Select * from SYSADM.BUDGET where DEPT_ID = ''MIS'' ')
--Update:
UPDATE [ISLANDLINK]..[SYSADM].[BUDGET]
SET [BGT_YEAR] = 2016
WHERE DEPT_ID = 'MIS'
GO
--Insert:
INSERT INTO [ISLANDLINK]..[SYSADM].[EMPLOYEE]
([EMPLOYEE_ID]
,[LAST_NAME]) VALUES (99 ,'PLUMAS' )
GO
--Delete:
DELETE FROM [ISLANDLINK]..[SYSADM].[EMPLOYEE]
WHERE [LAST_NAME] = 'PLUMAS'
I tried searching but could not find exactly what I'm looking for. I'm new to SQl Server and involved into SQL Server to Oracle conversion, and it is manual conversion. All I have is SQL Server files.
I see two types of SQL Server triggers - FOR UPDATE and FOR INSERT. They look to me as before update and before insert triggers in Oracle. I'd like to confirm this please and if you can provide examples that would be great.
Also, what is the equivalent to master.dbo.sysprocesses in Oracle please? Is this v$session? I can get user from dual in Oracle. Is this what nt_username is in below code?
Here is typical code examples I need to convert to Oracle - is this before insert?
CREATE TRIGGER trigger_name ON dbo.table_name
FOR Insert AS
declare #InsertUser varchar(32)
BEGIN
SELECT #InsertUser = nt_username from master.dbo.sysprocesses where spid = ##spid
Update table_name
SET dCreateDate = GETDATE(), cCreateUser = #InsertUser
FROM table1 a ,table2 i WHERE a.tab_id = i.tab_id
END
GO
Update Trigger - before update?
CREATE TRIGGER trigger_name ON dbo.table_name
FOR UPDATE AS
declare #UpdateUser varchar(32)
if not update(CreateUser) and not update(CreateDate)
BEGIN
SELECT #UpdateUser = nt_username from master.dbo.sysprocesses where spid = ##spid
Update table_name
SET UpdateDate = GETDATE(), UpdateUser = #UpdateUser
FROM table1 a ,table2 i WHERE a.tab_id = i.tab_id
END
GO
Should I combine these two into if inserting... elsif updating in Oracle?
Thank you very much to all.
I added a trigger to the table to copy the inserted data to an audit table.
I got all the column names of the table from INFORMATION_SCHEMA.
I used "SELECT * INTO #INSERTED FROM INSERTED" to copy inserted data to a temporary table.
Then used the following dynamic query to get the data from temporary table for each column.
SET #sqlText = N'SELECT ' + #ColName + ' FROM #INSERTED'
where #ColName is the column name.
It was working fine with sql server 2008.
Now we moved to sql azure. select into is not supported in sql azure. I cannot create a temporary table and then use insert on it, as my table contains over 70 columns and also, I cannot use INSERTED table for a dynamic query.
So, please suggest any solution\workaround for it.
SQL Azure V11 doesn't support select into. Please upgrade your server to SQL DB v12 and you should be able to do this.