Sql Query Task in SSIS - sql-server

I have added a Execute Sql Task in my Project.I have added a Sql query in it
Insert into M1
select * from M4
But the problem is M1 table is in AAA database & M4 table is in DDD Database.
It is showing some error...?

If both databases are on the same server then fully qualify the table names:
insert into AAA.dbo.M1 (col1, col2, ...)
select col1, col2, ...
from DDD.dbo.M4
Of course, if your objects are not in the dbo schema then you need to put the correct one. You should never use SELECT * by the way, it can lead to problems if you ever change the table structure (or someone else does). Instead, always specify the column names.
An alternative would be to use a data flow to copy the data, but that's probably unnecessary here.

you can use a Data Flow Task. add a OLE DB Source and a OLE DB Destination. Then configure source and destination as required.
Take a look at here

Related

SSIS OLEDB Command transformation (Insert if not exists)

Ok so according to Microsoft docs the OLE DB Command Transformation in SSIS does this
The OLE DB Command transformation runs an SQL statement for each row in a data flow. For example, you can run an SQL statement that inserts, updates, or deletes rows in a database table.
So I want to write some SQL to Insert rows in one of my tables only IF the record doesn't exists
So I tried this but the controls keeps complaining of bad sintaxys
IF NOT EXISTS
(SELECT * FROM M_Employee_Login WHERE
Column1=?
AND Column2=?
AND Column3=?)
INSERT INTO [M_Employee_Login]
([Column1]
,[Column2]
,[Column3])
VALUES
(?,?,?)
However if I remove the IF NOT EXISTS section (leaving the insert only) the controls says may code is Ok, what am I doing wrong.
Is there an easier solution?
Update: BTW My source is a Flat File (csv file)
Update since answer: Just to let people know. I ended up using the OLE DB Command Transformation like I planned cause is better than the OLE DB Destination for this operation. The difference is that I did used the Lookup Component to filter all the already existent records (like the answer suggested). Then use the OLE DB Command Transformation with the Insert SQL that I had in the question and it worked as expected. Hope it helps
OLEDB Command object is not the same as the OLE DB Destination
Rather than doing it as you describe, instead use a Lookup Component. Your data flow becomes Flat File Source -> Lookup Component -> OLE DB Destination
In your lookup, you will write the query SELECT Column1, Column2, Column3 FROM M_Employee_Login and configure it such that it will redirect no match entities to the stream instead of failure (depending on your version 2005 vs not 2005) this will be the default.
After the lookup, the output of No Match will contain the values that didn't find a corresponding match in the target table.
Finally, configure your OLEDB Destination to perform the fast load option.
Though you can make use of Look up component in SSIS to avoid the duplicates which is the best possible approach, but if you are looking for some query to avoid the duplicates then, you can simply insert all the data in some temp/staging table in your database, and run the following query.
INSERT INTO M_Employee_Login(Column1, Column2, Column3)
SELECT vAL1,vAL2,vAL3 from Staging_Table
EXCEPT
SELECT Column1, Column2, Column3 FROM M_Employee_Login

Netezza Incremental load from Sql server using SSIS

I am trying to do a incremental load from Sql server 2008 to Netezza (Nps6) using SSIS.
Netezza 5.x version OLEDB driver used. I am using Table or View - Fast Load option with Maximum insert commit size = 0.
Here I am trying to insert few thousands of records to a Netezza table. This destination table contains millions of records. This Data flow task was taking a hours to complete. When I looked into the Netezza Administrator Active Queries I could see that a query like below was the problem,
SELECT * FROM Destination_Table;
The next step is an external table load like below,
insert into "destination_table"(col1, col2, col3)
select c0, c1, c2 from external '/dev/null' (c0, c1, c2) using (
remotesource odbc' delimiter ' ' escapechar '\' ctrlchars 'yes' crinstring 'yes' timeroundnanos 'yes' encoding 'internal' maxerrors 1
) ;
Can anyone help me understand why a SELECT * FROM the Destination Table is required for load. Or how a Netezza OLEDB driver works with SSIS.
Appreciate your help.
Without looking at details in your package, the behavior which you have explained occurs if you have not selected the Table or View -fast load option for your Data access mode in your OLE DB Destination component. The fast load option would internally use a BULK INSERT for uploading data into the destination table.
Using the Table or view behaves like a SELECT * and pulls all the columns. This access mode should be used only if you need all the columns of the table or view from the source to the destination.
The problem for you is that this option might not be appearing for you by default, since you are using Netezza.
See issue discussed here along with possible workarounds:
http://social.msdn.microsoft.com/Forums/en-US/sqlintegrationservices/thread/965b6d83-cf5e-405b-8784-7981e4386adc
Official bug report raised here:
https://connect.microsoft.com/SQLServer/feedback/details/569087
After installing OLEDB 6.x version this "SELECT * FROM DESTINATION TABLE" issue is not occurring. I could see a good performance improvement with OLEDB 6 version. But, If we are working on OLEDB 5.x version, i believe it is better to load to a stage table and then load to the destination table

Create a copy of a table within the same database with SSIS

I want to create a copy of a table, say TestTable, with a new name, say TestTableNew, in the same database with the use of an SSIS package. I've created a "Transfer SQL Server Objects Task" for this with the source database specified as both the SourceDatabase and the DestinationDatabase. When I run this task, the original table TestTable is overwritten with a new -empty- TestTable.
This might well be something really obvious that I've overlooked, but can I somehow specify another name for the destination table somewhere in this transfer task? Or should I solve this in another way?
You can't use the "Transfer SQL Server Objects Task" to copy a table to the same database because there isn't an option to specify the new table name. You would be copying table "TestTable" to table "TestTable", which will fail because they both have the same name.
You can set the "DropObjectsFirst" property to true, but that will make you lose your original table and its data, which I think you did on your test, otherwise you would have received a failure message.
The best option here is to use an "Execute SQL Task" to create the structure of your TestTableNew based on your TestTable and then do a simple OleDBSource -> OleDBDestination transformation to load all the data from one table to another.
My knowledge of SSIS is very limited but I assume you can run sql commands passing in
parameters and therefore generating something like the following dynamically
select *
insert into TestTableNew
from TestTable

How to merge table from access to SQL Express?

I have one table named "Staff" in access and also have this table(same name) in SQL 2008.
Both table have thousands of records. I want to merge records from the access table to sql table without affecting the existing records in sql. Normally, I just export using OCBC driver and that works fine if that table doesn't exist in sql server. Please advise. Thanks.
A simple append query from the local access table to the linked sql server table should work just fine in this case.
So, just drop in the first (from) table into the query builder. Then change the query type to append, and you are prompted for the append table name.
From that point on, just drop in the columns you want (do not drop in the PK column, as they need not be used nor transferred in this case).
You can also type in the sql directly in the query builder. Either way, you will wind up with something like:
INSERT INTO dbo_custsql
( ADMINID, Amount, Notes, Status )
SELECT ADMINID, Amount, Notes, Status
FROM custsql1;
This may help: http://www.red-gate.com/products/sql-development/sql-compare/
Or you could write a simple program to read from each data set and do the comparison, adding, updating, and deleting, etc.

SQL Server: Copying table contents from one database to another

I want to update a static table on my local development database with current values from our server (accessed on a different network/domain via VPN). Using the Data Import/Export wizard would be my method of choice, however I typically run into one of two issues:
I get primary key violation errors and the whole thing quits. This is because it's trying to insert rows that I already have.
If I set the "delete from target" option in the wizard, I get foreign key violation errors because there are rows in other tables that are referencing the values.
What I want is the correct set of options that means the Import/Export wizard will update rows that exist and insert rows that do not (based on primary key or by asking me which columns to use as the key).
How can I make this work? This is on SQL Server 2005 and 2008 (I'm sure it used to work okay on the SQL Server 2000 DTS wizard, too).
I'm not sure you can do this in management studio. I have had some good experiences with
RedGate SQL Data Compare in synchronising databases, but you do have to pay for it.
The SQL Server Database Publishing Wizard can export a set of sql insert scripts for the table that you are interested in. Just tell it to export just data and not schema. It'll also create the necessary drop statements.
One option is to download the data to a new table, then use commands similar to the following to update the target:
update target set
col1 = d.col1,
col2 = d.col2
from downloaded d
inner join target t on d.pk = t.pk
insert into target (col1, col2, ...)
select (d.col1, d.col2, ...) from downloaded d
where d.pk not in (select pk from target)
If you disable the FK constrains during the 2nd option - and resume them after finsih - it will work.
But if you are using identity to create pk that are involves in the FK - it will cause a problem, so it works only if the pk values remains the same.

Resources