I have spent a few days trying to fix this problem. I have a SSIS package with 2 execute SQL tasks within a sequence container, one is a simple delete from table and the next one an simple insert the delete precedes the insert. The delete works fine so the connection etc is ok.
The Insert is failing with the following vague and unhelpful message.
failed with the following error: "Syntax error, permission violation, or other nonspecific error". Possible failure reasons: Problems with the query, "ResultSet" property not set correctly, parameters not set correctly, or connection not established correctly.
The insert has 1 input parameter which is a date which is bound to a datetime variable set to 01/01/2011. When I replace the ? in the sql task query with a hard coded date the task works. I have also looked at the locals at a pre-execute event break point on the insert task and the variable is ok.
I also fired up a SQL Profiler session and I see the delete query hitting the DB but nothing for the insert (when it uses the input parameter).
I am using Visual Studio 2005 Pro SP1 (Not my choice) and SQL Server 2005 SP3.
Regards
Mark
I know that you have found an answer to your question. However, I would like to clarify here that the following query you were executing using OleDb connection is valid and it does work within the Execute SQL Task in SSIS package.
INSERT INTO dbo.table1 (DateCol, OtherCol, OtherCol1)
SELECT ?, SourceCol1, SourceCol2 FROM dbo.SourceTable
Following example shows a successful implementation of the above mentioned query using SSIS 2005 (Business Intelligence Development Studio (BIDS) 2005)
Step-by-step process:
Create two tables named dbo.Source and dbo.Destination using the scripts provided under the Scripts section. Populate the table dbo.Source with data as shown in screenshot #1. Table dbo.Destination will initially be empty and will be populated with source data using an Execute SQL Task.
On the SSIS package, create an OLE DB Connection named SQLServer in the Connections Managers pointing to a SQL Server instance of your preference. Refer screenshot #2.
On the SSIS package, create a variable named RecordDate as shown in screenshot #3.
On the SSIS package, place an Execute SQL Task as shown in screenshot #4. Configure the task as shown in screenshots #5 and #6.
Screenshot #7 shows sample package execution.
Screenshot #8 shows data in the tables dbo.Source and dbo.Destination after package execution.
Hope that helps.
Scripts:
.
CREATE TABLE [dbo].[Destination](
[Id] [int] IDENTITY(1,1) NOT NULL,
[DateValue] [datetime] NOT NULL,
[ItemNumber] [varchar](50) NOT NULL,
[Qty] [int] NOT NULL,
CONSTRAINT [PK_Destination] PRIMARY KEY CLUSTERED ([Id] ASC)) ON [PRIMARY]
GO
CREATE TABLE [dbo].[Source](
[Id] [int] IDENTITY(1,1) NOT NULL,
[ItemNumber] [varchar](50) NOT NULL,
[Qty] [int] NOT NULL,
CONSTRAINT [PK_Source] PRIMARY KEY CLUSTERED ([Id] ASC)) ON [PRIMARY]
GO
Screenshot #1:
Screenshot #2:
Screenshot #3:
Screenshot #4:
Screenshot #5:
Screenshot #6:
Screenshot #7:
Screenshot #8:
You need to make sure your SQL Statement is of the correct type to be parameterized according to your connection manager.
If you're using OLE DB - your Insert statement needs to be of some kind like INSERT INTO Tbl(col) VALUES (?)
Resultset should be "None" (As there's nothing to return on your INSERT), and the Parameter Mapping tab should have a single parameter (or as many as ?'s you have, and Parameter Name should (for OLE DB) start on 0, then 1, 2 ... ,n. If you were using an ADO connection you would have to name the parameters Param1, Param2, ... ParamN.
You can see the reference for passing variables to the respective connection managers here: http://technet.microsoft.com/en-us/library/cc280502.aspx
Proper answer as per your comment:
You cannot use a parameter mapping in a query of that kind. But there is an easy way to do it.
Make a new variable sqlCommand (type string). click it in the list of variables, and press F4 to see properties. Change "EvaluateAsExpression" to true, and click the expression box to get up the expression window. In here make your query in a format like this "INSERT INTO tbl(dateCol,intCol,charCol) SELECT CONVERT(DATETIME,'" + (DT_STR,20,1252)#[User::dateVar] + "',104) AS dateCol, intCol, charCol from anotherTbl"
When you click Evaluate Expression you'll see a valid SQL statement being formed that you can use.
Now go back to the Execute SQL task, remove the parameter mapping. Close the dialog box, click the Execute SQL task, press F4, find the expressions line, click the ... and add an expression on the Property "SqlStatementSource" with expression #[User::sqlCommand] (or whatever you named your variable).
Now run the task and it should work without a problem.
You can find the expression here:
Objective:
To update datetime field (date+time) in parameter in sql task editor of SSIS package.
Description:
Step01: Created a variable having datatype as string.
e.g:
Variable name: ReconcileStartDateTime
Data Type: String
Step02: Assign value to a variable. create 'Execute SQL Task'
General:
SQL Command-> select cast(getdate() as nvarchar(100)) as StartDateTime
ResultSet-> Single Row
ByPassprepare-> True
Result set tab:
Result Name: StartDateTime
Variable Name: User::ReconcileStartDateTime
Step 03: Create Execute SQL Task and use query as below:
SqlStatement: Update OrderDetail set StartDate = cast(? as datetime) where ID= 101;
Parameter mapping: click ADD button, set variable name User::ReconcileStartDateTime; Datatype as Nvarchar; Parameter to 0
Result: When execute the SSIS package, datetime is set accordingly. SQL Profile would help to see the output.
Related
I was running some SQL migration that first checked for some conditions to determine if it should be executed and I was surprised to discover that SQL server failed the script even in the case it was not executed. What was even more surprising - only broken INSERTs for existing tables caused the failure, while broken SELECTs and INSERTs for non-existing tables were not causing any failures.
Here is a minimalistic example.
The test table creation script:
CREATE TABLE [dbo].[TableWithManyColumnsTheFirstIsGuid](
[Id] [uniqueidentifier] NOT NULL,
[Value] [int] NOT NULL
)
The test script:
IF 1 = 2 -- to never execute
BEGIN
INSERT INTO NoSuchTable VALUES ('failure');
SELECT * FROM NoSuchTable;
INSERT INTO TableWithManyColumnsTheFirstIsGuid VALUES ('failure');
END
You will receive Column name or number of supplied values does not match table definition. error, which is correct for the third command because there is no way for SQL server to insert a string value in a table with Guid and int columns.
Why does it fail at all if that faulty INSERT statement would be never executed because of the IF condition?
If the answer is 'because SQL server validates commands in scripts even when not executing them', then the question is why do the other two broken commands not cause it to fail?
This behavior can bite you hard in a situation when you have a migration that removes a column and a disabled old migration that inserts some data into that column. The disabled migration will cause your entire migration script to fail. Of course, you can clean up your migration, but that's manual work and not automation-friendly when using migration scripts generated by EntityFramework.
Is this behavior by design? Is it specified in ANSI SQL or is it Microsoft-specific?
I have an SSIS Package, which contains multiple flows.
Each flow is responsible for creating a "staging" table, which gets filled up after creation.
These tables are global temporary tables.
I added 1 extra flow (I did not make the package) which does exactly as stated above, for another table. However, for some reason, the package fails intermittently on this flow, while it is exactly the same as others, besides some table names.
The error that keeps popping up:
Update - Insert Data Flow:Error: SSIS Error Code DTS_E_OLEDBERROR. An
OLE DB error has occurred. Error code: 0x80004005. An OLE DB record is
available. Source: "Microsoft SQL Server Native Client 11.0"
Hresult: 0x80004005 Description: "Unspecified error". An OLE DB
record is available. Source: "Microsoft SQL Server Native Client
11.0" Hresult: 0x80004005 Description: "The metadata could not be determined because statement 'select * from
'##TmpMcsConfigurationDeviceHistory86B34BFD041A430E84CCACE78DA336A1'' uses a temp table.".
Creation expression:
"CREATE TABLE " + #[User::TmpMcsConfigurationDeviceHistory] + " ([RecId] [bigint] NULL,[DataAreaID] [nvarchar](4) COLLATE database_default NULL,[Asset] [bigint] NULL,[Code] [nvarchar](255) COLLATE database_default NULL,[Configuration] [bigint],[StartdateTime] [datetime] NULL,[EndDateTime] [datetime] NULL)
"
Parsed expression (=evaluated):
CREATE TABLE ##TmpMcsConfigurationDeviceHistory764E56F088DC475C9CC747CC82B9E388 ([RecId] [bigint] NULL,[DataAreaID] [nvarchar](4) COLLATE database_default NULL,[Asset] [bigint] NULL,[Code] [nvarchar](255) COLLATE database_default NULL,[Configuration] [bigint],[StartdateTime] [datetime] NULL,[EndDateTime] [datetime] NULL)
Using WITH RESULT SETS to explicitly define the metadata will allow SSIS to skip the sp_describe_first_result_set step and use the metadata that you define. The upside is that you can use this to get SSIS to execute SQL that contains a temporary table (for me, that performance helped a lot); the downside is, you have to manually maintain and update this if anything changes.
Query sample (stored procedure:)
EXEC ('dbo.MyStoredProcedure')
WITH RESULT SETS
(
(
MyIntegerColumn INT NOT NULL,
MyTextColumn VARCHAR(50) NULL,
MyOtherColumn BIT NULL
)
)
Query sample (simple SQL:)
EXEC ('
CREATE TABLE #a
(
MyIntegerColumn INT NOT NULL,
MyTextColumn VARCHAR(50) NULL,
MyOtherColumn BIT NULL
)
INSERT INTO #a
(
MyIntegerColumn,
MyTextColumn,
MyOtherColumn
)
SELECT
1 AS MyIntegerColumn,
''x'' AS MyTextColumn,
0 AS MyOtherColumn
SELECT MyIntegerColumn, MyTextColumn, MyOtherColumn
FROM #a')
WITH RESULT SETS
(
(
MyIntegerColumn INT NOT NULL
,MyTextColumn VARCHAR(50) NULL
,MyOtherColumn BIT NULL
)
)
Another option (kind of a hack, but it works and doesn't require you to change your use of global temp tables) is to use a SET FMTONLY ON command in front of your actual query to send a fake "First result set" to SSIS with your correct column structure. So you can do something like
SET FMTONLY ON
select 0 as a, 1 as b, 'test' as C, GETDATE() as D
SET FMTONLY OFF
select a, b, c, d from ##TempTable
When SSIS runs sp_describe_first_result_set, it will return the metadata and column names of your FMTONLY command, and won't complain about not being able to determine the metadata of your temp table because it won't even try.
If you are working on SSIS 2012, then it uses system stored procedure sp_describe_first_result_set to fetch the metadata of the tables and it does not support temporary tables. But you can go for other options like table variables and CTEs which are going to work fine. https://connect.microsoft.com/SQLServer/feedback/details/629077/denali-engine-metadata-discovery-shuns-temp-tables
I had faced a similar issue when SSSI packages were migrated from 2008 to 2016. The latest version uses sp_describe_first_result_set to fetch metadata and it does not work with temporary tables. As a workaround, I used the below query in the OLEDB source editor. I did not change the SQL stored procedure, and it still uses a temporary table. Do be sure to use the Parse Query and Preview option to ensure it works fine. See the image below.
Query:
EXEC [dbo].[spGetNames]
WITH RESULT SETS((
FirstName varchar(50),
LastName varchar(50)
));
Had the same issue as we use temp table for staging. After spending some time, found a work around.
In the OLE DB/ADO Destination of Data flow task where you specify the name of the staging table .
Change the AccessMode property to SQL command instead of OpenRowSet and specify SQL Command property to "select * from #temp".
Hurray, Its working as expected.
Catch here is when you specify Access mode other than SQL Command, SSIS expects that to be a table / view and it changed the SSIS to call sp_describe_first_result_set to get the meta data. but when you specify SQL Command, it's expecting a query or SP command etc. so luckily it still uses the old way of getting the meta data .
http://social.msdn.microsoft.com/Forums/sqlserver/en-US/cfe1c7c1-910a-4f52-9718-c3406263b177/usage-of-temp-tables-in-ssis-2012?forum=sqlintegrationservices#cfe1c7c1-910a-4f52-9718-c3406263b177
I found that the problem lied in a GUID duplicate issue, I copied elements (like the one to create temp tables) and they all received the same guid upon copying. I used a tool to reset all these guids in my package and this solved my problem.
Thanks!
I receive the following error (taken from replication monitor):
The option 'FILETABLE_STREAMID_UNIQUE_CONSTRAINT_NAME' is only valid when used on a FileTable. Remove the option from the statement. (Source: MSSQLServer, Error number: 33411)
The command attempted is:
CREATE TABLE [dbo].[WP_CashCenter_StreamLocationLink](
[id] [bigint] NOT NULL,
[Stream_id] [int] NOT NULL,
[Location_id] [numeric](15, 0) NOT NULL,
[UID] [uniqueidentifier] NOT NULL
)
WITH
(
FILETABLE_STREAMID_UNIQUE_CONSTRAINT_NAME=[UC_StreamLocation]
)
Now, for me there's two things unclear here.
Table already existed on subscriber, and I've set #pre_creation_cmd = N'delete' for the article. So I don't expect the table to be dropped and re-created. In fact, table still exists on subscriber side, although create table command failed to complete. What am I missing? Where does this create table command come from and why?
I don't understand why does this FILETABLE_STREAMID_UNIQUE_CONSTRAINT_NAME option appear in creation script. I tried generating create table script from table in SSMS and indeed, it's there. But what's weird, I can't drop and re-create the table this way - I get the very same error message.
EDIT: Ok, I guess now I know why the table is still there - I noticed begin tran in sql server profiler.
If your table on the publisher is truly not defined as a FileTable, then the issue has to do with the column named "Stream_id". I believe there is a known issue in SQL 2012 where if you have a column named "Stream_id", which is kind of reserved for FileTable/FileStream, it will automatically add that constraint, and unfortunately break Replication. The workaround here is to rename the column to something other than "Stream_id".
Another workaround is to set the schema option to not replicate constraints (guessing this will work). If you require constraints on the subscriber, you can then try to manually apply them on the sbuscriber after the fact (or script them out and use #post_snaphsot_script).
I have created a table called DimInternationalFunction.
IF EXISTS (SELECT * FROM sys.objects WHERE object_id = OBJECT_ID(N'[DimInternationalFunction]') AND type in (N'U'))
DROP TABLE [DimInternationalFunction]
Go
Create Table DimInternationalFunction
(IntFunctionKey int NOT NULL identity primary key,
SubSubFunctionString char(10),
FunctionCode char(3),
SubFunctionCode char(6),
SubSubFunctionCode char(10),
SubSubFunctionName nvarchar(60),
SubFunctionName nvarchar(60),
FunctionName nvarchar(60))
I have initially inserted records in this table in SSMS.
After inserting the initial records manually in SSMS, now my manager wants me to insert "new records only" using SSIS.
I have tried using this in SSMS and it worked. Either it gives me 0 records inserted or sometimes it gives me 5 records inserted as a result. My manager wants me to do this in SSIS.
I tried using this script inside the OLE DB Source under Data Access Mode: SQL Command and SQL Command text:
insert into DWResourceTask.dbo.DimInternationalFunction
select f.SubSubFunctionString,
f.FunctionCode,
f.SubFunctionCode,
f.SubSubFunctionCode,
f.SubSubFunctionName,
f.SubFunctionName,
f.FunctionName
from ODS_Function F
where FunctionViewCode = 'INT'
and not exists (select * from DWResourceTask.dbo.DimInternationalFunction I
where (f.SubSubFunctionString=i.SubSubFunctionString
and f.FunctionCode=i.FunctionCode
and f.SubFunctionCode=i.SubFunctionCode
and f.SubSubFunctionCode=i.SubSubFunctionCode
and f.SubSubFunctionName=i.SubSubFunctionName
and f.SubFunctionName=i.SubFunctionName
and f.FunctionName=i.FunctionName)
)
The error message that I got after clicking preview is
The component reported the following warnings:
Error at Int Function [International Function Table [33]]: No column information was returned by the SQL command.
Choose OK if you want to continue with the operation.
Choose Cancel if you want to stop the operation.
Is there another component in SSIS that can do this? or can I just use either exec sql task component or ole db source?
I am thinking of using exec sql task connected to a data flow task, inside the data flow task I will put ole db source containing a staging table and do a delete on that or is there any other way to do it. Please help. Thanks in advance.
You could do it with an Execute SQL task.
If you want to do it "the pure SSIS way", you could use a lookup component. Set the "rows with no matching" handler to "Redirect to no match output", and configure the target table as connection. Then use the "No Match Output" only, ignoring the "Match Output". And send the records from the "No Match Output" to the target.
In spite of its name, the "Lookup" component can be used to filter data in many cases.
But I would assume the Execute SQL task would be more efficient for large data sets, keeping all data within the database engine.
We are using SSIS to transfer the contents of 3 tables from ServerA to ServerB
ServerA is located inhouse, ServerB is in a datacenter.
There is VPN connectivity from ServerA to ServerB, and vice-versa.
For security reasons, we are looking to remove the ability for ServerB to “see” ServerA
The current SQL select statement goes something like this:
SELECT * FROM ServerB.OrderTable WHERE NOT IN ServerA.OrderTable
(i appreciate the syntax is off)
These records are then inserted on ServerA.OrderTable (the table is identical)
This works great- only transferring records that are not in ServerA
However, this requires ServerB to be aware of ServerA
What I propose to do is put a “Transferred” bit column on each table, and loop through each record, setting Transferred to true.
That way, the above mentioned SQL statement could be changed to something like:
SELECT * FROM ServerB.OrderTable WHERE Transferred = 0
My question is, how to go about doing this?
I have been advised that a foreach loop container could do this, but I can’t find that anywhere...
Could someone point me in the right direction
Probably, you have found an answer to your question by now. This answer is to help others who might stumble upon this question. Here is a possible option that can be used to resolve the data transfer using SSIS. I assumed that you can still create connection strings pointing to both your servers A and B from the SSIS package. If that assumption is wrong, please let me know so I can delete this answer. In this example, I am using SQL Server 2008 R2 as back-end. Since I don't have two servers, I have created two identical tables in different Schemas ServerA and ServerB.
Step-by-step process:
In the Connection manager section of the SSIS, create two OLE DB Connections namely ServerA and ServerB. This example is pointing to the same server but in your scenario, the connections will need to point to your two different servers. Refer screenshot #1.
Create two schemas ServerA and ServerB. Create the table dbo.ItemInfo in both the schemas. Create scripts for these tables are given under Scripts section. Again, these objects are for this example only.
I have populated both the tables with some sample data. Table ServerA.ItemInfo contains 2,222 rows and table ServerB.ItemInfo contains 10,000 rows. As per the question, the missing 7,778 rows should be transferred from ServerB to ServerA. Refer screenshot #2.
On the SSIS package's control flow tab, place a data flow task as shown in screenshot #3.
Double-click on the data flow task to navigate to the data flow tab and configure the data flow task as described below. Server B is an OLE DB Source; Find record in Server A is a Lookup transformation task and Server A is an OLE DB Destination.
Configure OLE DB Source Server B as shown in screenshots #4 and #5.
Configure Lookup transformation task Find record in Server A as shown in screenshots #6 - #8. In this example, ItemId is the unique key. Hence, that is the column used to search for missing records between the two tables. Since we need only the rows that do not exist in Server A, we need to select the option Redirect rows to no match output.
Place a OLE DB Destination on the data flow task. When you connect the Lookup transformation task with OLE DB Destination, you will be prompted with Input Output Selection dialog. Select Lookup No Match Output from the dialog as shown in screenshot #9. Configure the OLE DB Destination Server A as shown in screenshots #10 and #11.
Once the data flow task is configured, it should look like as shown in screenshot #12.
Sample execution of the package is shown in screenshot #13. As you can notice, the missing 7,778 rows have been transferred from Server B to Server A. Refer screenshot #14 to view the table record count after the package execution.
Since the requirement was to just insert the missing records, this approach has been used. If you would like update existing records and delete records that are no longer valid, please refer the example that I have provided in this link. SQL Integration Services to load tab delimited file? The example in the link shows how to transfer a flat file to SQL but it updates existing records and deletes invalid records. Also, the example is fine tuned to handle large number of rows.
Hope that helps.
Scripts
.
CREATE SCHEMA [ServerA] AUTHORIZATION [dbo]
GO
CREATE SCHEMA [ServerB] AUTHORIZATION [dbo]
GO
CREATE TABLE [ServerA].[ItemInfo](
[Id] [int] IDENTITY(1,1) NOT NULL,
[ItemId] [varchar](255) NOT NULL,
[ItemName] [varchar](255) NOT NULL,
[ItemType] [varchar](255) NOT NULL,
CONSTRAINT [PK_ItemInfo] PRIMARY KEY CLUSTERED ([Id] ASC),
CONSTRAINT [UK_ItemInfo_ItemId] UNIQUE NONCLUSTERED ([ItemId] ASC)
) ON [PRIMARY]
GO
CREATE TABLE [ServerB].[ItemInfo](
[Id] [int] IDENTITY(1,1) NOT NULL,
[ItemId] [varchar](255) NOT NULL,
[ItemName] [varchar](255) NOT NULL,
[ItemType] [varchar](255) NOT NULL,
CONSTRAINT [PK_ItemInfo] PRIMARY KEY CLUSTERED ([Id] ASC),
CONSTRAINT [UK_ItemInfo_ItemId] UNIQUE NONCLUSTERED ([ItemId] ASC)
) ON [PRIMARY]
GO
Screenshot #1:
Screenshot #2:
Screenshot #3:
Screenshot #4:
Screenshot #5:
Screenshot #6:
Screenshot #7:
Screenshot #8:
Screenshot #9:
Screenshot #10:
Screenshot #11:
Screenshot #12:
Screenshot #13:
Screenshot #14: