Temp Table Dropped Immediately in SSIS - sql-server

I have successfully deployed many SSIS packages (in SQL Server 2008 R2).
I have discovered an issue that is stumping me. Examine the basic following flow.
In 'Download Files from FTP' I pick up the new files I want to operate on and download them to my local drive, saving the picked up files so they can be iterated upon in the Process Each File foreach container.
In 'Create Temp Table' I am creating a temp table as such:
IF not exists (SELECT * FROM tempdb.dbo.sysobjects WHERE name='##tempProcessFiles' and xtype='U')
CREATE TABLE ##tempProcessFiles
(
--my columns
)
GO
In 'Truncate Temp Table' I am doing:
TRUNCATE TABLE ##tempProcessFiles
Basically, I pick up some files, create a temp table and then loop through each file, loading the necessary junk to my database. In order to make sure everything runs smoothly, I truncate the temp table on each iteration so I have a fresh table to work with. On the very last step of this package, I drop the temp table. I also drop the temp table OnError.
The problem is, when I schedule this guy, create temp table executes fine but once it reaches 'truncate temp table', it throws an exception saying the temp table does not exist. Specifically:
Executing the query "TRUNCATE TABLE ##tempProcessFiles" failed with
the following error: "Cannot find the object "##tempProcessFiles"
because it does not exist or you do not have permissions.". Possible
failure reasons: Problems with the query, "ResultSet" property not set
correctly, parameters not set correctly, or connection not established
correctly.
Running this package in debug mode does not recreate this scenario. Everything works fine.
I discovered (after many days of frustration, chasing ideas that sent me in circles) that by removing the GO statement in 'Create Temp Table' my package would execute normally.
It was determined that there were no errors or no accidental dropping of my temp table. Plus, I had persist security info set to false like always, delay validation set to true like always; but my drop temp table task wasn't firing - I even tried deploying it with it disabled. None of my other packages used this global temp table and the database had no scheduled jobs or triggers dropping it or anything ridiculous like that.
All I can determine is that the GO operator here was causing my session to the database to terminate and cause my temp table to be dropped immediately. Is this how GO operators works in SSIS? If I were to run a script on SQL Server using the exact same syntax as I have in my package, I definitely would not experience this kind of thing, so it's thrown me for a loop. But I'm not a DBA, so it's probably something fundamental or subtle to that area of work.
Can anyone explain what really happened here? Or would I need to provide more detail about my package to get a sufficient answer? The only change I did was removing GO from 'Create Temp Table', so that was the definite fix.

It seems that the connection is being dropped after all, despite RetainSameConnection.
To gather more information, try following:
Run a trace while executing the package to confirm the above (add Security Audit > Login/Logout and Sessions > Existing connections). Pay attention to StartTime and EndTime, maybe it'll be something interesting like 30s (default command timeout)?
Then try running this package with Crate and Truncate tasks only (without the loop) to eliminate a possibility that other components are interfering (Data Sources using same Connection Manager?). If it works, try it with a loop, but with Truncate task only placed inside, and keep adding other components.

Related

SSIS returns an incorrect error

I created SSIS package, which created table MyTable in SQL Server with a column BaseVariantVersionID. Program first inserts data into this table.
At the end of the package, I should drop column BaseVariantVersionID from the table.
First debug is OK. But on second attempt, SSIS returns a validation error. It doesn't allow to recreate the table with BaseVariantVersionID, because on next step package cannot made insert in column, which now is not presented.
Maybe you know some property to disable current db checking?
Update
I drop column after all steps.
And on first step I recreated DB with column.
But system returns error - looks like it use existing table for validation.
This could be several issues I can think of.
You must absolutely delete the table if it already exists prior to creating it at the beginning of the package.
Example Execute SQL Task:
IF OBJECT_ID('MyTable', 'U') IS NOT NULL
DROP TABLE MyTable
GO
CREATE TABLE MyTable (etc...) GO
ALTER TABLE MyTable ADD COLUMN (etc...) GO
Second, you can set DelayValidation = True in the Data Flow Task Properties Window (this is on the bottom right usually after clicking a Dataflow Task in the design area). That delays validation until run time.
If at the moment, you have a missing field error, you can add the column manually in SQL Server Management Studio, then double-click the task with the error and any missing field error should disappear (now that the column exists). After that you can save the package and exit.
For an Execute SQL Task, you can set BypassPrepare to True. Sometimes that will allow you to design and build a package which doesn't validate at design time, but will validate okay at run time.
But I have to question the need to create columns and tables at run time. Are you sure you need to do this? It is a more typical use case to have SSIS move data around in existing table structures, not create them at run time.
If I read your description correctly, you're dropping the column on the first pass, then getting an error trying to recreate the table on the second pass? You will get an error on run #2 if you try to create a table that already exists, even if it doesn't have "SomeColumn" in it.
You'll need to drop the table if you're going to want to recreate it or change your code to add the column back if the table exists.

Conditional SQL block evaluated even when it won't be executed

I'm working on writing a migration script for a database, and am hoping to make it idempotent, so we can safely run it any number of times without fear of it altering the database (/ migrating data) beyond the first attempt.
Part of this migration involves removing columns from a table, but inserting that data into another table first. To do so, I have something along these lines.
IF EXISTS
(SELECT * FROM sys.columns
WHERE object_id = OBJECT_ID('TableToBeModified')
AND name = 'ColumnToBeDropped')
BEGIN
CREATE TABLE MigrationTable (
Id int,
ColumnToBeDropped varchar
);
INSERT INTO MigrationTable
(Id, ColumnToBeDropped)
SELECT Id, ColumnToBeDropped
FROM TableToBeModified;
END
The first time through, this works fine, since it still exists. However, on subsequent attempts, it fails because the column no longer exists. I understand that the entire script is evaluated, and I could instead put the inner contents into an EXEC statement, but is that really the best solution to this problem, or is there another, still potentially "validity enforced" option?
I understand that the entire script is evaluated, and I could instead put the inner contents into an EXEC statement, but is that really the best solution to this problem
Yes. There are several scenarios in which you would want to push off the parsing validation due to dependencies elsewhere in the script. I will even sometimes put things into an EXEC, even if there are no current problems, to ensure that there won't be as either the rest of the script changes or the environment due to addition changes made after the current rollout script was developed. Minorly, it helps break things up visually.
While there can be permissions issues related to breaking ownership changing due to using Dynamic SQL, that is rarely a concern for a rollout script, and not a problem I have ever run into.
If we are not sure that the script will work or not specially migrating database.
However, For query to updated data related change, i will execute script with BEGIN TRAN and check result is expected then we need to perform COMMIT TRAN otherwise ROLLBACK transaction, so it will discard transaction.

SSIS transaction with MSDTC service

So I have been struggling with handling transactions in SSIS. My requirement is to achieve transaction without enabling MSDTC service and I have partially achieved that but I just got another error which I feel like is one of the many bugs in SSIS. I used execute SQL task and explicitly mentioned begin tran and commit/rollback tran in my package. My package is working fine. All the tables are enclosed in a sequence container. I have a condition where one output from one table goes in 2 different tables and that's where the problem is. The funny part is even the package fails, I will still see insert in only these two tables. SSIS is shown in the attached image. I have disabled two tables. These two tables take input from Frholdsum and even if the package fails and there is no data in FDR holdssum tables. Microsft never ceases to amaze me :(. enter image description here
Set RetainSameConnection on your ConnectionManager to true.
https://munishbansal.wordpress.com/2009/04/01/how-to-retain-same-data-connection-across-multiple-tasks-in-ssis/
It's working fine if I explicitly write delete statements after rollback ran like this:
rollback tran; delete from dbo.UCOP_ENDOW_INVEST; delete from dbo.ucop_fdr_attrib ;
I shouldn't have to do this though :(

SQL 2008 All records in column in table updated to NULL

About 5 times a year one of our most critical tables has a specific column where all the values are replaced with NULL. We have run log explorers against this and we cannot see any login/hostname populated with the update, we can just see that the records were changed. We have searched all of our sprocs, functions, etc. for any update statement that touches this table on all databases on our server. The table does have a foreign key constraint on this column. It is an integer value that is established during an update, but the update is identity key specific. There is also an index on this field. Any suggestions on what could be causing this outside of a t-sql update statement?
I would start by denying any client side dynamic SQL if at all possible. It is much easier to audit stored procedures to make sure they execute the correct sql including a proper where clause. Unless your sql server is terribly broken, they only way data is updated is because of the sql you are running against it.
All stored procs, scripts, etc. should be audited before being allowed to run.
If you don't have the mojo to enforce no dynamic client sql, add application logging that captures each client sql before it is executed. Personally, I would have the logging routine throw an exception (after logging it) when a where clause is missing, but at a minimum, you should be able to figure out where data gets blown out next time by reviewing the log. Make sure your log captures enough information that you can trace it back to the exact source. Assign a unique "name" to each possible dynamic sql statement executed, e.g., each assign a 3 char code to each program, and then number each possible call 1..nn in your program so you can tell which call blew up your data at "abc123" as well as the exact sql that was defective.
ADDED COMMENT
Thought of this later. You might be able to add / modify the update trigger on the sql table to look at the number of rows update prevent the update if the number of rows exceeds a threshhold that makes sense for your. So, did a little searching and found someone wrote an article on this already as in this snippet
CREATE TRIGGER [Purchasing].[uPreventWholeUpdate]
ON [Purchasing].[VendorContact]
FOR UPDATE AS
BEGIN
DECLARE #Count int
SET #Count = ##ROWCOUNT;
IF #Count >= (SELECT SUM(row_count)
FROM sys.dm_db_partition_stats
WHERE OBJECT_ID = OBJECT_ID('Purchasing.VendorContact' )
AND index_id = 1)
BEGIN
RAISERROR('Cannot update all rows',16,1)
ROLLBACK TRANSACTION
RETURN;
END
END
Though this is not really the right fix, if you log this appropriately, I bet you can figure out what tried to screw up your data and fix it.
Best of luck
Transaction log explorer should be able to see who executed command, when, and how specifically command looks like.
Which log explorer do you use? If you are using ApexSQL Log you need to enable connection monitor feature in order to capture additional login details.
This might be like using a sledgehammer to drive in a thumb tack, but have you considered using SQL Server Auditing (provided you are using SQL Server Enterprise 2008 or greater)?

How did my trigger get deleted?

If you can figure out this one you are a true SQL guru! It's one of the weirdest things I've ever seen.
I've added a trigger to a table in our database. The server is SQL 2008. The trigger doesn't do anything particularly tricky. Just changes a LastUpdated field in the table when certain fields are changed. It's a "After Update" trigger.
There is a large C++ legacy app that runs all kind of huge queries against this database. Somehow (I've got absolutely no idea how) it is deleting this trigger. It doesn't delete any other triggers and I'm certain that it's not explicitly dropping the trigger or table. The developers of this app don't even know anything about my triggers.
How is this possible??
I've tried running a trace using SQL Server Profiler and I've gone through each command that it's sending and run them using SQL Management Studio but my trigger is not affected. It only seems to happen when I run the app. WTF :(
UPDATE:
Sorry I don't want to waste your time. I just realised that if I change the name of the trigger then it doesn't get deleted. Furthermore if I modify the trigger so it doesn't do anything at all then it still gets deleted. From this I can only guess that the other devs are explicitly deleting it but I've searched the trace for the trigger name and it's not there. I'll hassle them and see what they say. Thanks for the suggestions.
UPDATE 2:
The other devs reckon that they are not deleting it explicitly. It doesn't exist in sys.objects or sys.triggers so it's not a glitch with SSMS. So confused :( Guess I'll just rename it and hope for the best? Can't think of anything else to try. A few comments below have asked if the trigger is being deleted or just disabled or not working. As I stated, it's being deleted completely. Also, the problem is not related to the actual contents of the trigger. As I stated, it I remove the contents and replace with some extremely simple code that doesn't do anything then it is still deleted.
Cheers
Mark
Thoughts:
To delete a trigger requires ALTER permission = shouldn't be used by an app
Triggers can be disabled with ALTER TABLE
Triggers can be confused by testing for ##ROWCOUNT at the beginning to trap dummy updates etc
Is the trigger coded for single rows only and appears not to run
Does the trigger exists in sys.objects/sys.triggers: don't rely on Object Explorer in SSMS
A trigger can be deleted if the table is dropped and re-created
A trigger won't fire for TRUNCATE TABLE
I had an identical issue which I tracked down to a creation script missing a final GO statement.
Script 1
IF EXISTS (....)
DROP PROC MyProc
GO
CREATE PROC MyProc
.....
/* GO statement is missing */
Script 2
IF EXISTS (....)
DROP TRIGGER MyDisappearingTrigger
GO
CREATE TRIGGER MyDisappearingTrigger
.....
GO
When I inspected MyProc in the object explorer it looked like this:
CREATE PROC MyProc
AS
...
IF EXISTS (....)
DROP TRIGGER MyDisappearingTrigger
GO
So this meant that every time the stored proc was called the trigger was also deleted.
check with MERGE command in table it will cause triggers get error

Resources