DTS - Manual step excution gives different results than just hitting "Run" - sql-server

I have a DTS (not SSIS) package that hasn't been touched in years that I had to update a query with. When I run the package by manually executing each step in the editor, everything works out fine and generates a file of a couple thousand records as expected. When I hit the "Execute" button at the top of the editor to run the whole package, it doesn't error but the file is generated with only 1 record.
All tasks inside of the package are either transformation steps or Sql Tasks. There are not any ActiveX script tasks. When I watch the process as it's running the steps by itself, the execution is following the mapping correctly.
I'm at a loss on this one. Has anyone seen this issue before or have any idea where to start?

I just ran into a similar issue recently. While working with the senior DBA, we found that the server where the package ran did not have the right permissions to a directory on the network. The package ran fine in my box, but died on the production server. We need to give permissions to the sqlservice account on the production box, to write to the directory on the network.
You might also want to check out any ActiveX Script step that changes the connection string or destination of Data Pump steps. I've had cases where these were different on the destination server that the DTS packages run.

After going through all of the lines of all of the stored procedures and straight sql tasks used in the package, I located a SET ROWCOUNT 1 that was never reset. While I was manually executing each step separately, the RowCount would be automatically reset; however, when it was run as a complete package, the RowCount was never reset. Adding SET ROWCOUNT 0 at the end of the particular script resolved this issue.

Related

SQL Server SSIS package suddenly throws conversion error (csv) ONLY when running in job - works fine running the package manually and on test server

This is a new problem that began about three weeks ago on several jobs that have been running for years without problems - with both SSIS Flat File Sources and script tasks. The error occurs on READ and never makes it to write.
The files are semicolon separated .csv files, and the columns that are throwing the error are decimal values. Since the locale is Danish, that means the decimal character is a comma.
The job in SSMS only points to the SSIS package, which consists of only a data flow task, which contains a Flat File Source and an OLE DB Destination. The job fails before it can get to the destination, throwing the error: "Data conversion failed. The data conversion for column 'Pris' returned status value 2 and status text "The value could not be converted because of a potental loss of data""
The package runs without error manually from within VS 2019 using the latest SSIS extension. The errors come when running the package from a SQL Server Agent job.
My first instinct was to change the column properties on the flat file connection manager. It was DT_NUMERIC with a precision of 9 and scale of 4 (the destination is numeric(9,4). I changed it to DT_R4 instead (float) and... well that fixed it.
It's not THE fix though. I mean, it works for this particular file, but what about all the others? We have a LOT of them, and even though I could modify the packages for all of them, something bothers me about the fact that these jobs were running just fine for years...
I haven't found anything that might have triggered the change. On the day the problems started the Service Account user was added to the local administrator group on another of our machines in order to be able to run a PS script on that machine. SSMS 18 was also installed on the server that day. Other than that, pretty uneventful.
The job is run as the SQL Server Agent Service Account. It runs 99% of our jobs, this is the only type I can see that it acts strange with.
I've checked the locale on the server, the file, the package, everything seems to be matching.
We have a parallel test server. Putting .csv files in the corresponding folders and running the jobs works fine. It's something either with the SQL Server or the Service Account.
I've just been fighting with this so much I can't see straight. Can anyone help me figure out what's going on and point me in the right direction so I can get this fixed?

SSIS package to load XLSX to SQL fails when run in Agent with Unexpected Termination

Let me start by saying I've read just about everything on 'Unexpected Termination' errors the last few days and am none the wiser!
I have a number of XLSX files in a network folder, and have 2 packages in a project to read them and load to SQL server (2017) using Excel connector (ACE.16) on dev and prod.
One package loops through about 35 workbooks and appends them all to a single SQL table.
The second package simply reads a single (~5mb) file and writes to a SQL table (with a single data transformation along the way - its not that, I removed it and it still failed)
Both packages use same source folder.
Both packages write to same destination DB.
Both packages run fine on my local machine.
Both packages can be executed correctly from the SSISDB catalog on the server (Right-click-->Execute).
When scheduled in Agent (we have a proxy account with correct folder permissions) the looping package runs just fine, whilst the simpler package fails with 'Unexpected Termination'
Verbose logging reveals nothing, as does Event Viewer on the SQL box.
I started to look into other options such as converting to CSV first using Script Task and Excel Interop but we are a 365 site now so we'd need Office client installs on dev and production server (and not knowing much C# won't help!)
My next route is to see if I can get it to write to CSV destination successfully. If so then I might be able to go XLSX-->CSV-->SQL without having to use Interop or external libraries.
Unfortunately I've not been able to turn up anything further in my searches, so wondered if someone more enlightened than me might be able to suggest where to look next.

SSIS file exist check works in SSDT not running from SSISDB

this is one that has me stumped and Ive been doing this a long while.
Migrating to SQL server 2016, large number of ETL. Easy enough.
One of the ETL packages has a simple script task to take a table of files, run a file exists foreach loop.
it uses a project parameter to create the unc ( \servername\share) and then binds that to the file name in the script task.
use an environment config setup in SSISDB
execute in SSDT works fine, deploy to catalog and it cant see the file. i know youll say permissions, but ive permissioned everyone group to share and drive in case its that. SSISDB execution means it should be running under my security context and im domain admin, local admin and creator owner of the share.
even strangeR, i have created simple package to grab the contents of one of the files and import into a dump table in case permissions or pathway were duff ( even though they work in SSDT might be the enviroNment config in SSISDB). THIS WORKS FINE, therefore it cant be the envrionment setup of SSISDB being referenced.
please note this is not running from an agent job yet so wont be due to agent server account issue. need to get it running from ssisdb first then ill create an agent job
So -- script task cant see unc share, built from two variables, that works in ssdt and its running under same credentials...
Go
For what its work the script task code is
Dts.Variables("BolFileExists").Value = File.Exists(Dts.Variables("StrLoadFileLocation").Value.ToString & Dts.Variables("StrCurrentFile").Value.ToString)
This is a slightly different answer as it shows a different approach and removes the script task. I use a foreach to check if the file exists using GUI tools provided by SSIS:
Well I found the answer and I deserve to punch myself in the face.
Tried everything, it was a file variable and path variable being pulled together in the script task so tried concatenation that before the script task, pumped this into a table to ensure it was going to write table.
Literally everything was fine and still didn’t work.
The issue....
Building it as a 2017 package onto a 2016 Sql server.
I’ve not found what was missing dll wise but it must have been one of those that meant the script task couldn’t find the files but weird it didn’t break and just said the files weren’t there!
Thanks all for input, I’m going to go put my head in the door and slam it

SQL Job completes successfully but does not execute packages

I have taken a look at several articles including this unanswered question: SQL Server Job runs successfully but doesn't execute packages
I have the exact same problem in SQL Server 2012 using the integration services MSDB catalog. I can execute the SSIS packages manually from that catalog, but the agent job doesn't do anything except state that it completed successfully. I have also executed my SSIS packages from within Visual Studio and they worked just fine. Here's the situation and am wondering if it may be permissions:
SSIS packages look for Excel files matching criteria in a network location.
Once found, the SSIS packages writing the data into the database and archive the file to another folder on that same network location.
Emails are sent upon any failure of import of data into the database or migration into the archive folders.
I have the SQL Agent job running the SSIS packages from a package store (MSDB) using the SQL Server Agent Service Account to run under. Currently we are not doing any sort of project deployment to these servers so I am sticking with package deployment. Here are some steps I've taken:
Run packages manually from Visual Studio 2010 (fully successful).
Run packages manually from SQL Server MSDB catalog (fully successful).
Run job manually from SQL Server Agent using parent package as a step that will execute child packages as an external reference (success but nothing happens).
Run job manually from SQL Server Agent using each package as its own step excluding the parent package (success but nothing happens).
Any ideas? Permissions to the network location or need a proxy? Again, I am running Microsoft SQL Server 2012 Enterprise Edition 64-bit. Many thanks for any help you can provide.
Found the problem. My SSIS package has a foreach loop container and, while the tasks inside the loop container couldn't access the destination, the loop container technically completed successfully. We had to give permissions to the account the steps were running under for the job to correct that. These permissions were put on the network location to allow that account access to read and write to that location. Additionally, my Excel connection was 64-bit so we enabled it to 32-bit runtime and this allowed that portion of the process to complete successfully. I re-enabled any disabled tasks and it looks good to go now. Thanks!
I have also faced this scenario many times but when I checked running the package manually,its completing successfully because I was using for each loop container and sequence container as well.In both cases for each loop and sequence were completing without validating other ones.So I checked precedence constraint and change it,Now it working and all the component ran successfully.
Sometimes we miss to choose appropriate precedence constraint, there are many option like on Success ,failure,completion and then for you can choose values from Constraint ,Expression,Expression AND Constraint and Expression OR Constraint.
Initially i was using Expression OR Constraint for success and now changed it to Expression AND Constraint, its working fine for me.
You also need to do this,it will definitely work please try and let me know.

SQL Server Agent - Retrying/resuming failed steps of job

I created an sql server agent job that consists of steps that are calls to ssis packages.
Let's say that the job is being executed and one of the steps (that is a ssis package) has a file system task/sql task etc that fails. Is there any way that can I retry/rerun this particular package step (file system task/sql task etc) and then go on with my execution of the rest of the job steps after I fix my error? I know that you can retry certain sql server agent jobs and steps but I can't find any way to retry this step and resume my execution in case something inside the package fails.
And I also want to know if there is any way to disable certain package steps from the sql server agent "level" - without having to open Data tools.
Thank you.
Go into the job properties, and into the step that contains the SSIS package. Go to the Advanced tab, and you can modify the number of Retry Attempts for that step.
You could always put some error handling into the package in question. If an exception occurs, it will be handled accordingly allowing the other packages to continue. I always handle exceptions within the packages and log exceptions to a table.
In relation to your other question:
"And I also want to know if there is any way to disable certain package steps from the sql server agent "level" - without having to open Data tools. "
You could always go down the PowerShell route, create a script that takes a parameter. The parameter specifying whether or not you want a specific package to run.
If you want to skip certain steps within the package then you would have to create some sort of precedent constraint that acts on a variable you have set.

Resources