First the basics: I'm using VS2015 SSDT to create a SSIS Package to pick up two files on a DFSR share (Windows 2012 R2) that are then imported into a SQL Server 2016 database. The package is run via SQL Server Agent. The SQL Server Agent is running under a domain account that has permissions to the DFSR folder and subsequent files.
The Problem: I'm getting a weird issue that I seem to have dwindled down to having SSIS script tasks being unable to change the value of a SSIS variables.
I've got a script task that uses VB2015 to check if two files exist. The file paths use variables to build the full path and then checks that both are present. If they are both present, it sets a variable to "True" and then exist. This runs just fine in VS2015 SSDT while testing the functionality, but the variable doesn't change when running under SQL Server Agent.
Here's the code:
Public Sub Main()
Dim filepath As String = CStr(Dts.Variables("User::varFileDirectory").Value)
Dim emplfile As String = filepath + CStr(Dts.Variables("User::varEmployeeCSVFile").Value)
Dim histfile As String = filepath + CStr(Dts.Variables("User::varHistoryCSVFile").Value)
If (File.Exists(emplfile) AndAlso File.Exists(histfile)) Then
Dts.Variables("User::varFilesExist").Value = True
Else
Dts.Variables("User::varFilesExist").Value = False
End If
Dts.TaskResult = ScriptResults.Success
End Sub
If the files are found in the folder it set User::FilesExist to "True" and then is used in the subsequent constraint as an expression:
I pass this constraint while testing from SSDT 2015, but once it's deployed over to the SSIS repository on SQL Server 2016, it no longer get's passed this constraint when there are files there.
I've verified my package runs by simply setting the default value of "User::varFilesExist" to "True". The pakage runs flawlessly from SQL Server Agent when this is done. On the first run, it deletes the files that have been loaded, so when it runs a second time, it fails since those files are not present.
I feel like I'm missing something simple, but it appears that the script is not changing/updating the value of "User::FilesExist" within the script. I believe my VB2015 code is correct because it tests just fine when run within VS2015 SSDT
Today I installed SSDT 2017 and brought over my packages. Tested and then deployed and everything is working as expected. Not sure what was going on.
Related
I have recently installed SSMS 2022 V18 on my D: drive to conserve some space on the C:
When I try to execute the .dtsx file it says that there is no program available to execute it.
Is there some way to make this work or do I have to re install SSMS on the default C: drive.
10/27/22
Loaded SQL server 19 developer version. included the Integration service. Now I can import my dtsx files and execute them. So far so good. What I would like to do is run the .dtsx packages using DTexec.exe. I follow the example of executing dtexec running a file dtexec /f "d:\documents\sql\hits.dtsx" and it complains about the version numbers.
Package migration from version 8 to version 3 failed with error 0xC001700A "The version number in the package is not valid. The version number cannot be greater than current version number.".
THe dtsx file was created while running sql server 19 SSMS v 18.12.1. Should I load the SSMS package from SQL19?
Any other info would great.
Thanks
SSIS packages are executed by
DTEXEC.exe for command line runs (requires installation of SQL Server Integration Services Service to pass a licensing check)
Custom code referencing the SQL Server assemblies
The managed object model in the SSISDB https://techcommunity.microsoft.com/t5/sql-server-integration-services/a-glimpse-of-the-ssis-catalog-managed-object-model/ba-p/387892
That's it, those are the three "runners" of SSIS packages. Using SSMS to run a package depends on "how" you were doing it versus what you're attempting now. Ultimately, you're connecting to a database and asking it to run a package.
If the package is in the SSISDB, it's using MOM.
It you were running a job, it's dtexec.exe
If you connected to the Integration Services connection node "thing" in SSMS, that's using dtexec, the 64 bit version I think?
If that's the case, changing SSMS likely also involved changing versions of SSMS which is version specific from what I can tell with that connector option.
OK so I came up with a solution for executing dtsx packages from another program.
the program that I want to run is DTExecUI.exe. In my case that's found in
"D:\Program Files\Common7\IDE\CommonExtensions\Microsoft\SSIS\150\Binn\DTExecUI.exe"
This program takes a parameter /f to indicate that is to get the data from a file and then the file location. example " /f D:\Documents\SQL\ipdata.dtsx.
Having figured all that out I wrote a little console app in VB that take the file name as a input parameter and launches DTExecUI to execute the file.
Imports System
Module Program
Sub Main(args As String())
Dim app As String = "D:\Program Files\Common7\IDE\CommonExtensions\Microsoft\SSIS\150\Binn\DTExecUI.exe"
Console.WriteLine(args(0))
Dim Arguments As String = "/f " & args(0)
LaunchProc(app, Arguments)
End Sub
Sub LaunchProc(fileNam As String, ByVal Optional arg As String = "",Optional wt As Boolean = True)
Dim p As New Process
p.StartInfo.FileName = fileNam
If arg > "" Then p.StartInfo.Arguments = arg
p.Start()
If wt Then p.WaitForExit()
End Sub
End Module
To make make it so I can run the dtsx file directly by double clicking on it. I associated the dtsx file extension with my little program. Works like a charm.
On our servers (windows 2016, SQL reporting server 2016, Microsoft Access Database Engine 2016) we run 2 SSIS packages. 1 imports data from an excel file to the database and 1 exports data from the database to an excel file. Both are xlsx files.
We run this exact package on TST, ACC, RES and PRD (same server and access setup). We didn't have any issues until a week ago the packages on PRD just kept on getting stuck in the "beginning validation phase" of the Dataflow Task. The other environments are fine.
We've determined that it is not a problem in the application since a simple read package that we created for this issue, gave the same problem. It doesn't seem to be an access issue either. The account that runs the script is sysadmin in SQL and local admin on the fileserver.
We also tried
• Only using one import flows instead of two in Data Flow task: no change https://social.msdn.microsoft.com/Forums/sqlserver/en-US/781c855f-833e-4578-a43a-1729482bbabd/dtspipeline-validation-phase-is-beginning-but-never-stop?forum=sqlintegrationservices
• Set connection managers for OLE DB sources are all set toDelayValidation to True: no change SSIS pre-evaluation phase taking long
• Set ValidateExternalMetadata is set to false for Excel Sources: no change SSIS pre-evaluation phase taking long
• Reinstall Microsoft Access Database Engine on server: no change
• Tested reading a flat file (txt) which worked without issue.
We're fresh out of ideas so any help would be greatly appreciated.
UPDATE:
When manually trying to run the import/export wizard (and selecting excel file) I get "The operating system is not presently configured to run this application". Investigating this message as well.
If you had no problem reading a text file then that would point me to the excel driver (32 or 64), however I would think if that was the issue you would see a connection error. Do this as test.
Go to the console and open excel on the server. This will let you know if you have some licensing issue or something preventing excel from opening on the server.
Import a small amount of data into SQL server from excel using the import data into a test database (or just make a test table). Be sure to use the same driver you are using in the SSIS package.
I am deploying SSIS packages which are then deployed in DEV (on SQL Server). The packages are executed via job steps using the Integration Services packages job step type (select the type, then select the SSIS server and browse to the package in the catalog)
When this is done, the project and package parameters show up, and we can set certain ones, and others are set via script on the SSIS server so the connection managers IN the package are ok, but FINALLY, my question is:
Does anyone have clever ways to script/deploy the job without having to go change the SERVER: value from, for instance DEVSSIS to PRODSSIS ?
Previously, we were running these packages via PowerShell script that would figure out the local server. We wanted to use the more standard SSIS step so that parameters could be changed if needed in the job step but now it looks like we will have to change the SSIS server by hand.
Any ideas are appreciated!
Thanks
At a previous shop, we would script out the job, and then at the top we would add a variable, (like #SSISServerName) and populate it based on the value of ##SERVERNAME (WHEN ##SERVERNAME = 'MyDevServer' THEN 'MyDevSSISServer' etc...)
Then obviously, go through the rest of the script and replace hard-coded SSIS Server Names with the new variable.
It worked back in SQL 2005 for sure. We could use that job script to deploy the same job to any environment as long as we had accounted for it as a possible ##SERVERNAME value. Dunno if anything has changed in SQL 2014 to prevent this strategy from working.
I have an simple SSIS package and I'm trying to export same set of data from a table to both flat file and excel destination. The package works fine when I run locally and it creates both text file and excel file with data.
But when deployed to a different server the sql agent job runs fine and the log inside integrations services catalog for the package says it wrote like 9000 rows to excel, and a new excel file is also created but it doesn't write any data to it(blank with just headers). text file works fine and it has all data I need.
SSIS package flow:
I'm working with Sql server 2014, Visual studio 2013 with SSDT and used Excel 2007 in excel destination.
We had the same issue.
The solution is that the user, which runs the SSIS package, must have full access to c:\users\default.
You can check this by running sysinternals' process monitor on the machine that executes the SSIS job.
You can find more information here:
Empty Excel File permissions issue: SSIS Excel Destination buffers large record sets through C:\Users\Default - This post made me find this solution
https://www.csopro.de/biblog/2018/04/ssis-fehlerbehebung-bei-excel-destination-schreibt-keine-zeilen/ - my blog. Here I describe the issue - unfortunately in German]
I had the same problem writing to several worksheets in an Excel file from a scheduled SQL Agent job. It worked fine for about 4 months. Then suddenly with no changes to the package, one of the 5 worksheets was no longer populated with data. No error message generated and it worked fine on every test from Visual Studio and Data Tools (the old "BIDS" tools as we used to call it.)
I never did find a solution and it continues to not write any data to that single worksheet of the 5 in the Excel file. (So answers above about the Account that the job runs under from SQL Agent does not have the appropriate permissions is NOT a correct answer for this issue.)
Plus, a new package I built today is having the same issue, only this one has only a single worksheet. Again, works fine in the development environment, but no data appears in the destination file and no errors. Not only that, but the timestamp on the file is the same as the template file -- it seems that it never even TRIES to write to the file.
Checking each run log for the package in the Integration Services Catalog has an entry in each log that shows 9K+ records "written" for the dataflow task.
Lastly, if I change the destination file name, the SQL Agent job generates the expected error, so that rules out answers that guess that the path is wrong.
This is bizarre. And exasperating.
I have encountered odd behaviour when using scheduled SSIS packages which use the Excel object.
The fix for me, was to edit the Agent Job properties. On the Execution Properties tab, try enabling the "use 32-bit runtime" option and force the SSIS to run in 32-bit mode instead of 64-bit mode.
This sounds similar to SSIS package SQL job not using new environment variable configuration, but when I did a test on my dev SQL Server box, I didn't have to restart the SQL Server agent. And honestly, if that is what it takes to prevent someone from 'hurting' themselves with this product, then. . . that is absolutely unacceptable.
UPDATE: And now, to follow up on that comment about SSIS package SQL
job not using new environment variable configuration possibly
being a solution.
It is not. I restarted the agent. No dice. Job still runs
successfully. I restarted the SQL Server Agent and Service. Again, no
failure.
This also sounds peripherally related to SSIS 2012 SQL Agent Job
ConnectionString vs InitialCatalog, but I've been working with SSIS
since 2005 and have never seen a connection string configure from it's
individual components. We have ALWAYS put the connecting string in the
.config files and had success.
I've got a problem that I haven't been able to resolve. Appears to be a bug, honestly, but I don't want to be so presumptuous.
I'm running SQL Server 2012 SP3 on Windows Server 2012 R2 in my test environment.
On my dev box, I'm running VS2012 + SSDT-BI.
I built a solution with a single SSIS 2012 project.
I deployed the SSIS Project to my Test SQL Server box.
I configured a new Environment (let's call it Fred).
I add variables to the Fred environment.
Some of these variables are connection strings.
I go back to the SSIS Project in the SSISDB tree. Right-click and configure.
I set all of my project's connection manager's connection string properties (only) to configure from one of Fred's connection string variables.
Now I have everything, connection-wise at least, wired to configure from my Fred Environment.
I created a SQL Server Agent job (of type SSIS) to execute a package.
I check the box to say Configure from environment .\Fred
I complete job configuration.
I right-click the job and choose 'Start Job at Step. . .'
The job executes. Yay! Everything works. Packages all execute. (Everyone is doing the dance of joy!)
Now here comes the part I can't resolve.
As part of my system testing (I'm new and cautious to the 2012+ Project Deployment model) I change a connection string value stored in one of my Fred Environment variables. This connection string was one that my SSIS project's connection manager's connection string (I triple-checked this) was configured to use.
The connection string is now invalid/incomplete and, now if I was to execute the same SQL Agent Job, the expectation is that the executed Package should fail (validation), right?
Well guess what. The job succeeds. (Dance of joy stops.) The packages all execute successfully. I check my test database environment and rows have been added to a table in a DB whose connection string is completed wrong (in the Fred environment).
So, now I think hmm, well, maybe I missed something. I go back in and I delete the environment variable that contains the invalid/incomplete connection string.
I execute the job again. Job succeeds. Uh, What?
I launch my dev instance of SQL Server 2012 (RTM, not SP3, incidentally)
I deploy a test project with the same project connection manager/environment configuration mapping
I go through the same steps. . .
BUT now this time, the job fails as expected. Note: No SQL Server agent restart made/required.
I go back to my Test box
I right click the SQL Agent (SSIS) job
I go to Properties > Steps (page) > Edit (first step. There's only one), and now I'm seeing the warning:
The parameter "SomeOtherSSISProjectParameter" is configured to use an
environment variable, but no environment variable has been selected.
Check the "Environment" checkbox and specify the environment to use,
or specify a literal value for the parameter.
(Microsoft.DataTransformationServices.DTSExecUI.Controls)
Uh, what!?
First off, the Environment check box in this job step is already checked.
Secondly, this parameter is not related or mapped to the the Fred enviro. variable that I changed and then removed.
Thirdly the Environment variable value is not null or empty. It's still there, un-touched from when I first added it to the environment.
(At this point, I'm thinking the SSISDB is corrupt.)
I click passed the warning. Now that I am here in the SSIS Package SQL Server Agent config page, as I try to click OK (without making any changes) I NOW get an expected error message related to the orphaned Environment config.
Property "ConnectionString" of connection manager
"MySSISProjConnManagerName" is configured to receive a value from an
environment variable named "MyFredEnviroVariableName" but there is not
environment variable named "MyFredEnviroVariableName" in environment
".\Fred". Select a different environment, or use a literal value for
the property.
(Microsoft.DataTransformationServices.DTSExecUI.Controls)
So the good news is that I can't click OK to save this corrupt job/step, but the bad news is that I CAN click cancel, and I can right click the job and choose 'Start Job at Step. . .' and have it complete successfully when it absolutely shouldn't.
Yes, I restarted the agent. Yes, I restarted to server.
I tested the scenario on my dev instance (RTM) of SQL Server 2012, and I can't reproduce it. Can anyone else? Better yet, is there a fix solution? SP4?
Any help with this would be MUCH appreciated. As far as I'm concerned this is a show-stopper for using the Project Deployment model with 2012.