I have been trying to solve a SSIS project related problem for a week now.
The SSIS solution/project has been working fine for two years but is not working at all for a second user after adding him two weeks ago.
The user can open the solution and execute packages after connecting to the source code via VSO/TFS or by using a local copy.
But after making changes to the project the other user got the error message (1) below when trying to build the project or executing a package.
Trying to import the project from the server results in another error message (2).
Most posts I have find refer to versions of SQL/visual studio (project created on a version that differs to the one currently in use) it is not appicable to my case
Any help or feedback highly appriciated.
My conditions are:
SQL Server 2014 (including SSIS) on win server 2012 r2
Dev machine with win 10 and SQL Server Data Tools 2013 (latest version)
Visual Studio Online for source code/versioning (GIT)
SSIS project deployment model
ProtectionLevel = EncryptSensitiveWithPassword
What I have tried so far:
Tried all cominations of user and dev machine
Tried different versions of Data tools
Tried to find differences in project/solution files as well as in packages to identify user related code etc.
Tried the SSI project with and without source control
Tried to change the Run64BitRunTime property
(1)
Error 1 Microsoft.SqlServer.Dts.Runtime.DtsRuntimeException: The package failed to load due to error 0xC0010014 "One or more error occurred. There should be more specific errors preceding this one that explains the details of the errors. This message is used as a return value from functions that encounter errors.". This occurs when CPackage::LoadFromXML fails. ---> System.Runtime.InteropServices.COMException: The package failed to load due to error 0xC0010014 "One or more error occurred. There should be more specific errors preceding this one that explains the details of the errors. This message is used as a return value from functions that encounter errors.". This occurs when CPackage::LoadFromXML fails. at Microsoft.SqlServer.Dts.Runtime.Wrapper.ApplicationClass.LoadPackage(String FileName, Boolean loadNeutral, IDTSEvents100 pEvents) at Microsoft.SqlServer.Dts.Runtime.Application.LoadPackage(String fileName, IDTSEvents events, Boolean loadNeutral) --- End of inner exception stack trace --- at Microsoft.SqlServer.Dts.Runtime.Application.LoadPackage(String fileName, IDTSEvents events, Boolean loadNeutral) at Microsoft.SqlServer.Dts.Runtime.Application.LoadPackage(String fileName, IDTSEvents events) at Microsoft.DataTransformationServices.Project.ProjectBuildItemInfo.Update(DateTime lastWriteTime, PackageItem packageItem, Project project, String projectDirectory) at Microsoft.DataTransformationServices.Project.ProjectBuildItemInfo..ctor(String name, DateTime lastWriteTime, PackageItem packageItem, Project project, String projectDirectory) at Microsoft.DataTransformationServices.Project.ProjectBuildValidator.RefreshCache(PackageItem item) at Microsoft.DataTransformationServices.Project.ProjectBuildValidator.CheckBuildItem(PackageItem item) at Microsoft.DataTransformationServices.Project.ProjectBuildValidator.CheckConsistency(String& errors, String buildLogFullName) at Microsoft.DataTransformationServices.Project.DataTransformationsProjectBuilder.IncrementalBuildThroughObj(IOutputWindow outputWindow) at Microsoft.DataTransformationServices.Project.DataTransformationsProjectBuilder.BuildIncremental(IOutputWindow outputWindow)
(2)
The package failed to load due to error 0xC0011008 "Error loading from XML. No further detailed error information can be specified for this problem because no Events object was passed where detailed error information can be stored.". This occurs when CPackage::LoadFromXML fails.
A bit late to the party I'm afraid, but I managed to resolve the 2nd error (in my case anyway) by using importing using the .ispac file rather than connecting to the catalogue in the database.
Related
We've been recently trying to migrate out of SQL Server 2016 to SQL Server 2019 on our servers, that includes upgrading all the SSIS Packages we have on our catalog.
The migration wizard had no issues and migrated all packages with no errors, and on the surface everything seems OK. Even tried a test run on Visual Studio, and everything worked. But once we deployed all the packages on the catalog and tried a run via there, we started getting the following error:
none
Error: 0xC0014020 at Load ODI_PaymentDevice, ODBC Source [14]: SQLSTATE: HY010, Message: [Microsoft][ODBC Driver Manager] Function sequence error;
Error: 0xC0209029 at Load ODI_PaymentDevice, ODBC Source [14]: SSIS Error Code DTS_E_INDUCEDTRANSFORMFAILUREONERROR. The "ODBC Source.Outputs[ODBC Source Output]" failed because error code 0xC020F450 occurred, and the error row disposition on "ODBC Source" specifies failure on error. An error occurred on the specified object of the specified component. There may be error messages posted before this with more information about the failure.
Error: 0xC0047038 at Load ODI_PaymentDevice, SSIS.Pipeline: SSIS Error Code DTS_E_PRIMEOUTPUTFAILED. The PrimeOutput method on ODBC Source returned error code 0xC0209029. The component returned a failure code when the pipeline engine called PrimeOutput(). The meaning of the failure code is defined by the component, but the error is fatal and the pipeline stopped executing. There may be error messages posted before this with more information about the failure.
This is only on the packages that have an connection with Hive, using the ODBC Connector on SSIS, and then ODBC Data Source on a Data Flow
Based on the error code, it could be narrowed down to the ODBC Connection we have to our Hadoop-Hive cluster, the connection for sure works as we tested it on the Windows' ODBC Sources tool, and works as well on Visual Studio. We've researched a lot about this error and the different soulutions to this. So we tried a lot of different things.
Deleting the data source and then creating a new one (To update metadata)
Running in 32 Bit mode
Updating Microsoft's Hive ODBC driver
Switching to a different vendor's driver (CDATA)
Switching to an ADO.NET connection instead
Played around with the driver's configuration, almost all combinations possible
After trying all of this to no avail, we tried again on Visual Studio, and to our surprise, it also started to fail there too.
After trying a few different things, we could reproduce again the conditions in which the package worked, and it is the strangest thing, we could not find anyone with a similar issue on the internet so far.
So, as stated before, the connection works, and the package itself also does, BUT, we have a For Each Loop Container, that iterates through dates, to load data for the last X dates we have, so if there is any kind of loop container (For Each loop, for example) that contains a query against our ODBC source, it fails on the second loop around 100% of the time.
So that is the reason it worked on Visual Studio, because it only ran once (had only one date to process as test), but when deployed, it had to fetch real data, with a bunch of different dates.
To confirm that this is indeed the issue, deployed the package, and updated the table with dates to load, to have only available 1 day. And the package ran through. Also ruling out any parameter issue on the deployment/server/catalog.
After this discovery we tried a few different things:
Passing NULL on every column to see if there is some issues with metadata between loops
Also activated LOG_TRACE on the Hive ODBC driver, to have a very detailed log of what is happening, we see the query going out for the second loop on the log, and it also appears on TEZ (our Hive execution engine) but very briefly, only fractions of a second. And then it cancels itself, so the query is arriving the cluster, but somehow SSIS drops the connection by itself.
As mentioned before, we couldn't find anything like this before, and we cannot think of any other options to solve the issue without having to directly change the packages or not upgrading to 2019 at all, which is not ideal knowing that it is already outside of the mainstream support cycle.
Anyone has an idea how this might be solved or what may be causing this issue?
I have faced a very similar issue (if not the same) with the SSIS ODBC Source Component inside a For Loop for transferring records in batches from a remote PostgreSQL server to a database on MS SQL Server 2019. My Visual Studio is 2019 and the MS SQL Server is 2019 as well. The very weird thing was that the package was running as expected in VS (Debugging and Without Debugging), then it was working quite well through the SQL Job Agent of the SQL Server installed on my machine, but when deployed on the production SQL Server (the same version and psqlodbc driver installed there) the package was running successfully for the first iteration of the For Loop component and then unexpectedly was crashing, showing in the logs the same errors you have posted above: SQLSTATE: HY010, Message: [Microsoft][ODBC Driver Manager] Function sequence error;.....etc. After many hours spent on this without any success, I finally fixed it and now it is working like a charm; hence decided to share how I figured that out, so hopefully it may be of help to you or anyone facing that challenge.
What I found out is that for some reason the problem was happening inside the ODBC Source Component, but could not do much as it is like a black box. I fixed the problem by switching to a Script Source Component, so that I took control over the connection in the C# code. Here below, I also share the code:
#region Namespaces
using System;
using System.Data;
using System.Data.Odbc;
using Microsoft.SqlServer.Dts.Pipeline.Wrapper;
using Microsoft.SqlServer.Dts.Runtime.Wrapper;
#endregion
...........
[Microsoft.SqlServer.Dts.Pipeline.SSISScriptComponentEntryPointAttribute]
public class ScriptMain : UserComponent
{
...........
public override void CreateNewOutputRows()
{
string connectionString = this.Connections.PostgreSQLODBCConn.ConnectionString;
using (OdbcConnection conn = new OdbcConnection(connectionString))
{
using (OdbcCommand cmd = conn.CreateCommand())
{
cmd.CommandText = "SELECT * FROM fn_transfer_records(500000);";
cmd.CommandType = CommandType.Text;
DataTable dt = new DataTable();
conn.Open();
using (OdbcDataAdapter adapter = new OdbcDataAdapter(cmd))
{
adapter.Fill(dt);
foreach (DataRow row in dt.Rows)
{
Output0Buffer.AddRow();
Output0Buffer.col1 = (Int32)row["col1"];
Output0Buffer.col2= (double)row["col2"];
}
}
}
}
}
}
The PostgreSQLODBCConn used in the code above is the name of the Connection added to the connections collection of the Script Component added through the visual editor of the component when you double click on it.
Hope this would be of help...
Ensure your SQL Server Target version is set to SQL Server 2019. This can be found from the Project properties. This error is typical of a mismatched target server, as the issue is only present during deployment, and not during development.
I'm trying to use SSIS to load some data from Oracle database to MSSQL database.
I created the project and used the ADO.Net source and was able to create a connection to Oracle and run queries and view results.
However when I actually run the package I get the following error:
Error: 0xC0208449 at Data Flow Task, ADO NET Source 2: ADO NET Source has failed to acquire the connection {EECB236A-59EA-475E-AE82-52871D15952D} with the following error message: "Could not create a managed connection manager.".
It seems similar to the issue here
And I did find that I have two oracle clients version installed "11.1" and "12.2".
One is used by PL/SQL and the other by other entity framework project.
If this is the issue I just wanted a way to tell the SSIS to pick-up the correct one.
I tried adding Entry in machine.config for "oracle.manageddataaccess.client" section with the desired version.
I also tried using other types of data sources but couldn't even create a successful connection
I tried changing the Run64bitRuntime property in the project to False
Note: I don't have SSIS installed on my machine.
Eventually, I just had to remove the entries related to 11.1 in path variable then restarted my machine.
Also I switched to "dotConnectForOracle" for connection and now it seems to be working fine.
I'm expecting issues related to other applications that might still be using the 11.1 version, but that will be a problem for another day.
Always make sure to write the user (oracle schema) in uppercase and some special characters [in my case it was $] in the password needs escape character even if you're using the wizard not the cmd
I still don't understand the whole issue but I hope this helps someone some day.
I have created a SSIS package and I am trying to run it locally. We use package configurations that point to sql tables and a XML config file. The package ran successfully for about a week, even when deployed to a SQL Server Agent Job in our STAGE environment.
Now, the only way I can get the package to run is by not using the Package Configurations and choosing EncryptSensitivewithPassword. If I change the package to DontSaveSensitive, I continuously get the error below:
An OLE DB record is available. Source: "Microsoft SQL Server Native Client 11.0" Hresult: 0x80040E4D Description: "Login failed for user 'Test_User'.".
Error: 0xC020801C at AgentCompany, Lookup [37]: SSIS Error Code DTS_E_CANNOTACQUIRECONNECTIONFROMCONNECTIONMANAGER. The AcquireConnection method call to the connection manager "Test" failed with error code 0xC0202009. There may be error messages posted before this with more information on why the AcquireConnection method call failed.
It is so strange that about a week ago, this package ran fine with the configurations and the DontSaveSensitive Option.
I have updated the config file to ensure that it is establishing the connection string to the appropriate database. I also test the connectivity on the connection managers and they all test successfully.
I also double checked the SQL Database where the user is trying to connect to ensure that it has permissions there and it does.
I am very confused. Please Help!
Updating dtsconfig file
Re-creating the connection managers
Making some DFT task DelayValidation to true
Changing the RunTime to 32 bit
EncrpytPasswordSensitive with package configs removed---This works but this is not the standard at my company and this is not how I developed and tested the package before
When you open/run a package, an OnInformation event is fired that says something like
The package is attempting to configure from the XML file "c:\ssisdata\so_56776576.dtsconfig".
When Visual Studio/SSDT opens/runs a package which says it uses configuration but for reasons, cannot get them, you should then see messages like
Warning loading so_56776576.dtsx: Failure importing configuration file: "c:\ssisdata\so_56776576.dtsconfig"
and
Warning loading so_56776576.dtsx: The configuration file "c:\ssisdata\so_56776576.dtsconfig" cannot be found. Check the directory and file name.
and
Warning loading so_56776576.dtsx: Failed to load at least one of the configuration entries for the package. Check configuration entries for "Configuration 1" and previous warnings to see descriptions of which configuration failed.
If someone has manually edited the config file and broken the XML, you'd see a warning like
Cannot load the XML configuration file. The XML configuration file may be malformed or not valid
The important thing to note with regard to configuration - if a configuration cannot be found, SSIS will continue along with the design time values. That is why it is crucial to check the warnings emitted when your package runs. If you are running manually, ensure that you have /rep ew specified so you report Errors and Warnings.
Guesses as to root cause
The package has the protection level of EncryptSensitiveWithUserKey which means the AD credentials of the package creator are used to hash things that might have sensitive information in them. I could be using AD authentication in my connection string and specify that the connection should be trusted but that entire block is still going to get encrypted against my Active Directory account. When you come along and attempt to maintain the package, it's not going to be able to decrypt the sensitive data as you are not me.
The two ways around that are to use a shared key (EncryptSensitiveWithPassword/EncryptPackageWithPassword) which is cumbersome to deal with plus it goes against the whole spirit of secrecy since everyone knows the secret. The other approach as you've identified is DontSaveSensitive and that's my go to for all of this.
The problem to be overcome is that with DontSaveSensitive is that every time you save, SSIS is going to wipe out any knowledge of user name and password from places that might be holding on to it - like a connection manager. The 2005/2008 strategy to hedge against this was to use Configuration or explicit overrides at run time to supply user name and password. My typical approach was to use configuration based on a table instead of XML as I was better at securing sensitive data in a table than I was mucking with ACL on the file system. The other challenge we had with multiple developers and file based configuration was that either everyone had to set their file systems up the same (and we developers are unique rainbow snowflakes so that's unlikely) or we need to use a network shared file which is great until someone adds their own values to it and breaks it or removes your changes or any of a host of other challenges.
I have Integration Services packages that need to work with Excel files (shudder) and also need to execute on x64. I have a handle on the whole SSIS x64 vs. 32-bit issue, so I am scheduling these problem-child packages using MS suggested technique of using SQL Agent job step type Operating System / CMDExec, with a command line string that explicitly calls the 32-bit dtexec. So far so good.
Here's the issue: the packages still fail to load, and complain about failing to load not the Excel bits, but instead my handy Log Provider that logs to SQL Server. This is the error message, edited for object names:
Started: 2:33:01 PM
Error: 2009-07-24 14:33:06.51
Code: 0xC0010018
Source:
Description:
Error loading value "<DTS:ConnectionManager xmlns:DTS="www.microsoft.com/SqlServer/Dts"><DTS:Property DTS:Name="DelayValidation">0</DTS:Property><DTS:Property DTS:Name="ObjectName">My_LogCon</DTS:Property><DTS:Property DTS:Name="DTSID">{86320FE6-AEFD-4A58-9277-84685B9B9" from node "DTS:ConnectionManager".
End Error Could not load package "c:\folder\mypkg.dtsx" because of error 0xC0010014.
Description: The package failed to load due to error 0xC0010014 "One or more error occurred. There should be more specific errors preceding this one that explains the details of the errors. This message is used as a return value from functions that encounter errors.". This occurs when CPackage::LoadFromXML fails. ... Process Exit Code 5. The step failed.
The packages run perfectly in 32-bit Visual Studio/debug. Anyone seen this type of thing?
From the 70-488 book:
Dtexec, dtutil, and the SQL Server Import and Export Wizard have both
a 64-bit and a 32-bit application. Be sure to note that if you develop
a package in a 32-bit environment and want to run the package in a
64-bit environment, the connection managers need to be 64-bit
compliant. Some connection managers such as Excel work in a 32-bit
environment only.
I do not think this is Excel related. Your error message clearly states:
Could not load package "c:\folder\mypkg.dtsx"
More detailed error message might help. Also curious to know, are there a lot of processes after the Excel file loads? I am faced with the same scenario and it is shame having to run the whole process in 32 bit.
I might have a work around that will allow you to use Excel files as source, but still configure it to run as a SSIS job, not CmdExec. Will share details once I test this.
Check the following Microsoft knowledge base article. There seems to be a different reason for your issue.
You receive an error message when you try to load an SSIS package that contains a DateTime type variable in SQL Server 2005
I have an SSIS package which reads an Excel File (Data Flow Source) and transfer the data to SQL Server using OLEDB Destination Data Flow Item. The OLEDB Connection Manager used for the destination is configured to use Windows Authentication. The package works fine on my development machine. But when I open the same package on another machine and try to execute it gives the following error in Validation phase
Error: 0xC020801C at DFT_NSOffers, Source - 'Subscription Offers$' 1 [347]: The AcquireConnection method call to the connection manager "ExcelConnection_NSOffers" failed with error code 0xC0202009.
Error: 0xC0047017 at DFT_NSOffers, DTS.Pipeline: component "Source - 'Subscription Offers$' 1" (347) failed validation and returned error code 0xC020801C.
Error: 0xC004700C at DFT_NSOffers, DTS.Pipeline: One or more component failed validation.
Error: 0xC0024107 at DFT_NSOffers: There were errors during task validation
I'm using SQL Server 2005 (Version - 9.0.1399)
How do I fix this? Do I need to install any other component or service pack?
I don't think it's 64/32 bit error. My Dev machine and DB server are 32bit. But I could make it work. I had to set Delay Validation property of Data Flow tasks to TRUE.
Hi This can be solved by changing the prorperty of the project in the solution explorer then give false to 64bit runtime option
64/32 bit error? I found this as a problem as my dev machine was 32bit and the production server 64bit. If so, you may need to call the 32bit runtime directly from the command line.
This link says it better (No 64bit JET driver): http://social.msdn.microsoft.com/forums/en-US/sqlintegrationservices/thread/da076e51-8149-4948-add1-6192d8966ead/
I was finally able to resolve the "Excel connection issue" in my case it was not a 64 bit issue like some of them had encounterd, I noticed the package worked fine when i didnt enable the package configuration, but i wanted my package to run with the configuration file, digging further into it i noticed i had selected all the properties that were available, I unchecked all and checked only the ones that I needed to store in the package configuration file. and ta dha it works :)
For me, I was accessing my XLS file from a network share. Moving the file for my connection manager to a local folder fixed the issue.
If you are receiving preview of data in the excel source. But while executing the data flow task you receive Acquire connection error. Then move the file to local system and change the file path in excel connection manager and try executing again.
In my case the problem was the 32/64 bit driver which I solved by configuring the properties of the sql server job:
I had similar issue just that excel was the destination in my case instead of source as in the case of the original question/issue. I have spent hours to resolve this issue but looks like finally Soniya Parmar saved the day for me. I have set job and let it run for few iterations already and all is good now. As per her suggestion I set up the delay validation of the Excel connection manager to 'True. Thanks Soniya
Setting RetainSameConnection property to True for Excel manager Worked for me .
I had similar issue, trying to load data from Excel spreadsheet; and was running on WinX64. So I went VS BI`s project properties: Configuration Properties \ Dbugging
and Switch Run64BitRuntime from True to False.
It worked.
I was also getting the same error and it simply got resolved after installing the MS offices driver and Execute the job in 32 Bit DTEXEC. Now it works fine.
You can get the setup from below.
https://www.microsoft.com/en-in/download/confirmation.aspx?id=23734
In my case password i set in expression was wrong causing this error. After assigning correct password to expression of connection manager issue resolved.
In my case, none of the previous solutions here worked. Apparently Visual Studio, upon creating the Excel Source component, opens the Excel file and does not release it. Trying to then execute the SSIS package within Visual Studio leads to a AcquireConnection error with code 0xC0202009. Closing Visual Studio completely (not just the solution), reopen the solution and then run the package again without any further changes works. I found out when I tried to replace the Excel file and Windows Explorer said it couldn't because the file was open.
In order to resolve this issue make all your data flow tasks in one sequence. It means it should not execute parallel. One data flow task sequence should contain only one data flow task and for this another data flow task as sequence.
Ex:-