SSIS Package not running from server - sql-server

I have an SSIS package that does the following;
Detects if a file (excel exists)
Deletes file
Copies file from a directory into the required one
This package when run locally to test functions correctly there are no issues.
When running on server or as a job I get the error of
Exception has been thrown by the target of an invocation.
I believe the error to be based around the copy code line, as commenting this out still allows the package to run.
My version of Visual Studio is 2013 (unable to upgrade) and the SQL-Server machine is running the latest version of 2016.
The account used to run the job is considered a network administrator - the error persists.
Imports System.IO (etc)
Public Sub Main()
Dim sourcePath As String = "\\server\File\Template.xlsx"
Dim destPath As String = "\\server\File\NewFile.xlsx"
If File.Exists(destPath) = True Then
File.Delete(destPath) 'deletes current file
End If
File.Copy(sourcePath, destPath)
Dts.TaskResult = ScriptResults.Success
End Sub
I wouldn't expect any issues doing this, as other packages (different functions etc.) work as they should.

Exception has been thrown by the target of an invocation.
This is a generic Script task message. Pretty standard suggestion: consider capturing a real exception text for a better issue analysis via FireError:
Imports System.IO (etc)
Public Sub Main()
Try
Dim sourcePath As String = "\\server\File\Template.xlsx"
Dim destPath As String = "\\server\File\NewFile.xlsx"
If File.Exists(destPath) = True Then
File.Delete(destPath) 'deletes current file
End If
File.Copy(sourcePath, destPath)
Catch ex As Exception
Dts.Events.FireError(0, "Script Task Example", ex.Message, String.Empty, 0);
End Try
Dts.TaskResult = ScriptResults.Success
End Sub
There are plenty of reasons for such code to fail, one of them is related to an in-place package upgrade which didn't upgrade a script task to VSTA 2015 correctly.

The error exists due to versioning differences between the SSIS Package writtne in Visual Studio 2013 and SQL Server 2016.
Upgrade errros can be found within the message portion of the error report.

Related

ODBC Connection on SSIS package fails only when it goes through a loop

We've been recently trying to migrate out of SQL Server 2016 to SQL Server 2019 on our servers, that includes upgrading all the SSIS Packages we have on our catalog.
The migration wizard had no issues and migrated all packages with no errors, and on the surface everything seems OK. Even tried a test run on Visual Studio, and everything worked. But once we deployed all the packages on the catalog and tried a run via there, we started getting the following error:
none
Error: 0xC0014020 at Load ODI_PaymentDevice, ODBC Source [14]: SQLSTATE: HY010, Message: [Microsoft][ODBC Driver Manager] Function sequence error;
Error: 0xC0209029 at Load ODI_PaymentDevice, ODBC Source [14]: SSIS Error Code DTS_E_INDUCEDTRANSFORMFAILUREONERROR. The "ODBC Source.Outputs[ODBC Source Output]" failed because error code 0xC020F450 occurred, and the error row disposition on "ODBC Source" specifies failure on error. An error occurred on the specified object of the specified component. There may be error messages posted before this with more information about the failure.
Error: 0xC0047038 at Load ODI_PaymentDevice, SSIS.Pipeline: SSIS Error Code DTS_E_PRIMEOUTPUTFAILED. The PrimeOutput method on ODBC Source returned error code 0xC0209029. The component returned a failure code when the pipeline engine called PrimeOutput(). The meaning of the failure code is defined by the component, but the error is fatal and the pipeline stopped executing. There may be error messages posted before this with more information about the failure.
This is only on the packages that have an connection with Hive, using the ODBC Connector on SSIS, and then ODBC Data Source on a Data Flow
Based on the error code, it could be narrowed down to the ODBC Connection we have to our Hadoop-Hive cluster, the connection for sure works as we tested it on the Windows' ODBC Sources tool, and works as well on Visual Studio. We've researched a lot about this error and the different soulutions to this. So we tried a lot of different things.
Deleting the data source and then creating a new one (To update metadata)
Running in 32 Bit mode
Updating Microsoft's Hive ODBC driver
Switching to a different vendor's driver (CDATA)
Switching to an ADO.NET connection instead
Played around with the driver's configuration, almost all combinations possible
After trying all of this to no avail, we tried again on Visual Studio, and to our surprise, it also started to fail there too.
After trying a few different things, we could reproduce again the conditions in which the package worked, and it is the strangest thing, we could not find anyone with a similar issue on the internet so far.
So, as stated before, the connection works, and the package itself also does, BUT, we have a For Each Loop Container, that iterates through dates, to load data for the last X dates we have, so if there is any kind of loop container (For Each loop, for example) that contains a query against our ODBC source, it fails on the second loop around 100% of the time.
So that is the reason it worked on Visual Studio, because it only ran once (had only one date to process as test), but when deployed, it had to fetch real data, with a bunch of different dates.
To confirm that this is indeed the issue, deployed the package, and updated the table with dates to load, to have only available 1 day. And the package ran through. Also ruling out any parameter issue on the deployment/server/catalog.
After this discovery we tried a few different things:
Passing NULL on every column to see if there is some issues with metadata between loops
Also activated LOG_TRACE on the Hive ODBC driver, to have a very detailed log of what is happening, we see the query going out for the second loop on the log, and it also appears on TEZ (our Hive execution engine) but very briefly, only fractions of a second. And then it cancels itself, so the query is arriving the cluster, but somehow SSIS drops the connection by itself.
As mentioned before, we couldn't find anything like this before, and we cannot think of any other options to solve the issue without having to directly change the packages or not upgrading to 2019 at all, which is not ideal knowing that it is already outside of the mainstream support cycle.
Anyone has an idea how this might be solved or what may be causing this issue?
I have faced a very similar issue (if not the same) with the SSIS ODBC Source Component inside a For Loop for transferring records in batches from a remote PostgreSQL server to a database on MS SQL Server 2019. My Visual Studio is 2019 and the MS SQL Server is 2019 as well. The very weird thing was that the package was running as expected in VS (Debugging and Without Debugging), then it was working quite well through the SQL Job Agent of the SQL Server installed on my machine, but when deployed on the production SQL Server (the same version and psqlodbc driver installed there) the package was running successfully for the first iteration of the For Loop component and then unexpectedly was crashing, showing in the logs the same errors you have posted above: SQLSTATE: HY010, Message: [Microsoft][ODBC Driver Manager] Function sequence error;.....etc. After many hours spent on this without any success, I finally fixed it and now it is working like a charm; hence decided to share how I figured that out, so hopefully it may be of help to you or anyone facing that challenge.
What I found out is that for some reason the problem was happening inside the ODBC Source Component, but could not do much as it is like a black box. I fixed the problem by switching to a Script Source Component, so that I took control over the connection in the C# code. Here below, I also share the code:
#region Namespaces
using System;
using System.Data;
using System.Data.Odbc;
using Microsoft.SqlServer.Dts.Pipeline.Wrapper;
using Microsoft.SqlServer.Dts.Runtime.Wrapper;
#endregion
...........
[Microsoft.SqlServer.Dts.Pipeline.SSISScriptComponentEntryPointAttribute]
public class ScriptMain : UserComponent
{
...........
public override void CreateNewOutputRows()
{
string connectionString = this.Connections.PostgreSQLODBCConn.ConnectionString;
using (OdbcConnection conn = new OdbcConnection(connectionString))
{
using (OdbcCommand cmd = conn.CreateCommand())
{
cmd.CommandText = "SELECT * FROM fn_transfer_records(500000);";
cmd.CommandType = CommandType.Text;
DataTable dt = new DataTable();
conn.Open();
using (OdbcDataAdapter adapter = new OdbcDataAdapter(cmd))
{
adapter.Fill(dt);
foreach (DataRow row in dt.Rows)
{
Output0Buffer.AddRow();
Output0Buffer.col1 = (Int32)row["col1"];
Output0Buffer.col2= (double)row["col2"];
}
}
}
}
}
}
The PostgreSQLODBCConn used in the code above is the name of the Connection added to the connections collection of the Script Component added through the visual editor of the component when you double click on it.
Hope this would be of help...
Ensure your SQL Server Target version is set to SQL Server 2019. This can be found from the Project properties. This error is typical of a mismatched target server, as the issue is only present during deployment, and not during development.

Crystal Report is not run on my vb.net 2010 project , after share in another pc

I make a bill, invoice, etc by using crystal report in vb.net 2010 and SQL Server project but I face a problem. I run the project on my pc is properly run but I share this project to another pc so is not run, crystal report viewer still loading then show a error
I think this error is related to OLE DB connection because I changed OLE DB server name. I chose my server name on that another pc so it properly runs. I have to share my project anywhere and run properly without changing this server name
how to run my project anywhere without changing SQL server name?
it's my code with SQL data load in crystal report.
If comboType.SelectedIndex = 0 Then 'detail type crystal report
Dim dt As New DataTable
With dt
.Columns.Add("series_no")
.Columns.Add("bill_no")
.Columns.Add("sale_date")
.Columns.Add("gross_amount")
.Columns.Add("dis_rs")
.Columns.Add("Sale_Return_Amo")
.Columns.Add("net_amount")
.Columns.Add("less")
.Columns.Add("cash_pay")
.Columns.Add("card_pay")
.Columns.Add("mo_pay")
.Columns.Add("balance")
End With
For Each dgr As DataGridViewRow In Me.DataGried_Sales_Report.Rows
dt.Rows.Add(dgr.Cells(0).Value, dgr.Cells(1).Value, dgr.Cells(2).Value, dgr.Cells(3).Value, dgr.Cells(4).Value, dgr.Cells(5).Value, dgr.Cells(6).Value, dgr.Cells(7).Value, dgr.Cells(8).Value, dgr.Cells(9).Value, dgr.Cells(10).Value, dgr.Cells(11).Value)
Next
Dim f As New Show_Reports
Dim rdoc As CrystalDecisions.CrystalReports.Engine.ReportDocument
rdoc = New CrystalReport_Detail_Sale
rdoc.SetDataSource(dt)
rdoc.SetParameterValue("FromD", txtPR_DateTimePicker1.Text)
rdoc.SetParameterValue("ToD", txtPR_DateTimePicker2.Text)
f.CrystalReportViewer1.ReportSource = rdoc
f.ShowDialog()
f.Dispose()

Deployed package execution in SSISDB shows success but write action never happened

I simplified the issue to a simple write action in C# script task, writing current time stamp into a text file located on the same machine in C:\test\ granted full control to Everyone. The environment is SQL Server 2016, Visual Studio 2015, and Windows Server 2016.
If create a SQL Server Agent job from file system, it updates the text file as expected. If deploy the package into SSISDB and execute, the execution report shows all green and success while the write action never happened. And I was not able to find any clue in system logs.
I will be very grateful to inputs on:
1) What might be wrong in this simple-write test; and
2) How to get more useful log information about this error and possibly other issues.
FYI, Related issues resolved before this post:
SSIS deployed package fails to map drive tag to network shared folder
Code in the simple-write script task.
public void Main()
{
Dts.TaskResult = (int)ScriptResults.Failure;
try
{
// TODO: Add your code here
using (var writer = new StreamWriter("C:\\test\\simple-write.txt", true, Encoding.Unicode))
{
writer.WriteLine(DateTime.Now.ToString("yyyy-MM-ddTHH:mm:ss.fffffffK"));
writer.Flush();
}
Dts.TaskResult = (int)ScriptResults.Success;
}catch(Exception ex)
{
Dts.Events.FireError(0, "Simple-Write-Test", ex.ToString(), String.Empty, 0);
Dts.TaskResult = (int)ScriptResults.Failure;
}
}
I got the answer on the following link:
https://social.msdn.microsoft.com/Forums/en-US/6c34874c-03a2-4926-8989-b977c1ec1866/deployed-package-execution-in-ssisdb-shows-success-but-write-action-never-happened?forum=sqlintegrationservices
It seems to be a compatibility issue between the latest version of Management Studio (2017 suite) and SQL Server 2016 database. The script task in the deployed package was silently ignored.

SSIS package gives error after deployment SQL Server 2012

I created a package in Visual Studio 2015. It works fine.
Basically I am using a script task that generates Excel spreadsheet and sends it to different users.
After I deploy the package to SQL Server 2012 and then try to execute it from there - I get an error without any further details.
I also run select * from internal.packages from SSISDB to make sure package_format_version is 6, which is what should be for SQL Server 2012.
What could be the problem?
This necessarily isn't an answer on how to fix the issue, but it's an answer on how you can modify your script task to get a better error message then "Script Task Failure: Exception has been thrown..."
We'll always wrap our script tasks in a try-catch and then raise the exception message back out of the script task:
public void Main()
{
try
{
//Your code here
Dts.TaskResult = (int)ScriptResults.Success;
}
catch (Exception ex)
{
Dts.Events.FireError(-1, "", ex.Message, String.Empty, 0);
Dts.TaskResult = (int)ScriptResults.Failure;
}
}
It's always a challenge, especially with a deployed SSIS package, when it errors on a scrip task you don't necessarily get a clear indication as to why it's failing and you get a cryptic error message. The above code will capture what threw the exception and bubble back out to integration services what that was.
You may want to make sure that the "Microsoft Access Database Engine 2010 Redistributable" driver is installed on the SSIS server. You can get it here.

Data upload from Visual Studio to SQL Server Timeout

I have a big Problem sending an SQL-Dump made with SQL-Server Management Studio via vb.net to another SQL-Server.
I made the SQL-Files like this: Database / Tasks / scripts ...
After deleting every "GO" command in the Dump I managed to upload to my local SQL-Server 2012. But this worked only if I made different scripts for any table.
If I use one big File, I get a timeout.
Now I want to transfer the Data (50 MB) to another SQL Server via Internet connection.
Only 3 of 5 Tables are generated on this server :-(
I get the following timeout error (german version, sorry):
"Eine Ausnahme (erste Chance) des Typs "System.Data.SqlClient.SqlException" ist in System.Data.dll aufgetreten.
Timeout abgelaufen. Das Zeitlimit wurde vor dem Beenden des Vorgangs überschritten oder der Server reagiert nicht."
Any Idea which side makes the timeout (SQL-Server oder vb.net)?
Is there a safer method uploading lots of data to a SQL-Server?
This is the code I use (The SQL-Skript is in My.Resources.SQLDump):
'####SQLDump #######
Dim SQLscript As String = My.Resources.SQLDump
SQLscript = "USE [" + TextBoxDB.Text + "] " + vbCrLf + SQLscript
Using connection As New SqlConnection(Verbind.ConStr)
Using command As New SqlCommand()
' Set the connection
command.Connection = connection
' Not necessary, but good practice
command.CommandType = CommandType.Text
' Example query using parameters
command.CommandText = SQLscript
Try
connection.Open()
rowsAffected = command.ExecuteNonQuery()
Catch ex As Exception
Debug.Print(ex.Message)
Finally
command.Dispose()
connection.Dispose()
End Try
'command.ExecuteNonQuery()
End Using ' Dispose Command
End Using ' Dispose (and hence Close) Connection
MsgBox("DB-Upload 1/5 fertig!")
Is there a safer method uploading lots of data to a SQL-Server?
Yes, there are couple of ways to load data into SQL Server
bcp Utility
BULK INSERT statement
OPENROWSET
SQL Server Import and Export Wizard
Important note: On a 64-bit computer, Integration Services installs
the 64-bit version of the SQL Server Import and Export Wizard
(DTSWizard.exe). However, some data sources, such as Access or Excel,
only have a 32-bit provider available. To work with these data
sources, you might have to install and run the 32-bit version of the
wizard. To install the 32-bit version of the wizard, select either
Client Tools or SQL Server Data Tools (SSDT) during setup.
SSIS
Tutorial to create a SSIS package
Being a SSIS developer I can go in deep and demostrate possible ways to import data by using SSIS like
Bulk import
Script task
Flat file source/ Excel source
Etc Etc.. but that would be different from the question which you have asked.
Regarding, resolving your error you can set the command timeout property in your code to make it run for any specific time limit or limitless.
command.CommandTimeout = 1;
Note: The CommandTimeout property will be ignored during asynchronous
method calls such as BeginExecuteReader.
Edited
Here is the sample code
Using command As New SqlCommand()
command.Connection = connection
command.CommandType = CommandType.Text
command.CommandText = SQLscript
command.CommandTimeout = 1
Try
connection.Open()
rowsAffected = command.ExecuteNonQuery()
Catch ex As Exception
Debug.Print(ex.Message)
Finally
command.Dispose()
connection.Dispose()
End Try
End Using
Other way is to extend connection timeout in your connection string. It accept integer which indicates seconds (in below example connection timeout is set to 30 seconds; 0 means unlimited)
Connection Timeout=30;
Import / Export Wizard for one time move
Since you were able to generate scripts, you have access to the source. Since you're able to execute them, you have access to the destination, and you have Management Studio installed. Your fastest option, IMO, is the Import Export utility provided by Sql Server.
In Sql Server Management Studio, right click on the source database name, click on "Tasks" and then click on "Export Data". This opens up a wizard that will let you put in Source, Destination, and pick the objects you want to export.
While 50 MB of scripts is large, 50MB of data move should finish within a matter of minutes.
Moving data over and over again
You have several options, I would go with OpenRowset, especially for the size of data you're moving. Here's a reference https://msdn.microsoft.com/en-us/library/ms190312.aspx
If you go into moving Gigabytes worth of data regularly, then SSIS packages are your best option. Here's a tutorial https://msdn.microsoft.com/en-us/library/ms169917.aspx

Resources