I am facing two issues in my SSIS package. The SSIS package I have has an active task that formats an excel sheet as per my requirement and saves it as a different file, modified.xlsx. This file is then used in my data flow task to process and upload data to database tables.
This package works perfectly in my local system, but when I created a scheduled job on my SQL server to run this package it fails with the generic error message "Microsoft (R) SQL Server Execute Package Utility Version 11.0.5058.0 for 64-bit Copyright (C) Microsoft Corporation. All rights reserved. Started: 12:06:55 PM Error: 2016-04-01 12:06:57.06 Code: 0x00000001 Source: Script Task Description: Exception has been thrown by the target of an invocation. End Error DTExec: The package execution returned DTSER_FAILURE (1). Started: 12:06:55 PM Finished: 12:06:57 PM Elapsed: 1.563 seconds. The package execution failed. The step failed."
To get a more detailed error message i tried to set up logging for the active task.
I configured logging to write log entries for events to a CSV file as in the below screen shot.
I enabled logging for package and checked individuals tasks as well. In active tasks, I added Dts.Log("",0,bytes); to keep track of any exception if any also to log each steps.
public partial class ScriptMain:Microsoft.SqlServer.Dts.Tasks.ScriptTask.VSTARTScriptObjectModelBase
{
byte[] bytes = new byte[0];
public void Main()
{
LogMessages("");
LogMessages("Update Bug package execution started at :: " + DateTime.Now.ToLongTimeString());
LogMessages("Loading package configuration values to local variables.");
FileName = Convert.ToString(Dts.Variables["User::ExcelFileName"].Value);
SourceFileLocation = Convert.ToString(Dts.Variables["User::SourceFileLoc"].Value);
SourceFileName = Path.Combine(SourceFileLocation, FileName);
saveLoc = Path.Combine(SourceFileLocation, "ModifiedExcel.xlsx");
var excel = new Excel.Application();
var workbook = excel.Workbooks.Open(SourceFileName);
try
{
foreach (Excel.Worksheet tempSheet in workbook.Worksheets)
{
LogMessages("For loop to check sheet names");
if (((Excel.Worksheet)(tempSheet)).Name.Contains("Test"))
{
if (File.Exists(saveLoc))
{
File.Delete(saveLoc);
}
//File.Create(saveLoc);
tempSheet.Select();
workbook.SaveAs(saveLoc);
}
System.Runtime.InteropServices.Marshal.ReleaseComObject(tempSheet);
}
workbook.Save();
workbook.Close();
excel.Quit();
LogMessages("Quit Excel sheet");
System.Runtime.InteropServices.Marshal.ReleaseComObject(workbook);
System.Runtime.InteropServices.Marshal.ReleaseComObject(excel);
LogMessages("Release excel objects");
}
catch(Exception ex)
{
LogMessages("Exception: " + ex.InnerException);
System.Runtime.InteropServices.Marshal.ReleaseComObject(workbook);
System.Runtime.InteropServices.Marshal.ReleaseComObject(excel);
}
Dts.TaskResult = (int)ScriptResults.Success;
}
#region ScriptResults declaration
/// <summary>
/// This enum provides a convenient shorthand within the scope of this class for setting the
/// result of the script.
///
/// This code was generated automatically.
/// </summary>
enum ScriptResults
{
Success = Microsoft.SqlServer.Dts.Runtime.DTSExecResult.Success,
Failure = Microsoft.SqlServer.Dts.Runtime.DTSExecResult.Failure
};
#endregion
#region Log messages to package log files/table.
public void LogMessages(string strLogMsg)
{
Dts.Log(strLogMsg, 0, bytes);
}
#endregion
}
But when i run the package the log file is not updated. The log file just contains the following :
Fields: event,computer,operator,source,sourceid,executionid,starttime,endtime,datacode,databytes,message
Can someone please help me understand what I am missing here for logging? Also, what could be the issue for the job to fail in SQL server?
Why is it not logging?
Here's the fun part, as best as I've been able to divine over the years of dealing with SSIS. Dts.Log is pretty useless, at least if you want it to show up in the Logging facility built into SSIS.
Instead, change out your Dts.Log calls to Dts.Events.Fire e.g.
bool fireAgain = false;
Dts.Events.FireInformation(0, "This gest logged", "My long description here", string.Empty, 0, ref fireAgain);
Then, in your Details tab above, ensure that you have checked the OnInformation event (this also presumes you've configured the package to track it all)
Finally, if you don't actually click the button in Providers and Logs tab, it won't log to the table
Why is it not working?
The package isn't working because you're dealing with Excel and the error message specifies you are running in 64 bit mode.
Microsoft (R) SQL Server Execute Package Utility Version 11.0.5058.0 for 64-bit
Unless you've done something to explicitly make 64 bit Excel work on this server, it's not going to work. Instead, in the SQL Agent job, you need to specify that this task runs in 32 bit mode.
See also
Why does my ODBC connection fail when running an SSIS load in Visual Studio but not when running the same package using Execute Package Utility
Related
I have a task to send query result in an email using SSIS 2017.
I have referred to previous posts related to that but i am not able to get my required answer.
Here's what i have done so far:
1. Create an executive sql task with my query and resultset which refers to a variable of object type
2. create a foreach container
3. placed a script task under foreach loop(i think my issue is with this task) and have placed the variable as readonlyvariable
4. send email task with bodytext as variable
My result set is just one column from the table.
I have referred to this url: How to send the records from a table in an e-mail body using SSIS package?
and please find attached the script task code.
Would be great if you guys could help me out
/*Microsoft SQL Server Integration Services Script Task
Write scripts using Microsoft Visual C# 2008.
The ScriptMain is the entry point class of the script.
*/
using System;
using System.Data;
using Microsoft.SqlServer.Dts.Runtime;
using System.Windows.Forms;
namespace ST_8015f41e93944f0e944089c73b520312
{
[Microsoft.SqlServer.Dts.Tasks.ScriptTask.SSISScriptTaskEntryPointAttribute]
public partial class ScriptMain : Microsoft.SqlServer.Dts.Tasks.ScriptTask.VSTARTScriptObjectModelBase
{
#region VSTA generated code
enum ScriptResults
{
Success = Microsoft.SqlServer.Dts.Runtime.DTSExecResult.Success,
Failure = Microsoft.SqlServer.Dts.Runtime.DTSExecResult.Failure
};
#endregion
public void Main()
{
Variables varCollection = null;
string header = string.Empty;
string message = string.Empty;
Dts.VariableDispenser.LockForWrite("User::EmailMessage");
Dts.VariableDispenser.LockForWrite("User::Result");
Dts.VariableDispenser.GetVariables(ref varCollection);
//Set the header message for the query result
if (varCollection["User::EmailMessage"].Value == string.Empty)
{
header = "Execute SQL task output sent using Send Email Task in SSIS:\n\n";
header += string.Format("{0}\n", "Result");
//varCollection["User::EmailMessage"].Value = header;
}
//Format the query result with tab delimiters
message = string.Format("{0}",
varCollection["User::Result"].Value);
varCollection["User::EmailMessage"].Value = varCollection["User::EmailMessage"].Value + message;
Dts.TaskResult = (int)ScriptResults.Success;
}
}
}
I was able to accomplish this without using a script task or db mail. I needed to email myself notices of any reports that failed to refresh across multiple reporting systems. But I didn't want to save the results to a file and attach the file. I just wanted the results in the email body.
I wrote my query to output one concatenated column and added the query results to a recordset Object and used a foreach loop container to append each row to a string variable with "/n" to start a new line.
This created a string that I could use as the body of the email.
Data Flow
Control Flow
I have following C# code in a console application.
Whenever I debug the application and run the query1 (which inserts a new value into the database) and then run query2 (which displays all the entries in the database), I can see the new entry I inserted clearly. However, when I close the application and check the table in the database (in Visual Studio), it is gone. I have no idea why it is not saving.
using System;
using System.Collections.Generic;
using System.Data.Entity;
using System.Linq;
using System.Text;
using System.Threading.Tasks;
using System.Data.SqlServerCe;
using System.Data;
namespace ConsoleApplication1
{
class Program
{
static void Main(string[] args)
{
try
{
string fileName = "FlowerShop.sdf";
string fileLocation = "|DataDirectory|\\";
DatabaseAccess dbAccess = new DatabaseAccess();
dbAccess.Connect(fileName, fileLocation);
Console.WriteLine("Connected to the following database:\n"+fileLocation + fileName+"\n");
string query = "Insert into Products(Name, UnitPrice, UnitsInStock) values('NewItem', 500, 90)";
string res = dbAccess.ExecuteQuery(query);
Console.WriteLine(res);
string query2 = "Select * from Products";
string res2 = dbAccess.QueryData(query2);
Console.WriteLine(res2);
Console.ReadLine();
}
catch (Exception e)
{
Console.WriteLine(e);
Console.ReadLine();
}
}
}
class DatabaseAccess
{
private SqlCeConnection _connection;
public void Connect(string fileName, string fileLocation)
{
Connect(#"Data Source=" + fileLocation + fileName);
}
public void Connect(string connectionString)
{
_connection = new SqlCeConnection(connectionString);
}
public string QueryData(string query)
{
_connection.Open();
using (SqlCeDataAdapter da = new SqlCeDataAdapter(query, _connection))
using (DataSet ds = new DataSet("Data Set"))
{
da.Fill(ds);
_connection.Close();
return ds.Tables[0].ToReadableString(); // a extension method I created
}
}
public string ExecuteQuery(string query)
{
_connection.Open();
using (SqlCeCommand c = new SqlCeCommand(query, _connection))
{
int r = c.ExecuteNonQuery();
_connection.Close();
return r.ToString();
}
}
}
EDIT: Forgot to mention that I am using SQL Server Compact Edition 4 and VS2012 Express.
It is a quite common problem. You use the |DataDirectory| substitution string. This means that, while debugging your app in the Visual Studio environment, the database used by your application is located in the subfolder BIN\DEBUG folder (or x86 variant) of your project. And this works well as you don't have any kind of error connecting to the database and making update operations.
But then, you exit the debug session and you look at your database through the Visual Studio Server Explorer (or any other suitable tool). This window has a different connection string (probably pointing to the copy of your database in the project folder). You search your tables and you don't see the changes.
Then the problem get worse. You restart VS to go hunting for the bug in your app, but you have your database file listed between your project files and the property Copy to Output directory is set to Copy Always. At this point Visual Studio obliges and copies the original database file from the project folder to the output folder (BIN\DEBUG) and thus your previous changes are lost.
Now, your application inserts/updates again the target table, you again can't find any error in your code and restart the loop again until you decide to post or search on StackOverflow.
You could stop this problem by clicking on the database file listed in your Solution Explorer and changing the property Copy To Output Directory to Copy If Newer or Never Copy. Also you could update your connectionstring in the Server Explorer to look at the working copy of your database or create a second connection. The first one still points to the database in the project folder while the second one points to the database in the BIN\DEBUG folder. In this way you could keep the original database ready for deployment purposes and schema changes, while, with the second connection you could look at the effective results of your coding efforts.
EDIT Special warning for MS-Access database users. The simple act of looking at your table changes the modified date of your database ALSO if you don't write or change anything. So the flag Copy if Newer kicks in and the database file is copied to the output directory. With Access better use Copy Never.
Committing changes / saving changes across debug sessions is a familiar topic in SQL CE forums. It is something that trips up quite a few people. I'll post links to source articles below, but I wanted to paste the answer that seems to get the best results to the most people:
You have several options to change this behavior. If your sdf file is part of the content of your project, this will affect how data is persisted. Remember that when you debug, all output of your project (including the sdf) if in the bin/debug folder.
You can decide not to include the sdf file as part of your project and manage the file location runtime.
If you are using "copy if newer", and project changes you make to the database will overwrite any runtime/debug changes.
If you are using "Do not copy", you will have to specify the location in code (as two levels above where your program is running).
If you have "Copy always", any changes made during runtime will always be overwritten
Answer Source
Here is a link to some further discussion and how to documentation.
I'm trying to insert some rows into Filemaker using Script Component. I followed this article Creating an ODBC Destination with the Script Component
When I edit the script here I set the connection string:
public override void AcquireConnections(object Transaction)
{
string connectionString;
//connectionString = this.Connections.FmConnection.ConnectionString;
odbcConn = new OdbcConnection("uid=someUID;Dsn=FM;pwd=somepassword");
odbcConn.Open();
}
I get the connection string from the this.Connections object and set it to the new OdbcConnection object. This does not work as I keep getting the exception. I tried setting the connection string manually as you can see above. I still get the exception.
In other parts of my package I use the connection manager to Filemaker and it works. Just not in this script component.
The exceptions I'm getting is:
Error at MyPackageName: [Filemaker] Password Incorrect
What is wrong?
I am executing an SSIS package on the command line with dtexec.exe. The package calls MessageBox.Show() and MsgBox() (C# and VB.NET, respectively) for showing errors. The boxes are hanging the automated job processor since it is waiting for the box to close.
I was under the impression that the boxes should be hidden when not in Interactive Mode. Can the message boxes be suppressed without modifying the SSIS package? The package is from a software vendor and I really shouldn't be changing it.
EDIT: The full command I am using (edited to remove details)
C:\Program Files\Microsoft SQL Server\100\DTS\Binn\DTExec.exe /F D:\Path\To\Package.dtsx /Conn DBConnection;"Provider=SQLOLEDB;Data Source=SERVER;Initial Catalog=DB;Integrated Security=SSPI;Connect Timeout=60" \Package.Variables[User::InputFileName].Properties[Value];C:\Path\To\Input.csv
EDIT2: Example of message box code. This is in a script task.
Public Sub Main()
MsgBox("Failed to read/parse input file or generic database failure while running " + Dts.Variables("System::PackageName").Value.ToString() + ". Please check the layout of the feed and database connectivity.")
Dts.TaskResult = ScriptResults.Failure
End Sub
EDIT3: If anyone in the future is interested in the code to test for InteractiveMode. It must also be passed in via the Script Task's ReadOnlyVariables. I ended up modifying the package since there were other issues.
VB.NET
If CBool(Dts.Variables("System::InteractiveMode").Value) = True Then
....
End if
C#
if ((bool)Dts.Variables["System::InteractiveMode"].Value == true)
....
}
If anyone in the future is interested in the code to test for InteractiveMode. It must also be passed in via the Script Task's ReadOnlyVariables. I ended up modifying the package since there were other issues.
VB.NET
If CBool(Dts.Variables("System::InteractiveMode").Value) = True Then
....
End if
C#
if ((bool)Dts.Variables["System::InteractiveMode"].Value == true)
....
}
I am trying to deploy an instance which is getting the following build error on App Harbor
Build FAILED.
"D:\temp\gcp22bmp.ggi\input\src\ShareBill.sln" (default target) (1) ->
"D:\temp\gcp22bmp.ggi\input\src\Sharebill.Database\Sharebill.Database.dbproj" (default target) (5) ->
D:\temp\gcp22bmp.ggi\input\src\packages\TeamData\Microsoft.Data.Schema.SqlTasks.targets(5,3): error MSB4019: The imported project "C:\Program Files (x86)\MSBuild\Microsoft\VisualStudio\v10.0\TeamData\Microsoft.Data.Schema.TSqlTasks.targets" was not found. Confirm that the path in the <Import> declaration is correct, and that the file exists on disk. [D:\temp\gcp22bmp.ggi\input\src\Sharebill.Database\Sharebill.Database.dbproj]
0 Warning(s)
1 Error(s)
I know this is because the sql targets that are generally present here - C:\Program Files (x86)\MSBuild\Microsoft\VisualStudio\v10.0\TeamData
are not present on the deployment server.
What should I do to deploy the data base now? Is there any other way to deploy the database. I have not used code first in EF, but I still want to have db migrations automatically applied.
I am assuming that if i get the db project running it would automatically find the schema differences and apply the changes to the database.
I host a project on AppHarbor and use FluentMigrator to deploy my database changes.
It is really easy to implement, just follow the documentation you can use their fluent interface or even just use it to execute sql.
As an example I run the following from my Application_Start of my Global.asax.cs - just put you migrations in the same assembly as the NOP MigrationMarker class:
const string connectionString = #"Data Source=localhost, 1433;Initial Catalog=testdb;Integrated Security=SSPI;";
Announcer announcer = new TextWriterAnnouncer(s => System.Diagnostics.Debug.WriteLine(s));
announcer.ShowSql = true;
var assembly = Assembly.GetAssembly(typeof(MigrationMarker));
var migrationContext = new RunnerContext(announcer);
var options = new ProcessorOptions
{
PreviewOnly = false, // set to true to see the SQL
Timeout = 60
};
var factory = new SqlServer2008ProcessorFactory();
var processor = factory.Create(connectionString, announcer, options);
var runner = new MigrationRunner(assembly, migrationContext, processor);
runner.MigrateUp(true);