I've literally no experience in VB script or C#. I've created this SSIS package using some online tutorial which server my purpose but I've to fine-tune it to fit my requirements.
Current Scenario:
I'm trying to run an SSIS package which has a for-each loop container which imports the files with *.txt extension in a directory as the file names are not constant. This for-each loop container is followed by some other SQL tasks.
The package is executed successfully even when there are no files in the directory (May be I did something wrong while creating the container and data flow tasks, file system tasks). This is causing the SQL script at the end of the for-each loop container to execute successfully which is resulting in wrong data.
Requirement:
The package should fail if there is no file in directory. I've to implement a script before for-each loop container but not sure how to do it. Any leads would be appreciated!
I did something like this but not sure how to search wrt extension rather than file name:
Public Sub Main()
'
' Add your code here
'
Dim fileName As String
fileName = "filename.txt"
If System.IO.File.Exists(fileName) Then
Dts.Variables("User::bolFileExists").Value = True
Else
Dts.Variables("User::bolFileExists").Value = False
End If
Dts.TaskResult = ScriptResults.Success
End Sub
You should use System.IO.Directory.GetFiles() function.
If System.IO.Directory.GetFiles(<your path goes here>, "*.txt", SearchOption.AllDirectories).Length = 0 Then
Dts.Variables("User::bolFileExists").Value = False
Else
Dts.Variables("User::bolFileExists").Value = True
End If
Below would be my suggestion, I did the same in one of my requirements using event handling section that the underlying DFT is not run, then the script in event handler page would raise error. The point to be noted is that the DFT runs atleast once if there is any file in directory, raising error if it not runs would be simple rather than writing a complex script
Thanks,
Srinivas
I have following C# code in a console application.
Whenever I debug the application and run the query1 (which inserts a new value into the database) and then run query2 (which displays all the entries in the database), I can see the new entry I inserted clearly. However, when I close the application and check the table in the database (in Visual Studio), it is gone. I have no idea why it is not saving.
using System;
using System.Collections.Generic;
using System.Data.Entity;
using System.Linq;
using System.Text;
using System.Threading.Tasks;
using System.Data.SqlServerCe;
using System.Data;
namespace ConsoleApplication1
{
class Program
{
static void Main(string[] args)
{
try
{
string fileName = "FlowerShop.sdf";
string fileLocation = "|DataDirectory|\\";
DatabaseAccess dbAccess = new DatabaseAccess();
dbAccess.Connect(fileName, fileLocation);
Console.WriteLine("Connected to the following database:\n"+fileLocation + fileName+"\n");
string query = "Insert into Products(Name, UnitPrice, UnitsInStock) values('NewItem', 500, 90)";
string res = dbAccess.ExecuteQuery(query);
Console.WriteLine(res);
string query2 = "Select * from Products";
string res2 = dbAccess.QueryData(query2);
Console.WriteLine(res2);
Console.ReadLine();
}
catch (Exception e)
{
Console.WriteLine(e);
Console.ReadLine();
}
}
}
class DatabaseAccess
{
private SqlCeConnection _connection;
public void Connect(string fileName, string fileLocation)
{
Connect(#"Data Source=" + fileLocation + fileName);
}
public void Connect(string connectionString)
{
_connection = new SqlCeConnection(connectionString);
}
public string QueryData(string query)
{
_connection.Open();
using (SqlCeDataAdapter da = new SqlCeDataAdapter(query, _connection))
using (DataSet ds = new DataSet("Data Set"))
{
da.Fill(ds);
_connection.Close();
return ds.Tables[0].ToReadableString(); // a extension method I created
}
}
public string ExecuteQuery(string query)
{
_connection.Open();
using (SqlCeCommand c = new SqlCeCommand(query, _connection))
{
int r = c.ExecuteNonQuery();
_connection.Close();
return r.ToString();
}
}
}
EDIT: Forgot to mention that I am using SQL Server Compact Edition 4 and VS2012 Express.
It is a quite common problem. You use the |DataDirectory| substitution string. This means that, while debugging your app in the Visual Studio environment, the database used by your application is located in the subfolder BIN\DEBUG folder (or x86 variant) of your project. And this works well as you don't have any kind of error connecting to the database and making update operations.
But then, you exit the debug session and you look at your database through the Visual Studio Server Explorer (or any other suitable tool). This window has a different connection string (probably pointing to the copy of your database in the project folder). You search your tables and you don't see the changes.
Then the problem get worse. You restart VS to go hunting for the bug in your app, but you have your database file listed between your project files and the property Copy to Output directory is set to Copy Always. At this point Visual Studio obliges and copies the original database file from the project folder to the output folder (BIN\DEBUG) and thus your previous changes are lost.
Now, your application inserts/updates again the target table, you again can't find any error in your code and restart the loop again until you decide to post or search on StackOverflow.
You could stop this problem by clicking on the database file listed in your Solution Explorer and changing the property Copy To Output Directory to Copy If Newer or Never Copy. Also you could update your connectionstring in the Server Explorer to look at the working copy of your database or create a second connection. The first one still points to the database in the project folder while the second one points to the database in the BIN\DEBUG folder. In this way you could keep the original database ready for deployment purposes and schema changes, while, with the second connection you could look at the effective results of your coding efforts.
EDIT Special warning for MS-Access database users. The simple act of looking at your table changes the modified date of your database ALSO if you don't write or change anything. So the flag Copy if Newer kicks in and the database file is copied to the output directory. With Access better use Copy Never.
Committing changes / saving changes across debug sessions is a familiar topic in SQL CE forums. It is something that trips up quite a few people. I'll post links to source articles below, but I wanted to paste the answer that seems to get the best results to the most people:
You have several options to change this behavior. If your sdf file is part of the content of your project, this will affect how data is persisted. Remember that when you debug, all output of your project (including the sdf) if in the bin/debug folder.
You can decide not to include the sdf file as part of your project and manage the file location runtime.
If you are using "copy if newer", and project changes you make to the database will overwrite any runtime/debug changes.
If you are using "Do not copy", you will have to specify the location in code (as two levels above where your program is running).
If you have "Copy always", any changes made during runtime will always be overwritten
Answer Source
Here is a link to some further discussion and how to documentation.
I am facing two issues in my SSIS package. The SSIS package I have has an active task that formats an excel sheet as per my requirement and saves it as a different file, modified.xlsx. This file is then used in my data flow task to process and upload data to database tables.
This package works perfectly in my local system, but when I created a scheduled job on my SQL server to run this package it fails with the generic error message "Microsoft (R) SQL Server Execute Package Utility Version 11.0.5058.0 for 64-bit Copyright (C) Microsoft Corporation. All rights reserved. Started: 12:06:55 PM Error: 2016-04-01 12:06:57.06 Code: 0x00000001 Source: Script Task Description: Exception has been thrown by the target of an invocation. End Error DTExec: The package execution returned DTSER_FAILURE (1). Started: 12:06:55 PM Finished: 12:06:57 PM Elapsed: 1.563 seconds. The package execution failed. The step failed."
To get a more detailed error message i tried to set up logging for the active task.
I configured logging to write log entries for events to a CSV file as in the below screen shot.
I enabled logging for package and checked individuals tasks as well. In active tasks, I added Dts.Log("",0,bytes); to keep track of any exception if any also to log each steps.
public partial class ScriptMain:Microsoft.SqlServer.Dts.Tasks.ScriptTask.VSTARTScriptObjectModelBase
{
byte[] bytes = new byte[0];
public void Main()
{
LogMessages("");
LogMessages("Update Bug package execution started at :: " + DateTime.Now.ToLongTimeString());
LogMessages("Loading package configuration values to local variables.");
FileName = Convert.ToString(Dts.Variables["User::ExcelFileName"].Value);
SourceFileLocation = Convert.ToString(Dts.Variables["User::SourceFileLoc"].Value);
SourceFileName = Path.Combine(SourceFileLocation, FileName);
saveLoc = Path.Combine(SourceFileLocation, "ModifiedExcel.xlsx");
var excel = new Excel.Application();
var workbook = excel.Workbooks.Open(SourceFileName);
try
{
foreach (Excel.Worksheet tempSheet in workbook.Worksheets)
{
LogMessages("For loop to check sheet names");
if (((Excel.Worksheet)(tempSheet)).Name.Contains("Test"))
{
if (File.Exists(saveLoc))
{
File.Delete(saveLoc);
}
//File.Create(saveLoc);
tempSheet.Select();
workbook.SaveAs(saveLoc);
}
System.Runtime.InteropServices.Marshal.ReleaseComObject(tempSheet);
}
workbook.Save();
workbook.Close();
excel.Quit();
LogMessages("Quit Excel sheet");
System.Runtime.InteropServices.Marshal.ReleaseComObject(workbook);
System.Runtime.InteropServices.Marshal.ReleaseComObject(excel);
LogMessages("Release excel objects");
}
catch(Exception ex)
{
LogMessages("Exception: " + ex.InnerException);
System.Runtime.InteropServices.Marshal.ReleaseComObject(workbook);
System.Runtime.InteropServices.Marshal.ReleaseComObject(excel);
}
Dts.TaskResult = (int)ScriptResults.Success;
}
#region ScriptResults declaration
/// <summary>
/// This enum provides a convenient shorthand within the scope of this class for setting the
/// result of the script.
///
/// This code was generated automatically.
/// </summary>
enum ScriptResults
{
Success = Microsoft.SqlServer.Dts.Runtime.DTSExecResult.Success,
Failure = Microsoft.SqlServer.Dts.Runtime.DTSExecResult.Failure
};
#endregion
#region Log messages to package log files/table.
public void LogMessages(string strLogMsg)
{
Dts.Log(strLogMsg, 0, bytes);
}
#endregion
}
But when i run the package the log file is not updated. The log file just contains the following :
Fields: event,computer,operator,source,sourceid,executionid,starttime,endtime,datacode,databytes,message
Can someone please help me understand what I am missing here for logging? Also, what could be the issue for the job to fail in SQL server?
Why is it not logging?
Here's the fun part, as best as I've been able to divine over the years of dealing with SSIS. Dts.Log is pretty useless, at least if you want it to show up in the Logging facility built into SSIS.
Instead, change out your Dts.Log calls to Dts.Events.Fire e.g.
bool fireAgain = false;
Dts.Events.FireInformation(0, "This gest logged", "My long description here", string.Empty, 0, ref fireAgain);
Then, in your Details tab above, ensure that you have checked the OnInformation event (this also presumes you've configured the package to track it all)
Finally, if you don't actually click the button in Providers and Logs tab, it won't log to the table
Why is it not working?
The package isn't working because you're dealing with Excel and the error message specifies you are running in 64 bit mode.
Microsoft (R) SQL Server Execute Package Utility Version 11.0.5058.0 for 64-bit
Unless you've done something to explicitly make 64 bit Excel work on this server, it's not going to work. Instead, in the SQL Agent job, you need to specify that this task runs in 32 bit mode.
See also
Why does my ODBC connection fail when running an SSIS load in Visual Studio but not when running the same package using Execute Package Utility
I had a connection string to a MS Access DB file, Foo.accdb, defined and used in my project. It was defined as a connection string Setting in the Settings section of my project properties. The program referenced the connection string setting and everything worked fine.
Then I decided to replace Foo.accdb with two different DB files, A.accdb and B.accdb each of which would be used under different circumstances. I added connection strings for them in Settings and removed the Setting definition for Foo.accdb connection string.
The name of my application is Foo and the name of the Foo.accdb connection string was FooConnectionString.
But now when I build the program both in debugger and for release I get the following error message:
'FooConnectionString' is not a member of 'Foo.My.MySettings'.
The offending reference, in file FooDataSet.Designer.vb, is:
<Global.System.Diagnostics.DebuggerNonUserCodeAttribute()> _
Private Sub InitConnection()
Me._connection = New Global.System.Data.OleDb.OleDbConnection
Me._connection.ConnectionString = Global.Foo.My.MySettings.Default.FooConnectionString
End Sub
What is going on here? FooConnectionString is not in any other file in the project directory nor in the My Project subdir. I completely got rid of it in my code and in my project properties yet it persists in FooDataSet.Designer.vb (whatever that is).
While researching this on the web I saw a recommendation to select the FooDataSet.xsd file, right click it and execute the "Run Custom Tool" option. I did this and it appears to rebuild FooDataSet.Designer.vb (the time stamp changes) but the problem persists.
I also tried removing the offending reference by manually editing FooDataSet.Designer.vb but that gave me some other error message.
Why is this old reference staying around and what can I do about it?
This is a standalone app. I'm using VS2008 Standard Ed., VB.Net 3.5
Thanks.
Open the FooDataSet XSD file in a text editor. Right click on dataset in the solution explorer and select "Open With..." and the select XML (text) Editor or open it outside the solution.
Look for the <Connections> tag near the top of the file. Remove the line that looks like this
<Connection AppSettingsObjectName="Settings" AppSettingsPropertyName="FooConnectionString" ConnectionStringObject="" IsAppSettingsProperty="true" Modifier="Assembly" Name="FooConnectionString(Settings)" ...
I'm using SQL Server 2005, and creating ftp tasks within SSIS.
Sometimes there will be files to ftp over, sometimes not. If there are no files, I don't want the task nor the package to fail. I've changed the arrow going from the ftp task to the next to "completion", so the package runs through. I've changed the allowed number of errors to 4 (because there are 4 ftp tasks, and any of the 4 directories may or may not have files).
But, when I run the package from a job in agent, it marks the job as failing. Since this will be running every 15 minutes, I don't want a bunch of red x's in my job history, which will cause us to not see a problem when it really does occur.
How do I set the properties in the ftp task so that not finding files to ftp is not a failure? The operation I am using is "Send files".
Here is some more information: the files are on a server that I don't have any access through except ftp. And, I don't know the filenames ahead of time. The user can call them whatever they want. So I can't check for specific files, nor, I think, can I check at all. Except through using the ftp connection and tasks based upon that connection. The files are on a remote server, and I want to copy them over to my server, to get them from that remote server.
I can shell a command level ftp in a script task. Perhaps that is what I need to use instead of a ftp task. (I have changed to use the ftp command line, with a parameter file, called from a script task. It gives no errors when there are no files to get. I think this solution is going to work for me. I'm creating the parameter file dynamically, which means I don't need to have connection information in the plain text file, but rather can be stored in my configuration file, which is in a more secure location.)
I understand that you have found an answer to your question. This is for other users who might stumble upon this question. Here is one possible way of achieving this. Script Task can be used to find the list of files present in an FTP folder path for a given pattern (say *.txt). Below example shows how this can be done.
Step-by-step process:
On the SSIS package, create an FTP Connection named FTP and also create 5 variables as shown in screenshot #1. Variable RemotePath contains the FTP folder path; LocalPath contains the folder where the files will be downloaed to; FilePattern contains the file pattern to find the list of files to download from FTP server; FileName will be populated by the Foreach loop container but to avoid FTP task design time error, it can be populated with / or the DelayValidation property on the FTP Task can be set to True.
On the SSIS package, place a Script Task, Foreach Loop container and FTP Task within the Foreach Loop container as shown in screenshots #2.
Replace the Main() method within the Script Task with the code under the Script Task Code section. Script Task will populate the variable ListOfFiles with the collection of files matching a given pattern. This example will first use the pattern *.txt, which yields no results and then later the pattern *.xls that will match few files on the FTP server.
Configure the Foreach Loop container as shown in screenshots #3 and #4. This task will loop through the variable **ListOfFiles*. If there are no files, the FTP task inside the loop container will not execute. If there are files, the FTP task inside the loop container will execute for the task for the number of files found on the FTP server.
Configure the FTP Task as shown in screenshots #5 and #6.
Screenshot #7 shows sample package execution when no matching files are found for the pattern *.txt.
Screenshot #8 shows the contents of the folder C:\temp\ before execution of the package.
Screenshot #9 shows sample package execution when matching files are found for the pattern *.xls.
Screenshot #10 shows the contents of the FTP remote path /Practice/Directory_New.
Screenshot #11 shows the contents of the folder C:\temp\ after execution of the package.
Screenshot #12 shows the package failure when provided with incorrect Remote path.
Screenshot #13 shows the error message related to the package failure.
Hope that helps.
Script Task Code:
C# code that can be used in SSIS 2008 and above.
Include the using statement using System.Text.RegularExpressions;
public void Main()
{
Variables varCollection = null;
ConnectionManager ftpManager = null;
FtpClientConnection ftpConnection = null;
string[] fileNames = null;
string[] folderNames = null;
System.Collections.ArrayList listOfFiles = null;
string remotePath = string.Empty;
string filePattern = string.Empty;
Regex regexp;
int counter;
Dts.VariableDispenser.LockForWrite("User::RemotePath");
Dts.VariableDispenser.LockForWrite("User::FilePattern");
Dts.VariableDispenser.LockForWrite("User::ListOfFiles");
Dts.VariableDispenser.GetVariables(ref varCollection);
try
{
remotePath = varCollection["User::RemotePath"].Value.ToString();
filePattern = varCollection["User::FilePattern"].Value.ToString();
ftpManager = Dts.Connections["FTP"];
ftpConnection = new FtpClientConnection(ftpManager.AcquireConnection(null));
ftpConnection.Connect();
ftpConnection.SetWorkingDirectory(remotePath);
ftpConnection.GetListing(out folderNames, out fileNames);
ftpConnection.Close();
listOfFiles = new System.Collections.ArrayList();
if (fileNames != null)
{
regexp = new Regex("^" + filePattern + "$");
for (counter = 0; counter <= fileNames.GetUpperBound(0); counter++)
{
if (regexp.IsMatch(fileNames[counter]))
{
listOfFiles.Add(remotePath + fileNames[counter]);
}
}
}
varCollection["User::ListOfFiles"].Value = listOfFiles;
}
catch (Exception ex)
{
Dts.Events.FireError(-1, string.Empty, ex.ToString(), string.Empty, 0);
Dts.TaskResult = (int) ScriptResults.Failure;
}
finally
{
varCollection.Unlock();
ftpConnection = null;
ftpManager = null;
}
Dts.TaskResult = (int)ScriptResults.Success;
}
VB code that can be used in SSIS 2005 and above.
Include the Imports statement Imports System.Text.RegularExpressions
Public Sub Main()
Dim varCollection As Variables = Nothing
Dim ftpManager As ConnectionManager = Nothing
Dim ftpConnection As FtpClientConnection = Nothing
Dim fileNames() As String = Nothing
Dim folderNames() As String = Nothing
Dim listOfFiles As Collections.ArrayList
Dim remotePath As String = String.Empty
Dim filePattern As String = String.Empty
Dim regexp As Regex
Dim counter As Integer
Dts.VariableDispenser.LockForRead("User::RemotePath")
Dts.VariableDispenser.LockForRead("User::FilePattern")
Dts.VariableDispenser.LockForWrite("User::ListOfFiles")
Dts.VariableDispenser.GetVariables(varCollection)
Try
remotePath = varCollection("User::RemotePath").Value.ToString()
filePattern = varCollection("User::FilePattern").Value.ToString()
ftpManager = Dts.Connections("FTP")
ftpConnection = New FtpClientConnection(ftpManager.AcquireConnection(Nothing))
ftpConnection.Connect()
ftpConnection.SetWorkingDirectory(remotePath)
ftpConnection.GetListing(folderNames, fileNames)
ftpConnection.Close()
listOfFiles = New Collections.ArrayList()
If fileNames IsNot Nothing Then
regexp = New Regex("^" & filePattern & "$")
For counter = 0 To fileNames.GetUpperBound(0)
If regexp.IsMatch(fileNames(counter)) Then
listOfFiles.Add(remotePath & fileNames(counter))
End If
Next counter
End If
varCollection("User::ListOfFiles").Value = listOfFiles
Dts.TaskResult = ScriptResults.Success
Catch ex As Exception
Dts.Events.FireError(-1, String.Empty, ex.ToString(), String.Empty, 0)
Dts.TaskResult = ScriptResults.Failure
Finally
varCollection.Unlock()
ftpConnection = Nothing
ftpManager = Nothing
End Try
Dts.TaskResult = ScriptResults.Success
End Sub
Screenshot #1:
Screenshot #2:
Screenshot #3:
Screenshot #4:
Screenshot #5:
Screenshot #6:
Screenshot #7:
Screenshot #8:
Screenshot #9:
Screenshot #10:
Screenshot #11:
Screenshot #12:
Screenshot #13:
Check this link that describes about gracefully handling task error in SSIS Package.
I had almost the same problem but, with retrieving files. I wanted the package NOT to fail when no files were found on FTP server. The above link stops the error bubbling up and causing the package to fail; something you would have thought FailPackageOnError=false should have done? :-S
Hope this solves it for you too!
I just had this issue, after reading some of the replies here, nothing really sorted out my problem and the solutions in here seem insane in terms of complexity.
My FTP task was failing since I did not allow overwriting files, lets say the job was kicked off twice in a row, the first pass will be fine, because some files are transferred over but will fail if a local file already exists.
My solution was simple:
Right click task - Properties
Set ForceExecutionResult = "Success"
(I can't accept my own answer, but this was the solution that worked for me.)
It may not be the best solution, but this works.
I use a script task, and have a bunch of variables for the ftp connection information, and source and destination directories. (Because, we'll be changing the server this is run on, and it will be easier to change in a config package.)
I create a text file on the fly, and write the ftp commands to it:
Dim ftpStream As StreamWriter = ftpFile.CreateText()
ftpStream.WriteLine(ftpUser)
ftpStream.WriteLine(ftpPassword)
ftpStream.WriteLine("prompt off")
ftpStream.WriteLine("binary")
ftpStream.WriteLine("cd " & ftpDestDir)
ftpStream.WriteLine("mput " & ftpSourceDir)
ftpStream.WriteLine("quit 130")
ftpStream.Close()
Then, after giving it enough time to really close, I start a process to do the ftp command:
ftpParameters = "-s:" & ftpParameterLoc & ftpParameterFile & " " & ftpServer
proc = System.Diagnostics.Process.Start("ftp", ftpParameters)
Then, after giving it some more time for the ftp process to run, I delete the temporary ftp file (that has connection information in it!).
If files don't exist in the source directory (the variable has the \\drive\dir\*.* mapping), then there is no error. If some other error happens, the task still fails, as it should.
I'm new to SSIS, and this may be a kludge. But it works for now. I guess I asked for the best way, and I'll certainly not claim that this is it.
As I pointed out, I have no way of knowing what the files are named, or even if there are any files there at all. If they are there, I want to get them.
I don't have a packaged answer for you, but since no one else has posted anything yet...
You should be able to set a variable in an ActiveX script task and then use that to decide whether or not the FTP task should run. There is an example here that works with local paths. Hopefully you can adapt the concept (or if possible, map the FTP drive and do it that way).
1) Set the FTP Task property ForceExecutionResult = Success
2) Add this code to FTP Task OnError event handler.
public void Main()
{
// TODO: Add your code here
int errorCode = (int)Dts.Variables["System::ErrorCode"].Value;
if (errorCode.ToString().Equals("-1073573501"))
{
Dts.Variables["System::Propagate"].Value = false;
}
else
{
Dts.Variables["System::Propagate"].Value = true;
}
Dts.TaskResult = (int)ScriptResults.Success;
}
Put it in a ForEach container, which iterates over the files to upload. No files, no FTP, no failure.
You can redirect on failure, to another task that does nothing, ie a script that just returns true.
To do this, add the new script task, highlight your FTP task, a second green connector will appear, drag this to the script task, and then double click it. Select Failure on the Value drop down. Obviously, you'll then need to handle real failures in this script task to still display right in the Job history.
Aha, OK - Thanks for clarification. As the FTP task cannot return a folder listing it will not be possible to use the ForEach as I initially said - That only works if you're uploading X amount of files to a remote source.
To download X amount of files, you can go two ways, either you can do it entirely in .Net in a script task, or you can populate an ArrayList with the file names from within a .Net script task, then ForEach over the ArrayList, passing the file name to a variable and downloading that variable name in a standard FTP task.
Code example to suit: http://forums.microsoft.com/msdn/ShowPost.aspx?PostID=2472491&SiteID=1
So, in the above, you'd get the FileNames() and populate the ArrayList from that, then assign the ArrayList to an Object type variable in Dts.Variables, then ForEach over that Object (ArrayList) variable using code something like: http://www.sqlservercentral.com/articles/SSIS/64014/
You can use the free SSIS FTP Task++ from eaSkills. It doesn't throw an error if the file or files don't exist, it support wild cards and gives you the option to download and delete if you need to do so.
Here's the link to the feature page:
http://www.easkills.com/ssis/ftptask
This is another solution that is working for me, using built-in stuff and so without manually re-writing the FTP logic:
1) Create a variable in your package called FTP_Error
2) Click your FTP Task, then click "Event Handlers" tab
3) Click within the page to create an event handler for "FTP Task/OnError" - this will fire whenever there is trouble with the FTP
4) From the toolbox, drag in a Script Task item, and double-click to open that up
5) In the first pop-up, ReadOnlyVariables - add System::ErrorCode, System::ErrorDescription
6) In the first pop-up, ReadWriteVariables - add your User::FTP_Error variable
7) Edit Script
8) In the script set your FTP_Error variable to hold the ReadOnlyVariables we had above:
Dts.Variables["FTP_Error"].Value = "ErrorCode:" + Dts.Variables["ErrorCode"].Value.ToString() + ", ErrorDescription=" + Dts.Variables["ErrorDescription"].Value.ToString();
9) Save and close script
10) Hit "OK" to script task
11) Go back to "Control Flow" tab
12) From the FTP task, OnError go to a new Script task, and edit that
13) ReadOnlyVariables: User::FTP_Error from before
14) Now, when there are no files found on the FTP, the error code is -1073573501
(you can find the error code reference list here: http://msdn.microsoft.com/en-us/library/ms345164.aspx)
15) In your script, put in the logic to do what you want - if you find a "no files found" code, then maybe you say task successful. If not, then task failed. And your normal flow can handle this as you wish:
if (Dts.Variables["FTP_Error"].Value.ToString().Contains("-1073573501"))
{
// file not found - not a problem
Dts.TaskResult = (int)ScriptResults.Success;
}
else
{
// some other error - raise alarm!
Dts.TaskResult = (int)ScriptResults.Failure;
}
And from there your Succeeded/Failed flow will do what you want to do with it.
An alternative is to use this FTP File Enumerator