Wait for file to delete, then copy a folder - loops

I have a couple of scripts that have to sync a folder from the network server, to the local terminal server, and lastly into the %LOCALAPPDATA%. I need to first check if a folder is being synced (this is done by creating a temporary COPYING.TXT on the server), and wait until that is removed, THEN copy to %LOCALAPPDATA%.
Something like this:
Server-side script executes, which syncs my folder to all of my terminal servers.
It creates a COPYING.TXT temporary file, which indicates the sync is in progress. Once the sync is finished, the script removes the COPYING.TXT
If someone logs on during the sync, I need a script to wait until the COPYING.TXT is deleted I.E the sync is finished, then resume the local sync into their %LOCALAPPDATA%.
do {
cp c:\folder\program $env:LOCALAPPDATA\
} while ( !(test-path c:\folder\COPYING.txt) )
(So that copies the folder while the file DOESN'T exist, but I don't think that exits cleanly)
Or:
while ( !(test-path c:\folder\COPYING.txt) ) {
cp c:\folder\program $env:LOCALAPPDATA\ -recurse -force
if ( !(test-path c:\folder\program) ) {return}
}
But that script quits if the COPYING.TXT exists. I think I need to create a function and insert that function within itself, or a nested while loop, but that is starting to make my head hurt.

As Mosser Lee said, try using the FileSystemWatcher class. Here is a working sample.
#Create the Copying.txt file
"test"|Out-File Copying.txt
#Create a filesystemwatcher
$watcher = New-Object System.IO.FileSystemWatcher
#Give it the root path to monitor
$watcher.Path = $pwd
#The matching pattern
$watcher.Filter = "Copying.txt"
#Monitor subfolder or not
$watcher.IncludeSubdirectories = $true
#Setup event and save a ref
$evt = Register-ObjectEvent $watcher Deleted FileDeleted -Action {
#Stuff it into the global space just so you can inspect it
$global:SomeVar = $Event
Write-Host ("{0} deleted at {1}" -f $Event.SourceEventArgs.FullPath, $Event.TimeGenerated)
}
Remove-Item Copying.txt
This gives output like "H:\Copying.txt deleted at 6/12/2014 3:01:48 PM" when the file is deleted. It also sets a global variable $global:SomeVar if you wanted to look at the properties in depth.
$global:SomeVar
ComputerName :
RunspaceId : 1ab5089e-1734-4b92-8bab-9de4df78ada2
EventIdentifier : 2
Sender : System.IO.FileSystemWatcher
SourceEventArgs : System.IO.FileSystemEventArgs
SourceArgs : {System.IO.FileSystemWatcher, Copying.txt}
SourceIdentifier : FileDeleted
TimeGenerated : 6/12/2014 3:01:48 PM
MessageData :
Don't forget to unregister the event as it will continue to run until you close the session even if you set $watcher to null.
Unregister-Event $evt.Id

Did you have a try to use c# FileSystemWatcher, to monitor the target folder, when change event raised, then check the target file, if it no exits, your expected time is comming: do it.

Related

Laravel queue stops randomly without exception

I have a laravel queue setup with a
database
Connection. Note this problem is also on redis. But i am currently using the database connection for the
failed_jobs
Table to help me check any errors that occur during the queue process.
The problem i have is that the queue stops working after a few jobs without any message showing why. But when i restart the command (php artisan queue:work) it picks up the remaining jobs. And continues. (But stops again later)
The job is configured with these values
public $tries = 1;
public $timeout = 10;
The job code is, (Not original code)
public function handle()
{
try {
$file = //function to create file;
$zip = new ZipArchive();
$zip->open(//zip_path);
$zip->addFile(//file_path, //file_name);
$zip->close();
#unlink(//remove file);
} catch (\Exception $e) {
Log::error($e);
}
}
And the failed function is setup like this:
public function failed(\Exception $exception)
{
Log::error($exception);
$this->fail($exception);
$this->delete();
}
But my there is no failed_job row, And my log is empty
Edit: I added simple info logs after every line of code. And every time i start the queue, It stops after the last line. So the code runs correct. So laravel doesn't start the new job after that
so what you need here to solve the issue is to do the following steps :
go to bootstrap/cache/ remove all file .PHP
go to the src and run php artisan queue:restart
Now after adding the snippet, we need to trigger the following commands respectively:
sudo supervisorctl reread (to check the file content and make sure
that the snippet is correctly set)
sudo supervisorctl update (release the config changes under the supervisor)
sudo supervisorctl restart all (re-trigger the queues so that the newly created queue gets initialized and start picking up messages respectively)
Did you tried queue:listen ?
php artisan queue:listen
Also i guess you need the Supervisor to keep your worker alive.

Convert dacpac into folder structure of database objects with powershell

I'm working on integrating SQL Server databases into our in-house version control/deployment utility, which is built with powershell,and uses Github as a repository.
Using the excellent sqlpackage.exe utility, I have been able to add a process whereby a developer can extract their current changes into a dacpac and store it in Github, then do the opposite in reverse when they want to get the latest version. However, because the .dacpac is a binary file, it's not possible to see differences in git. I have mitigated this somewhat by unzipping the dacpac before storing in in source control, so contained xml files are added instead. However, even though these files are text-based, they are still not easy to look through and find differences.
What I would like to do, is convert the dacpac into a folder structure similar to what would be seen in SSMS (with all the database objects such as triggers, sprocs etc in their respective folders), store that in Github, and then convert it back into a dacpac when a client checks out the code. However, there doesn't seem to be any function in sqlpackage.exe for this, and I can't find any documentation. Is there any command line tool I can use to this through Powershell?
Using the public APIs for DacFx you can load the dacpac, iterate over all objects, and script each one out. If you're willing to write your own code you could write each one to its own file based on the object type. The basic process is covered in the model filtering samples in the DacExtensions Github project. Specifically you'll want to do something like the ModelFilterer code that loads a dacpac, queries all objects, scripts them out - see the CreateFilteredModel method. I've put a sample that should mostly work below. Once you have this, you can easily do compare on a per-object basis.
using (TSqlModel model = new TSqlModel(dacpacPath))
{
IEnumerable<TSqlObject> allObjects = model.GetObjects(QueryScopes);
foreach (TSqlObject tsqlObject allObjects)
{
string script;
if (tsqlObject.TryGetScript(out script))
{
// Some objects such as the DatabaseOptions can't be scripted out.
// Write to disk by object type
string objectTypeName = tsqlObject.ObjectType.Name;
// pseudo-code as I didn't bother writing.
// basically just create the folder and write a file
this.MkdirIfNotExists(objectTypeName);
this.WriteToFile(objectTypeName, tsqlObject.Name + '.sql', script);
}
}
}
This can be converted into a powershell cmdlet fairly easily. The dacfx libraries are on nuget at https://www.nuget.org/packages/Microsoft.SqlServer.DacFx.x64/ so you should be able to install them in PS and then use the code without too much trouble.
Based on the other post I was able to get a script working. Caveat is you'll have to try the types till you get what you want... The way it is no it trys to put the full http or https value for some of the objects.
param($dacpacPath = 'c:\somepath' Debug', $dacpac = 'your.dacpac')
Add-Type -Path 'C:\Program Files (x86)\Microsoft SQL Server\120\DAC\bin\Microsoft.SqlServer.Dac.dll'
add-type -path 'C:\Program Files (x86)\Microsoft SQL Server\120\DAC\bin\Microsoft.SqlServer.Dac.Extensions.dll'
cd $dacpacPath
$dacPack = [Microsoft.SqlServer.Dac.DacPackage]::Load(((get-item ".\$dacpac").fullname))
$model =[Microsoft.SqlServer.Dac.Model.TSqlModel]::new(((get-item ".\$dacpac").fullname))
$queryScopes = [Microsoft.SqlServer.Dac.Model.DacQueryScopes]::All
$return = [System.Collections.Generic.IEnumerable[string]]
$returnObjects = $model.GetObjects([Microsoft.SqlServer.Dac.Model.DacQueryScopes]::All)
$s = ''
foreach($r in $returnObjects)
{
if ($r.TryGetScript([ref]$s))
{
$objectTypeName = $r.ObjectType.Name;
$d="c:\temp\db\$objectTypeName"
if(!(test-path $d ))
{
new-item $d -ItemType Directory
}
$filename = "$d\$($r.Name.Parts).sql"
if(! (test-path $filename))
{new-item $filename -ItemType File}
$s | out-file $filename -Force
write-output $filename
}
}

Task Scheduler is not working even though its running

I have windows 10 installed. I have created a bat file to run my php scripts.
I tested the bat file in the cmd and it works fine(The php script was to delete non-activated users).
bat file:
"C:\xampp\php\php.exe" -f C:\xampp\htdocs\cron_jobs\once_daily.php
php file:
<?php
include("../cust/mysql_connect.php");
$mysqli = db_connect();
$sql = "DELETE FROM customer WHERE Activate=0";
error_log("You messed up!", 3, "my-errors.log");
if (!$mysqli->query($sql))
{
die ('Error : ' . mysqli_error());
}
?>
But when I use the task scheduler, it just runs and does nothing. It did not delete anything, I have just recently started using the task scheduler.
**
UPDATED
**
Create Task:
This is where I set name and highest privileges as admin.
Set Action:
I set the .bat file location and the Trigger tab is empty as I didn't need to use a trigger for this event.
Conditions left as default:
I didn't change this but to show.
Setting left as default:
I didn't change this but to show.
Shown as running:
Overview the task created and running.
PhpMyAdmin:
It did not delete the row with Activate = 0.
History of the task scheduler event:
It seems to run but does nothing in my case.
Any ideas?

Laravel 4: File Upload says success not showing in folder

I'm trying to get a simple image file to upload, then move it to a designated folder, and though I get no error messages, and everything says it works, it won't show up in any folder. The code in question is:
$destination = public_path().'/auction_img/';
$filename = $auction->id;
if (Input::hasFile('image')) {
$file = Input::file('image');
try{
$uploadSuccess = $file->move($destination, $filename.$file->getClientOriginalExtension());
}catch(Exception $e){
error_log('Exception: '.$e->getMessage());
}
if($uploadSuccess){
error_log("YAY!");
}
}
I'm not getting an exception, and I am getting the "YAY" in the error_log. But it won't show up whatever I do. Any ideas? I also checked my php.ini and it has a max size of 20M but the file is only about .5MB
Looks like it's not going where you think it is going.
First of all
try{} catch() {}
Doesn't work on Laravel. You have to set a handler:
App::error(function(Exception $exception)
{
Log::error($exception);
});
Then do a full log of your move to see if it is really going where you need it to:
$destination = public_path().'/auction_img/';
$filename = $auction->id;
if (Input::hasFile('image')) {
$file = Input::file('image');
$uploadSuccess = $file->move($destination, "$filename.".$file->getClientOriginalExtension());
if($uploadSuccess)
{
error_log("Destination: $destination");
error_log("Filename: $filename");
error_log("Extension: ".$file->getClientOriginalExtension());
error_log("Original name: ".$file->getClientOriginalName());
error_log("Real path: ".$file->getRealPath());
}
else
{
error_log("Error moving file: ".$file->getClientOriginalName());
}
}
Also check if the original file is where it is supposed to be.
According to the comments, your move should be:
$uploadSuccess = $file->move($destination, "$filename.".$file->getClientOriginalExtension());
Check if $destination points to your public folder, it must look something like:
/var/www/site/public/auction_img/
Check if public_path() returns /var/www/site/public/ and if it doesn't you can use this in place of it:
$destination = app()->make('path.public').'/auction_img/'
$uploadSuccess in Laravel 4.0 at least, is not a boolean. It returns the file object after it has been moved (it returns $file, so it can be chained). This object cannot reliably be cast to a boolean (Symfony is raising an exception for me if I try to use it in an if-statement like you have).
If the file fails to be moved, then an exception will be raised. If an exception is raised, then you know the file has not been moved and remains where it is. As such, that leaves $file as a valid Symfony file object, and so your processing will continue with the file where it first landed.
So whether the file successfully moves or not, your "YAY!" is always in the execution path. You catch the exception if the move fails, log it, then fail to take any further action on the exception.

How to avoid SSIS FTP task from failing when there are no files to download?

I'm using SQL Server 2005, and creating ftp tasks within SSIS.
Sometimes there will be files to ftp over, sometimes not. If there are no files, I don't want the task nor the package to fail. I've changed the arrow going from the ftp task to the next to "completion", so the package runs through. I've changed the allowed number of errors to 4 (because there are 4 ftp tasks, and any of the 4 directories may or may not have files).
But, when I run the package from a job in agent, it marks the job as failing. Since this will be running every 15 minutes, I don't want a bunch of red x's in my job history, which will cause us to not see a problem when it really does occur.
How do I set the properties in the ftp task so that not finding files to ftp is not a failure? The operation I am using is "Send files".
Here is some more information: the files are on a server that I don't have any access through except ftp. And, I don't know the filenames ahead of time. The user can call them whatever they want. So I can't check for specific files, nor, I think, can I check at all. Except through using the ftp connection and tasks based upon that connection. The files are on a remote server, and I want to copy them over to my server, to get them from that remote server.
I can shell a command level ftp in a script task. Perhaps that is what I need to use instead of a ftp task. (I have changed to use the ftp command line, with a parameter file, called from a script task. It gives no errors when there are no files to get. I think this solution is going to work for me. I'm creating the parameter file dynamically, which means I don't need to have connection information in the plain text file, but rather can be stored in my configuration file, which is in a more secure location.)
I understand that you have found an answer to your question. This is for other users who might stumble upon this question. Here is one possible way of achieving this. Script Task can be used to find the list of files present in an FTP folder path for a given pattern (say *.txt). Below example shows how this can be done.
Step-by-step process:
On the SSIS package, create an FTP Connection named FTP and also create 5 variables as shown in screenshot #1. Variable RemotePath contains the FTP folder path; LocalPath contains the folder where the files will be downloaed to; FilePattern contains the file pattern to find the list of files to download from FTP server; FileName will be populated by the Foreach loop container but to avoid FTP task design time error, it can be populated with / or the DelayValidation property on the FTP Task can be set to True.
On the SSIS package, place a Script Task, Foreach Loop container and FTP Task within the Foreach Loop container as shown in screenshots #2.
Replace the Main() method within the Script Task with the code under the Script Task Code section. Script Task will populate the variable ListOfFiles with the collection of files matching a given pattern. This example will first use the pattern *.txt, which yields no results and then later the pattern *.xls that will match few files on the FTP server.
Configure the Foreach Loop container as shown in screenshots #3 and #4. This task will loop through the variable **ListOfFiles*. If there are no files, the FTP task inside the loop container will not execute. If there are files, the FTP task inside the loop container will execute for the task for the number of files found on the FTP server.
Configure the FTP Task as shown in screenshots #5 and #6.
Screenshot #7 shows sample package execution when no matching files are found for the pattern *.txt.
Screenshot #8 shows the contents of the folder C:\temp\ before execution of the package.
Screenshot #9 shows sample package execution when matching files are found for the pattern *.xls.
Screenshot #10 shows the contents of the FTP remote path /Practice/Directory_New.
Screenshot #11 shows the contents of the folder C:\temp\ after execution of the package.
Screenshot #12 shows the package failure when provided with incorrect Remote path.
Screenshot #13 shows the error message related to the package failure.
Hope that helps.
Script Task Code:
C# code that can be used in SSIS 2008 and above.
Include the using statement using System.Text.RegularExpressions;
public void Main()
{
Variables varCollection = null;
ConnectionManager ftpManager = null;
FtpClientConnection ftpConnection = null;
string[] fileNames = null;
string[] folderNames = null;
System.Collections.ArrayList listOfFiles = null;
string remotePath = string.Empty;
string filePattern = string.Empty;
Regex regexp;
int counter;
Dts.VariableDispenser.LockForWrite("User::RemotePath");
Dts.VariableDispenser.LockForWrite("User::FilePattern");
Dts.VariableDispenser.LockForWrite("User::ListOfFiles");
Dts.VariableDispenser.GetVariables(ref varCollection);
try
{
remotePath = varCollection["User::RemotePath"].Value.ToString();
filePattern = varCollection["User::FilePattern"].Value.ToString();
ftpManager = Dts.Connections["FTP"];
ftpConnection = new FtpClientConnection(ftpManager.AcquireConnection(null));
ftpConnection.Connect();
ftpConnection.SetWorkingDirectory(remotePath);
ftpConnection.GetListing(out folderNames, out fileNames);
ftpConnection.Close();
listOfFiles = new System.Collections.ArrayList();
if (fileNames != null)
{
regexp = new Regex("^" + filePattern + "$");
for (counter = 0; counter <= fileNames.GetUpperBound(0); counter++)
{
if (regexp.IsMatch(fileNames[counter]))
{
listOfFiles.Add(remotePath + fileNames[counter]);
}
}
}
varCollection["User::ListOfFiles"].Value = listOfFiles;
}
catch (Exception ex)
{
Dts.Events.FireError(-1, string.Empty, ex.ToString(), string.Empty, 0);
Dts.TaskResult = (int) ScriptResults.Failure;
}
finally
{
varCollection.Unlock();
ftpConnection = null;
ftpManager = null;
}
Dts.TaskResult = (int)ScriptResults.Success;
}
VB code that can be used in SSIS 2005 and above.
Include the Imports statement Imports System.Text.RegularExpressions
Public Sub Main()
Dim varCollection As Variables = Nothing
Dim ftpManager As ConnectionManager = Nothing
Dim ftpConnection As FtpClientConnection = Nothing
Dim fileNames() As String = Nothing
Dim folderNames() As String = Nothing
Dim listOfFiles As Collections.ArrayList
Dim remotePath As String = String.Empty
Dim filePattern As String = String.Empty
Dim regexp As Regex
Dim counter As Integer
Dts.VariableDispenser.LockForRead("User::RemotePath")
Dts.VariableDispenser.LockForRead("User::FilePattern")
Dts.VariableDispenser.LockForWrite("User::ListOfFiles")
Dts.VariableDispenser.GetVariables(varCollection)
Try
remotePath = varCollection("User::RemotePath").Value.ToString()
filePattern = varCollection("User::FilePattern").Value.ToString()
ftpManager = Dts.Connections("FTP")
ftpConnection = New FtpClientConnection(ftpManager.AcquireConnection(Nothing))
ftpConnection.Connect()
ftpConnection.SetWorkingDirectory(remotePath)
ftpConnection.GetListing(folderNames, fileNames)
ftpConnection.Close()
listOfFiles = New Collections.ArrayList()
If fileNames IsNot Nothing Then
regexp = New Regex("^" & filePattern & "$")
For counter = 0 To fileNames.GetUpperBound(0)
If regexp.IsMatch(fileNames(counter)) Then
listOfFiles.Add(remotePath & fileNames(counter))
End If
Next counter
End If
varCollection("User::ListOfFiles").Value = listOfFiles
Dts.TaskResult = ScriptResults.Success
Catch ex As Exception
Dts.Events.FireError(-1, String.Empty, ex.ToString(), String.Empty, 0)
Dts.TaskResult = ScriptResults.Failure
Finally
varCollection.Unlock()
ftpConnection = Nothing
ftpManager = Nothing
End Try
Dts.TaskResult = ScriptResults.Success
End Sub
Screenshot #1:
Screenshot #2:
Screenshot #3:
Screenshot #4:
Screenshot #5:
Screenshot #6:
Screenshot #7:
Screenshot #8:
Screenshot #9:
Screenshot #10:
Screenshot #11:
Screenshot #12:
Screenshot #13:
Check this link that describes about gracefully handling task error in SSIS Package.
I had almost the same problem but, with retrieving files. I wanted the package NOT to fail when no files were found on FTP server. The above link stops the error bubbling up and causing the package to fail; something you would have thought FailPackageOnError=false should have done? :-S
Hope this solves it for you too!
I just had this issue, after reading some of the replies here, nothing really sorted out my problem and the solutions in here seem insane in terms of complexity.
My FTP task was failing since I did not allow overwriting files, lets say the job was kicked off twice in a row, the first pass will be fine, because some files are transferred over but will fail if a local file already exists.
My solution was simple:
Right click task - Properties
Set ForceExecutionResult = "Success"
(I can't accept my own answer, but this was the solution that worked for me.)
It may not be the best solution, but this works.
I use a script task, and have a bunch of variables for the ftp connection information, and source and destination directories. (Because, we'll be changing the server this is run on, and it will be easier to change in a config package.)
I create a text file on the fly, and write the ftp commands to it:
Dim ftpStream As StreamWriter = ftpFile.CreateText()
ftpStream.WriteLine(ftpUser)
ftpStream.WriteLine(ftpPassword)
ftpStream.WriteLine("prompt off")
ftpStream.WriteLine("binary")
ftpStream.WriteLine("cd " & ftpDestDir)
ftpStream.WriteLine("mput " & ftpSourceDir)
ftpStream.WriteLine("quit 130")
ftpStream.Close()
Then, after giving it enough time to really close, I start a process to do the ftp command:
ftpParameters = "-s:" & ftpParameterLoc & ftpParameterFile & " " & ftpServer
proc = System.Diagnostics.Process.Start("ftp", ftpParameters)
Then, after giving it some more time for the ftp process to run, I delete the temporary ftp file (that has connection information in it!).
If files don't exist in the source directory (the variable has the \\drive\dir\*.* mapping), then there is no error. If some other error happens, the task still fails, as it should.
I'm new to SSIS, and this may be a kludge. But it works for now. I guess I asked for the best way, and I'll certainly not claim that this is it.
As I pointed out, I have no way of knowing what the files are named, or even if there are any files there at all. If they are there, I want to get them.
I don't have a packaged answer for you, but since no one else has posted anything yet...
You should be able to set a variable in an ActiveX script task and then use that to decide whether or not the FTP task should run. There is an example here that works with local paths. Hopefully you can adapt the concept (or if possible, map the FTP drive and do it that way).
1) Set the FTP Task property ForceExecutionResult = Success
2) Add this code to FTP Task OnError event handler.
public void Main()
{
// TODO: Add your code here
int errorCode = (int)Dts.Variables["System::ErrorCode"].Value;
if (errorCode.ToString().Equals("-1073573501"))
{
Dts.Variables["System::Propagate"].Value = false;
}
else
{
Dts.Variables["System::Propagate"].Value = true;
}
Dts.TaskResult = (int)ScriptResults.Success;
}
Put it in a ForEach container, which iterates over the files to upload. No files, no FTP, no failure.
You can redirect on failure, to another task that does nothing, ie a script that just returns true.
To do this, add the new script task, highlight your FTP task, a second green connector will appear, drag this to the script task, and then double click it. Select Failure on the Value drop down. Obviously, you'll then need to handle real failures in this script task to still display right in the Job history.
Aha, OK - Thanks for clarification. As the FTP task cannot return a folder listing it will not be possible to use the ForEach as I initially said - That only works if you're uploading X amount of files to a remote source.
To download X amount of files, you can go two ways, either you can do it entirely in .Net in a script task, or you can populate an ArrayList with the file names from within a .Net script task, then ForEach over the ArrayList, passing the file name to a variable and downloading that variable name in a standard FTP task.
Code example to suit: http://forums.microsoft.com/msdn/ShowPost.aspx?PostID=2472491&SiteID=1
So, in the above, you'd get the FileNames() and populate the ArrayList from that, then assign the ArrayList to an Object type variable in Dts.Variables, then ForEach over that Object (ArrayList) variable using code something like: http://www.sqlservercentral.com/articles/SSIS/64014/
You can use the free SSIS FTP Task++ from eaSkills. It doesn't throw an error if the file or files don't exist, it support wild cards and gives you the option to download and delete if you need to do so.
Here's the link to the feature page:
http://www.easkills.com/ssis/ftptask
This is another solution that is working for me, using built-in stuff and so without manually re-writing the FTP logic:
1) Create a variable in your package called FTP_Error
2) Click your FTP Task, then click "Event Handlers" tab
3) Click within the page to create an event handler for "FTP Task/OnError" - this will fire whenever there is trouble with the FTP
4) From the toolbox, drag in a Script Task item, and double-click to open that up
5) In the first pop-up, ReadOnlyVariables - add System::ErrorCode, System::ErrorDescription
6) In the first pop-up, ReadWriteVariables - add your User::FTP_Error variable
7) Edit Script
8) In the script set your FTP_Error variable to hold the ReadOnlyVariables we had above:
Dts.Variables["FTP_Error"].Value = "ErrorCode:" + Dts.Variables["ErrorCode"].Value.ToString() + ", ErrorDescription=" + Dts.Variables["ErrorDescription"].Value.ToString();
9) Save and close script
10) Hit "OK" to script task
11) Go back to "Control Flow" tab
12) From the FTP task, OnError go to a new Script task, and edit that
13) ReadOnlyVariables: User::FTP_Error from before
14) Now, when there are no files found on the FTP, the error code is -1073573501
(you can find the error code reference list here: http://msdn.microsoft.com/en-us/library/ms345164.aspx)
15) In your script, put in the logic to do what you want - if you find a "no files found" code, then maybe you say task successful. If not, then task failed. And your normal flow can handle this as you wish:
if (Dts.Variables["FTP_Error"].Value.ToString().Contains("-1073573501"))
{
// file not found - not a problem
Dts.TaskResult = (int)ScriptResults.Success;
}
else
{
// some other error - raise alarm!
Dts.TaskResult = (int)ScriptResults.Failure;
}
And from there your Succeeded/Failed flow will do what you want to do with it.
An alternative is to use this FTP File Enumerator

Resources