A ForEach task performs a "data copy" from SQL DB to .txt file and with sftp service writes the files to a folder on the server. This is done 10 times. The files are all named differently. A connection to the server is opened and closed for each file. Is there any way to open the connection once and make the copy of the 10 (or more files) and then close it only at the end?
Related
First off I seached for this topic on previous questions and on the web but cannot find any that is giving or suggesting me the solution.
I'm NOT new to access and this really drives me crazy.
I have an Access 2010 frontend accdb file that is used as below:
The FE Analyzer.accdb is located in the Company Server#1 in a shared folder
On the Company Server#2 a .bat file was created that is scheduled to run every night, and the .bat lauches the Access FE file with Access Runtime (no full MS access on the server) with the usual command:
C:\Program Files\Microsoft Office\Office14\MSACCESS.EXE" "\\192.168.1.5\SharedFolder\Analyzer.accdb
"C:" drive above is the one of CompanyServer#2, the IP address 192.168.1.5 is the Server#1.
When the access file is run, a startup form opens, in the Load event of the form, a subroutine is called that does a bunch of calculations and writes into sql server. After calling the main subroutine, Access application is closed, like this:
Private Sub Form_Load()
Call Init
Application.Quit
End Sub
The schedule works perfectly every night, the Access file accdb does what it's expected to, no MSACCESS.EXE remains in the task manager - ok so far, BUT the *.laccdb lock file remains in the shared folder!
When, from my client PC I access the shared folder, I can manually delete the lock file without any problems.
As a work-around, in the .BAT file I had written a first line that removes the lock file before running the .accdb file again.
BUT now that's annoying again, because I set auto-compact on close, and it doesn't work, I think because the Access accdb file is not closed properly - the lock file still there being some evidence.
If I run the Access frontend file that is in the shared folder (just double click it and let the startup form run the main subroutine) from my client PC it works and no lock file remains.
If I run the .BAT file that is on the Server#2 from my client PC, also everything runs and no lock file remains.
Any clue?
I have FTP server where people put some txt files that I need to process. I have for each loop for FTP task and with that, i receive a file to local directory from FTP server. Next, I need to process files from the directory in the foreach loop, and my question is can I somehow protect my loop to don't iterate through the files from past.
Example: someone put files on FTP yesterday. I process them and they are in my local directory. Today someone put a file on FTP, FTP task download them to the local directory and my loop will iterate through all files, yesterday and today. How can I avoid this, without deleting them manually
In the BW Designer, File Poller can look for file in the local server where the Designer is installed.
But If I want the File Poller to look at another server instead of the local server, I don't see any option to do that.
Is there any other way so that I can use File Poller to look on any other server and poll file from that server instead of the server where the Designer is installed?
Thanks
The File Poller can only access files and directories visible to the system the engine is running on, so you need to mount a network drive to poll files sitting on a remote server.
Another option would be to build a custom polling mechanism using the FTP Palette. You could use a Timer instead of a File Poller. On the first execution, the process would list the files in the remote folder using the FTP Dir activity then store this list in a Shared Variable. This list would contain the file names, last modification dates, etc.
Every time the process is triggered, it would run FTP Dir to compare the current list of files against the previous one to detect any changes (new files, modified files, etc.) then update the Shared Variable to keep the latest image of the remote folder. You could then run FTP Get to retrieve any new or modified file.
I have an SSIS package (using 2008 R2) that I would like sent to my FTP server. In the Flat File Connection Manager under the connection string I've put the address \stp-ftp\RMB as the location that I would like the file to be placed but each time I close the connection manager it reverts back to my original location. Is there a way correct this? The share drive that I'm attempting to send the file to hasn't been mapped on the sql server where I'm working is that what needs to happen?
Thanks -
I was using an expression to append the date to the end of the file! It seems that the expression was overwriting my connection string.
I have been running a batch file to copy files from one location (my local machine) to multiple servers. Recently we moved into a server for processing files, now the problem is the same batch file is not working when I copy it from one server to another 2 or more servers....
Do I have to change any statements in the batch file pertaining to servers...?
Here is my batch file:
#echo off
echo copying files to multiple servers
copy *.eps* \\server1\adman\in\displ
copy *.eps* \\server2\BasketsIn\TheHindu\AdImport\Ads_SAP))
This will copy a static filemask of files from one server to two other servers.
You need read/write permissions over the lan.
#echo off
echo copying files to multiple servers
set "source_server=\\server0\c\share"
copy "%source_server%\*.eps*" "\\server1\adman\in\displ"
copy "%source_server%\*.eps*" "\\server2\BasketsIn\TheHindu\AdImport\Ads_SAP"