SSIS flat file to ftp server - sql-server

I have an SSIS package (using 2008 R2) that I would like sent to my FTP server. In the Flat File Connection Manager under the connection string I've put the address \stp-ftp\RMB as the location that I would like the file to be placed but each time I close the connection manager it reverts back to my original location. Is there a way correct this? The share drive that I'm attempting to send the file to hasn't been mapped on the sql server where I'm working is that what needs to happen?
Thanks -

I was using an expression to append the date to the end of the file! It seems that the expression was overwriting my connection string.

Related

SSIS Package stopped working with Error code: 0x80004005

I got this SSIS package working this past December. It only runs on Friday mornings. Last Friday it failed with this error message:
Package:Error: SSIS Error Code DTS_E_OLEDBERROR. An OLE DB error has occurred. Error code: 0x80004005.
An OLE DB record is available. Source: "Microsoft Access Database Engine" Hresult: 0x80004005 Description: "The Microsoft Access database engine cannot open or write to the file '\\ServerShare\IT\Reports\Export Templates\YoderReport.xlsx'. It is already opened exclusively by another user, or you need permission to view and write its data.".
I've checked out a couple of other questions that were similar, but they did not answer my question. I have checked to make sure that no one has that file open.
The file in question is a template that is copied over then populated, so no one should have it open, to begin with.
I've tried changing the RetainSameConnection to True, but no difference. I have run it in debug mode, and it works fine.
Anyone got any ideas how to clear this up so it runs automatically again?
UPDATE
After some more testing, it appears that the file is getting the data, but isn't being copied. Here is what I have setup:
I have a File System Task that copies a template from my template
folder to my Export folder.
Then I have a Data Flow Task
Begins with a OLE DB Source that runs a SQL script to pull data
Runs a Data Conversion to update a couple of fields to the correct format
Excel Destination is used for the output. (This is the template that was copied to the Export folder
There is also a Flat File Destination just in case there are any errors
Then back to the Control Flow with another File System Task, this one moves the file from the Export folder to its final destination on a shared drive
When I run this from VS 2015 it works fine and creates the file. When I run this from the SQL Agent job it fails with the above error message. The only thing that I can think of is that in the Data Flow Task the Excel Destination isn't releasing the file before the final File System Task tries to copy it? But if that is the case, why did that just start happening now?
I think that there is a problem in the filepath, as you can see the file path start with one single backslash:
\ServerShare\IT\Reports\Export Templates\YoderReport.xlsx
Maybe it should start with double back slash \\ if it is on the network
\\ServerShare\IT\Reports\Export Templates\YoderReport.xlsx
Or the Partition is missed --> Incomplete path
Check the way that the file path is provided to the connection manager (if you are using expressions)
Also based on this microsoft article, there is two possible causes:
You must have permission to read data in the specified file in order to view its data. To change your permission assignments, see your system administrator or the table or query's creator.
You tried to open the indicated file exclusively, but another user already has the file open.
Looks like an access issue. Ensure that the Agent Service Account is running with full rights to the network share path. Maybe you can try with your credentials on the Agent Server.
Also, Ensure that your Excel destination connection string supports .xlsx.
Provider=Microsoft.ACE.OLEDB.12.0;Data Source=c:\path\xxx.xlsb;Extended Properties="Excel 12.0;HDR=YES";
Changing “Excel 12.0” to “Excel 12.0 Xml” will tell the provider to output in .xslx format instead.

SSIS File System task didn't copy files from the source server location when scheduled

I'm new to SSIS and stuck with a problem and hope some of them would have already gone through any of this.
Task:
Copying files from a remote server to a local machine folder using File System task and For each loop container.
Problem:
The job executes i.e. files are getting copied successfully when I execute from the SSIS designer but when deployed the project on the SQL server instance it isn't copying any files in fact the target folder is totally empty.
I'm not understanding this strange behavior. Any inputs would be of great help!
Regards-
Santosh G.
The For each loop will not error out if it doesn't find any files.
The SQL Agent account may not have access to read the directory contents.
Check is your path a variable - is it being set by a config or /SET statement?
Can you log the path before starting the for loop?
Can you copy a dummy file and see can SSIS see this file?
How are you running the job - cmd_exec() can give spurious results with file I/O tasks
The issue was related to the user authorizarions of the SQL Server agent service.
When I execute the job from SQL Server it uses agent service and for that agent service you need to assign a service user who has access rights to the desired file path.

SSIS is overwriting excelfilepath expression with user variable on first pass through for each loop

I have searched everywhere for an answer to this so I am hoping someone out there can help.
I am trying to follow the steps laid out for importing data from multiple excel files in SSIS. I have several excel 2010 files in a directory and am trying to move them into a SQL Server 2008 r2 Database. I have followed all of the directions for doing this with a For Each (File) Loop. I have set the collections information and am using the User::Filename variable to pull back the Fully Qualified filename. The problem I have is that when I check the properties of my excel connection manager-->click on expressions-->modify excelfilepath by setting it to #[User::Filename], SSIS immediately overwrites the property with the new value which is blank (the first time). This then causes my package to fail during execution because there is no filename to go to in the connection.
I have set the delayvalidation property to True on the package, the dataflow and the excel connection and this does not fix the problem. I have also tried to put a fully qualified filename into the User::Filename variable during initialization and this does allow me to process a few files but then gives me a locking error when SSIS tries to re-read the file that I put in during initialization.
I have not found anything like this on the net yet...Hopefully, someone out there has seen this.
I would split this SSIS package into two. The master package would go as deep as the For Each File Loop, but would not include the Excel Connection Manager. The sub-package would open and process each Excel file. It would receive the Filename variable value via Configuration for a Parent Package Variable or similar.
This seems to have the effect of "closing" the excel connection after each file is finished.
I also prefer the OLE DB driver from the Access Database Engine over the native SSIS Excel connection.

How to copy file to remote server in Lotusscript

I want to create a Lotus Notes agent that will run on the server to generate a text file. Once the file is created, I need to send it to a remote server.
What is the best/easiest way to send the file to a remote server?
Thanks
If your "remote" server is on a local windows network, you can simply copy the file from the server file system to a UNC path (\myserver\folder\file.txt) using the FileCopy statement. If not, you may want to look at using a Java agent, which would make more file transfer protocols easily accessible.
In either case, be sure to understand the security restrictions on Notes agents - for your agent to run on the server and create a file on the server's file system, the agent will need to be flagged with a runtime security level of 2 or 3, and signed by an appropriately authorized ID.
Sending or copying files using O/S like commands to a remote server require that destination servers be also mapped as drives on your source server. As Ed rightly said, security needs to allow you to save files down onto the server and then try and copy them.
You can generate the file locally on the server and then use FTP commands in a script to send the file. Or if you're a java guru, you can try using Java.FTP to send the file as well. I had some trouble with it, but it should be possible providing an FTP account is setup on the destination server. FTP related stuff by a well known notes guy can be found here and here
I have done it using a script, and it's clumsy but effective in simply pushing files around. Ideally, if the server at the other end is a Domino server as well, you could actually attach the file in an email and send it to a mailin account on the destination server. I have done that before, and it's great as you can just pass the whole problem of getting files off to the SMTP process.

SQL Server Bulk Insert fails (Network related)

I'm fairly new to SQL Server.
I'm trying to bulk insert into a table, using the command in SQL Server Management Studio (2005):
BULK INSERT Table1 FROM 'c:\text.txt' WITH (FIELDTERMINATOR = '|')
I get the error:
Msg 4860, Level 16, State 1, Line 1
Cannot bulk load. The file "c:\text.txt" does not exist.
I'm positive the file actually exists.
I get the feeling that it is looking for the file on the local hard drive for where ever the server is. Is that the case? If so, how do you generally solve this problem? (to note, I've tried specifying the network address of my PC when entering the location of the text file, but I get a permission error. Also, I know in advance that my company doesn't allow files to placed on a server).
SQL Server does not have an SQL statement that reads data from the client end (as the other posters have pointed out). Other RDBMS products do implement this (eg. the Postgres COPY statement lets you specify a file on the server or a file on the client that is read by the db connectivity library on the client side).
You can achieve moving data from a file on the client to a table on SQL Server using the bcp command line program.
bcp lets you copy data from a local file to a table on the server, or from a table (or select query) on the server into a local file. For example:
bcp servername.dbname.tablename in c:\temp.txt -T -c
will copy a tabbed delimited file (temp.txt) into the specified table (assuming the file contains the right number of columns).
I am not sure if this helps, but it is the only way to move data from a client file to a server table without giving the server some sort of access across the network to the data file on client.
I'd agree that it's a problem with the file being on your C drive, and not the server's drive.
If it's a permissions issue, have you tried creating a file share on your workstation that the server does have permissions to read from? Maybe something like \YourWorkstation\SQLFile, and then granting everyone (or Guest, depending on how your network permissions are set up) read access on it?
If you can't create the share on your laptop, or you can't grant rights to it for some reason, is there a file share somewhere in the office that you do have rights to, and that SQL can also read from? Maybe a NAS or a "Common" network folder?
Have you created a share drive on your machine that the server can see? If so then you just need to refer to the path including your machine name instead of C:
Yes, it will look for the file on the SQL server itself.
If you can map a network drive to the C drive of your SQL server, then you can just copy the file over before running the bulk insert.
If you absolutely can't get any access to the server's file system, then you can look at doing something like this:
write a program that reads your text file and inserts the contents into single record in a temporary table that has a Text field, perhaps using a stored procedure
have the program execute the bcp command to export the data from the temporary table into a text file on the SQL server's local file system, to a folder that the account under which the SQL service is running has write permission
have the program run a bulk insert command specifying the path to the text file on the server
delete the text file and the temporary table

Resources