Download files from the remote server using Pentaho transformation - file

is there any way to retrieve a file from a web server using FTP protocol without using "Get A FILE WITH FTP" JOB STEP. I can only use TRANSFORMATION in Pentaho. Any ideas.

You can use REST Client to do that in a transformation. I am doing the same thing in some of my transformations. You can GET the file content in the stream using REST Client and use a Text File Output to store the content in a file. File format will be your choice.

Even if you can only use a transformation, a transformation can call a job using the "Job Executor" step. You can do the FTP in the job you are calling.
A note of caution - you've got to explicitly check the output of the Job Executor Step for errors.

Related

Sending generated XML file via SFTP

I've generated an XML file using tsql and I'm wondering if there's any way to send it to a remote server via SFTP. Is there any way to do it without external software? What is the best approach to solve this problem?
Any tips will be greatly appreciated
edit:
I forgot to mention that I need a new copy of the file on the server everyday, so...
I need to generate a new file everyday and then replace the old file on the remote server.
I've tried setting up a JOB that runs a SSIS package and it partly does the job, but the standard package doesn't support SFTP. :(
Managed to solve it using WinSCP & command line task in SSIS as suggested by user
Panagiotis Kanavos

Informatica Cloud - Picking up files from SFTP and inserting records in Salesforce

Our objective is as follows
a) Pick up a file "Test.csv" from a Secure FTP location.
b) After picking up the file we need to insert the contents of the file into an object in Salesforce.
I created the following connection for the Remote SFTP (the location which will contain "Test.csv")
Step 1
This is as shown below
Step 2
Then I started to build a Data Synchronization Task as below
What we want is for the Informatica Cloud to connect to the secure FTP location and extract the contents from a .csv from that location into our object in Salesforce.
But as you can see in Step 2, it does not allow me to choose .csv from that remote location.
Instead the wizard prompts me to choose a file from a local directory (which is my machine ...where the secure agent is running) and this is not what I want
What should I do in this scenario ?
Can someone help ?
You can write a UNIX script to transfer the file to your secure agent and then use informatica to read the file. Although, I have never tried using sftp in cloud, I have used cloud and I do know that all files are tied up to the location of the secure agent( either server or local computer) .
The local directory is used for template files. The idea is that you set up the task using a local template and then IC will connect to the FTP site when you actually run the task.
The Informatica video below shows how this works at around 1:10:
This video explains how it works at around 1:10:
http://videos.informaticacloud.com/2FQjj/secure-ftp-and-salesforececom-using-informatica-cloud/
Can you elaborate the Secure agent OS as in Windows or Linux.
For Windows environment you will have to call the script using WINSCP or CYGWIN utility I recommend the former.
For Linux the basic commands in script should work.

Using SQL Server Service Broker in Powershell script?

I have a PowerShell script do the following tasks:
Loop a big database table
Generate text file
Zip the text file
FTP upload the zipped file
Write to the log table
The step generating text file may take short or longer time depends on the data. And the FTP uploading time takes a while. So I want to make at least these two steps asynchronous. Is SQL Server Service Broker a viable choice? Is there any example? Any other options?
You cant make them aysnc in PowerShell, but you could use the Start-Job cmdlet to put them on another thread and wait until they complete.
Using Service Broker will will by default make them work asynchronously. The tricky thing would be if you still want to run some of then sequentially, for which you need to add a conversation-group id for them.

Is SSIS the way to achieve this functionality?

I have a manual process that needs to be automated. Below are the steps outlined. Is SSIS the correct way to achieve all the below mentioned steps? Especially the result to CSV, zip and email steps? Can this be done using inbuilt sql server scheduler?
Connect to SQL Server: DARVIN,51401
Open SQL Query: o:\Status Report.sql
Chose database: AdventureWorks
In the menu bar above choose ‘Tools’, then ‘Options’.
When the pop-up appears, choose the tab named ‘Results’ and choose Results output format of Comma Delimited (CSV). Clikd the ‘Apply’ then the ‘OK’ button.
Execute the query
Instruct where to save the file, You can save these in O:\Reports File name format is: day^_^Report TSP MM-DD-YY
Let the query run 15-25 minutes.
When the query is complete, open the folder that you save the report in, right click on the report title and compress to a zip file. (right click, Send To Compressed (zipped) Folder) When this has been completed.
Copy the saved file and put into: O:\zippedFiles\
Email: support#adventureworks.com to let them know that you
have placed a zip file at: O:\zippedFiles\
This is exacly what SSIS is for : automating data transfer processes.
The only stp that might cause you problems is the zipping part. You can use a third party library, and do a custom script task that will achieve what you want. You will have to do some Vb.Net (or C#) to achieve that part.
The rest of what you want is pretty straightforward in SSIS.

How to copy file to remote server in Lotusscript

I want to create a Lotus Notes agent that will run on the server to generate a text file. Once the file is created, I need to send it to a remote server.
What is the best/easiest way to send the file to a remote server?
Thanks
If your "remote" server is on a local windows network, you can simply copy the file from the server file system to a UNC path (\myserver\folder\file.txt) using the FileCopy statement. If not, you may want to look at using a Java agent, which would make more file transfer protocols easily accessible.
In either case, be sure to understand the security restrictions on Notes agents - for your agent to run on the server and create a file on the server's file system, the agent will need to be flagged with a runtime security level of 2 or 3, and signed by an appropriately authorized ID.
Sending or copying files using O/S like commands to a remote server require that destination servers be also mapped as drives on your source server. As Ed rightly said, security needs to allow you to save files down onto the server and then try and copy them.
You can generate the file locally on the server and then use FTP commands in a script to send the file. Or if you're a java guru, you can try using Java.FTP to send the file as well. I had some trouble with it, but it should be possible providing an FTP account is setup on the destination server. FTP related stuff by a well known notes guy can be found here and here
I have done it using a script, and it's clumsy but effective in simply pushing files around. Ideally, if the server at the other end is a Domino server as well, you could actually attach the file in an email and send it to a mailin account on the destination server. I have done that before, and it's great as you can just pass the whole problem of getting files off to the SMTP process.

Resources