I have Pentaho-Spoon on my Windows machine and all the transformations/jobs are stored in a Database Repository.
Now, we want to set up a scheduler for the transformations and jobs.
Being a newbie, I just know that I need to use a batch file in Windows Scheduler with the address of kitchen.bat/pan.bat and address of the job/transformation to be scheduled.
Do I need to install the Pentaho Data Integration tool on the server as well on which the repository is located? And even if I do so how to get the address from the repository?
This can be done with the help of Windows Scheduler.Just need to trigger a batch file similar to the shown in the example below:
C:\\data-integration\kitchen.bat /rep:"PentahoRepository" /job:"TestJob" /dir:\<direcoryname> /user:<usernamm> /pass:<password> /level:Basic
Related
I need to schedule Knime Workflow to run daily as I wasn't able to understand/put together the steps in (knime.com/faq#q12) due to my business background. My environment details are:
Operating System: Windows Server 2012 R2.
Database: Reading from SQL Server 2017 and insert model output to the same
database.
Knime Version: Analytical Platform 3.5.2
The Knime Analytical Platform is installed on D drive.
The Workflow is saved on E drive.
could you share with me in details the needed process as I'm coming from a business background :
The needed batch file with exact commands.
Other need steps to run it daily
I use the following command on Windows systems to run an exported workflow as a scheduled task.
You must use quotes around the path of your workflow and path to your Knime executable.
"/path/to/knime.exe" -reset -nosave -nosplash -application org.knime.product.KNIME_BATCH_APPLICATION -workflowFile="/path/to/workflowFile"
we need to execute scripts locally, but how can we do it from a remote machine, so as to avoid going on to each machine and triggering the process manually. preferably with VbScript.
Generally in your VBScript you will specify the machine you want to query with the line
strComputer = "."
This can be changed to a computer name or IP address to remotely query a machine. However it is difficult to provide anything further as you've not posted your script or what you're trying to achieve with it..
Write your .vbs scripts and deploy them to a remote server - then on the remote server schedule a task to run whenever you'd like the script to run.
The script will run locally on that server.
Our objective is as follows
a) Pick up a file "Test.csv" from a Secure FTP location.
b) After picking up the file we need to insert the contents of the file into an object in Salesforce.
I created the following connection for the Remote SFTP (the location which will contain "Test.csv")
Step 1
This is as shown below
Step 2
Then I started to build a Data Synchronization Task as below
What we want is for the Informatica Cloud to connect to the secure FTP location and extract the contents from a .csv from that location into our object in Salesforce.
But as you can see in Step 2, it does not allow me to choose .csv from that remote location.
Instead the wizard prompts me to choose a file from a local directory (which is my machine ...where the secure agent is running) and this is not what I want
What should I do in this scenario ?
Can someone help ?
You can write a UNIX script to transfer the file to your secure agent and then use informatica to read the file. Although, I have never tried using sftp in cloud, I have used cloud and I do know that all files are tied up to the location of the secure agent( either server or local computer) .
The local directory is used for template files. The idea is that you set up the task using a local template and then IC will connect to the FTP site when you actually run the task.
The Informatica video below shows how this works at around 1:10:
This video explains how it works at around 1:10:
http://videos.informaticacloud.com/2FQjj/secure-ftp-and-salesforececom-using-informatica-cloud/
Can you elaborate the Secure agent OS as in Windows or Linux.
For Windows environment you will have to call the script using WINSCP or CYGWIN utility I recommend the former.
For Linux the basic commands in script should work.
I designed a SSIS package that ensures .CSV file generation into a destination folder using a Script Task component. Everything is ok when I run from the Visual Studio solution but the problems starts to appear right after deployment in SQL Server. The Script Task shows success but no file is generated. If someone please can provide help.
Thanks a lot in advance.
Are you running it through a SQL job? This might be because the SQL Agent account (or whatever account you're running with) might not have Read/Write permissions on that particular directory.
What is the simplest way to schedule a batch file to run on a remote machine using Hudson (latest and greatest version)? I was exploring the master slave setup. I created a dumb slave but I am not sure what the parameters should be so that I can trigger the batch file in the remote slave machine.
Basically, I am trying to run 2 different batch files on two different remote machines sequentially, triggered from my machine (the master). The Step by step guide on the Hudson website is a dead link. There are similar questions posted on SO but it does not quite work for me when I use the parameters they mention.
If anyone has done something similar please suggest ways to make this work.
(I know how to set up jobs, and add a step to run a batch file etc what I am having trouble configuring is doing this on a remote machine using hudson in built features)
UPDATE
Thank you all for the suggestions. Quick update on this:
What I wanted to get done is partially working, below are the steps followed to get to it -
Created new Node from Manage Nodes -> New Node -> set # of Executors as 1, Remote FS root set as '/var/hudson', set Launch method as using JNLP, set slavename and saved.
Once slave was set up (from master machine), I logged into the Slave physical machine, I downloaded the _slave.jar from http://masterserver:port/jnlpJars/slave.jar, and ran the following from command line at the download location -> java -jar _slave.jar -jnlpUrl http://masterserver:port/computer/slavename/slave-agent.jnlp. The connection was made successfully.
Checked 'Restrict where this project can be run' in the Master job configuration, and set paramater as slavename.
Checked "Add Build Step" for adding my batch job script
What I am still missing now is a way to connect to 2 slaves from one job in sequence, is that possible?
It is fairly easy and straight forward. Lets assume you already have a slave running. Then you configure the job as if you are locally on the target box. The setting for Restrict where this project can be run needs to be the node that you want to on. This is all for the job configuration.
For the slave configuration read the following pages.
Installing Hudson as a Windows service
Distributed builds
On windows I prefer to run the slave as a service and let the remote machine manage the start up and shut down of the slave. The only disadvantage with this is, you need to upgrade the client every time you update the server Just get the new client.jar from the server, after the upgrade and put it on the slave. Then restart the slave and you are done.
I had troubles using the install as a service option for the slave even though I did it as a local administrator. I used then srvany to wrap the jar into a service. Here is a blog about it. The command that you need to wrap, you will get from your Hudson server from the slave page. For all of this to work, you should set up the slave management as jnlp.
If you have an ssh server on your target machine, you can use the ssl slave settings. These work for me like a charm. I use them with my unix slaves. So far the ssl option with unix is less of an hassle, than the windows service clients.
I had some similar trouble with slave setup and wrote up this blog post - I was running on Linux rather than Windows, but hopefully this will help.
I dont know about how to use built-in hudson features for this job - but in one of my project builds, i run a batch file that in turn uses PSTools
to run the job on a remote server. I found PS tools extremely easy to use - download, unpack and run the command with the right parameters, hence opted to use this.