I have a simple website built using PHP. It is hosted on a Linux server.
I need to run a PHP script every night. How do I do this?
Will an 'open-source job scheduler in java' be able to run a PHP script?
There are several possibilities.
If you have shell access to the machine you can set up a scheduled task (cron job, http://www.scrounge.org/linux/cron.html) to execute your script either by the php command line client or by a tool like curl or wget.
IF you don't have access to the shell on that machine there are several web sites that offer free cronjobs. Basically what they do is that you give them a link and the schedule when they should access your link. Just google for online cron job
Related
I have Pentaho-Spoon on my Windows machine and all the transformations/jobs are stored in a Database Repository.
Now, we want to set up a scheduler for the transformations and jobs.
Being a newbie, I just know that I need to use a batch file in Windows Scheduler with the address of kitchen.bat/pan.bat and address of the job/transformation to be scheduled.
Do I need to install the Pentaho Data Integration tool on the server as well on which the repository is located? And even if I do so how to get the address from the repository?
This can be done with the help of Windows Scheduler.Just need to trigger a batch file similar to the shown in the example below:
C:\\data-integration\kitchen.bat /rep:"PentahoRepository" /job:"TestJob" /dir:\<direcoryname> /user:<usernamm> /pass:<password> /level:Basic
I have automated build and deploy process in TFS by referring to http://www.codeproject.com/Articles/790206/Deploying-Web-Applications-using-Team-Foundation.
After deployment I am validating deployed application by running selenium scripts via batch file by mentioning the path in "post-test script path". Its executes the batch file and runs automated tests.
Now, I wanted publish these selenium results. So I have created jenkins jobs with email configured. So how to execute this job post deployment. I have tried by providing jenkins job trigger email in "post-test script path", but it actually looking the path and throughs an error. So how to execute jenkins jobs post deployment.
Also, I am trying complete automation process, where it automatically build, deploy and run some selenium tests using TFS. If any body has better process please let me know. Thanks
You can use the Command Line task in the new TFS 2015/VSTS build system to easily execute selenium tests.
You can then easily configure and pass variables.
I would also recommend that you move to using release management tools for deployment. While a CD makes sense for development it is often not viable for a release pipeline and you need more meta data and approvals.
You can do this with the release management tools that come with TFS 2013+.
I have a simple batch file which needs to be run weekly to upload some files via Core FTP.
I'm using the free version of Core FTP LE.
MySavedProfile is the Site Name of the saved profile I created using Core FTP's site Manager. The profile contains the URL / credentials / etc of the site to connect to.
Here are the contents of the batch file:
SET logf=confirm.log
echo test-start >> %logf%
"C:\Progra~1\CoreFTP\coreftp.exe" -B -s -pasv -O -site MySavedProfile -u "C:\Progra~2\PathToFiles\FileToUpload.txt"
echo test-finish >> %logf%
For the Windows Server 2012 r2 Task Scheduler, I have created a basic, weekly scheduled task on the Task Scheduler Library root which runs the batch file. For this scheduled task I have:
(Under the General tab)
"Run whether user is logged on or not" is selected
"Run with highest privileges" is checked
Configure for = Windows Server 2012 R2
(Under Actions)
Action = Start a program
Program / Script = "C:\Progra~2\PathToFiles\batch.bat"
Start in = C:\Progra~2\PathToFiles\
Here is the weird behavior I am getting:
If I double click on the batch file directly, it works fine and uploads the text file via Core FTP just fine.
However, if I try to let the Windows Task Scheduler run it, it runs everything except the Core FTP line. That is, I get the usual:
test-start
test-finish
in the confirm.log file, but the FileToUpload.txt has not been uploaded to the remote server, and there are no errors from CoreFTP that I can detect.
I have tried this with a service account that has permissions to run batch files, as well as my own account for this scheduled task. I get the same result: it doesn't seem to run that CoreFTP line. At least not via Task Scheduler. I need this upload to be automated.
I've searched Core FTP's documentation, Google, etc. No one seems to have run into this exact issue. I've applied recommendations from distantly related issues, but none of them have worked.
Any help would be greatly appreciated. Thank you.
The only way to do this is to use the full version of Core FTP (that is Core FTP Pro). If you use the LE version you have to check the "Run only when user is logged on" option.
This happens because of the splash screen at the beginning.
If you can't be logged on forever you could create a user that will always be logged on just for these tasks.
Remember to use the -Log option on CoreFTP to check if it is actually doing something.
I'm writing a perl script in which I've to shutdown my mssql server ,do some operation and then I've to restart it.I know 1 way is to use netstat to stopt the service but I cann't use that. So I tried installing DBI and DBD::ODBC module.
More info here :Shutdown MSSQL server from perl script DBI
But when I trying to shutdown my server using this command
$dbh->prepare("SHUTDOWN WITH NOWAIT ");
It's not working for me :
I got this response from the community
SHUTDOWN permissions are assigned to members of the sysadmin and serveradmin fixed server roles, and they are not transferable. I'd consider it unlike(hopefully) that perl is run with this rights.
So please tell me is there a way to run the above command as these users ? or what can I do other than this . Note that I have a constraint tha tI cann't simply stop it as windows service.
If the scripts are executed through a web browser then the user executing the scripts will be defined by the web server. It will probably not be a good idea to fiddle with this user. Just leave things as they are.
What you can do is to create a Perl script that is being run by a privileged user on a consistent basis with CRON.
This script being run by CRON can check for specific content like a file which has been written by a script where the user executing the script has lesser privileges.
So the way it could work is as follows:
You execute browser.cgi through a browser to do a specific task.
browser.cgi writes instructions to a file.
Every 1 minute priveleged.cgi executes via CRON. (The root user could execute priveleged.cgi)
priveleged.cgi reads the file browser.cgi has written for instructions and starts and stops services according to the instructions.
What is the simplest way to schedule a batch file to run on a remote machine using Hudson (latest and greatest version)? I was exploring the master slave setup. I created a dumb slave but I am not sure what the parameters should be so that I can trigger the batch file in the remote slave machine.
Basically, I am trying to run 2 different batch files on two different remote machines sequentially, triggered from my machine (the master). The Step by step guide on the Hudson website is a dead link. There are similar questions posted on SO but it does not quite work for me when I use the parameters they mention.
If anyone has done something similar please suggest ways to make this work.
(I know how to set up jobs, and add a step to run a batch file etc what I am having trouble configuring is doing this on a remote machine using hudson in built features)
UPDATE
Thank you all for the suggestions. Quick update on this:
What I wanted to get done is partially working, below are the steps followed to get to it -
Created new Node from Manage Nodes -> New Node -> set # of Executors as 1, Remote FS root set as '/var/hudson', set Launch method as using JNLP, set slavename and saved.
Once slave was set up (from master machine), I logged into the Slave physical machine, I downloaded the _slave.jar from http://masterserver:port/jnlpJars/slave.jar, and ran the following from command line at the download location -> java -jar _slave.jar -jnlpUrl http://masterserver:port/computer/slavename/slave-agent.jnlp. The connection was made successfully.
Checked 'Restrict where this project can be run' in the Master job configuration, and set paramater as slavename.
Checked "Add Build Step" for adding my batch job script
What I am still missing now is a way to connect to 2 slaves from one job in sequence, is that possible?
It is fairly easy and straight forward. Lets assume you already have a slave running. Then you configure the job as if you are locally on the target box. The setting for Restrict where this project can be run needs to be the node that you want to on. This is all for the job configuration.
For the slave configuration read the following pages.
Installing Hudson as a Windows service
Distributed builds
On windows I prefer to run the slave as a service and let the remote machine manage the start up and shut down of the slave. The only disadvantage with this is, you need to upgrade the client every time you update the server Just get the new client.jar from the server, after the upgrade and put it on the slave. Then restart the slave and you are done.
I had troubles using the install as a service option for the slave even though I did it as a local administrator. I used then srvany to wrap the jar into a service. Here is a blog about it. The command that you need to wrap, you will get from your Hudson server from the slave page. For all of this to work, you should set up the slave management as jnlp.
If you have an ssh server on your target machine, you can use the ssl slave settings. These work for me like a charm. I use them with my unix slaves. So far the ssl option with unix is less of an hassle, than the windows service clients.
I had some similar trouble with slave setup and wrote up this blog post - I was running on Linux rather than Windows, but hopefully this will help.
I dont know about how to use built-in hudson features for this job - but in one of my project builds, i run a batch file that in turn uses PSTools
to run the job on a remote server. I found PS tools extremely easy to use - download, unpack and run the command with the right parameters, hence opted to use this.