how to run cron on a module function in drupal? - drupal-7

I have a function in a custom module retrieving data from a sql table. I need to run cron at every X minutes to submit these data to a remote url. Any way of doing this?

You should enable a cronjob on your server.
Go to admin/config/system/cron on your page and copy the url to run the cron from outside the site.
Log in to your server as the user that owns the project or as www-data and type crontab -e
Now you can add a line like this:
*/15 * * * * curl -Lk http://your.site.com/cron.php?cron_key=YOURCRONKEY
This statement will trigger the cronjob of your.site.com every 15 Minutes

Related

Execute process task not running from SQL Agent Job

I have an SSIS package that runs a backup in a T-SQL task and then uploads the file to google drive from an execute process task. The package runs perfectly from the catalog. When I try to run it from sql-agent job, the backup runs ok but the upload to google drive does not. There are no error messages in the job history or in the package history.
The sql agent job is set to run from a proxy account with the necessary credentials.
The issue turned out to be that the command file was trying to find another file it needed to run execute the code, but since the windows user changed it was looking in a different users folder. We solved it by passing the filepath as a param so it doesn't default to the user accounts folder its executed from.

Add seed data to a Docker based Microsoft SQL Server at image build time

NOTE: I believe that this question is different than the many others that look similar. Please read it before closing.
I am trying to build a Docker container image that has a "for testing" copy of my database. I have a script that will create this. It takes about 60 seconds to run.
I put this into a docker container by following the steps outline by Julie Lerman here. It worked just fine except that it runs my script when the container instance is created. This means that I have to wait 60 seconds before the database is fully ready.
I want to incur the 60 second cost when the image is built, not when the container starts up (startup is when my automated tests need it to be ready fast!)
How can I create a Sql Server container image that has my database script pre-run?
NOTE: I need this to be a repeatable, auto-build process. As such I am hesitant to use docker commit
This command ended up doing it:
RUN ( /opt/mssql/bin/sqlservr --accept-eula & ) | grep -q "Service Broker manager has started" && /createScript.sh
createScript.sh is a bash script that calls SqlCmd on the sql script you want to run.
The key is to do a RUN, so it is done at image build time.
Source: https://github.com/microsoft/mssql-docker/issues/229

Manual schedule in PostgreSQL?

I'm starting to use the PostgreSQL now, and I wonder if I can schedule tasks / work (in SQL) to be done by the db without having to use pgAgent.
I'm working on a system where administrators need to schedule promotions. For example, from day X to day Y there is a Z promotion. This must be done in the system interface (UI), on a page that will send the command to the database. All I need is to perform a SQL command when a proper time comes.
I have searched on the internet, and all I find is about pgAgent or how to configure it. I do not want it. From what I saw, the pgAgent only works by pgAdmin interface, and system administrators can not lay a finger on pgAdmin... Or not (I'm new to PostgreSQL)...? :/
In pgAdmin, when creating a new job I also clicked on the help button but there does not talk much except set everything through pgAdmin interface.
Is there any way to achieve this? Are there alternatives?
Thank you for your attention.
PgAgent doesn't work just through PgAdmin, but rather, PgAdmin is the only (current) GUI that interfaces with the PgAgent tables. PgAgent is a service that interacts exclusively with its own set of tables, which can be modified and reported on by any software, not just PgAdmin. PgAdmin can be very useful since it implements multi-stepped jobs, and the results are stored in the database and can be queried or custom reports can be made.
There are many alternatives, from developing your own PgAgent-like tool, to using cron in Linux/Unix/Cygwin or Scheduled Tasks in Windows.
For example, in Linux, a daily table export can be implemented in cron by adding a batch file in /etc/cron.daily/ like
sudo -i -u postgres psql -c "copy foo.bar to '/var/lib/dbexports/foo-bar.csv' with csv header" foo_db
or in a file in /etc/cron.d/ to export that file specifically on Mondays at 5:30 like
30 5 * * 1 postgres psql -c "copy foo.bar to '/var/lib/dbexports/foo-bar.csv' with csv header" foo_db
or similar on any user's crontab.

Batch file's Core FTP line is Not running during Scheduled Task. Works if started Manually

I have a simple batch file which needs to be run weekly to upload some files via Core FTP.
I'm using the free version of Core FTP LE.
MySavedProfile is the Site Name of the saved profile I created using Core FTP's site Manager. The profile contains the URL / credentials / etc of the site to connect to.
Here are the contents of the batch file:
SET logf=confirm.log
echo test-start >> %logf%
"C:\Progra~1\CoreFTP\coreftp.exe" -B -s -pasv -O -site MySavedProfile -u "C:\Progra~2\PathToFiles\FileToUpload.txt"
echo test-finish >> %logf%
For the Windows Server 2012 r2 Task Scheduler, I have created a basic, weekly scheduled task on the Task Scheduler Library root which runs the batch file. For this scheduled task I have:
(Under the General tab)
"Run whether user is logged on or not" is selected
"Run with highest privileges" is checked
Configure for = Windows Server 2012 R2
(Under Actions)
Action = Start a program
Program / Script = "C:\Progra~2\PathToFiles\batch.bat"
Start in = C:\Progra~2\PathToFiles\
Here is the weird behavior I am getting:
If I double click on the batch file directly, it works fine and uploads the text file via Core FTP just fine.
However, if I try to let the Windows Task Scheduler run it, it runs everything except the Core FTP line. That is, I get the usual:
test-start
test-finish
in the confirm.log file, but the FileToUpload.txt has not been uploaded to the remote server, and there are no errors from CoreFTP that I can detect.
I have tried this with a service account that has permissions to run batch files, as well as my own account for this scheduled task. I get the same result: it doesn't seem to run that CoreFTP line. At least not via Task Scheduler. I need this upload to be automated.
I've searched Core FTP's documentation, Google, etc. No one seems to have run into this exact issue. I've applied recommendations from distantly related issues, but none of them have worked.
Any help would be greatly appreciated. Thank you.
The only way to do this is to use the full version of Core FTP (that is Core FTP Pro). If you use the LE version you have to check the "Run only when user is logged on" option.
This happens because of the splash screen at the beginning.
If you can't be logged on forever you could create a user that will always be logged on just for these tasks.
Remember to use the -Log option on CoreFTP to check if it is actually doing something.

is it possible execute a perl script with admin rights or as a specific user?

I'm writing a perl script in which I've to shutdown my mssql server ,do some operation and then I've to restart it.I know 1 way is to use netstat to stopt the service but I cann't use that. So I tried installing DBI and DBD::ODBC module.
More info here :Shutdown MSSQL server from perl script DBI
But when I trying to shutdown my server using this command
$dbh->prepare("SHUTDOWN WITH NOWAIT ");
It's not working for me :
I got this response from the community
SHUTDOWN permissions are assigned to members of the sysadmin and serveradmin fixed server roles, and they are not transferable. I'd consider it unlike(hopefully) that perl is run with this rights.
So please tell me is there a way to run the above command as these users ? or what can I do other than this . Note that I have a constraint tha tI cann't simply stop it as windows service.
If the scripts are executed through a web browser then the user executing the scripts will be defined by the web server. It will probably not be a good idea to fiddle with this user. Just leave things as they are.
What you can do is to create a Perl script that is being run by a privileged user on a consistent basis with CRON.
This script being run by CRON can check for specific content like a file which has been written by a script where the user executing the script has lesser privileges.
So the way it could work is as follows:
You execute browser.cgi through a browser to do a specific task.
browser.cgi writes instructions to a file.
Every 1 minute priveleged.cgi executes via CRON. (The root user could execute priveleged.cgi)
priveleged.cgi reads the file browser.cgi has written for instructions and starts and stops services according to the instructions.

Resources