Automate backup command not working in job scheduling - batch-file

I am using PostgreSQL 8.4 on my Windows Server 2012 box. I have made a configuration with pgagent for job scheduling. I need to schedule an auto backup on a daily basis on my system using with job scheduling. I searched the net and created a batch file and located the file path in job scheduling.
If I run the batch file on my server then it works fine, but when I call it using scheduling in PostgreSQL then it goes in the running stage, but does not give any kind of result.
Below is the command I used for taking backup:
"c:\Program Files (86)\path\to\bin" pg_dump.exe -i -h hostname -U username -F c -b -v -f "backup\file\path\filename.backup" databasename
This command is working fine on the command prompt and when calling the batch file, but does not give any output from PostgreSQL job scheduling.
Does anyone have any idea about this kind of issue?

Related

Scheduled Task Runs .bat File with sqlcmd, but Doesn't Update the Output File

My scheduled task (on Windows Server 2008 R2) is "successfully" running each morning, but the output of sqlcmd does not get updated in the output file. The log file changes to show that the batch file ran, but the output file remains the same.
When I run a .bat file with the lines of code below, it works just fine. The .csv file gets updated with the most recent query's results.
sqlcmd -S SERVERNAME -i my_query.sql -s "," -o c:\scripts\My_query\query_results.csv -W -h-1
ftp.exe -s:"c:\scripts\My_query\file_upload.ftp"
Useful Information:
I am running the scheduled task as my user which is a server admin and has access to the server in question
I have tried giving explicit access to the user for full control of all files and folders involved
I have tried running C:\Windows\System32\cmd.exe as the action with the argument /c C:\scripts\My_query\my_process.bat to make sure the entire process runs
Task scheduler is working fine for other tasks, but it has this same problem with five similar processes
Are the tasks configured to run whether the user is logged on or not? If so, it will never find your SQL file because the default working directory is SYSTEM32 when you configure a scheduled task that way. You will need to provide the full path to the SQL file.

Automatically run .bat file after EC2 reboot without having to remote connect

I am running an AWS Windows 2012 EC2 instance that has to run 24/7. On this instance, I run a Python 3.6 scraper script and to prevent me from having to regularly check up on the server whether the file is running, I have a .bat file in the shell:startup folder of my instance, that automatically restarts it on a daily base. The .bat file works as it will run the Python script and set a timer to restart/reboot the instance after (t=86400). The .bat file runs on the EC2 instance itself.
However, what the file does not do is run automatically after the reboot. I now first have to remote connect to the server before the .bat file will run. What I want it to do is run without me having to first remote connect into the server. How can I achieve this?
I use the following code in my .bat file. Located on my EC2 instance.
#ECHO OFF
START CMD /K (
CD C:/Users/Administrator/Documents/
python scraper.py
)
START CMD /K SHUTDOWN -t 86400 -r -f
I have tried looking into using AWS' Automations and other schedule based methods but couldn't get that to work.
If you want to use something native to Windows Server 2012, look at Schtasks -- this is more or less the Windows equivalent of cron.
I found the answer to my question by using Task Scheduler and looking at the following article: Run a batch file with Windows task scheduler
An important note here is that for my batch file to run I had to create a task that started CMD and run the batch file from there. Asking Task Scheduler to run the batch file directly doesn't work on Windows Server 2012. I ran the task with the following details:
Administrator account
"Run whether user is logged on or not"
"Run with the highest privileges"
"Start on system start-up"
Action: Start a program -> CMD
Add arguments (optional): /c start "" "C:\Users\Administrator\Desktop\file.bat"
More information on how to do this can be found in this answer: https://stackoverflow.com/a/27055435/7736676

auto run batch file containing db2 statements in windows

I want Task Scheduler in Window run DailyJob.bat daily at a certain time, this file having content like this:
#echo off
DB2 CONNECT TO dbName USER usrName USING password
DB2 .........
DB2 ..........
Moreover, Task Scheduler will run this file by cmd.exe automatically, but cmd doesn't understand DB2 command. Please help me, thanks !
You need to call your db2 statements with the db2cmd command (http://www-01.ibm.com/support/knowledgecenter/SSEPGG_10.5.0/com.ibm.db2.luw.admin.cmd.doc/doc/r0002036.html?cp=SSEPGG_10.5.0%2F3-6-2-6-35)
From CMD.exe
db2cmd -i -c db2 list node directory
There are a lot of related questions in the Web about this problem.

Executing a stored procedure using Windows task Scheduler

I've been trying to set up a schedule to run a stored procedure every hour in Windows Task Scheduler (as I'm using SQL Express and can't install 3rd party tools) but after trying various methods such as running a .bat file from task scheduler, opening SqlCmd utility from task scheduler and passing either the command line syntax or a .sql script file I'm having no luck.
I know this can be done and therefore I'm sure it's something I've missed but if anyone can share their experience of this I'd very much appreciate it.
The following command is in the batch file...
sqlcmd -E -i"C:\Users\Administrator\Desktop\test.sql" -o"C:\Users\Administrator\Desktop\dump.txt"
Thanks a lot
If you are an admin on the sql instance (Since you are using SQLExpress I bet you are trying to do this on your own computer so there is a high chance your user is an admin of the sql instance) you should not use -E at all, just ignore it.
Second, specify the server even if you are working on local.
Start with a simple sql command like below:
sqlcmd.exe -S "." -d MY_DATABASE -Q "SELECT * FROM MY_TABLE"
Replace MY_DATABASE and MY_TABLE with your dbname and table name. Make sure you can run it from command line. It should return the data from your table. (Beware command line options are case-sensitive so -s is not same as -S)
Last, do not try to feed parameters through task scheduler. Put the command with all parameters in a .bat file and just run the batch from task scheduler.
I have recently had a similar issue and my experience may assist you. I was calling a small app i.e. EXE from a batch file. I was scheduling the batch file to run from the Windows Task Scheduler. The app was accessing the SQL data using Windows Authentication.
I could run the app directly i.e. click on the EXE to run it.
I could run the app from the batch file.
But if I tried to run the scheduled task it seemed to start but did nothing and posted no errors that I could find.
I found if I changed the app to run with SQL Authentication it could be run from the Task Scheduler.
I suspect there is something about the context of the Windows Authentication when it is run from Task Scheduler that is not recognised by SQL.

How do I launch multiple sqlcmd windows from a batch file?

How do I launch multiple sqlcmd windows from a batch file that all point to the same database? For example, when I run the .bat file I want it to spawn N number of windows based on a parameter that I pass into it (ex. 5). Each of these 5 windows should open on my desktop and all connect to the same database. That's what I want to do first. Once I have that working, I then want to have each of those 5 windows to run a distinct .sql script that performs inserts, queries, updates, deletes, calling stored procedures...essentially emulating a production environment to help us in debugging efforts (under a user load). I want to see the output of each .sql commend flying by in the sqlcmd window while it is being executed.
I found:
http://hammerora.sourceforge.net/
which is a GUI tool that is focused on TPC-C load testing, but it is not exactly what I want. I bring it up because it is a similar concept that I want to do only driven by batch files on a smaller scale (ex. 20 concurrent users max).
I created a system like this back in the late 90's for Oracle scalability testing but I've been out of the database business since then and can't remember how to do it and how different it would need to be to support SQL Server. So I know it is possible in Oracle, but just not sure about SQL Server given the command line tool and scripting capabilities.
Does anyone have any information about what it would take to make this work?
Ex. Create a launch3users.bat file that looks like:
sqlcmd -d MichaelTest -run this 1.sql file
Pause
sqlcmd -d MichaelTest -run this 2.sql file
Pause
sqlcmd -d MichaelTest -run this 3.sql file
Pause
where each of those would spawn a sqlcmd window and run the proper .sql script which could do DML operations or called stored procedures.
Thanks,
Michael
You simply add "start" to the beginning of commands.
start sqlcmd -d MichaelTest -i 1.sql
start sqlcmd -d MichaelTest -i 2.sql
start sqlcmd -d MichaelTest -i 3.sql

Resources