Task Scheduler isn't opening web page - batch-file

I was asked to write a script that updates our database every hour. It's hosted on our website and updates the database every time the page is visited. I used Windows Task Scheduler to control the automation part of this process. I know for a fact that the code involved in updating the database is correct, but now I'm beginning to question whether or not my .bat file is right. This all started happening once I started another script that does the same thing for another web page. This is the first .bat file:
taskkill.exe /f /im iexplore.exe
start http://website.com/scriptA.php
The second consists of this:
::taskkill.exe /f /im iexplore.exe
start http://website.com/scriptB.php
I'm aware of master-slave database replication but since that's an automatic process that runs constantly whenever the database is updated, we decided against using it because we only want these scripts to run at set intervals. The first file is set to run every hour starting at 9:45 in the morning, so I've verified that before the second script was added, the database would show a "last updated" timestamp of X:45 (where X is the current hour). The second file runs every four hours starting at noon, and I also verified that it shows a "last updated" timestamp of X:00.
What is causing this? I can't waste time constantly checking to see whether or not the database is getting updated properly, and some of our inventory information relies on these databases. If it's worth anything, the scripts are hosted on the same server, which is the same machine I'm using the scheduler on.

First line of second script does nothing. It is ILLEGAL code. You turned it into a label of :. Some people believe it is a comment, but they are wrong. It is only a label - an illegal one at that.

Related

Move files that are being executed from batch file

First of all, it's my first time around here, so tell me if anything is wrong :D
This is the actual situation: I have some tasks running more or less constantly (scheduled tasks from Windows Task Scheduler) that are in a live environment, while I develop in a dev environment. When I want to launch updates to live, what I usually do is "Drag and drop" the current files in the live environment to a backup folder. This way, the tasks that were running, keep running in the new backup folder. At this point, I copy the updated files to the live folder. Therefore, the next time the tasks are executed, they will be already updated.
But it turns out to be kind of annoying each time I want to include new things, so I wanted to have a little batch script that does all of this automatically. The thing is that it seems impossible to do the "Drag and drop" function from batch.
I have tried using the usual "move" function, creating some symbolic links, using other copying tools such as Robocopy. But none of them have had success. I always get some error due to my tasks being executed all the time.
Is there a way around this problem? Mainly, is it possible to reproduce the "drag and drop" functionality from a batch script or similar?
Thank you very much!

Automated task using pymssql runs fine during the day- but not at night; why?

I have a web scraping job that needs to be executed each evening. Our company has a virtual machine with the "Windows Task Manager" application installed. So, I created a new task (i.e., an entry in task manager) to run every evening at 3 a.m.
Initially, the process did exactly as expected: it fetched the data and inserted them into our database. A few nights later, the website we were scraping from kept shutting down for maintenance, so I went into Task Manager, changed the time setting to start at 10:30 pm instead of 3:00 a.m., and waited until the next morning.
The script executed completely with no exception issues- but nothing was entered into the database! The task manager even said that the script ran completely, and it even took the usual amount of time to run, but alas, no new rows.
One might posit that there was no new data to fetch. However, When I execute the script manually from the command line (and kept the start date/end date the same), the script uploaded the usual ~10,000 rows into the database. So there is data- but it only gets written to the database when we launch the script manually during the day, and not when scheduled in the evening.
Does anyone know a potential reason as to why this happens?
Thank you in advance.
Edited to add:
I understand that this question might sound a little ridiculous, especially since from the surface, there doesn't seem to be a lone factor in determining the issue. If I could provide any further background information into the issue, feel free to ask.

Change Data Capture (CDC) Performance issues

I am using CDC in SQL Server 2008 and triggering it via SSIS(In the form Stored procedures and calling commands like table-diff in EST). Recently i came across a different issue and I am simply not understanding what might be the reason. What exactly happened is CDC suddenly hanged while executing. It was in running state for almost around two days. After that we manually had to kill CDC process in order to run the next batch for the same. When I ran same CDC job next time it ran properly without any interruption (Completed in few minutes/No changes were made to package or anywhere before next execution).When that process was executing for around two days I checked changed records, except for 2-3 new Inserts/Updates nothing was there. After going through log file details I came to know my table-diff cmd took around 5 hours to execute which normally takes around 5 mins for my package everyday.I just want to know what might be the possible reason for such behaviour.
PS: Such behaviour occurred few times in past as well.

Scheduled task runs but launches batch only after minor modification

I have a problem on Windows server 2008 R2.
I scheduled a task that runs daily at fixed time using a domain account, it launches a .bat file that calls an executable passing some params.
It worked for months then without any apparent reason it stopped working: the task history shows that the task actually starts, it does not report errors but the executable does nothing.
I noticed that:
If I am logged on remote desktop at the time of execution the task works.
If I make a minor modification to the task (change the start time by 1 second) and save it then the task resumes normal behavior even if I am not logged.
Other similar tasks on the same server running on the same account work properly.
This is the second time that on the same server a scheduled task give this problem.
It seems that something in the task gets corrupted and forcing a save restores normal operation.
I already checked:
the task runs properly if launched manually;
permissions on .bat and exe are full control for the task account;
The password for the account did not change;
The "Start in" field is the .bat path;
There are no mapped drives involved;
The task runs with a domain account which is also machine admin, it is configured with highest privileges and of course Whether the user is logged or not.
Anyway the task worked properly for months and resumed working simply forcing a save to the task so the configuration should be ok.
Do you have any advice to detect and prevent this issue?
Unfortunately, the only thing I can think to do is add a step of outputting a timestamp to your first task, then create a scheduled task that checks the time stamp to make sure its "current".
Out of curiosity, when this happens, does the task show that it is set when checked programmatically?
Dim rootFolder As TaskScheduler.ITaskFolder
Dim task As TaskScheduler.IRegisteredTask
' Create the TaskService object.
Set service = CreateObject("Schedule.Service")
service.Connect strHostname, strUsername, strDomain, strPassword
' Get the task folder that contains the tasks.
Set rootFolder = service.GetFolder("\")
Set task = rootFolder.GetTask(strNameOfTask)
WScript.Echo task.Name & " (Enabled: " & task.Enabled & ") Last Run: " & task.LastRunTime
Also, when you look at C:\Windows\System32\Tasks, is the XML file for the task still valid?
Without knowing exactly which part of the process broke, its tough to tell where to look for a solution.

SSIS 2005 Process Task Timeout Issue

Here is what I am trying to do.
I have a perl script that browses a website, fills out a form, and submits the form. This process takes about 1 minute to complete and will initiate an asynchronous process on the website to create a report and drop it to an FTP site.
After the form is submitted I would like to kill the process and report success.
After a period of time (hours) I will go to the FTP site to pick up the reports that were generated from the website.
To accomplish this I have a batch file which calls the perl script with a parameter for the report type to run. This works fine. When I call the batch script from the SSIS Process Task it works. However, I want the process task to terminate with success after 5 minutes so I set the timeout to 300 seconds but it still terminates with a failure.
Does anyone know how to make the process task report success so it will continue on to the next task in the package?
Try turning the connecting line into a 'Completion' precedence constraint. I'm guessing the line will will be coloured green. Right-click on this, and change it to 'Completion'. It should turn blue.
The first task wille still terminate with a failure, however you will progress to the next no bother.
Let me know if this helps,
James

Resources