Running an exe on multiple pcs on sync - batch-file

I'm trying to run an exe on multiple pcs on sync.
Im using psexec, this is what I have till now:
I have a batch file with this:
start psexec \\pc01 -i -s -d c:\videos360\video360.exe
start psexec \\pc02 -i -s -d c:\videos360\video360.exe
With this I can start the exe in the 2 pcs, but never totally on sync.
Anyone has some idea of how can I make them run more on sync?
Thanks in advance.
Sorry for my bad English...

First sync the clocks on both machines. You can run a script on one of them to sync to the other or have them both sync to a central time source. Then add a task to Task Scheduler on each machine to start the application at the same time. That's about as close as you're going to get without resorting to some sort of IPC mechanism between the processes (requires source code access to video360.exe).
See
schtasks.exe
Windows time service tools
You won't need psexec because schtask can be used to manage tasks on the remote machines. It would be up to your script to change the next time to fire the task, or you could setup a repetitive task that fires every minute or two and just enable/disable the task. I believe there's a one-shot option as well.

Related

Quick Test Pro (11) not running when computer is locked

I need to run QTP scripts when I'm not at work. So I'm scheduling (with windows task scheduler), batch files (which calls .vbs files) to call the QTP codes.
They run fine, if the computer is no locked (I have tried with the scheduled taks, and running the .batch directly also).
Even when the computer has been locked for a hour, the QTP scripst run fine.
But if the computer is locked for several hours (for example, if I leave work at 5 pm, and I need to run the scripts after 12 am), the QTP scripts donĀ“t run (neither an error message is popped up, nor QTP in encycled, nor nothing).
Anybody has an idea what need to be done to work this out?
Directly from UFT help file, same applies to QTP:
When running UFT tests or components on a local machine, if the computer on which the application is being tested is locked, your test run may fail.
Workaround:
Install UFT on a virtual machine (without a screen saver or lock password), and start or schedule your run session on the virtual machine. Then you can lock your local computer without locking the virtual machine.
Another workaround (not recommended):
Play any video in loop in Windows Media Player. This will prevent your machine from get locked automatically.
In this scenario you can auto schedule the script through the external free software like Auto-Sys.
There you can create jobs to unlock the machine and then run the regression.
You can simple use the below utility to keep you system unlocked
https://sumeetkushwah.com/2015/11/07/windows-lock-prevention-utility/
use the below code and save this as a .vbs file(SomeName.vbs)
Set WshShell = WScript.CreateObject("WScript.Shell")
Do
WshShell.SendKeys "{CAPSLOCK}"
WScript.sleep (1000) '1000 means one second. choose the time of your liking
WshShell.SendKeys "{NUMLOCK}"
Loop
Double click the saved vbs file. Your computer will not get locked unless you manually kill the WScript.exe task in your task manager. Use the keys of your choice from here http://www.pctools.com/guides/scripting/detail/149/?act=reference

PSexec file on list of computers speed up

I was hoping I could have some help with this. I have about 300 computers that I have to clear all the temporary internet folders on and separately do a logoff on them.
Currently I have a file located on the C: drive that has the following command:
Clear temporary internet folders:
RunDll32.exe InetCpl.cpl,ClearMyTracksByProcess 255
log off
shutdown /l /t 0 /f
Here is the PSexec command I use to remotely execute that command on every computer:
#echo off
psexec #C:\_\complist.txt -u Admin -p PASSWORD -i. c:\clearcache.cmd
The problem I am having is that it is taking really long, about 30 - 35 minutes because it does one at a time. Is there anyway I can speed this up? Maybe instead of doing 1 at a time I can do all at once?
If anyone can help improve the speed or help me find a better way I would be so grateful.
Thanks
Error I am getting when executing:
the other way round: instead of doing all at once, do them separately. Sounds contraproductive? Parse complist.txt with batch and start psexec for each single computer instead of a list of computers:
for /f "delims=" %%i in (C:\_\complist.txt) do (
start "%%i" psexec \\%%i -u Admin -p PASSWORD -i. c:\clearcache.cmd
)
start creates a new process for executing psexec on every single computer and doesn't wait for it to finish, so they all run (nearly) parallel.
You could look at SCCM to run a script like this, lot of overhead if its not in place yet. But could be worth looking at for larger deployments in the future.
I have used PDQ Deploy in the past works well, they have a few levels of licensing including a free one depending on your business. DPQ Modes
Also have you considered pushing out a scheduled task that clears the IE favorites on a set date, this could also run the log off command but that could get messy. With a scheduled task in place could just run the log off command like you have in the past which wont be waiting for the clean command to finish.
If you have a domain you can use Group Policy to push out the scheduled task to the PCs saving you from making a import script and deploying that with psexec.exe.
Just a few Ideas for you.

Writing a simple two command CMD batch file for task scheduler

I am having an issue with the new version of Tableau 9.0.1 where I need to run tabadmin cleanup daily or else several gigabites of data will be stored on my server slowing things down and often leading to crashes if not taken care of regularly.
Can someone help me write a batch file script to accomplish the following commands that can be run every morning using task scheduler to avoid future issues?
Tableau is said to be releasing a patch fix from this issue in 9.1.0 but it is an issue we are dealing with currently.
Scripts to run:
cd C:\Program Files\Tableau\Tableau Server\9.0\bin
tabadmin cleanup
I would appreciate anyone's help who is familiar with writing batch files.
Thank you,
Connor
what if you put in a sleep in it and put in in the startup folder?
so when the server starts it would run to file and in the file you can have sleep X amount of seconds then it can run them. then if you have that in a loop you can have it clean every say 12 hours or 24 hours. then if you lose power or a restart it will just run again.

Batch Script to SSH and deploy cron job via VI

First i would just like to point out, i know very little about this subject matter. Okay, now that that is out of the way.
I am trying to setup a batch script to ssh to a list of IP's (about 50) and create a simple cron job to reboot the box it connects to every 24 hours at midnight local time.
I already create the cron job in VI, but i have no idea how to make this batch script work. I have tried to piece meal a batch together but, have had zero luck. Lastly i think i should mention that i am making the batch file on a windows box, and sshing to a Linux shell. If their is anything you guys need let me know and i will try to supply it.
Thanks in advance!
Edit: for clarity
May this be useful?
HERE
Inside the "commands.txt used in the script i linked to:
shutdown -r -t sec 0 Rebooting
exit

What is a good pattern to synchronize files between computers in parallel (in CentOS)?

Trying to find a good way to copy code between one "deployment" computer and several "target" computers, hopefully in parallel. The idea is that the deployment computer holds a copy of the files as they are supposed to be copied to the target servers. We would like to have copying happen in parallel, as it might involve several tens of target servers.
Our current scheme involves using rsync to synchronize the containing directory where the files reside, in order to keep the target servers up-to-date on the deployment server.
So, the questions are:
What is a good / better way to do this?
What sort of tools are used to do this?
Should this problem be faced from a different angle or perspective that I'm totally missing?
Thanks very much!
Another option is pdsh, a parallel, distributed shell. It's available from EPEL, and allows running remote commands (via ssh) on multiple nodes in parallel. For example:
pdsh -w node10,node11,node12 command
Runs "command" on all three nodes in parallel. It also has a handy hostname expression feature to do the same thing with a bit less typing:
pdsh -w node[10-12] command
It also includes the pdcp command copies files to multiple nodes in parallel. (The pdsh package needs to be installed on all nodes for pdcp to work.)
pdcp -w node[10-12] /local/file /remote/dir/
The local file is copied to the /remote/dir on all three nodes.
We use the lftp command to sync our remote web server to our local backup machine. We wrote a BaSH script to automatically sync all backups on the server to the local box, and we set that script up on a cron to run nightly.
rsync is a fine way of handling this, and I might recommend moving your current protocol into a cron setup if it isn't already.
Unison is also a tool available for setting up two way sync, if you requie that functionality.
Hope this helps!
There is a program called clusterssh that is available on debian based operating systems (but I was able to install it onto RHEL 6.3 using an RPM and resolving other dependencies) that will allow you to open an ssh terminal for multiple machines, with a single input location (this allows you type once onto as many machines as you have terminals open). Then you just have to use a simple scp. I have used this program to move a file from a development workstation to as many as 25 other workstations at the same time, but this option is only really useful if you're trying to accomplish what you stated in the question, that is, copying files from one computer to several others.
This is not an effective syncing mechanism. If you really want it to sync then the above answer would be best.

Resources