I am having an issue with the new version of Tableau 9.0.1 where I need to run tabadmin cleanup daily or else several gigabites of data will be stored on my server slowing things down and often leading to crashes if not taken care of regularly.
Can someone help me write a batch file script to accomplish the following commands that can be run every morning using task scheduler to avoid future issues?
Tableau is said to be releasing a patch fix from this issue in 9.1.0 but it is an issue we are dealing with currently.
Scripts to run:
cd C:\Program Files\Tableau\Tableau Server\9.0\bin
tabadmin cleanup
I would appreciate anyone's help who is familiar with writing batch files.
Thank you,
Connor
what if you put in a sleep in it and put in in the startup folder?
so when the server starts it would run to file and in the file you can have sleep X amount of seconds then it can run them. then if you have that in a loop you can have it clean every say 12 hours or 24 hours. then if you lose power or a restart it will just run again.
Related
I'm trying to run an exe on multiple pcs on sync.
Im using psexec, this is what I have till now:
I have a batch file with this:
start psexec \\pc01 -i -s -d c:\videos360\video360.exe
start psexec \\pc02 -i -s -d c:\videos360\video360.exe
With this I can start the exe in the 2 pcs, but never totally on sync.
Anyone has some idea of how can I make them run more on sync?
Thanks in advance.
Sorry for my bad English...
First sync the clocks on both machines. You can run a script on one of them to sync to the other or have them both sync to a central time source. Then add a task to Task Scheduler on each machine to start the application at the same time. That's about as close as you're going to get without resorting to some sort of IPC mechanism between the processes (requires source code access to video360.exe).
See
schtasks.exe
Windows time service tools
You won't need psexec because schtask can be used to manage tasks on the remote machines. It would be up to your script to change the next time to fire the task, or you could setup a repetitive task that fires every minute or two and just enable/disable the task. I believe there's a one-shot option as well.
I need to run QTP scripts when I'm not at work. So I'm scheduling (with windows task scheduler), batch files (which calls .vbs files) to call the QTP codes.
They run fine, if the computer is no locked (I have tried with the scheduled taks, and running the .batch directly also).
Even when the computer has been locked for a hour, the QTP scripst run fine.
But if the computer is locked for several hours (for example, if I leave work at 5 pm, and I need to run the scripts after 12 am), the QTP scripts donĀ“t run (neither an error message is popped up, nor QTP in encycled, nor nothing).
Anybody has an idea what need to be done to work this out?
Directly from UFT help file, same applies to QTP:
When running UFT tests or components on a local machine, if the computer on which the application is being tested is locked, your test run may fail.
Workaround:
Install UFT on a virtual machine (without a screen saver or lock password), and start or schedule your run session on the virtual machine. Then you can lock your local computer without locking the virtual machine.
Another workaround (not recommended):
Play any video in loop in Windows Media Player. This will prevent your machine from get locked automatically.
In this scenario you can auto schedule the script through the external free software like Auto-Sys.
There you can create jobs to unlock the machine and then run the regression.
You can simple use the below utility to keep you system unlocked
https://sumeetkushwah.com/2015/11/07/windows-lock-prevention-utility/
use the below code and save this as a .vbs file(SomeName.vbs)
Set WshShell = WScript.CreateObject("WScript.Shell")
Do
WshShell.SendKeys "{CAPSLOCK}"
WScript.sleep (1000) '1000 means one second. choose the time of your liking
WshShell.SendKeys "{NUMLOCK}"
Loop
Double click the saved vbs file. Your computer will not get locked unless you manually kill the WScript.exe task in your task manager. Use the keys of your choice from here http://www.pctools.com/guides/scripting/detail/149/?act=reference
First i would just like to point out, i know very little about this subject matter. Okay, now that that is out of the way.
I am trying to setup a batch script to ssh to a list of IP's (about 50) and create a simple cron job to reboot the box it connects to every 24 hours at midnight local time.
I already create the cron job in VI, but i have no idea how to make this batch script work. I have tried to piece meal a batch together but, have had zero luck. Lastly i think i should mention that i am making the batch file on a windows box, and sshing to a Linux shell. If their is anything you guys need let me know and i will try to supply it.
Thanks in advance!
Edit: for clarity
May this be useful?
HERE
Inside the "commands.txt used in the script i linked to:
shutdown -r -t sec 0 Rebooting
exit
Long story short... we have multiple servers which we run perflog monitoring on every night. My job is to convert these logs to .csv format and send them to my e-mail.
This bit it already automated via a .sh script an ex-employee wrote.
What I want automated is to run a batch job after the perfmon logging to look at a specific folder and find the latest .blg file and run the sh script on it (the script is called upload) so that I don't have to log in to each server and do it manually.
e.g.
upload myInitials cd /cygdrive/someLocation/logs/$latestFile$.blg
myInitials and the location can be hard-coded... I just wouldn't know how to find the latest file in the folder and automate it all via a batch file.
Any pointers would be very helpful!
# Jeremy:
Sorry, I probably should have mentioned in my question that the servers are running 2003 and 2008.
I don't think it would be absolutely necessary to register a change notification on the folder - If the log runs from noon till 7 in the morning, the script will run immediately after (you can set a script to run after a perfmon log has finished in log properties) so the log will almost definitely be the latest file in the folder anyway.
Like I said, I already have a .sh file in place to convert to csv and send to my e-mail, I just need to incorporate it into a batch file so that instead of me going to each of the servers and opening up cygwin and typing upload xx /cygdrive/location/logs/xyz.blg, I can have it automated to run straight after the log has finished without me having to RDC into it.
Thanks for the input!
If you have a Shell script and you job is to call the shell script from a windows batch file then this will work.This assumes the cygwin is installed in C:
Contents of start_cyg.bat
#echo off
set PATH=%PATH%:"C:\Cygwin\bin"
rem bash --login -i
bash "/cygdrive/d/cyg.sh"
Contents of cyg.sh
#!/bin/bash
TAIL=`ls -lrt | tail -1`
echo "TAIL:$TAIL"
If you call start_cyg.bat from windows command prompt you can get the output of the cyg.sh in the console
for getting newest file in a directory, ls -1tr | tail -1 should work.
First, I don't know if it would meet your requirements, but the Windows Task Scheduler 2 in Vista+ is very robust and can trigger an event even based on log entries. However, extraction and parsing of that log entry may require some scripting, and might have concurrency issues, even if that log entry did indicate the last used process. Chances are none of this is helpful, but just throwing it out there.
Programatically, it would be simple as you can register a change notification on a folder. When a change occurs, you go find the latest file. Then launch the batch file to launch your shell script, or whatever your desired sequence may be.
I think cygwin may even support change notification events via scripting, though I'm unsure. I believe there are linux extensions for this, but I may be wrong.
If it were me, I'd just write a little C++ app to do whatever I wanted.. but for you maybe any (or more likely none) of the above helps ;o.
I have a simple batch file as seen below that should extract a zip file to the root of E:. The zip file is valid and I can run the batch file from the command line just fine.
Instead of completing the task, it continues to inform me that the Status is "Running". The problem is, it is not running and the file never gets unzipped.
The task is running as a Domain Admin that has been specifically added as an Admin on the box.
Are there any known problems with using zip files in Scheduled Tasks. I actually have this same problem on 3 out of the 12 boxes this task runs on, but there is no rhyme nor reason as to why some servers work, and others don't.
Any ideas on how to debug what is going on, or a solution would be very helpful.
Here is the batch file I'm attempting to run.
SET RootPath=E:
SET WinzipLocation=E:\Program Files\WinZip
"%WinzipLocation%\winzip32" -e -o %CD%\TestZipFile.zip %RootPath%
Try to use the WinZip Command Line Support Add-on.
what if you use 7-zip in command line?
I realized after posting that the "bad" servers were all 64-bit. I was running the 32-bit version of winzip. Since the company I work for doesn't see the benefit in purchasing any software, I had no other option but to starting using 7-zip. I have not tested for any performance increases or hits, but I do not that it works, regardless of the environment.
Thanks for the answers, but it looks like without the 64-bit version of winzip....i have no other options.