I've got this script in a batch file:
cmdkey /generic:"servername" /user:"user id" /pass:"password"
mstsc /v:"servername"
...which logs me into a remote server (remote desktop session). It works great, but doesn't actually log me in all of the way because this server is configured with an interactive logon, meaning there is a message that comes up that I have to click OK to when I first connect before it actually signs in all of the way.
We have a problem with our admins shutting down this server at night to apply updates and thus logging us out - causing our scheduled tasks to not refresh. I want to assign this batch file to a scheduled task on my local PC to ensure that every morning, before my scripts run, I'm logged back into the server.
Is there a way to get this batch file to bypass/acknoweldge this message so it finishes signing me in?
Related
Afternoon everyone,
I've tried to research this topic in depth and I cannot come to a conclusion for my problem. I'm trying to automate a batch file in Task Scheduler to execute two SSIS packages. Currently when I attempt to execute the scheduled task (either waiting for its set schedule or running on demand), task scheduler will show that the task has completed successfully, the "Status" will continue to say "Running" but the destination files are never created/re-created. This is the script:
dtexec /f "D:\SSIS\Folder\Folder\Folder\Package.dtsx"
dtexec /f "D:\SSIS\Folder\Folder\Folder\Package.dtsx"
The SSIS is supposed to pull information from SQL and export it to a CSV which it does wonderfully... If I execute this script in CMD, Powershell, or run my batch file directly.
About my environment:
My script resides on a SQL Server. (Windows Server 2016 Standard)
I have a domain admin account used for scripting permissions (all parent folders including the batch file itself have that domain admin added with full permissions, including the csv destination).
The scheduled task is set to: "Run whether user is logged on or not", "Run with highest privileges", Configured for Windows Vista and Windows Server 2008. I know my credentials are correct for my domain admin account.
In "Actions", "Program/script:" is currently set to "C:\Scripts\file.bat", there is nothing in "Add Arguments (optional):" currently, "Start in (optional):" is set to "C:\Scripts".
So here is what I've tried:
I've set "Program/script:" to "cmd.exe" and added an argument of "file.bat" with a start in as "C:\Scripts", no dice.
I've set "Program/script:" to "Powershell.exe" and set an argument of "-ExecutionPolicy Bypass C:\Scripts\file.bat" with a start in as "C:\Scripts", no luck again.
I added my domain admin account to the local administrators group on the server as well.
I've changed the user/group from my domain admin account to my domain admin account (the one I'm logged into the server with) and set it to "Run only when user is logged on", if this is set and I attempt to run the scheduled task, CMD flashes on screen and disappears before I can read anything (it's still too fast for the script to actually run, it takes ~20 seconds) and the destination file isn't altered.
(This one really stumps me) I've tried to add the script to another scheduled task on the server, that scheduled task will run on schedule and complete every script in the batch file except for these 2 lines. The other scheduled task it was added to uses the same domain admin account, same settings across the board, it even has other similar SSIS packages being run using "dtexec", I don't get it.
Thanks for any input anybody can give me, it's greatly appreciated.
I know this is a super old post, but I just had the same issue and wasn't successful with any other popular solutions around StackOverflow, so I want to put out an alternative solution for anyone still struggling!
When in doubt, double check the user account in the Security Options under Properties (right-click on Task > Properties > first page under "Security Options").
Even though the user that it had selected by default should have permission to execute the script, I had to change the user account to a user with a higher privilege (I'm on a work computer). For me, this meant I had to select the Administrator group on my particular desktop environment.
our company has a remote server in which we run the Task Scheduler to run .bat files.
Seems like it asks for user credentials usually, so I put in my own user/ password for this remote server to run these .bat scripts.
However, this seems like poor practice. Yesterday for the first time in 3 years, we had to change our Windows passwords. Lo and behold, the .bats that ran on the same old user/ pass silently failed (returned login error but that's only in the scheduler).
Any way to get around this? Schedule fail alerts? Not require a user or somehow have a constant service account that doesn't have password change?
I have a psftp script to get files from a server every hour. The script works good when I manually run the bat file from its location and also runs well from the SQL Server Agent when I am logged in/monitoring it.
However, when am not logged in, the script hangs and continues to run forever, creating an empty log file.
Since this is an hourly run, this blocks out all subsequent runs as well creating delays in data load.
Could anyone advice what could possibly be causing this rather bizarre issue.
The log files created for each run are independent, with time stamp - so its not a lock out of log files causing the hanging.
Even though there is a prompt , a prompt file is supplied with value Y in it , that allows it to run without any requirement for explicitly entering a prompt value.
The script has been scanned for any pauses and other timeouts (Is not applicable really , since it works seamlessly when am logged in)
I tried setting set --trust-model=always, script still hangs when I am not logged in.
This said, I am not really sure if it has anything to do with me being logged in or if its just a coincidence. But basically, I am never able to catch the issue happening while am monitoring it!
This is probably a permission issue. Firstly check what user will be running the slq agent job when you are not logged in. Then ensure this user has full permissions/admin rights.
I think you might need admin to shell out - which I presume you are doing to run psftp.
I have a simple batch file which needs to be run weekly to upload some files via Core FTP.
I'm using the free version of Core FTP LE.
MySavedProfile is the Site Name of the saved profile I created using Core FTP's site Manager. The profile contains the URL / credentials / etc of the site to connect to.
Here are the contents of the batch file:
SET logf=confirm.log
echo test-start >> %logf%
"C:\Progra~1\CoreFTP\coreftp.exe" -B -s -pasv -O -site MySavedProfile -u "C:\Progra~2\PathToFiles\FileToUpload.txt"
echo test-finish >> %logf%
For the Windows Server 2012 r2 Task Scheduler, I have created a basic, weekly scheduled task on the Task Scheduler Library root which runs the batch file. For this scheduled task I have:
(Under the General tab)
"Run whether user is logged on or not" is selected
"Run with highest privileges" is checked
Configure for = Windows Server 2012 R2
(Under Actions)
Action = Start a program
Program / Script = "C:\Progra~2\PathToFiles\batch.bat"
Start in = C:\Progra~2\PathToFiles\
Here is the weird behavior I am getting:
If I double click on the batch file directly, it works fine and uploads the text file via Core FTP just fine.
However, if I try to let the Windows Task Scheduler run it, it runs everything except the Core FTP line. That is, I get the usual:
test-start
test-finish
in the confirm.log file, but the FileToUpload.txt has not been uploaded to the remote server, and there are no errors from CoreFTP that I can detect.
I have tried this with a service account that has permissions to run batch files, as well as my own account for this scheduled task. I get the same result: it doesn't seem to run that CoreFTP line. At least not via Task Scheduler. I need this upload to be automated.
I've searched Core FTP's documentation, Google, etc. No one seems to have run into this exact issue. I've applied recommendations from distantly related issues, but none of them have worked.
Any help would be greatly appreciated. Thank you.
The only way to do this is to use the full version of Core FTP (that is Core FTP Pro). If you use the LE version you have to check the "Run only when user is logged on" option.
This happens because of the splash screen at the beginning.
If you can't be logged on forever you could create a user that will always be logged on just for these tasks.
Remember to use the -Log option on CoreFTP to check if it is actually doing something.
I am getting this error message running a batch job with TeamCity. The batchjob is copying files from TeamCity Server to another server(server2). Have checked multiple times, the folders have all the rights permissions needed and this works fine (copies files between servers) when the batch job is run manually from command prompt. I have this error for each file that needs to be copied.
error MSB3021: Unable to copy file "..\bin\Release\Boo.Lang.Compiler.dll" to "\Server2\DestinationFolder\". Could not find a part of the path '\Server2\DestinationFolder'.[10:54:32]: Creating directory "\Server2\DestinationFolder".
I tried few things, but issue remains unresolved. Thanks for your input.
TeamCity build Agent is running as System user account that has no access to the network resources, you should change the service user to an account that has network permissions, like your Administrator account.
See also the related question.