Execute process task fails to execute Rust script in job - sql-server

I have written a small Rust script for an import job. I added the script execution into the SSIS package through the Execute process task and set the correct working directory. Using the debugger, the script works without a problem, as it does when executed normally, e.g. through cmd or PowerShell.
When I start the job in SSMS as an Agent Job, the package fails with
The process exit code was "-1073741515" while the expected was "0".
What I tried
Replacing the script with a very basic Rust script that just writes one single line to a file in the working directory to exclude the possibility that the script somehow panics, but still nothing.
The script is compiled with the i686-pc-windows-msvc toolchain. With 64bit the script does not work at all on the server.
Permissions look okay for the executing and the working directory. Full control for both folders.
Since execution works on the server I think this should most likely be a permissions issue but I can't for the life of me not figure out what is wrong.

Okay the problem was indeed with the default SQL Server Service Agent-account that ran the job. If I run the job with a Proxy-account it runs fine.
I'll have to look into the configuration of the Service Agent but until then, Proxy account it is.

Related

WinSCP script works fine manually, but failes when executed by SQL Server Agent Job

I have a WinSCP script that SFTPs a file from a remote server to a local directory. The script works fine when I execute it from the command line, but when I try to execute it as a command step in a SQL Server Agent job, the job fails. All the history tells me is:
Executed as user: NT Service\SQLSERVERAGENT. The step did not generate any output. Process Exit Code 1. The step failed.
which could mean anything. I've tried to break the problem down to its core by stripping everything out of the script except an exit statement. It still fails, so I know the issue is not in the script. Any thoughts?
Posting in case anyone else runs into this problem. The WinSCP FAQ has a little section on these kinds of problems, but missed one ridiculously simple possibility: does the job have access to the script being executed?
Regardless of who created the script, the command will be executed by the SQLSERVERAGENT account. If you're like me, you keep your scripts under your user account's home directory, which SQLSERVERAGENT doesn't have access to. Move the script to a directory it can read and see if that fixes it for you. I used the Users\Public directory.

SSIS in SQL 2014 running an Execute Process Task failing

Running a Cmd.exe inside an ETL Process Task and it's failing with Exit code 1.
If I run the command as the same user I'm running the SQL Agent job as outside of the ETL it's running fine and giving Exit code 0.
I've seen some DCOM errors in Event Viewer and Ive taken steps to give permissions to the user I'm running the ETL through the SQL Agent Job for. However it's still failing.
Are there other things I should check about running a CMD command across servers as a specific user?
Just to say this was a permission issue ultimately for it to write a file inside the Default folder. It wasn't manifesting as such until I dumped out the Log a little more. I had to make the user I was running the agent job to be a part of the Administrators group on the SSIS server to allow the process to work.

PowerShell script run using Windows Task Scheduler is not running a SQL Server SSIS DTEXEC.EXE job

Hopefully this question is unique enough not to be a duplicate. I have a PowerShell script which does two things.
Inserts records into a SQL Server table
Writes text to a text file
For the purpose of this post, I have simplified the script. On my computer, the script is located at C:\Temp\ssis.ps1. Following is the contents of the script.
DTEXEC.EXE /F "C:\Temp\ssisjob.dtsx"
$date = Get-Date
Write-Output "This PowerShell script file was last run on $date" >> C:\Temp\test.txt
When I manually run this PowerShell script, records are inserted into the SQL Server table, and a line of text is written to the test.txt file. If I schedule this script to run using Windows Task Scheduler, a new line of text is written to the text file, but the records are not inserted into the SQL Server table. This tells me that Windows Task Scheduler is able to run the PowerShell script. However, for some unknown reason, Windows Task Scheduler seems to not want to run the SSIS job (DTEXEC.EXE) part of the script. Event Viewer confirms there is an issue with the SSIS job. I am running Microsoft SQL Server 2014, Developer Version.
In my task, on the Actions tab, the Add arguments field has the following reference: C:\Temp\ssis.ps1. Task Scheduler is configured to run with the highest privileges.
I have tried all of the following Execution Policies in PowerShell. Regardless of the Execution Policy I select, my experience does not change.
Bypass
Unrestricted
RemoteSigned
The History tab in Task Scheduler has information events, but no error events.
I do not have the permission to view the SQL Server logs (this is a production server).
I have been debugging this issue for a few weeks, and I have read numerous posts here on Stack Overflow, yet I still cannot seem to find the answer, so hopefully I have done my due diligence before making a new post here. I could add some additional observations, but I do not want my post here to get extensively long. If anyone has any hints or tips or insight that might lead me down the right path, it would be greatly appreciated.
Here is the solution I came up with. Instead of exporting the file to Excel, I exported to a flat file (txt file). Also, using Nick McDermaids excellent recommendations, instead of using PowerShell in Task Scheduler, I started the dtexec.exe file in Task Scheduler.
Task Sheduler Actions Tab
Keep the action as Start a program
In Program/script, type dtexec.exe
In Add arguments, type /f "C:\path\to\example.dtsx
Leave the Start In box empty

task in task scheduler failing when executing batch file

I have a task set up on a windows server 2008 R2. This task is setup in the task scehduler to execute a batch file that backs up a mongo database, every 4 hours. I have it set up the same on 2 servers.
On 1 server it runs fine.
On the other, I get this error logged in the history and it doesn't execute.
Task Scheduler failed to start "\Backup MongoDb" task for user "*****". Additional Data: Error Value: 2147750687.
I am lost for ideas what the issue may be. Anyone got any ideas?
My workaround is to call a .BAT file from the Task Scheduler.
This batch file then calls the PowerShell script file:
powershell c:\dir1\AutoPopulate.ps1
Seems to work.
I have run into this a few times. Check the last run result on the task. If it says another instance is already running, right-click on the task and click "End." Then after it ends the task it should start again correctly the next time. You can also right click and choose "Run" to run it immediately.
Check the Security options of your Task.
Make sure, the option Run whether user is logged on or not is selected.

Executing a stored procedure using Windows task Scheduler

I've been trying to set up a schedule to run a stored procedure every hour in Windows Task Scheduler (as I'm using SQL Express and can't install 3rd party tools) but after trying various methods such as running a .bat file from task scheduler, opening SqlCmd utility from task scheduler and passing either the command line syntax or a .sql script file I'm having no luck.
I know this can be done and therefore I'm sure it's something I've missed but if anyone can share their experience of this I'd very much appreciate it.
The following command is in the batch file...
sqlcmd -E -i"C:\Users\Administrator\Desktop\test.sql" -o"C:\Users\Administrator\Desktop\dump.txt"
Thanks a lot
If you are an admin on the sql instance (Since you are using SQLExpress I bet you are trying to do this on your own computer so there is a high chance your user is an admin of the sql instance) you should not use -E at all, just ignore it.
Second, specify the server even if you are working on local.
Start with a simple sql command like below:
sqlcmd.exe -S "." -d MY_DATABASE -Q "SELECT * FROM MY_TABLE"
Replace MY_DATABASE and MY_TABLE with your dbname and table name. Make sure you can run it from command line. It should return the data from your table. (Beware command line options are case-sensitive so -s is not same as -S)
Last, do not try to feed parameters through task scheduler. Put the command with all parameters in a .bat file and just run the batch from task scheduler.
I have recently had a similar issue and my experience may assist you. I was calling a small app i.e. EXE from a batch file. I was scheduling the batch file to run from the Windows Task Scheduler. The app was accessing the SQL data using Windows Authentication.
I could run the app directly i.e. click on the EXE to run it.
I could run the app from the batch file.
But if I tried to run the scheduled task it seemed to start but did nothing and posted no errors that I could find.
I found if I changed the app to run with SQL Authentication it could be run from the Task Scheduler.
I suspect there is something about the context of the Windows Authentication when it is run from Task Scheduler that is not recognised by SQL.

Resources