I am trying to create a scheduled SQL Agent Job that executes a .bat file with a gsutil rsync command to synchronize a local folder with a Google Cloud Bucket. The task runs without an error, but does not finish or actually perform the rsync command.
I can however manually run the .bat file with success as the root user, as I have a key handshake with the GCP service account. I can also successfully execute other non-gsutil .bat files from the SQL Agent Job, as I have created credentials and proxy to run as the root user
The .bat file looks like this :
gsutil rsync -r G:\MyBackupFolder gs://my-coldline-storage-bucket
and I am running the SQL Agent step as type Operating system (CmdExec)
Related
I am using jenkins to do CI, and in one of my jenkins job's jenkins file
I try to run a batch file which is in a slave agent of jenkins (windows 7 OS), and the batch file is as below:
#echo on
"<path1>\eclipse.exe -noSplash -application org.eclipse.cdt.managedbuilder.core.headlessbuild -data E:\jenkins\build_workspace -cleanBuild <project name>/Debug
exit
and in my jenkinsfile, it is like:
pipeline {
agent {label "buildserver"}
stage {
script {
bat 'call <path2>\abc.bat'
}
}
}
and total build time is about 16-18 minutes.
but when I login in the build server, and try to run the batch file directly in the command line window, the build process time is about 4-5 minutes.
I do not know what is going on ?
This issue has been solved. It is the reason of jenkins agent installed as windows service on slave agent of win 7 OS, probably the compatibility between win 7 and jenkins.
refer to Jenkins job windows batch execution 20 times slower than executing in cmd.exe.
I have installed Jenkins at 1.1.1.01 Ipaddress, and a bat file does exist at remote fileserver 1.1.1.02 Ipaddress (that may differ by user, because I will give Ipaddress as a parameter).
Can I deploy that bat file through Jenkins pipeline?
You need first to check, independently of Jenkins, if you can access 1.1.1.02 (or any other remote server IPs) from 1.1.1.01, assuming 1.1.1.01 is the server executing your job.
If, from 1.1.1.01, you can SSH for instance to 1.1.1.02, or scp 1.1.1.02, then you can copy a file (like your bat file), from 1.1.1.01 to 1.1.1.02 or vice-versa.
I'm trying to use scheduled tasks to run a database back up once per day using sqlcmd that points to a SQL script.
If I open a command prompt and run the code in the batch file everything is successful but when task manager tries it gets a 0x2331 error.
A search reveals it's something to do with permissions - maybe I need to change the path to the sql file?
here is the script:
ECHO OFF
ECHO This is running to backup a database
sqlcmd -S COMPNAME -i C:\backup.sql
GO
ECHO Success!
EXIT
I'm new to batch files to tried to keep it as simple as possble.
I can successfully run a gsutil command with a windows domain account from the command line in Windows (setting up service account key etc.). When I try to run the same command from a SQL Agent Job using a CmdExec task the job hangs and doesn't complete. I can't see any logging so have no clue what it's waiting for. I've setup the job to run with the same Proxy User that i use to run the gsutil command manually.
Any ideas how I can get this to work or how to see more logging?
Are you using standalone gsutil? Or did you get it as part of installing the Cloud SDK (gcloud)?
If the job hangs for a long time, it could be stuck retrying multiple times. To test if this is the case, you can set the num_retries option to be very small, but above 0 (e.g. 1) either in your .boto file or the the command arguments via this option:
gsutil -o 'Boto:num_retries=1' <rest of command here...>
A second thing to note (at least for the version of gsutil that doesn't come with gcloud) is that gsutil looks for your boto config file (which specifies the credentials it should use) in your home directory by default. If you're running gsutil as a different user (maybe your SQL Agent Job runs as its own dedicated user?), it will look for a .boto file in that user's home directory. The same should apply for the gcloud version -- gcloud uses credentials based on the user executing it. You can avoid this by copying your .boto file to somewhere that the job has permission to read from, along with setting the BOTO_CONFIG environment variable to that path before running gsutil. From the cmd shell, this would look something like:
set BOTO_CONFIG=C:\some\path\.boto && gsutil <rest of command here...>
Note: If you're not sure which boto config file you're normally using, you can find out by running gsutil version -l and looking at the line that displays your config path(s).
I have a .bat file that runs sqlcmd and runs a .sql script. When I run it manually through the user the Jenkins Service runs under everything works fine. But when Jenkins runs the job to execute the same .bat file I get a login failed.
I am running the .bat through Jenkins like this:
C:\MasterFiles\ListingsManagement\LH.bat > batResults.txt
The .bat looks like this:
sqlcmd -S <Server> -E -i C:\MasterFiles\ListingsManagement\test.sql
-o C:\MasterFiles\ListingsManagement\results.txt
and when run through Jenkins the error message is:
Sqlcmd: Error: Microsoft SQL Server Native Client 11.0 : Login failed for user '<domain>\svc_jenkins'..
Server name and Domain name have purposefully been removed.
Sql is 2014 Enterprise, and Jenkins is at 1.645 and all this is running on a Windows 7 vm.
When you run the command manually, are you certain you are logged in as <domain>\svc_jenkins on the same machine Jenkins runs on?
Can you confirm the contents of the environment variables USERDOMAIN and USERNAME match <domain>\svc_jenkins? when you are logged in manually and run the script?
One of those tests should flag a problem.