Microsoft Release Management timeout during deploy step - ms-release-management

We're using Microsoft's Release Management to deploy our web application to our test environment (QA). It is a straight-forward MVC.Net web application. Our build generates a web deploy package and we have a command script that sets some parameters based on the target environment (QA is just the first step), and then runs the standard Web Deploy command line tool. The command script works without errors when run from the command line outside of Release Management.
When we move this process into Release Management using the command-line tool, we encounter a timeout during the deploy step of the workflow. The error is:
The installation command \"powershell -command ./RunCommandLine.ps1 -FilePath 'Deployment\Deploy.cmd' -Arguments '/T:QA /E:intranet' -UserDomain 'domain' -UserName 'username' -UserPassword '*****'\" reached the configured timeout (2 minutes); the process was terminated.
We've checked the output log and there is no information from the script at all. We have echo commands in the beginning that should at least dump some output to the log before any action is taken.
The interesting thing is that when we click the "Retry failed deployment" button, the retry succeeds in about 15 seconds without any issues. This happens for each release - fails with timeout, retry succeeds in 15 seconds.
Any ideas from any release management gurus is greatly appreciated.

In release management , what deployment steps have you chosen ? i am assuming a powershell command ? can we do it using a xcopy ( works for me). Also i would request you do follow this blog on debugging the release mangement agent. opening the agent in debug mode has solved my problems or atleast told me about my problem most of the times. http://blogs.msdn.com/b/visualstudioalm/archive/2013/12/13/how-to-enable-detailed-logs-and-collect-traces-from-various-release-management-components.aspx

Related

SSIS-job that executes SAS-scripts suddenly stopped working (but appears to run successfully)

We have a daily SSIS-job that runs in 3 SQL Server environments. The job runs an SSIS-project that executes multiple SAS-scripts (stored on a SAS-server per SSIS-environment).
There's never been any problems with this SSIS-project, until now. Although the SSIS-jobs appears to run well and appears to execute the SAS-scripts (they complete with successful status), the scripts don't actually get executed (on any of the SAS-servers).
According to the Execution Log in SSMS, each SAS-script is successfully executed. However it says that each script finished in under 1 second (a successful execution normally takes many minutes). And on the SAS-server side we don't find any trace of the scripts having been run (no log files are generated). SSMS log example
The SAS-scripts were successfully executed on Monday (and each day before that, for multiple years). We haven't changed the SSIS-project or the SAS-scripts in any way since then. And there apparently haven't been any changes to our infrastructure, network, drivers, etc. Not for SAS, nor for SQL Server. At least not according to anyone we've been in touch with so far...
It appears as though SSIS thinks it's communicating with SAS, while SAS ignores SSIS. And there's no errors or warnings.
If I try to execute a SAS-script manually by running an individual SSIS-package, the same thing happens: It appears to run well, but doesn't actually execute the SAS-program.
The SAS-scripts are stored on a SAS-server, and run via Execute Process tasks in SSIS-packages. The arguments use the following script (unchanged for 3 years):
#!/bin/sh
dtstamp=$(date +%Y%m%d_%H.%M.%S)
sas_dir=/opt/sas94/$1/sashome/SASFoundation/9.4
script_dir="/data/DVHANALYSE/scripts/projects/monaco/current/sas_code"
log_dir="/data/DVHANALYSE/scripts/logs/monaco"
suff=".sas"
echo "This is running Monaco $2"
echo "env: $1" >> $log_dir/plink.log
echo "pgm: $2" >> $log_dir/plink.log
pgmname=$2
logname=${pgmname%"$suff"}"_$dtstamp.log"
#
$sas_dir/sas $script_dir/$pgmname -log $log_dir/$logname
#exit 0
Do you have any suggestions on how to troubleshoot this?
EDIT: I've enabled full logging on one of the SSIS-packages and executed it. As you can see on row 19, the process simply exits now: SSIS Logging
Best guess is that the SAS license is expired.
This solved the problem:
We switched to SHA256-hostkeys (for communication between SSIS & SAS).
The Microsoft server (which was running putty/plink v.0.70) was upgraded to v.0.75 (which supports SHA256).

SSIS "Execute Process" Exit Code 253

General issue (lots of questions on this already):
I have an SSIS package that works from Visual Studio, but it fails from SQL Agent.
Specific Failure Point
Task: Execute Process
Executable invoked: aws cli
Error message: The process exit code was "253" while the expected was "0".
Things I've tried
Checked permissions on the executable and destination folder. They have execute and read/write permissions (respectively) for the SQL Agent user.
Looked up exit code 253; I haven't found any documentation on it.
Stripped away everything but the "Execute Process" task invoking AWS CLI. Still get error 253.
Multiple AWS CLI Commands (s3api get-object, s3 ls). Still get error 253.
Conclusion
My main question is: what is exit code 253?
Thanks to #Dan Guzman for pointing me to the documentation on exit code 253.
Exit code 253
Essentially, AWS CLI configuration isn't right.
The system environment or configuration was invalid. While the command provided may be syntactically valid, missing configuration or credentials prevented the command from running.
Why this code was returned
Visual Studio executes as User A (my account), while SQL Agent executes as User B (whatever user it's configured with).
I set up the AWS CLI on my account, but that wasn't enough. I need to configure it for Account B as well.
CLI config details are stored in each user's home directory (~/.aws/config, ~/.aws/credentials). I can navigate to User B's home directory and set up config details there.

Definitive/Guaranteed Approach to Execute Batch Script Post MSI Installation using Visual Studio 2013

I've been cobbling some post installation steps but having difficulties, which I will go into. Please recommend a suitable, ideally native, process to run some custom steps post installation.
Using:
Visual Studio 2013 Ultimate
Setup:
vb.NET project & Visual Studio Instillation Project
Test OS:
Win 7 x64 with Admin
Currently, installer successfully install main application and extracts MS SQL Express + SQL scripts to a subdirectory. I have a batch file ("InstallSQL.bat") in the same directory, which silently installs SQL Express, and then executes the SQL scripts.
So how to best execute the "InstallSQL.bat" script, when Visual Studio doesn't support batch execution, from an installer Custom Action?
Methods I've tried:
Add cmd.exe (32-bit & 64-bit) + Installer Custom Action to launch the script, as per this post. For some reason, the cmd.exe is executed with non-administrator credential, and SQL Setup fails.
Use a VBS script, to launch the batch script. VBS script does not run and error "A script required for this install to complete could not be run".
I am happy to consider an alternative approach to install SQL Express and run scripts, not based on a batch file. However, my custom batch file works perfectly when run manually i.e. not from the installer.
Please note, it needs to work on Windows XP and up, be location insensitive i.e. no static file locations and ideally, without using 3rd party utilities. This last requirement is weak.
I do that with a config file
"YourSQLExeFile" /CONFIGURATIONFILE="FullPath\YourConfigFile"
You did not specify what version of SQL you are using (2008/2012/2014, etc). You can find examples with a google search... otherwise I may be able to share what I use if you reply with SQL version.
This is all so error prone your best bet is to do it as part of the configuration of the app when it first runs.
Any custom action code that runs with user credentials (typically because it is a "just me" install) will not run elevated.
Any custom action code that runs in a per machine Everyone install will run elevated with the system account, which typically is not helpful because the system account doesn't have database credentials or network credentials (if the database is on a network share it doesn't work).
If your vbscript fails you'll need to do the install with a verbose log to get diagnostics. A common error is to use the WScript object, but that's available only if you initiate the script with WSH, and Windows Installer is not WSH.
This is why it's better to do it after the install, initiated by the user rather than by an msiexec process running with the system account or impersonating the installing user.
You best hope of doing this from the install is for you to write code to read that script and configure the database. The SQLCommand class, for example, as here:
Sql Scripot with C#
Run SQL script
Run a SQL Script
This executable needs a manifest requesting administrator privilege so when it runs it prompts for elevation. You fire off this program from your VBScript custom action. The point is that this elevation and the fact that it's shell-executed as a separate process gives some independence from the msiexec.exe service process that is calling your custom actions. Again, Everyone vs. Just me installs will make a difference, and you may find that your executable ends up running with the system account. The good news is that if this doesn't work from the install you just need to arrange for it to run the first time someone runs your app, when it will in a normal interactive user environment, and of course you can test and debug this in the normal way.

Batch file's Core FTP line is Not running during Scheduled Task. Works if started Manually

I have a simple batch file which needs to be run weekly to upload some files via Core FTP.
I'm using the free version of Core FTP LE.
MySavedProfile is the Site Name of the saved profile I created using Core FTP's site Manager. The profile contains the URL / credentials / etc of the site to connect to.
Here are the contents of the batch file:
SET logf=confirm.log
echo test-start >> %logf%
"C:\Progra~1\CoreFTP\coreftp.exe" -B -s -pasv -O -site MySavedProfile -u "C:\Progra~2\PathToFiles\FileToUpload.txt"
echo test-finish >> %logf%
For the Windows Server 2012 r2 Task Scheduler, I have created a basic, weekly scheduled task on the Task Scheduler Library root which runs the batch file. For this scheduled task I have:
(Under the General tab)
"Run whether user is logged on or not" is selected
"Run with highest privileges" is checked
Configure for = Windows Server 2012 R2
(Under Actions)
Action = Start a program
Program / Script = "C:\Progra~2\PathToFiles\batch.bat"
Start in = C:\Progra~2\PathToFiles\
Here is the weird behavior I am getting:
If I double click on the batch file directly, it works fine and uploads the text file via Core FTP just fine.
However, if I try to let the Windows Task Scheduler run it, it runs everything except the Core FTP line. That is, I get the usual:
test-start
test-finish
in the confirm.log file, but the FileToUpload.txt has not been uploaded to the remote server, and there are no errors from CoreFTP that I can detect.
I have tried this with a service account that has permissions to run batch files, as well as my own account for this scheduled task. I get the same result: it doesn't seem to run that CoreFTP line. At least not via Task Scheduler. I need this upload to be automated.
I've searched Core FTP's documentation, Google, etc. No one seems to have run into this exact issue. I've applied recommendations from distantly related issues, but none of them have worked.
Any help would be greatly appreciated. Thank you.
The only way to do this is to use the full version of Core FTP (that is Core FTP Pro). If you use the LE version you have to check the "Run only when user is logged on" option.
This happens because of the splash screen at the beginning.
If you can't be logged on forever you could create a user that will always be logged on just for these tasks.
Remember to use the -Log option on CoreFTP to check if it is actually doing something.

is it possible execute a perl script with admin rights or as a specific user?

I'm writing a perl script in which I've to shutdown my mssql server ,do some operation and then I've to restart it.I know 1 way is to use netstat to stopt the service but I cann't use that. So I tried installing DBI and DBD::ODBC module.
More info here :Shutdown MSSQL server from perl script DBI
But when I trying to shutdown my server using this command
$dbh->prepare("SHUTDOWN WITH NOWAIT ");
It's not working for me :
I got this response from the community
SHUTDOWN permissions are assigned to members of the sysadmin and serveradmin fixed server roles, and they are not transferable. I'd consider it unlike(hopefully) that perl is run with this rights.
So please tell me is there a way to run the above command as these users ? or what can I do other than this . Note that I have a constraint tha tI cann't simply stop it as windows service.
If the scripts are executed through a web browser then the user executing the scripts will be defined by the web server. It will probably not be a good idea to fiddle with this user. Just leave things as they are.
What you can do is to create a Perl script that is being run by a privileged user on a consistent basis with CRON.
This script being run by CRON can check for specific content like a file which has been written by a script where the user executing the script has lesser privileges.
So the way it could work is as follows:
You execute browser.cgi through a browser to do a specific task.
browser.cgi writes instructions to a file.
Every 1 minute priveleged.cgi executes via CRON. (The root user could execute priveleged.cgi)
priveleged.cgi reads the file browser.cgi has written for instructions and starts and stops services according to the instructions.

Resources