I have automated build and deploy process in TFS by referring to http://www.codeproject.com/Articles/790206/Deploying-Web-Applications-using-Team-Foundation.
After deployment I am validating deployed application by running selenium scripts via batch file by mentioning the path in "post-test script path". Its executes the batch file and runs automated tests.
Now, I wanted publish these selenium results. So I have created jenkins jobs with email configured. So how to execute this job post deployment. I have tried by providing jenkins job trigger email in "post-test script path", but it actually looking the path and throughs an error. So how to execute jenkins jobs post deployment.
Also, I am trying complete automation process, where it automatically build, deploy and run some selenium tests using TFS. If any body has better process please let me know. Thanks
You can use the Command Line task in the new TFS 2015/VSTS build system to easily execute selenium tests.
You can then easily configure and pass variables.
I would also recommend that you move to using release management tools for deployment. While a CD makes sense for development it is often not viable for a release pipeline and you need more meta data and approvals.
You can do this with the release management tools that come with TFS 2013+.
Related
In Azure DevOps release pipeline, How do I get our Script.PreDeployment.sql and Script.PostDeployment.sql files to execute during our SQL Server database deploy task?
In our release pipeline we have a Database Deployment phase, which has a SQL Server database deploy task. The tasks publishes our DACPAC file just fine. However I cannot figure out how to get the pre and post deployment scripts to execute.
Both of the scripts are included in the project, both have the appropriate BuildAction set to PreDeploy and PostDeploy. Yet in the logs of the dacpac deployment, there is no indication that the files were run - I have a bunch of PRINT statements in there.
I am also working on the post deployment scrip by using SSDT approach, and for deployment I am using the Azure SQL DacpacTask task in my Azure Pipeline, so you just need to create the post deployment scripts as you can see this in image and save it, after you run the Azure build this will add it in the Azure Pipeline artifact ill automatically executed in above azure tasks under release pipeline. First it will executed the database deployment and after that it runs the post deployment script. It works for me .
You can make use of Command line task to run those pre and post deployment SQL script using the SQLCMD tool.
The arguments to this its execution script are:
-S {database-server-name}.database.windows.net
-U {username}#{database-server-name}
-P {password}
-d {database-name}
-i {SQL file}
If you store the pre/postdeployment script in artifact, you can specify -i as like $(System.DefaultWorkingDirectory)/drop/Script.PreDeployment.sql.
Once Postdeployment script is added to the project, it will integrate with DacPac.
.sqlproj should have below PostDeploy Item group like -
<ItemGroup> <PostDeploy Include="PostDeploymentScript path"/> </ItemGroup>.
Sometime it does not update in .sqlproj and postdeployment does not work.
It should be added by default when postdeployment is added, just verify before publish.
.sqlproj
I have setup Jenkins to automatically build my Maven project. I would like to extend the same to automatically create database tables if they dont exist. Can this be done via jenkins? i.e. Can the Jenkins job be modified to have an additional pre-build step of creating database tables? If so, can anyone give me any pointers as to how to do this?
if this is not possible directly in Jenkins, can this be done via Maven? i.e. modify the Maven build to create the tables before compiling the code.
Any help will be useful. Thanks in advance!
You could use the sql maven plugin to execute sql-scripts or define pre-build steps in the jenkins job.
See this similar question for more tips...
If you prefer not to use maven but a jenkins solution I would add a 'pre-build Step' or a 'Build-Step' (depends on your job-type) of the type 'Execute Shell' or 'Execute Windows batch command' and call the corresponding command line tool of your database (in this case MySQL Command-Line Tool).
The command line tool must be installed on the jenkins server or must be included in the project checkout.
As a solution, I used a separate Jenkins Job to run the SQL script via the Jenkins command line tool interface.
I've been cobbling some post installation steps but having difficulties, which I will go into. Please recommend a suitable, ideally native, process to run some custom steps post installation.
Using:
Visual Studio 2013 Ultimate
Setup:
vb.NET project & Visual Studio Instillation Project
Test OS:
Win 7 x64 with Admin
Currently, installer successfully install main application and extracts MS SQL Express + SQL scripts to a subdirectory. I have a batch file ("InstallSQL.bat") in the same directory, which silently installs SQL Express, and then executes the SQL scripts.
So how to best execute the "InstallSQL.bat" script, when Visual Studio doesn't support batch execution, from an installer Custom Action?
Methods I've tried:
Add cmd.exe (32-bit & 64-bit) + Installer Custom Action to launch the script, as per this post. For some reason, the cmd.exe is executed with non-administrator credential, and SQL Setup fails.
Use a VBS script, to launch the batch script. VBS script does not run and error "A script required for this install to complete could not be run".
I am happy to consider an alternative approach to install SQL Express and run scripts, not based on a batch file. However, my custom batch file works perfectly when run manually i.e. not from the installer.
Please note, it needs to work on Windows XP and up, be location insensitive i.e. no static file locations and ideally, without using 3rd party utilities. This last requirement is weak.
I do that with a config file
"YourSQLExeFile" /CONFIGURATIONFILE="FullPath\YourConfigFile"
You did not specify what version of SQL you are using (2008/2012/2014, etc). You can find examples with a google search... otherwise I may be able to share what I use if you reply with SQL version.
This is all so error prone your best bet is to do it as part of the configuration of the app when it first runs.
Any custom action code that runs with user credentials (typically because it is a "just me" install) will not run elevated.
Any custom action code that runs in a per machine Everyone install will run elevated with the system account, which typically is not helpful because the system account doesn't have database credentials or network credentials (if the database is on a network share it doesn't work).
If your vbscript fails you'll need to do the install with a verbose log to get diagnostics. A common error is to use the WScript object, but that's available only if you initiate the script with WSH, and Windows Installer is not WSH.
This is why it's better to do it after the install, initiated by the user rather than by an msiexec process running with the system account or impersonating the installing user.
You best hope of doing this from the install is for you to write code to read that script and configure the database. The SQLCommand class, for example, as here:
Sql Scripot with C#
Run SQL script
Run a SQL Script
This executable needs a manifest requesting administrator privilege so when it runs it prompts for elevation. You fire off this program from your VBScript custom action. The point is that this elevation and the fact that it's shell-executed as a separate process gives some independence from the msiexec.exe service process that is calling your custom actions. Again, Everyone vs. Just me installs will make a difference, and you may find that your executable ends up running with the system account. The good news is that if this doesn't work from the install you just need to arrange for it to run the first time someone runs your app, when it will in a normal interactive user environment, and of course you can test and debug this in the normal way.
I'm trying to automate sonarqube installation.
One (small) issue I'm running into is after installation during first access, as sonarqube is initializing the db schema, we run into timeouts.
I'd like to know if there's a script/command to initialize the db (create tables and all) from the bash?
I've been digging on the internet and couldn't find an answer to that.
Thanks!
I'd like to complete answers from Seb and Jeroen.
Schema is indeed created programmatically by SonarQube during startup. This step can't be executed independently in a script. Just start server. Instead of parsing logs, I suggest to call the web service GET http:///api/system/status (see documentation at http://nemo.sonarqube.org/api_documentation/api/system/status) to know when database is fully initialized.
Database upgrade when installing a new release can also be triggered through the web service POST http:///api/system/migrate_db (see http://nemo.sonarqube.org/api_documentation/api/system/migrate_db).
Database initialization is built-in SonarQube and can not be run independently of starting SonarQube.
As suggested by #jeroen, you can indeed analyze the sonar.log file and wait for the web[o.s.s.a.TomcatAccessLog] Web server is started line.
You can build in a wait loop and/or analyze the SonarQube log file during startup.
I have a simple website built using PHP. It is hosted on a Linux server.
I need to run a PHP script every night. How do I do this?
Will an 'open-source job scheduler in java' be able to run a PHP script?
There are several possibilities.
If you have shell access to the machine you can set up a scheduled task (cron job, http://www.scrounge.org/linux/cron.html) to execute your script either by the php command line client or by a tool like curl or wget.
IF you don't have access to the shell on that machine there are several web sites that offer free cronjobs. Basically what they do is that you give them a link and the schedule when they should access your link. Just google for online cron job