Automated database table creation via Jenkins - database

I have setup Jenkins to automatically build my Maven project. I would like to extend the same to automatically create database tables if they dont exist. Can this be done via jenkins? i.e. Can the Jenkins job be modified to have an additional pre-build step of creating database tables? If so, can anyone give me any pointers as to how to do this?
if this is not possible directly in Jenkins, can this be done via Maven? i.e. modify the Maven build to create the tables before compiling the code.
Any help will be useful. Thanks in advance!

You could use the sql maven plugin to execute sql-scripts or define pre-build steps in the jenkins job.
See this similar question for more tips...
If you prefer not to use maven but a jenkins solution I would add a 'pre-build Step' or a 'Build-Step' (depends on your job-type) of the type 'Execute Shell' or 'Execute Windows batch command' and call the corresponding command line tool of your database (in this case MySQL Command-Line Tool).
The command line tool must be installed on the jenkins server or must be included in the project checkout.

As a solution, I used a separate Jenkins Job to run the SQL script via the Jenkins command line tool interface.

Related

Is there any way to change the SQL Server name in Python code based on the environment?

In my Python script, I established the SQL Server DEV connection and I am calling this script in my SSIS package, so now I want to deploy the project on Production server.
Q: How SQL server connection should be changed to Production from Development automatically/dynamically without editing the script manually? Is there a way that it should get/read Production environment?
Please help me out with this, thank you.
sys.argv and pass it as a command line parameter.
Pull it from an environment variable value os.environ
Read from a config file with configparser
Without any sample code, it's hard to say what the right approach should be but I would favor a command line parameter as that allows you to provide the value from the SSIS package (instead of defining configurations in both SSIS space and python space)

Definitive/Guaranteed Approach to Execute Batch Script Post MSI Installation using Visual Studio 2013

I've been cobbling some post installation steps but having difficulties, which I will go into. Please recommend a suitable, ideally native, process to run some custom steps post installation.
Using:
Visual Studio 2013 Ultimate
Setup:
vb.NET project & Visual Studio Instillation Project
Test OS:
Win 7 x64 with Admin
Currently, installer successfully install main application and extracts MS SQL Express + SQL scripts to a subdirectory. I have a batch file ("InstallSQL.bat") in the same directory, which silently installs SQL Express, and then executes the SQL scripts.
So how to best execute the "InstallSQL.bat" script, when Visual Studio doesn't support batch execution, from an installer Custom Action?
Methods I've tried:
Add cmd.exe (32-bit & 64-bit) + Installer Custom Action to launch the script, as per this post. For some reason, the cmd.exe is executed with non-administrator credential, and SQL Setup fails.
Use a VBS script, to launch the batch script. VBS script does not run and error "A script required for this install to complete could not be run".
I am happy to consider an alternative approach to install SQL Express and run scripts, not based on a batch file. However, my custom batch file works perfectly when run manually i.e. not from the installer.
Please note, it needs to work on Windows XP and up, be location insensitive i.e. no static file locations and ideally, without using 3rd party utilities. This last requirement is weak.
I do that with a config file
"YourSQLExeFile" /CONFIGURATIONFILE="FullPath\YourConfigFile"
You did not specify what version of SQL you are using (2008/2012/2014, etc). You can find examples with a google search... otherwise I may be able to share what I use if you reply with SQL version.
This is all so error prone your best bet is to do it as part of the configuration of the app when it first runs.
Any custom action code that runs with user credentials (typically because it is a "just me" install) will not run elevated.
Any custom action code that runs in a per machine Everyone install will run elevated with the system account, which typically is not helpful because the system account doesn't have database credentials or network credentials (if the database is on a network share it doesn't work).
If your vbscript fails you'll need to do the install with a verbose log to get diagnostics. A common error is to use the WScript object, but that's available only if you initiate the script with WSH, and Windows Installer is not WSH.
This is why it's better to do it after the install, initiated by the user rather than by an msiexec process running with the system account or impersonating the installing user.
You best hope of doing this from the install is for you to write code to read that script and configure the database. The SQLCommand class, for example, as here:
Sql Scripot with C#
Run SQL script
Run a SQL Script
This executable needs a manifest requesting administrator privilege so when it runs it prompts for elevation. You fire off this program from your VBScript custom action. The point is that this elevation and the fact that it's shell-executed as a separate process gives some independence from the msiexec.exe service process that is calling your custom actions. Again, Everyone vs. Just me installs will make a difference, and you may find that your executable ends up running with the system account. The good news is that if this doesn't work from the install you just need to arrange for it to run the first time someone runs your app, when it will in a normal interactive user environment, and of course you can test and debug this in the normal way.

How to launch jenkins jobs after Post deployment in TFS

I have automated build and deploy process in TFS by referring to http://www.codeproject.com/Articles/790206/Deploying-Web-Applications-using-Team-Foundation.
After deployment I am validating deployed application by running selenium scripts via batch file by mentioning the path in "post-test script path". Its executes the batch file and runs automated tests.
Now, I wanted publish these selenium results. So I have created jenkins jobs with email configured. So how to execute this job post deployment. I have tried by providing jenkins job trigger email in "post-test script path", but it actually looking the path and throughs an error. So how to execute jenkins jobs post deployment.
Also, I am trying complete automation process, where it automatically build, deploy and run some selenium tests using TFS. If any body has better process please let me know. Thanks
You can use the Command Line task in the new TFS 2015/VSTS build system to easily execute selenium tests.
You can then easily configure and pass variables.
I would also recommend that you move to using release management tools for deployment. While a CD makes sense for development it is often not viable for a release pipeline and you need more meta data and approvals.
You can do this with the release management tools that come with TFS 2013+.

Command line to initialize SonarQube database?

I'm trying to automate sonarqube installation.
One (small) issue I'm running into is after installation during first access, as sonarqube is initializing the db schema, we run into timeouts.
I'd like to know if there's a script/command to initialize the db (create tables and all) from the bash?
I've been digging on the internet and couldn't find an answer to that.
Thanks!
I'd like to complete answers from Seb and Jeroen.
Schema is indeed created programmatically by SonarQube during startup. This step can't be executed independently in a script. Just start server. Instead of parsing logs, I suggest to call the web service GET http:///api/system/status (see documentation at http://nemo.sonarqube.org/api_documentation/api/system/status) to know when database is fully initialized.
Database upgrade when installing a new release can also be triggered through the web service POST http:///api/system/migrate_db (see http://nemo.sonarqube.org/api_documentation/api/system/migrate_db).
Database initialization is built-in SonarQube and can not be run independently of starting SonarQube.
As suggested by #jeroen, you can indeed analyze the sonar.log file and wait for the web[o.s.s.a.TomcatAccessLog] Web server is started line.
You can build in a wait loop and/or analyze the SonarQube log file during startup.

How to configure Visual Studio 2012 Database Project to Build deployment .sql file

Does anyone know How to configure Visual Studio 2012 Database Project to Build a deployment .sql file package.
I'm attempting to not have to use the .dacpac as my only deployment option.
e.g. generate a .sql file of all database schema alters.
If you're trying to do this automatically, you'll want to use the SQLPackage command to generate a script. If you want to do this within the IDE, publish the database and choose the option to generate a script instead of publishing the changes.
I'll usually build the project first with msbuild. That will generate a dacpac against which I can run the SQLPackage command.
My batch file looks something like this:
msbuild .\MyDB\MyDB.sqlproj /t:build /p:Configuration="Local"
sqlpackage /a:DeployReport /tsn:FTPROD /sf:.\MyDB\sql\Local\MyDB.dacpac /pr:.\MyDB\Publish\Production.publish.xml /op:.\Release\MyDB-Production.xml
sqlpackage /a:script /tsn:DBServer /sf:.\MyDB\sql\Local\MyDB.dacpac /pr:.\MyDB\Publish\Production.publish.xml /op:.\Release\MyDB-Production.sql
The first line builds the project. The second creates a deploy report so I can easily see what's going to be changed. The last generates the script. Your paths may vary so you'll need to tweak as appropriate for your environment. You'll need to be able to access the database against which the script will be run in order to generate the script.

Resources