My question is simple I want to extract version and name from package.json, but when I extract version and name I got more URL see attached file.Why is that?
Jenkinsfile
pipeline {
agent any
environment {
CI = 'true'
//IMAGE = bat 'node -e "console.log(require(`./package.json`).name);"'
//VERSION = bat(script: 'npm run get-version')
//VERSION = bat '(npm run version --silent)'
//PACKAGE_VERSION = bat '(node -p -e "require(\'./package.json\').version")'
GIT_COMMIT_SHORT_HASH = GIT_COMMIT.take(7)
REPOSITORY = 'repo.dimiroma.com'
PORT = '8085'
LATEST = 'latest'
}
stages {
stage('Set Build Variables') {
steps {
script {
VERSION = bat(script: '''node -e "console.log(require('./package.json').version)"''', returnStdout: true).trim()
def getProjectName = { ->
return bat(
returnStdout: true,
script: 'node -e "console.log(require(\'./package.json\').name);"'
).trim()
}
//VERSION = getProjectVersion()
IMAGE = getProjectName()
}
}
}
stage('Information') {
steps {
script{
bat 'node -v'
bat 'git --version'
bat 'docker -v'
echo "JOB BASE NAME: ${JOB_BASE_NAME} BUILD-NUMBER: ${BUILD_NUMBER}"
echo "Version: ${VERSION}"
//echo "Version: ${PACKAGE_VERSION}"
echo "Name: ${IMAGE}"
echo "Branch_name: ${env.BRANCH_NAME}"
final scmVars = checkout(scm)
echo "scmVars: ${scmVars}"
echo "scmVars.GIT_COMMIT: ${scmVars.GIT_COMMIT}"
echo "scmVars.GIT_BRANCH: ${scmVars.GIT_BRANCH}"
}
}
}
stage('Install Dependencies') {
steps {
bat 'npm install'
}
}
stage('Test') {
steps {
bat 'npm test -- --coverage a'
}
}
stage('Create Docker Image'){
steps {
bat "docker images"
bat "docker build . -t ${IMAGE}:${VERSION}-${GIT_COMMIT_SHORT_HASH}"
}
}
}
}
Dockerfile
Please Help.
You could do it like this. I assume you got npm as a tool installed for this
stage("Build and push docker non-release"){
steps {
script {
def version = sh(returnStdout: true, script: "npm version")
echo "Version is ${version}"
def versionProps = readJSON text: version
echo "Project version is ${versionProps['project-name']}"
}
}
}
Related
I'm a newbie to jenkins dsl groovy scripting. i have a parameterized jenkins job which takes two inputs $param1 $param2
I have two stages. First stage generates output.txt file which has contents like below in the workspace. The output.txt file changes content based on shell script execution in stage1. So, the values are dynamic
output.txt
svn-test
svn_prod
svn_dev
Second stage has to get input from the file output.txt and iterate in a parallel loop dynamically creating stages. I have the below code but it doesn't take input from output.txt file. I'm unable to overirde the array in the stage and iterate parallely
def jobs = []
def parallelStagesMap = jobs.collectEntries {
["${it}" : generateStage(it)]
}
def generateStage(job) {
return {
stage("stage: ${job}") {
script{
git credentialsId: 'github', url: 'ssh://github.com/promp/${job}.git', branch: master
echo "This is ${job}."
sh ''' make ${parameter1}#"${paramete2}" '''
}
}
}
}
pipeline {
agent any
parameters {
parameters {
string(name: 'parameter1', defaultValue: 'UIAutomation', description: 'Please enter the value')
string(name: 'parameter2', defaultValue: 'UISite', description: 'Please enter the value')
}
stages {
stage('non-parallel stage') {
steps {
script {
echo 'This stage will be executed first.'
sh '''./rotate_script.sh output.txt'''
}
}
}
stage('parallel stage') {
failFast false
steps {
script {
def filePath = readFile('output.txt').trim()
def lines = filePath.readLines()
line.each {
// I have tried to read lines and pass it value. It didn't workout``
}
parallel parallelStagesMap
}
}
}
}
}
Ideally this is how my one of the second stage looks like and create multiple parallel stages based on output.txt file
stage('svn-test'){
steps{
sh 'mkdir -p svn-test'
dir("svn-test"){
script{
git credentialsId: 'github', url: 'ssh://github.com/promp/svn-test.git', branch: master
sh ''' make ${parameter1}#"${parameter2}"
'''
}
}
}
}
I finally got something like this to work. I had to move my Groovy List and the parallelStageMap into my pipeline. The StageMap is getting set at the start of your script when your list is empty, thus you are never getting any results. If you move it AFTER the list is populated it will work.
def generateStage(job) {
return {
stage("stage: ${job}") {
script{
git credentialsId: 'github', url: 'ssh://github.com/promp/${job}.git', branch: master
echo "This is ${job}."
sh ''' make ${parameter1}#"${paramete2}" '''
}
}
}
}
pipeline {
agent any
parameters {
string(name: 'parameter1', defaultValue: 'UIAutomation', description: 'Please enter the value')
string(name: 'parameter2', defaultValue: 'UISite', description: 'Please enter the value')
}
stages {
stage('non-parallel stage') {
steps {
script {
echo 'This stage will be executed first.'
sh '''./rotate_script.sh output.txt'''
}
}
}
stage('parallel stage') {
failFast false
steps {
script {
def jobs=[]
def filePath = readFile('output.txt').trim()
def lines = filePath.readLines()
lines.each { job ->
jobs.add("${job}")
}
def parallelStagesMap = jobs.collectEntries {
["${it}" : generateStage(it)]
}
parallel parallelStagesMap
}
}
}
}
}
I am trying to convert our IdentityServer4 project to docker. I Added the proper dockerfile and it can build. it is .net core 3.1 But I am getting a start up error
PlatformNotSupportedException: Unix LocalMachine X509Store is limited
to the Root and CertificateAuthority stores.
I looked at this exact error in another post IdentityServer4: How to load Signing Credential from Cert Store when in Docker
with the only answer not working for me.
this is my Dockerfile
FROM mcr.microsoft.com/dotnet/core/aspnet:3.1-buster-slim AS base
WORKDIR /app
EXPOSE 80
EXPOSE 443
FROM mcr.microsoft.com/dotnet/core/sdk:3.1-buster AS build
WORKDIR /src
COPY ["src/Sentinel.Web/is4.Web.csproj", "src/is4.Web/"]
RUN dotnet restore "src/is4.Web/is4.Web.csproj"
COPY . .
WORKDIR "/src/src/is4.Web"
RUN dotnet build "is4.Web.csproj" -c Release -o /app/build
FROM build AS publish
RUN dotnet publish "is4.Web.csproj" -c Release -o /app/publish
FROM base AS final
WORKDIR /app
COPY --from=publish /app/publish .
COPY "IdentityServerCN.pfx" .
COPY "IdentityServerDataKeysCN.pfx" .
ENTRYPOINT ["dotnet", "Sentinel.Web.dll"]
I even tried changing it to use the developer cert with still no luck
.AddDeveloperSigningCredential(); // debug only
Startup class:
public class Startup
{
public IConfiguration Configuration { get; }
public Startup(IConfiguration configuration)
{
Configuration = configuration;
}
public void ConfigureServices(IServiceCollection services)
{
services.AddControllersWithViews();
services.AddDbContext<ApplicationDbContext>(options =>
options.UseSqlServer(Configuration.GetConnectionString("IdentityServer")));
services
.AddDbContext<DataKeysContext>(o =>
{
o.UseSqlServer(Configuration.GetConnectionString("IdentityServer"));
o.UseQueryTrackingBehavior(QueryTrackingBehavior.NoTracking);
if (Configuration.GetValue<bool?>("EnableSensitiveDataLogging") ?? false)
{
o.EnableSensitiveDataLogging(); // debug only
}
})
.AddDataProtection()
.PersistKeysToDbContext<DataKeysContext>()
.ProtectKeysWithCertificate(GetCertficateFromKeystore(Configuration.GetValue<string>("Certificates:DataProtection")))
.SetDefaultKeyLifetime(TimeSpan.FromDays(7))
.SetApplicationName("Sentinel.Web");
services.AddIdentity<ApplicationUser, IdentityRole>()
.AddEntityFrameworkStores<ApplicationDbContext>()
.AddDefaultTokenProviders();
// If we're in dev allow showing PII for debugging identity server cert issues
IdentityModelEventSource.ShowPII = Configuration.GetValue<bool?>("ShowIdentityModelPII") ?? false;
// configures IIS out-of-proc settings (see https://github.com/aspnet/AspNetCore/issues/14882)
services.Configure<IISOptions>(iis =>
{
iis.AuthenticationDisplayName = "Windows";
iis.AutomaticAuthentication = false;
});
// configures IIS in-proc settings
services.Configure<IISServerOptions>(iis =>
{
iis.AuthenticationDisplayName = "Employee Account";
iis.AutomaticAuthentication = false;
});
var builder = services.AddIdentityServer(options =>
{
options.Events.RaiseErrorEvents = true;
options.Events.RaiseInformationEvents = true;
options.Events.RaiseFailureEvents = true;
options.Events.RaiseSuccessEvents = true;
})
.AddDefaultEndpoints()
//.AddSigningCredential(GetCertficateFromKeystore(Configuration.GetValue<string>("Certificates:IdentityServer")))
//.AddSigningCredential(new X509Certificate2("IdentityServerCN.pfx", "pass1234") )
//.AddInMemoryPersistedGrants()
.AddOperationalStore(options =>
{
options.DefaultSchema = "dbo";
options.ConfigureDbContext = x =>
x.UseSqlServer(Configuration.GetConnectionString("IdentityServer"));
})
//.AddTestUsers(TestUsers.Users)
.AddInMemoryIdentityResources(Configuration.GetSection("IdentityResources"))
.AddInMemoryApiResources(Configuration.GetSection("ApiResources"))
.AddInMemoryClients(Configuration.GetSection("Clients"))
.AddAspNetIdentity<ApplicationUser>()
.AddDeveloperSigningCredential(); // debug only
services.AddAuthentication()
.AddGoogle("Google", options =>
{
options.SignInScheme = IdentityServerConstants.ExternalCookieAuthenticationScheme;
options.ClientId = "xxx.apps.googleusercontent.com";
options.ClientSecret = "xxxxxxxxxxxx";
});
// If we did a custom grant store (i.e. instead of using EF we could use dapper)
//services.AddTransient<IPersistedGrantStore, PersistedGrantStore>();
services.AddTransient<ActiveDirectoryService>();
}
public void Configure(IApplicationBuilder app)
{
if (Configuration.GetValue<bool?>("ShowDeveloperExceptionPage") ?? false)
{
app.UseDeveloperExceptionPage();
//app.UseDatabaseErrorPage();
}
app.UseStaticFiles();
// CORS handled by identity server settings
app.UseRouting();
app.UseIdentityServer();
app.UseAuthorization();
app.UseEndpoints(endpoints =>
{
endpoints.MapDefaultControllerRoute();
});
}
private X509Certificate2 GetCertficateFromFileSystem(string keyFilePath, string keyFilePassword)
{
// using file
return new X509Certificate2(keyFilePath, keyFilePassword, X509KeyStorageFlags.MachineKeySet);
}
private X509Certificate2 GetCertficateFromKeystore(string keyIssuer)
{
// using keystore
var store = new X509Store(StoreName.My, StoreLocation.LocalMachine);
store.Open(OpenFlags.ReadOnly);
var certificates = store.Certificates.Find(X509FindType.FindByIssuerName, keyIssuer, true);
if (certificates.Count == 0)
{
throw new InvalidOperationException("Could not locate certificate in keystore");
}
return certificates[0];
}
}
What should I change so I can install both QA & Prod versions of app on the same machine?
# Starter pipeline
# Start with a minimal pipeline that you can customize to build and deploy your code.
# Add steps that build, run tests, deploy, and more:
# https://aka.ms/yaml
# Build defintion variables to define:
# AgentPool = BUILD2_CD
# Env1 = Dev or UAT
# Env2 = QA or PROD
# BuildVersion = 1.0.0.1 , example
trigger:
- none
pool:
#name: BUILD2
name: $(AgentPool)
demands:
- npm
- msbuild
- visualstudio
- vstest
- DotNetFramework
variables:
BuildPlatform: 'x64'
BuildConfiguration: 'release'
major: 2
minor: 0
build: 0
revision: $[counter('rev', 0)]
BuildOutputFolder: 'Runtime'
isDevelop: $[eq(variables['Env1'], 'DEV')] # runtime expression
CertExportDir: '$(Build.ArtifactStagingDirectory)\AppxPackages\$(MsixPackageRootFolderName)'
CertFilePath: '$(Build.ArtifactStagingDirectory)\AppxPackages\$(MsixPackageRootFolderName)\PIE21stMortgage.pfx'
MsixPackageRootFolderName: 'DotnetCoreInstaller_$(BuildVersion)_$(BuildPlatform)_Test'
jobs:
- job: Phase1
displayName: "Client DotnetCore Installer "
timeoutInMinutes: 50
strategy:
maxParallel: 2
matrix:
ENV_1:
Multiplier: $(Env1)
ENV_2:
Multiplier: $(Env2)
steps:
- task: PowerShell#2
displayName: 'REST API : Update Build Version '
inputs:
targetType: filePath
filePath: ./Automation/RestApi/RestApiVersionCounter.ps1
continueOnError: true
env:
SYSTEM_ACCESSTOKEN: $(System.AccessToken)
- task: PowerShell#1
displayName: 'PowerShell Script - Update ApplicationRevision '
inputs:
scriptType: inlineScript
inlineScript: |
$fileVersion = $Env:BuildVersion.Split(".")
$last3Numbers = [int]$fileVersion[3].ToString()
$path = "$(Build.SourcesDirectory)/DotnetCore/DotnetCore.csproj"
$word = "<ApplicationRevision>.*$"
$replacement = "<ApplicationRevision>" + $last3Numbers + "</ApplicationRevision>"
$text = get-content $path
$newText = $text -replace $word,$replacement
$newText > $path
(Get-Content $path) | Out-File -encoding UTF8 $path
- task: PowerShell#1
displayName: 'PowerShell Script - Update ApplicationVersion'
inputs:
scriptType: inlineScript
inlineScript: |
$path = "$(Build.SourcesDirectory)/DotnetCore/DotnetCore.csproj"
$word = "<ApplicationVersion>.*$"
$replacement = "<ApplicationVersion>" + $Env:BuildVersion + "</ApplicationVersion>"
$text = get-content $path
$newText = $text -replace $word,$replacement
$newText > $path
(Get-Content $path) | Out-File -encoding UTF8 $path
- task: PowerShell#1
displayName: 'PowerShell Script - Set BuildOutputFolder variable'
inputs:
scriptType: inlineScript
inlineScript: |
switch ($env:Multiplier) {
"PROD" {
$folderName = "Release"
}
"UAT" {
$folderName = "UATRelease"
}
"DEV" {
$folderName = "Debug"
}
"QA" {
$folderName = "QARelease"
}
}
Write-Host "Setting 'BuildOutputFolder' variable to: $folderName" -Verbose
Write-Host ("##vso[task.setvariable variable=BuildOutputFolder;]$folderName") -Verbose
- task: FileTransform#1
displayName: 'File Transform: App.config'
inputs:
folderPath: DotnetCoreFolder
enableXmlTransform: true
xmlTransformationRules: -transform **\App.$(BuildOutputFolder).config -xml **\App.config
fileType: xml
- powershell: |
# Update appxmanifest. This must be done before the build.
[xml]$manifest= get-content ".\DotnetCoreInstaller\Package.appxmanifest"
# $manifest.Package.Identity.Version = "$(major).$(minor).$(build).$(revision)"
$manifest.Package.Identity.Version = "$(BuildVersion)"
$manifest.Package.Applications.Application.VisualElements.DisplayName = "DotnetCore.$(Multiplier)"
$manifest.save("DotnetCoreInstaller/Package.appxmanifest")
displayName: 'Version Package Manifest'
- task: DotNetCoreCLI#2
inputs:
command: 'build'
projects: '.\DotnetCore\DotnetCore.csproj'
- task: CopyFiles#1
displayName: 'Copy PIE21stMortgage.pfx File to: $(build.artifactstagingdirectory)'
inputs:
SourceFolder: D:\versions\CERT
Contents: PIE21stMortgage.pfx
TargetFolder: $(CertExportDir)
- task: MSBuild#1
inputs:
solution: DotnetCoreInstaller/DotnetCoreInstaller.wapproj
platform: $(buildPlatform)
configuration: $(buildConfiguration)
msbuildArguments: '/p:OutputPath=NonPackagedApp
/p:UapAppxPackageBuildMode=SideLoadOnly
/p:AppxBundle=Never
/p:GenerateAppInstallerFile=True
/p:AppInstallerUri=\\shares\Intranet\$(Multiplier)\DotnetCore
/p:AppInstallerCheckForUpdateFrequency=OnApplicationRun
/p:AppInstallerUpdateFrequency=1
/p:AppxPackageOutput=$(Build.ArtifactStagingDirectory)\DesktopApp.msix
/p:AppxPackageSigningEnabled=false
/p:PublishAssemblyName=$(Multiplier)
/p:ProductName=$(Multiplier)'
displayName: 'Package the App'
- script: '"C:\Program Files (x86)\Windows Kits\10\bin\10.0.18362.0\x64\signtool"
sign /fd SHA256 /f $(CertExportDir)/$(CertName) /p "$(CertPassword)" $(Build.ArtifactStagingDirectory)\DesktopApp.msix'
displayName: 'Sign MSIX Package'
- task: PublishBuildArtifacts#1
displayName: 'Publish Artifact: MSIX Package'
inputs:
PathtoPublish: $(Build.ArtifactStagingDirectory)
ArtifactName: ClientDotnetCoreInstaller$(Multiplier)
- task: CopyFiles#1
inputs:
SourceFolder: '$(Build.ArtifactStagingDirectory)'
Contents: '**'
TargetFolder: '\\shares\Intranet\$(Multiplier)\DotnetCore'
OverWrite: true
- task: DeleteFiles#1
displayName: 'Delete PIE21stMortgage.pfx'
inputs:
SourceFolder: '\\21stmortgage\shares\Intranet\$(Multiplier)\DotnetCore\AppxPackages\$(MsixPackageRootFolderName)'
Contents: 'PIE21stMortgage.pfx'
- task: DeleteFiles#1
displayName: 'Delete files from $(Build.SourcesDirectory) '
condition: always()
continueOnError: True
enabled: False
inputs:
SourceFolder: $(Build.SourcesDirectory)
Contents: '\*'
In above yaml pipeline. There are two powershell tasks PowerShell Script - Update ApplicationRevision to update the ApplicationRevision. The second powershell task will override the first one. You can use conditions for above powershell tasks, so that only one of them will run in each strategy. See below:
- task: PowerShell#1
displayName: 'PowerShell Script - Update ApplicationRevision '
condtion: eq(variables['Multiplier'],'QA')
inputs:
scriptType: inlineScript
inlineScript: |
$fileVersion = $Env:BuildVersion.Split(".")
$last3Numbers = [int]$fileVersion[3].ToString()
...
- task: PowerShell#1
displayName: 'PowerShell Script - Update ApplicationVersion'
condtion: eq(variables['Multiplier'],'Prod')
inputs:
scriptType: inlineScript
inlineScript: |
$path = "$(Build.SourcesDirectory)/DotnetCore/DotnetCore.csproj"
$word = "<ApplicationVersion>.*$"
....
Add condtions to powershell task to make them run in their strategy.
I have a Jenkinsfle. I need to pass parameters from Build with Parameters plugin and also have variables defined within the script. I cannot get either to work. It may be a syntax issue?
#!/usr/bin/env groovy
pipeline {
agent any
stages {
stage('Config API (dev)') {
steps {
script {
apiName = "config_API"
taskDefinitionFamily = "mis-core-dev-config"
taskDefinition = "mis-core-dev-config"
if (params.apiName.contains('Stop Task')) {
build(job: 'Stop ECS Task (utility)',
parameters: [
string(name: 'region', value: params.region),
string(name: 'cluster', value: params.cluster),
string(name: 'family', value: params.taskDefinitionFamily)
])
}
else if (params."${apiName}".contains('Start Task')) {
build(job: 'Start ECS Task (utility)',
parameters: [
string(name: 'region', value: params."${region}"),
string(name: 'cluster', value: params."${cluster}"),
string(name: 'taskDefinition', value: params."${taskDefinition}"),
string(name: 'containerInstanceIds', value: params."${containerInstanceIdsToStartOn}")
])
}
else if (params."${apiName}" == null || params."${apiName}" == "") {
echo "Did you forget to check a box?"
}
}
}
}
My Build with parameters variables are set in the GUI as `string variables,
containerInstanceIdsToStartOn = "463b8b6f-9388-4fbd-8257-b056e28c0a43"
region = "eu-west-1"
cluster = "mis-core-dev"
Where am I going wrong?
Define your parameters in a parameter block:
pipeline {
agent any
parameters {
string(defaultValue: 'us-west-2', description: 'Provide your region', name: 'REGION')
}
stages {
stage('declarative'){
steps {
print params.REGION
sh "echo ${params.REGION}"
}
}
stage('scripted'){
steps {
script {
print params.REGION
}
}
}
}
}
Output:
[Pipeline] {
[Pipeline] stage
[Pipeline] { (declarative)
[Pipeline] echo
us-west-2
[Pipeline] sh
[test] Running shell script
+ echo us-west-2
us-west-2
[Pipeline] }
[Pipeline] // stage
[Pipeline] stage
[Pipeline] { (scripted)
[Pipeline] script
[Pipeline] {
[Pipeline] echo
us-west-2
[Pipeline] }
[Pipeline] // script
[Pipeline] }
[Pipeline] // stage
[Pipeline] }
[Pipeline] // node
[Pipeline] End of Pipeline
Finished: SUCCESS
Until gradle 3, I used this to write gradle's output to a logfile:
def fileLogger = [
onOutput : {
File logfile = new File( 'gradle.log' )
logfile << it
}
] as org.gradle.api.logging.StandardOutputListener
gradle.useLogger( fileLogger )
This does not work with with gradle 4.
Update for Gradle 5:
It works when using logging.addStandardOutputListener instead gradle.useLogger and adding it to all tasks:
// logger
def fileLogger = [
onOutput: {
File logfile = new File('gradle.log')
logfile << it
}
] as org.gradle.api.logging.StandardOutputListener
// for configuration phase
logging.addStandardOutputListener(fileLogger)
// for execution phase
gradle.taskGraph.whenReady { taskGraph ->
taskGraph.allTasks.each { Task t ->
t.doFirst {
logging.addStandardOutputListener(fileLogger)
}
}
}