Artifactory Generic Download - VSTS Task Failing - package

I have set up a very basic task in a VSTS build definition with the following simple steps and objective:
Setup and successfully test an endpoint to our Artifactory repository.
Implement a VSTS "Artifactory Generic Download" Task to retrieve a single jar file from the Artifactory repository.
Drop the jar file in staging directory of the build agent.
The file spec source, based on an example from the JFrog website www.jfrog.com and set up as a Task Configuration is very basic and is depicted below:
Unfortunately, triggering this build job fails horribly with the below error and I simply can't figure out why it is failing. Would appreciate some help on this.

It seems that no artifacts were found and the task fails due to the configured "Fail task if no dependencies were downloaded" flag. If you wish to change this behavior, you can uncheck the flag in your task configuration.
As for the not downloaded artifacts, make sure a repository called "list" exists and a jar file exists in the provided pattern.
More information about file-specs can be found here.

Related

Flink logging - Using Log4j2

We are running a Flink(1.9.1) application on AWS-EMR(5.29) using yarn. We are using a common logging adaptor throughout all the components(including the Flink application) in our project and it uses Log4j2.
From the documentation, I see that there are 3 configuration files.
log4j.properties
log4j-yarn-session.properties
log4j-cli.properties
I understand that I will have to modify log4j.properties for the job manager and task manager logs and log4j-cli.properties for the code not included in the cluster code.
Now given this situation,
How do I pass my log4j2.properties?
Do we replace the logging jars in the lib folder with log4j2 jars?
Not a solid solution but this is a workaround. If the log4j.properties file in the /conf folder is deleted, the log4j2 file within the jar that is within the classpath is referred. But be careful when you have multiple jars in the classpath with the log4j2 properties file.

KNIME Command Line Execution - ClassNotFoundException

I'd like to schedule a KNIME workflow. The workflow does its job very good as long as I start it from the KNIME GUI application. When I execute the same workflow via command line, java complains that com.microsoft.sqlserver.jdbc.SQLServerDriver
could not be found (ClassNotFoundException).
I invoke it via:
"D:\Progamme\KNIME\knime.exe" -nosplash -application -consoleLog org.knime.product.KNIME_BATCH_APPLICATION -preferences="absolutepathto\preferences.epf" -workflowDir="absolutepathto\workflow"
Since the error message signals missing content in the java CLASSPATH I also tried to add the parameters
-vmargs -classpath .;"absolutepathto/sqljdbc42.jar"
But still I earn a java slap, pointing to the same error...
I also tried to run the command from within the knime.exe's directory and I also tried to add the JAR file to Preferences -> Java -> Build Path -> Classpath Variable / User Libraries (referenced via the -preference argument). But that had no effect.
Did anybody face the same problems? Maybe with other third party JARs?
It is all about a Database connector that is configured like this:
Does the integrated security maybe force a misleading error?
System spec: KNIME 3.2.2 on Windows Server 2008 R2
Update - extract from preferences file
/configuration/org.eclipse.core.net/org.eclipse.core.net.hasMigrated=true
/configuration/org.eclipse.ui.ide/MAX_RECENT_WORKSPACES=10
/configuration/org.eclipse.ui.ide/RECENT_WORKSPACES=<list of some workspaces>
/configuration/org.eclipse.ui.ide/RECENT_WORKSPACES_PROTOCOL=3
/configuration/org.eclipse.ui.ide/SHOW_RECENT_WORKSPACES=false
/configuration/org.eclipse.ui.ide/SHOW_WORKSPACE_SELECTION_DIALOG=true
Is there maybe a problem due to the fact that it is a shared KNIME instance among several users and the command line execution does not know which workspace has to be chosen? Is the workspace somehow needed and why?
Partial Solution:
I finally managed it but I don't know exactly why it works now. What I did was to load a fresh portable version of KNIME and ran the same commands only changing the executable path to the new portable version. Before that I started the portable version once to set the workspace directory and register the database driver in preferences dialog and .ini file, nothing else, same configuration so far as the shared KNIME instance. What I am really wondering abpout is that from now on the commands are also working with the shared KNIME instance. I really don't know what caused the change that let KNIME find the driver class.
Info
Because I encountered a few more problems within shared environment in KNIME command line mode, that led to undeterministic execution results, I wrote a little .NET library. This gives me more flexibility/control over the workflow execution (which returncodes and error messages occured and so on). You can find it here if you're interested: KnimeNet
I took a very minimal approach:
cd "C:\Program Files\KNIME"
.\knime -nosplash -noexit -consoleLog -reset -application org.knime.product.KNIME_BATCH_APPLICATION -workflowFile="D:\Work\Knime Workflows\Output\CMD_Test.knwf" -preferences="D:\Work\Knime Workflows\Output\CMD_Test.epf"

Exporting Octopus Deploy project doesn't create json file

I'm using Octo.exe 4.0.4 and Octopus version 3.4.12, and I'm trying to export a project from an Octopus Deploy server but I'm not getting an output file.
I can connect to the server, and list the projects, but when I run the export process it seems to stop at the point it should find the project group.
Here's a screenshot of the output.
I've tried running this from the machine which hosts the Octopus Server, as well as a tentacle, and I get identical results.
I've also tried not putting quotes around the project name and output file, since they aren't actually required here, and again same results.
According to the Octopus docn, the results should be something like this:
Octopus Deploy Command Line Tool, version 1.0.0.0
Handshaking with Octopus server: http://localhost/Octopus
Handshake successful. Octopus version: 2.4.4.43; API version: 3.0.0
Finding exporter 'project'
Beginning the export
Finding project: OctoFX Rate Service
Finding project group for project
Finding variable set for project
Finding deployment process for project
Finding NuGet feed for deployment process...
Finding NuGet feed for step Database
Finding NuGet feed for step Rate Service
Export file C:\tmp\OctoFX_Rate_Service.json successfully created.
It turns out this was a bug in Octo.exe v4.0.4.
It has been fixed in v4.0.7, and I can now export my project.

How to upload an artifact to Artifactory / consume it in a build system (Gradle Maven Ant) where the artifact does not have an extension

I have the following files which I would like to upload to Artifactory as a 9.8.0 versioned artifact.
NOTE: The first two files DO NOT have an extension (they are executable files i.e. if you open them/cat on it, you'll see junk characters).
Folder/files of a given version 9.8.0 in CVS is like:
com.company.project/gigaproject/v9.8.0/linux/gigainstall
com.company.project/gigaproject/v9.8.0/solaris/gigainstall
com.company.project/gigaproject/v9.8.0/win32/gigainstall.exe
com.company.project/gigaproject/v9.8.0/gigafile.dtd
com.company.project/gigaproject/v9.8.0/gigaanotherfile.dtd
com.company.project/gigaproject/v9.8.0/giga.jar
com.company.project/gigaproject/v9.8.0/giga.war
Uploading the above files which have an extension is very easy... You log in to Artifactory as an administrator/user which has access to deploy artifacts, click on "Deploy" tab, browse for the Artifactory file and once you select the file, click on "Upload" button.
Next you'll see a screen (like shown above). You'll tweak what you want in the fields on this page and once you click on "Deploy Artifact", you are done. All you have to make sure is you select the correct file.extension file while uploading and make sure the file extension is shown in the "Target Path" box correctly (with the version -x.x.x, etc.).
My questions:
Question 1: How do I upload an artifact which doesn't have an extension? It seems like Artifactory by default takes an artifact as a .jar extension. How can I upload the "gigainstall" artifact as shown in the folder/file structure above for both Linux and Solaris? I see I can use the artifact name as gigainstall-linux and gigainstall-solaris and differentiate it, but I am not sure how to tell Artifactory that this artifact doesn't have any extension.
I don't think the development team will start generating this artifact with a proper extension (as this artifact may be hard coded everywhere in other projects where they are currently getting it from CVS/SVN source control somewhere - which is itself a bad practice to store an artifact in a source control version tool).
Question 2: How would I tell a build system (for example, Gradle) to consume a non-extensioned artifact during, let's say, 'compile' task. In build.gradle under section dependencies { .. }, I will add something like as shown below, but I am not sure for non-extensioned files (the first two in the folder/file structure I mentioned above).
dependencies {
//compile 'com.company.project:gigainstall-linux:9.8.0#'
//compile 'com.company.project:gigainstall-linux:9.8.0#??????'
//compile 'com.company.project:gigainstall-linux:9.8.0#""'
//compile 'com.company.project:gigainstall-linux:9.8.0#"none"'
//compile 'com.company.project:gigainstall-linux:9.8.0#"NULL_or_something"'
// The following will easily get giga.jar version giga-9.8.0.jar from Artifactory repository
compile 'com.company.project:giga:9.8.0'
// The following will easily get giga.war
compile 'com.company.project:giga:9.8.0#war'
// Similarly, other extension based artifacts can be fetched from Artifactory
compile 'com.company.project:gigafile:9.8.0#dtd'
compile 'com.company.project:gigaanotherfile:9.8.0#dtd'
}
Answer 1 (will cover 2 as well in a different sense): Using Artifactory "Artifact Bundle" feature section under "Deploy" tab can do the TRICK for AT LEAST uploading the artifacts in a way we want, by creating a zip file first (containing the structure and artifacts in it) --OR you can upload the artifacts using/calling Artifactory REST API way.
High level idea:
Create a zip file called gigaproject.zip OR anyname.zip/.tar/compressed file which Artifactory can read. Inside the zip, create the structure - how these artifacts will be loaded to Artifactory
i.e.
gigaproject.zip will contain the following folders/structure/files.
Case 1:
com/company/project/gigaproject/9.8.0/linux/gigainstall
com/company/project/gigaproject/9.8.0/solaris/gigainstall
com/company/project/gigaproject/9.8.0/win32/gigainstall.exe
com/company/project/gigaproject/9.8.0/gigafile.dtd
com/company/project/gigaproject/9.8.0/gigaanotherfile.dtd
com/company/project/gigaproject/9.8.0/giga.jar
com/company/project/gigaproject/9.8.0/giga.war
NOTE: In case 1 example, I didn't use any -x.x.x in the filename (i.e. I'm using plain and simple giga.jar instead of giga-9.8.0.jar).
The above Upload/Deploy will result the files (as shown in the following snapshot):
So, we have achieved what we wanted. Actually (visibly speaking yes), but not in a way Artifactory usually stores these artifacts (as they should -x.x.x version embedded in the file name and where artifact id should match the artifact filename). Now, if you want to consume the following in a Gradle build file, you CANNOT as first, you haven't uploaded the filename with -x.x.x version name in it, secondly, the artifact id in our case 1 tree was "gigaproject" (after com/company/project folder), so Gradle way of defining what artifact id and what artifact file name you want won't work.
compile 'com.company.project:gigaproject:CANNOTSAY_HOW_TO_GET_GIGA_JARorGIGAINSTALL_with_without_extension'
Conclusion: It's possible to upload any files (with/without extension in Artifactory) in any structure but it depends how your build system will consume it or will be able to consume it or not.
- I deleted the structure I just created with case 1 .zip file from Artifactory repository to try next case#2 and deleted the .zip file I created.
Case 2:
Let's create an individual versioned file name for each artifact and also create structure in the format - how Artifactory actually stores them (an artifact as seen in a repository in a tree view) and create a .zip file containing that structure. Let's use the same "Artifact Bundle" feature to upload this .zip file to upload individual artifacts that we need in Artifactory - where artifact-id (second value which we mention while trying to consume it) would match the artifactfile name in Artifactory.
Folder/file structure for the .zip file:
com/company/project/gigainstall/9.8.0/gigainstall-9.8.0.linux
com/company/project/gigainstall/9.8.0/gigainstall-9.8.0.solaris
com/company/project/gigainstall/9.8.0/gigainstall-9.8.0.exe
com/company/project/gigafile/9.8.0/gigafile-9.8.0.dtd
com/company/project/gigaanotherfile/9.8.0/gigaanotherfile-9.8.0.dtd
com/company/project/giga/9.8.0/giga-9.8.0.jar
com/company/project/giga/9.8.0/giga-9.8.0.war
NOTE: This time, we'll be using the same "Artifact Bundle" feature and for similar files (gigainstall under both Linux/Solaris folders), I took the approach of creating gigainstall folder (containing gigainstall-9.8.0.linux and gigainstall-9.8.0.solaris file names) i.e. when we'll consume these artifacts in Gradle under dependencies { ... } section for compile, we'll use x.x.x# way to fetch these artifacts from Artifactory.
OK, once "Artifact Bundle" Deploy/Upload was successfully complete, I got the following message.
Successfully deployed 7 artifacts from archive: gigaproject.zip (1 seconds).
Now, let's see how it looks like in Artifactory while searching for one of the artifact/in Tree view. You can see we have the files now in place, with filename-x.x.x.extension way so that I can consume them easily in Gradle.
In Gradle build file (build.gradle), I'll mention:
dependencies {
compile "com.company.project:gigainstall:9.8.0#linux"
compile "com.company.project:gigainstall:9.8.0#solaris"
compile "com.company.project:gigainstall:9.8.0#linux"
compile "com.company.project:giga:9.8.0
compile "com.company.project:giga:9.8.0#war
compile "com.company.project:gigafile:9.8.0#dtd
compile "com.company.project:gigaanotherfile:9.8.0#dtd
}
OH OH!! - That didn't work, see below for Gradle error. Why? - Artifactory Bundle upload/deploy feature uploads a zip file content what you have in the .zip but it DOES NOT create a .pom file per artifact it deploys. Thus, making the Gradle build to fail. May be in Ant this might succeed. This occurred for each individual .jar/.war/.dtd/etc file. I'm just showing one error example.
While doing gradle clean build
Could not resolve all dependencies for configuration ':compile'.
> Could not resolve com.company.project:gigafile:0.0.0.
Required by:
com.company.project:ABCProjectWhichConsumesGIGAProjectArtifacts:1.64.0
> Could not GET 'http://artifactoryserver:8081/artifactory/ext-snapshot-local/com/company/project/gigafile/0.0.0/gigafile-0.0.0.pom'. Received status code 409 from server: Conflict
Case 3: Let's take a simple approach (workaround but will save a lot of pain).
Create gigaproject.zip file with the following structure, this approach takes - No x.x.x version value embedded in the individual artifact/filename in the folder/file structure. We will use "Single Artifact" approach (which will create the .pom for gigaproject.zip file automatically during the upload/deploy process provided by Artifactory). You'll still be able to get gigainstall file without needing any extension to its name using this approach. During the upload/deploy step, as you already have seen, you upload gigaproject.zip and artifactory will upload it to a given Target Repository as "gigaproject-x.x.x.zip" where x.x.x is 9.8.0 in our case. See the image snapshot below.
gigaproject/linux/gigainstall
gigaproject/solaris/gigainstall
gigaproject/win32/gigainstall.exe
gigaproject/gigafile.dtd
gigaproject/gigaanotherfile.dtd
gigaproject/gigaproject.zip
gigaproject/giga.jar
gigaproject/giga.war
Now, upload it in Artifactory using "Single Artifact" feature. Click "Deploy Artifact" once you tweak the values for GroupId, ArtifactId, Version, etc.
Once this is uploaded. You'll see in the zip artifact in the target repository (I took a bad example, usually this would be libs-snapshot-local or libs-release-local instead of ext-...), you'll be able to consume the ZIP artifact directly in Graddle:
dependencies {
// This is the only line we need now.
compile "com.company.project:gigaproject:9.8.0#zip"
}
Once the .zip is available to Gradle build system, now you can tell Gradle to unpack this .zip file somewhere in your build/workspace area where you can feed the actual(unpacked) files (gigainstall, .dtd, .jar, .war, etc.) to the build process/steps.
PS: Case# 1 and 2 would have worked for Ant I guess.
Answer 2:
If you have uploaded a non-extensioned file in either way. Make sure you have manually created/uploaded its POM file as well (i.e. if I uploaded gigainstall-9.8.0 as an artifact under com/company/project/gigainstall/9.8.0/gigainstall-9.8.0, then at the same level, I have to/should create it's POM file (see a simple template .pom file for a custom jar artifact or while uploading an extensioned file via "Single Artifact" deploy, you'll see what POM Editor window shows you) and upload both so that Gradle won't error out saying no POM conflict/error. Ant might not need pom (I didn't check that).
Once it's there in Artifactory, the following line should work -- OR comment please if you find another way.
dependencies {
// See nothing mentioned after - x.x.x#
compile "com.company.package:gigainstall:9.8.0#"
}

Jenkins commit a file after successful build

I am using Jenkins, Ant , Flex and Java for my web application.
Currently I update a build version file in Flex src and commit it before starting Jenkins build.
I want to avoid this manual process and let script do this for me.
Contents of file:
Build=01_01_2013_10:43
Release=2.01
Question1:
I want to update this file contents and compile my code and then commit this file back to svn. So that SVN has latest build version number.
How do I commit this changed file to SVN. Would be great if commit happens after successful build.
Question2: I want to send an email to all developers an hour before build starts. "Please commit your changes. Build will start in 1 hr." Can I set up a delay between email and (actual svn export + ant build).
or
Do I have to schedule 2 jobs an hour apart. One to send email and one to do build.
You can use the subclipse svn ant integration to commit changed files to SVN including authentication:
<svnSetting
svnkit="true"
username="bingo"
password="bongo"
id="svn.settings"
/>
<svn refid="svn.settings">
<commit file="your.file" />
</svn>
To get username and password to the build file you have different options. One would be to use a parametrized build, where you define user name and password as build parameters which can be evaluated in the build file.
username="${parameter.svn.username}"
password="${parameter.svn.password}"
A second option is using a the jenkins config file provider plugin. With this you can also use the parameters like for the parametrized build, but you import the credentials from the provided config file, e.g. a properties file can be imported via
<property file="config.file" />
Actually you can also use ant's exec task to execute your subversion commit the file.
For sending an e-mail one hour before actually building, you should setup two jobs, which are scheduled one hour apart. But I don't think this is good practice to notify before building, consider to build more often maybe even per commit to svn.
You can also use the Post build Task plugin (https://wiki.jenkins-ci.org/display/JENKINS/Post+build+task) to execute svn as a shell script (svn must be installed and authenticated from the shell once for the user that runs Jenkins).
Then the svn commit runs as a post build action. The plugin has an option (checkbox) to run the script only if the previous build/steps were successful.
The plugins is also mentioned here: Execute Shell Script after post build in Jenkins

Resources