Jenkins build-flow plugin "build" command - jenkins-plugins

I suspect there is more fundamental learning I need before I attempt to use the build-flow plugin.
Using the DSL plugin I created a Seed job which created sub-jobs:
FromTemplate-Job1 to FromTemplate-Job3
Ultimately I want to run these in parallel. However, for now, I just want to run FromTemplate-Job1
I thought this could just use build("FromTemplate-Job1") to run the job, however this generates the exception:
groovy.lang.MissingMethodException: No signature of method: script1438099035271418250533.build() is applicable for argument types: (java.lang.String) values: [FromTemplate-Job1]
There is obviously something fundamental that I need to understand?

Mahi - your question was the right one to ask - Thanks .
I had chosen FreeStyle as the job type and not Build Flow - it is now working!

DSL build command executes Free style job.
It won't execute job which is configured with multijob or dsl job.

Related

How to clean database after scenario in Python behave

I'm pretty new to the world of python/behave and API testing, and I'm trying to clean the database after 1 scenario is run by calling the tag #clean_database.
Can you please assist?
I guess that I will need a database_context.py in my context_steps folder but I'm not sure how to do the connection to the database...
Seems like you have 2 questions here:
(1) How do I connect to the database?
This question doesn't involve behave, so you should ask this question elsewhere--perhaps on the MySQL-Python thread if you're using MySQL (which you haven't specified) or on the Python thread.
(2) How do I use behave to call specific tags?
For the latter, check out the documentation for running tagged tests and see how to run behave from your Python program.

Output in a Maven plugin when loaded via `ServiceLoader`

In a vain attempt to solve Produce tree output with Surefire like the JUnit 5 console launcher for myself, I added a simple JUnit Jupiter TestExecutionListener to my project. It is registered via putting the class name into some file in META-INF, so I assume it is using Java’s ServiceLoader. In any case, it is instantiated with the default constructor.
Now the tricky integration part: When run using Surefire, all writes to standard out trigger warnings – this is not the way Maven plugins are supposed to create output. However, in the default-constructed listener, how would I get access to the Maven logger?
Maybe it is easier to extend Surefire and instead somehow wire the listener into it?

How do I force ssis package step to fail to test error handling

I have an ssis package with multiple different steps and types of steps, most are not script tasks. I want to make sure my error handling and error logging are working properly and I am wondering if there is a way which I can selectively force the package to fail at specific points to make sure it acts appropriately.
Thank you
There is properties called forceExecutionValue and ForceExecutionResult for every task. Just assign a value to it

azkaban keeps changing executor id

I'm using Azkaban 3.0 and I have it on a server with two executors. I have a simple echo job that I'm running and I'm specifying the executor by setting the setExecutor=id# in the flow parameters. but whenever I run tise job the execution keeps alternating between the two executors although it explicitly specified in the job definition to run on the second executor only.
Do I need to change something in the configurations?
I restarted azkaban with executors but it didn't help.
Thanks in advance!
check this out to know how to configure azaban with multiple executers. . .
http://azkaban.github.io/azkaban/docs/latest/#executor-setup
a got a help from a colleague and he showed me how to solve this issue.
it was solved by deleting the executor.port from azkaban webserver properties file.

Execute one SSIS Package from Another, but using a DIFFERENT proxy user. Is it possible?

I have one SSIS Package that must run as Proxy A and another that must run as Proxy B. I would love to have the first package run, and, as one of its tasks, execute the second package. Is this possible?
Thanks a lot!
You could have the first package use sp_start_job to kick off a job that is set up to run the second package. If this is "fire-and-forget", that's all you need to do. If you need to wait until it's completed, things get more messy - you'd have to loop around calling (and parsing the output of) sp_help_jobactivity
and use WAITFOR DELAY until the run completes.
This is also more complex if you need to determine the actual outcome of running the second package.

Resources