In karaf org.apache.karaf.features.cfg file
I have added
featuresRepositories=mvn:org.apache.cxf.karaf/apache-cxf/3.0.8/xml/features
featuresBoot = cxf-jaxws
The cxf feature could fetch and be installed when karaf started with the connection.
But it will fail without connection, how can I pre-install cxf feature?
This is likely far from most optimal solution for this (would love to hear about the better ones) but you could create offline-repository project using karaf-feature-archetype and configure karaf-maven-plugin use something like following configuration:
<plugin>
<groupId>org.apache.karaf.tooling</groupId>
<artifactId>karaf-maven-plugin</artifactId>
<configuration>
<startLevel>50</startLevel>
<aggregateFeatures>true</aggregateFeatures>
<checkDependencyChange>true</checkDependencyChange>
<failOnDependencyChange>false</failOnDependencyChange>
<logDependencyChanges>true</logDependencyChanges>
<overwriteChangedDependencies>true</overwriteChangedDependencies>
</configuration>
<executions>
<execution>
<id>features-add-to-repo</id>
<phase>generate-resources</phase>
<goals>
<goal>features-add-to-repository</goal>
</goals>
<configuration>
<descriptors>
<!-- Feature repository paths -->
<descriptor>mvn:groupId/artifactId/version/xml/features</descriptor>
</descriptors>
<features>
<!-- features and their artifacts + dependencies to add to offline repository-->
<feature>featureName</feature>
<feature>featureName/version</feature>
</features>
<repository>target/offline-repository</repository>
</configuration>
</execution>
</executions>
</plugin>
When packaging the project i.e with command maven clean install (in environment with online access) it'll generate offline-repository under target folder which you can copy to your offline environment and tell karaf to use it by adding it to org.ops4j.pax.url.mvn.defaultRepositories in file org.ops4j.pax.url.mvn.cfg i.e file:${user.home}/offline-repository#snapshots#id=local if its located in home directory.
features.xml itself can be empty this is just to use karaf-maven-plugin not to create an actual feature repository.
Just be careful if you need to create a new version of the offline-repository to replace the old one. If the new version is missing any of the artifacts that are currently installed to karaf it can cause issues when trying to remove/uninstall them.
Related
I updated my pom.xml to use the new mvn appengine plugin
<plugin>
<groupId>com.google.cloud.tools</groupId>
<artifactId>appengine-maven-plugin</artifactId>
<version>1.2.0</version>
<configuration>
<project>{project_id}</project>
<devserver.host>0.0.0.0</devserver.host>
<devserver.port>1984</devserver.port>
</configuration>
</plugin>
Now when I run mvn appengine:deploy it converts my queue.xml to queue.yaml in the staging directory. However this queue configuration is not deployed.
I have tried so many ways to deploy it to google cloud but nothing worked. This setup is for my cloud endpoints project setup. The documentations do not cover this.
This is the maven plugin code i added after trying your suggestion out .
<plugin>
<groupId>com.google.cloud.tools</groupId>
<artifactId>appengine-maven-plugin</artifactId>
<version>1.2.0</version>
<configuration>
<project>{project_id}</project>
<devserver.host>0.0.0.0</devserver.host>
<devserver.port>1984</devserver.port>
</configuration>
</plugin>
I opened a similar issue on the project board
By default, only the app.yaml file is deployed (which represents the application).
If you want (in addition, or only) the queue.yaml, or even the cron or index, you need to specify those files inside the plugin configuration.
<plugin>
<groupId>com.google.cloud.tools</groupId>
<artifactId>appengine-maven-plugin</artifactId>
<version>${appengine.maven.plugin.version}</version>
<configuration>
<deployables>
<param>target/appengine-staging/app.yaml</param>
<param>target/appengine-staging/cron.yaml</param>
<param>target/appengine-staging/queue.yaml</param>
<param>target/appengine-staging/index.yaml</param>
</deployables>
</configuration>
</plugin>
Please remember that if you specificy certain files, the app.yaml files should be added as well. It is deployed by default only if the deployabels parameter is missing.
Playing with this parameter you can choose which files to deploy
Since Intellij Idea IDE gae deployment plugin does not work, I have to use mvn appengine:update. It always deploy to version 1, ignoring version in appengine-web.xml.
How to set version with mvn appengine:update deployment?
Another way is, don't add anything on the app engine plugin as it is hard to changes each time the pom.xml better pass the version information from the command line, like this
mvn clean package appengine:deploy -Dapp.deploy.version=your-version-here
reference document here
You can set it via a Maven property:
<properties>
<appengine.appId>my-application-id</appengine.appId>
<appengine.version>my-application-version</appengine.version>
</properties>
PS: I'm also setting the applicationId here, you don't necessarily need that.
Add the following into the plugins section in the project pom.xml file:
<plugin>
<groupId>com.google.cloud.tools</groupId>
<artifactId>appengine-maven-plugin</artifactId>
<version>2.2.0</version>
<configuration>
<deploy.projectId>java</deploy.projectId>
<deploy.version>1</deploy.version>
</configuration>
</plugin>
Set the version in plugin property
<plugin>
<groupId>com.google.cloud.tools</groupId>
<artifactId>appengine-maven-plugin</artifactId>
<version>1.3.2</version>
<configuration>
<version>2</version>
</configuration>
</plugin>
We develop a web application that uses Java on the back-end and Angular for the UI, with Maven as the build system.
I've been trying to set up automated integration testing with Protractor, and after loads of Googling/StackOverflowing still can't figure out how the end-2-end configuration can be achieved.
Node.js/NPM installation (failed)
I've tried using frontend-maven-plugin to handle Node.js and NPM installation, but since we're behind a corporate firewall, it doesn't seem possible to download anything directly. It could download Node from our Artifactory though, but then failed on NPM download (I don't understand why it even downloads it as it's a part of Node package). Anyway, I gave up on this idea and decided to use Node installed locally.
Starting Tomcat
Starting/stopping a Tomcat instance for e2e testing is handled nicely by
<plugin>
<groupId>org.apache.tomcat.maven</groupId>
<artifactId>tomcat7-maven-plugin</artifactId>
<version>2.2</version>
<configuration>
<url>${tomcat.manager.url}</url>
<path>/</path>
<server>Tomcat</server>
</configuration>
<executions>
<!-- Starting Tomcat -->
<execution>
<id>start-tomcat</id>
<phase>pre-integration-test</phase>
<goals>
<goal>run</goal>
</goals>
<configuration>
<!-- Fork the process, otherwise the build will be blocked by the running Tomcat -->
<fork>true</fork>
<port>${tomcat.port}</port>
<systemProperties>
<!-- We want to use the 'e2e' profile for integration testing -->
<spring.profiles.active>e2e</spring.profiles.active>
</systemProperties>
</configuration>
</execution>
<!-- Stopping Tomcat -->
<execution>
<id>stop-tomcat</id>
<phase>post-integration-test</phase>
<goals>
<goal>shutdown</goal>
</goals>
</execution>
</executions>
</plugin>
Using WebDriver (failed)
I managed to start WebDriver, but the problem is it's blocking any further execution:
<plugin>
<groupId>org.codehaus.mojo</groupId>
<artifactId>exec-maven-plugin</artifactId>
<executions>
<!-- Start webdriver -->
<execution>
<id>start-webdriver</id>
<phase>pre-integration-test</phase>
<goals>
<goal>exec</goal>
</goals>
<configuration>
<executable>webdriver-manager</executable>
<arguments>
<argument>start</argument>
</arguments>
</configuration>
</execution>
</executions>
</plugin>
Running Protractor
Given that Node.js is installed and WebDriver is running, this shouldn't be a problem. But as I failed to start WebDriver so that it continues execution, this is blocked.
Any advice how the WebDriver can be managed (started/stopped)?
Adding directConnect: true to the Protractor config file solves the issue of starting/stopping the WebDriver (as suggested by Nick). In that case any explicit control of the WebDriver has to be removed from the POM.
Available parameters are explained in detail in the reference configuration file.
I would like to aggregate / assemble multiple js files into one without minifying or obfuscating them using a maven plugin.
I am already using a yui plugin to obfuscate some js files into one:
<plugin>
<groupId>net.alchim31.maven</groupId>
<artifactId>yuicompressor-maven-plugin</artifactId>
<version>1.2</version>
<executions>
<execution>
<id>obfuscate</id>
<phase>process-resources</phase>
<goals>
<goal>compress</goal>
</goals>
<configuration>
<nosuffix>true</nosuffix>
<linebreakpos>-1</linebreakpos>
<aggregations>
<aggregation>
<removeIncluded>true</removeIncluded>
<insertNewLine>false</insertNewLine>
<output>${project.build.directory}/${project.build.finalName}/all.js</output>
<includes>
<include>**/*.js</include>
</includes>
<excludes>
<exclude>**/include/*.js</exclude>
</excludes>
</aggregation>
</aggregations>
</configuration>
</execution>
</executions>
</plugin>
Now I want the same js files aggregated without minification or obfuscation in a file allForDev.js . The goal is to have one file for development and one for production. Its going to be useful to see the whole scripts when debugging in developer tools. If I don't find a way to do this I'll be forced to place a lot of script tags to load all those scirpts (which is not the end of the world :) but I would like to do it in a cleaner way).
I can see that the assemble plugin has the following formats:
zip tar.gz tar.bz2 jar dir war and any other format that the
ArchiveManager has been configured for
Is there a way I can use the assemble maven plugin to do this? As much as I looked there were a bunch of examples to create zips jars and wars, but none to match what I want to do. Or did I miss something?
Is there another plugin I could use?
As a side note, I tried using a second execution of the yui plugin to create a second js file, but I had no luck in creating 2 files. I also tried providing 2 yui plugins, with no luck again. I think that's not possible either.
Cheers,
Despot
The answer would lie in wro4j library. For a more precise setup see:
Javascript and CSS files combining in Maven build WITHOUT compression, minification etc
If you do not want to hassle with the wro4j plugin as me (from #despot's answer) and want to prototype quickly, you can actualy use the old maven-antrun-plugin with similar configuration as the following one:
<plugin>
<artifactId>maven-antrun-plugin</artifactId>
<version>1.8</version>
<executions>
<execution>
<phase>generate-sources</phase>
<configuration>
<target>
<property name="root" location=""/>
<property name="jsRoot" location="${root}/src/main/webapp/js"/>
<property name="jsAggregated" location="${root}/src/main/webapp/all.js"/>
<echo message="Aggregating js files from ${jsRoot} into ${jsAggregated}"/>
<concat destfile="${jsAggregated}" encoding="UTF-8" >
<fileset dir="${jsRoot}" includes="*.*"/>
<filelist dir="${jsRoot}/.." files="client.js"/>
</concat>
</target>
</configuration>
<goals>
<goal>run</goal>
</goals>
</execution>
</executions>
</plugin>
This one concatenates all files in folder <module>/src/main/webapp/js/*.* (but not files from sub folders). Then it adds client.js to the end to make sure all stuff is available for it (AFAIK <fileset> has undefined order).
The resulting concatenated file then resides at <module>/src/main/webapp/all.js.
I know the maven phase, paths and other stuff may not be "correct" - this is just a quick example to show the alternative non-invasive way to do it.
I am using the maven-pmd-plugin on my project and this is how I have configured it
<reporting>
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-jxr-plugin</artifactId>
<version>2.3</version>
</plugin>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-pmd-plugin</artifactId>
<version>2.6</version>
<configuration>
<linkXref>true</linkXref>
<sourceEncoding>UTF-8</sourceEncoding>
<minimumTokens>100</minimumTokens>
<targetJdk>${targetJdk}</targetJdk>
<rulesets>
<ruleset>${maven.pmd.rulesetfiles}</ruleset>
</rulesets>
</configuration>
</plugin>
</plugins>
</reporting>
Here are the properties used in the above configuration
<properties>
<spring.version>3.0.6.RELEASE</spring.version>
<basedir>C:\Users\Q4\workspace\project</basedir>
<maven.pmd.rulesetfiles>${basedir}\pmdRuleset.xml</maven.pmd.rulesetfiles>
<targetJdk>1.5</targetJdk>
<project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
</properties>
The problem is when I run mvn pmd:check, it gives me 8 violations -- only from the basic, unusedcode and imports. It simply doesn't use all the rules that I have listed in the custom ruleset file. I have even tried using the logging-java.xml and strings.xml directly in the ruleset without using the custom ruleset file and it still doesn't work.
When i run mvn pmd:pmd, i get a BUILD SUCCESS but the errors still show up in my target folder. Why do I get a build success here?
I solved this by simply adding the plugins in the build section along with the ones in the reporting section.
Somehow it needed to be in the as well to be able to run all the rulesets. Earlier I was under the impression that we put plugins in the build only if we want to run them during the build and deploy phase.