Wro-maven plugin - CSS changes are overwritten - maven-plugin

Using Wro-maven plugin version 1.8.0. When a CSS file is updated, once clean install is triggered on Maven, the changes are lost in CSS file and old file is present.
<plugin>
<groupId>ro.isdc.wro4j</groupId>
<artifactId>wro4j-maven-plugin</artifactId>
<version>1.8.0</version>
<executions>
<execution>
<phase>compile</phase>
<goals>
<goal>run</goal>
</goals>
</execution>
</executions>
<configuration>
<nosuffix>true</nosuffix>
<minimize>true</minimize>
<destinationFolder>${basedir}/src/main/webapp/wro/</destinationFolder>
<contextFolder>${basedir}/src/main/webapp/</contextFolder>
<wroFile>${basedir}/src/main/resources/wro.xml</wroFile>
<wroManagerFactory>ro.isdc.wro.maven.plugin.manager.factory.ConfigurableWroManagerFactory</wroManagerFactory>
<extraConfigFile>${basedir}/src/main/resources/wro.properties</extraConfigFile>
<ignoreMissingResources>false</ignoreMissingResources>
</configuration>
</plugin>
please help.

Sorry, My Bad.
We were editing a file which was generated as a part of wro-maven plugin.
Wro.xml holds list of groups. Each group will be generated as a file (either css/js). Each group holds a list of files clubbed together, configurable in wro.xml
We were actually editing the file which was a group generated as minimized version and hence changes are overwritten during each clean install. Editing the source file was the correct way.
Found the reason at last.
Thanks.

Related

Apache Karaf feature offline issue

In karaf org.apache.karaf.features.cfg file
I have added
featuresRepositories=mvn:org.apache.cxf.karaf/apache-cxf/3.0.8/xml/features
featuresBoot = cxf-jaxws
The cxf feature could fetch and be installed when karaf started with the connection.
But it will fail without connection, how can I pre-install cxf feature?
This is likely far from most optimal solution for this (would love to hear about the better ones) but you could create offline-repository project using karaf-feature-archetype and configure karaf-maven-plugin use something like following configuration:
<plugin>
<groupId>org.apache.karaf.tooling</groupId>
<artifactId>karaf-maven-plugin</artifactId>
<configuration>
<startLevel>50</startLevel>
<aggregateFeatures>true</aggregateFeatures>
<checkDependencyChange>true</checkDependencyChange>
<failOnDependencyChange>false</failOnDependencyChange>
<logDependencyChanges>true</logDependencyChanges>
<overwriteChangedDependencies>true</overwriteChangedDependencies>
</configuration>
<executions>
<execution>
<id>features-add-to-repo</id>
<phase>generate-resources</phase>
<goals>
<goal>features-add-to-repository</goal>
</goals>
<configuration>
<descriptors>
<!-- Feature repository paths -->
<descriptor>mvn:groupId/artifactId/version/xml/features</descriptor>
</descriptors>
<features>
<!-- features and their artifacts + dependencies to add to offline repository-->
<feature>featureName</feature>
<feature>featureName/version</feature>
</features>
<repository>target/offline-repository</repository>
</configuration>
</execution>
</executions>
</plugin>
When packaging the project i.e with command maven clean install (in environment with online access) it'll generate offline-repository under target folder which you can copy to your offline environment and tell karaf to use it by adding it to org.ops4j.pax.url.mvn.defaultRepositories in file org.ops4j.pax.url.mvn.cfg i.e file:${user.home}/offline-repository#snapshots#id=local if its located in home directory.
features.xml itself can be empty this is just to use karaf-maven-plugin not to create an actual feature repository.
Just be careful if you need to create a new version of the offline-repository to replace the old one. If the new version is missing any of the artifacts that are currently installed to karaf it can cause issues when trying to remove/uninstall them.

maven-surefire-report-plugin is not called during building

I defined the following pom.xml file for generating of reports during an integration testing.
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-surefire-report-plugin</artifactId>
<version>2.19</version>
<configuration>
<aggregate>true</aggregate>
</configuration>
<executions>
<execution>
<phase>verify</phase>
<goals>
<goal>report</goal>
</goals>
</execution>
</executions>
</plugin>
If the mvn verify is executed,there are no reports created. I have to use mvn surefire-report:report for generating.
The above mentioned pom.xml file is parent for two children to be clear.
Does anyone know what is wrong?
Actually the above solution works just only if it is defined in child pom.xml file. It does not work in parent one. Is it bug or what is going on? I spent several hours when I got it!

Resolving Izpack artifacts using maven dependency

I have Izpack installer which packs a pre-configured server and installs in target directory. This server is around 500Mb. Currently I have checked in this src/main/resources folder of installer maven project.But having this big server in git is making the git pulls very slow. So i am planning to keep this server as maven artifact in nexus and add its dependency to installer maven project. This way i can create a maven profile to pull this server from nexus on demand. I am yet to figure out how to copy this dependency to staging folder using a maven plugin(any help would be greatly appreciated). My question here, is it a right approach? or is there any better way to do this. Thanks in advance.
You can use the maven dependency plugin to copy a dependeny to a specific folder.
You can use it to either copy all dependencies or even unpack those dependencies.
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-dependency-plugin</artifactId>
<version>2.10</version>
<executions>
<execution>
<id>unpack</id>
<phase>package</phase>
<goals>
<goal>unpack</goal>
</goals>
<configuration>
<outputDirectory>${izpack.staging}/content/ninjolibs</outputDirectory>
</configuration>
</execution>
</executions>
</plugin>
</plugins>
This is what i did. I uploaded wso2.zip to nexus as zip artifact and configured pom.xml of my installer module to use this dependency.
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-dependency-plugin</artifactId>
<version>2.10</version>
<executions>
<execution>
<id>copy-binaries</id>
<phase>prepare-package</phase>
<goals>
<goal>copy</goal>
</goals>
<configuration>
<artifactItems>
<artifactItem>
<groupId>org.wso2</groupId>
<artifactId>wso2is</artifactId>
<version>5.0.0</version>
<type>zip</type>
<overWrite>true</overWrite>
<outputDirectory>src/main/resources/wso2/binary</outputDirectory>
<destFileName>wso2is-5.0.0.zip</destFileName>
</artifactItem>
</artifactItems>
</configuration>
</execution>
</executions>
</plugin>

What is the simplest way to aggregate/assemble multiple (js) files into one (js) file with a maven plugin WITHOUT compression?

I would like to aggregate / assemble multiple js files into one without minifying or obfuscating them using a maven plugin.
I am already using a yui plugin to obfuscate some js files into one:
<plugin>
<groupId>net.alchim31.maven</groupId>
<artifactId>yuicompressor-maven-plugin</artifactId>
<version>1.2</version>
<executions>
<execution>
<id>obfuscate</id>
<phase>process-resources</phase>
<goals>
<goal>compress</goal>
</goals>
<configuration>
<nosuffix>true</nosuffix>
<linebreakpos>-1</linebreakpos>
<aggregations>
<aggregation>
<removeIncluded>true</removeIncluded>
<insertNewLine>false</insertNewLine>
<output>${project.build.directory}/${project.build.finalName}/all.js</output>
<includes>
<include>**/*.js</include>
</includes>
<excludes>
<exclude>**/include/*.js</exclude>
</excludes>
</aggregation>
</aggregations>
</configuration>
</execution>
</executions>
</plugin>
Now I want the same js files aggregated without minification or obfuscation in a file allForDev.js . The goal is to have one file for development and one for production. Its going to be useful to see the whole scripts when debugging in developer tools. If I don't find a way to do this I'll be forced to place a lot of script tags to load all those scirpts (which is not the end of the world :) but I would like to do it in a cleaner way).
I can see that the assemble plugin has the following formats:
zip tar.gz tar.bz2 jar dir war and any other format that the
ArchiveManager has been configured for
Is there a way I can use the assemble maven plugin to do this? As much as I looked there were a bunch of examples to create zips jars and wars, but none to match what I want to do. Or did I miss something?
Is there another plugin I could use?
As a side note, I tried using a second execution of the yui plugin to create a second js file, but I had no luck in creating 2 files. I also tried providing 2 yui plugins, with no luck again. I think that's not possible either.
Cheers,
Despot
The answer would lie in wro4j library. For a more precise setup see:
Javascript and CSS files combining in Maven build WITHOUT compression, minification etc
If you do not want to hassle with the wro4j plugin as me (from #despot's answer) and want to prototype quickly, you can actualy use the old maven-antrun-plugin with similar configuration as the following one:
<plugin>
<artifactId>maven-antrun-plugin</artifactId>
<version>1.8</version>
<executions>
<execution>
<phase>generate-sources</phase>
<configuration>
<target>
<property name="root" location=""/>
<property name="jsRoot" location="${root}/src/main/webapp/js"/>
<property name="jsAggregated" location="${root}/src/main/webapp/all.js"/>
<echo message="Aggregating js files from ${jsRoot} into ${jsAggregated}"/>
<concat destfile="${jsAggregated}" encoding="UTF-8" >
<fileset dir="${jsRoot}" includes="*.*"/>
<filelist dir="${jsRoot}/.." files="client.js"/>
</concat>
</target>
</configuration>
<goals>
<goal>run</goal>
</goals>
</execution>
</executions>
</plugin>
This one concatenates all files in folder <module>/src/main/webapp/js/*.* (but not files from sub folders). Then it adds client.js to the end to make sure all stuff is available for it (AFAIK <fileset> has undefined order).
The resulting concatenated file then resides at <module>/src/main/webapp/all.js.
I know the maven phase, paths and other stuff may not be "correct" - this is just a quick example to show the alternative non-invasive way to do it.

Maven Proguard processing a library jar that other applications will depend one

Here is what my build plug in stanza looks like:
<plugin>
<groupId>com.pyx4me</groupId>
<artifactId>proguard-maven-plugin</artifactId>
<version>2.0.4</version>
<executions>
<execution>
<phase>package</phase>
<goals>
<goal>proguard</goal>
</goals>
</execution>
</executions>
<configuration>
<options>
<option>-dontshrink</option>
<option>-dontnote</option>
<option>-allowaccessmodification</option>
<option>-dontskipnonpubliclibraryclasses</option>
<option>-dontskipnonpubliclibraryclassmembers</option>
</options>
<libs>
<lib>${java.home}/lib/rt.jar</lib>
<lib>${java.home}/lib/jsse.jar</lib>
</libs>
</configuration>
</plugin>
Here is what I get from execution of mvn clean package
[proguard] Error: You have to specify '-keep' options for the shrinking step.
How do I specify the keep options for a library where I just want obfuscation?
You must define with the -keep option the entry points of your application, because you can't obfuscate it. For example if your main class is obfuscated it will be renamed and you won't be able to launch it. The same for public interfaces of your APIs.

Resources