What is <classifier> in <configuration> in maven - maven-plugin

what is the use of classifier tag in inside configuration tag in maven.
example:
org.apache.maven.plugins
maven-jar-plugin
2.2
pre-process-classes
compile
jar
pre-process

LMGTFY:
classifier:\ The classifier allows to distinguish artifacts that were
built from the same POM but differ in their content. It is some
optional and arbitrary string that - if present - is appended to the
artifact name just after the version number.
As a motivation for this element, consider for example a project that
offers an artifact targeting JRE 1.5 but at the same time also an
artifact that still supports JRE 1.4. The first artifact could be
equipped with the classifier jdk15 and the second one with jdk14 such
that clients can choose which one to use.
Another common use case for classifiers is the need to attach
secondary artifacts to the project’s main artifact. If you browse the
Maven central repository, you will notice that the classifiers sources
and javadoc are used to deploy the project source code and API docs
along with the packaged class files.

Related

[Apache Flink]: Where is flink-s3-fs-hadoop plugin?

I would like to read and write some data with Apache Flink 1.11.2 from S3. The documentation recommends to use the presto plugin for checkpoints and the hadoop plugin for pipeline data.
After reading this section you have to copy the plugins from /opt to /plugin. I can find the flink-s3-fs-presto-1.11.2.jar under /opt but there is no flink-s3-fs-hadoop-1.11.2.jar. Where can i find the s3-hadoop plugin for setting up my production environment?
And how can i use these plugins in the IDE? Simply adding these to pom.xml als provided dependencies? And then how can i pass the crentials in IDE?
That is weird I can see that they are both present in the official binaries in opt in 1.11.1. However if You can't find them, You can simply try to get the jars from Maven here and copy them to the required place. Another thing that may work is adding the dependency into the project with compile scope.
Running the job locally is described here. There are various ways of configuring the credentials when running the job in IDE, one might be adding core-site.xml to resources folder with proper configruation.
EDIT:
As for the local execution it was explained here a little bit.

Not able to start the bundle in servicemix

I have a bundle up and running in Servicemix. I went to my company's repository and downloaded the corresponding JAR to my local machine. I extracted that JAR and found out that this JAR had only one folder META-INF.
Inside this folder, there is a Manifest.mf file and my resources such as Spring configuration file and Camel Context file.
there I got my first question: where are the source files of this JAR i.e. JAVA classes and all. Only thing I saw there was manifest file, pom.xml, another pom properties file and couple of other configuration files for spring and camel.
this led to my next step. I had a local copy of this project in my workspace as well. I build this project locally and found the JAR in target directory of the project.
Now following steps might seem silly but anyway I did little experiment. I extracted this JAR which I found in target and extracted it to see the content. I believed it was a bundle because I used maven-bundle-plugin and there is no way you could tell by looking at a JAR that its just a JAR or an OSGI bundle. ok so I extracted the JAR and guess what this time it did have the compiled java classes.
this is not the end, I did something silly again. I removed the compiled classes from this JAR and made it exactly same as which I copied from my Company's central repository. Now I used a JDK's JAR creation utility to create a JAR.
Now I have two JARS:
one which I downloaded from company's central repo.
another one which I created myself. it has exactly same content as the other one. I even used the same manifest.mf while creating this JAR. (Since I knew Manifest is the backbone of an oSGI Bundle).
I secure copied this bundle in my server's home directory. and finally, I installed this Bundle/JAR in Servicemix using :
install file:path_to_JAR/JAR_FILE_NAME.
it got installed successfully. but when I tried to start this bundle. it could not start. by using display-exception, I saw the exception : it wasnt able to load the beans and could not
initialize the Application Context followed by a more specific exception "ClassNotFound" exception. I understand that it wasnt able to find the classes defined in my application context. BUT WHYYYYYYYYYY?
I did exactly same steps and I checked it multiple times. if mine could not start, why the earlier one is up and running.
It might sound silly for others who have worked in OSGI environment, But now I am starting to re consider especially ServiceMix.
Thanks for any suggestion.
This is nothing about OSGi, it's more something about your application.
As I don't know your project I just can do some assumptions.
First the jar you got from the Company Repository is most likely an "older" version and not the same as your local sources. With Servicemix it's quite possible to just have blueprint or spring xmls in your bundle cause those are valid resources a Camel-Blueprint/Spring extender are able to pick up. Those XMLs are interpreted and if those only make use of standard Camel Components there is no reason to have a single Class inside your bundle.
Now back to your newly created Bundle, obviously you have some new "Code" in your camel-xml which requires not only standard Camel classes but also some Processes you created on your own, now those classes need to stay in the Bundle!
Best just deploy the newly created Bundle with all it's classes. You should rather check what has changed in the camel xml files.

With maven, do I still need to configure jdb driver and datasources?

I want to write an Java EE 6 app, on JBoss 7.1, with JPA (Hibernate as JPA provider) and SqlServer.
The build tool is, for better or worse, Maven 3.
What does Maven arrange/set up for me in terms of connecting Jboss 7 to a database and JPA?
Do I need to set up a JDBC driver, as well as a datasource, or does Maven automagically set these up for me via transitive dependencies and its build cycle? In other words, do I need go into standalone.xml as well as into the module folder and modify a module.xml file, etc?
I can't find a tutorial that has all of the steps or a sample pom.xml.
No Maven will not automagically set up dependencies on JDBC providers. You must declare them explicitly in your own project pom.
As far as a tutorial, the canonical references and tutorials are on the Sonatype web site.
See: Maven by Example
and specifically regarding project dependencies in Maven: the Complete Reference:
http://books.sonatype.com/mvnref-book/reference/pom-relationships-sect-project-dependencies.html

DataNucleus libraries and maven-gae-plugin

I'm using maven-gae-plugin to manage a Google AppEngine project but I don't know how to include the libraries required to use JPA.
Google's documentation says:
The classpath must contain the JARs 'datanucleus-core-*.jar', 'datanucleus-jpa-*', 'datanucleus-enhancer-*.jar', 'asm-*.jar', and 'geronimo-jpa-*.jar' (where * is the appropriate version number of each JAR) from the 'appengine-java-sdk/lib/tools/' directory, as well as all of your data classes.
How can I tell the plugin to put all the jars in the classpath?
So far I just edited the pom.xml file setting gae.version to 1.7.3 (Leaving datanucleus.version to 1.1.5 and I run mvn gae:unpack but I cannot get it to work.
First, I have problems with javax.persistance that is not found. Do I have to add it manually to pom.xml?
If I do it, the development server starts, but I cannot work with the storage: I get the following error:
SEVERE: Found Meta-Data for class com.sharecost.entities.User but this class is not enhanced!! Please enhance the class before running DataNucleus.
I found a solution to the second part of my question. Looking at the POM.xml file I discovered that the all entities are supposed to be in a **/model package.
I still don't know if the manual inclusion of the javax.persistence dependency is actually required.

DataNucleus Enhancer doesn't work

I'm writing a web app using Google AppEngine and Spring MVC. I carefully upgraded to the v2 of the DataNucleus pluging by following these steps: http://code.google.com/p/datanucleus-appengine/wiki/UpgradingToVersionTwo (I use Eclipse).
When I try to run the Enhancer Tool I get following error:
Exception in thread "main" Plugin (Bundle) "org.datanucleus" is already registered. Ensure you dont have multiple JAR versions of the same plugin in the classpath. The URL
"file:/.../eclipse/plugins/com.google.appengine.eclipse.sdkbundle_1.6.4.v201203300216r37/appengine-java-sdk-1.6.4/lib/opt/user/datanucleus/v2/datanucleus-core-3.0.6.jar" is already registered, and you are trying to register an identical plugin located at URL
"file:/.../eclipse/plugins/com.google.appengine.eclipse.sdkbundle_1.6.4.v201203300216r37/appengine-java-sdk-1.6.4/lib/opt/tools/datanucleus/v2/datanucleus-core-3.0.6.jar."
I formatted the message so that you could see the tiny difference, one jar is loaded from "user" directory, the other one from "tools" directory. I don't understand why. In the project build path, there is only the one from "user" and to the DataNucleus configuration I added the one from "tools", just like the howto above suggested.
In other cases I've seen around this message was mostly caused by conflicting versions of datanucleus plugin but it doesn't apply to me. I guess it's just some stupid thing in my case... so what am I doing wrong?
So after all, I didn't read the instructions as carefully as I thought. The problem was really that the jars were there twice, one in the project build path, one in the datanucleus configuration. It shouldn't be in the project build path (or in fact, it shouldn't be in one of them, doesn't matter which one). I added it there automatically when I copied libs to the war directory and I assumed it had to be done. But the instructions clearly say that only jdo-api needs to be in the project build path.
One thing I don't understand though. In one step of the instructions I had to uncheck "use project classpath when running tools" in the DataNuclues configuration. So how is it possible that the plugin was still using the libs configured in the project build path?

Resources