Deploying a war to tomcat server with maven 1.1 - tomcat6

does anyone know a way to configure a maven goal to deploy to a tomcat server after a build is run? I know that this is possible using the maven-tomcat-plugin but it looks as though as this only works for Maven 2 whereas I'm using Maven 1.1
I'm currently trying to set up Hudson so this would be part of my continuous intergration phase which I hope will run like this:
Build all necessary components
Build war and deploy to (local) server
Run selenium tests
Any help with this would be much appreciated.
Thanks.

Honestly I would refactor your project to use Maven 2, there are several guides that can help ease migration pain (google maven 2 migration), and there is even the maven-one-plugin to convert your project.xml or package your Maven 1 plugins for Maven 2.
If you can't do that, you could use the Maven 1 ant plugin to copy the war to tomcat's webapps directory after it has been packaged. Tomcat will detect the new war and should hot-deploy it.

I have to admit, I don't know much about the plugin for maven, but I do everything in a simple script that cleans the work directories as well (don't know if the maven plugin cleans the work directories).
CALL mvn clean install
CALL rm C:\apps\tomcat\webapps\Foo.war
CALL rm -rdf C:\apps\tomcat\webapps\foo
CALL rm -rdf C:\apps\tomcat\work\Catalina
CALL copy C:\webapps\workspace\Foo\target\Foo.war C:\apps\tomcat\webapps\Foo.war /y
(I know, -1 for MS scripting)
The point is you generally want to clean the work directory and the webapps directory and the Maven 1 ant plugin does not do this (as far as I know if, and read from the link provided). Tomcat is "supposed" to recreate the class files in these directories when it explodes the war file, but anybody who has worked with it long enough knows: this isn't always the case.
Therefore, if the plugin does not clean these directories, it's useless as far as I am concerned. Write yourself a cheap little script like the one provided. It takes 2 minutes.

I've figured out that best way to do this - its actually pretty easy to write a maven goal to transfer the war. The goal could be written as follows:
<goal name="deployWar" prereqs="buildWar">
<echo message="+---------------------------------------------------+" />
<echo message="installing war file to server" />
<echo message="+---------------------------------------------------+" />
<j:set var="deploy.dir" value="${server}/webapps" />
<copy file="${maven.build.dir}/${pom.artifactId}.war"
todir="${deploy.dir}" overwrite="true" />
</goal>
The server variable can be determined in your project.properties file. Also be sure to specify that a pre-requisite to build the WAR before you try to deploy it. Hope this helps someone!

webappDirectory can be configured for maven-war-plugin to deploy exploded war. Nothing special is needed, just run maven install.
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-war-plugin</artifactId>
<version>2.1.1</version>
<configuration>
<webappDirectory>path/to/server/deploy/dir</webappDirectory>
</configuration>
</plugin>

Related

Changing version of cxf on Service Mix

I want to change the version of cxf on Service Mix 7.0.1 where cxf version is 3.1.9 . Because of using Brave tracing that is implemented in 3.1.12 version of cxf.
So Is there a way to change the version of the CXF on Service Mix.
I have deleted manually everything in the system/apache/cxf folder that has 3.1.9 version and add a file with 3.2.5 version. But it is still not working. When I write feature:list all of the cxf dependencies are with 3.1.9 version...
Deleting files from the system folder won't work. It does not scan the system folder for files, but rather uses it as a cache to go looking for specific versions. You don't need to add new versions to system either because it will download them from the central maven repo it they arn't in system.
If it starts up without a data folder, it will install features & versions listed in org.apache.karaf.features.cfg
One would expect to be able to delete the data folder, change the version in org.apache.karaf.features.cfg & start it up, but I tried that and Camel was broken. Unsure why.
I find it easier to deal with it using the management console.
Install the management console by dropping the following xml file into the deploy folder:
<features name="features-murray" xmlns="http://karaf.apache.org/xmlns/features/v1.2.0"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://karaf.apache.org/xmlns/features/v1.0.0 http://karaf.apache.org/xmlns/features/v1.2.0">
<repository>mvn:io.hawt/hawtio-karaf/1.5.7/xml/features</repository>
<feature name="murray" version="1" install="auto">
<feature>hawtio-offline</feature>
</feature>
</features>
Then point your browser at http://localhost:8181/hawtio and login with SMX/SMX.
From OSGI/Features add the your new feature version with the plus button:
mvn:org.apache.cxf.karaf/apache-cxf/3.1.12/xml/features
It may take some time to install, because it downloads it from the net. I found it bounced me out of the management console also, but after logging back in I could uninstall the old cxf 3.1.9. It again logged me out of the management console, but after logging back in I had Camel active and CXF on 3.1.12.
No testing though - goodness knows what else is broken.

Unable to deploy profile to JBoss Fuse 6.1 using fabric8:deploy

I am trying to deploy a simple Camel route to my local instance of JBoss Fuse 6.1 (GA release). I am trying to use the fabric8-maven-plugin to do so, but everytime I run fabric8:deploy, I receive the following error
Failed to execute goal io.fabric8:fabric8-maven-plugin:1.0.0.redhat-379:deploy (default-cli) on project filemover: Error executing: IO-Error while contacting the server: org.apache.http.NoHttpResponseException: The target server failed to respond
Here is my current plugin-definition from my pom file
<plugin>
<groupId>io.fabric8</groupId>
<artifactId>fabric8-maven-plugin</artifactId>
<version>1.0.0.redhat-379</version>
<configuration>
<profile>sample-filemover</profile>
<parentProfiles>feature-camel</parentProfiles>
<features>mq-fabric-camel</features>
</configuration>
</plugin>
My ~/.m2/user/settings.xml file contains the following server definition
<server>
<id>fabric8.upload.repo</id>
<username>admin</username>
<password>admin</password>
</server>
And I am executing the following mvn command
mvn fabric8:deploy -Dmaven.test.skip=true
(I realize I am skipping the tests, but I am trying to just deploy a profile at this time)
I can log onto the management console just fine and can see the root container no problem. Have I missed something in the configuration of Fuse to enable this?
I just spend some hours in the same problem.
Just change the version to 1.1.0.CR5 and you can deploy using mvn fabric8:deploy
Best Regards
are you sure you are trying on the correct server? Just put the fabric URL server where you want to deploy. I'm using that plugin version and it works correctly.
I know may be it's too late, but just for let you know, you can deploy in any server with
mvn clean fabric8:deploy -Dfabric8.jolokiaUrl=http://localhost:8181/jolokia
Cheers

Running Solr with Jetty

I'm having a little trouble understanding how Solr fits in with Jetty, and why I can't seem to get the start.jar in the distribution package to work.
I can run all of the example configurations via java -jar start.jar. However, when I try to run something like the follwing --
java -Dsolr.solr.home=/Users/jwwest/solr -jar $(brew --prefix solr)/libexec/example/start.jar
-- the following error occurs:
java.io.FileNotFoundException: No XML configuration files specified in start.config or command line.
at org.eclipse.jetty.start.Main.start(Main.java:506)
at org.eclipse.jetty.start.Main.main(Main.java:95)
I opened up the start.jar file, and there is a start.config file located inside of the jar which I'm assuming should handle this configuration for me. I'm not understanding why it will work when run from inside of the distribution examples directory, but not outside of it.
You also need to define the jetty.home property. Try:
java -Dsolr.solr.home=/Users/jwwest/solr -jar $(brew --prefix solr)/libexec/example/start.jar -Djetty.home=$(brew --prefix solr)/libexec/example
You can see the effective command line start.jar generates by using the --dry-run command line flag.
java -jar start.jar --dry-run
That will output everything with full path names so you can run it from outside the directory.
Source: http://www.eclipse.org/jetty/documentation/9.0.0.M3/advanced-jetty-start.html
The start.jar is a jetty specific mechanism that works to build out all the classpath requirements for starting up Jetty. It is generally only used in the scope of the jetty distribution. Pulling the start.jar out of the configuration and placing it somewhere else renders the default configuration of the start.config rather moot.
My understanding of Solr is that it bundles itself with a distribution of jetty, placing what it needs to run into the distribution and repackages it as its own. They may have a custom start.config file that further adds its own locations for classpath resources and the like, or not.
The exception you are seeings stems from the start.config file expecting an etc/ directory containing jetty.xml formatted xml files which are used to configure the jetty process.
Jetty being often used in an embedded format has little to do with this issue, it is simply a common use case because jetty is incredibly easy to embed into an application. Embedded instances of jetty rarely (if ever) leverage a start.jar...instead it is up to the embedding application to manage its own classpath.
First, you need to change your folder where start.jar is located, then execute the same command.
Jetty is often used as embedded container. If you want to use the jetty, then a good start would be to copy the example directory and rename it to what you want it to be. The solr directory is the one for basic configuration.
Else it is recommended to use tomcat and the solr.war file.

Using JSVC to daemonize a Java app packaged with the Maven One-Jar Plugin

Here is the problem:
I have packaged my Java application into a single jar using the Maven plugin One-Jar.
Now I want to run the application as a Unix Daemon using JSVC, i.e. Apache Commons Daemon.
I am using JSVC as follows (which works for Jars made with the Maven assembly plugin, etc):
jsvc -user $USER -home $HOME -pidfile $PID_PATH -cp $PATH_TO_ONE_JAR my.package.MyClass
The error is this:
jsvc.exec error: Cannot find daemon loader org/apache/commons/daemon/support/DaemonLoader
jsvc.exec error: Service exit with a return value of 1
Does anyone know if it is even possible to use JSVC and One-Jar together, since One-Jar uses a custom class loader? The jar runs just fine when I run java -jar my-one-jar.jar.
What can be done?
Thank you for any insight!
I had to add all jars dependencies to the classpath option from jsvc. It seems jsvc doesn't use the jars inside another jar
If you use the (poorly-documented) Maven Shade Plugin instead of One-jar (they can achieve similar results as each other), it should solve your problems. It unpacks the dependent jars and stores the class files directly in the fat Jar (rather than having jars within the jar). I have used it to create an executable jar for running under JSVC with some success.
Of course, things are seldom as simple as they sound. With the Shade plugin, you may have to do some work to relocate classes when there are conflicts in your dependency tree, or use resource transformers to handle your non-Java resource files. But hopefully not.
(Of course Mkyong.com has a guide on this)

Can I change the location of the deployment script for VS2010 database project from TFS CI Build

I'm having trouble directing the output of a TFS CI build to directory locations other than the default.
I have 2 Database projects integrated with our large .net project. Everything builds and does just what we expect on the local PC. Files end up where you expect them under the database projects sql/Debug or Release folder.
On our TFS build server the project builds and generates everthing properly but it dumps all of the database project output files (.SQL, .schema, etc)into the root of the TFS output directory. It's getting pretty messy there since there are several projects that seem to cause that to happen.
At the moment I am only concerned with the database projects. Is there a way to specify either in the deploy of the project or the build definition (or any where else I haven't thought to look) where these files will be output?
thanks
A simple way to do this is to create an MSbuild project that runs at the end of the compilation process.
Create file called DropTidy.Proj and then add something like the following.
<Project DefaultTargets="CopySQLReleaseFiles" xmlns="http://schemas.microsoft.com/developer/msbuild/2003" ToolsVersion="3.5">
<Target Name="CopySQLReleaseFiles">
<ItemGroup>
<SqlBuildOutput Include="$(OutDir)\*.sql" />
</ItemGroup>
<Copy SourceFiles="#(SqlBuildOutput)" DestinationFolder="$(OutDir)\SQL" />
</Target>
</Project>
The example above will copy all files with an extensions of "sql" in to a folder called "SQL". $(OutDir) is the working folder used by team build that relelates to the "Binaries" folder in the build workspace on your build agent
Check the file in to TFS and then add it to your "Items to Build" list in the Team Build Process. Make sure that it's the last "Solution" in the list so that it runs after the other solutions in your build. Also make sure that the folder you checked the proj file in to is part of your builds workspace.
Run your build and you should see a new folder called SQL in your drop location.

Resources