How can I get the current akka version in Play Framework 2.4.1? I've searched on all configuration files and I don't see any akka dependecy (although I can use akka in Play).
Within activator, use show compile:dependencyClasspath and you'll get a list of all jars used at compile time.
It's ugly and unformatted, but it works.
If you want to have something more structured, you can install sbt-dependency-graph to get more control.
Shut down activator
Add the following to ~/.sbt/0.13/plugins/plugins.sbt
addSbtPlugin("net.virtual-void" % "sbt-dependency-graph" % "0.7.5")
Add the following to ~/.sbt/0.13/global.sbt
net.virtualvoid.sbt.graph.Plugin.graphSettings
You'll then have a bunch of different dependency-related tasks available in sbt/activator, such as dependency-graph.
Related
Is there a way to get any Apache Camel component "metadata" using Java code, like the list of options and other parameters and their types? I think some automatic help builder was mentioned somewhere that might be of use for this task without using reflection.
A way to get the registered components of all types (including data formats and languages) with java code is also sought. Thanks
Yeah take a look at the camel-catalog JAR which includes all such details. This JAR is what the tooling uses such as some of the Maven tooling itself, or IDE plugs for IntelliJ or Eclipse etc. The JAR has both Java API and metadata files embedded in the JAR you can load.
At runtime you can also access this catalog via RuntimeCamelCatalog which you can access via CamelContext. The runtime catalog is a little bit more limited than CamelCatalog as it has a view of what actually is available at runtime in the current Camel application.
Also I cover this in my book Camel in Action 2nd edition where there is a full chapter devoted on Camel tooling and how to build custom tooling etc.
This is what I've found so far
http://camel.apache.org/componentconfiguration.html
How can I configure Solr logs to get sent to Azure Application Insights?
I see can use a Log4J appender.
https://learn.microsoft.com/en-us/azure/application-insights/app-insights-java-trace-logs
Solr is an open source project, and I don't compile it myself, I just use the distribution.
How can I drop in Application Insights/Log4J appender, without recompiling having installed the SDK?
I just want to configure the logs to get sent to application insghts, for effectively a 3rd party application.
And configure the instrumentation key.
I'm normally a C# dev, but familiar with Log4Net. So appologies if this is simple in Java Log4J. Not been able to find a post for this scenario so posting here.
Using Solr 6.6.
It takes a lot less configuration than you'd expect, and most of the info is hidden away in the link that you've already got: https://learn.microsoft.com/en-gb/azure/azure-monitor/app/java-trace-logs
First, go download the jar files from https://github.com/Microsoft/ApplicationInsights-Java/releases. You'll want applicationinsights-logging-log4j1_2-2.3.0 and applicationinsights-core-2.3.0. Put these in the server/lib folder and Solr will load them automatically for you.
Next you''ll need to add a new appender for app insights into your log4j.properties file
# Appinsights
log4j.appender.aiAppender=com.microsoft.applicationinsights.log4j.v1_2.ApplicationInsightsAppender
log4j.appender.aiAppender.layout=org.apache.log4j.EnhancedPatternLayout
log4j.appender.aiAppender.layout.ConversionPattern=%d{yyyy-MM-dd HH:mm:ss.SSS} %-5p (%t) [%X{collection} %X{shard} %X{replica} %X{core}] %c{1.} %m%n
You also need to add this aiAppender to the log4j.rootLogger list in the same file (it'll probably look something like this: log4j.rootLogger=INFO, file, CONSOLE, aiAppender)
Finally, you need an ApplicationInsights.xml file, which you can get an example of from here https://learn.microsoft.com/en-gb/azure/azure-monitor/app/java-get-started#2-add-the-application-insights-sdk-for-java-to-your-project
Drop this in the server/resources folder, set your instrumentation key and you're good to go!
I'm running a Camel application on Liberty Profile server. I'm taking a message from a queue, unmarshalling, mapping then marshalling. This was working fine but now I'm getting an error that JAXBDataBinding method getContextualNamespaceMap is not found.
I think this is because there is an older version of the jar in the server libs but I don't know why it started using it.
IBM Jar: com.ibm.ws.org.apache.cxf-rt-databinding-jaxb.2.6.2_1.0.12
The issue is resolved if I switch to parent last class loading but its a very hacky way to fix it and is not a great option. Any other ideas? I'm thinking some feature or other dependency in my build may have pulled this jar in.
So it does look like getContextualNamespaceMap is only available in newer versions of the org.apache.cxf-rt-databinding-jaxb JAR than what is available in Liberty.
It might be that parentLast is the best option then. (You already know how to do this but it's documented (here). If it leads to some other issues then do follow-up with another question.
I suppose it's conceivable you might be able to look at whatever is packaged within your application and try removing a set of things and picking them up from the Liberty runtime, to avoid running in parentLast mode. E.g. if you are only referencing getContextualNamespaceMap because you have other code in your app but there is some alternative path you could have gone down entirely in the Liberty-provided modules, then in theory you could be OK.
I'm not familiar enough with the code paths in the modules in the CXF or Camel "stack" to guess whether that's a real-world likelihood though.
The javaee7 feature contained a jaxsw version that clashed with the server version. Removing the javaee7 feature has resolved this issue. Remains to be seen whether or not I will to add it back in.
I'm using MSpec to drive some automated UI tests using Selenium WebDriver. Much like the examples I found online. I'm having problems getting it to take screenshot when the test fails.
I saw a comment on another issue where it works because they have a ResultSupplementer in the sample web specs. However, ResultSupplementer does not seem to exist in the latest version of Mspec (0.9.1).
Is there a different way to do this in the latest version of mspec? Ultimately, I'm going to generate HTML reports as TeamCity artifacts and include the screenshot on any failing specs.
I've updated the samples for the latest version of MSpec (in short, you need to implement ISupplementSpecificationResults yourself).
I've also merged the solutions and converted the MVC project to Nancy. You'll find that there's a bit more infrastructure-related code that grew over the last couple of years and works around various things, like
status codes 4xx and 5xx logged by IIS Express
IIS and Chrome Driver ports bound by other processes
page objects access the web driver with a high-level API
I use Paket for dependency management because it's far more powerful than plain NuGet
All that said, you need to run msbuild.exe mspec-samples.sln and then All-Specs.cmd. I've also checked that a TeamCity build creates screenshots.
I have an OSGi-based, server side application that uses the filesystem to store scripts and configuration data.
In time, I'd like to move that application to 'the cloud', and that's not going to work well with its current dependency to file system access.
What I'd like to do is insert a JCR layer into this application, so it will still work in the current situation (regular files on the local filesystem), but will pave the way to a cloud situation.
I did find a file connector in modeshape, but I ran into a pretty severe incompatibility with OSGi, which hasn't been fixed. Besides, ModeShape pulls in LOTS of dependencies (about 6 MB, I think), which is a problem for me.
So I don't see any options besides starting to hack my own JCR implementation, which I am reluctant to do.
Any ideas?
Although you wouldn't be using JCR directly, using the Apache Sling ResourceProvider mechanism should allow you to move easily from filesystem to something else later, and it's OSGi-friendly as Sling is 100% based on OSGi.
You could start now by using Sling's Filesystem resource provider ( http://sling.apache.org/site/accessing-filesystem-resources-extensionsfsresource.html ) and later move to your own custom ResourceProvider, as needed.
The source code of the filesystem provider is at https://svn.apache.org/repos/asf/sling/trunk/bundles/extensions/fsresource - it's quite simple code that can be used as an example for creating your own ResourceProvider.
For your custom system the question would be how many Sling bundles you need to get that working - I don't know off the top of my head but would suggest using the Sling Launchpad to find out, it launches a vanilla Sling system with lots of bundles that you won't need, but you could try reducing it to the minimum that still allows the ResourceProvider mechanism to work.
You can also use Apache Commons VFS2, there is for example a JCR connector, or you can use webdav or a JDBC table. I use this in a commercial project on top of an atomic (git like) tree on top of a shared JDBC table.