How can I configure Solr logs to get sent to Azure Application Insights?
I see can use a Log4J appender.
https://learn.microsoft.com/en-us/azure/application-insights/app-insights-java-trace-logs
Solr is an open source project, and I don't compile it myself, I just use the distribution.
How can I drop in Application Insights/Log4J appender, without recompiling having installed the SDK?
I just want to configure the logs to get sent to application insghts, for effectively a 3rd party application.
And configure the instrumentation key.
I'm normally a C# dev, but familiar with Log4Net. So appologies if this is simple in Java Log4J. Not been able to find a post for this scenario so posting here.
Using Solr 6.6.
It takes a lot less configuration than you'd expect, and most of the info is hidden away in the link that you've already got: https://learn.microsoft.com/en-gb/azure/azure-monitor/app/java-trace-logs
First, go download the jar files from https://github.com/Microsoft/ApplicationInsights-Java/releases. You'll want applicationinsights-logging-log4j1_2-2.3.0 and applicationinsights-core-2.3.0. Put these in the server/lib folder and Solr will load them automatically for you.
Next you''ll need to add a new appender for app insights into your log4j.properties file
# Appinsights
log4j.appender.aiAppender=com.microsoft.applicationinsights.log4j.v1_2.ApplicationInsightsAppender
log4j.appender.aiAppender.layout=org.apache.log4j.EnhancedPatternLayout
log4j.appender.aiAppender.layout.ConversionPattern=%d{yyyy-MM-dd HH:mm:ss.SSS} %-5p (%t) [%X{collection} %X{shard} %X{replica} %X{core}] %c{1.} %m%n
You also need to add this aiAppender to the log4j.rootLogger list in the same file (it'll probably look something like this: log4j.rootLogger=INFO, file, CONSOLE, aiAppender)
Finally, you need an ApplicationInsights.xml file, which you can get an example of from here https://learn.microsoft.com/en-gb/azure/azure-monitor/app/java-get-started#2-add-the-application-insights-sdk-for-java-to-your-project
Drop this in the server/resources folder, set your instrumentation key and you're good to go!
Related
Tesseract initializes fine until it needs to load the language files, and it just stops working. See the attached picture for reference on the error..
The npm package(?) installs fine, I also downloaded offline files (worker and wasm files) and made it work as I have seen that it loads them correctly.. Well, at least until it starts loading the language files and breaks my app..
Worker and wasm files are put in the
/public
folder so it can be read by the jsx. I tried not using the offline files, by removing these lines
workerPath: '/External/tesseractjs_data/js/worker.min.js',
corePath: '/External/tesseractjs_data/js/tesseract-core.wasm.js',
but I am still having the same error. All of the solutions I have seen online that is connected to this problem are almost all in java, and one of the solution needs to install some kind of tesseract software, but what I would want to avoid this as I wanted no installations, why I have picked web programming so the installation would be minimal..
I don't think anyone will need this but here is how I fixed my issue:
Seems like my downloader (IDM) was capturing the language files (traineddata.gz) and then sets a key with 0 value in indexed db on the domain / browser.
Clear browser cache, or just delete the key/value pair thingy in the Indexed DB, which can be found on the developer tools / console thingy, at the "Storage" of the browser
Disable downloader or just remove ".gz" on the file types capturing section of the downloader
It should now work
I am making a Solr web-based application and one of the features is the user can create a core and schema to the Solr. My friend made it using child process by going to the directory of the Solr first and then using the command 'bin/solr create -c...' the core can be created. But I am thinking of another approach, like using the http api request. I found this.
http://localhost:8983/solr/admin/cores?action=CREATE&name=mycore&instanceDir=path/to/instance&configSet=configset2
But apparently, it cannot run properly because you need to make the config file first for the core. The error says like this.
Error CREATEing SolrCore 'mycore': Unable to create core [mycore] Caused by: Could not load configuration from directory/opt/solr/server/solr/configsets/configset2
So I am wondering what kind of approach I can do, since it seems like I can't make a core without setting up a config first. Or should I make an input menu with create core, create schema and only after the user clicks 'submit' it will process everything, from making a config file, creating schema, and then finally creating the core? I wonder if it's the best approach.
I am looking forward to any help.
You always need to provide a configuration when creating a core.
When your friend run the command, it actually used the default configuration data_driven_schema_configs, which you can confirm by reading the explanation from create_core command (create is an alias for create_core for non Cloud setup):
bin/solr create_core -h
The solr script copied that configuration and then created the core with it.
The example you showed is only valid for SolrCloud. If you are not using SolrCloud, you need to be using Core Admin API directly and manually setup the directory with configuration.
Notice that configsets are a bit of a tricky thing in the sense that if you create several cores from the same configset, that configset is shared and changes made to it by one core affect all of them. So, you most likely don't want to use them, but instead copy the configuration as I described above.
I've run over only a few examples of how to do this and they didn't work for me. Mainly since i've only used an ant script to auto build jar files threw jenkins. Now though i need to build those files in jenkins then upload them to a 3rd party file site like sourceforge. This is both to save hard drive space on the server, since i don't own it, and to allow external downloads. Any help is welcome but no comments on the fact i don't know to much about ant scripts.
Also something related by a bit separate.The jar file i'm building depends on a another jar file with its own version. i also want to make a new folder each time it uploads with a different dependency version. This way the users that download this file can easily understand the main jar version it goes with while allowing me to upload 20+ sub builds.
There are several ways to upload files, so there are several kind of ant tasks to do the job.
For instance, if you want to upload to sourceforge, you can use the Ant task scp. But it seems also possible to upload there via FTP: so here is the task ftp.
Maybe you find some other service which requires you to upload via HTTP: ant-contrib have the post task.
I used to do publications as part of my ANT build logic, creating a special "publish" target that issued the scp or ftp command.Now I'm more inclined to leverage one of the publish over plugins for Jenkins.
The main reason for this shift is the management of access credentials. Using the ANT based approach I was forced to run my build on a Jenkins slave that was pre-configured with the correct SSH key to talk to the remote server. The Jenkins plugin manages private keys centrally and ensures all slaves are properly configured.
Finally if your build has dependencies on 3rd party jars, use a dependency manager like ivy to download them and include them in your project. It then becomes trivial to include their upload as part of your publish step.
Im exploring Solr4 and Polygons/linestrings.
There is some info on it here but not a howto/installation guide for a basic user like me.
http://wiki.apache.org/solr/SolrAdaptersForLuceneSpatial4
As far as I understand, you need to install the spatial4j code into solr. (Im a hack at best).
https://github.com/spatial4j/spatial4j/tree/master/src/main/java
Does one know where I upload this code to, inside the solr4 installation? Keep in mind im using the /example/solr/collection1 directory.
"Due to a combination of things, JTS can't simply be referenced by a "" entry in solrconfig.xml; it needs to be in WEB-INF/lib in Solr's war file, basicall" Does anyone know what that means in terms of an installation instruction? Im after some guidance of what goes where. I use start.jar to start solr on my apache server.
After that I understand that I simply need to add a field type and field () to the schema and as far as that goes it should be installed.
Im trying to send it polygon and linestring queries to find all documents within a polygon or within a radius of a line.
Solr includes Spatial4j already; what it doesn't have is JTS, which is a java library (.jar file). Download JTS from https://sourceforge.net/projects/jts-topo-suite/ (the .jar is within the .zip distro). WEB-INF/lib is a java webapp reference within a WAR file. example/webapps/solr.war is where that is. A .war file is really a zip, and can either be in it's '.war' file form or be uncompressed in a plain directory layout. So if you rename the '.war' to '.zip' in OSX it's trivial to double-click it in order to expand it. But then rename the resulting directory to 'solr.war', and put aside the original war file to some other place as you won't be using it for now. Take the JTS jar and put it in solr.war/WEB-INF/lib/. When you start Solr, it'll have access to JTS. If it doesn't have access due to whatever reason, you'll get a ClassNotFoundException pertaining to a JTS related Java class.
I'm using TeamCity 6.5.4 and I need to have 3 build configurations for the same deployment package. I'd like to persist the version number across all three build configurations and be able to use that number to version the assembly, tag vcs, version the nuspec file, etc.
Here are the configurations and desired version numbers:
Configuration | Version
-------------------|---------
CI/Nightly Build | 1.1.*
Minor Release | 1.*.0
Major Release | *.0.0
It seems that TeamCity uses a separate build incrementer for each configuration. This means every time we have a major or minor release, I'd have to manually update the persisted values (1) in all of the subsequent configurations. I'm a programmer and I'm lazy. I want a single button to do everything for me.
I've seen examples of persisting the build number through build steps of a configuration with dependent snapshots, but that only works in the same configuration.
The Autoincrementer plugin bumps up the number every time you reference the ID. This is fine for the changing numbers (*), but not so good for referencing the persisted values (1).
Is there a way for TeamCity, either natively or via plugin, to allow me to read and write that version to a file or variable that can be persisted across build configurations?
You can reference the build number of the dependent ( artifact / snapshot) configuration using dep.btx.build.number where btx is the bt id of the latter. Once you have the build number, pass the build number to your script running in the configuration, parse the build number in the script and send service messages from the script to Teamcity to set the build number in the way you want. Do this parsing and setting number as the first step in your script / first step in the build steps.
Thanks for the suggestions. I opted to write a set of custom targets to use with my MSBuild script which maintains assembly metadata in a remote xml "manifest" file. When a new TeamCity project is created, my build script calls an Init target which creates a new manifest file from an unpopulated template.
<Copy SourceFiles="#(ManifestTemplate)" DestinationFiles="#(ManifestTemplate->'$(ManifestFile)')" Condition="!Exists('$(ManifestFile)')" />
I'm using the MSBuild Extentions pack to read attributes like version information from the manifest file.
<MSBuild.ExtensionPack.Xml.XmlFile TaskAction="ReadElementText" File="$(ManifestFile)" XPath="/Package/Version/Major">
<Output PropertyName="PackageVersionMajor" TaskParameter="Value"/>
</MSBuild.ExtensionPack.Xml.XmlFile>
I have my TeamCity build configurations separated to CI, Test, Minor Release, and Major Release with different events triggering each. From the corresponding target in my project build script, I add a new target to DependsOnTargets attribute to call the custom target to update the appropriate version number and save it to the manifest file.
<Target Name="Test" DependsOnTargets="IntializeBuildProject;Build-UpdateVersion-Build">
<MSBuild Projects="$(SolutionFile)" Targets="Rebuild" Properties="Configuration=$(Configuration)" />
<TeamCitySetBuildNumber BuildNumber="$(PackageVersion)" />
The code in the custom target to handle the version update:
<MSBuild.ExtensionPack.Science.Maths TaskAction="Add" Numbers="$(PackageVersionBuild);1">
<Output PropertyName="PackageVersionBuild" TaskParameter="Result"/>
</MSBuild.ExtensionPack.Science.Maths>
<MSBuild.ExtensionPack.Xml.XmlFile TaskAction="UpdateElement" File="$(ManifestFile)" XPath="/Package/Version/Build" InnerText="$(PackageVersionBuild)"/>
This file handles persistence of the version and other metadata thus ignoring the TeamCity build number. Since the XML metadata file is centralized, I can use the values to populate my Nuspec, AssemblyInfo, and WiX Installer metadata as well as pass the version and other pertinent information back to TeamCity through service messages.
I added a simple MVC web interface to allow my team to edit the file contents remotely if package details change. Now we have one single place to update things like Copyright information and any other metadata for a given build project. I can also give non-dev folks access to the MVC site to update branding information without allowing them access to my TeamCity build configurations.
With the exception of the service messages used to relay version to TeamCity, there's very little here that's coupled with TeamCity. I like having the functionality in custom targets and build scripts removed from TeamCity on the off chance we move to another build management solution. For that reason, I don't envision taking time to build a TeamCity plugin, but there could be a blog series coming soon.
I'll be happy to provide more code and further explanation to anyone interested.
Yes, you can create a plugin to do this easy. You can take my auto increment build number ( across configurations ) plugin and modify it to fit your need. The build number will be saved in a text file that is configurable from the admin screen in TeamCity.
http://github.com/ornatwork/tc_plugins/tree/master/unique
You can hit me up for input how to change it if you need.