can data-import-handler use HikariCP? - solr

I'm using solr4.5.1 in work.
The trouble is that a lot of getConnection occurred, when I execute data-import(full-import). So I thought if HikariCP could be used in data-import, but I haven't found similar problem.
Is it possible? If so please advice.
Solr 4.5.1 with Tomcat
data-config.xml
<dataSource driver="oracle.jdbc.driver.OracleDriver"
name="jdbc"
url="jdbc:oracle:thin:#address/mydb"
user="user" password="pass"/>

Heavily borrowed from David H Nebinger's post: Tomcat and HikariCP.
Install HikariCP
To make use of JNDI, you need to declare the JNDI datasource with all its' settings, password and cache options within the JNDI declaration. This has nothing to do with Solr at this point. This is a Tomcat mechanism. How you do this is described in this tutorial that also makes use of HikariCP.
First is to download the .zip or .tar.gz file from http://brettwooldridge.github.io/HikariCP/. This is actually a source release that you'll need to build yourself.
Second option is to download the built jar from a source like Maven Central, https://mvnrepository.com/artifact/com.zaxxer/HikariCP
Once you have the jar, copy to the Tomcat lib/ext directory. Note that Hikari CP does have a dependency on SLF4J, so you'll need to put that jar into lib/ext too.
Do not forget to place your datasource's JDBC driver in the lib/ext folder.
Configure the JNDI datasource
Location of your JNDI datasource <Resource /> definitions depends upon the scope for the connections. You can define them globally by specifying them in Tomcat's conf/server.xml and conf/context.xml, or you can scope them to individual applications by defining them in conf/Catalina/localhost/WebAppContext.xml (where WebAppContext is the web application context for the app, basically the directory name from Tomcat's webapps directory).
Create the file conf/Catalina/localhost/ROOT.xml if it doesn't already exist. Use the table from https://github.com/brettwooldridge/HikariCP#popular-datasource-class-names to find your data source class name, we'll need it when we define the element.
<Resource name="jdbc/SolrPool" auth="Container"
factory="com.zaxxer.hikari.HikariJNDIFactory"
type="javax.sql.DataSource"
minimumIdle="5"
maximumPoolSize="10"
connectionTimeout="300000"
dataSourceClassName="oracle.jdbc.pool.OracleDataSource"
dataSource.url="jdbc:oracle:thin:#address/mydb"
dataSource.implicitCachingEnabled="true"
dataSource.user="user"
dataSource.password="pass" />
Make use of the JNDI datasource in Solr
After you have followed this tutorial, you will need to use the declared JNDI datasource, this would be like described in the Solr Wiki:
<dataSource
jndiName="java:jdbc/SolrPool"
type="JdbcDataSource"
user="" password=""/> <!-- leave out user/password here -->

Related

How to configure p6spy for mssql server with hibernate?

In our web application we are using spring, hibernate & sql server 2016 as db. We are using jndi to connect to the database. To record all the queries executed by hibernate I am trying to implement the p6spy.
Here are the changes I have made.
Changed the resource information from
<Resource name="jdbc/eportalcore" auth="Container"
type="javax.sql.DataSource"
driverClassName="com.microsoft.sqlserver.jdbc.SQLServerDriver"
url="jdbc:sqlserver://localhost:1433;databaseName=eportal-core;"
username="eportaldbadmin"
password="P#ssw0rd"
maxTotal="100"
maxIdle="20"
minIdle="5"
maxWaitMillis="10000" />
to
<Resource name="jdbc/eportalcore" auth="Container"
type="javax.sql.DataSource"
driverClassName="com.p6spy.engine.spy.P6SpyDriver"
url="jdbc:p6spy:sqlserver://localhost:1433/eportal-core"
username="eportaldbadmin"
password="P#ssw0rd"
maxTotal="100"
maxIdle="20"
minIdle="5"
maxWaitMillis="10000" />
and added the spy.properties file under lib folder of tomcat directory. Also I have placed the p6spy-3.0.0.jar too inside the lib folder.
But after this my application is not getting connected to the DB. What mistake I am doing here? If I remove this changes then it is working fine.
Any suggestions?
I had the same question and here is what I did to make it work. Note my Database connection properties are in a property file (shouldn't be an issue) and that I use tomcat as an Application Server. You can also add P6Spy as a Maven dependency in your project.
Before :
db.properties
db.driver=net.sourceforge.jtds.jdbc.Driver
db.url=jdbc:jtds:sqlserver://${db.server}/${db.name};useNTLMv2=true;domain=XX
After :
Download(latest version when writing this response) the project. Put the p6spy-3.7.0.jar and spy.properties in the tomcat/lib folder. Change the spy.properties and application properties as noted under. Restart application and you should find a spy.log where your logs are printed normally.
db.properties
db.driver=com.p6spy.engine.spy.P6SpyDriver
db.url=jdbc:p6spy:jtds:sqlserver://${db.server}/${db.name};useNTLMv2=true;domain=XX
spy.properties
driverlist=net.sourceforge.jtds.jdbc.Driver

Update jar file in Solr 4.4.0

I have Solr cloud configuration which we run on 4 servers. We use tomcat as web server for solr. I have 5 zookeepers to maintain the data-replication. I have added a jar file with custom update processor. This is in shared folder which is mention in solr.xml
<solr persistent="true" sharedLib="/solr/lib">
While creating the first version of this jar file I gave the name updateProcessor.0.1.jar as the file name. Even though it was shared, jar files were added in all the 4 servers.
But now I have to update the updateProcessor. For this I created updateProcessor0.2.jar. I deleted the updateProcessor.0.1.jar from each sever and added a new one. But changes were not seen ?
Any ideas what I am doing wrong? Should this is be checked using zkcli ?
Well I found a roundabout which may help someone in future maybe.
I changed entry in solrconfig from
<processor class="org.apache.solr.update.processor.MyUpdateProcessorFactory">
to
<processor class="org.apache.solr.update.processor.MyUpdateProcessorFactory2">
I renamed the class file I created in solr config from MyUpdateProcessorFactory to MyUpdateProcessorFactory2

How to install solr-4.6.1 on ubuntu using tomcat7?

I want step by step instructions for installation of solr search engine using tomcat7 on ubuntu. I searched on google but I am not getting proper reference. Please help me to install it.
1- Download Solr dist: here. Unzip it anywhere.
2- find $EXTRACTEDDIR/solr/dist/solr-4.6.1.war and copy it to the location you want to configure solr.
3- configure the solr.xml as explained here. You will need to change your hostport to 8080(or what ever port tomcat7 is configured) and put it where you put solr-4.6.1.war
4- create /etc/tomcat7/Catalina/localhost/solr.xml and put
<?xml version="1.0" encoding="UTF-8"?>
<Context docBase="$SOLRBASE/solr-4.6.1.war" debug="0" privileged="true" allowLinking="true" crossContext="true">
<Environment name="solr/home" type="java.lang.String" value="$SOLRBASE" override="true" />
</Context>
replace SOLRBASE with the loacation you put solr.war and solr.xml.
5- copy $EXTRACTEDDIR/example/lib/ext/* . * to /usr/share/tomcat7/lib(for loging libs)
6- give permissions of the folder where you put solr.war and solr.xml to "tomcat7" user.
7- restart tomcat and you are done.
copy the solr files to ubuntu.
copy the file "solr.war" into "%tomcat_home%/webapps/" and add the solr service's context to the "%tomcat_home%/server.xml"
set the environment value "solr/home" int server.xml or web.xml
visit here for detail:https://wiki.apache.org/solr/SolrTomcat
it's a piece of cake, may you succeed!

Why tomcat7 can't read my context.xml?

I'm trying to integrate apache solr4.5 with tomcat7.0.
Here is my prob, I make a directory named "solr" under $CATALINA_BASE/webapps, and create an context.xml under the path, which the directory structure looks like.
$CATALINA_BASE/webapps/solr/META-INF/context.xml
The context.xml looks like as follow:
<?xml version="1.0" encoding="utf-8" ?>
<Context docBase="/solr/home/root/solr.war">
<Environment name="solr/home" type="java.lang.String" value="/solr/home/root" />
</Context>
I was expecting when I visit "localhost:8080/solr/", the solr webapp should work fine, but what I got is some errors like this:
HTTP Status 404 - /solr/
type Status report
message /solr/
description The requested resource (/solr/) is not available.
Apache Tomcat/7.0.26
But according to the apache tomcat7.0 document,
Individual Context elements may be explicitly defined:
In an individual file at /META-INF/context.xml inside the application files. Optionally (based on the Host's copyXML
attribute) this may be copied to
$CATALINA_BASE/conf/[enginename]/[hostname]/ and renamed to
application's base file name plus a ".xml" extension.
In individual files (with a ".xml" extension) in the $CATALINA_BASE/conf/[enginename]/[hostname]/ directory. The context
path and version will be derived from the base name of the file (the
file name less the .xml extension). This file will always take
precedence over any context.xml file packaged in the web
application's META-INF directory.
Inside a Host element in the main conf/server.xml.
So, please anyone give me some clues, thanks.
If you check the catalina log file located in /logs, you will probably see some errors from Solr, one of which will be:
SEVERE: Error filterStart
This is due to a change in the way that logging in general is now implemented within Solr as of version 4.3. Please refer to SolrLogging - Using the example logging setup in containers other than Jetty for the necessary steps to setup logging within Tomcat.
This should resolve your issue.
Try placing your context.xml (for all webapps) or solr.xml (solr only) file in Catalina/localhost folder instead.
For instance:
<Context docBase="webapps/solr.war" allowLinking="true" reloadable="true">
<Environment name="solr/home" type="java.lang.String" value="/opt/solr" override="true" />
</Context>
Troubleshooting:
check your logs, e.g.:
cd /var/log/tomcat? && tail -f *.log *.out *.txt

Reload Solr configuration without multicore

Is it possible to reload Solr configuration without setting up Multicore or restarting the servlet container?
I would like to tweak some <analyzer> chains with the analysis tab in the admin, and tweak the parameters to my <requestHandler>, but having to restart the servlet container after every small change to schema.xml or solrconfig.xml is a bit of a pain and time consuming.
There is always a core in SOLR. By default, SOLR instance creates a core named collection1. In case you have a single core and not sure how to reload at runtime, you can use this,
http://localhost:8080/solr/admin/cores?action=RELOAD&core=collection1
As best as I can tell, online reloading requires a Multicore configuration, which it turns out isn't too hard:
Put this solr.xml into the solr home directory
<solr persistent="false" sharedLib="lib">
<cores adminPath="/admin/cores" defaultCoreName="core0">
<core name="core0" instanceDir="." />
</cores>
</solr>
Restart the servlet container.
Hit a URL like this to reload the configuration:
http://localhost:8983/solr/admin/cores?action=RELOAD&core=core0
To remove the rest of the friction, you can set it up to automatically reload the configuration by running the following script within the conf directory.
get_on_fsevent.rb "http://localhost:8983/solr/admin/cores?action=RELOAD&core=core0"
get_on_fsevent.rb:
#!/usr/bin/env ruby
require 'rubygems'
require 'rb-fsevent'
require 'net/http'
require 'uri'
uri = URI.parse(ARGV.first)
fsevent = FSEvent.new
fsevent.watch Dir.pwd do |directories|
puts "Detected change. Requesting #{ARGV.first}"
puts Net::HTTP.get_response(uri)
end
fsevent.run

Resources