Enabling LTR in SOLRCloud (solr version8.2) - solr

I am trying to work with solr-cloud and I have to use learning to rank models and features for my project. But I am facing this issue of SolrCore Initialization Failures
techproducts_shard1_replica_n2: org.apache.solr.common.SolrException:org.apache.solr.common.SolrException: Failed to create new ManagedResource /schema/model-store of type org.apache.solr.ltr.store.rest.ManagedModelStore due to: org.apache.solr.common.SolrException: org.apache.solr.ltr.model.ModelException: Model type does not exist org.apache.solr.ltr.model.LinearModel
techproducts_shard2_replica_n6: org.apache.solr.common.SolrException:org.apache.solr.common.SolrException: Failed to create new ManagedResource /schema/model-store of type org.apache.solr.ltr.store.rest.ManagedModelStore due to: org.apache.solr.common.SolrException: org.apache.solr.ltr.model.ModelException: Model type does not exist org.apache.solr.ltr.model.LinearModel
Please check your logs for more information
These are the solr logs:-
**2019-12-04 12:44:05.760 ERROR (searcherExecutor-15-thread-1-processing-n:192.168.137.1:8983_solr x:techproducts_shard1_replica_n2 c:techproducts s:shard1 r:core_node5) [c:techproducts s:shard1 r:core_node5 x:techproducts_shard1_replica_n2] o.a.s.h.RequestHandlerBase java.lang.NullPointerException
at org.apache.solr.handler.component.SearchHandler.initComponents(SearchHandler.java:183)
at org.apache.solr.handler.component.SearchHandler.getComponents(SearchHandler.java:203)
at org.apache.solr.handler.component.SearchHandler.handleRequestBody(SearchHandler.java:260)
at org.apache.solr.handler.RequestHandlerBase.handleRequest(RequestHandlerBase.java:199)
at org.apache.solr.core.SolrCore.execute(SolrCore.java:2578)
at org.apache.solr.core.QuerySenderListener.newSearcher(QuerySenderListener.java:74)
at org.apache.solr.core.SolrCore.lambda$getSearcher$18(SolrCore.java:2344)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at org.apache.solr.common.util.ExecutorUtil$MDCAwareThreadPoolExecutor.lambda$execute$0(ExecutorUtil.java:209)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
2019-12-04 12:44:05.760 ERROR (searcherExecutor-14-thread-1-processing-n:192.168.137.1:8983_solr x:techproducts_shard2_replica_n6 c:techproducts s:shard2 r:core_node8) [c:techproducts s:shard2 r:core_node8 x:techproducts_shard2_replica_n6] o.a.s.h.RequestHandlerBase java.lang.NullPointerException
at org.apache.solr.handler.component.SearchHandler.initComponents(SearchHandler.java:183)
at org.apache.solr.handler.component.SearchHandler.getComponents(SearchHandler.java:203)
at org.apache.solr.handler.component.SearchHandler.handleRequestBody(SearchHandler.java:260)
at org.apache.solr.handler.RequestHandlerBase.handleRequest(RequestHandlerBase.java:199)
at org.apache.solr.core.SolrCore.execute(SolrCore.java:2578)
at org.apache.solr.core.QuerySenderListener.newSearcher(QuerySenderListener.java:74)
at org.apache.solr.core.SolrCore.lambda$getSearcher$18(SolrCore.java:2344)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at org.apache.solr.common.util.ExecutorUtil$MDCAwareThreadPoolExecutor.lambda$execute$0(ExecutorUtil.java:209)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)

Match the compatibility of Solr version and zookeeper....
For Solr 8.2.0 use zookepeer 3.5.6

Related

Error while enabling camel-swagger-java at karaf startup

Problem
I'm trying to install the camel feature camel-swagger-java 2.22.2 in a karaf 4.2.2 environment.
I added the feature camel-swagger-java to featuresBoot property in the config file etc/org.apache.karaf.features.cfg of karaf because I want that this feature is installed when karaf is up.
When I launch bin/karaf I get this error:
org.apache.karaf.features.internal.util.MultiException: Error:
Error downloading wrap:file:/C:/Users/karaf/.m2/repository/io/swagger/swagger-parser/1.0.36/swagger-parser-1.0.36.jar
Error downloading wrap:file:/C:/Users/karaf/.m2/repository/io/swagger/swagger-parser/1.0.36/swagger-parser-1.0.36.jar
at org.apache.karaf.features.internal.download.impl.MavenDownloadManager$MavenDownloader.(MavenDownloadManager.java:91)
at org.apache.karaf.features.internal.download.impl.MavenDownloadManager.createDownloader(MavenDownloadManager.java:72)
at org.apache.karaf.features.internal.region.Subsystem.downloadBundles(Subsystem.java:457)
at org.apache.karaf.features.internal.region.Subsystem.downloadBundles(Subsystem.java:452)
at org.apache.karaf.features.internal.region.SubsystemResolver.resolve(SubsystemResolver.java:224)
at org.apache.karaf.features.internal.service.Deployer.deploy(Deployer.java:388)
at org.apache.karaf.features.internal.service.FeaturesServiceImpl.doProvision(FeaturesServiceImpl.java:1025)
at org.apache.karaf.features.internal.service.FeaturesServiceImpl.lambda$doProvisionInThread$13(FeaturesServiceImpl.java:964)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
Suppressed: java.io.IOException: Error downloading wrap:file:/C:/Users/karaf/.m2/repository/io/swagger/swagger-parser/1.0.36/swagger-parser-1.0.36.jar
at org.apache.karaf.features.internal.download.impl.AbstractRetryableDownloadTask.run(AbstractRetryableDownloadTask.java:77)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$201(ScheduledThreadPoolExecutor.java:180)
at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:293)
... 3 more
Caused by: java.io.IOException: Could not download [wrap:file:/C:/Users/karaf/.m2/repository/io/swagger/swagger-parser/1.0.36/swagger-parser-1.0.36.jar]
at org.apache.karaf.features.internal.download.impl.SimpleDownloadTask.download(SimpleDownloadTask.java:91)
at org.apache.karaf.features.internal.download.impl.AbstractRetryableDownloadTask.run(AbstractRetryableDownloadTask.java:60)
... 7 more
Caused by: java.net.MalformedURLException: Unknown protocol: wrap
at java.net.URL.(URL.java:627)
at java.net.URL.(URL.java:490)
at java.net.URL.(URL.java:439)
at org.apache.karaf.features.internal.download.impl.SimpleDownloadTask.download(SimpleDownloadTask.java:62)
... 8 more
Caused by: java.lang.IllegalStateException: Unknown protocol: wrap
at org.apache.felix.framework.URLHandlersStreamHandlerProxy.parseURL(URLHandlersStreamHandlerProxy.java:373)
at java.net.URL.(URL.java:622)
... 11 more
[CIRCULAR REFERENCE:java.io.IOException: Error downloading wrap:file:/C:/Users/karaf/.m2/repository/io/swagger/swagger-parser/1.0.36/swagger-parser-1.0.36.jar
Code
These are the first lines of my etc/org.apache.karaf.features.cfg
featuresRepositories mvn:org.apache.karaf.features/enterprise/4.2.2/xml/features, mvn:org.apache.karaf.features/spring/4.2.2/xml/features, mvn:org.apache.karaf.features/standard/4.2.2/xml/features, mvn:org.apache.karaf.features/framework/4.2.2/xml/features, mvn:org.apache.camel.karaf/apache-camel/2.22.2/xml/features, mvn:org.apache.karaf.decanter/apache-karaf-decanter/2.1.0/xml/features, mvn:org.ops4j.pax.jdbc/pax-jdbc-features/1.3.2/xml/features, mvn:org.apache.servicemix.features/servicemix-features/7.0.1/xml/features
featuresBoot = instance/4.2.2, package/4.2.2, log/4.2.2, ssh/4.2.2, framework/4.2.2, system/4.2.2, eventadmin/4.2.2, feature/4.2.2, shell/4.2.2, management/4.2.2, service/4.2.2, jaas/4.2.2, deployer/4.2.2, diagnostic/4.2.2, wrap/2.5.4, bundle/4.2.2, config/4.2.2, kar/4.2.2, webconsole/4.2.2, jdbc/4.2.2, http/4.2.2, jetty/4.2.2, camel/2.22.2, camel-core/2.22.2, camel-blueprint/2.22.2, camel-spring/2.22.2, camel-jetty/2.22.2, camel-netty/2.22.2, camel-netty-http/2.22.2, camel-netty4/2.22.2, camel-netty4-http/2.22.2, camel-http4/2.22.2, camel-jackson/2.22.2, camel-sql/2.22.2, camel-jdbc/2.22.2, camel-quartz/2.22.2, camel-jacksonxml/2.22.2, camel-swagger-java/2.22.2, decanter-appender-elasticsearch-rest/2.1.0, decanter-collector-log/2.1.0, pax-jdbc-mariadb/1.3.2, pax-jdbc-mysql/1.3.2, pax-jdbc-config/1.3.2
Question
Why I get this error? It's a bug of camel-swagger-java feature or I'm missing something?

Error integration of solr 5.5.0 with nutch 1.13: 'Connection pool shut down'

I had a problem when I tried to integrate 'Solr' with 'Nutch':
version of 'Nutch':1.13
version of 'Solr': 5.5.0 (as recommended by the official
documentations https://wiki.apache.org/nutch/NutchTutorial#Verify_your_Nutch_installation)
The error is :
Active IndexWriters :
SOLRIndexWriter
solr.server.url : URL of the SOLR instance
solr.zookeeper.hosts : URL of the Zookeeper quorum
solr.commit.size : buffer size when sending to SOLR (default 1000)
solr.mapping.file : name of the mapping file for fields (default solrindex-mapping.xml)
solr.auth : use authentication (default false)
solr.auth.username : username for authentication
solr.auth.password : password for authentication
Indexer: number of documents indexed, deleted, or skipped:
Indexer: finished at 2017-11-30 01:34:49, elapsed: 00:00:01
Cleaning up index if possible
apache-nutch-1.13/bin /nutch clean -Dsolr.server.url=http://localhost:8983/solr/nutch crawling_dir/crawldb
SolrIndexer: deleting 1/1 documents
ERROR CleaningJob: java.io.IOException: Job failed!
at org.apache.hadoop.mapred.JobClient.runJob(JobClient.java:865)
at org.apache.nutch.indexer.CleaningJob.delete(CleaningJob.java:174)
at org.apache.nutch.indexer.CleaningJob.run(CleaningJob.java:197)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
at org.apache.nutch.indexer.CleaningJob.main(CleaningJob.java:208)
Error running:
apache-nutch-1.13/bin/nutch clean -Dsolr.server.url=http://localhost:8983/solr/nutch crawling_dir/crawldb
Failed with exit value 255.
on the log file:
2017-11-30 01:34:50,851 WARN output.FileOutputCommitter - Output Path is null in cleanupJob()
2017-11-30 01:34:50,851 WARN mapred.LocalJobRunner - job_local531807742_0001
java.lang.Exception: java.lang.IllegalStateException: Connection pool shut down
at org.apache.hadoop.mapred.LocalJobRunner$Job.runTasks(LocalJobRunner.java:462)
at org.apache.hadoop.mapred.LocalJobRunner$Job.run(LocalJobRunner.java:529)
Caused by: java.lang.IllegalStateException: Connection pool shut down
at org.apache.http.util.Asserts.check(Asserts.java:34)
at org.apache.http.pool.AbstractConnPool.lease(AbstractConnPool.java:169)
at org.apache.http.pool.AbstractConnPool.lease(AbstractConnPool.java:202)
at org.apache.http.impl.conn.PoolingClientConnectionManager.requestConnection(PoolingClientConnectionManager.java:184)
at org.apache.http.impl.client.DefaultRequestDirector.execute(DefaultRequestDirector.java:415)
at org.apache.http.impl.client.AbstractHttpClient.doExecute(AbstractHttpClient.java:863)
at org.apache.http.impl.client.CloseableHttpClient.execute(CloseableHttpClient.java:82)
at org.apache.http.impl.client.CloseableHttpClient.execute(CloseableHttpClient.java:106)
at org.apache.http.impl.client.CloseableHttpClient.execute(CloseableHttpClient.java:57)
at org.apache.solr.client.solrj.impl.HttpSolrClient.executeMethod(HttpSolrClient.java:481)
at org.apache.solr.client.solrj.impl.HttpSolrClient.request(HttpSolrClient.java:240)
at org.apache.solr.client.solrj.impl.HttpSolrClient.request(HttpSolrClient.java:229)
at org.apache.solr.client.solrj.SolrRequest.process(SolrRequest.java:149)
at org.apache.solr.client.solrj.SolrClient.commit(SolrClient.java:482)
at org.apache.solr.client.solrj.SolrClient.commit(SolrClient.java:463)
at org.apache.nutch.indexwriter.solr.SolrIndexWriter.commit(SolrIndexWriter.java:191)
at org.apache.nutch.indexwriter.solr.SolrIndexWriter.close(SolrIndexWriter.java:179)
at org.apache.nutch.indexer.IndexWriters.close(IndexWriters.java:117)
at org.apache.nutch.indexer.CleaningJob$DeleterReducer.close(CleaningJob.java:122)
at org.apache.hadoop.io.IOUtils.cleanup(IOUtils.java:244)
at org.apache.hadoop.mapred.ReduceTask.runOldReducer(ReduceTask.java:459)
at org.apache.hadoop.mapred.ReduceTask.run(ReduceTask.java:392)
at org.apache.hadoop.mapred.LocalJobRunner$Job$ReduceTaskRunnable.run(LocalJobRunner.java:319)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
2017-11-30 01:34:51,458 ERROR indexer.CleaningJob - CleaningJob: java.io.IOException: Job failed!
at org.apache.hadoop.mapred.JobClient.runJob(JobClient.java:865)
at org.apache.nutch.indexer.CleaningJob.delete(CleaningJob.java:174)
at org.apache.nutch.indexer.CleaningJob.run(CleaningJob.java:197)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
at org.apache.nutch.indexer.CleaningJob.main(CleaningJob.java:208)
Please do you have any idea?
Had same problem and yours probably is due to the same reason
https://issues.apache.org/jira/browse/NUTCH-2269
Try patch it and error should be gone
From my finding, it appears to be a bug. Here's a blog that explains it well, https://reformatcode.com/code/apache-configuration/apache-nutch-112-with-apache-solr-621-give-an-error

NUTCH 1.13 fetch of url failed with: org.apache.nutch.protocol.ProtocolNotFound: protocol not found for url=http

fetch of httpurl failed with:
org.apache.nutch.protocol.ProtocolNotFound: protocol not found for
url=http at
org.apache.nutch.protocol.ProtocolFactory.getProtocol(ProtocolFactory.java:85)
at org.apache.nutch.fetcher.FetcherThread.run(FetcherThread.java:285)
Using queue mode : byHost
fetch of httpsurl failed with: org.apache.nutch.protocol.ProtocolNotFound: protocol not found for url=https
at org.apache.nutch.protocol.ProtocolFactory.getProtocol(ProtocolFactory.java:85)
at org.apache.nutch.fetcher.FetcherThread.run(FetcherThread.java:285)
I get above result while running nutch1.13 with solr6.6.0
command i used is
bin/crawl -i -D
solr.server.url=http://myip/solr/nutch/ urls/ crawl 2
below is plugin section in my nutch-site.xml
<name>plugin.includes</name>
<value>
protocol-(http|httpclient)|urlfilter-regex|parse-(html)|index-(basic|anchor)|indexer-solr|query-(basic|site|url)|response-(json|xml)|summary-basic|scoring-opic|urlnormalizer-(pass|regex|basic)
</value>
Below are my file contents
[root#localhost apache-nutch-1.13]# ls plugins
creativecommons index-more nutch-extensionpoints protocol-file scoring-similarity urlnormalizer-ajax
feed index-replace parse-ext protocol-ftp subcollection urlnormalizer-basic
headings index-static parsefilter-naivebayes protocol-htmlunit tld urlnormalizer-host
index-anchor language-identifier parsefilter-regex protocol-http urlfilter-automaton urlnormalizer-pass
index-basic lib-htmlunit parse-html protocol-httpclient urlfilter-domain urlnormalizer-protocol
indexer-cloudsearch lib-http parse-js protocol-interactiveselenium urlfilter-domainblacklist urlnormalizer-querystring
indexer-dummy lib-nekohtml parse-metatags protocol-selenium urlfilter-ignoreexempt urlnormalizer-regex
indexer-elastic lib-regex-filter parse-replace publish-rabbitmq urlfilter-prefix urlnormalizer-slash
indexer-solr lib-selenium parse-swf publish-rabitmq urlfilter-regex
index-geoip lib-xml parse-tika scoring-depth urlfilter-suffix
index-links microformats-reltag parse-zip scoring-link urlfilter-validator
index-metadata mimetype-filter plugin scoring-opic urlmeta
I'm stuck with this issue. As you can see i have included both protocol-(http|httpclient) .But still fetching url failed. Thanks in advance.
NEWER ISSUE hadoop.log
2017-09-01 14:35:07,172 INFO solr.SolrIndexWriter - SolrIndexer:
deleting 1/1 documents 2017-09-01 14:35:07,321 WARN
output.FileOutputCommitter - Output Path is null in cleanupJob()
2017-09-01 14:35:07,323 WARN mapred.LocalJobRunner -
job_local1176811933_0001 java.lang.Exception:
java.lang.IllegalStateException: Connection pool shut down at
org.apache.hadoop.mapred.LocalJobRunner$Job.runTasks(LocalJobRunner.java:462)
at
org.apache.hadoop.mapred.LocalJobRunner$Job.run(LocalJobRunner.java:529)
Caused by: java.lang.IllegalStateException: Connection pool shut down
at org.apache.http.util.Asserts.check(Asserts.java:34) at
org.apache.http.pool.AbstractConnPool.lease(AbstractConnPool.java:169)
at
org.apache.http.pool.AbstractConnPool.lease(AbstractConnPool.java:202)
at
org.apache.http.impl.conn.PoolingClientConnectionManager.requestConnection(PoolingClientConnectionManager.java:184)
at
org.apache.http.impl.client.DefaultRequestDirector.execute(DefaultRequestDirector.java:415)
at
org.apache.http.impl.client.AbstractHttpClient.doExecute(AbstractHttpClient.java:863)
at
org.apache.http.impl.client.CloseableHttpClient.execute(CloseableHttpClient.java:82)
at
org.apache.http.impl.client.CloseableHttpClient.execute(CloseableHttpClient.java:106)
at
org.apache.http.impl.client.CloseableHttpClient.execute(CloseableHttpClient.java:57)
at
org.apache.solr.client.solrj.impl.HttpSolrClient.executeMethod(HttpSolrClient.java:481)
at
org.apache.solr.client.solrj.impl.HttpSolrClient.request(HttpSolrClient.java:240)
at
org.apache.solr.client.solrj.impl.HttpSolrClient.request(HttpSolrClient.java:229)
at
org.apache.solr.client.solrj.SolrRequest.process(SolrRequest.java:149)
at
org.apache.solr.client.solrj.SolrClient.commit(SolrClient.java:482)
at
org.apache.solr.client.solrj.SolrClient.commit(SolrClient.java:463)
at
org.apache.nutch.indexwriter.solr.SolrIndexWriter.commit(SolrIndexWriter.java:191)
at
org.apache.nutch.indexwriter.solr.SolrIndexWriter.close(SolrIndexWriter.java:179)
at org.apache.nutch.indexer.IndexWriters.close(IndexWriters.java:117)
at
org.apache.nutch.indexer.CleaningJob$DeleterReducer.close(CleaningJob.java:122)
at org.apache.hadoop.io.IOUtils.cleanup(IOUtils.java:244) at
org.apache.hadoop.mapred.ReduceTask.runOldReducer(ReduceTask.java:459)
at org.apache.hadoop.mapred.ReduceTask.run(ReduceTask.java:392) at
org.apache.hadoop.mapred.LocalJobRunner$Job$ReduceTaskRunnable.run(LocalJobRunner.java:319)
at
java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
at java.util.concurrent.FutureTask.run(FutureTask.java:266) at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748) 2017-09-01 14:35:07,679
ERROR indexer.CleaningJob - CleaningJob: java.io.IOException: Job
failed! at
org.apache.hadoop.mapred.JobClient.runJob(JobClient.java:865) at
org.apache.nutch.indexer.CleaningJob.delete(CleaningJob.java:174) at
org.apache.nutch.indexer.CleaningJob.run(CleaningJob.java:197) at
org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70) at
org.apache.nutch.indexer.CleaningJob.main(CleaningJob.java:208)
I somehow solved the issue. I think the space in nutch-site.xml was causing issue new plugin.includes section for others coming here.
<name>plugin.includes</name>
<value>protocol-http|protocol-httpclient|urlfilter-regex|parse-(html)|index-(basic|anchor)|indexer-solr|query-(basic|site|url)|response-(json|xml)|summary-basic|scoring-opic|urlnormalizer-(pass|regex|basic)</value>

Camel Spring cxf restful web service issue

Hi I am trying to consume a cxf spring based web service using camel.
Getting below error on server start up
SEVERE:
Context initialization failed
org.apache.camel.RuntimeCamelException: org.apache.cxf.service.factory.ServiceConstructionException
at org.apache.camel.util.ObjectHelper.wrapRuntimeCamelException(ObjectHelper.java:1680)
at org.apache.camel.spring.SpringCamelContext.onApplicationEvent(SpringCamelContext.java:138)
at org.apache.camel.spring.CamelContextFactoryBean.onApplicationEvent(CamelContextFactoryBean.java:340)
at org.springframework.context.event.SimpleApplicationEventMulticaster.invokeListener(SimpleApplicationEventMulticaster.java:163)
at org.springframework.context.event.SimpleApplicationEventMulticaster.multicastEvent(SimpleApplicationEventMulticaster.java:136)
at org.springframework.context.support.AbstractApplicationContext.publishEvent(AbstractApplicationContext.java:381)
at org.springframework.context.support.AbstractApplicationContext.publishEvent(AbstractApplicationContext.java:335)
at org.springframework.context.support.AbstractApplicationContext.finishRefresh(AbstractApplicationContext.java:855)
at org.springframework.context.support.AbstractApplicationContext.refresh(AbstractApplicationContext.java:541)
at org.springframework.web.context.ContextLoader.configureAndRefreshWebApplicationContext(ContextLoader.java:446)
at org.springframework.web.context.ContextLoader.initWebApplicationContext(ContextLoader.java:328)
at org.springframework.web.context.ContextLoaderListener.contextInitialized(ContextLoaderListener.java:107)
at org.apache.catalina.core.StandardContext.listenerStart(StandardContext.java:4842)
at org.apache.catalina.core.StandardContext.startInternal(StandardContext.java:5303)
at org.apache.catalina.util.LifecycleBase.start(LifecycleBase.java:147)
at org.apache.catalina.core.ContainerBase$StartChild.call(ContainerBase.java:1407)
at org.apache.catalina.core.ContainerBase$StartChild.call(ContainerBase.java:1397)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)
Caused by: org.apache.cxf.service.factory.ServiceConstructionException
at org.apache.cxf.jaxrs.JAXRSServerFactoryBean.create(JAXRSServerFactoryBean.java:219)
at org.apache.camel.component.cxf.jaxrs.CxfRsConsumer.<init>(CxfRsConsumer.java:44)
at org.apache.camel.component.cxf.jaxrs.CxfRsEndpoint.createConsumer(CxfRsEndpoint.java:176)
at org.apache.camel.impl.EventDrivenConsumerRoute.addServices(EventDrivenConsumerRoute.java:69)
at org.apache.camel.impl.DefaultRoute.onStartingServices(DefaultRoute.java:98)
at org.apache.camel.impl.RouteService.warmUp(RouteService.java:158)
at org.apache.camel.impl.DefaultCamelContext.doWarmUpRoutes(DefaultCamelContext.java:3490)
at org.apache.camel.impl.DefaultCamelContext.safelyStartRouteServices(DefaultCamelContext.java:3420)
at org.apache.camel.impl.DefaultCamelContext.doStartOrResumeRoutes(DefaultCamelContext.java:3197)
at org.apache.camel.impl.DefaultCamelContext.doStartCamel(DefaultCamelContext.java:3053)
at org.apache.camel.impl.DefaultCamelContext.access$000(DefaultCamelContext.java:175)
at org.apache.camel.impl.DefaultCamelContext$2.call(DefaultCamelContext.java:2848)
at org.apache.camel.impl.DefaultCamelContext$2.call(DefaultCamelContext.java:2844)
at org.apache.camel.impl.DefaultCamelContext.doWithDefinedClassLoader(DefaultCamelContext.java:2867)
at org.apache.camel.impl.DefaultCamelContext.doStart(DefaultCamelContext.java:2844)
at org.apache.camel.support.ServiceSupport.start(ServiceSupport.java:61)
at org.apache.camel.impl.DefaultCamelContext.start(DefaultCamelContext.java:2813)
at org.apache.camel.spring.SpringCamelContext.maybeStart(SpringCamelContext.java:270)
at org.apache.camel.spring.SpringCamelContext.onApplicationEvent(SpringCamelContext.java:136)
... 19 more
Caused by: org.apache.cxf.service.factory.ServiceConstructionException: No resource classes found
at org.apache.cxf.jaxrs.AbstractJAXRSFactoryBean.checkResources(AbstractJAXRSFactoryBean.java:317)
at org.apache.cxf.jaxrs.JAXRSServerFactoryBean.create(JAXRSServerFactoryBean.java:159)
... 37 more
Aug 24, 2016 11:19:05 AM org.apache.catalina.core.StandardContext listenerStart
SEVERE: Exception sending context initialized event to listener instance of class org.springframework.web.context.ContextLoaderListener
org.apache.camel.RuntimeCamelException: org.apache.cxf.service.factory.ServiceConstructionException
at org.apache.camel.util.ObjectHelper.wrapRuntimeCamelException(ObjectHelper.java:1680)
at org.apache.camel.spring.SpringCamelContext.onApplicationEvent(SpringCamelContext.java:138)
at org.apache.camel.spring.CamelContextFactoryBean.onApplicationEvent(CamelContextFactoryBean.java:340)
at org.springframework.context.event.SimpleApplicationEventMulticaster.invokeListener(SimpleApplicationEventMulticaster.java:163)
at org.springframework.context.event.SimpleApplicationEventMulticaster.multicastEvent(SimpleApplicationEventMulticaster.java:136)
at org.springframework.context.support.AbstractApplicationContext.publishEvent(AbstractApplicationContext.java:381)
at org.springframework.context.support.AbstractApplicationContext.publishEvent(AbstractApplicationContext.java:335)
at org.springframework.context.support.AbstractApplicationContext.finishRefresh(AbstractApplicationContext.java:855)
at org.springframework.context.support.AbstractApplicationContext.refresh(AbstractApplicationContext.java:541)
at org.springframework.web.context.ContextLoader.configureAndRefreshWebApplicationContext(ContextLoader.java:446)
at org.springframework.web.context.ContextLoader.initWebApplicationContext(ContextLoader.java:328)
at org.springframework.web.context.ContextLoaderListener.contextInitialized(ContextLoaderListener.java:107)
at org.apache.catalina.core.StandardContext.listenerStart(StandardContext.java:4842)
at org.apache.catalina.core.StandardContext.startInternal(StandardContext.java:5303)
at org.apache.catalina.util.LifecycleBase.start(LifecycleBase.java:147)
at org.apache.catalina.core.ContainerBase$StartChild.call(ContainerBase.java:1407)
at org.apache.catalina.core.ContainerBase$StartChild.call(ContainerBase.java:1397)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)
Caused by: org.apache.cxf.service.factory.ServiceConstructionException
at org.apache.cxf.jaxrs.JAXRSServerFactoryBean.create(JAXRSServerFactoryBean.java:219)
at org.apache.camel.component.cxf.jaxrs.CxfRsConsumer.<init>(CxfRsConsumer.java:44)
at org.apache.camel.component.cxf.jaxrs.CxfRsEndpoint.createConsumer(CxfRsEndpoint.java:176)
at org.apache.camel.impl.EventDrivenConsumerRoute.addServices(EventDrivenConsumerRoute.java:69)
at org.apache.camel.impl.DefaultRoute.onStartingServices(DefaultRoute.java:98)
at org.apache.camel.impl.RouteService.warmUp(RouteService.java:158)
at org.apache.camel.impl.DefaultCamelContext.doWarmUpRoutes(DefaultCamelContext.java:3490)
at org.apache.camel.impl.DefaultCamelContext.safelyStartRouteServices(DefaultCamelContext.java:3420)
at org.apache.camel.impl.DefaultCamelContext.doStartOrResumeRoutes(DefaultCamelContext.java:3197)
at org.apache.camel.impl.DefaultCamelContext.doStartCamel(DefaultCamelContext.java:3053)
at org.apache.camel.impl.DefaultCamelContext.access$000(DefaultCamelContext.java:175)
at org.apache.camel.impl.DefaultCamelContext$2.call(DefaultCamelContext.java:2848)
at org.apache.camel.impl.DefaultCamelContext$2.call(DefaultCamelContext.java:2844)
at org.apache.camel.impl.DefaultCamelContext.doWithDefinedClassLoader(DefaultCamelContext.java:2867)
at org.apache.camel.impl.DefaultCamelContext.doStart(DefaultCamelContext.java:2844)
at org.apache.camel.support.ServiceSupport.start(ServiceSupport.java:61)
at org.apache.camel.impl.DefaultCamelContext.start(DefaultCamelContext.java:2813)
at org.apache.camel.spring.SpringCamelContext.maybeStart(SpringCamelContext.java:270)
at org.apache.camel.spring.SpringCamelContext.onApplicationEvent(SpringCamelContext.java:136)
... 19 more
Caused by: org.apache.cxf.service.factory.ServiceConstructionException: No resource classes found
at org.apache.cxf.jaxrs.AbstractJAXRSFactoryBean.checkResources(AbstractJAXRSFactoryBean.java:317)
at org.apache.cxf.jaxrs.JAXRSServerFactoryBean.create(JAXRSServerFactoryBean.java:159)
... 37 more
Make sure you've configured resources classes for the cxf component to use. E.g
from("cxfrs:http://localhost:8080/?resourceClasses=your.fully.qualified.ClassName")
.to("mock:in")
The resource class you specify needs to be annotated with the jaxrs annotations so cxf can understand it and use it.

How to diagnose problems (Failed to route and batch data on XXX channel)?

I am using SymmetricDS and i have stumbled across a problem on one of my clients.
Log says:
2015-11-11 09:10:57,688 ERROR [blagajna_XXX] [RouterService] [blagajna_XXX-job-17] Failed to route and batch data on 'cFpromet' channel
java.lang.NullPointerException
There is no further explanation for null pointer exception, so i can't debug it myself. Replication itself works for some time and then this error appears and replication stops working. Identical system works without any problems.
select * from sym_outgoing_batch where error_flag=1; returns 0 rows, so how can i debug this problem?
GregaJ
EDIT:
java.lang.NullPointerException
at org.jumpmind.db.platform.AbstractJdbcDdlReader.getTableNamePattern(AbstractJdbcDdlReader.java:638)
at org.jumpmind.db.platform.AbstractJdbcDdlReader$3.execute(AbstractJdbcDdlReader.java:574)
at org.jumpmind.db.platform.AbstractJdbcDdlReader$3.execute(AbstractJdbcDdlReader.java:563)
at org.jumpmind.db.sql.JdbcSqlTemplate.execute(JdbcSqlTemplate.java:432)
at org.jumpmind.db.platform.AbstractJdbcDdlReader.readTable(AbstractJdbcDdlReader.java:563)
at org.jumpmind.db.platform.AbstractDatabasePlatform.readTableFromDatabase(AbstractDatabasePlatform.java:239)
at org.jumpmind.db.platform.AbstractDatabasePlatform.getTableFromCache(AbstractDatabasePlatform.java:314)
at org.jumpmind.symmetric.db.AbstractSymmetricDialect.getTable(AbstractSymmetricDialect.java:377)
at org.jumpmind.symmetric.service.impl.RouterService.routeData(RouterService.java:689)
at org.jumpmind.symmetric.service.impl.RouterService.selectDataAndRoute(RouterService.java:634)
at org.jumpmind.symmetric.service.impl.RouterService.routeDataForChannel(RouterService.java:436)
at org.jumpmind.symmetric.service.impl.RouterService.routeDataForEachChannel(RouterService.java:328)
at org.jumpmind.symmetric.service.impl.RouterService.routeData(RouterService.java:175)
at org.jumpmind.symmetric.job.RouterJob.doJob(RouterJob.java:40)
at org.jumpmind.symmetric.job.AbstractJob.invoke(AbstractJob.java:180)
at org.jumpmind.symmetric.job.AbstractJob.run(AbstractJob.java:224)
at org.springframework.scheduling.support.DelegatingErrorHandlingRunnable.run(DelegatingErrorHandlingRunnable.java:54)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308)
at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180)
at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)

Resources