I am not able analyze WhitespaceAnalyzer in schema.xml - solr

When I add analyzer in schema.xml
<fieldType name="nametext" class="solr.TextField">
<analyzer class="org.apache.lucene.analysis.WhitespaceAnalyzer"/>
</fieldType>
And reload the core in solr 5.0.0, I get following error:
testcore: org.apache.solr.common.SolrException:org.apache.solr.common.SolrException:
Could not load conf for core testcore: Plugin init failure for [schema.xml]
fieldType "nametext": Cannot load analyzer:
org.apache.lucene.analysis.WhitespaceAnalyzer. Schema file is E:\files\future\solr-5.0.0\server\solr\testcore\conf\schema.xml
What am I missing?

In Solr 5.0.0 this class is in the org.apache.lucene.analysis.core package, so the class should be org.apache.lucene.analysis.core.WhitespaceAnalyzer - see documentation: http://lucene.apache.org/core/5_0_0/analyzers-common/index.html

Related

Solr Cloud in Kubernetes Indexing error - HttpSolrCall Unable to write response

I am trying to do an Indexing with Solr Cloud running in Kubernetes cluster. I defined a Data Import Handler and I can see the configuration in Solr UI.
The Data Import Handler will allow me to trigger a SQL query and fetch the Polygon data for building the index.
<dataSource
type="JdbcDataSource" processor="XPathEntityProcessor"
driver="oracle.jdbc.driver.OracleDriver" ...... />
<document>
<entity name="pcode" pk="PC" transformer="ClobTransformer"
query="select PCA as PC, GEOM as WPOLYGON,
SBJ,PD,CD
from SCPCA
where SBC is not null>
<field column="PC" name="pCode" />
<field column="WPOLYGON" name="wpolygon" clob="true"/>
<field column="SBJ" name="sbjcode" clob="true"/>
<field column="PD" name="portid"/>
<field column="CD" name="cancid"/>
</entity>
</document>
</dataConfig>
After triggering the index via UI.It runs for around 1 minute and fails with following errors in the console
qtp1046545660-14) [c:sba s:shard1 r:core_node6 x:sba_shard1_replica_n4] o.a.s.u.p.LogUpdateProcessorFactory [sba_shard1_replica_n4] webapp=/solr path=/dataimport params={core=sba&debug=true&optimize=false&indent=on&commit=true&name=dataimport&clean=true&wt=json&command=full-import&_=164589234356779&verbose=true}{deleteByQuery=*:*,commit=} 0 70343
2022-02-26 16:30:38.092 INFO (qtp10465423460-14) [c:sba s:shard1 r:core_node6 x:sba_shard1_replica_n4] o.a.s.s.HttpSolrCall Unable to write response, client closed connection or we are shutting down => org.eclipse.jetty.io.EofException: Reset cancel_stream_error
at org.eclipse.jetty.http2.server.HTTP2ServerConnectionFactory$HTTPServerSessionListener.onReset(HTTP2ServerConnectionFactory.java:159)
org.eclipse.jetty.io.EofException: Reset cancel_stream_error
I am using Solr Cloud 8.9 with Solr operator 0.5.0 and I checked jetty config and it have an idle timeout of 120000.
Any one faced similar issues and fixed it?
Jetty's EofException almost always means one specific thing. The client
closed the connection before Solr could respond, so when Solr finally
finished processing and tried to have Jetty send the response, there was
nowhere to send it -- the connection was gone.
In my case I was doing a full data import to Solr and it failed with this HttpSolrCall Unable to write response EofException . This was happening due to issues with my managedSchema / schema.xml . I forgot to add all columns correctly in the schema.xml which caused the Indexing to fail with EofException. After correcting my schema.xml it worked fine.
It is bit confusing error as why there is an EofException for wrong schema. However, if it is Solr always check the schema.xml / managedSchema for any discrepancies.

Can not apply patch LUCENE-2899.patch to SOLR on Windows

I am trying to apply patch LUCENE-2899.patch to Solr.
I have done this:
Cloned solr from official repo (I am on master branch)
Downloaded and installed ant and GNU patch, i get it here http://gnuwin32.sourceforge.net/packages/patch.htm
Put Ant and GNU patch to PATH env var.
And I got this...
```
D:\utils\solr_master\lucene-solr>patch -p1 -i LUCENE-2899.patch --dry-run
patching file dev-tools/idea/.idea/ant.xml
Assertion failed: hunk, file ../patch-2.5.9-src/patch.c, line 354
This application has requested the Runtime to terminate it in an unusual way.
Please contact the application's support team for more information.
```
UPDATE 1
I am trying to compile, but build failed.
D:\utils\solr_master\lucene-solr>ant compile
Buildfile: D:\utils\solr_master\lucene-solr\build.xml
BUILD FAILED
D:\utils\solr_master\lucene-solr\build.xml:21: The following error occurred while executing this line:
D:\utils\solr_master\lucene-solr\lucene\common-build.xml:623: java.lang.NullPointerException
at java.util.Arrays.stream(Arrays.java:5004)
at java.util.stream.Stream.of(Stream.java:1000)
at java.util.stream.ReferencePipeline$7$1.accept(ReferencePipeline.java:267)
at java.util.stream.ReferencePipeline$3$1.accept(ReferencePipeline.java:193)
at java.util.Spliterators$ArraySpliterator.forEachRemaining(Spliterators.java:948)
at java.util.stream.AbstractPipeline.copyInto(AbstractPipeline.java:481)
at java.util.stream.AbstractPipeline.wrapAndCopyInto(AbstractPipeline.java:471)
at java.util.stream.AbstractPipeline.evaluate(AbstractPipeline.java:545)
at java.util.stream.AbstractPipeline.evaluateToArrayNode(AbstractPipeline.java:260)
at java.util.stream.ReferencePipeline.toArray(ReferencePipeline.java:438)
at org.apache.tools.ant.util.ChainedMapper.lambda$mapFileName$1(ChainedMapper.java:36)
at java.util.stream.ReduceOps$1ReducingSink.accept(ReduceOps.java:80)
at java.util.ArrayList$ArrayListSpliterator.forEachRemaining(ArrayList.java:1382)
at java.util.stream.AbstractPipeline.copyInto(AbstractPipeline.java:481)
at java.util.stream.AbstractPipeline.wrapAndCopyInto(AbstractPipeline.java:471)
at java.util.stream.ReduceOps$ReduceOp.evaluateSequential(ReduceOps.java:708)
at java.util.stream.AbstractPipeline.evaluate(AbstractPipeline.java:234)
at java.util.stream.ReferencePipeline.reduce(ReferencePipeline.java:484)
at org.apache.tools.ant.util.ChainedMapper.mapFileName(ChainedMapper.java:35)
at org.apache.tools.ant.util.CompositeMapper.lambda$mapFileName$0(CompositeMapper.java:32)
at java.util.stream.ReferencePipeline$3$1.accept(ReferencePipeline.java:193)
at java.util.stream.ReferencePipeline$2$1.accept(ReferencePipeline.java:175)
at java.util.ArrayList$ArrayListSpliterator.forEachRemaining(ArrayList.java:1382)
at java.util.stream.AbstractPipeline.copyInto(AbstractPipeline.java:481)
at java.util.stream.AbstractPipeline.wrapAndCopyInto(AbstractPipeline.java:471)
at java.util.stream.AbstractPipeline.evaluate(AbstractPipeline.java:545)
at java.util.stream.AbstractPipeline.evaluateToArrayNode(AbstractPipeline.java:260)
at java.util.stream.ReferencePipeline.toArray(ReferencePipeline.java:438)
at org.apache.tools.ant.util.CompositeMapper.mapFileName(CompositeMapper.java:33)
at org.apache.tools.ant.taskdefs.PathConvert.execute(PathConvert.java:363)
at org.apache.tools.ant.UnknownElement.execute(UnknownElement.java:292)
at sun.reflect.GeneratedMethodAccessor4.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.tools.ant.dispatch.DispatchUtils.execute(DispatchUtils.java:106)
at org.apache.tools.ant.Task.perform(Task.java:346)
at org.apache.tools.ant.Target.execute(Target.java:448)
at org.apache.tools.ant.helper.ProjectHelper2.parse(ProjectHelper2.java:172)
at org.apache.tools.ant.taskdefs.ImportTask.importResource(ImportTask.java:221)
at org.apache.tools.ant.taskdefs.ImportTask.execute(ImportTask.java:165)
at org.apache.tools.ant.UnknownElement.execute(UnknownElement.java:292)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.tools.ant.dispatch.DispatchUtils.execute(DispatchUtils.java:106)
at org.apache.tools.ant.Task.perform(Task.java:346)
at org.apache.tools.ant.Target.execute(Target.java:448)
at org.apache.tools.ant.helper.ProjectHelper2.parse(ProjectHelper2.java:183)
at org.apache.tools.ant.ProjectHelper.configureProject(ProjectHelper.java:93)
at org.apache.tools.ant.Main.runBuild(Main.java:824)
at org.apache.tools.ant.Main.startAnt(Main.java:228)
at org.apache.tools.ant.launch.Launcher.run(Launcher.java:283)
at org.apache.tools.ant.launch.Launcher.main(Launcher.java:101)
Total time: 0 seconds
UPDATE 2
I have downloaded Solr from
https://builds.apache.org/job/Solr-Artifacts-7.3/lastSuccessfulBuild/artifact/solr/package/ and https://builds.apache.org/job/Solr-Artifacts-master/lastSuccessfulBuild/artifact/solr/package/
but neither for 7.3 version nor for 8.0(master) version I don't see opennlp dir in contrib dir. Where can I find it?
UPDATE 3
I have run version from master branch witch I have downloaded here https://builds.apache.org/job/Solr-Artifacts-master/lastSuccessfulBuild/artifact/solr/package/ and I have trying to run OpenNLP like gentleman in this post:
Exception while integrating openNLP with Solr
But I have the same error as he.
numberplate_shard1_replica_n1:
org.apache.solr.common.SolrException:org.apache.solr.common.SolrException: >Could not load conf for core numberplate_shard1_replica_n1: Can't load schema >managed-schema: Plugin init failure for [schema.xml] fieldType >"text_opennlp_nvf": Plugin init failure for [schema.xml] analyzer/tokenizer: >Error instantiating class: 'org.apache.lucene.analysis.opennlp.OpenNLPTokenizerFactory'
If patch LUCENE-2899 is merged into master why I have this error?
UPDATE 5
I have restarted solr and errors were gone. But...
I was trying to add fields ( to managed-schema ) to form example ( https://wiki.apache.org/solr/OpenNLP ) :
<fieldType name="text_opennlp" class="solr.TextField">
<analyzer>
<tokenizer class="solr.OpenNLPTokenizerFactory"
sentenceModel="opennlp/en-sent.bin"
tokenizerModel="opennlp/en-token.bin"
/>
</analyzer>
</fieldType>
<field name="content" type="text_opennlp" indexed="true" termOffsets="true" stored="true" termPayloads="true" termPositions="true" docValues="false" termVectors="true" multiValued="true" required="true"/>
But when I try to run Solr in Cloud mode I got this:
D:\utils\solr-7.3.0-7\solr-7.3.0-7\bin>solr -e cloud
Welcome to the SolrCloud example!
This interactive session will help you launch a SolrCloud cluster on your local workstation.
To begin, how many Solr nodes would you like to run in your local cluster? (specify 1-4 nodes) [2]:
1
Ok, let's start up 1 Solr nodes for your example SolrCloud cluster.
Please enter the port for node1 [8983]:
Solr home directory D:\utils\solr-7.3.0-7\solr-7.3.0-7\example\cloud\node1\solr already exists.
Starting up Solr on port 8983 using command:
"D:\utils\solr-7.3.0-7\solr-7.3.0-7\bin\solr.cmd" start -cloud -p 8983 -s "D:\utils\solr-7.3.0-7\solr-7.3.0-7\example\cloud\node1\solr"
Waiting up to 30 to see Solr running on port 8983
Started Solr server on port 8983. Happy searching!
INFO - 2018-03-26 14:42:26.961; org.apache.solr.client.solrj.impl.ZkClientClusterStateProvider; Cluster at localhost:9983 ready
Now let's create a new collection for indexing documents in your 1-node cluster.
Please provide a name for your new collection: [gettingstarted]
numberplate
Collection 'numberplate' already exists!
Do you want to re-use the existing collection or create a new one? Enter 1 to reuse, 2 to create new [1]:
1
Enabling auto soft-commits with maxTime 3 secs using the Config API
POSTing request to Config API: http://localhost:8983/solr/numberplate/config
{"set-property":{"updateHandler.autoSoftCommit.maxTime":"3000"}}
ERROR: Error from server at http://localhost:8983/solr: Expected mime type application/octet-stream but got text/html. <html>
<head>
<meta http-equiv="Content-Type" content="text/html;charset=utf-8"/>
<title>Error 404 Not Found</title>
</head>
<body><h2>HTTP ERROR 404</h2>
<p>Problem accessing /solr/numberplate/config. Reason:
<pre> Not Found</pre></p>
</body>
</html>
SolrCloud example running, please visit: http://localhost:8983/solr
D:\utils\solr-7.3.0-7\solr-7.3.0-7\bin>
UPDATE 6
I have created new collection and I get more precise error:
test_collection_shard1_replica_n1: > org.apache.solr.common.SolrException:org.apache.solr.common.SolrException: > Could not load conf for core test_collection_shard1_replica_n1: Can't load > schema managed-schema: org.apache.solr.core.SolrResourceNotFoundException: > Can't find resource 'opennlp/en-sent.bin' in classpath or '/configs/_default', > cwd=D:\utils\solr-7.3.0-7\solr-7.3.0-7\server
Please check your logs for more information
Maybe I need to copy somewhere OpenNLP models http://opennlp.sourceforge.net/models-1.5/
But where can I put this models?
Can you help me? What I do wrong?
As you can see on LUCENE-2899, the patch is already applied to 8.0 (master), as well as 7.3.
You can find pre-built nightlies at Solr-Artifacts-master for (currently) 8.0 and at Solr-Artifacts-7.3 for 7.3.
The opennlp libraries are bundled inside the artifacts:
solr-8.0.0-3304 find . -name '*nlp*'
[...]
./contrib/langid/lib/opennlp-tools-1.8.3.jar
./contrib/analysis-extras/lib/opennlp-maxent-3.0.3.jar
./contrib/analysis-extras/lib/opennlp-tools-1.8.3.jar
./contrib/analysis-extras/lucene-libs/lucene-analyzers-opennlp-8.0.0-3304.jar
You then have to tell Solr to load these jars, which you can do through solrconfig.xml.
<lib dir="../../../contrib/analysis-extras/lib/" regex="opennlp-.*\.jar" />
<lib dir="../../../contrib/analysis-extras/lucene-libs/lucene-analyzers-opennlp-.*\.jar" regex=".*\.jar" />
Confirm that the jars are loaded as you expect in Solr's log file.

Failed to install jts library on Solr 6.4.2

I just installed Solr-6.4.2 and tried to install the JTS library like here explained by copying all JTS library files to the /solr-6.4.2/server/solr-webapp/WEB-INF/lib directory.
Then configured the managed-schema by adding
<fieldType name="location_rpt" class="solr.SpatialRecursivePrefixTreeFieldType"
spatialContextFactory="com.spatial4j.core.context.jts.JtsSpatialContextFactory"
distErrPct="0.025"
maxDistErr="0.000009"
units="degrees"
/>
<field name="geo" type="location_rpt" indexed="true" stored="true" multiValued="true" />
and started it in /bin with ./solr start (jetty)
But when i visit the solr interface it says:
> org.apache.solr.common.SolrException:org.apache.solr.common.SolrException:
> Could not load conf for core polygon: Can't load schema
> /home/spatial/solr-6.4.2/server/solr/polygon/conf/managed-schema:
> Plugin Initializing failure for [schema.xml] fieldType
It looks to me that the library is not found or not automatically loaded (as it should be according to tutorials).
Can you help me?
Here is the log file:
2017-03-11 15:44:57.061 INFO (main) [ ] o.a.s.c.CorePropertiesLocator Cores are: [polygon]
2017-03-11 15:44:57.067 INFO (coreLoadExecutor-6-thread-1) [ x:polygon] o.a.s.c.SolrResourceLoader [null] Added 8 libs to classloader, from paths: [/home/spatial/solr-6.4.2/server/solr/polygon/lib]
2017-03-11 15:44:57.117 INFO (main) [ ] o.e.j.s.Server Started #777ms
2017-03-11 15:44:57.174 INFO (coreLoadExecutor-6-thread-1) [ x:polygon] o.a.s.c.SolrResourceLoader [polygon] Added 59 libs to classloader, from paths: [/home/spatial/solr-6.4.2/contrib/clustering/lib, /home/spatial/solr-6.4.2/contrib/extraction/lib, /home/spatial/solr-6.4.2/contrib/langid/lib, /home/spatial/solr-6.4.2/contrib/velocity/lib, /home/spatial/solr-6.4.2/dist]
2017-03-11 15:44:57.209 INFO (coreLoadExecutor-6-thread-1) [ x:polygon] o.a.s.c.SolrConfig Using Lucene MatchVersion: 6.4.2
2017-03-11 15:44:57.298 INFO (coreLoadExecutor-6-thread-1) [ x:polygon] o.a.s.s.IndexSchema [polygon] Schema name=example-data-driven-schema
2017-03-11 15:44:57.385 WARN (coreLoadExecutor-6-thread-1) [ x:polygon] o.a.s.c.SolrResourceLoader Solr loaded a deprecated plugin/analysis class [solr.SynonymFilterFactory]. Please consult documentation how to replace it accordingly.
2017-03-11 15:44:57.535 WARN (coreLoadExecutor-6-thread-1) [ x:polygon] o.a.s.s.AbstractSpatialFieldType Replace 'com.spatial4j.core' with 'org.locationtech.spatial4j' in your schema.
2017-03-11 15:44:57.556 ERROR (coreLoadExecutor-6-thread-1) [ x:polygon] o.a.s.c.CoreContainer Error creating core [polygon]: Could not load conf for core polygon: Can't load schema /home/spatial/solr-6.4.2/server/solr/polygon/conf/managed-schema: Plugin Initializing failure for [schema.xml] fieldType
org.apache.solr.common.SolrException: Could not load conf for core polygon: Can't load schema /home/spatial/solr-6.4.2/server/solr/polygon/conf/managed-schema: Plugin Initializing failure for [schema.xml] fieldType
at org.apache.solr.core.ConfigSetService.getConfig(ConfigSetService.java:84)
at org.apache.solr.core.CoreContainer.create(CoreContainer.java:888)
at org.apache.solr.core.CoreContainer.lambda$load$3(CoreContainer.java:542)
at com.codahale.metrics.InstrumentedExecutorService$InstrumentedCallable.call(InstrumentedExecutorService.java:197)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at org.apache.solr.common.util.ExecutorUtil$MDCAwareThreadPoolExecutor.lambda$execute$0(ExecutorUtil.java:229)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)
Caused by: org.apache.solr.common.SolrException: Can't load schema /home/spatial/solr-6.4.2/server/solr/polygon/conf/managed-schema: Plugin Initializing failure for [schema.xml] fieldType
at org.apache.solr.schema.IndexSchema.readSchema(IndexSchema.java:598)
at org.apache.solr.schema.IndexSchema.<init>(IndexSchema.java:183)
at org.apache.solr.schema.ManagedIndexSchema.<init>(ManagedIndexSchema.java:104)
at org.apache.solr.schema.ManagedIndexSchemaFactory.create(ManagedIndexSchemaFactory.java:173)
at org.apache.solr.schema.ManagedIndexSchemaFactory.create(ManagedIndexSchemaFactory.java:45)
at org.apache.solr.schema.IndexSchemaFactory.buildIndexSchema(IndexSchemaFactory.java:75)
at org.apache.solr.core.ConfigSetService.createIndexSchema(ConfigSetService.java:106)
at org.apache.solr.core.ConfigSetService.getConfig(ConfigSetService.java:78)
... 8 more
Caused by: org.apache.solr.common.SolrException: Plugin Initializing failure for [schema.xml] fieldType
at org.apache.solr.util.plugin.AbstractPluginLoader.load(AbstractPluginLoader.java:194)
at org.apache.solr.schema.IndexSchema.readSchema(IndexSchema.java:491)
... 15 more
Caused by: java.lang.RuntimeException: schema fieldtype location_rpt(org.apache.solr.schema.SpatialRecursivePrefixTreeFieldType) invalid arguments:{units=degrees}
at org.apache.solr.schema.FieldType.setArgs(FieldType.java:202)
at org.apache.solr.schema.FieldTypePluginLoader.init(FieldTypePluginLoader.java:165)
at org.apache.solr.schema.FieldTypePluginLoader.init(FieldTypePluginLoader.java:53)
at org.apache.solr.util.plugin.AbstractPluginLoader.load(AbstractPluginLoader.java:191)
... 16 more
2017-03-11 15:44:57.558 ERROR (coreContainerWorkExecutor-2-thread-1) [ ] o.a.s.c.CoreContainer Error waiting for SolrCore to be created
java.util.concurrent.ExecutionException: org.apache.solr.common.SolrException: Unable to create core [polygon]
at java.util.concurrent.FutureTask.report(FutureTask.java:122)
at java.util.concurrent.FutureTask.get(FutureTask.java:192)
at org.apache.solr.core.CoreContainer.lambda$load$4(CoreContainer.java:570)
at com.codahale.metrics.InstrumentedExecutorService$InstrumentedRunnable.run(InstrumentedExecutorService.java:176)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at org.apache.solr.common.util.ExecutorUtil$MDCAwareThreadPoolExecutor.lambda$execute$0(ExecutorUtil.java:229)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)
Caused by: org.apache.solr.common.SolrException: Unable to create core [polygon]
at org.apache.solr.core.CoreContainer.create(CoreContainer.java:903)
at org.apache.solr.core.CoreContainer.lambda$load$3(CoreContainer.java:542)
at com.codahale.metrics.InstrumentedExecutorService$InstrumentedCallable.call(InstrumentedExecutorService.java:197)
... 5 more
Caused by: org.apache.solr.common.SolrException: Could not load conf for core polygon: Can't load schema /home/spatial/solr-6.4.2/server/solr/polygon/conf/managed-schema: Plugin Initializing failure for [schema.xml] fieldType
at org.apache.solr.core.ConfigSetService.getConfig(ConfigSetService.java:84)
at org.apache.solr.core.CoreContainer.create(CoreContainer.java:888)
... 7 more
Caused by: org.apache.solr.common.SolrException: Can't load schema /home/spatial/solr-6.4.2/server/solr/polygon/conf/managed-schema: Plugin Initializing failure for [schema.xml] fieldType
at org.apache.solr.schema.IndexSchema.readSchema(IndexSchema.java:598)
at org.apache.solr.schema.IndexSchema.<init>(IndexSchema.java:183)
at org.apache.solr.schema.ManagedIndexSchema.<init>(ManagedIndexSchema.java:104)
at org.apache.solr.schema.ManagedIndexSchemaFactory.create(ManagedIndexSchemaFactory.java:173)
at org.apache.solr.schema.ManagedIndexSchemaFactory.create(ManagedIndexSchemaFactory.java:45)
at org.apache.solr.schema.IndexSchemaFactory.buildIndexSchema(IndexSchemaFactory.java:75)
at org.apache.solr.core.ConfigSetService.createIndexSchema(ConfigSetService.java:106)
at org.apache.solr.core.ConfigSetService.getConfig(ConfigSetService.java:78)
... 8 more
Caused by: org.apache.solr.common.SolrException: Plugin Initializing failure for [schema.xml] fieldType
at org.apache.solr.util.plugin.AbstractPluginLoader.load(AbstractPluginLoader.java:194)
at org.apache.solr.schema.IndexSchema.readSchema(IndexSchema.java:491)
... 15 more
Caused by: java.lang.RuntimeException: schema fieldtype location_rpt(org.apache.solr.schema.SpatialRecursivePrefixTreeFieldType) invalid arguments:{units=degrees}
at org.apache.solr.schema.FieldType.setArgs(FieldType.java:202)
at org.apache.solr.schema.FieldTypePluginLoader.init(FieldTypePluginLoader.java:165)
at org.apache.solr.schema.FieldTypePluginLoader.init(FieldTypePluginLoader.java:53)
at org.apache.solr.util.plugin.AbstractPluginLoader.load(AbstractPluginLoader.java:191)
It turned out the interface was recently changed and older examples found on stackoverflow, like here are not working with the most current solr version (6.4.2). The most current documentation is here
A configuration example which will work:
<fieldType name="location_rpt" class="solr.SpatialRecursivePrefixTreeFieldType"
spatialContextFactory="com.spatial4j.core.context.jts.JtsSpatialContextFactory"
autoIndex="true"
distErrPct="0.025"
maxDistErr="0.001"
distanceUnits="kilometers" />
I. e. distanceUnits is now used instead of units, etc. and degrees as attribute seems to raise an error.
The initial code I used raised errors in the most current solr version.
Caused by: java.lang.RuntimeException: schema fieldtype location_rpt(org.apache.solr.schema.SpatialRecursivePrefixTreeFieldType) invalid arguments:{units=degrees}

SOLR Field not reflected in schema browser

I created a solr core using bin/solr -c core1 and then copied the schema.xml file from basic config set to core1/conf folder and added a field
<field name="title" type="text" indexed="true" stored="true"/>.
But this field is not reflected in schema browser.
What configurations should I make to get the new fields reflected in schema browser in solr admin ui?
I am using solr 5.3.1
By default when you create a solr core it will use managed schema. You will see the following configuration in solrconfig.xml after core is created.
<schemaFactory class="ManagedIndexSchemaFactory">
<bool name="mutable">true</bool>
<str name="managedSchemaResourceName">managed-schema</str>
</schemaFactory>
Above this configuration you will find the comments on how use managed-schema. Comment this out and uncomment the following to use schema.xml
<schemaFactory class="ClassicIndexSchemaFactory"/>
You need to reload the core: go to http://yourhost:8983/solr/#/~cores/core1 and press "Reload" button.

Can not use ICUTokenizerFactory in Solr

I am trying to use ICUTokenizerFactory in Solr schema. This is how I have defined field and fieldType.
<fieldType name="text_icu" class="solr.TextField" positionIncrementGap="100">
<analyzer>
<tokenizer class="solr.ICUTokenizerFactory"/>
</analyzer>
</fieldType>
<field name="fld_icu" type="text_icu" indexed="true" stored="true"/>
And, when I start Solr, I am get this error
Plugin init failure for [schema.xml] fieldType "text_icu": Plugin init failure for [schema.xml] analyzer/tokenizer: Error loading class 'solr.ICUTokenizerFactory'
I have searched in for that with no success. I don't know if I am missing something or there is some problem in schema.
If someone has tried ICUTokenizerFactory then please suggest what could be the problem.
Add this at the top of your solrconfig.xml:
<config>
<lib dir="${user.dir}/../contrib/analysis-extras/lucene-libs/" />
<lib dir="${user.dir}/../contrib/analysis-extras/lib/" />
This assumes that you are running from example directory with solr.solr.home set to your instance. Otherwise, just use absolute path to your Solr installation.
You can also copy all those jars into lib directory (under your core, not solr home). But the above is an easier way.
From the Wiki:
Lucene provides support for segmenting these languages into syllables with solr.ICUTokenizerFactory in the analysis-extras contrib module. To use this tokenizer, see solr/contrib/analysis-extras/README.txt for instructions on which jars you need to add to your SOLR_HOME/lib

Resources