Can not apply patch LUCENE-2899.patch to SOLR on Windows - solr

I am trying to apply patch LUCENE-2899.patch to Solr.
I have done this:
Cloned solr from official repo (I am on master branch)
Downloaded and installed ant and GNU patch, i get it here http://gnuwin32.sourceforge.net/packages/patch.htm
Put Ant and GNU patch to PATH env var.
And I got this...
```
D:\utils\solr_master\lucene-solr>patch -p1 -i LUCENE-2899.patch --dry-run
patching file dev-tools/idea/.idea/ant.xml
Assertion failed: hunk, file ../patch-2.5.9-src/patch.c, line 354
This application has requested the Runtime to terminate it in an unusual way.
Please contact the application's support team for more information.
```
UPDATE 1
I am trying to compile, but build failed.
D:\utils\solr_master\lucene-solr>ant compile
Buildfile: D:\utils\solr_master\lucene-solr\build.xml
BUILD FAILED
D:\utils\solr_master\lucene-solr\build.xml:21: The following error occurred while executing this line:
D:\utils\solr_master\lucene-solr\lucene\common-build.xml:623: java.lang.NullPointerException
at java.util.Arrays.stream(Arrays.java:5004)
at java.util.stream.Stream.of(Stream.java:1000)
at java.util.stream.ReferencePipeline$7$1.accept(ReferencePipeline.java:267)
at java.util.stream.ReferencePipeline$3$1.accept(ReferencePipeline.java:193)
at java.util.Spliterators$ArraySpliterator.forEachRemaining(Spliterators.java:948)
at java.util.stream.AbstractPipeline.copyInto(AbstractPipeline.java:481)
at java.util.stream.AbstractPipeline.wrapAndCopyInto(AbstractPipeline.java:471)
at java.util.stream.AbstractPipeline.evaluate(AbstractPipeline.java:545)
at java.util.stream.AbstractPipeline.evaluateToArrayNode(AbstractPipeline.java:260)
at java.util.stream.ReferencePipeline.toArray(ReferencePipeline.java:438)
at org.apache.tools.ant.util.ChainedMapper.lambda$mapFileName$1(ChainedMapper.java:36)
at java.util.stream.ReduceOps$1ReducingSink.accept(ReduceOps.java:80)
at java.util.ArrayList$ArrayListSpliterator.forEachRemaining(ArrayList.java:1382)
at java.util.stream.AbstractPipeline.copyInto(AbstractPipeline.java:481)
at java.util.stream.AbstractPipeline.wrapAndCopyInto(AbstractPipeline.java:471)
at java.util.stream.ReduceOps$ReduceOp.evaluateSequential(ReduceOps.java:708)
at java.util.stream.AbstractPipeline.evaluate(AbstractPipeline.java:234)
at java.util.stream.ReferencePipeline.reduce(ReferencePipeline.java:484)
at org.apache.tools.ant.util.ChainedMapper.mapFileName(ChainedMapper.java:35)
at org.apache.tools.ant.util.CompositeMapper.lambda$mapFileName$0(CompositeMapper.java:32)
at java.util.stream.ReferencePipeline$3$1.accept(ReferencePipeline.java:193)
at java.util.stream.ReferencePipeline$2$1.accept(ReferencePipeline.java:175)
at java.util.ArrayList$ArrayListSpliterator.forEachRemaining(ArrayList.java:1382)
at java.util.stream.AbstractPipeline.copyInto(AbstractPipeline.java:481)
at java.util.stream.AbstractPipeline.wrapAndCopyInto(AbstractPipeline.java:471)
at java.util.stream.AbstractPipeline.evaluate(AbstractPipeline.java:545)
at java.util.stream.AbstractPipeline.evaluateToArrayNode(AbstractPipeline.java:260)
at java.util.stream.ReferencePipeline.toArray(ReferencePipeline.java:438)
at org.apache.tools.ant.util.CompositeMapper.mapFileName(CompositeMapper.java:33)
at org.apache.tools.ant.taskdefs.PathConvert.execute(PathConvert.java:363)
at org.apache.tools.ant.UnknownElement.execute(UnknownElement.java:292)
at sun.reflect.GeneratedMethodAccessor4.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.tools.ant.dispatch.DispatchUtils.execute(DispatchUtils.java:106)
at org.apache.tools.ant.Task.perform(Task.java:346)
at org.apache.tools.ant.Target.execute(Target.java:448)
at org.apache.tools.ant.helper.ProjectHelper2.parse(ProjectHelper2.java:172)
at org.apache.tools.ant.taskdefs.ImportTask.importResource(ImportTask.java:221)
at org.apache.tools.ant.taskdefs.ImportTask.execute(ImportTask.java:165)
at org.apache.tools.ant.UnknownElement.execute(UnknownElement.java:292)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.tools.ant.dispatch.DispatchUtils.execute(DispatchUtils.java:106)
at org.apache.tools.ant.Task.perform(Task.java:346)
at org.apache.tools.ant.Target.execute(Target.java:448)
at org.apache.tools.ant.helper.ProjectHelper2.parse(ProjectHelper2.java:183)
at org.apache.tools.ant.ProjectHelper.configureProject(ProjectHelper.java:93)
at org.apache.tools.ant.Main.runBuild(Main.java:824)
at org.apache.tools.ant.Main.startAnt(Main.java:228)
at org.apache.tools.ant.launch.Launcher.run(Launcher.java:283)
at org.apache.tools.ant.launch.Launcher.main(Launcher.java:101)
Total time: 0 seconds
UPDATE 2
I have downloaded Solr from
https://builds.apache.org/job/Solr-Artifacts-7.3/lastSuccessfulBuild/artifact/solr/package/ and https://builds.apache.org/job/Solr-Artifacts-master/lastSuccessfulBuild/artifact/solr/package/
but neither for 7.3 version nor for 8.0(master) version I don't see opennlp dir in contrib dir. Where can I find it?
UPDATE 3
I have run version from master branch witch I have downloaded here https://builds.apache.org/job/Solr-Artifacts-master/lastSuccessfulBuild/artifact/solr/package/ and I have trying to run OpenNLP like gentleman in this post:
Exception while integrating openNLP with Solr
But I have the same error as he.
numberplate_shard1_replica_n1:
org.apache.solr.common.SolrException:org.apache.solr.common.SolrException: >Could not load conf for core numberplate_shard1_replica_n1: Can't load schema >managed-schema: Plugin init failure for [schema.xml] fieldType >"text_opennlp_nvf": Plugin init failure for [schema.xml] analyzer/tokenizer: >Error instantiating class: 'org.apache.lucene.analysis.opennlp.OpenNLPTokenizerFactory'
If patch LUCENE-2899 is merged into master why I have this error?
UPDATE 5
I have restarted solr and errors were gone. But...
I was trying to add fields ( to managed-schema ) to form example ( https://wiki.apache.org/solr/OpenNLP ) :
<fieldType name="text_opennlp" class="solr.TextField">
<analyzer>
<tokenizer class="solr.OpenNLPTokenizerFactory"
sentenceModel="opennlp/en-sent.bin"
tokenizerModel="opennlp/en-token.bin"
/>
</analyzer>
</fieldType>
<field name="content" type="text_opennlp" indexed="true" termOffsets="true" stored="true" termPayloads="true" termPositions="true" docValues="false" termVectors="true" multiValued="true" required="true"/>
But when I try to run Solr in Cloud mode I got this:
D:\utils\solr-7.3.0-7\solr-7.3.0-7\bin>solr -e cloud
Welcome to the SolrCloud example!
This interactive session will help you launch a SolrCloud cluster on your local workstation.
To begin, how many Solr nodes would you like to run in your local cluster? (specify 1-4 nodes) [2]:
1
Ok, let's start up 1 Solr nodes for your example SolrCloud cluster.
Please enter the port for node1 [8983]:
Solr home directory D:\utils\solr-7.3.0-7\solr-7.3.0-7\example\cloud\node1\solr already exists.
Starting up Solr on port 8983 using command:
"D:\utils\solr-7.3.0-7\solr-7.3.0-7\bin\solr.cmd" start -cloud -p 8983 -s "D:\utils\solr-7.3.0-7\solr-7.3.0-7\example\cloud\node1\solr"
Waiting up to 30 to see Solr running on port 8983
Started Solr server on port 8983. Happy searching!
INFO - 2018-03-26 14:42:26.961; org.apache.solr.client.solrj.impl.ZkClientClusterStateProvider; Cluster at localhost:9983 ready
Now let's create a new collection for indexing documents in your 1-node cluster.
Please provide a name for your new collection: [gettingstarted]
numberplate
Collection 'numberplate' already exists!
Do you want to re-use the existing collection or create a new one? Enter 1 to reuse, 2 to create new [1]:
1
Enabling auto soft-commits with maxTime 3 secs using the Config API
POSTing request to Config API: http://localhost:8983/solr/numberplate/config
{"set-property":{"updateHandler.autoSoftCommit.maxTime":"3000"}}
ERROR: Error from server at http://localhost:8983/solr: Expected mime type application/octet-stream but got text/html. <html>
<head>
<meta http-equiv="Content-Type" content="text/html;charset=utf-8"/>
<title>Error 404 Not Found</title>
</head>
<body><h2>HTTP ERROR 404</h2>
<p>Problem accessing /solr/numberplate/config. Reason:
<pre> Not Found</pre></p>
</body>
</html>
SolrCloud example running, please visit: http://localhost:8983/solr
D:\utils\solr-7.3.0-7\solr-7.3.0-7\bin>
UPDATE 6
I have created new collection and I get more precise error:
test_collection_shard1_replica_n1: > org.apache.solr.common.SolrException:org.apache.solr.common.SolrException: > Could not load conf for core test_collection_shard1_replica_n1: Can't load > schema managed-schema: org.apache.solr.core.SolrResourceNotFoundException: > Can't find resource 'opennlp/en-sent.bin' in classpath or '/configs/_default', > cwd=D:\utils\solr-7.3.0-7\solr-7.3.0-7\server
Please check your logs for more information
Maybe I need to copy somewhere OpenNLP models http://opennlp.sourceforge.net/models-1.5/
But where can I put this models?
Can you help me? What I do wrong?

As you can see on LUCENE-2899, the patch is already applied to 8.0 (master), as well as 7.3.
You can find pre-built nightlies at Solr-Artifacts-master for (currently) 8.0 and at Solr-Artifacts-7.3 for 7.3.
The opennlp libraries are bundled inside the artifacts:
solr-8.0.0-3304 find . -name '*nlp*'
[...]
./contrib/langid/lib/opennlp-tools-1.8.3.jar
./contrib/analysis-extras/lib/opennlp-maxent-3.0.3.jar
./contrib/analysis-extras/lib/opennlp-tools-1.8.3.jar
./contrib/analysis-extras/lucene-libs/lucene-analyzers-opennlp-8.0.0-3304.jar
You then have to tell Solr to load these jars, which you can do through solrconfig.xml.
<lib dir="../../../contrib/analysis-extras/lib/" regex="opennlp-.*\.jar" />
<lib dir="../../../contrib/analysis-extras/lucene-libs/lucene-analyzers-opennlp-.*\.jar" regex=".*\.jar" />
Confirm that the jars are loaded as you expect in Solr's log file.

Related

Solr Map reduce indexer tool not able to fetch aliases through zk

Hi While working with MapReduceIndexerTool with solr 4.10 cloud, the code is successfully able to connect to Zookeeper, but while fetching the aliases.json, it fails to fetch the data. Below is the command and stack trace:
command:
hadoop --config /etc/hadoop/conf jar target/search-mr-*-job.jar org.apache.solr.hadoop.MapReduceIndexerTool -D 'mapred.child.java.opts=-Xmx500m' --log4j src/test/resources/log4j.properties --morphline-file /home/impadmin/app_quotes_morphline.conf --output-dir hdfs://impetus-i0056.impetus.co.in:8020/user/impadmin/MapReduceIndexerTool/output2 --zk-host 172.26.45.69:9983/solr --collection app.quotes hdfs://impetus-i0056.impetus.co.in:8020/apps/hive/warehouse/kst
stack trace:
WARNING: Use "yarn jar" to launch YARN applications.
1 [main] INFO org.apache.solr.common.cloud.SolrZkClient - Using default ZkCredentialsProvider
87 [main] INFO org.apache.solr.common.cloud.ConnectionManager - Waiting for client to connect to ZooKeeper
114 [main-EventThread] INFO org.apache.solr.common.cloud.ConnectionManager - Watcher org.apache.solr.common.cloud.ConnectionManager#1568159 name:ZooKeeperConnection Watcher:172.26.45.69:9983/solr got event WatchedEvent state:SyncConnected type:None path:null path:null type:None
115 [main] INFO org.apache.solr.common.cloud.ConnectionManager - Client is connected to ZooKeeper
115 [main] INFO org.apache.solr.common.cloud.SolrZkClient - Using default ZkACLProvider
Exception in thread "main" net.sourceforge.argparse4j.inf.ArgumentParserException: java.lang.IllegalArgumentException: Cannot find expected information for SolrCloud in ZooKeeper: 172.26.45.69:9983/solr
at org.apache.solr.hadoop.MapReduceIndexerTool.verifyZKStructure(MapReduceIndexerTool.java:1418)
at org.apache.solr.hadoop.MapReduceIndexerTool.run(MapReduceIndexerTool.java:716)
at org.apache.solr.hadoop.MapReduceIndexerTool.run(MapReduceIndexerTool.java:681)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:76)
at org.apache.solr.hadoop.MapReduceIndexerTool.main(MapReduceIndexerTool.java:668)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.hadoop.util.RunJar.run(RunJar.java:221)
at org.apache.hadoop.util.RunJar.main(RunJar.java:136)
Caused by: java.lang.IllegalArgumentException: Cannot find expected information for SolrCloud in ZooKeeper: 172.26.45.69:9983/solr
at org.apache.solr.hadoop.ZooKeeperInspector.extractDocCollection(ZooKeeperInspector.java:88)
at org.apache.solr.hadoop.ZooKeeperInspector.extractShardUrls(ZooKeeperInspector.java:56)
at org.apache.solr.hadoop.MapReduceIndexerTool.verifyZKStructure(MapReduceIndexerTool.java:1415)
... 10 more
Caused by: org.apache.zookeeper.KeeperException$NoNodeException: KeeperErrorCode = NoNode for /aliases.json
at org.apache.zookeeper.KeeperException.create(KeeperException.java:111)
at org.apache.zookeeper.KeeperException.create(KeeperException.java:51)
at org.apache.zookeeper.ZooKeeper.getData(ZooKeeper.java:1155)
at org.apache.solr.common.cloud.SolrZkClient$7.execute(SolrZkClient.java:351)
at org.apache.solr.common.cloud.SolrZkClient$7.execute(SolrZkClient.java:348)
at org.apache.solr.common.cloud.ZkCmdExecutor.retryOperation(ZkCmdExecutor.java:61)
at org.apache.solr.common.cloud.SolrZkClient.getData(SolrZkClient.java:348)
at org.apache.solr.hadoop.ZooKeeperInspector.checkForAlias(ZooKeeperInspector.java:164)
at org.apache.solr.hadoop.ZooKeeperInspector.extractDocCollection(ZooKeeperInspector.java:85)
... 12 more
Please help me to identify the root cause.
The issue was with the URL that was being hit to access zk solr configs. thus correcting the URL solved the issue. In case of embedded solr instance the URL does not have application solr available, but rather puts it directly under zk root.

Nutch 1.11 crawl Issue

I have followed the tutorial and configured nutch to run on Windows 7 using Cygwin and i'm using Solr 5.4.0 to index the data
But nutch 1.11 is having problem in executing a crawl.
Crawl Command
$ bin/crawl -i -D solr.server.url=http://127.0.0.1:8983/solr /urls /TestCrawl 2
Error/Exception
Injecting seed URLs /apache-nutch-1.11/bin/nutch inject /TestCrawl/crawldb /urls
Injector: starting at 2016-01-19 17:11:06
Injector: crawlDb: /TestCrawl/crawldb
Injector: urlDir: /urls
Injector: Converting injected urls to crawl db entries.
Injector: java.lang.NullPointerException
at java.lang.ProcessBuilder.start(ProcessBuilder.java:1012)
at org.apache.hadoop.util.Shell.runCommand(Shell.java:445)
at org.apache.hadoop.util.Shell.run(Shell.java:418)
at org.apache.hadoop.util.Shell$ShellCommandExecutor.execute(Shell.java:650)
at org.apache.hadoop.util.Shell.execCommand(Shell.java:739)
at org.apache.hadoop.util.Shell.execCommand(Shell.java:722)
at org.apache.hadoop.fs.RawLocalFileSystem.setPermission(RawLocalFileSystem.java:633)
at org.apache.hadoop.fs.RawLocalFileSystem.mkdirs(RawLocalFileSystem.java:421)
at org.apache.hadoop.fs.FilterFileSystem.mkdirs(FilterFileSystem.java:281)
at org.apache.hadoop.mapreduce.JobSubmissionFiles.getStagingDir(JobSubmissionFiles.java:125)
at org.apache.hadoop.mapreduce.JobSubmitter.submitJobInternal(JobSubmitter.java:348)
at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1285)
at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1282)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1548)
at org.apache.hadoop.mapreduce.Job.submit(Job.java:1282)
at org.apache.hadoop.mapred.JobClient$1.run(JobClient.java:562)
at org.apache.hadoop.mapred.JobClient$1.run(JobClient.java:557)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1548)
at org.apache.hadoop.mapred.JobClient.submitJobInternal(JobClient.java:557)
at org.apache.hadoop.mapred.JobClient.submitJob(JobClient.java:548)
at org.apache.hadoop.mapred.JobClient.runJob(JobClient.java:833)
at org.apache.nutch.crawl.Injector.inject(Injector.java:323)
at org.apache.nutch.crawl.Injector.run(Injector.java:379)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
at org.apache.nutch.crawl.Injector.main(Injector.java:369)
Error running:
/home/apache-nutch-1.11/bin/nutch inject /TestCrawl/crawldb /urls
Failed with exit value 127.
I can see there are multiple problems with your command, try this:
bin/crawl -i -Dsolr.server.url=http://127.0.0.1:8983/solr/core_name path_to_seed crawl 2
The first problem is that there is a space when you pass the solr parameter. The second problem is that the solr url should include the core name as well.
hadoop-core jar file is needed when you are working with nutch
with nutch 1.11 compatible hadoop-core jar is 0.20.0
please download jar from this link :
http://www.java2s.com/Code/Jar/h/Downloadhadoop0200corejar.htm
paste that jar into "C:\cygwin64\home\apache-nutch-1.11\lib" folder and it will run
successfully.

I cannot create core in Solr 5.2.1

I have solr clouds 5.2.1. I deploy solr and zookeeper. When I try to create a core this errors are throwing :
org.apache.solr.common.SolrException: Could not load conf for core contracts_shard1_replica1: Error loading solr config from solrconfig.xml
at org.apache.solr.core.ConfigSetService.getConfig(ConfigSetService.java:78)
at org.apache.solr.core.CoreContainer.create(CoreContainer.java:635)
at org.apache.solr.core.CoreContainer.create(CoreContainer.java:611)
at org.apache.solr.handler.admin.CoreAdminHandler.handleCreateAction(CoreAdminHandler.java:628)
at org.apache.solr.handler.admin.CoreAdminHandler.handleRequestInternal(CoreAdminHandler.java:213)
at org.apache.solr.handler.admin.CoreAdminHandler.handleRequestBody(CoreAdminHandler.java:193)
at org.apache.solr.handler.RequestHandlerBase.handleRequest(RequestHandlerBase.java:143)
at org.apache.solr.servlet.HttpSolrCall.handleAdminRequest(HttpSolrCall.java:660)
at org.apache.solr.servlet.HttpSolrCall.call(HttpSolrCall.java:431)
at org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.java:227)
at org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.java:196)
at org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1652)
at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:585)
at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143)
at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:577)
org.apache.solr.common.SolrException: Error CREATEing SolrCore 'contracts_shard1_replica1': Unable to create core [contracts_shard1_replica1] Caused by: Can't find resource 'solrconfig.xml' in classpath or '/configs/contracts', cwd=C:\CM_10.1.0\INDEXSERVER\searchserver-distribution\target\searchserver\solr\server
at org.apache.solr.handler.admin.CoreAdminHandler.handleCreateAction(CoreAdminHandler.java:661)
at org.apache.solr.handler.admin.CoreAdminHandler.handleRequestInternal(CoreAdminHandler.java:213)
at org.apache.solr.handler.admin.CoreAdminHandler.handleRequestBody(CoreAdminHandler.java:193)
at org.apache.solr.handler.RequestHandlerBase.handleRequest(RequestHandlerBase.java:143)
at org.apache.solr.servlet.HttpSolrCall.handleAdminRequest(HttpSolrCall.java:660)
at org.apache.solr.servlet.HttpSolrCall.call(HttpSolrCall.java:431)
at org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.java:227)
at org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.java:196)
at org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1652)
at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:585)
at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143)
]
I created contracts inside of C:\CM_10.1.0\INDEXSERVER\searchserver-distribution\target\searchserver\solr\server and copied " conf" folder which is in solr\configsets\basic_configs" into contracts. But problem didn't solved.
I do need help to solve this problem. Does anyone help me?
Thanks
Since you are using the zookeeper, you must first send the config files to the zookeeper. I'm not sure how it is in Windows :P, but in Linux it would be:
cd /searchserver/solr/server/scripts/cloud-scripts
./zkcli.sh -cmd upconfig -confdir /searchserver/solr/server/solr/corename/conf -confname myconfname -z zoo1:2181,zoo2:2181,zoo3:2181
In Windows, use zkcli.bat in the same directory.
Another way to do this is by adding
SOLR_OPTS="$SOLR_OPTS -Dbootstrap_confdir=./solr/corename/conf/"
SOLR_OPTS="$SOLR_OPTS -Dcollection.configName=myconfname"
to the solr.in.sh file, then (re)starting solr. In Windows, the file is solr.in.cmd, and you add the following lines:
set SOLR_OPTS=%SOLR_OPTS% -Dbootstrap_confdir=./solr/corename/conf/
set SOLR_OPTS=%SOLR_OPTS% -Dcollection.configName=myconfname
The solr.in.sh/solr.in.cmd file is included into the solr (colr.cmd) command that you use to start the solr server. Myconfname above (in both methods) is an arbitrary name you give to indicate the sets of config files that you've added to the zookeeper. Then you can create the core using the collections API:
http://localhost:8983/solr/admin/collections?action=CREATE&name=coreName&numShards=2&shards=shard1,shard2&collection.configName=myconfname&createNodeSet=localhost:8983_solr

Solr cloud error while creating collection no config file found

I am using external zookeper for testing I am using on local system steps I followed are as bellow.
Step 1. created 3 zookeper server with data containing myid file containing unique numbers 1,2,3 respectively.
Step 2. I started all three zookeper server useing command
./zkServer.sh start
Step 3. check status of each server 2 showing status as leader and remaining 2 as Mode: follower
Step4 : try to run solr cloud example as
/opt/solr$bin/solr start -e cloud -z localhost:2181,localhost:2182,localhost:2183
It ask me no of shards replicas etc. System ask me collection name I entered test but it throws an exception like
`basic_configs, data_driven_schema_configs, or sample_techproducts_configs [data_driven_schema_configs]
Exception in thread "main" org.apache.solr.client.solrj.SolrServerException: Error loading config name for collection phrases
at org.apache.solr.util.SolrCLI.getJson(SolrCLI.java:537)
at org.apache.solr.util.SolrCLI.getJson(SolrCLI.java:471)
at org.apache.solr.util.SolrCLI$StatusTool.getCloudStatus(SolrCLI.java:721)
at org.apache.solr.util.SolrCLI$StatusTool.reportStatus(SolrCLI.java:704)
at org.apache.solr.util.SolrCLI.getZkHost(SolrCLI.java:1160)
at org.apache.solr.util.SolrCLI$CreateCollectionTool.runTool(SolrCLI.java:1210)
at org.apache.solr.util.SolrCLI.main(SolrCLI.java:215)
Enabling auto soft-commits with maxTime 3 secs using the Config API
POSTing request to Config API: http://localhost:8990/solr/sai/config
{"set-property":{"updateHandler.autoSoftCommit.maxTime":"3000"}}
Exception in thread "main" org.apache.solr.client.solrj.impl.HttpSolrClient$RemoteSolrException: Error from server at http://localhost:8990/solr: Expected mime type application/octet-stream but got text/html. <html>
<head>
<meta http-equiv="Content-Type" content="text/html; charset=UTF-8"/>
<title>Error 404 Not Found</title>
</head>
<body><h2>HTTP ERROR 404</h2>
<p>Problem accessing /solr/sai/config. Reason:
<pre> Not Found</pre></p><hr><i><small>Powered by Jetty://</small></i><hr/>
</body>
</html>
at org.apache.solr.client.solrj.impl.HttpSolrClient.executeMethod(HttpSolrClient.java:529)
at org.apache.solr.client.solrj.impl.HttpSolrClient.request(HttpSolrClient.java:235)
at org.apache.solr.client.solrj.impl.HttpSolrClient.request(HttpSolrClient.java:227)
at org.apache.solr.client.solrj.SolrClient.request(SolrClient.java:1220)
at org.apache.solr.util.SolrCLI.postJsonToSolr(SolrCLI.java:1882)
at org.apache.solr.util.SolrCLI$ConfigTool.runTool(SolrCLI.java:1856)
at org.apache.solr.util.SolrCLI.main(SolrCLI.java:215)
SolrCloud example running, please visit http://localhost:8990/solr `
It showing error Exception in thread "main" org.apache.solr.client.solrj.SolrServerException: Error loading config name for collection phrases and I am trying to create collection test
If you are trying to create a collection on solr cloud with your custom configuration, you have to upload it first on to Zookeeper. Then you can create a collection using that config. You can also check what all configurations are there currently on the solrcloud through solr admin UI (http://localhost:8983/solr/#/~cloud?view=tree).
Uploading configuration to zookeeper:
Using zookeeper client in solr (solr-6.5.1\server\scripts\cloud-scripts)
zkcli -zkhost <zookeeper host> -cmd upconfig -confname <configname> -solrhome <solr home directory> -confdir <config directory path>
ex: zkcli -zkhost localhost:2181 -cmd upconfig -confname sampleconfig -solrhome ../solr -confdir ../../solr/configsets/sampleconfig/conf
Now that your configuration is uploaded, you can create a collection on solrcloud
start solr cloud with external zookeeper
solr start -c -z localhost:2181
Create collection
http://localhost:8983/solr/admin/collections?action=CREATE&name=<collectionname>&numShards=1&replicationFactor=1&collection.configName=<configname>
-e cloud is an example provided by SOLR and it works with implicit ZooKeepers.
For explicit ZooKeeper implementation refer either of the below ones
http://amn-solr.blogspot.in/
SolrCloud 5 and Zookeeper config upload

Error deploying configuration descriptor Solr

I have done the below steps for Solr Integration to tomcat on windows machine.Can you please clarify what am I doing wrong here.
1) Download Solr and unzipped Solr 5.2.1 to the below directory C:\downloads\solr-5.2.1\solr-5.2.1.
2)Download Tomcat 7 zipped version and unzipped it to below location C:\downloads\apache-tomcat-7.0.62\apache-tomcat-7.0.62
3)Copy Jar files from C:\downloads\solr-5.2.1\solr-5.2.1\dist\solrj-lib directory to C:\downloads\apache-tomcat-7.0.62\apache-tomcat-7.0.62\lib directory.
4) Create a solr.xml in the C:\downloads\apache-tomcat-7.0.62\apache-tomcat-7.0.62\conf\Catalina\localhost folder.
<?xml version='1.0' encoding='UTF-8'?>
<context docBase="C:/downloads/apache-tomcat-7.0.62/apache-tomcat-7.0.62/webapps/solr.war" debug="0" crossContext="true" >
<environment name="solr" type="java.lang.String" value="/apache-tomcat-7.0.62/webapps/" override="true"></environment>
</context>
5)Copy solr.war file from C:\downloads\solr-5.2.1\solr-5.2.1\server\webapps to
C:\downloads\apache-tomcat-7.0.62\apache-tomcat-7.0.62\webapps folder.
6)Start the tomcat using startup.bat command in bin folder
7)Edit web.xml to
<env-entry>
<env-entry-name>solr/home</env-entry-name>
<env-entry-value>C:/downloads/solr-5.2.1/solr-5.2.1</env-entry-value>
<env-entry-type>java.lang.String</env-entry-type>
</env-entry>
8)Restart the tomcat and hit the url http://localhost:8080/solr I get 404 Not found Error.The error in the console is
SEVERE: Error deploying configuration descriptor C:\downloads\apache-tomcat-7.0.
62\apache-tomcat-7.0.62\conf\Catalina\localhost\solr.xml
java.lang.NullPointerException
at org.apache.catalina.startup.HostConfig.deployDescriptor(HostConfig.ja
va:645)
The Solr wiki states that running 5.x versions on Tomcat is no longer supported:
Internally, Solr is still implemented via Servlet APIs and is powered by Jetty -- but this is simply an implementation detail. Deployment as a "webapp" to other Servlet Containers (or other instances of Jetty) is not supported, and may not work in future 5.x versions of Solr when additional changes are likely to be made to Solr internally to leverage custom networking stack features.

Resources