I tried to do indexing via HMC but it all get aborted with below error.
ERROR [coreLoadExecutor-4-thread-1] [CoreContainer] Failed to load file /hybris_extensions/hybris/config/solr/embedded/collection1/solrconfig.xml
ERROR [coreLoadExecutor-4-thread-1] [CoreContainer] Unable to create core: collection1
org.apache.solr.common.SolrException: Could not load config file /hybris_extensions/hybris/config/solr/embedded/collection1/solrconfig.xml
at org.apache.solr.core.CoreContainer.createFromLocal(CoreContainer.java:530)
I have started solr from hybris 5.6 in standalone mode and using linux machine.
hybris can run solr in standalone or embedded mode. When running in embedded mode, the configuration for SolR is created in the directory:
hybris/config/solr/embedded/
Therefore your hybris server needs the rights to write to this directory.
Further reading:
https://wiki.hybris.com/display/release5/SolrFacetSearch+-+Installation+Guide
Related
I encountered a problem that I could not figure out for the past week.
After installing another java JDK (Eclipse JDK) and changing the JAVA_HOME value in the computer environment, my Solr service failed to start.
I tried everything, including format to my PC, and new installation as we do every time on a new PC, and I still encounter the same problem.
This is the command the system runs to start the Solr service:
Starting java -Xms3000m -Xmx3000m -verbose:gc -XX:NewRatio=3 -XX:SurvivorRatio=4 -XX:TargetSurvivorRatio=90 -XX:MaxTenuringThreshold=8 -XX:ConcGCThreads=4 -XX:ParallelGCThreads=4 -XX:+CMSScavengeBeforeRemark -XX:PretenureSizeThreshold=64m -XX:+UseCMSInitiatingOccupancyOnly -XX:CMSInitiatingOccupancyFraction=50 -XX:CMSMaxAbortablePrecleanTime=6000 -XX:+CMSParallelRemarkEnabled -XX:+ParallelRefProcEnabled -XX:-OmitStackTraceInFastThrow -DSTOP.PORT=7984 -DSTOP.KEY=mysecret "-Dsolr.install.dir=C:\Program Files\Morphisec Server Dev\solr-6.3.0-ssl\server.." -Djetty.host=0.0.0.0 -Djetty.port=8984 -Dsolr.jetty.https.port=8984 "-Djetty.home=C:\Program Files\Morphisec Server Dev\solr-6.3.0-ssl\server\solr" -Dsolr.autoSoftCommit.maxTime=10 "-Dsolr.log.dir=C:\Program Files\Morphisec Server Dev\solr-6.3.0-ssl\server\logs" -Dsolr.autoCommit.maxTime=60000 -Dsolr.ssl.checkPeerName=false "-Djavax.net.ssl.trustStore=C:\Program Files\Morphisec Server Dev\solr-6.3.0-ssl\server\etc\keystore.solr.jks" -Djavax.net.ssl.trustStorePassword=DEA7B39145F6478C -DzkClientTimeout=15000 -DzkRun -jar start.jar --module=https -DurlScheme=https
All the schemas in the folders of Solr do not contain any use of intPointField as mentioned, only TrieField…
We use Solr 6.3.0.
When I enter the Solr UI, into CloudàTreeà the /configs and choose one of my collections, I have a managed-schema file that does not appear in the directory on my PC, and there is a use of intPointField:
I don’t know where it gets it from. (As I mentioned, I even formatted the PC)
This is the log I get for each collection creation fail:
Downloaded (Apache solr 8.8.1) [https://archive.apache.org/dist/lucene/solr/8.1.1/]
In path \solr-8.8.1\bin
Opened command prompt, and executed following command
solr start
command prompt screen after starting solr core
Accessed http://localhost:8983/solr/#/ in the browser
clicked on -> Core Admin -> new core
*filled core name, instanceDir, dataDir- data, config- (by default) solrconfig.xml , schema- (by default) schema.xml *
When i click on Add core,
I get following error:
Error CREATEing SolrCore 'new_core': Unable to create core [new_core] Caused by: Can't find resource 'solrconfig.xml' in classpath or 'C:\Users\AnanyaStitipragyan\Desktop\CollabAI\solr-8.8.1\server\solr\new_core'
you can add core using following command
solr-8.8.1\bin>
solr create -c <core_name>
Check out this link https://www.tutorialspoint.com/apache_solr/apache_solr_core.htm
I'm trying to integrate SOLR with Hybris but both of them are running on Kubernetes as a diffferent pod.
If I'm trying indexing SOLR on Hybris, it throws the error below;
ERROR [BackofficeLO-47] (000001JT) [SolrStandaloneSearchProvider] Error from server at http://10.10.100.181:34324/solr: Error CREATEing SolrCore 'master_backoffice_backoffice_product_flip': Unable to create core [master_backoffice_backoffice_product_flip] Caused by: de.hybris.platform.solr.search.MultiMaxScoreQParserPlugin
org.apache.solr.client.solrj.impl.HttpSolrClient$RemoteSolrException: Error from server at http://10.10.100.181:34324/solr: Error CREATEing SolrCore 'master_backoffice_backoffice_product_flip': Unable to create core [master_backoffice_backoffice_product_flip] Caused by: de.hybris.platform.solr.search.MultiMaxScoreQParserPlugin
I guess somethings wrong with SOLR "deafult" indexing directory.
Solr is running as process like below inside the pod;
solr#solr-fsd33wdf-qteg:/opt/solr-8.5.2$ ps -ef | grep solr
solr 10 1 0 Jun23 ? 00:15:55 /usr/local/openjdk-11/bin/java -server -Xms512m -Xmx512m -XX:+UseG1GC -XX:+PerfDisableSharedMem -XX:+ParallelRefProcEnabled -XX:MaxGCPauseMillis=250 -XX:+UseLargePages -XX:+AlwaysPreTouch -Xlog:gc*:file=/var/solr/logs/solr_gc.log:time,uptime:filecount=9,filesize=20M -Dsolr.jetty.inetaccess.includes= -Dsolr.jetty.inetaccess.excludes= -Dsolr.log.dir=/var/solr/logs -Djetty.port=8983 -DSTOP.PORT=7983 -DSTOP.KEY=solrrocks -Duser.timezone=UTC -Djetty.home=/opt/solr/server -Dsolr.solr.home=/var/solr/data -Dsolr.data.home= -Dsolr.install.dir=/opt/solr -Dsolr.default.confdir=/opt/solr/server/solr/configsets/_default/conf -Dlog4j.configurationFile=/var/solr/log4j2.xml -Xss256k -Dsolr.jetty.https.port=8983 -jar start.jar --module=http
So default confdir is /opt/solr/server/solr/configsets/_default/conf
If I'm check SOLR_HOME variable, it's different directory;
solr#solr-f575dcfdf-qtnpg:/opt/solr-8.5.2$ echo $SOLR_HOME
/var/solr/data
So, how can I change confdir to /var/solr/data ? I guess this is the problem here?
Thanks!
This page provides you with information on how to use the standalone setup. The ant configureSolrServer takes an argument of the path to the original Solr binary that you should download from here. This overwrites the files in the directory with the SAP Commerce specific setup. The MultiMaxScoreQParserPlugin is part of solr-hybris-components-<version_of_solr>.jar file, where the <version_of_solr> corresponds to the solr version your SAP Commerce is running on. Note that SAP Commerce also supports multiple Solr versions and it depends on what configuration you have.
You may then extend the default Solr docker image as provided here to have your setup running.
I'v just started to learn solr. From last 3 days I'm in trouble. I can not
index rich documents on solr 3.6 and 4.0. I am using windows7 64bit.
what i tried is as:
First I installed solr 3.6 with tomcat-jetty.using BitNami Apache
1.tried -Durl command what i got :
error #500 lazy loading error
2.Download curl for my window machine and tried curl i got: error #500 lazy loading error
3.copied a program from solr tutorial to upload a file using solrJ for
SolrJ in NetBeans IDE and tried a pdf files to indexed using
update/extract
then i got:
org.apache.solr.common.SolrException: Server at
"myServer:port/solr" returned non ok status:500, message:Internal
Server Error
4.changed solconfig.xml so removed startup=lazy from update/extract
request handler and got the same thing
I re-installed solr 3.6 again but can't succeed. 4.0 gives the same error.
Same problem with some other request handler also like /browse says
etc.
Should i switch to Linux?
Looks like the packager (Bitnami) did not include that library, even though they left Solr configured to use that library. You may ask them to resolve it. Or you can deploy it yourself.
Here's how to deploy Solr on Tomcat. Its equally easy to install on Windows; and it starts as a Windows service. Once installed, to enable the rich document support, copy the contents of contrib/extraction/lib/ to a directory and point the sharedLib in solr.xml to that directory. If you have used that guide, you will understand those new terms :-)
I am trying to run Apache Nutch on Windows for web crawling.I have installed cygwin and set its Path .But I am getting the following exception :
Exception in thread "main" java.io.IOException: Failed to set permissions of path: \tmp\hadoop-cjindal\mapred\staging\cjindal-330065706\.staging to 0700
at org.apache.hadoop.fs.FileUtil.checkReturnValue(FileUtil.java:682)
at org.apache.hadoop.fs.FileUtil.setPermission(FileUtil.java:655)
at org.apache.hadoop.fs.RawLocalFileSystem.setPermission(RawLocalFileSystem.java:509)
at org.apache.hadoop.fs.RawLocalFileSystem.mkdirs(RawLocalFileSystem.java:344)
at org.apache.hadoop.fs.FilterFileSystem.mkdirs(FilterFileSystem.java:189)
at org.apache.hadoop.mapreduce.JobSubmissionFiles.getStagingDir(JobSubmissionFiles.java:116)
at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:856)
at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:850)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:396)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1083)
at org.apache.hadoop.mapred.JobClient.submitJobInternal(JobClient.java:850)
at org.apache.hadoop.mapred.JobClient.submitJob(JobClient.java:824)
at org.apache.hadoop.mapred.JobClient.runJob(JobClient.java:1261)
at org.apache.nutch.crawl.Injector.inject(Injector.java:217)
at org.apache.nutch.crawl.Crawl.run(Crawl.java:127)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65)
at org.apache.nutch.crawl.Crawl.main(Crawl.java:55)
I have not installed hadoop. Please help.
It is better to run Nutch on Unix. But if you want to run it on windows then probably you can download 1.2 version of Nutch which comes with Hadoop version which does not have this issue.