I am using solr 4.6 for fetching records from sql server using data import handler.But while fetching i am getting error and the reason for error is one of my field is of LatLong type.So when My sql latlong field contains wrong value for eg 23.454,545454 As u can see longitude value i.e 545454 is wrong so solr dih gives error. I want know where solr keeps these error logs. I am using jetty container for solr.
JAVA_HOME=/usr/java/default
JAVA_OPTIONS="-Dsolr.solr.home=/opt/solr/solr $JAVA_OPTIONS"
JETTY_HOME=/opt/solr
JETTY_USER=solr
JETTY_LOGS=/opt/solr/logs
All of these settings are important. In particular, not setting JETTY_LOGS would lead jetty to attempt (and fail) to place request logs in /home/solr/logs.
go through the link
https://wiki.apache.org/solr/SolrJetty
Related
In log file Full-Indexing have not any error message.
Cronjob log is showing success but few product are not coming.
Any one tell me what is exact problem ?
Have you checked in Solr Server http://localhost:8983/solr whether your missing products are coming or not?
Case 1 : If Products are showing in Solr server
In that case might be due to some search filters, few products are not showing in storefront. Try to debug what exactly query is hitting to solr server. Have you added any custom filter condition in commerceSearchQueryPageableConverter chain ? Debug SearchFiltersPopulator as well.
Case 2 : Products are not showing in Solr server
In that case, you need to check your Full and Update Index Query.
I have an index on a Schemaless solr instance. To allow the application to query some of the fields that are in this index, I have to register these fields using the schema REST API http://localhost:8983/solr/schema/fields.
All works fine in isolation. I can also replicate the index to slaves without problem. However, I am unable to query the replicated index using the fields that were registered via the schema REST API.
That means, if I register the field "button" using the API, I can query using this field on master, but I cannot query on slave. I get error message 400 undefined field button.
Now, I also tried to register this field on the slave in the same way I registered it on the master using the schema REST API. This fails with the message: 400 This IndexSchema is not mutable.
Any idea how this should be addressed?
I presume that when the schema is well defined, the schema.xml can be replicated. But what happens with fields created via the REST API?
I am using SOLR 4.10.3
I have not fully validated that this is the solution to this problem, but my gut feeling tells me that it is. The SOLR master was running SOLR 4.8.0 and the SOLR Slave was running SOLR 4.10.3. It looks like the slave did not completely like the index replicated from 4.8.0. So I downgraded the slave to 4.8.0 and everything works fine.
I have been struggling with setting up Carrot2 for use PHP, on a local machine. The plan is to have Carrot2 retrieve cluster from Solr populated by Nutch. Currently Solr and Nutch are correctly configured and I have been able to access the information via Carrot2 Workbench. Carrot2-dcs-3.10.0 has been set up what I believed to be correctly deployed through the tomcat6 manager although the documentation on setting this up is horrible vague and incomplete. Changes to source-solr-attributes.xml were made according to https://sites.google.com/site/profileswapnilkulkarni/tech-talk/howtoconfigureandruncarrot2webapplicationwithsolrdocumentsource . Tomcat is set up on port 8080. The Carrot2 DCS php example example.php works and displays the test output correctly. Although, when I try to perform a cluster using localIPAddress:8080/carrot2-dcs/index.html I run into a problem. When I set document source to Solr and the query to : then click cluster I get the following error message.
HTTP Status 500 - Could not perform processing: org.apache.http.conn.HttpHostConnectException: Connection to localhost:8983 refused
type Status report
message Could not perform processing: org.apache.http.conn.HttpHostConnectException: Connection to localhost:8983 refused
description The server encountered an internal error that prevented it from fulfilling this request.
I have searched everywhere in the deployed webapp folder for carrot2 and can't find where it is getting localhost:8983 from.
Any assistance would be appreciated, thank you.
It turns out that the source-solr-attributes.xml file had an extra overridden-attributes. one was before the default block comment with the example parameters and the second was added in by me with the parameters needed for my config. Deleting one of the line so there was only one corrected the problem. Apparently with two of those it ignores the server settings and uses default values instead.
I am writing an application in which I present search capabilities based on SOLR 4.
I am facing a strange behaviour: in case of massive indexing, search request doesnt always "sees" new indexed data. It seems like the index reader is not getting refreshed frequently, and only after I manually refresh the core from the Solr Core Admin window - the expected results will return...
I am indexing my data using JsonUpdateRequestHandler.
Is it a matter of configuration? do I need to configure Solr to reopen its index reader more frequently somehow?
Changes to the index are not available until they are commited.
For SolrJ, do
HttpSolrServer server = new HttpSolrServer(host);
server.commit();
For XML either send in <commit/> or add ?commit=true to the URL, e.g. http://localhost:8983/solr/update?commit=true
I have setup Solr to index data from Oracle DB through DIH handler. However through Solr admin I could see the DB connection is successfull, data retrieved from DB to Solr but not added into index. The message is that "0 documents added" even when I am able to see that 9 records are returned back.
The schema and fields in db-data-config.xml are one and the same.
Please suggest if anything I should look for.
Did you do a full import by hitting http://HOST:PORT/solr/CORE/dataimport?command=full-import? Then the commit should happen by default. You can also try committing on full import explicitly by hitting http://HOST:PORT/solr/CORE/dataimport?command=full-import&commit=true.
Hit http://HOST:PORT/solr/CORE/select?q=*:* and check if you get 9 docs back.
However, if you are running a delta import, then there is a possibility that no documents were changed and you may see 0 docs added/deleted.
If you want to delete the existing Solr index before starting, hit http://HOST:PORT/solr/CORE/update?stream.body=<delete><query>*:*</query></delete>&commit=true and then do a full import & verify.