Adding Index via HTTP Get in Apache SOLR - solr

I have a problem using SOLR when add document via GET method :
http://localhost:8983/solr/update?stream.body=%3Cadd%3E%3Cdoc%3E%3Cfield%20name=%22office%22%3EBridgewater%3C/field%3E%3Cfield%20name=%22skills%22%3EPerl%3C/field%3E%3Cfield%20name=%22skills%22%3EJava%3C/field%3E%3C/doc%3E%3C/add%3E
The error I got is
HTTP ERROR 400
Problem accessing /solr/update. Reason:
ERROR: [doc=null] unknown field 'office'
Is there any prerequisite that I'd missed ?

Looks like in the schema.xml you don't have the definition for the office field. If you have it, maybe you're using the wrong case to refer to it?

Related

Error while implementing Solr using solarium

I installed solr using the Bitnami installation in windows. When i ping the server using solarium.
If use var_dump($result->getResponse()); then everything works fine but if i use var_dump($result->getdata()); I get a error saying JSON could not be encoded.
Similarly when i try to index data into the solr server i get this error.
Fatal error: Uncaught exception 'Solarium\Exception\HttpException' with message 'Solr HTTP error: OK (405) Error 405 HTTP method POST is not supported by this URL<br> HTTP ERROR 405<br> Problem accessing /solr/admin.html. Reason: <br> HTTP method POST is not supported by this URL<br> Powered by Jetty<br> in C:\xampp\htdocs\trial\search\vendor\solarium\solarium\library\Solarium\Core\Query\Result\Result.php on line 104</p> <p>What can be the possible issue?</p>
Bitnami developer here
It seems to be an issue related to Solarium, and not with Bitnami Solr. You can see the issue here:
https://github.com/solariumphp/solarium/issues/101
You could ask there.
I hope it helps

Solr 4 Data Import Handler doesn't work

I am deploying Solr 4.3.0 in Tomcat 7.
Everything works fine but DataImportHandler. I can go to the
http://localhost:8080/solr/#/collection1/dataimport//dataimport
screen and see the dataimport options load at the UI.
Still, I can see any of my entities load in the "entity" combo box. Inside the configuration box, at the right side I can see the error below.
Apache Tomcat/7.0.41 - Error
report
525D76;}--> HTTP Status 500 - Filter execution threw an exception
noshade="noshade">type Exception reportmessage
Filter execution threw an exceptiondescription
The server encountered an internal error that prevented it from
fulfilling this request.exception
javax.servlet.ServletException: Filter execution threw an
exception root cause
java.lang.NoClassDefFoundError: org/apache/log4j/spi/LoggingEvent
org.apache.solr.logging.log4j.EventAppender.append(EventAppender.java:35)
org.apache.log4j.AppenderSkeleton.doAppend(AppenderSkeleton.java:251)
org.apache.log4j.helpers.AppenderAttachableImpl.appendLoopOnAppenders(AppenderAttachableImpl.java:66)
org.apache.log4j.Category.callAppenders(Category.java:206)
org.apache.log4j.Category.forcedLog(Category.java:391)
org.apache.log4j.Category.log(Category.java:856)
org.slf4j.impl.Log4jLoggerAdapter.error(Log4jLoggerAdapter.java:498)
org.apache.solr.common.SolrException.log(SolrException.java:119)
org.apache.solr.servlet.ResponseUtils.getErrorInfo(ResponseUtils.java:58)
org.apache.solr.servlet.SolrDispatchFilter.sendError(SolrDispatchFilter.java:691)
org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.java:380)
org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.java:155)
note The full stack trace of the root cause is
available in the Apache Tomcat/7.0.41 logs.Apache Tomcat/7.0.41
Problem is that I have the "log4j-1.2.16.jar" loaded in the classpath (it's on Tomcat lib dir).
Anyone have stepped in this problem?
Try following the steps outlined in Using the example logging setup in containers other than Jetty. I have encountered this same error when running Solr 4.3 until I followed these steps to configure logging.
After changing the directory, did you change the directory path in solrconfig.xml file.
I just want to make sure after the making changes in configuration file, did you restart the tomcat and solr server?
You need to copy the slf4j-log4j12-1.6.6.jar from the ext of Solr into the lib folder.
You also need to put the logging.properties file there.

Need help understanding Solr

I'm just getting started with Nutch and Solr. I ran the crawl once with just one seed URL.
I ran this command:
bin/nutch crawl urls -dir crawl -solr http://localhost:8983/solr/ -depth 3 -topN 5
Everything goes fine and I'm assuming Solr indexes the pages? So how do I go about searching now? I went here localhost:8983/solr/admin/ but when I put a search query and click search I get this:
HTTP ERROR 400
Problem accessing /solr/select/.
Reason: undefined field text
I also tried an example from the tutorial but when I run this command:
java -jar post.jar solr.xml monitor.xml
I get this:
SimplePostTool: version 1.4
SimplePostTool: POSTing files to http://localhost:8983/solr/update..
SimplePostTool: POSTing file solr.xml
SimplePostTool: FATAL: Solr returned an error #400 ERROR: [doc=SOLR1000] unknown field 'name'
My ultimate goal is to somehow add this data into Accumulo and use it for a search engine.
I'm assuming you are using Nutch 1.4 or up. If that is the case, you need to change the type of the fields you added in the solr/conf/schema.xml file from "text" to "text_general", without the quotes.
I am working towards a similar goal right now and have used that fix to at least get solr working properly, although I still cannot get solr to search the indexed sites. Hope this helps, let me know if you get it working.

org.apache.solr.common.SolrException: missing content stream

I have installed Apache Solr with Tomcat and my /solr/admin is working fine. But when I try to issue /solr/update I am getting the following error. What could be the reason?
org.apache.solr.common.SolrException: missing content stream
If you add commit parameter i.e. ?commit=true, it will work
/solr/update will look for any input documents to be indexed. Running plain /solr/update will cause this exception since there is no input for it. The easiest way to run it is like,
java -Durl=localhost:8080/<your apache solr context path, mostly solr>/update -jar post.jar *.xml
This can also happen through SolrJ/spring-data-solr if you try to persist an empty collection of documents.
So solrClient.add(new ArrayList<SolrInputDocument>(), 10000);
would also cause the error.

Building Solr indexes through Haystack throws unknown field error

I'm trying to integrate Haystack with Solr. When I try to build the index, I get an error
"Unknown field django_id" from SOLR. What's causing this to happen?
You also get this error if you haven't given Solr the schema.xml file which Haystack generates for you, as explained here in the docs.
django-haystack.readthedocs.io/en/latest/tutorial.html#reindex
The schema.xml was malformed as I had copied additional text from the console.
If you added new fields to your database and copied the generated XML files from Haystack, you might also be getting this error because you haven't restarted jetty/Tomcat/whatever server you are using. This solved it for me on Ubuntu and Jetty:
sudo /etc/init.d/jetty stop
sudo /etc/init.d/jetty start
(by the way, that would also be the same as simply doing this):
sudo service jetty restart
Or, if you are using tomcat, that would be
sudo service tomcat6 restart
Edit: (tested this with Tomcat, and it solved the same problem again, the same as with Jetty).

Resources