i want to use solr to index MySql database and so that I can perform a faster search of data on my website.Can anyone help me with the code. I don't have any idea how to implement solr in my code.
Your question is too broad. However for a head start you could have a look at DataImport in Solr.
you many want to check for Solr Data Import Handler module which will help you index data from MySQL into Solr without writing any java code.
If you have downloaded Solr, You can check out the example solr-4.3.0/example/example-DIH (Refer to the readme.txt) which will give you an idea of how the DIH is configured and the indexing can be done.
CommonsHttpSolrServer commonsHttpSolrServer = new CommonsHttpSolrServer("http://localhost:8983/solr");
QueryRequest request = new QueryRequest(params);
request.setPath("/dataimport");
ModifiableSolrParams params = new ModifiableSolrParams();
params.set("command", "full-import");
commonsHttpSolrServer.request(request);
NOTE - The request sent is asynchronous, so you would receive an immediate response and would need to check the status to know if it was complete.
Related
I am new to solr search, i have completed a simple search.
Now I want to index documents directly from Database and want set scheduler or trigger for updating index when there is any change in DB.
I know that I can do it with DataImportHandler but can't understand its flow.
can you help me that from which steps I should have to start this process?
or can anyone just give me pointers to do this ??
I want to do this all things using SolrJ client.
This task requires many parts to work together. Work through https://wiki.apache.org/solr/DataImportHandler
DataImportHandler is a Solr component, which means that it runs inside the Solr instance. All you have to do is configure Solr and than run the DHI through the Dataimport Screen.
On the other hand SolrJ is an API that makes it easy for Java applications to talk to Solr. So you can write your own applications that create, modify, search and delete documents to Solr.
try to do simple edit and delete function on button click event and
send the id with that url in servlet and do your jdbc opertaion
after that successfully commited, call your data import command from solrj and redirect it to your index page
thats it.
I am trying to use Solr to read and search trough the indexes provided by an another application. These indexes are copied to a NAS every 15 minutes.
Is there a way to force Solr to re-read the indexes every 15 minutes ? Is there a way to set a searcher to expire or to be reloaded using maybe a CRON expression?
I am aware that I can reload the core... but I'm asking if maybe is there an another way...
Thanks.
If you are able to write some CRON expression it could be done in that way:
Solr have an endpoint for reloading a core, so all you need is to hit this URI every X minutes.
Load a new core from the same configuration as an existing registered
core. While the "new" core is initalizing, the "old" one will continue
to accept requests. Once it has finished, all new request will go to
the "new" core, and the "old" core will be unloaded.
http://localhost:8983/solr/admin/cores?action=RELOAD&core=core0
Yes you can use a CRON expression.
DataImportHandler will allow you to update your Solr index based on your NAS-indexes.
Look for the "delta-import" command "for incremental imports and change detection":
http://<host>:<port>/solr/<collection_name>/dataimport?command=delta-import
Programmatically using a Client API like SolrJ:
CommonsHttpSolrServer server = new CommonsHttpSolrServer("http://localhost:8983/solr/<collection_name>");
ModifiableSolrParams params = new ModifiableSolrParams();
params.set("command", "delta-import");
QueryRequest request = new QueryRequest(params);
request.setPath("/dataimport");
server.request(request);
I am writing an application in which I present search capabilities based on SOLR 4.
I am facing a strange behaviour: in case of massive indexing, search request doesnt always "sees" new indexed data. It seems like the index reader is not getting refreshed frequently, and only after I manually refresh the core from the Solr Core Admin window - the expected results will return...
I am indexing my data using JsonUpdateRequestHandler.
Is it a matter of configuration? do I need to configure Solr to reopen its index reader more frequently somehow?
Changes to the index are not available until they are commited.
For SolrJ, do
HttpSolrServer server = new HttpSolrServer(host);
server.commit();
For XML either send in <commit/> or add ?commit=true to the URL, e.g. http://localhost:8983/solr/update?commit=true
I want to make queries according to the parameters I get from my server and query solr server.
First I want a guide how to make a query url and second, how to send it to solr. For the time being I use the default graphical user interface.
Use Client Libraries provided with Solr in your application to build queries as you get from the browser and request it to solr.
These libraries provide the Solr results in the respective language objects, which are very simple to use.
Solrj for Java, Sunspot and rsolr with Ruby, SolrPHP for PHP and much more
As I came to know from your other comment, you are using C#.
Here you go, code snippet for fetching the XML in Solr using C#
using System.Net;
...
...
HttpWebRequest webrequest = (HttpWebRequest)HttpWebRequest.Create(solruri);
webrequest.Method = WebRequestMethods.Http.Get;
//Get the Response from Solr
HttpWebResponse webresponse = (HttpWebResponse)webrequest.GetResponse();
StreamReader reader = new StreamReader(webresponse.GetResponseStream());
Variable solruri= url to Solr instance running, for example: _http://localhost:8080/solr/select......
I try to add new data to the solandra according to the solr's schema but I can't find any example about this. My ultimate goal is to integrate solandra with django-solr.
What I understand about the insert and updating in the solr based on the original solr and django-solr is to send the new data on the http protocol to the decent path, for example:
http://localhost:8983/solandra/wikipedia/update/json
However, when I access the url, the browser keep telling me HTTP ERROR: 404.
Can you help me understand the step to add new data and delete the data in the solandra environment?
I also have a look at the reuters-demo, but the procedure to insert data is process in the file of reutersimporter.jar, but I can't see the source as well. So Please help me to understand how the system work in terms of data inserting and deleting.
Thank you.
Since you are using the JSON update handler, this UpdateJSON page on the Solr Wiki has some good examples of inserting data using the JSON handler via curl. Also, the Indexing Data section of the Solr Tutorial shows how you can insert data using the post.jar file that is included with the Solr source.
Are you creating the solr schema.xml and solrconfig.xml and posting it to solandra? If you add the JSON handler then this should work. The reutersdemo uses solrj. django-solr should work as well.