I have a standard Solr 3.6 index and am looking to get the latest N documents back (date ascending from indexing them).
This site was helpful but not exactly what I'm looking for.
I am looking to do something like this:
localhost:8080/solr/select/?q=greekbailout&wt=json; date asc
Basically, query whatever with json output and the latest N submitted documents to the index. Anybody run into this before?
Use &sort=date asc for pure sorting and this for boosting newer documents.
solr query using date field with N documents returned in results
localhost:8080/solr/select/?q=greekbailout&wt=json&sort=date asc&rows=N
default schema of solr has a field called timestamp, which stores time at which a particular document is created or modified, so if your date field doesn't quite store this and this is your requirement, you can use timestamp.. just replace date with timestamp
In your Solr URL just apend &sort=<field>+<asc/desc>. Also your field should be indexed and not multivalued.
You can also sort on multiple fields.
&sort=<field name>+<direction>[,<field name>+<direction>]...
http://wiki.apache.org/solr/CommonQueryParameters#sort
Related
I have Solr 7.2.1 and in my managed-schema.xml file I have a field which represents date object of type "pDate".
Now I need to index also the time of the day, but I saw I can't search for the time with "pDate" field type. If I query solr searching for my_date_field:[2018-03-12T00:00:00.000Z TO *] it works; instead if i search [2018-03-12T12:00:00.000Z TO *] I can't find any results.
so, basically, what type is better to use to achieve that ? Is the field type the origin of the problem ?
Solr Ref Guide says: "Solr’s date fields (DatePointField, DateRangeField and the deprecated TrieDateField) represent "dates" as a point in time with millisecond precision." So the field can not be the origin of your problem.
Check the format "12/03/2018T12:00:00.000Z". Is is really correct? I only know dates formatted like "2018-03-12T12:00:00.000Z". See here Date Formatting
Also find the document in Solr Admin UI and inspect the JSON response. What is the value of my_date_field?
I found the solution,
I had to set $SOLR_TIMEZONE variable in the "solr.in" config file, but the correct value was not "CEST" but "Europe/Rome" for me
hi guys i have started working on a project which need solr implemtation for searching.
I am using SolrNet Lib and my question is:
I have two field in solr index Maxsal and Minsal and i have Currentsal parameter which contains salary amount. What i want is, get all records which satisfy this condition:
currentsal< Maxsal && currentsal> Minsal
Take a look at Solr range query. It should allow to create query like this
minsal:[* TO PARAM] AND maxsal:[PARAM TO *]
For more information look here - http://www.solrtutorial.com/solr-query-syntax.html
Never noticed that Query() take string parameter too.
So,
Solr.Query("MaxSal<="+parameter && MinSal>=parameter")
I recently patched my Solr 4.2.1 with the ComplexPhrase query addon (https://issues.apache.org/jira/browse/SOLR-1604). When I issue a query such as :
my_text_field:"testin* compl*"~1 AND my_date_field:2013-12-12T04:58:53.732Z
I get results that contain the text query I issued and the date I issued in the my_date_field.
But when I do this:
my_text_field:"testin* compl*"~1 AND my_date_field:[2013-01-01T02:58:53.732Z TO 2013-12-12T04:58:53.732Z]
I get no results.
If I remove the complexphrase parser things go back to normal ( but I have no support for complex phrase queries ).
Ok after some time reading the lucene and solr code I figured it out.
This patch creates a Query Parser that extends the Lucene QueryParser. The Lucene QueryParser does not handle range queries other than Term Ranges ( simple strings in a way ). If one wants to specialize the behavior of the QueryParser, he must extract the field type and create the appropriate range query ( eg NumericRangeQuery for numbers, etc).
I want solr to create indexes based on a specific field. For e.g. I have a field in schema.xml, createDate (which might be of value 2012/2013/etc). Now while indexing if the value of that specific field is 2013, the document should be indexed at /data/2013/index folder (or some logically separated folder). I tried to provide the following in my solrconfig xml just before the <config> tag ends:
<partition>
<partitionField name="creationYear">
<value>2004</value>
<value>2005</value>
<value>2006</value>
<value>2007</value>
<value>2008</value>
<value>2009</value>
<value>2010</value>
<value>2011</value>
<value>2012</value>
<value>2013</value>
</partitionField>
</partition>
While indexing its not working and it seems that this was just an idea but not really implemented in solr. Am I assuming correct? Or is there a way I can allow solr to create dynamic index folders based on the year(as in this example)?
Any help would be appreciated!!
1)I have a field storing timestamp as text (YYYYMMDDHHMM). Can i get the results as i get with date faceting? (July(30),August(54) etc)
As per my knowledge Solr currently doesn't support range faceting, even if it does in the future , text will not be recognized as integer/long.
2)Is there any way to get total count of facet results for a particular query in an efficient way?
Thanks,
Even though faceting by range isn't supported in Solr 1.4.x, you can use facet.query to facet by any arbitrary query. So you can build facet queries like facet.query=timestamp:[201006010000 TO 201006302359] for June, facet.query=timestamp:[201007010000 TO 201007312359] for July, etc. Use a copyField to query against a trie field for optimum performance.
Add them up client-side.