Solrj ClientUtils toQueryString escapes facet.pivot field comma - solr

The toQueryString method inside Solrj's ClientUtils class is called when the http request is formed internally. But in this process, it also encodes the commas (,) that need to be sent in the facet.pivot field.
eg . facet.pivot=A1,A2 gets sent as facet.pivot=A1%2CA2
Because of this the query returns no result.
Please suggest a mechanism to report this or any work around for the same.

Your question is about escaping/encoding the query for a solr request.
In current version of solr the toQueryString method is moved to SolrParams. But never the less "%2C" ist correct for "," in utf-8.
So most likely you have a problem on server side with unencoding the params.
Try solr in current version, because in this case you don't need to config the servlet container properly: it is now part of solr.
btw: take a look to sub-facets instead of pivot-faceting: http://yonik.com/solr-subfacets/

Related

extjs 4.2 - entity encoding with backslash e.g. '\1014'

The value "\1014" is coming from my database and I want to display it in an ExtJS Panel.
The problem is, it gets processed as an entity value, and "A4" is displayed instead
I don't want to have to do entity encoding on the back end.
I tried
Ext.util.Format.htmlEncode('\1014')
But this also returns "A4"
What is the correct way to encode such values on the front-end for display?
This has nothing to do with ExtJS. This is a builtin feature of JavaScript and JSON. If you want to send the non-literal \101 as JSON to the frontend, you have to escape the backslash correctly to the specs in the backend:
{"success":true,"data":{"test":"\\101","id":"extModel2-1"}}
If you don't escape the backslash, it will be converted to the appropriate literal immediately when it hits the frontend and is then indistinguishable from the letter A, so this is not revertible on the frontend.
Relevant fiddle
Relevant older answer
You can parse data using JSON.parse(response.reponseText) instead of Ext.decode

Apache Camel - Mybatis select with parameters and useIterator

I'm trying to use Apache Camel (version 2.20.0) with mybatis component.
More specifically, I have to export a large set or record from database to file.
I'd like to prevent memory issues so I want to use the option consumer.useIterator. My route is:
from("mybatis:selectItemsByDate?statementType=SelectList&consumer.useIterator=true")
.split()
.body()
.process(doSomething)
to(file:my-path-file);
but my query has in input a parameter (the starting date to get data). How should I set this parameter?
In many example on internet I saw the parameter in the body or in the header of the Exchange message but I think is possibile only if the mybatis endpoint is in a "to" method. But the option "consumer.useIterator" is working only when the enpdoint is in a "from" method.
Please help me to understand how I can set the input for my query or if this is not supported yet (in this case if you can give some hint how to implement would be great)
thank you.
Then you need to start your route from something else, like a timer or direct endpoint, and then call the mybatis endpoint in a to, where you have set that information in the message body/header you use in the mybatis query so its dynamic.
Also you should set the splitter to be in streaming mode so it walks the iterator it gets from mybatis on-demand.

Adding raw query parameters via Criteria API

I could not find an answer to this. I found the previous similar question unanswered. I'd like to use Spring data solr for queries. But #Query is insufficient for my needs. As I understood, whatever you give here becomes a q parameter to `select' handler of solr.
In my case I need to add more parameters for example sfield for a spatial search. If #Query wont cut it, I am ready to write a custom repository implementation by autowiring SolrTemplate, But then the Criteria API does not seem to let me add a raw query parameter either.
Any help/points will be greatly appreciated.
I worked around it by creating a QueryParser decorator that adds the required parameters to a parsed solr query. The QueryParser was registered using solrTemplate.registerQueryParser().
Note however that I had to do a really nasty hack to get this working, since all queries that are sent to solrTemplate.queryForPage are wrapped by a static package protected inner class in QueryBase. So my registration code above had to be in a package org.springframework.data.solr.core

Using PreAnalyzedField in Solr 4

I am trying to index fields using Solr, in which I already have a TokenStream. I dont want Solr to have any analysis - Its already made. As I understood, I could get this exact functionality using Solr's PreAnalyzedField.
The problem is that I cannot find any good resource to help me understand the flow:
I needto define the field in the schema.xml file as PreAnalyzedField, and the tokenstream should be parsed using the parse method of the parser implementation - but how to I actually feed the field with my tokeStream? how \ when exactly is it sent to the toFormattedString method???
I think PreAnalyzedField is a bleeding edge of Solr as of 4.0/4.1. The main documentation is on the Wiki and basically explains the two parser types. The default is JSON, I am not sure how to get the other type to work.
Once you have that type defined, you just supply fully-tokenized content in the JSON format as described as that field's value. When that hits the parser, it will convert it to the Token stream. The same way a number gets parsed from a string representation into a real numeric representation. Try feeding an unparsable value and you will see the full call stack in the exception stacktrace.
The problem is how to query it. My own discussion on the mailing list did not get very far.

Create SOLR index by parsing "content" via GET (URL)

is it possible to create a SOLR index by parsing the "content" to index via GET (URL)?
The examples I found allowed this only by reading the content directly from a file.
Thanks in advance!
Have a look at the Solr Data Import Handler - focusing on the XML/HTTP Datasource. If I understand correctly what you are asking this should be able to meets your needs.
you might be able to use curl to do that. The request should most probably be something like (untested):-
curl 'http://localhost:8983/solr/update?key1=val1&key2=val2'
and so on where key1 and key2 are the fields and val1 and val2 are their corresponding values. Do note though that you will need to send a commit command also to the above url after sending the above request.
You can also look at the following url :-
http://wiki.apache.org/solr/UpdateXmlMessages
Search for the phrases 'Updating via GET' and 'Updating a Data Record via curl'

Resources