I am very new to solr. I am trying to add a large number of fields to the schema. I am using version 8.1, and it is my understanding that it should be done through the API.
I am trying to upload all fields using curl, but keep getting errors. It works fine through the web interface.
1. Where can I find the correct field types? I checked
here, but I get error messages like "Field type 'StrField' not found".The values are also different from the ones that I get presented with in the webinterface.
2. Enum valuesI found documentation, which also results in an unknown field error. For enumns I don't see an option in the web interface.
<\p>
curl -X POST -H 'Content-type:application/json' --data-binary '{"add-field":{"name":"TEST","type":"string","required":"true","stored":true,"indexed":"true"}}' http://localhost:8983/api/cores/tgec/schema
{
"responseHeader":{
"status":400,
"QTime":27},
"error":{
"metadata":[
"error-class","org.apache.solr.api.ApiBag$ExceptionWithErrObject",
"root-error-class","org.apache.solr.api.ApiBag$ExceptionWithErrObject"],
"details":[{
"add-field":{
"name":"TEST",
"type":"StrField",
"required":"true",
"stored":true,
"indexed":"true"},
"errorMessages":["Field 'TEST': Field type 'StrField' not found.\n"]}],
"msg":"error processing commands",
"code":400}}
There is field type named "string" and the class is of "solr.StrField".
Its defined in schema.xml as below.
<fieldType name="string" class="solr.StrField" sortMissingLast="true" docValues="true" />
Then when you define a field, you mention a type string to it as below.
<field name="id" type="string" indexed="true" stored="true" required="true" multiValued="false" />
You need to change the "type":"StrField" to "type":"string".
Related
i am testing solr query for geographical search, this is my query:
SolrQuery query =new SolrQuery();
query.setParam("q","*:*");
query.setParam("fq","geofilt");
query.setParam("d","100000");
query.setParam("pt","51.53750834,-0.19329616");
query.setParam("sfield","location_s");
i am getting no results although there is very near points and also exact point to the pt.
any idea whats the reason??
hint: im using this field type for spatial search (the one comes in the schema.xml by default):
<fieldType name="location_rpt" class="solr.SpatialRecursivePrefixTreeFieldType"
geo="true" distErrPct="0.025" maxDistErr="0.000009" units="degrees" />
because when i try to use this one as mentioned in the solr website i get an error:
<fieldType name="location_rpt" class="solr.SpatialRecursivePrefixTreeFieldType" spatialContextFactory="com.spatial4j.core.context.jts.JtsSpatialContextFactory"
autoIndex="true"
distErrPct="0.025"
maxDistErr="0.000009"
units="degrees" />
and this is my field definition:
<field name="location_s" type="location_rpt" indexed="true" stored="true"/>
thanks in advance!
instead of this kind of field names please add only 1 field that served for both x/y :
<dynamicField name="*_coordinate" type="tdouble" indexed="true" stored="true"/>
That will allow you to put directly datas in solr geo format :
"env": "DEV",
"latlgn_0_coordinate": -2.6263,
"latlgn_1_coordinate": -44.1978,
please take a look to both solr spatialsearch & solr wiki spatialsearch
As said above, please make sure to have in your runtime classpath.
You might download and install in your path the JTS library : JTS Library
Solr Manual solr install documentation
The JTS jar file must be on Solr's classpath as well. Due to a
combination of things, JTS can't simply be referenced by a ""
entry in solrconfig.xml; it needs to be in WEB-INF/lib in Solr's war
file, basically.
enjoy :)
Thanks jean, it worked with me without using JTS library, solr schema comes already with this field type that I used:
<fieldType name="location_rpt" class="solr.SpatialRecursivePrefixTreeFieldType"
geo="true" distErrPct="0.025" maxDistErr="0.000009" units="degrees" />
I defined a field of that type and it worked. This is my query:
SolrQuery query =new SolrQuery();
query.set("q","*:*");
query.set("fq","{!geofilt}");
query.set("pt","32.014708,35.873725");
query.set("sfield","location");
query.set("d","100");
Thanks.
I am trying to create a very simple search index on solr 4.5.1 with just two fields 'id' and 'name' by using a csv file. Running on Windows 8.
When I run:
curl http://localhost:8983/solr/update/csv --data-binary #mydata.csv -H
"Content-type:text/plain; charset=utf-8"
I get the error: Document is missing mandatory uniqueKey field: id
When I copy/paste the content of the file into the csv import function in the solr-admin ui (documents->document type:csv) then it works.
What am I missing? Thx for any help!
My schema.xml:
<fields>
<field name="id" type="string" indexed="true" stored="true" required="true" multiValued="false" />
<!-- points to the root document of a block of nested documents -->
<field name="_root_" type="string" indexed="true" stored="false"/>
<field name="name" type="text_en" indexed="true" stored="true"/>
<field name="_version_" type="long" indexed="true" stored="true" multiValued="true"/>
</fields>
<uniqueKey>id</uniqueKey>
The simplest csv file I tried is:
id,name
LXOxjksM2z, The simplest cookbook you can ever find
I do not have windows 8 , but i tried with windows 7. I installed 4.5.1 and ran your curl on windows/cygwin as:
/cygdrive/c/Installs/solr-4.5.1/solr-4.5.1/example/exampledocs
$ curl http://localhost:8983/solr/update/csv --data-binary #test.csv -H "Content-type:text/plain; charset=utf-8"
<?xml version="1.0" encoding="UTF-8"?>
<response>
<lst name="responseHeader"><int name="status">0</int><int name="QTime">17</int></lst>
</response>
Where test.csv is:
id,name
LXOxjksM2z, The simplest cookbook you can ever find
Solr indexed properly and i was able to query from admin for id:LXOxjksM2z and found 1 document as :
"response": {
"numFound": 1,
"start": 0,
"docs": [
{
"id": "LXOxjksM2z",
"name": " The simplest cookbook you can ever finds",
"_version_": 1452541521661264000
}
]
}
}
Not sure what is wrong on windows 8.
It's all about file encoding, it works only with ANSI encoding
here is the same issue
Apach Solr import index from csv(UTF-8) error: undefined field
I am using drupal 7 with apachesolr module.
I have an external file field to boost the results i want. The name of the file is external_eff_ranking. In the schema, I have:
<fieldType name="pfloat" class="solr.FloatField" omitNorms="true"/>
<fieldType name="file" keyField="id" defVal="1" stored="false" indexed="false" class="solr.ExternalFileField" valType="pfloat"/>
<dynamicField name="eff_*" type="file"/>
The format of the external file is:
id1=3.1
id2=4.2
id3=5
This works as expected, the results are boosted according to the values in the file. The problem is that when the values are changed, the results do not reflect the changes. I understand that I need to commit the changes somehow, but I can not figure out how.
I tried things like:
curl http://localhost:8983/solr/update?commit=true -H "Content-Type: text/xml" --data-binary '<commit />'
but did not work.
SOLVED
The following line in my solrconfig.xml solved the problem:
<requestHandler name="/reloadCache" class="org.apache.solr.search.function.FileFloatSource$ReloadCacheRequestHandler" />
Then I hit this url (http://localhost:port/reloadCache) after each file update
Looks like this is due to a bug in solr that affects cached results. May be trying the reloadCache helps?
I have the following field type (notice no filters, no tokenizers)
<fieldType name="text_names" class="solr.StrField" />
I create a field in my schema using that type:
<field name="exact_type" type="text_names" indexed="true" stored="true" />
now, I search q=*:*&fq=exact_type:aa&fl=exact_type
I still get results which have other than 'aa' in the exact_type field.
What am I missing here?
Also this behaves the same:
q=exact_type:aa&fl=exact_type
I don't think that "q=*:*" works with DisMaxHandler and I believe that you are using it ,the correct syntax for both the queries should be:
q=&fq=exact_type:aa&fl=exact_type
fq=exact_type:aa&fl=exact_type
I am trying to index using curl based request
the request is
curl "http://localhost:8080/solr1/update/extract?literal.id=who.pdf&uprefix=attr_&fmap.content=attr_content&commit=true" -F "myfile=#/root/apache-solr-3.1.0/docs/who.pdf"
On submitting the request, i am getting this error,
Error report</title><style><!--H1 {font-family:Tahoma,Arial,sans-serif;color:white;background-color:#525D76;font-size:22px;} H2 {font-family:Tahoma,Arial,sans-serif;color:white;background-color:#525D76;font-size:16px;} H3 {font-family:Tahoma,Arial,sans-serif;color:white;background-color:#525D76;font-size:14px;} BODY {font-family:Tahoma,Arial,sans-serif;color:black;background-color:white;} B {font-family:Tahoma,Arial,sans-serif;color:white;background-color:#525D76;} P {font-family:Tahoma,Arial,sans-serif;background:white;color:black;font-size:12px;}A {color : black;}A.name {color : black;}HR {color : #525D76;}--></style> </head><body><h1>HTTP Status 400 - ERROR:unknown field 'ignored_meta'</h1><HR size="1" noshade="noshade"><p><b>type</b> Status report</p><p><b>message</b> <u>ERROR:unknown field 'ignored_meta'</u></p><p><b>description</b> <u>The request sent by the client was syntactically incorrect (ERROR:unknown field 'ignored_meta').</u></p><HR size="1" noshade="noshade"><h3>Apache Tomcat/6.0.18</h3></body></html>r
Your problem is due to the fact that the default handler for ExtractingRequestHandler defined in the solrconfig.xml put all the Tika's not identified extracted fields into fields named 'ingored_XXXXX'.
To solve this, you can simply add to your Solr configuration a field name 'ignored_*' like this:
<dynamicField name="ignored_*" type="ignored"/>
Don't forget to add also the ignored type if you remove it from the default configuration:
<fieldtype name="ignored" stored="false" indexed="false" multiValued="true" class="solr.StrField" />
This will stop your Solr from crashing when Tika index fields that Solr don't know of.