I have the following data structure for creating index.
user
userid
username
userstatus
friends
friendid
friendstatus
friendcreateddate
I think dynamic field wont work for me since I need to query based on specific field names.
I have search based on friendstatus and friendcreateddate. Can someone advise me on best possible document structure?
That is a very simple data structure. You just need to look at an example schema.xml and put your own field definitions in there. A field like "friends" would be declared as multiValued="true" and the userid would be tagged <uniqueKey>
Follow this guide http://wiki.apache.org/solr/SchemaXml
and ignore complicated stuff like dynamic fields which you probably don't need.
Related
NOTE: there are a million stackoverflow answers explaining inserting into lookup fields that talk about __c fields. They are NOT duplicates of this question.
so I'm trying to insert a value into a lookup field using the rest api.
If the field name looks like blah__c - it's easy - I just insert a {key, value} into blah__r. I do that all over the place.
but in this case my field is called PlanId. Trying to insert into PlanId__r just says no such field. How do I do it for a lookup field that does not end in __c?
You're talking about the insert/upsert by external id trick, right? Because vanilla lookup is absolutely the same, just chuck the 15/18-char id in. blah__c = '001...', PlanId = '003...'
For standard lookup field (without __c) most of the time the relation will be same thing minus the "Id" part. So try Plan = {'UniqueKey__c', '123'} or whatever is your equivalent.
(shameless plug) check my answer https://salesforce.stackexchange.com/a/274696/799, look at how that insert of Asset references Contact and Product2 even though the actual fields are ContactId, Product2Id.
https://salesforce.stackexchange.com/a/23507/799 might be helpful too. Or just use Workbench etc to describe your object.
There are few standard objects where upsert by external id won't work. You might have hard time adding Tasks to Accounts for example because that WhatId field is a mutant, lookup to many tables. Was few years since I tried that though so maybe something changed.
Let's say you have a simple forms automation application, and you want to index every submitted form in a Solr collection. Let's also say that form content is open-ended so that the user can create custom fields on the form and so forth.
Since users can define custom forms, you can't really predefine fields to Solr, so we've been using Solr's "schema-less" or managed schema mode. It works well, except for one problem.
Let's say a form comes through with a field called "ID" and a value of "9". If this is the first time Solr has seen a field called "ID", it dutifully updates it's schema, and since the value of this field is numeric, Solr assigns it a data type of one of it's numeric data types (we see "plong" a lot).
Now, let's say that the next day, someone submits another instance of this same form, but in the ID field, they type their name instead of entering a number. Solr spits this out and won't index this record because the schema says ID should be numeric, but on this record, it's not.
The way we've been dealing with this so far is to trap the exception we get when a field's data type disagrees with the schema, and then we use the Solr API to alter the schema, making the field in question a text or string instead of a numeric.
Of course, when we do this, we need to reindex the entire collection since the schema changed, and so we need to persist all the original data just in case we need to re-index everything after one of these schema data-type collisions. We're big Solr fans, but at the same time, we wonder whether the benefits of using the search engine outweigh all this extra work that gets triggered if a user simply enters character data in a previously numeric field.
Is there a way to just have Solr always assign something like "text_general" for every field, or is there some other better way?
I would say that you might need to handle the Id values at your application end.
It would be good to add a validation for Id, that Id should be of either string or numberic.
This would resolve your issue permanently. If this type is decided you don't have to do anything on the solr side.
The alternative approach would be have a fixed schema.xml.
In this add a field Id with a fixed fieldType.
I would suggest you to go with string as a fieldType for ID if don't want it to tokenize the data and want the exact match in the search.
If you would like to have flexibility in search for the Id field then you can add a text_general field type for the field.
You can create your own fieldType as well with provided tokenizer and filter according to your requirement for you the field Id.
Also don't use the schemaless mode in production. You can also map your field names to a dynamic field definition. Create a dynamic field such as *_t for the text fields. All your fields with ending with _t will be mapped to this.
I am working on eCommerce web application which is developed using DOT NET MVC. I use Solr to index product details. So that I have mentioned Product related fields to my Solr Schema file.
Now I also want to index SearchTerm to Solr. For this how can I manage my Schema file to store/index searchterm as my Schema file is product specific?
Can anyone please suggest?
You can have a separate core for this and define the new schema.xml for it or if you want to use the existing schema.xml then you can make use of the dynamic fields by which you need not have bother in future if any other field you need to add..
You can use Dynamic fields.
Dynamic fields allow Solr to index fields that you did not explicitly define in your schema.
This is useful if you discover you have forgotten to define one or more fields. Dynamic fields can make your application less brittle by providing some flexibility in the documents you can add to Solr.
A dynamic field is just like a regular field except it has a name with a wildcard in it. When you are indexing documents, a field that does not match any explicitly defined fields can be matched with a dynamic field.
For example, suppose your schema includes a dynamic field with a name of *_i.
If you attempt to index a document with a cost_i field, but no explicit cost_i field is defined in the schema, then the cost_i field will have the field type and analysis defined for *_i.
Like regular fields, dynamic fields have a name, a field type, and options.
<dynamicField name="*_i" type="int" indexed="true" stored="true"/>
what would be the most native way, to add comments to documents in solr? I would like to add comments with some user_id, datetime and the actual comment.
Thanks a lot
depending on your needs, if you wanted to query on the comments (so sort of maintaining the 1 Doc ---> N comments relationship in a more DB-like way) you might want to use block join too. Be aware of its limitations though
If you just want to load them together with the document: Stringify the array of objects and store it in an additional field.
If you want to search the comments also you have to split the fields of the comments up and store the comment text in a searchable multivalued field.
I've been using cakephp for a while, but have not learned all the ins and outs yet so I may be missing something simple. Or the problem may lie with my database structure. Either way, if anyone has any idea of what I'm doing wrong, please share.
Is there a way to order the data returned by cakephp's find using values stored in another table?
I am creating custom form fields on a per category basis, so when I choose a particular category to post in, custom fields will be added to my form. I have 3 tables: Posts, Fields, and Answers. The Posts table stores the basic static information for the post, such as id, category_id, title, and description. The Fields table stores the custom field data, such as category_id, field_label, field size, etc. The Answers table stores the values that are entered for particular fields, such as post_id, field_id, value.
I am trying to display the posts for a particular category, and create html table headers on the fly, using select fields, set by a column toggle in the fields table, and also select the answers associated with that particular field and post.
I am able to select all the data I want, and paginate everything just fine, but what I can't seem to figure out is how to order the data using one of the dynamic column values. For example, if I have year, make, and model as 3 custom fields, I would like to click the year column to sort my results by the year values, and if I click the make column, I would like to sort my results by the make values, etc.
I know how to order the results by a particular field inside the posts table, such as id or title, but is it possible to order using the custom fields? Am I setting up the database and/or something else wrong, and if not is there are particular cakephp method or sql command that I need to use in order to sort by the custom fields? I'm not really well versed in complex sql commands.
Thanks.
I'd suggest you pass the field name and sort direction in the URL (GET param). So when you have your table header link, form it so that it links to a URL as so:
http://somesite.com/pages/index/sort:customfield1/dir:asc
Then when you're grabbing the data from the db in your find() query, include the named parameters as the order parameter that can be sent to find.
You'll need to determine a default sorting column and direction. Maybe have that be selectable with a boolean field in the schema -- if there are no parameters sent to the action above, pull the field from your other table that has default set to true in the record.
To clarify: when a user visits a given action, first you'll pull the custom fields from the other table. Then using those fields (either the default as mentioned above, or the named params passed in the URL) form the query for the actual data, using the order parameter.