How to retrieve custom properties of Solr cores - solr

I'm bit new with Solr 7.0.
With Solr CoreAdmin APIs it is possible to create new cores with custom properties:
solr/admin/cores?action=CREATE&name=mycore&configSet=myconfigset&property.version=1.2.3
The command above creates a new core with a core.properties file containing the specified custom property "version" with value "1.2.3"
The defined custom properites may be used as replacement in Solr configuration files, but I could not be able to retrieve (and eventually update) a specific custom property using the Solr CoreAdmin APIs.
How is it possible to retrive or update a specific core custom property?
Thanks a lot

To create user defined property, you can use below command.
curl http://localhost:8983/solr/<core-name>/config -H'Content-type:application/json' -d '{
"set-user-property" : {"custom_property":"some_value"}}'
And, to retrieve it.
curl http://localhost:8983/solr//config/overlay?omitHeader=true
Though, these properties will be removed, once you have restarted the server. So I would suggest you to add these properties in core.properties file.
For more information :- https://lucene.apache.org/solr/guide/6_6/config-api.html#ConfigAPI-CreatingandUpdatingUser-DefinedProperties

Related

Can I query a new handler not in solr config?

I am using solr 5.4.0.
I want to create a new handler in solr say "X". This handler is not defined in solr config, but can I define this on run time and include it in query using the qt field?
The same way how we can replace the bq, qf etc fields for an already existing handler in solr config, is there a support for creating a new handler while issuing the solr query as well
I do not remember being able to create additional request handlers via API in Solr 5.4. You may be able to modify or XInclude a file on a filesystem and reload core. But that's a bit hacky.
In the latest versions of Solr, you do have
configuration API to override solrconfig.xml
request parameters API, that allow you to define parameter sets, which you can apply with useParams configuration or as a query parameter in the URL.

Spring SAML Extension - Programmatically setting entityBaseURL

I am using the Spring SAML extension with WSO2 IS as the IdP. Currently I set the entityBaseURL property for the MetadataGenerator inside the Spring XML config. For now, this works fine going against a single server since the entityBaseURL matches the servername. Since I have several environments (dev, test, and UAT) I need to programmatically set the entityBaseURL because each environment has a different server name and that servername won't match the entityBaseURL prop. It is undesirable to rebuild the WAR artifact for every environment. We keep our config for each environment in a database. So settings and properties specific to a particular stack of machines can be read at runtime. I would like to read the servername for the entityBaseURL property from our DB and set it programmatically. Should I replace the MetadataGenerator with my own class? It is unclear to me where the entityBaseURL property is initialized.
I have found a workable path to solve this. I ended up extending the MetadataGeneratorFilter class and overriding the getDefaultBaseURL method. The default implementation of the getDefaultBaseURL method is to compute the value using properties found in the HTTP request. I changed this behavior to do a DB lookup and return the value stored in the database. I could be short-sited here, but this does work. I was able to verify that the AssertionConsumerServiceURL attribute of the SAML AuthnRquest is getting set properly. The generated metadata is also correct.
Note: the entityBaseURL property can still be set manually in the Spring config. If it is then the value returned from the getDefaultBaseURL method is not used.

how to backup solrconfig file from running solr

I have a single core solr server. when solr was running, in one collection solrconfig.xml and schema.xml files replaced by mistake.
now collection worked correctly and correctly response to request but valid file in conf folder is replaced by mistake files. surly if i reload collection, new bad files load and my collection not worked correctly.
is there a way than can get solrconfig.xml & schema.xml from running collection without considering solrconfig.xml and schema.xml files that exist in conf folder?
You can read the current running schema and config through the Solr schema API and Solr config API.
Pay attention: the results of this APIs is not the original schema.xml or solrconfig.xml files but from that you can rebuild the originals.
Again, pay also attention that Solr config API is available only in recent version of Solr.
In older versions (I have tested version 4.8.1) are no API for the solr configuration, so there is no way to fully rebuild the solrconfig.xml file.
You can retrieve the loaded configuration files using Solr Administration User Interface :
Go to http://<hostname>:<port>/solr.
Select your core in the dropdown menu in the left pane.
A menu apears below the selected core, select Files
Load the file you want
Or you can go straight to http://<hostname>/solr/#/<corename>/files?file=<filename>
See https://cwiki.apache.org/confluence/display/solr/Files+Screen
Solr version prior to 4.x shows a slightly different interface, if I remember correctly there is no core dropdown, solrconfig.xml & schema.xml appears right in the left pane.
On SolrCloud there is an additional dropdown list showing all collections in a given cluster, but you get the idea.
Note : Solr Admin UI shows you parsed files, so if you ever had to escape special characters, for example in a filter's regex that uses a <, you would have to re-escape it to < once you get the file back in order to prevent parse error.

Solr-Collection or Core not reloading schema.xml

I am using Solar 4.6 and changed something inside schema.xml. In order to update schema.xml inside my core I used zkcli. Which works fine and I am able to see the modified schema.xml inside the Solr Admin GUI under cloud\tree\config\foobar\schema.xml.
But after calling
http://localhost:8983/solr/admin/collections?action=RELOAD&name=foobar and
http://localhost:8983/solr/admin/cores?action=RELOAD&name=foobar,
the old schema.xml was still in the core named foobar.
Your 2nd HTTP request to the Core API is wrong. Change name to core:
http://localhost:8983/solr/admin/cores?action=RELOAD&name=foobar should be
http://localhost:8983/solr/admin/cores?action=RELOAD&core=foobar.
http://archive.apache.org/dist/lucene/solr/ref-guide/apache-solr-ref-guide-4.6.pdf (page 277)
RELOAD
The RELOAD action loads a new core from the configuration of an existing, registered Solr core. While the new core is initializing, the existing one
will continue to handle requests. When the new Solr core is ready, it takes over and the old core is unloaded.
This is useful when you've made changes to a Solr core's configuration on disk, such as adding new field definitions. Calling the RELOAD action
lets you apply the new configuration without having to restart the Web container. However the Core Container does not persist the SolrCloud
solr.xml parameters, such as solr/#zkHost and solr/cores/#hostPort, which are ignored.
http://localhost:8983/solr/admin/cores?action=RELOAD&core=core0
The RELOAD action accepts a single parameter, core, which is the name of the core to be reloaded.
see also https://cwiki.apache.org/confluence/display/solr/CoreAdmin+API#CoreAdminAPI-RELOAD
You have to reload your cores after giving it a new schema.
Replace name with core in your query as:
/solr/admin/cores?action=RELOAD&**core**=yourcorename
For example
http://localhost:8983/solr/admin/cores?action=RELOAD&core=foobar

Solr Cluster + DataImportHandler: can I have autogenerated id?

I'm using Solr 4.3. I've created 4 shards. I configured UniqueKey autogenerated field as described here:
http://wiki.apache.org/solr/UniqueKey
It works fine if I use the actual update handler to insert documents (i.e. if I make a HTTP POST to /update with some JSON data, the unique key is autogenerated for each document).
If however I use the DataImportHandler to pull some documents from database, they are not added to the index, instead I see a warning in the Solr log saying that "mandatory id field is missing".
I know the DataImportHandler doesn't go through the UpdateHandler to add documents, but I was hoping this feature would work for DIH as well...
So my question is: does anybody know how to make work the id autogeneration for a Solr 4.3 cluster when using the DataImportHandler to insert documents?
Well, the solution I ended up using was this
created a custom transformer in Java (actually I was already using one - I find it's faster than doing them in JS - the other option Solr offers)
Inside the transformer I pretty much do what the UUIDUpdateProcessorFactory does: add
#Override
public Object transformRow(Map<String, Object> row, Context context) {
row.put("id", UUID.randomUUID());
I then removed the <updateRequestProcessorChain name="uuid"> tag from my solrconfig.xml, and only left the schema.xml configuration as per the link in the question

Resources