Spring Data MongoDB default type for inheritance - spring-data-mongodb

My model consisted of the following example:
class Aggregate {
private SomeClassWithFields property;
}
Now I decided to introduce inheritance to SomeClassWithFields. This results in:
class Aggregate {
private AbstractBaseClass property;
}
The collection already contains a lot of documents. These documents do not contain a _class property inside the DB since they were stored before the inheritance was present.
Is there a way to tell Spring Data MongoDB to use SomeClassWithFields as the default implementation of AbstractBaseClass if no _class property is present?
The other solution would be to add the _class to all the existing documents with a script but this would take some time since we have a lot of documents.

I solved it by using an AbstractMongoEventListener
The AbstractMongoEventListener has an onAfterLoad method which I used to set the default _class value if none was present :) This method is called before any mapping from the DBObject to my domain model by spring so it works then.
Do note that I also needed to let spring data mongodb know the mappingBasePackage in order for it to be able to read an Aggregate before writing one. This can be done implementing the getMappingBasePackage method of the PreconfiguredAbstractMongoConfiguration class.

Related

Hybris custom facet sort provider not working

I made an implementation where I created a custom Facet Value Sort Provider and a custom Facet Top Values provider.
I assigned them to one of my Solr Indexed Properties. I also change the Facet Sort type to Custom
It worked just fine on my local enviroment and in one of our test enviroments as well. But on our QA enviroment only the top values provider is working. The regular Facet Solr Provider applied is based on the facet result count.
I just notice after this implementation that doesn't matter the Facet sort that I select there, it insists to apply the sort by count.
Do you guys have any idea how to make my custom sort work there? Is there maybe a solr xml that I must change?
After selecting "custom" for SolrIndexedPropertyFacetSort, and setting the field customFacetSortProvider to your custom bean, you need to make sure your bean implements FacetSortProvider and override the comparator method:
#Override
public Comparator<FacetValue> getComparatorForTypeAndProperty(IndexedType arg0, IndexedProperty arg1)
{
// XXX Auto-generated method stub
return null;
}
It worked after changing the legacyMode to true in SolrSearchConfig.
It was the only different between the enviroments

How to specify Solr Cloud collection dynamically in Spring Data Solr 2.0.1?

We are trying to implement a two-dimensional solr cloud cluster where the first dimension is a collection and the second is a shard. Collection should be determined in runtime based on a document properties.
I can see that this functionality is supported by solrj- CloudSolrClient has appropriate methods which accept collection name like add(String collection, SolrInputDocument doc), so I registered #Bean CloudSolrClient("zookeeper.host"). But apparently it isn't enough because methods in SolrTemplate, which is used by Spring Data Solr, doesn't accept a collection name.
Since SolrTemplate uses SolrClient under the hood I tried to workaround this problem extending SolrTemplate and overriding saveBean and saveBeans methods delegating to CloudSolrClient#add(String collection, SolrInputDocument doc) and CloudSolrClient#add(String collection, Collection<SolrInputDocument> docs). It worked fine until I was need to do the same for queries. SolrTemplate#executeSolrQuery is package-private and final, so I can't override it. And here I stuck!
To summarise my question: is there a way to specify a collection name in spring data solr in runtime?
I would greatly appreciate any help!
Regards,
Eugeny
My problem was a bit different, but I had also a problem with collection name in queries and in my case adding #SolrDocument(solrCoreName = "core_to_which_model_class_belong") to model class solved the problem.

Spring Data MongoDB - model classes without annotations

This question is related to Spring Data MongoDB model classes without annotations.
I have a situation where I need to store my domain classes either in RDBMS store or NoSQL store. Say for example my domain classes are User, Feature, PaymentRequest, Order, OrderLine, OrderHeader etc.
I cannot use any annotation on my domain classes for various reasons.
Application team will specify in which persistent store they like to store. They might configure to store it in MongoDB or in MySQL or in Oracle etc.
My requirement is when I am storing in MongoDB say using spring-data-mongodb I want to leverage the DBRefs for associated objects in my domain object.
How can I achieve with spring-data-mongodb without using annotations in my model classes.
class Role
{
String id;
String roleName;
}
class User {
String id;
String firstName;
String lastName;
List<Role> userRoles;
}
When I save User object I want to ensure that in MongoDB Role objects are stored as DBRefs instead of actual Role object graph.
My question is ─ without using annotations in my User and Role classes ─ how can I achieve this?
I searched the user's forums and could not find a way. That's why I'm posting my question here.
Thanks,
Kishore Veleti A.V.K.
Not sure if you ever figured this out, but you can use AspectJ to create an ITD (inter-type declaration) to weave in the annotations into the class without having to actually modify the original code.
For example, to turn your userRoles into a DBRef, you just need this aspect:
import org.springframework.data.mongodb.core.mapping.DBRef;
privileged aspect User_Mongo {
declare #field: * User.userRoles : #DBRef;
}
This simply adds the #DBRef annotation to any fields within User named userRoles. You can look at the AspectJ documentation for more information on field patterns and ITDs.

Do Fluent conventions break lazy loading? (uNhAddIns)

I have a simple entity class in a WPF application that essentially looks like this:
public class Customer : MyBaseEntityClass
{
private IList<Order> _Orders;
public virtual IList<Order> Orders
{
get { return this._Orders; }
set {this._Orders = new ObservableCollection<Order>(value);}
}
}
I'm also using the Fluent automapper in an offline utility to create an NHibernate config file which is then loaded at runtime. This all works fine but there's an obvious performance hit due to the fact that I'm not passing the original collection back to NHibernate, so I'm trying to add a convention to get NHibernate to create the collection for me:
public class ObservableListConvention : ICollectionConvention
{
public void Apply(ICollectionInstance instance)
{
Type collectionType =
typeof(uNhAddIns.WPF.Collections.Types.ObservableListType<>)
.MakeGenericType(instance.ChildType);
instance.CollectionType(collectionType);
}
}
As you can see I'm using one of the uNhAddIns collections which I understand is supposed to provide support for both the convention and INotification changes, but for some reason doing this seems to break lazy-loading. If I load a custom record like this...
var result = this.Session.Get<Customer>(id);
...then the Orders field does get assigned an instance of type PersistentObservableGenericList but its EntityId and EntityName fields are null, and attempting to expand the orders results in the dreaded "illegal access to loading collection" message.
Can anyone tell me what I'm doing wrong and/or what I need to do to get this to work? Am I correct is assuming that the original proxy object (which normally contains the Customer ID needed to lazy-load the Orders member) is being replaced by the uNhAddIns collection item which isn't tracking the correct object?
UPDATE: I have created a test project demonstrating this issue, it doesn't reference the uNhAddins project directly but the collection classes have been added manually. It should be pretty straightforward how it works but basically it creates a database from the domain, adds a record with a child list and then tries to load it back into another session using the collection class as the implementation for the child list. An assert is thrown due to lazy-loading failing.
I FINALLY figured out the answer to this myself...the problem was due to my use of ObservableListType. In NHibernate semantics a list is an ordered collection of entities, if you want to use something for IList then you want an unordered collection i.e. a Bag.
The Eureka moment for me came after reading the answer to another StackOverflow question about this topic.

DRY unique objects in Django

I want to ensure an object is unique, and to throw an error when a user tries to save it (e.g. via the admin) if not? By unique, I mean that some of the object's attributes might hold the same values as those of other objects, but they can't ALL be identical to another object's values.
If I'm not mistaken, I can do this like so:
class Animal(models.Model):
common_name = models.CharField(max_length=150)
latin_name = models.CharField(max_length=150)
class Meta:
unique_together = ("common_name", "latin_name")
But then each time I refactor the model (e.g. to add a new field, or to change the name of an existing field), I also have to edit the list of fields in the parenthesis assigned to unique_together. With a simple model, that's OK, but with a substantial one, it becomes a real hassle during refactoring.
How can I avoid having to repeat typing out the list of field names in the unique_together parenthesis? Is there some way to pass the list of the model's fields to a variable and to assign that variable to unique_together instead?
Refactoring models is a rather expensive thing to do:
You will need to change all code using your models since field names correspond to object properties
You will have to change your database manually since Django cannot do this for you (at least the version I used the last time when I worked with Django couldn't)
Therefore I think updating the list of unique field names in the model meta class is the least issue you should worry about.
EDIT: If you really want to do this and all of your fields must be "unique together", then the guy at freenode is right and you'll have to write a custom metaclass. This is quite complicated and errorprone, plus it might render your code incompatible to future releases of Django.
Django's ORM "magic" is controlled by the metaclass ModelBase (django.db.models.base.ModelBase) of the generic base class Model. This class is responsible to take your class definition with all fields and Meta information and construct the class you will be using in your code later.
Here is a recipe on how you could achieve your goal:
Subclass ModelBase to use your own metaclass.
Override the method __new__(cls, name, bases, dict)
Inspect dict to gather the Meta member (dict["Meta"]) as well as all field members
Set meta.unique_together based on the names of the fields you gathered.
Call the super implementation (ModelBase.__new__)
Use the custom metaclass for all your unique models using the magic member __metaclass__ = MyMetaclass (or derive an abstract base class extending Model and overriding the metaclass)

Resources