Lets take an example where there are two type of entites loaded: Product and Category, Product.CategoryId -> Category.Id. We have available CRUD operations on products (not Categories).
If on another screen Categories are updated (or from another user in the network), we would like to be able to reload the Categories, while preserving the context we currently use, since we could be in the middle of editing data, and we do not want changes to be lost (and we cannot depend on saving, since we have incomplete data).
Since there is no easy way to tell EF to get fresh data (added, removed and modified), we thought of twp possible ways:
1) Getting products attached to context, and categories detached from context. This would mean that we loose the ability to access Product.Category.Name, which we do sometimes require, so we would need to manually resolve it (example when printing data).
2) detaching / attaching all Categories from current context.
Context.ChangeTracker.Entries().Where(x => x.Entity.GetType() == typeof(T)).ForEach(x => x.State = EntityState.Detached);
And then reload the categories, which will get fresh data. Do you find any problem with this second approach? We understand that this will require all constraints to be put on foreign keys, and not navigation properties, since when detaching all Categories, Product.Category navigation properties would be reset to null also. Also, there could be a potential performance problem, which we did not test, since there could be couple of thousand products loaded, and all would need to resolve navigation property when reloading.
Which of the two do you prefer, and is there a better way (EF6 + .NET 4.0)?
Sounds like you want to try
/// <summary>
/// Reloads the entity from the database overwriting any property values with values from the database.
/// The entity will be in the Unchanged state after calling this method.
///
/// </summary>
public void Reload(){
....
}
sample call
public virtual void Reload(TPoco poco) {
if (poco == null) {
// wtf....
}
try {
Context.Entry<TPoco>(poco).Reload();
}
catch (Exception ex) {
//... whatever
throw;
}
}
You can use a second level cache implementation, for example:
SECOND LEVEL CACHE FOR EF 6.1
Note: it's no longer beta version. the 1.0.0 what released on 2013/8/27. There are also other implementations, including some for older versions of EF.
Related
I'm using Silverlight + EntityFramework + RIA Services in a business application. The underlying database tables include the Humans table and the HumanAddresses table. Every human can have one or more addresses of different types (e.g. home, job, place of birth, etc). At least one address of type "Home" must always be present.
The UI allows to edit, remove and add new several addresses of a given human before submitting. I need to perform validation to find out whether these changes violate the forenamed rule. What is the best way to do this?
I tried using CustomValidationAttribute, but it allows (AFAIK) only entity-level validation, not validation across multiple entities, some of which are to be deleted, while others are to be added or modified.
If you need access to other entities then you need to override ValidateEntity in your database context. This is called for every entity that has been changed when you call SaveChanges().
protected override DbEntityValidationResult ValidateEntity(DbEntityEntry entityEntry, IDictionary<object, object> items)
{
DbEntityValidationResult result = new DbEntityValidationResult(entityEntry, new List<DbValidationError>());
result = base.ValidateEntity(entityEntry, items);
//Do your validation
if(invalid)
{
result.ValidationErrors.Add(new DbValidationError("Property", "Error Message"));
}
return result;
}
Here are the validation options in EF http://msdn.microsoft.com/en-us/data/gg193959.aspx
my first real (not test) NHibernate/Castle.ActiveRecord project is developing quickly.
I am working with NHibernate/Castle.ActiveRecord about one month now but still have not a real idea how to handle Sessions in my WindowsForms application.
The common handling-methods seam not to work for me:
SessionPerRequest, SessionPerConversation, etc. all only work for WebApplications, etc.
SessionPerApplication is not recomanded/highly dangerous when I am correct
SessionPerThread is not very helpfull, since I either have only one thread, the WindowsForms-thread, or for each button-click a new thread. The first thing would make my applicaton use too much memory and to hold old objects in the memmory. With worker-threads for ech button click I would disable lazy-loading, since my loaded objects would live longer then the thread.
SessionPerPresenter is not working as well, because it is common, that I open a "sub-presenter" in a form to let the user search/load/select some referenced objects (foreigen key) and of cause the presenter is destroyed - what means session closed - but the object used in the "super-presenter" to fill the referencing property (foreigen key).
I've used google and bing for hours and read a lot, but only found one good website about my case: http://msdn.microsoft.com/en-us/magazine/ee819139.aspx . There SessionPerPresenter is used, but to a "sub-presenter" it is only given the id, not the entire object! And it seams that there are no foreigen-keys in this example and no scenari in wich a object is returned to a "super-presenter".
Qestions
Is there any other method of session handling for windowsforms/desktop-application?
I could add a session-property or a session-constructor-parameter to all of my presenters, but it feels not right to have session-handling all over my ui-code.
When an Exception occures NHibernate want's me to kill the session. But if it is 'only' a business-logic exception and not an NHibernate-Exception?
Example
I am trying to make an example the covers most of my problem.
// The persisten classes
public class Box
{
public virtual int BoxId{get;set;}
public virtual Product Content{get;set;}
...
}
public class User
{
public virtual int UserId{get;set;}
public virtual IList<Product> AssigenedProducts{get;set;}
...
}
public clas Product
{
public virtual int ProductId{get;set;}
public virtual string PrductCode{get;set;}
}
.
// The presenter-classes
public class ProductSearchPresenter : SearchPresenter<Product> { ... }
public class ProductEditPresenter : EditPresenter<Product> { ... }
public class UserSearchPresenter : SearchPresenter<User> { ... }
public class UserEditPresenter : EditPresenter<User> { ... }
public class BoxSearchPresenter : SearchPresenter<Box> { ... }
public class BoxEditPresenter : EditPresenter<Box> { ... }
// The search-presenters allow the user to perform as search with criterias on the class defined as generic argument and to select one of the results
// The edit-presenters allow to edit a new or loaded (and given as parameter) object of the class defined as generic argument
Now I have the following use-cases, wich all can be performed in the same application at the same time asyncronous (the use simply switchs between the presenters).
using an instance of BoxSearchPresenter to search and select a object
part of this usecase is to use an instance of the ProductSearchPresenter to fill a criteria of the BoxSearchPresenter
part of this usecase is to use an instance of the BoxEditPresenter to edit and save the selected object of the BoxSearchPresenter-instance
using an instance of UserSearchPresenter to search and select a object
part of this usecase is to use an instance of the UserEditPresenter to edit and save the slected object of the UserSearchPresenter
part of this usecase is to use a ProductSearchPresenter to search and select objects that will be added to User.AssignedProducts.
Using an instance of ProductSearchPresenter to search and select a object.
part of this usecase is to use an instance of ProductEditPresenter to edit and save a selected object of the ProductSearchPresenter.
It's only a small collection of usecases, but there are allready a lot of the problems I have.
UseCase 1. and 2. run at the same time in the same ui-thread.
UseCase 1.1. and 2.2. return there selected objects to other presenters that use this objects longer then the presenters exist that have loaded the object.
UseCase 3.1. might alter a object loaded from 2.2./1.1. before 3.1. was started, but when 2.2./1.1. is commited before 3.1. is finished the object would be saved and it would not be possible to "rollback" 3.1.
Here is just a short view of what I found best to fit into our WinForms application architecture (based on MVP).
Every presenter is constructor dependent on repositories which it needs, for example if you have InvoicePresenter then you have InvoiceRepository as dependency, but you will probably have CustomerRepository and many others depending on complexity (CustomerRepsitory for loading all customers into the customers combobox if you want to change customer of the invoice, stuff like that).
Then, every repository has a constuctor argument for UnitOfWork. Either you can abstract the session with UnitOfWork pattern, or you can have your reporitories depend on ISession.
Everything is wired together by IoC container, where we create presenters based on "context". This is a very simple concept, context is per presenter and all sub presenter, which in turn we create as composite block of more complex presenters to reduce complexitiy (if for example you have multiple tabs of options to edit some entity or something).
So, in practice, this context is 90% time form based, because one form is at least one presenter / view.
So to answer your questions:
Session per presenter and session per conversation (works with WinForms as well) are only really usable patterns here (and opening closing sessions all over the place, but not really good way to handle that)-
this is best solved by making repositories depend on session, not presenters. You make presenters depend on repositories, repositories depend on session, and when you create all, you give them common session; but as I state again, this is only practical when done in contexts. You cannot share session for presenter editing invoices and another presenter editing customers; but you can share session when editing invoice via main presenter and invoice details and invoice notes sub presenter.
Please clarify, didn't understand this...
I have a Silverlight RIA app where I share the models and data access between the MVC web app and the Silverlight app using compiler directives, and for the server, to see what context I am running under I would check to see if the ChangeSet object was non-null (meaning I was running under RIA rather than MvC). Everything works alright but I had problems with the default code generated by the domain service methods.
Let's say I had a Person entity, who belonged to certain Groups (Group entity). The Person object has a collection of Groups which I add or remove. After making the changes, the SL app would call the server to persist the changes. What I noticed happening is that the group entity records would be inserted first. That's fine, since I'm modifying an existing person. However, since each Group entity also has a reference to the existing person, calling AddObject would mark the whole graph - including the person I'm trying to modify - as Added. Then, when the Update statement is called, the default generated code would try to Attach the person, which now has a state of Added, to the context, with not-so-hilarious results.
When I make the original call for an entity or set of entities in a query, all of the EntityKeys for the entities are filled in. Once on the client, then EntityKey is filled in for each object. When the entity returns from the client to be updated on the server, the EntityKey is null. I created a new RIA services project and verified that this is the case. I'm running RIA Services SP1 and I am not using composition. I kind of understand the EntityKey problem - the change tracking done is on two separate contexts. EF doesn't know about the change tracking done on the SL side. However, it IS passing back the object graph, including related entities, so using AddObject is a problem unless I check the database for the existence of an object with the same key first.
I have code that works. I don't know how WELL it works but I'm doing some further testing today to see what's going on. Here it is:
/// <summary>
/// Updates an existing object.
/// </summary>
/// <typeparam name="TBusinessObject"></typeparam>
/// <param name="obj"></param>
protected void Update<TBusinessObject>(TBusinessObject obj) where TBusinessObject : EntityObject
{
if (this.ChangeSet != null)
{
ObjectStateManager objectStateManager = ObjectContext.ObjectStateManager;
ObjectSet<TBusinessObject> entitySet = GetEntitySet<TBusinessObject>();
string setName = entitySet.EntitySet.Name;
EntityKey key = ObjectContext.CreateEntityKey(setName, obj);
object dbEntity;
if (ObjectContext.TryGetObjectByKey(key, out dbEntity) && obj.EntityState == System.Data.EntityState.Detached)
{
// An object with the same key exists in the DB, and the entity passed
// is marked as detached.
// Solution: Mark the object as modified, and any child objects need to
// be marked as Unchanged as long as there is no Domainoperation.
ObjectContext.ApplyCurrentValues(setName, obj);
}
else if (dbEntity != null)
{
// In this case, tryGetObjectByKey said it failed, but the resulting object is
// filled in, leading me to believe that it did in fact work.
entitySet.Detach(obj); // Detach the entity
try
{
ObjectContext.ApplyCurrentValues(setName, obj); // Apply the changes to the entity in DB
}
catch (Exception)
{
entitySet.Attach(obj); // Re-attach the entity
ObjectContext.ApplyCurrentValues(setName, obj); // Apply the changes to the entity in DB'
}
}
else
{
// Add it..? Update must have been called mistakenly.
entitySet.AddObject(obj);
}
}
else
DirectInsertUpdate<TBusinessObject>(obj);
}
Quick walkthrough: If the ChangeSet is null, I'm not under the RIA context, and therefore can call a different method to handle the insert/update and save immediately. That works fine as far as I can tell. For RIA, I generate a key, and see if it exists in the database. If it does and the object I am working with is detached, I apply those values; otherwise, I force detach and apply the values, which works around the added state from any previous Insert calls.
Is there a better way of doing this? I feel like I'm doing way too much work here.
In this kind of a case, where you're adding Group entities to Person.Groups, I would think of just saving the Person and expect RIA to handle the Groups for me.
But let's take a step back, how are you trying to persist your changes? You shouldn't be saving/updating entities one by one. All you have to do is call DomainContext.SubmitChanges and all your changes should be persisted.
I work with pretty complicated projects and I seldom ever have to touch add/update code.
This question has been around with no solid answer, so I'll tell you what I did... which is nothing. That's how I handled it in RIA services, using the code above, since I was sharing the RIA client model and the server model.
After working with RIA services for a year and a half, I'm in the camp that believes that RIA services is good for working with smaller, less complex apps. If you can use [Composite] for your entities, which I couldn't for many of my entities, then you're fine.
RIA services can make throwing together small applications where you want to use the entity from EF really quick, but if you want to use POCOs or you foresee your application getting complex in the future, I would stick with building POCOs on the service end and passing those through regular WCF, and using shared behaviors by making your POCOs partial classes and sharing the behavior code with the client. When you're trying to create models that work the same on the client and the server, I had to write a ridiculous amount of plumbing code to make it work.
It definitely IS possible to do, I've done it; but there is a lot of hoops you must jump through for everything to work well, and I never fully took into consideration things like your shared model pre-loading lists for use on the client, whereas the server didn't need these preloaded everytime and actually slowed down the loading of the web page unnecessarily and countering by writing hacky method calls which I had to adopt on the client. (Sorry for the run-on.) The technique I chose to use definitely had its issues.
I'm trying to peform an update statement using WCF RIA Services, but everytime I update I keep getting "An entity with the same identity already exists in this EntitySet. Any insight on where I can start looking or figuring out what is wrong?
Step 1
LoadOperation<Analysis> AnalysisLP = ((App)Application.Current)._context.
Load(((App)Application.Current)._context.GetAnalysisQuery().
Where(o => o.ProjectID == Convert.ToInt32(((App)Application.Current).Project.ProjectID)));
Step 2
AnalysisLP.Completed += delegate
{
if (!AnalysisLP.HasError)
{
Analysis = AnalysisLP.Entities.FirstOrDefault();
};
Step 3
((App)Application.Current)._context.Analysis.Attach(Analysis);
((App)Application.Current)._context.SubmitChanges(OnSubmitCompleted, null);
Can anyone help me, what is it i'm doing wrong??
thanks
Your object Analysis comes from the EntitySet via a query but is still attached to that EntitySet.
You just need to change its properties and call SubmitChanges. Do not try to attach it again.
To avoid the “An Entity with the same identity already exists in the EntitySet” exception, Entities that are updated, modified or deleted must always be fully refreshed from server upon saving, there can be no references held in memory to the previous instances of the entities. To prevent orhpaned instances from hanging around, I follow these rules:
Entity instances should not have any property changed event handlers assigned directly to them, rather use OnCreated or OnPropertyNameChanged partial methods instead.
When entities are added to an EntitySet, do not assign parent Entity instance references, use the foreign key ID property instead (myEntity.ParentalID = SelectedParent.ParentalID rather than myEntity.Parent = SelectedParent) because the SelectedParent probably isn’t getting reloaded upon saving because it isn’t part of the unit of work, so that reference will be held after the save and refresh.
Any combo boxes that are used as populate sources for Entity properties of the Entity need to have their EntitySet reloaded after saving as well; otherwise those related Entities populating the combo will hold references to the previous entity instance.
Currently for ASP.Net stuff I use a request model where a context is created per request (Only when needed) and is disposed of at the end of that request. I've found this to be a good balance between not having to do the old Using per query model and not having a context around forever. Now the problem is that in WPF, I don't know of anything that could be used like the request model. Right now it looks like its to keep the same context forever (Which can be a nightmare) or go back to the annoying Using per query model that is a huge pain. I haven't seen a good answer on this yet.
My first thought was to have an Open and Close (Or whatever name) situation where the top level method being called (Say an event handling method like Something_Click) would "open" the context and "close" it at the end. Since I don't have anything on the UI project aware of the context (All queries are contained in methods on partial classes that "extend" the generated entity classes effectively creating a pseudo layer between the entities and the UI), this seems like it would make the entity layer dependent on the UI layer.
Really at a loss since I'm not hugely familiar with state programming.
Addition:
I've read up on using threads, but the
problem I have with a context just
sitting around is error and recovery.
Say I have a form that updates user
information and there's an error. The
user form will now display the changes
to the user object in the context
which is good since it makes a better
user experience not to have to retype
all the changes.
Now what if the user decides to go to
another form. Those changes are still
in the context. At this point I'm
stuck with either an incorrect User
object in the context or I have to get
the UI to tell the Context to reset
that user. I suppose that's not
horrible (A reload method on the user
class?) but I don't know if that
really solves the issue.
Have you thought about trying a unit of work? I had a similar issue where I essentially needed to be able to open and close a context without exposing my EF context. I think we're using different architectures (I'm using an IoC container and repository layer), so I have to cut up this code a bit to show it to you. I hope it helps.
First, when it comes to that "Something_Click" method, I'd have code that looked something like:
using (var unitOfWork = container.Resolve<IUnitOfWork>){
// do a bunch of stuff to multiple repositories,
// all which will share the same context from the unit of work
if (isError == false)
unitOfWork.Commit();
}
In each of my repositories, I'd have to check to see if I was in a unit of work. If I was, I'd use the unit of work's context. If not, I'd have to instantiate my own context. So in each repository, I'd have code that went something like:
if (UnitOfWork.Current != null)
{
return UnitOfWork.Current.ObjectContext;
}
else
{
return container.Resolve<Entities>();
}
So what about that UnitOfWork? Not much there. I had to cut out some comments and code, so don't take this class as working completely, but... here you go:
public class UnitOfWork : IUnitOfWork
{
private static LocalDataStoreSlot slot = Thread.AllocateNamedDataSlot("UnitOfWork");
private Entities entities;
public UnitOfWork(Entities entities)
{
this.entities = entities;
Thread.SetData(slot, this);
}
public Entities ObjectContext
{
get
{
return this.Entities;
}
}
public static IUnitOfWork Current
{
get { return (UnitOfWork)Thread.GetData(slot); }
}
public void Commit()
{
this.Entities.SaveChanges();
}
public void Dispose()
{
entities.Dispose();
Thread.SetData(slot, null);
}
}
It might take some work to factor this into your solution, but this might be an option.