Concurrency with a custom DomainService - silverlight

I have created a class that inherits for DomainService and have a Silverlight app that uses System.ServiceModel.DomainServices.Client to get a DomainContext. I have also created POCO DataContracts that are used in the DomainServices's Query, Update, Insert, and Delete methods. I also have a ViewModel that executes all the LoadOperations. And now I'm at the part of my app where I want to add new Entities to the generated EntitySets but am unsure about what's going to happen when one user creates new and sets the Key value; all while another user creates a similar entity with the same Key value.
I have seen in the documentation that an ObjectContext is used, but in my situation I was not able to use the EntityFramework model generator. So I had to create my datacontracts by hand.
So I guess my question is, is there any way I can force other silverlight apps to update on database change?

When you make a Save operation to your DomainContext, depending on the load behavior, it will automatically refresh.
TicketContext.Load(TicketContext.GetTicketByIdQuery(ticketId),
LoadBehavior.RefreshCurrent,
x =>
{
Ticket = x.Entities.First();
Ticket.Load();
((IChangeTracking) TicketContext.EntityContainer).AcceptChanges();
}, null);
Here I've set the LoadBehavior to RefreshCurrent. When you make a save, RIA will send the entity back across the wire to the client and merge the changes with the entity already cached on your client side context. I don't know if that quite answers your question or not however.

Related

Can Not Edit Entity with TextProperty in GAE

i am using NBD Datastore to save some information that my application requires.Let's say i have a class like this:
class Example(db.Model):
Entity1 = db.TextProperty()
Entity2 = db.StringProperty(multiline=True)
when i populate my db and i view it locally, i can see my update and even make changes (manually) to both the Entity1(TextProperty) and Entity2(StringProperty).
But when i deploy this app, and make an update and for some reasons, i want to change my values or update from the datastore viewer on appengine.google.com, only Entity2(StringProperty) becomes editable and for some reasons, i just can't change the value of Entity1(TextProperty). Is there any settings that i need to do to make this work ?
The datastore viewer is just a convenience, it's not surprising only some entity types can be edited directly.
As you've seen, just a difference in model type changes the behavior. And the behavior on the development server is often different from the live system in any case.
The simplest (only) solution is to write code that lets you perform the required edit on the model.

SL RIA app - Insert and Update using standard generated code does not work - is there a better way?

I have a Silverlight RIA app where I share the models and data access between the MVC web app and the Silverlight app using compiler directives, and for the server, to see what context I am running under I would check to see if the ChangeSet object was non-null (meaning I was running under RIA rather than MvC). Everything works alright but I had problems with the default code generated by the domain service methods.
Let's say I had a Person entity, who belonged to certain Groups (Group entity). The Person object has a collection of Groups which I add or remove. After making the changes, the SL app would call the server to persist the changes. What I noticed happening is that the group entity records would be inserted first. That's fine, since I'm modifying an existing person. However, since each Group entity also has a reference to the existing person, calling AddObject would mark the whole graph - including the person I'm trying to modify - as Added. Then, when the Update statement is called, the default generated code would try to Attach the person, which now has a state of Added, to the context, with not-so-hilarious results.
When I make the original call for an entity or set of entities in a query, all of the EntityKeys for the entities are filled in. Once on the client, then EntityKey is filled in for each object. When the entity returns from the client to be updated on the server, the EntityKey is null. I created a new RIA services project and verified that this is the case. I'm running RIA Services SP1 and I am not using composition. I kind of understand the EntityKey problem - the change tracking done is on two separate contexts. EF doesn't know about the change tracking done on the SL side. However, it IS passing back the object graph, including related entities, so using AddObject is a problem unless I check the database for the existence of an object with the same key first.
I have code that works. I don't know how WELL it works but I'm doing some further testing today to see what's going on. Here it is:
/// <summary>
/// Updates an existing object.
/// </summary>
/// <typeparam name="TBusinessObject"></typeparam>
/// <param name="obj"></param>
protected void Update<TBusinessObject>(TBusinessObject obj) where TBusinessObject : EntityObject
{
if (this.ChangeSet != null)
{
ObjectStateManager objectStateManager = ObjectContext.ObjectStateManager;
ObjectSet<TBusinessObject> entitySet = GetEntitySet<TBusinessObject>();
string setName = entitySet.EntitySet.Name;
EntityKey key = ObjectContext.CreateEntityKey(setName, obj);
object dbEntity;
if (ObjectContext.TryGetObjectByKey(key, out dbEntity) && obj.EntityState == System.Data.EntityState.Detached)
{
// An object with the same key exists in the DB, and the entity passed
// is marked as detached.
// Solution: Mark the object as modified, and any child objects need to
// be marked as Unchanged as long as there is no Domainoperation.
ObjectContext.ApplyCurrentValues(setName, obj);
}
else if (dbEntity != null)
{
// In this case, tryGetObjectByKey said it failed, but the resulting object is
// filled in, leading me to believe that it did in fact work.
entitySet.Detach(obj); // Detach the entity
try
{
ObjectContext.ApplyCurrentValues(setName, obj); // Apply the changes to the entity in DB
}
catch (Exception)
{
entitySet.Attach(obj); // Re-attach the entity
ObjectContext.ApplyCurrentValues(setName, obj); // Apply the changes to the entity in DB'
}
}
else
{
// Add it..? Update must have been called mistakenly.
entitySet.AddObject(obj);
}
}
else
DirectInsertUpdate<TBusinessObject>(obj);
}
Quick walkthrough: If the ChangeSet is null, I'm not under the RIA context, and therefore can call a different method to handle the insert/update and save immediately. That works fine as far as I can tell. For RIA, I generate a key, and see if it exists in the database. If it does and the object I am working with is detached, I apply those values; otherwise, I force detach and apply the values, which works around the added state from any previous Insert calls.
Is there a better way of doing this? I feel like I'm doing way too much work here.
In this kind of a case, where you're adding Group entities to Person.Groups, I would think of just saving the Person and expect RIA to handle the Groups for me.
But let's take a step back, how are you trying to persist your changes? You shouldn't be saving/updating entities one by one. All you have to do is call DomainContext.SubmitChanges and all your changes should be persisted.
I work with pretty complicated projects and I seldom ever have to touch add/update code.
This question has been around with no solid answer, so I'll tell you what I did... which is nothing. That's how I handled it in RIA services, using the code above, since I was sharing the RIA client model and the server model.
After working with RIA services for a year and a half, I'm in the camp that believes that RIA services is good for working with smaller, less complex apps. If you can use [Composite] for your entities, which I couldn't for many of my entities, then you're fine.
RIA services can make throwing together small applications where you want to use the entity from EF really quick, but if you want to use POCOs or you foresee your application getting complex in the future, I would stick with building POCOs on the service end and passing those through regular WCF, and using shared behaviors by making your POCOs partial classes and sharing the behavior code with the client. When you're trying to create models that work the same on the client and the server, I had to write a ridiculous amount of plumbing code to make it work.
It definitely IS possible to do, I've done it; but there is a lot of hoops you must jump through for everything to work well, and I never fully took into consideration things like your shared model pre-loading lists for use on the client, whereas the server didn't need these preloaded everytime and actually slowed down the loading of the web page unnecessarily and countering by writing hacky method calls which I had to adopt on the client. (Sorry for the run-on.) The technique I chose to use definitely had its issues.

Calling WCF services in MVVM?

I am working on a Prism desktop application and would like to know the best way to deal with lookup / reference data lists when using a WCF backend. I think this question may cover a few areas and I would appreciate some guidance
For example, consider a lookup that contains Products(codes and descriptions) which would be used in a lot of different input screens in the system.
Does the viewmodel call the WCF service directly to obtain the data to fill the control?
Would you create a control that solely deals with Products with its own viewmodel etc and then use that in every place that needs a product lookup or would you re-implements say a combobox that repopulates the products ItemsSource in every single form view model that uses it?
Would I create a brand new WCF service called something like LookupData service and use that to populate my lookup lists? - I am concerned I will end up with lots of lookups if I do this.
What other approaches are there for going about this?
I suggest creating your lookup object/component as a proxy object for WCF service. It can work in several ways, but most simple coming to my mind would be:
Implement WCF service with methods to provide all Products entities and requested one (eg. basing on product code)
Implement component that will use WCF client to get products, let's call it ProductsProvider
Your view models will take dependency on ProductsProvider (eg. via constructor injection)
Key element in this model is ProductsProvider - it will work as kind of cache for Products objects. First, it will ask web service for all products (or some part of it, up to your liking) to start with. Then, whenever you need to lookup product, you ask provider - it's provider's responsibility to deal with how product should be looked up - maybe it's already in local list? Maybe it will need to call web service for update? Example:
public class ProductsProvider
{
private IList<Product> products;
private IProductsService serviceClient;
public ProductsProvider(IProductsService serviceClient)
{
this.serviceClient = serviceClient;
this.products = serviceClient.GetAllProducts();
}
public Product LookUpProduct(string code)
{
// 1: check if our local list contains product with given code
// 2: if it does not, call this.serviceClient.LookUpProduct
// 3: if service also doesn't know such product:
// throw, return null, report error
}
}
Now, what this gives you is:
you only need to have one ProductsProvider instance
better flexibility with when and how your service is called
your view models won't have to deal with WCF at all
Edit:
As for your second question. Control may not be needed, but having view model for Product entity is definitely a good idea.

An entity with the same identity already exists in this EntitySet

I'm trying to peform an update statement using WCF RIA Services, but everytime I update I keep getting "An entity with the same identity already exists in this EntitySet. Any insight on where I can start looking or figuring out what is wrong?
Step 1
LoadOperation<Analysis> AnalysisLP = ((App)Application.Current)._context.
Load(((App)Application.Current)._context.GetAnalysisQuery().
Where(o => o.ProjectID == Convert.ToInt32(((App)Application.Current).Project.ProjectID)));
Step 2
AnalysisLP.Completed += delegate
{
if (!AnalysisLP.HasError)
{
Analysis = AnalysisLP.Entities.FirstOrDefault();
};
Step 3
((App)Application.Current)._context.Analysis.Attach(Analysis);
((App)Application.Current)._context.SubmitChanges(OnSubmitCompleted, null);
Can anyone help me, what is it i'm doing wrong??
thanks
Your object Analysis comes from the EntitySet via a query but is still attached to that EntitySet.
You just need to change its properties and call SubmitChanges. Do not try to attach it again.
To avoid the “An Entity with the same identity already exists in the EntitySet” exception, Entities that are updated, modified or deleted must always be fully refreshed from server upon saving, there can be no references held in memory to the previous instances of the entities. To prevent orhpaned instances from hanging around, I follow these rules:
Entity instances should not have any property changed event handlers assigned directly to them, rather use OnCreated or OnPropertyNameChanged partial methods instead.
When entities are added to an EntitySet, do not assign parent Entity instance references, use the foreign key ID property instead (myEntity.ParentalID = SelectedParent.ParentalID rather than myEntity.Parent = SelectedParent) because the SelectedParent probably isn’t getting reloaded upon saving because it isn’t part of the unit of work, so that reference will be held after the save and refresh.
Any combo boxes that are used as populate sources for Entity properties of the Entity need to have their EntitySet reloaded after saving as well; otherwise those related Entities populating the combo will hold references to the previous entity instance.

Linq Datacontext and "unit of work"

The buzword in Linq now days is "unit of work".
as in "Only keep your datacontext in existence for one unit of work" then destroy it.
Well I have a few questions about that.
I am creating a fat client WPF
application. So my data context needs to track the entire web of instantiated object available for the user on the current screen. when can I destroy my datacontext?
I build a linq query over time based on actions of the user and their interactions with objects of the first datacontext. How can I create a new DataContext and execute the query on new Context?
I hope I was clear.
Thank you
Unit of Work is not the same as only keep your datacontext in existence for one unit of work.
Unit of Work is a design pattern that describe how to represent transactions in an abstract way. It's really only required for Create, Update and Delete (CUD) operations.
One philosophy is that UoW is used for all CUD operations, while read-only Repositories are used for read operations.
In any case I would recommend decoupling object lifetime from UoW or Repository usage. Use Dependency Injection (DI) to inject both into your consuming services, and let the DI Container manage lifetimes of both.
In web applications, my experience is that the object context should only be kept alive for a single request (per-request lifetime). On the other hand, for a rich client like the one you describe, keeping it alive for a long time may be more efficient (singleton lifetime).
By letting a DI Container manage the lifetime of the object context, you don't tie yourself to one specific choice.
I am creating a fat client WPF application.
Ok.
So my data context needs to track the entire web of instantiated object available for the user on the current screen.
No. Those classes are database mapping classes. They are not UI presentation classes.
How can I create a new DataContext and execute the query on new Context?
Func<DataContext, IQueryable<Customer>> queryWithoutADataContext =
dc =>
from cust in dc.Customers
where cust.name == "Bob"
select cust;
Func<DataContext, IQueryable<Customer>> moreFiltered =
dc =>
from cust in queryWithoutADataContext(dc)
where cust.IsTall
select cust;
var bobs = queryWithoutADataContext(new DataContext);
var tallbobs = moreFiltered(new DataContext);

Resources