RIA services and nHibernate insert new problem - silverlight

I have combination of RIA services and nHibernate. nHibernate is configured to use identity on database side. So new Entities are sent with 0 for id. nHibernate works as it should. It updates generated keys form database and updates entites.
I have example with compositional hierarchy. My entity is complex it has two collections.
InvestObject
- MaterialItems
- WorkItems
I work with this structure in one unit of work. Geting and showing data in Silverlight app is no problem. But if I try to add more than one item in MaterialItems collection on client side, when saving I get this error:
Submit operation failed. Invalid
ChangeSet : Only one entry for a given
entity instance can exist in the
ChangeSet. at
System.ServiceModel.DomainServices.Server.ChangeSet.ValidateChangeSetEntries(IEnumerable1
changeSetEntries) at
System.ServiceModel.DomainServices.Server.ChangeSet..ctor(IEnumerable1
changeSetEntries)
There is a quick fix on client side, just to generate some dummy negative ids, for Material. This works for RIA and save is propagated to server side. But then nHibernate fires error, beacuse it expects 0 for all new Ids not a given value ( ). So this is not OK.
Finally I tricked nHibernate by reseting back all new Ids to 0. But this does not make me happy. It is messy ugly solution.
Please help

It's been a while since I've done this so the details are hazy but I think you basically can't use IDs that are generated in the DB with RIA services. We used the HiLo algorithm instead.

Related

Breeze and Data Access SOA

In evaluating Angular + Breeze, does it support tracking changes with the consumed DTOs from services back to the backend Entity Framework?
Yes and no. Breeze tracks the changes on the client, and when you call saveChanges(), it sends the changed entities (with information about what properties changed) to the server. What happens on the server is up to you, so you could use the received data to modify the states of entities in an existing EF context, and accumulate change tracking info in EF until you decide to save it to the DB.
The provided EF + WebApi server-side components don't do that, however. They are built to streamline the following use case:
Client performs add/update/delete operations on entities and calls saveChanges().
Server creates a new EF DbContext and applies the changes to it.
Server applies validation rules (in the BeforeSaveEntities method), and rejects the save if they fail.
Server DbContext saves the changes to the database.
In this scenario, there is no long-lived EF DbContext tracking the changes; the change tracking is done on the client, and EF is used to process those changes on the server and save them in the DB.
That probably covers 90% of what most applications require, but there are hooks in place to intercept the save and make server-side changes before the save, and you can override any parts that don't fit your needs.

Unit testing WCF-RIA Services

When data from a Get operation on my DomainService is sent to the DomainContext in my silverlight application, some rows end up not being sent while others are sent. I check this by setting a breakpoint in the DomainService and a breakpoint in the DomainContext load operation callback. How can I create a unit test to check this?
E.g. Set up some in-memory data for the DomainService and check if the silverlight DomainContext receives this data?
This is usually caused by a primary key that isn't unique. When RIA Services sends rows to the client it filters the results by the primary key to make sure there are no duplicates. If you have two rows with different data but the same primary key only one of those rows will make it to the client.
There is a blog series by Kyle McClellan on how to unit test RIA Services: http://blogs.msdn.com/b/kylemc/archive/2011/08/18/unit-testing-a-wcf-ria-domainservice-part-1-the-idomainservicefactory.aspx which may be helpful.

Method for loading a large SharePoint list (~65K records) into Silverlight

What would be decently performant approach to loading about 65,000 records from a SharePoint list into Silverlight? The list is composed of call data that the Silverlight app will roll up into a summarized chart. I tried with the Client OM and the fastest I could get the data was 2.5-3 minutes. And that was only getting a single column of the list.
I've also tried using the SharePoint WCF services, but when I try to add a service reference in Visual Studio 2010 I receive a vague Bad Request (400) error.
Would WCF be any faster than the Client OM or is there a feature in SharePoint for large lists that I could use? If WCF is faster, does anyone have any thoughts on why I would be getting the Bad Request error in VS2010?
Thanks!
I would recommend that you create a server side web service for this data. With 65k+ records you are never going to be able to do that with decent performance.
Create a custom server side code that will read and process that data into a summarised list that can be rendered by the client.

EF4 + STE: Reattaching via a WCF Service? Using a new objectcontext each and every time?

I am planning to use WCF (not ria) in conjunction with Entity Framework 4 and STE (Self tracking entitites). If I understand this correctly my WCF should return an entity or collection of entities (using LIST for example and not IQueryable) to the client (in my case Silverlight).
The client then can change the entity or update it. At this point I believe it is self tracking? This is where I sort of get a bit confused as there are a lot of reported problems with STEs not tracking.
Anyway, then to update I just need to send back the entity to my WCF service on another method to do the update. I should be creating a new OBJECTCONTEXT every time? In every method?
If I am creating a new objectcontext every time in every method on my WCF then don't I need to re-attach the STE to the objectcontext?
So basically this alone wouldn't work??
using(var ctx = new MyContext())
{
ctx.Orders.ApplyChanges(order);
ctx.SaveChanges();
}
Or should I be creating the object context once in the constructor of the WCF service so that 1 call and every additional call using the same WCF instance uses the same objectcontext?
I could create and destroy the WCF service in each method call from the client - hence creating in effect a new objectcontext each time.
I understand that it isn't a good idea to keep the objectcontext alive for very long.
You are asking several questions so I will try to answer them separately:
Returning IQueryable:
You can't return IQueryalbe. IQueryable describes query which should be executed. When you try to return IQueryable from service it is being executed during serialization of service response. It usually causes exception because ObjectContext is already closed.
Tracking on client:
Yes STEs can track changes on a client if client uses STEs! Assembly with STEs should be shared between service and client.
Sharing ObjectContext:
Never share ObjectContext in server environment which updates data. Always create new ObjectContext instance for every call. I described reasons here.
Attaching STE
You don't need to attach STE. ApplyChanges will do everything for you. Also if you want to returen order back from your service operation you should call AcceptChanges on it.
Creating object context in service constructor:
Be aware that WCF has its own rules how to work with service instances. These rules are based on InstanceContextMode and used binding (and you can implement your own rules by implement IInstanceProvider). For example if you use BasicHttpBinding, default instancing will be PerCall which means that WCF will create new service instance for each request. But if you use NetTcpBinding instead, default instancing will be PerSession and WCF will reuse single service instance for all request comming from single client (single client proxy instance).
Reusing service proxy on a client:
This also depends on used binding and service instancing. When session oriented binding is used client proxy is related to single service instance. Calling methods on that proxy will always execute operations on the same service instance so service instance can be stateful (can contains data shared among calls). This is not generally good idea but it is possible. When using session oriented connection you have to deal with several problems which can arise (it is more complex). BasicHttpBinding does not allow sessions so even with single client proxy, each call is processed by new service instance.
You can attach an entity to a new object context, see http://msdn.microsoft.com/en-us/library/bb896271.aspx.
But, it will then have the state unchanged.
The way I would do it is:
to requery the database for the information
compare it with the object being sent in
Update the entity from the database with the changes
Then do a normal save changes
Edit
The above was for POCO, as pointed out in the comment
For STE, you create a new context each time but use "ApplyChanges", see: http://msdn.microsoft.com/en-us/library/ee789839.aspx

Need advice for my Server part

I'm stuck on my server part.
I thing it would be fine if I make an REST architecture but I'm not sure.
My application is that an identitied user edit his name, age, hobbies...and I want to stock all the informations of all the users on my data server. Then, I could send the information of a user to another in a ListView with an adaptater.
Any idea to help me?
Thanks in advance
I have recently added a series of posts to my blog that may help. It covers creating a RESTful service using Java EE technologies on the GlassFish server. The example produces consumes XML, but could easily be adapted to handle JSON.
Part 1 - The Database Model
Part 2 - Mapping the Database Model to JPA Entities
Part 3 - Mapping JPA Entities to XML using JAXB
Part 4 - The RESTful Service

Resources