Data conflict resolution in Silverlight with WCF Service - silverlight

When using Silverlight together with a WCF Services library how do you solve the age old possibility of two different users loading a record, making different changes to that record and then updating. In other words when updating how does the WCF service know that the data it retrieved is the same as the data it fetched before applying an update?
Do you need to check the original loaded values against the values in the database (i.e. recall the original query before updating)?
I was hoping that there would be a more out-of-the-box answer.

Do you mean using EntityFramework? If so, here is the strategy I used :
When retrieving data on the client side you see that the "RowState" property of the entity is gone. What I did is I added the property on the client side using the "partial class" feature. And I manage locally the value of that RowState value.
When the data goes back to the server for update only send what has been modified filtering by the "RowState" property.
On your Update method, call the ApplyCurrentValues() method of the object.
Maybe there is a better solution for that, but that's what I am using on my project; and it works well on my case :)
Good luck

Related

How to apply remote filter on a tree panel

I have a tree panel and the store associated to it is memory I am loading the data using a loadData function where I am making a call service call. I want to add filters for the columns and filter them remotely when I say remoteFilter : true it is not making a call to the back-end. Any suggestions on how to achieve this.
Fiddle with my funcions and views (fiddle is not running )
The code is utter chaos, and shows a lack of understanding of the core concepts of ExtJS. You should clean it up and use the standardized ExtJS ways of doing things whenever possible.
Right now you are trying to put remoteFilter: true on a store with a memory proxy. A memory proxy is not a server proxy, alas does not have a possibility to filter remotely. Adding remoteFilter:true on that store can only either harm or do nothing.
Then you are loading data into a store with a memory proxy by executing Ext.Ajax.request manually. By the way, the filters that you wanted to be applied by the server, are not part of your hand-crafted Ajax request.
Usually, one would use store.load on a store with an ajax proxy to load a store from a server (this executes Ext.Ajax.request under the hood, but with all special settings a store supports). In that case, remoteFilter makes sense insofar as the filters you have set are then submitted to the server, and the server then has to filter out records that should not be displayed client-side. (Let's say it like this, from your front-end code I doubt that the backend supports anything like filtering.)
And then it seems that you are loading data into a tree store that does not come in the format expected by ExtJS. You should look into whether you can craft your model in such a way that the tree store can be directly loaded from the server's response, and get rid of all the intermediate code with Ext.Ajax.request and reformatting. Since you already have to modify the server side to enable remote filtering, this would be the most futureproof way to get your code working.

DTOs and object graphs

I'm making an Angular2 SPA with a webAPI REST service that exposes an EntityFramework model.
The quickest way to get going is to load up a big object graph in a single controller action and pass out a big blob of JSON to the client, which can then split it up as it walks the object graph.
Is it considered better practice to build API actions for each object in the object graph and have the JS client pull the graph piecemeal as required?
The piecemeal approach requires many more controllers and actions and, correspondingly, angular services, i.e., more work! Do I need to just grasp the nettle and get on with it?
Actually it depends whether your are using Entity Framework in connected scenarios or in disconnected scenarios. Regarding your case, you are using Entity framework in disconnected scenarios, which mean that DBContext doesn't attach to object graph all the time, because you get the data from database, send it to the client and then close the context. For me, I would recommend to use divide your controllers and actions for each POCO or DTO because this will help you to maintain and attach each object individually rather than maintain the whole object graph at once. The problem will start to appear when you start editing or manipulating your entities because in disconnected scenarios you never know which object has been edited or deleted or added in a big object graph. However, you should maintain and manipulate each change in client side directly to the sever to reflect that update.
I don't know if this answers your question, but if you need any further explanation or code sample. please let me know.
I think you have to make one backend action for one angular2 page-level component. User shouldn't wait for extra data loads, only data that needed on this page.

Is my angularjs application development in the correct way?

I am new to angular JS. Now I am creating an application using AngularJS + Codeigniter. I understand the basic concepts of AngularJS. I have done the basic operations like Add, View, Update, Delete records from database. Now I have several doubts.
Normally we fetch the database initially into a scope variable. All the listing is done using this JSON data. If we update a record using $http request the changes affect our database. But it does not change the data in the scope. Ideas to solve this
Made a request to the server to update the scope variable.
Just update the scope variable also along with the updating the database.
Which method should I follow?
I suggest you to use the first method. You should use another request to update the scope data. Using this approach the scope data will always be up to date with the data stored in the database. Also you have to think about what will happen if the database raises some error. If you use the second method the scope data will be temporary updated, but after refreshing the page the new changes will be lost, because nothing is stored in the database.

Oracle ADF - CommandImageLink is not working

We are developing a web application using Oracle ADF. We have a view object based on query. we drag and dropped this view object as a table in a jsf page(suppose page1). For that table we have added a new column contains a commandImageLink.
From another page we are adding some data to DB using ADF DC, that should be reflected in page1. Actually its not working after that we googled and got solution that if I set CacheResults to false of that table Iterator in Binding layour it will work. I have set to false and reflection is happening.
But my problem is if I set CacheResults to false my commandImageLink is not working. If I set CacheResults to true my commandImageLink is working(navigation is happening).
Please help.
First of all:
I would presume that your page1 is part of a bounded task flow. If you are not, you should refactor your code and move the page in a bounded task flow. I would further presume you are using ADF Business Components for your model layer.
Second of all:
Never use CacheResults=false. That solution is a nonsense from ADF perspective.
Now, your problem is re-querying the table info on page opening, therefore:
Expose View Object Implementation for your VO. Override executeQuery() method in the newly created java class. Expose this method as client method (this should make the method PUBLIC, visible from Data Controls).
drag and drop executeQuery method into your task flow, as method action. Make sure you have this method action as the start activity of your task flow. Furthermore, make sure you are connecting executeQuery() to your page activity in the task flow. This will make sure that every time you open the page, execute query is fired.

How to build sets of entity records using Breeze and local storage

I'm trying to create an off-line data collection app, using AngularJS.
I think, adding Breeze.js should help with saving and querying data to and from the browser local storage:
1) present the user with angular data entry form
2) when the "save" button is clicked - create a new Breeze entity and store it locally
3) the next time this form is used - create a second entity, and add/save it as a part of the same collection
I was wandering if anyone have tried to do something similar and could give me some pointers of how this is done.
I think it's viable and these links should help you to get started:
http://www.breezejs.com/documentation/querying-locally
You also might want to check this Angular sample aswell:
http://www.breezejs.com/samples/todo-angular
One caveat you have to have in mind is that Breeze will need to load the model's metadata from somewhere. Typically you hit a Web API asynchronously and get the metadata from there. However, on your particular scenario you should give a look at trying to load your metadata from a script file. Here's an how-to and discussion about it:
http://www.breezejs.com/documentation/load-metadata-script

Resources