We have a financial application that we have started to develop 5 years ago. It is a migration of an old application with the main purpose being to achieve a nice user interface.
Because of UI considerations back then the chosen technology is WPF. The application still uses .Net version 3.5 with SP1.
The current application is a layered one, but the business logic and the back-end is different technology and it is outside the scope of our problem.
The client is not a traditional MVVM application, however it mixes some concepts from it.
In a simple scenario data is retrieved from an application server in the form of typed datasets. The datasets being to the client are mapped into data objects (POCO) implementing the INotifyPropertyChanged interface, the properties raising the PropertyChanged event.
Those data objects are practically bound to the UI and as response to user interaction some routed commands are executed so in the end data is retrieved from the server.
A variation of Active Record pattern is implemented so basically each table from a dataset received from the server gets mapped into a data object, a property from a data object representing a column from a table and a data object instance represents a row from the table.
Because of the WPF bindings used the retrieved data automatically appears on the UI.
The problem is that we started to have performance problems. They appear sometimes in a non-deterministic fashion after the data is retrieved from the server. There is a delay between the moment the mapping from the datasets starts until the UI is updated.
This delays on slow machines can go up to a couple of seconds.
On powerful machines the delays seem to be insignificant. Also the delays are not constant, so retrieving the same data the UI is updated sometimes in a couple of milliseconds and sometimes it can last seconds.
We think we have isolated the problem and it is the usage of bindings in cases where a binding source (data object) has many properties (ex: 50) and the UI contains many controls (text boxes) that use bindings.
Some measurements have been made also using different tools. It seems there are no memory leaks. The processing delays were seen at the moment a property of a data object participating in a binding is updated.
Also seems like the duration of setting the value of the property depends also on the value itself (ex: updating decimal properties with the zero value is faster, using or not converters on the binding, tests were made also without converters and strings, the property value string “0” is updated much faster than property value “123.456.00”).
So basically we think that a major part of the performance issue is using many controls on the UI (ex: 150 textboxes) bound to objects having many properties (ex: 50). We have noticed that targeting the 3.5 version is much slower than targeting the version 4.0 or 4.5.
Delays can be reproduced but they are not occurring all the time. The delays increase or decrease exponentially like from 5 milliseconds to 2 seconds or on very slow machines even up to tens of seconds.
We suspect the performance issue (delay until the UI is updated) is related to the WPF technology and the binding mechanism but it is not confirmed.
So basically we do not understand why the delays appear, why they appear only sometimes (non-deterministic) and if they are because of the WPF binding system!
Any help on this would be appreciated. Thank you.
Related
Please help! I am developing a wpf application with ef core.
After reading the Microsoft documentation, I found it convenient to bind my DataGrid to a collection DbSet< T >.Local:
it represents a local view of all Added, Unchanged, and Modified entities in this set. This local view will stay in sync as entities are added or removed from the context.
Everything works great, but there is a small problem:
Now I can only make synchronous queries to DB through the context, which often freeze the UI. Asynchronous queries throw an exception:
This type of CollectionView does not support changes to its SourceCollection from a thread different from the Dispatcher thread.
Is there a way how to bind to this Local and keep the possibility of asynchronous queries through the context?
Here is the simplest test(no MVVM etc): https://github.com/OlegFj17/TestingBindingToLocal_EF_6
Thank you.
I have a Silverlight Windows Phone 7 app that pulls data from a public API. I find myself doing much of the same thing over and over again:
In the UI, set a loading message or loading progress bar in place of where the content is
Get the content, which may be already in memory, cached in isolated file storage, or require an HTTP request
If the content can not be acquired (no network connection, etc), display an error message
If the content is acquired, display it in the UI
Keep the content in main memory for subsequent queries
The content that is displayed to the user can be taken directly from a data source, such as an ObservableCollection, or it may be a query on a data source.
I would like to factor out this repetitive process into a framework where ideally only the following needs to be specified:
Where to display the content in the UI
The UI elements to show while loading, on failure, and on success
The URI of the HTTP request
How to parse the HTTP response into the data structure that will kept in memory
The location of the file in isolated storage, if it exists
How to parse the file contents into the data structure that will be kept in memory
It may sound like a lot, but two strings, three FrameworkElements, and two methods is less than the overhead that I currently have.
Also, this needs to work for however the data is maintained in memory, and needs to work for direct collections and queries on those collections.
My questions are:
Has something like this already been implemented?
Are my thoughts about the topic above fundamentally wrong in some way?
Here is a design I'm thinking of:
There are two components, a View and a Model.
The View is given the FrameworkElements for loading, failure, and success. It is also given a reference to the corresponding Model. The View is a UserControl that is placed somewhere in the UI.
The Model a class that is given the URI for the data, a method of how to parse the data, and optionally a filename and how to parse the file. It is responsible for retrieving the data and notifying the View whenever the current status (loading/fail/success) changes. If the data downloaded from the network is different from the cache, the network data takes precedence. When the app closes or is tombstoned, the model writes the data to the cache.
How does that sound?
I took some time to have a good read of your requirements and noted some thoughts to offer as a sounding board.
Firstly, for repetetive tasks with common behaviour this is definitely the way to approach it. You are not alone in thinking about this problem.
People doing a bunch of this sort of thing may have created similar abstractions however, to my knowledge none have been publicly released.
How far you go with it may depend if you intend it to be just for your own use and for those with very similar requirements or whether you want to handle more general cases and make a product that is usable by a very wide audience.
I'm going to assume the former, but that does not preclude the possibility of releasing it as an open source project that can be developed further and/or forked.
By not trying to cater for all possibilities you can make certain assumptions about the nature of the using implementation and in particular UI design choices.
I think overall your thinking in the right direction. While reading some of your high level thoughts I considered some things could be simplified (a good thing) and at the same time delivering a compeling UI.
On your initial points.
You could just assume a performance isindeterminate progressbar is being passed in.
Do this if it's important to you, but you could be buying yourself into some complexity here handling different caching requirements - variance in duration or dirty handling. Perhaps sufficient to lean on the platforms inbuilt caching of urls (which some people have found gets in their way).
Handle network connectivity, yep this is repetitive and somewhat intricate. A perfect candidate for a general solution.
Update UI... arguably better to just return data and defer decisions regarding presentation and format of data to your individual clients.
Content in main memory - see above on caching.
On your potential inputs.
Where to display content - see above re data and defer presentation choices to client.
I would go with a UI element for the progress indicator, again a performant progress bar. Regarding communication of failure I would consider implementing this in a Completed event which you publish. Then through parameters you can communicate the result and defer handling to the client to place that result in some presentation control/log/whatever. This is consistent with patterns used by the .Net Framework.
URI - yes, this gets passed in.
How to parse - passing in a delegate to convert a stream or string into an object whose type can be decided by the client makes sense.
Loc of cache - you could pass this if generalising this matters, or hardcode it's path. It would be more useful to others if passed in (consider if you handle folders/creation).
On the implementation.
You could go with a UserControl, if it works for you to be bound by that assumption. It would be more flexible though, and arguably equally simple/elegant, to push presentation back on the client for both the data display and status messages and control hide/display of the progress bar as passed in.
Perhaps you would go so far as to assume the status messages would always be displayed in a textblock (if passed) and shift that housekeeping from each of your clients into your generic class.
I suspect you will benefit from not coupling the data format and presentation still.
Tombstone handling.. I would recommend some testing on the platforms in built caching of URLs here and see if you can identify whether it's durations/dirty conditions work for your general cases.
Hopefully this gives you some things to think about and some reassurance you're heading down the right path. There are many ways you could go about this. Which is the best path ultimately will be driven by your goals.
I'm developing a WP7 application which is basically a client of an existing REST API. The server returns data in JSON. With the help of the library JSON.NET (http://json.codeplex.com/) I was able to deserialize it directly to my .NET C# classes.
I store locally the data to handle offline scenario of my application and also to prevent the call on the server each time the user launch the application. I provide two ways to refresh the data: manually and/or after a period of time. To store the data I use Sertling (http://sterling.codeplex.com/), it’s a simple but easy to use local database for Silverlight/WP7.
The biggest challenge is to handle the asynchronous communication with the server. I provide clear UI feedbacks (Progressbar and /or loading wheel) to let know to the user what’s going on.
On a side note I’m using MVVM Light toolkit and SL Unit Testing to do integration test View Model => my local Client code => Server. (http://code.google.com/p/nunit-silverlight/wiki/NunitTestsWp7)
We are going through a process of selecting a 3rd party suite of controls for Silverlight 4.0. We're mostly interested in a feature-rich grid control. I'm surprised to find that most of the products out there focus on client side paging, filtering, sorting, and grouping. But if the dataset is large enough to benefit from these functions isn't also too big to bring to the client in one call? And doesn't this make most of the advertised fancy grid features useless? In my opinion 200 rows of data is ideal upper limit on how much I'd request from the server in one request. Yet the sites for Telerik, DevExpress, ComponentOne, Xceed, and others all have fancy demos that bring 10,000+ rows of data to the client and show off the ability to page, filter, group, and sort it. Who brings 10,000+ rows of data to the client? What if you have 1,000s of concurrent users? What if that data is volatile? What use-case does this really address?
Can you share your experiences with any of these control suites and whether you've implemented paging? Also whether you are using RIA?
Thanks.
You don't need a third party Grid control to achieve server side paging. You can use the grid control and ObjectDataSource provided by silverlight toolkit http://silverlight.codeplex.com/
http://borrell.parivedasolutions.com/2008/01/objectdatasource-linq-paging-sorting.html
I agree with you, it can be crazy for a client to want to view their entire years worth of data all at the same time, but sometimes the client (and product managers) don't see things the same way you do and insist upon doing stupid things....
In any case, just because the demo is paging through 1 million records that doesn't mean they are bringing them all to the client. You also have to consider the scenario where you have 200 rows worth of data but you can only show 10 rows at a time due to the data templates you are using (you may only fit 10 rows to a page) - you can still retrieve all 200 rows because it is simply your presentation that is using up the physical room. You can also implement paging and retrieve the next page worth of data when it is requested (which will introduce a small delay, but could be well worth it). Possibly the best way to deal with this is to not give the user the ability to retrieve squillions of records at once - if you give them that feature they will use it and then they will also complain about its performance.
As for fast client side sorting/grouping/filtering, that is a real world necessity. It is common for our users to fetch many thousands of records from the server, then use the filters (which i have extended) to view a handful of records at a time, operate on those records, then modify the filters to view a different bunch. It is important to have these functions working fast because it makes a huge difference to the user experience. I trialled several different component sets earlier this year and found there was a vast difference in the performance between them when it came to these functions, so choose wisely :)
I'd like to see a control suite that boast working with concurrency issues on order fullfullment and also uses queues or stacks in order to solve data conflicts. I see too often that this grids and list controls are really nice, pretty, and show you all the data, but they don't solve basic concurreny problems when you have more than one person working on the same set of data. If it automates the locking of a row of one user from another, prevents duplication of work, and automatically logs error messages, then I can see purchasing the control suite.
You don't need to load all your data at once you can specify a maximum load in the xaml of your ObjectDataSource. This will load your data in blocks of the specified size.
Take a look at the 2 RIA services videos here:
https://www.silverlight.net/getstarted/riaservices/
There are segments on paging which may also be useful to you.
note(some of the assembly references and syntax have changed slightly since these videos were made but the core function is still the same)
I am in the middle of development of a WPF application that is using Entity Framework (.NET 3.5). It accesses the entities in several places throughout. I am worried about consistency throughout the application in regard to the entities. Should I be instancing separate contexts in my different views, or should I (and is a a good way to do this) instance a single context that can be accessed globally?
For instance, my entity model has three sections, Shipments (with child packages and further child contents), Companies/Contacts (with child addresses and telephones), and disk specs. The Shipments and EditShipment views access the DiskSpecs, and the OptionsView manages the DiskSpecs (Create, Edit, Delete). If I edit a DiskSpec, I have to have something in the ShipmentsView to retrieve the latest specs if I have separate contexts right?
If it is safe to have one overall context from which the rest of the app retrieves it's objects, then I imagine that is the way to go. If so, where would that instance be put? I am using VB.NET, but I can translate from C# pretty good. Any help would be appreciated.
I just don't want one of those applications where the user has to hit reload a dozen times in different parts of the app to get the new data.
Update:
OK so I have changed my app as follows:
All contexts are created in Using Blocks to dispose of them after they are no longer needed.
When loaded, all entities are detatched from context before it is disposed.
A new property in the MainViewModel (ContextUpdated) raises an event that all of the other ViewModels subscribe to which runs that ViewModels RefreshEntities method.
After implementing this, I started getting errors saying that an entity can only be referenced by one ChangeTracker at a time. Since I could not figure out which context was still referencing the entity (shouldn't be any context right?) I cast the object as IEntityWithChangeTracker, and set SetChangeTracker to nothing (Null).
This has let to the current problem:
When I Null the changeTracker on the Entity, and then attach it to a context, it loses it's changed state and does not get updated to the database. However if I do not null the change tracker, I can't attach. I have my own change tracking code, so that is not a problem.
My new question is, how are you supposed to do this. A good example Entity query and entity save code snipped would go a long way, cause I am beating my head in trying to get what I once thought was a simple transaction to work.
A global static context is rarely the right answer. Consider what happens if the database is reset during the execution of this application - your SQL connection is gone and all subsequent requests using the static context will fail.
Recommend you find a way to have a much shorter lifetime for your entity context - open it, do some work, dispose of it, ...
As far as putting your different objects in the same EDMX, that's almost certainly the right answer if they have any relationships between objects you'll want them in the same EDMX.
As for reloading - the user should never have to do this. Behind the scenes you can open a new context, reloading the current version of the object from the database, applying the changes they made in the UI annd then saving it back.
You might want to look at detached entities also, and beware of optimistic concurrency exceptions when you try to save changes and someone else has changed the same object in the database.
Good question, Cory. Thumb up from me.
Entity Framework gives you a free choice - you can either instanciate multiple contexts or have just one, static. It will work well in both cases and yes, both solutions are safe. The only valuable advice I can give you is: experiment with both, measure performance, delays etc and choose best one for you. It's fun, believe me :)
If this is going to be a really massive application with tons of concurrent connections I would advise using one static context or one static, core context and just few additional ones just to support the main one. But, as I wrote just few lines above - it's up to your requirements which solution is better for you.
I especially liked this part of your question:
I just don't want one of those
applications where the user has to hit
reload a dozen times in different
parts of the app to get the new data.
WPF is a really, really powerful tool and basically times when users have to press buttons to refresh data are gone forever. WPF gives you a really wide range of asynchronous, multithreading tools such as Dispatcher class or Background worker to gently refresh desired data in the background. This is really great, because not only you don't have to worry about pressing various buttons, but also background threads don't block UI, so data is refreshed transparently from user's point of view.
WPF together with Entity Framework are really worth the effort of learning - please feel free to ask if you have any further concerns.
I have a series of forms and navigate between them.
Each form has a set of controls for which I load properties from SQLite database and this is the long (about 1s) operation that doesn't give users the best feeling because the form is gradually being drawn.
I don't quite mind the delay but I'd like the form to be drawn when all data is loaded. I'd like to avoid new threads because this would result in cross-thread operation issues.
Is there any good solution apart from speeding the whole application up by caching the loaded data?
There is a simple way to speed up perceived performance of many controls, especially data intensive ones like listviews, listboxes, combo boxes etc.
Before you populate them call the BeginUpdate() method and when done call the EndUpdate(). This disables the redrawing of the control until you're done populating it with data.
Sorry. This is what threading is for. "Cross-thread operation issues" are well defined and there are common patterns for dealing with them. Just reduce the places where threads interact to a minimum (in this case, it would be a single place--after data is loaded) and it becomes trivial.
There are also some classes which make multithreading in a winforms app much easier, as they abstract away the interaction between threads. The BackgroundWorker (link to blog post about it) will perform work on another thread for you, and notifies you when its done by firing an event on the UI thread. You get the benefits of multithreading without any of the pitfalls.
I've found that loading Winform controls, like combo boxes and listboxes, load a lot faster when they are pointing to Views instead of a table itself, especially if you can limit the view to be leaner compared to the control having to look through an entire table.
Back in VB days I used LockWindowUpdate API. Since this takes a window handle, it should be usable also with WinForms. Though, never tried.
this link has some nice solution, as far as i think BackgroundWorker process should help.
http://devcomponents.com/blog/?p=361