Silverlight UI Thread Blocking - silverlight

Can someone please tell me how the processing in Silverlight processes between the UI Thread and the other "worker" threads.
I have a scenario where I have to update several hundred complex UI objects in the view via a viewmodel. Each item is backed by its own viewmodel.
If each viewmodel had a property, for example, called IsSelected, which changed a background color through behaviours, how should I go about making changes to minimal UI Thread blocking?
If I update my (several hundred) viewmodels, it blocks the UI thread for around 4 seconds. How can I determine what's doing the blocking? Are there more efficient ways to update?
Thanks

There are definitely more efficient ways than doing it in one go.
A non-Silverlight specific solution would be to space these updates a few milliseconds apart with DispatcherTimer delayed calls, so the thread has some "breathing space" to carry on with the execution path.
But you should also give some thought to your architecture, if you're dealing with hundreds of VMs it might worth using lazy loading and updating your screen sequentially, in order of importance for your audience.
See this answer too for more explanation: https://stackoverflow.com/a/1710868/21217

Related

WPF BindingOperations.EnableCollectionSynchronization + ObservableCollection + Reentrancy

Firstly I am terrible at titles and articulating my problem at hand.
I am producing a file-system like structure. Files and Folders are stored in a local sqlite db, you have File entities and Folder entities.
When the screen first opens, goes to the local db and fetches the data and pushes it into the ObservableCollection. This is done on a new background thread.
Here's the problem, if you have 50,000 items, it seems like WPF's new BindingOperation mechanism for cross thread collection access, does some special tricks to queue all the changes to the UI on another thread which then trickles it onto the Dispatcher.
Now obviously I have no control over how it does this. But my question is, how do I handle reentrancy in these scenarios? If the user was to spam-click a reload button that I have, how do I either cancel all these queued UI updates that are slow and pending, or how do I know there's still so many pending UI updates?
It seems rather slow on the updates, with 50,000 items it takes at least a minute or 2 to completely process all items into the list.

Process lots of small tasks and keep the UI responsive

I have a WPF application that needs to do some processing of many small tasks.
These small tasks are all generated at the same time and added to the Dispatcher Queue with a priority of Normal. At the same time a busy indicator is being displayed. The result is that the busy indicator actually freezes despite the work being broken into tasks.
I tried changing the priority of these tasks to be Background to see if that fixed it, but still the busy indicator froze.
I subscribed to the Dispatcher.Hooks.OperationStarted event to see if any render jobs occurred while my tasks were processing but they didn't.
Any ideas what is going on?
Some technical details:
The tasks are actually just messages coming from an Observable sequence, and they are "queued" into the dispatcher by a call to ReactiveUI's ObserveOn(RxApp.MainThreadScheduler) which should be equivalent to ObserveOn(DispatcherScheduler). The work portion of each of these tasks is the code that is subscribing through the ObserveOn call e.g.
IObservable<TaskMessage> incomingTasks;
incomingTasks.ObserveOn(RxApp.MainThreadScheduler).Subscribe(SomeMethodWhichDoesWork);
in this example, incomingTasks would produce maybe 3000+ messages in short succession, the ObserveOn pushes each call to SomeMethodWhichDoesWork onto the Dispatcher queue so that it will be processed later
The basic problem
The reason you are seeing the busy indicator stall is because your SomeMethodWhichDoesWork is taking too long. While it is running, it prevents any other work from occuring on the Dispatcher.
Input and Render priority operations generated to handle animations are lower than Normal, but higher priority than Background operations. However, operations on the Dispatcher are not interrupted by the enqueing of higher priority operations. So a Render operation will have to wait for a running operation, even if it is a Background operation.
Caveat regarding observing on the DispatcherScheduler
ObserveOn(DispatcherScheduler) will push everything through at Normal priority by default. More recent versions of Rx have on overload that allows you to specify a priority.
One point to highlight that's often missed is that items will be queued onto the Dispatcher by the DispatcherScheduler as soon as they arrive NOT one after the other.
So if your 3000 items all turn up fairly close together, you will have 3000 operations at Normal priority backed up on the Dispatcher blocking everything of the same or lower priority until they are done - including Render operations. This is almost certainly what you were seeing - and that means you might still see problems even if you do all but the UI update work on a background thread depending on how heavy your UI updates are.
In addition to this, you should check you aren't running the whole subscription on the UI thread - as Lee says. I usually write my code so that I Subscribe on a background thread rather than use SubscribeOn, although this is perfectly fine too.
Recommendations
Whatever you do, do as much work as possible on a background thread. That point has been done to death on StackOverflow, and elsewhere. Here are some good resources covering this:
MSDN Entry on WPF Threading Model
MSDN Magazine "Build More Responsive Apps With The Dispatcher", by Shaun Wildermuth
If you want to keep the UI responsive in the face of lots of small updates you can either:
Schedule items at a lower priority, which is nice and easy - but not so good if you need a certain priority
Store updates in your own queue and enqueue them and have each operation you run Invoke the next item from your queue as it's last step.
The bigger picture
It's worth stepping back a bit and looking at the bigger picture as well.
If you separately dump 3000 items into the UI in succession, what's that going to do for the user? At best they are going to be running a monitor with a refresh rate of 100Hz, probably lower. I find that frame rates of 10 per second are more than adequate for most purposes.
Not only that, human beings supposedly can't handle more than 5-9 bits of information in one go - so you might find better ways of aggregating and displaying information than updating so many things at once. For example, make use of master/detail views rather than showing everything on screen at once etc. etc.
Another option is to review how much work your UI update is causing. Some controls (I'm looking at you XamDataGrid) can have very lengthy measure/arrange layout operations. Can you simplify your animations? Use a simpler Visual tree? Think about the popular busy spinner that looks like circling dots - but really it's just changing their color. A great effect that is fairly cheap to achieve. It's worth profiling your application to see where time is going.
I would think about the overrall approach front-to-back as well. If you are reasonably certain you are going to get that many items to update at once, why not buffer them up and manage them in chunks? That would might have advantages all the way back to the source - which perhaps is on a server somewhere? In any case, Rx has some nice operators, like Buffer that can turn a stream of individual items into a larger lists - and it has overloads that can buffer by time and size together.
Have you tried using .SubscribeOn(TaskPoolScheduler.TaskPool) to subscribe on a different thread?
#Pedro Pombeiro has the right answer.
The reason you are seeing the freezes on the UI is that you are queueing the work on the Dispatcher. This means the work will be done on the UI thread. You can think of the Dispatcher as a message pump that is constant draining messages from each of its queues (which you can think of each of the priorities [SystemIdle, ApplicationIdle, ContextIdle, Background, Input, Loaded, Render, DataBind, Normal, Send])
Putting you work onto a different priority queue, does not make it run concurrently, just asynchronously.
To run your work on another thread using Rx, then use SubscribeOn as above. Remember to then schedule any updates to the UI back on to the Dispatcher with ObserveOn.

Memory leak and performance issue with WPF's TreeView

I'm running into a problem with the WPF TreeView control.
I think I ran into a memory leak issue with this control and also some performance issues.
I've prepared a simple demo solution where you can see these problems.
Download link: http://www.custom-projects.com/TreeViewMemoryAndPerformanceIssue.zip
I'm creating the tree based on some domain objects. The objects are wrapped in view models.
The number of levels is not restricted, but currently we have a maximum of 3 levels.
So, each view model can have children.
When you click on the up/down buttons of the UpDown control and don't release the mouse button you will see, that the update speed of the int value will get slower and slower and the memory consumption constantly rises.
What I'm doing: When you click on the up/down button the value is sent to the view model via data binding. In the setter I'm raising a event. Our application consists of different view models and if someone is changing data in one of them, the others are notified through these DataChanged events.
For simplicity, my demo solutions just consists of the NavigationViewModel. So it listens
for the DataChanged event and if fired, the tree is rendered.
Because we don't have a list which will always be the same (and just rows are added or removed), I'm not using a ObservableCollection. We always have to regenerate the list based on the objects the user has added/created.
Anyways, I'm adding these view models to a list and raise the NotifyPropertyChanged event
so that WPF updates the tree. Works well but the more the list is updated, the slower the application gets (and memory goes up).
I checked, that the item view models are garbage collected and they are, so I don't see
something wrong on my side. I also did some performance profiling. It looks, that the
issue is on the WPF side, because my code does not slow down. The Application.Run method
execution time rises... Strange thing.
Does anyone has an idea, why the memory is going up and never gets released and why the
performance starts to decrease the more often the TreeView updates itself?
I would appreciate any help or comment on this issue.
Thanks,
Christian
I profiled your test application using ANTS Memory Profiler and you can see that your classes 'NavigationItemBaseViewModel' and the array "NavigationItemBaseViewModel[]" are still held in memory by references, and this is getting worse with each increment.
If you slowly increment and allow the update to happen, then the references are broken and objects disposed. All good.
However, if you increment fast/continuously then you see that your references are not broken, thus the arrays are kept in memory.
The increments get slower each time because your application is having to update a lot of these view models, at increment #58 I had 172 arrays holding between them 517 NavigationItemBaseViewModel's.
Where with "normal" functionality you only have 4 arrays and 13 NavigationItemBaseViewModel's.
Hope that helps, I would recommend you profile your memory if you cannot figure out your logic where new are creating new arrays. Typically it is best to reuse arrays.
Profiler I used is here: http://www.red-gate.com/products/dotnet-development/ants-memory-profiler/index2
Hope that helps.
I was investigating lots of memory leaks in WPF and i find this tool very useful: http://www.jetbrains.com/profiler/ It has trial period of 10 days (i've just checked) so i hope you will be able to find your problem.

Keeping the UI visually updated while running an expensive operation on the UI thread

In my WPF app, I need to run an expensive operation on my UI thread (let's call it ExpensiveUIOperation()), and I want to keep the UI up to date to track it's progress.
To track progress, I simply have a TextBlock, whose Text property is bound to an integer dependency property PercentageComplete. During ExpensiveUIOperation(), I simply set the value of PercentageComplete as required.
Now, I understand enough about threading to know that if I simply ran ExpensiveUIOperation() on my UI thread, that the TextBlock would not appear to keep up to date, as the UI thread would be blocked, stopping any interface updates.
And so I thought I could do it asynchronously like this:
Dispatcher.BeginInvoke(new Action(ExpensiveUIOperation), DispatcherPriority.Background);
But that is still not working. The text block is not visually updated until the operation completes.
Is there a way to do this?
Unfortunately in this situation I cannot use a background thread, as the operation makes heavy use of objects owned by the UI thread.
Unfortunately in this situation I cannot use a background thread, as the operation makes heavy use of objects owned by the UI thread.
That is not a good enough reason to abuse the UI-thread like this. Use the Dispatcher when accessing those elements (see the treading model reference), or properly bind your view to relevant properties and you will not even need to do that as updates are queued to the UI internally.
You're stuck, UI operations have to occur on the UI thread, and while that occurs no UI updates will happen. You could do the equivalent of an Application.DoEvents in WPF by creating a new dispatcher frame (http://dedjo.blogspot.com/2007/08/how-to-doevents-in-wpf.html) but it is dangerous, you will catch the UI in the middle of updates and not a good thing to do.
Is the expensive UI operation really a CPU-intensive, UI-only operation? Nothing that can be done with a View Model object graph and then finally bound to the UI, for example?

WinForms multi-threaded databinding scenario, best practice?

I'm currently designing/reworking the databinding part of an application that makes heavy use of winforms databinding and updates coming from a background thread (once a second on > 100 records).
Let's assume the application is a stock trading application, where a background thread monitors for data changes and putting them onto the data objects. These objects are stored in a BindingList<> and implement INotifyPropertyChanged to propagate the changes via databinding to the winforms controls.
Additionally the data objects are currently marshalling the changes via WinformsSynchronizationContext.Send to the UI thread.
The user is able to enter some of the values in the UI, which means that some values can be changed from both sides. And the user values shouldn't be overritten by updates.
So there are several question coming to my mind:
Is there a general design-guildline how to do that (background updates in databinding)?
When and how to marshal on the UI thread?
What is the best way of the background thread to interact with
binding/data objects?
Which classes/Interfaces should be used? (BindingSource, ...)
...
The UI doesn't really know that there is a background thread, that updates the control, and as of my understanding in databinding scenarios the UI shouldn't know where the data is coming from... You can think of the background thread as something that pushes data to the UI, so I'm not sure if the backgroundworker is the option I'm searching for.
Sometimes you want to get some UI response during an operation in the data-/business object (e.g. setting the background during recalculations). Raising a propertychanged on a status property which is bound to the background isn't enough, as the control get's repainted after the calculation has finished? My idea would be to hook on the propertychanged event and call .update() on the control...
Any other ideas about that?
This is a hard problem since most “solutions” lead to lots of custom code and lots of calls to BeginInvoke() or System.ComponentModel.BackgroundWorker (which itself is just a thin wrapper over BeginInvoke).
In the past, I've also found that you soon wish to delay sending your INotifyPropertyChanged events until the data is stable. The code that handles one propriety-changed event often needs to read other proprieties. You also often have a control that needs to redraw itself whenever the state of one of many properties changes, and you don’t wan the control to redraw itself too often.
Firstly, each custom WinForms control should read all data it needs to paint itself in the PropertyChanged event handler, so it does not need to lock any data objects when it was a WM_PAINT (OnPaint) message. The control should not immediately repaint itself when it gets new data; instead, it should call Control.Invalidate(). Windows will combine the WM_PAINT messages into as few requests as possible and only send them when the UI thread has nothing else to do. This minimizes the number of redraws and the time the data objects are locked. (Standard controls mostly do this with data binding anyway)
The data objects need to record what has changed as the changes are made, then once a set of changes has been completed, “kick” the UI thread into calling the SendChangeEvents method that then calls the PropertyChanged event handler (on the UI thread) for all properties that have changed. While the SendChangeEvents() method is running, the data objects must be locked to stop the background thread(s) from updating them.
The UI thread can be “kicked” with a call to BeginInvoke whenever a set of update have bean read from the database. Often it is better to have the UI thread poll using a timer, as Windows only sends the WM_TIMER message when the UI message queue is empty, hence leading to the UI feeling more responsive.
Also consider not using data binding at all, and having the UI ask each data object “what has changed” each time the timer fires. Databinding always looks nice, but can quickly become part of the problem, rather then part of the solution.
As locking/unlock of the data-objects is a pain and may not allow the updates to be read from the database fast enough, you may wish to pass the UI thread a (virtual) copy of the data objects. Having the data object be persistent/immutable so that any changes to the data object return a new data object rather than changing the current data object can enable this.
Persistent objects sound very slow, but need not be, see this and that for some pointers. Also look at this and that on Stack Overflow.
Also have a look at retlang - Message-based concurrency in .NET. Its message batching may be useful.
(For WPF, I would have a View-Model that sets in the UI thread that was then updated in ‘batches’ from the multi-threaded model by the background thread. However, WPF is a lot better at combining data binding events then WinForms.)
Yes all the books show threaded structures and invokes etc. Which is perfectly correct etc, but it can be a pain to code, and often hard to organise so you can make decent tests for it
A UI only needs to be refreshed so many times a second, so performance is never an issue, and polling will work fine
I like to use a object graph that is being continuously updated by a pool of background threads. They check for actual changes in data values and when they notice an actual change they update a version counter on the root of the object graph (or on each main item whatever makes more sense) and updates the values
Then your foreground process can have a timer (same as UI thread by default) to fire once a second or so and check the version counter, and if it changes, locks it (to stop partial updates) and then refreshes the display
This simple technique totally isolates the UI thread from the background threads
There is an MSDN article specific on that topic. But be prepared to look at VB.NET. ;)
Additionally maybe you could use System.ComponentModel.BackgroundWorker, instead of a generic second thread, since it nicely formalize the kind of interaction with the spawned background thread you are describing. The example given in the MSDN library is pretty decent, so go look at it for a hint on how to use it.
Edit:
Please note: No marshalling is required if you use the ProgressChanged event to communicate back to the UI thread. The background thread calls ReportProgress whenever it has the need to communicate with the UI. Since it is possible to attach any object to that event there is no reason to do manual marshalling. The progress is communicated via another async operation - so there is no need to worry about neither how fast the UI can handle the progress events nor if the background thread gets interruped by waiting for the event to finish.
If you prove that the background thread is raising the progress changed event way too fast then you might want to look at Pull vs. Push models for UI updates an excellent article by Ayende.
I just fought a similar situation - badkground thread updating the UI via BeginInvokes. The background has a delay of 10ms on every loop, but down the road I ran into problems where the UI updates which sometimes get fired every time on that loop, can't keep up with teh freq of updates, and the app effectively stops working (not sure what happens- blew a stack?).
I wound up adding a flag in the object passed over the invoke, which was just a ready flag. I'd set this to false before calling the invoke, and then the bg thread would do no more ui updates until this flag is toggled back to true. The UI thread would do it's screen updates etc, and then set this var to true.
This allowed the bg thread to keep crunching, but allowed the ui to shut off the flow until it was ready for more.
Create a new UserControl, add your control and format it (maybe dock = fill) and add a property.
now configure the property to invoke the usercontrol and update your element, each time you change the property form any thread you want!
thats my solution:
private long value;
public long Value
{
get { return this.value; }
set
{
this.value = value;
UpdateTextBox();
}
}
private delegate void Delegate();
private void UpdateTextBox()
{
if (this.InvokeRequired)
{
this.Invoke(new Delegate(UpdateTextBox), new object[] {});
}
else
{
textBox1.Text = this.value.ToString();
}
}
on my form i bind my view
viewTx.DataBindings.Add(new Binding("Value", ptx.CounterTX, "ReturnValue"));
This is a problem that I solved in Update Controls. I bring this up not to suggest you rewrite your code, but to give you some source to look at for ideas.
The technique that I used in WPF was to use Dispatcher.BeginInvoke to notify the foreground thread of a change. You can do the same thing in Winforms with Control.BeginInvoke. Unfortunately, you have to pass a reference to a Form object into your data object.
Once you do, you can pass an Action into BeginInvoke that fires PropertyChanged. For example:
_form.BeginInvoke(new Action(() => NotifyPropertyChanged(propertyName))) );
You will need to lock the properties in your data object to make them thread-safe.
This post is old but I thought I'd give options to others. It seems once you start doing async programming and Windows Forms databinding you end up with problems updating Bindingsource datasource or updating lists bound to windows forms control. I am going to try using Jeffrey Richters AsyncEnumerator class from his powerthreading tools on wintellect.
Reason:
1. His AsyncEnumerator class automatically marshals background threads to UI threads so you can update controls as you would doing Synchronous code.
2. AsyncEnumerator simplifies Async programming. It does this automatically, so you write your code in a Synchronous fashion, but the code is still running in an asynchronous fashion.
Jeffrey Richter has a video on Channel 9 MSDN, that explains AsyncEnumerator.
Wish me luck.
-R
I am late to the party but I believe this is still a valid question.
I would advise you to avoid using data binding at all and use Observable objects instead.
The reason is, data binding looks cool and when implemented the code looks good, but data binding miserably fails when there is lot os asynchronous UI update or multi-threading as in your case.
I have personally experienced this problem with asynchronous and Databinding in prod, we even didn't detect it in testing, when users started using all different scenarios things started to break down.

Resources