WPF Dispatcher not releasing object for GC - wpf

I'm trying to track down some memory leaks in my application and according the the ANTS profiler, many of my objects are being help up by System.Windows.Threading.Dispatcher. My application is basically single threaded and the only explicit calls I make to Dispatcher.Invoke are unrelated to the objects being held up. The objects do seem to all be UserControl children of a FixedDocument subclass of mine, if that means anything to anyone.
What is causing the dispatcher to not release my objects?

There is an operation scheduled on the dispatcher (e.g., with BeginInvoke), and one of the arguments to that operation references your ReportVisualTable, either directly or indirectly.
Looking at the types included in your retention graph, it looks like a DocumentViewer attempted to bring a page into view, but the page hadn't been loaded yet, so the operation was deferred on the dispatcher. The operation was enqueued at priority level Inactive, which means it will just sit in the queue indefinitely, because that priority level is never processed. When the requested page is loaded, the operation's priority is bumped up to Background, but if that never happens, it seems the operation will just stay inactive.

Related

Process lots of small tasks and keep the UI responsive

I have a WPF application that needs to do some processing of many small tasks.
These small tasks are all generated at the same time and added to the Dispatcher Queue with a priority of Normal. At the same time a busy indicator is being displayed. The result is that the busy indicator actually freezes despite the work being broken into tasks.
I tried changing the priority of these tasks to be Background to see if that fixed it, but still the busy indicator froze.
I subscribed to the Dispatcher.Hooks.OperationStarted event to see if any render jobs occurred while my tasks were processing but they didn't.
Any ideas what is going on?
Some technical details:
The tasks are actually just messages coming from an Observable sequence, and they are "queued" into the dispatcher by a call to ReactiveUI's ObserveOn(RxApp.MainThreadScheduler) which should be equivalent to ObserveOn(DispatcherScheduler). The work portion of each of these tasks is the code that is subscribing through the ObserveOn call e.g.
IObservable<TaskMessage> incomingTasks;
incomingTasks.ObserveOn(RxApp.MainThreadScheduler).Subscribe(SomeMethodWhichDoesWork);
in this example, incomingTasks would produce maybe 3000+ messages in short succession, the ObserveOn pushes each call to SomeMethodWhichDoesWork onto the Dispatcher queue so that it will be processed later
The basic problem
The reason you are seeing the busy indicator stall is because your SomeMethodWhichDoesWork is taking too long. While it is running, it prevents any other work from occuring on the Dispatcher.
Input and Render priority operations generated to handle animations are lower than Normal, but higher priority than Background operations. However, operations on the Dispatcher are not interrupted by the enqueing of higher priority operations. So a Render operation will have to wait for a running operation, even if it is a Background operation.
Caveat regarding observing on the DispatcherScheduler
ObserveOn(DispatcherScheduler) will push everything through at Normal priority by default. More recent versions of Rx have on overload that allows you to specify a priority.
One point to highlight that's often missed is that items will be queued onto the Dispatcher by the DispatcherScheduler as soon as they arrive NOT one after the other.
So if your 3000 items all turn up fairly close together, you will have 3000 operations at Normal priority backed up on the Dispatcher blocking everything of the same or lower priority until they are done - including Render operations. This is almost certainly what you were seeing - and that means you might still see problems even if you do all but the UI update work on a background thread depending on how heavy your UI updates are.
In addition to this, you should check you aren't running the whole subscription on the UI thread - as Lee says. I usually write my code so that I Subscribe on a background thread rather than use SubscribeOn, although this is perfectly fine too.
Recommendations
Whatever you do, do as much work as possible on a background thread. That point has been done to death on StackOverflow, and elsewhere. Here are some good resources covering this:
MSDN Entry on WPF Threading Model
MSDN Magazine "Build More Responsive Apps With The Dispatcher", by Shaun Wildermuth
If you want to keep the UI responsive in the face of lots of small updates you can either:
Schedule items at a lower priority, which is nice and easy - but not so good if you need a certain priority
Store updates in your own queue and enqueue them and have each operation you run Invoke the next item from your queue as it's last step.
The bigger picture
It's worth stepping back a bit and looking at the bigger picture as well.
If you separately dump 3000 items into the UI in succession, what's that going to do for the user? At best they are going to be running a monitor with a refresh rate of 100Hz, probably lower. I find that frame rates of 10 per second are more than adequate for most purposes.
Not only that, human beings supposedly can't handle more than 5-9 bits of information in one go - so you might find better ways of aggregating and displaying information than updating so many things at once. For example, make use of master/detail views rather than showing everything on screen at once etc. etc.
Another option is to review how much work your UI update is causing. Some controls (I'm looking at you XamDataGrid) can have very lengthy measure/arrange layout operations. Can you simplify your animations? Use a simpler Visual tree? Think about the popular busy spinner that looks like circling dots - but really it's just changing their color. A great effect that is fairly cheap to achieve. It's worth profiling your application to see where time is going.
I would think about the overrall approach front-to-back as well. If you are reasonably certain you are going to get that many items to update at once, why not buffer them up and manage them in chunks? That would might have advantages all the way back to the source - which perhaps is on a server somewhere? In any case, Rx has some nice operators, like Buffer that can turn a stream of individual items into a larger lists - and it has overloads that can buffer by time and size together.
Have you tried using .SubscribeOn(TaskPoolScheduler.TaskPool) to subscribe on a different thread?
#Pedro Pombeiro has the right answer.
The reason you are seeing the freezes on the UI is that you are queueing the work on the Dispatcher. This means the work will be done on the UI thread. You can think of the Dispatcher as a message pump that is constant draining messages from each of its queues (which you can think of each of the priorities [SystemIdle, ApplicationIdle, ContextIdle, Background, Input, Loaded, Render, DataBind, Normal, Send])
Putting you work onto a different priority queue, does not make it run concurrently, just asynchronously.
To run your work on another thread using Rx, then use SubscribeOn as above. Remember to then schedule any updates to the UI back on to the Dispatcher with ObserveOn.

Keeping the UI visually updated while running an expensive operation on the UI thread

In my WPF app, I need to run an expensive operation on my UI thread (let's call it ExpensiveUIOperation()), and I want to keep the UI up to date to track it's progress.
To track progress, I simply have a TextBlock, whose Text property is bound to an integer dependency property PercentageComplete. During ExpensiveUIOperation(), I simply set the value of PercentageComplete as required.
Now, I understand enough about threading to know that if I simply ran ExpensiveUIOperation() on my UI thread, that the TextBlock would not appear to keep up to date, as the UI thread would be blocked, stopping any interface updates.
And so I thought I could do it asynchronously like this:
Dispatcher.BeginInvoke(new Action(ExpensiveUIOperation), DispatcherPriority.Background);
But that is still not working. The text block is not visually updated until the operation completes.
Is there a way to do this?
Unfortunately in this situation I cannot use a background thread, as the operation makes heavy use of objects owned by the UI thread.
Unfortunately in this situation I cannot use a background thread, as the operation makes heavy use of objects owned by the UI thread.
That is not a good enough reason to abuse the UI-thread like this. Use the Dispatcher when accessing those elements (see the treading model reference), or properly bind your view to relevant properties and you will not even need to do that as updates are queued to the UI internally.
You're stuck, UI operations have to occur on the UI thread, and while that occurs no UI updates will happen. You could do the equivalent of an Application.DoEvents in WPF by creating a new dispatcher frame (http://dedjo.blogspot.com/2007/08/how-to-doevents-in-wpf.html) but it is dangerous, you will catch the UI in the middle of updates and not a good thing to do.
Is the expensive UI operation really a CPU-intensive, UI-only operation? Nothing that can be done with a View Model object graph and then finally bound to the UI, for example?

odata on silverlight runs on UI thread only

We're working with OData on Silverlight, using DataServiceCollection to get the data.
All calls for fetching data (LoadAsync() LoadNextPartialSetAsync()) are done on a worker thread. However, the 'LoadCompleted' callback as well as deserialization and object materialization are done the UI thread.
We decompiled the System.Data.Services.Client.DLL where the DataServiceCollection is, and saw that indeed all code handling the response of the OData is dispatched to the UI thread.
Is there any way to get the deserialization to be called on a worker thread instead?
thanks
Yaron
Well...
It seems the OData collections deliberately moves processing the UI thread. I'm guessing this is done because old objects might have properties the UI is bound to. These properties might change when loading additional data.
Using the query itself, I was able to get the response on a worker thread. However, doing this means one MUST detach the objects from the OData context (or clone them) if UI is bound to any property. Otherwise consequent queries might cause property changed events when objects are materializing on the worker thread.
The problem you have is that the DataServiceCollection<T> is derived from the ObservableCollection<T>. Which in turn is designed to be bound to UI elements. When changes are made to the membership of a ObservableCollection<T> a binding expression observing it is notified. This binding expression then attempt to update the target UI element. If the notification arrives on a non-UI Thread then an exception occurs.
Hence the DataServiceCollection<T> deliberately shifts materialisation to the UI Thread so that as items appear in the collection the resulting change notifications do not result in an exception. If this behaviour is unacceptable to you then DataServiceCollection<T> is not for you.
Instead execute the query yourself via the DataServiceQuery<T>.BeginExecute. The callback you pass to BeginExecute will execute on a worker thread (at least it will when ClientHTTP is being used, I have yet established what will happen when XmlHttp is being used). Here you can enumerate the results and place them in whatever collection type your prefer. You can the switch to the UI thread when you are ready to display the results.
The callback will be called always on a UI thread.
If the request was using the XmlHttp stack (which is the default if you call it from a UI thread) then the networking stack invokes the callback registered by the WCF Data Service on the UI thread. So in this case it's a behavior of the DataServiceCollection/DataServiceContext but a behavior of the underlying network stack.
If you call the request from a non-UI thread or you explicitely set the Http stack to be Client then the callback will come back on a non-UI thread (potentially a different one). We still move it back to the UI thread before letting the caller know. The reason for this is consistency, especially since you can't interact with UI elements on background threads.
If you would execute the query manually, so for example through DataServiceContext.BeginExecute, then the materialization (or most of it anyway) is driven by the caller, since the call returns just IEnumerable which is not yet populated. If you would then transfer execution to a worker thread and enumerate the results there, the materialization will happen on that thread.
Just curious, why do you want to move it? Are you processing so much data, that it causes a visible UI lag?

System.Timers.Timer leaking due to "direct delegate roots"

Apologies for the rather verbose and long-winded post, but this problem's been perplexing me for a few weeks now so I'm posting as much information as I can in order to get this resolved quickly.
We have a WPF UserControl which is being loaded by a 3rd party app. The 3rd party app is a presentation application which loads and unloads controls on a schedule defined by an XML file which is downloaded from a server.
Our control, when it is loaded into the application makes a web request to a web service and uses the data from the response to display some information. We're using an MVVM architecture for the control. The entry point of the control is a method that is implementing an interface exposed by the main app and this is where the control's configuration is set up. This is also where I set the DataContext of our control to our MainViewModel.
The MainViewModel has two other view models as properties and the main UserControl has two child controls. Depending on the data received from the web service, the main UserControl decides which child control to display, e.g. if there is a HTTP error or the data received is not valid, then display child control A, otherwise display child control B. As you'd expect, these two child controls bind two separate view models each of which is a property of MainViewModel.
Now child control B (which is displayed when the data is valid) has a RefreshService property/field. RefreshService is an object that is responsible for updating the model in a number of ways and contains 4 System.Timers.Timers;
a _modelRefreshTimer
a _viewRefreshTimer
a _pageSwitchTimer
a _retryFeedRetrievalOnErrorTimer (this is only enabled when something goes wrong with retrieving data).
I should mention at this point that there are two types of data; the first changes every minute, the second changes every few hours. The controls' configuration decides which type we are using/displaying.
If data is of the first type then we update the model quite frequently (every 30 seconds) using the _modelRefreshTimer's events.
If the data is of the second type then we update the model after a longer interval. However, the view still needs to be refreshed every 30 seconds as stale data needs to be removed from the view (hence the _viewRefreshTimer).
The control also paginates the data so we can see more than we can fit on the display area. This works by breaking the data up into Lists and switching the CurrentPage (which is a List) property of the view model to the right List. This is done by handling the _pageSwitchTimer's Elapsed event.
Now the problem
My problem is that the control, when removed from the visual tree doesn't dispose of it's timers. This was first noticed when we started getting an unusually high number of requests on the web server end very soon after deploying this control and found that requests were being made at least once a second! We found that the timers were living on and not stopping hours after the control had been removed from view and that the more timers there were the more requests piled up at the web server.
My first solution was to implement IDisposable for the RefreshService and do some clean up when the control's UnLoaded event was fired. Within the RefreshServices Dispose method I've set Enabled to false for all the timers, then used the Stop() method on all of them. I've then called Dispose() too and set them to null. None of this worked.
After some reading around I found that event handlers may hold references to Timers and prevent them from being disposed and collected. After some more reading and researching I found that the best way around this was to use the Weak Event Pattern. Using this blog and this blog I've managed to work around the shortcomings in the Weak Event pattern.
However, none of this solves the problem. Timers are still not being disabled or stopped (let alone disposed) and web requests are continuing to build up. Mem Profiler tells me that "This type has N instances that are directly rooted by a delegate. This can indicate the delegate has not been properly removed" (where N is the number of instances). As far as I can tell though, all listeners of the Elapsed event for the timers are being removed during the cleanup so I can't understand why the timers continue to run.
Thanks for reading. Eagerly awaiting your suggestions/comments/solutions (if you got this far :-p)
have you tried using the DispatchTimer instead of System.Timers.Timer?
If a Timer is used in a WPF application, it is worth noting that the Timer runs on a different thread then the user interface (UI) thread. In order to access objects on the user interface (UI) thread, it is necessary to post the operation onto the Dispatcher of the user interface (UI) thread using Invoke or BeginInvoke. Reasons for using a DispatcherTimer opposed to a Timer are that the DispatcherTimer runs on the same thread as the Dispatcher and a DispatcherPriority can be set. ( from this blog )
It sounds to me like the timer signals are being queued faster than they are processed. For example, this would happen if each elapse event takes 2 seconds to process while the timer elapses every second. This would result in a backlog of "raise Elapse event now" signals being scheduled on the .NET thread pool.
Such a backlog would then continue to cause elapse events even after you stop the timer. From the System.Timers.Timer documentation:
Even if SynchronizingObject is true,
Elapsed events can occur after the
Dispose or Stop method has been called
or after the Enabled() property has
been set to false, because the signal
to raise the Elapsed event is always
queued for execution on a thread pool
thread. One way to resolve this race
condition is to set a flag that tells
the event handler for the Elapsed
event to ignore subsequent events.
To avoid a backlog of unprocessed timer signals building up in the thread pool, you can set AutoReset to false. That way the timer disables itself at each elapse. You can then set Enabled back to true after handling the elapse event. That way no additional elapse events will be scheduled until the last one has been handled.
Call timer.Elapsed -= new ElapsedEventHandler(); when your control is disposed of to manually detach the handler.

WinForms multi-threaded databinding scenario, best practice?

I'm currently designing/reworking the databinding part of an application that makes heavy use of winforms databinding and updates coming from a background thread (once a second on > 100 records).
Let's assume the application is a stock trading application, where a background thread monitors for data changes and putting them onto the data objects. These objects are stored in a BindingList<> and implement INotifyPropertyChanged to propagate the changes via databinding to the winforms controls.
Additionally the data objects are currently marshalling the changes via WinformsSynchronizationContext.Send to the UI thread.
The user is able to enter some of the values in the UI, which means that some values can be changed from both sides. And the user values shouldn't be overritten by updates.
So there are several question coming to my mind:
Is there a general design-guildline how to do that (background updates in databinding)?
When and how to marshal on the UI thread?
What is the best way of the background thread to interact with
binding/data objects?
Which classes/Interfaces should be used? (BindingSource, ...)
...
The UI doesn't really know that there is a background thread, that updates the control, and as of my understanding in databinding scenarios the UI shouldn't know where the data is coming from... You can think of the background thread as something that pushes data to the UI, so I'm not sure if the backgroundworker is the option I'm searching for.
Sometimes you want to get some UI response during an operation in the data-/business object (e.g. setting the background during recalculations). Raising a propertychanged on a status property which is bound to the background isn't enough, as the control get's repainted after the calculation has finished? My idea would be to hook on the propertychanged event and call .update() on the control...
Any other ideas about that?
This is a hard problem since most “solutions” lead to lots of custom code and lots of calls to BeginInvoke() or System.ComponentModel.BackgroundWorker (which itself is just a thin wrapper over BeginInvoke).
In the past, I've also found that you soon wish to delay sending your INotifyPropertyChanged events until the data is stable. The code that handles one propriety-changed event often needs to read other proprieties. You also often have a control that needs to redraw itself whenever the state of one of many properties changes, and you don’t wan the control to redraw itself too often.
Firstly, each custom WinForms control should read all data it needs to paint itself in the PropertyChanged event handler, so it does not need to lock any data objects when it was a WM_PAINT (OnPaint) message. The control should not immediately repaint itself when it gets new data; instead, it should call Control.Invalidate(). Windows will combine the WM_PAINT messages into as few requests as possible and only send them when the UI thread has nothing else to do. This minimizes the number of redraws and the time the data objects are locked. (Standard controls mostly do this with data binding anyway)
The data objects need to record what has changed as the changes are made, then once a set of changes has been completed, “kick” the UI thread into calling the SendChangeEvents method that then calls the PropertyChanged event handler (on the UI thread) for all properties that have changed. While the SendChangeEvents() method is running, the data objects must be locked to stop the background thread(s) from updating them.
The UI thread can be “kicked” with a call to BeginInvoke whenever a set of update have bean read from the database. Often it is better to have the UI thread poll using a timer, as Windows only sends the WM_TIMER message when the UI message queue is empty, hence leading to the UI feeling more responsive.
Also consider not using data binding at all, and having the UI ask each data object “what has changed” each time the timer fires. Databinding always looks nice, but can quickly become part of the problem, rather then part of the solution.
As locking/unlock of the data-objects is a pain and may not allow the updates to be read from the database fast enough, you may wish to pass the UI thread a (virtual) copy of the data objects. Having the data object be persistent/immutable so that any changes to the data object return a new data object rather than changing the current data object can enable this.
Persistent objects sound very slow, but need not be, see this and that for some pointers. Also look at this and that on Stack Overflow.
Also have a look at retlang - Message-based concurrency in .NET. Its message batching may be useful.
(For WPF, I would have a View-Model that sets in the UI thread that was then updated in ‘batches’ from the multi-threaded model by the background thread. However, WPF is a lot better at combining data binding events then WinForms.)
Yes all the books show threaded structures and invokes etc. Which is perfectly correct etc, but it can be a pain to code, and often hard to organise so you can make decent tests for it
A UI only needs to be refreshed so many times a second, so performance is never an issue, and polling will work fine
I like to use a object graph that is being continuously updated by a pool of background threads. They check for actual changes in data values and when they notice an actual change they update a version counter on the root of the object graph (or on each main item whatever makes more sense) and updates the values
Then your foreground process can have a timer (same as UI thread by default) to fire once a second or so and check the version counter, and if it changes, locks it (to stop partial updates) and then refreshes the display
This simple technique totally isolates the UI thread from the background threads
There is an MSDN article specific on that topic. But be prepared to look at VB.NET. ;)
Additionally maybe you could use System.ComponentModel.BackgroundWorker, instead of a generic second thread, since it nicely formalize the kind of interaction with the spawned background thread you are describing. The example given in the MSDN library is pretty decent, so go look at it for a hint on how to use it.
Edit:
Please note: No marshalling is required if you use the ProgressChanged event to communicate back to the UI thread. The background thread calls ReportProgress whenever it has the need to communicate with the UI. Since it is possible to attach any object to that event there is no reason to do manual marshalling. The progress is communicated via another async operation - so there is no need to worry about neither how fast the UI can handle the progress events nor if the background thread gets interruped by waiting for the event to finish.
If you prove that the background thread is raising the progress changed event way too fast then you might want to look at Pull vs. Push models for UI updates an excellent article by Ayende.
I just fought a similar situation - badkground thread updating the UI via BeginInvokes. The background has a delay of 10ms on every loop, but down the road I ran into problems where the UI updates which sometimes get fired every time on that loop, can't keep up with teh freq of updates, and the app effectively stops working (not sure what happens- blew a stack?).
I wound up adding a flag in the object passed over the invoke, which was just a ready flag. I'd set this to false before calling the invoke, and then the bg thread would do no more ui updates until this flag is toggled back to true. The UI thread would do it's screen updates etc, and then set this var to true.
This allowed the bg thread to keep crunching, but allowed the ui to shut off the flow until it was ready for more.
Create a new UserControl, add your control and format it (maybe dock = fill) and add a property.
now configure the property to invoke the usercontrol and update your element, each time you change the property form any thread you want!
thats my solution:
private long value;
public long Value
{
get { return this.value; }
set
{
this.value = value;
UpdateTextBox();
}
}
private delegate void Delegate();
private void UpdateTextBox()
{
if (this.InvokeRequired)
{
this.Invoke(new Delegate(UpdateTextBox), new object[] {});
}
else
{
textBox1.Text = this.value.ToString();
}
}
on my form i bind my view
viewTx.DataBindings.Add(new Binding("Value", ptx.CounterTX, "ReturnValue"));
This is a problem that I solved in Update Controls. I bring this up not to suggest you rewrite your code, but to give you some source to look at for ideas.
The technique that I used in WPF was to use Dispatcher.BeginInvoke to notify the foreground thread of a change. You can do the same thing in Winforms with Control.BeginInvoke. Unfortunately, you have to pass a reference to a Form object into your data object.
Once you do, you can pass an Action into BeginInvoke that fires PropertyChanged. For example:
_form.BeginInvoke(new Action(() => NotifyPropertyChanged(propertyName))) );
You will need to lock the properties in your data object to make them thread-safe.
This post is old but I thought I'd give options to others. It seems once you start doing async programming and Windows Forms databinding you end up with problems updating Bindingsource datasource or updating lists bound to windows forms control. I am going to try using Jeffrey Richters AsyncEnumerator class from his powerthreading tools on wintellect.
Reason:
1. His AsyncEnumerator class automatically marshals background threads to UI threads so you can update controls as you would doing Synchronous code.
2. AsyncEnumerator simplifies Async programming. It does this automatically, so you write your code in a Synchronous fashion, but the code is still running in an asynchronous fashion.
Jeffrey Richter has a video on Channel 9 MSDN, that explains AsyncEnumerator.
Wish me luck.
-R
I am late to the party but I believe this is still a valid question.
I would advise you to avoid using data binding at all and use Observable objects instead.
The reason is, data binding looks cool and when implemented the code looks good, but data binding miserably fails when there is lot os asynchronous UI update or multi-threading as in your case.
I have personally experienced this problem with asynchronous and Databinding in prod, we even didn't detect it in testing, when users started using all different scenarios things started to break down.

Resources