Apologies for the rather verbose and long-winded post, but this problem's been perplexing me for a few weeks now so I'm posting as much information as I can in order to get this resolved quickly.
We have a WPF UserControl which is being loaded by a 3rd party app. The 3rd party app is a presentation application which loads and unloads controls on a schedule defined by an XML file which is downloaded from a server.
Our control, when it is loaded into the application makes a web request to a web service and uses the data from the response to display some information. We're using an MVVM architecture for the control. The entry point of the control is a method that is implementing an interface exposed by the main app and this is where the control's configuration is set up. This is also where I set the DataContext of our control to our MainViewModel.
The MainViewModel has two other view models as properties and the main UserControl has two child controls. Depending on the data received from the web service, the main UserControl decides which child control to display, e.g. if there is a HTTP error or the data received is not valid, then display child control A, otherwise display child control B. As you'd expect, these two child controls bind two separate view models each of which is a property of MainViewModel.
Now child control B (which is displayed when the data is valid) has a RefreshService property/field. RefreshService is an object that is responsible for updating the model in a number of ways and contains 4 System.Timers.Timers;
a _modelRefreshTimer
a _viewRefreshTimer
a _pageSwitchTimer
a _retryFeedRetrievalOnErrorTimer (this is only enabled when something goes wrong with retrieving data).
I should mention at this point that there are two types of data; the first changes every minute, the second changes every few hours. The controls' configuration decides which type we are using/displaying.
If data is of the first type then we update the model quite frequently (every 30 seconds) using the _modelRefreshTimer's events.
If the data is of the second type then we update the model after a longer interval. However, the view still needs to be refreshed every 30 seconds as stale data needs to be removed from the view (hence the _viewRefreshTimer).
The control also paginates the data so we can see more than we can fit on the display area. This works by breaking the data up into Lists and switching the CurrentPage (which is a List) property of the view model to the right List. This is done by handling the _pageSwitchTimer's Elapsed event.
Now the problem
My problem is that the control, when removed from the visual tree doesn't dispose of it's timers. This was first noticed when we started getting an unusually high number of requests on the web server end very soon after deploying this control and found that requests were being made at least once a second! We found that the timers were living on and not stopping hours after the control had been removed from view and that the more timers there were the more requests piled up at the web server.
My first solution was to implement IDisposable for the RefreshService and do some clean up when the control's UnLoaded event was fired. Within the RefreshServices Dispose method I've set Enabled to false for all the timers, then used the Stop() method on all of them. I've then called Dispose() too and set them to null. None of this worked.
After some reading around I found that event handlers may hold references to Timers and prevent them from being disposed and collected. After some more reading and researching I found that the best way around this was to use the Weak Event Pattern. Using this blog and this blog I've managed to work around the shortcomings in the Weak Event pattern.
However, none of this solves the problem. Timers are still not being disabled or stopped (let alone disposed) and web requests are continuing to build up. Mem Profiler tells me that "This type has N instances that are directly rooted by a delegate. This can indicate the delegate has not been properly removed" (where N is the number of instances). As far as I can tell though, all listeners of the Elapsed event for the timers are being removed during the cleanup so I can't understand why the timers continue to run.
Thanks for reading. Eagerly awaiting your suggestions/comments/solutions (if you got this far :-p)
have you tried using the DispatchTimer instead of System.Timers.Timer?
If a Timer is used in a WPF application, it is worth noting that the Timer runs on a different thread then the user interface (UI) thread. In order to access objects on the user interface (UI) thread, it is necessary to post the operation onto the Dispatcher of the user interface (UI) thread using Invoke or BeginInvoke. Reasons for using a DispatcherTimer opposed to a Timer are that the DispatcherTimer runs on the same thread as the Dispatcher and a DispatcherPriority can be set. ( from this blog )
It sounds to me like the timer signals are being queued faster than they are processed. For example, this would happen if each elapse event takes 2 seconds to process while the timer elapses every second. This would result in a backlog of "raise Elapse event now" signals being scheduled on the .NET thread pool.
Such a backlog would then continue to cause elapse events even after you stop the timer. From the System.Timers.Timer documentation:
Even if SynchronizingObject is true,
Elapsed events can occur after the
Dispose or Stop method has been called
or after the Enabled() property has
been set to false, because the signal
to raise the Elapsed event is always
queued for execution on a thread pool
thread. One way to resolve this race
condition is to set a flag that tells
the event handler for the Elapsed
event to ignore subsequent events.
To avoid a backlog of unprocessed timer signals building up in the thread pool, you can set AutoReset to false. That way the timer disables itself at each elapse. You can then set Enabled back to true after handling the elapse event. That way no additional elapse events will be scheduled until the last one has been handled.
Call timer.Elapsed -= new ElapsedEventHandler(); when your control is disposed of to manually detach the handler.
Related
I have a DevExpress GridControl oneway binding to a tableview in the viewmodel. There are about 20 background threads querying data from databases and update the tableview individually. The update to the table view is guardard with lock for async update. Dispatcher is used for refreshing the main UI thread. I also have another button to cancel the database and update functions via CancellationTokenSource.
However, when the applicatoin runs, I have to click the cancellation button many times in order to execute code in the cancel command. In another word, the UI Main thread is busy refreshing the GridControl and it blocks the Cancel Button.
Is there a way to achive this function?
Edit: Found this method helps a lot await Dispatcher.Yield(DispatcherPriority.ApplicationIdle);
It simply gives other UI controls a chance to be executed.
Creates an awaitable object that asynchronously yields control back to the current dispatcher and provides an opportunity for the dispatcher to process other events. (MSDN)
AFAIK, UI stands for User Interface, where main part is User! User is not robot, it is a person, which uses your application. You do not need to query data updates so often on UI thread. Why do you need 20 threads for database query operation? You have the single GridControl, which can show only several rows at time on the screen, 20, 50, 100 (no more). So, I suggest you to have only one thread for reading data from database, and do it once per 2-5 seconds, in order to provide User with some interactivity. For this purpose, TPL - is the good choice. Using it with CancellationToken - is good way to support cancellation with your scenario. Between database queries you can do the following:
while (true)
{
myToken.ThrowIfCancellationRequested();
// database query here
myToken.WaitHandle.WaitOne(2000)
}
Also, before updating source collection, you can use the following:
GridControl.BeginDataUpdate();
//Update your source collection
GridControl.EndDataUpdate();
This will prevent DevExpress control from listening CollectionChanged event, or any other, because on each Add/Remove action DevExpress call UpdateLayout() method, which is no so fast as we want to.
if layout is invalid in either respect, the UpdateLayout call will redo the entire layout. Therefore, you should avoid calling UpdateLayout after each incremental and minor change in the element tree.
We're working with OData on Silverlight, using DataServiceCollection to get the data.
All calls for fetching data (LoadAsync() LoadNextPartialSetAsync()) are done on a worker thread. However, the 'LoadCompleted' callback as well as deserialization and object materialization are done the UI thread.
We decompiled the System.Data.Services.Client.DLL where the DataServiceCollection is, and saw that indeed all code handling the response of the OData is dispatched to the UI thread.
Is there any way to get the deserialization to be called on a worker thread instead?
thanks
Yaron
Well...
It seems the OData collections deliberately moves processing the UI thread. I'm guessing this is done because old objects might have properties the UI is bound to. These properties might change when loading additional data.
Using the query itself, I was able to get the response on a worker thread. However, doing this means one MUST detach the objects from the OData context (or clone them) if UI is bound to any property. Otherwise consequent queries might cause property changed events when objects are materializing on the worker thread.
The problem you have is that the DataServiceCollection<T> is derived from the ObservableCollection<T>. Which in turn is designed to be bound to UI elements. When changes are made to the membership of a ObservableCollection<T> a binding expression observing it is notified. This binding expression then attempt to update the target UI element. If the notification arrives on a non-UI Thread then an exception occurs.
Hence the DataServiceCollection<T> deliberately shifts materialisation to the UI Thread so that as items appear in the collection the resulting change notifications do not result in an exception. If this behaviour is unacceptable to you then DataServiceCollection<T> is not for you.
Instead execute the query yourself via the DataServiceQuery<T>.BeginExecute. The callback you pass to BeginExecute will execute on a worker thread (at least it will when ClientHTTP is being used, I have yet established what will happen when XmlHttp is being used). Here you can enumerate the results and place them in whatever collection type your prefer. You can the switch to the UI thread when you are ready to display the results.
The callback will be called always on a UI thread.
If the request was using the XmlHttp stack (which is the default if you call it from a UI thread) then the networking stack invokes the callback registered by the WCF Data Service on the UI thread. So in this case it's a behavior of the DataServiceCollection/DataServiceContext but a behavior of the underlying network stack.
If you call the request from a non-UI thread or you explicitely set the Http stack to be Client then the callback will come back on a non-UI thread (potentially a different one). We still move it back to the UI thread before letting the caller know. The reason for this is consistency, especially since you can't interact with UI elements on background threads.
If you would execute the query manually, so for example through DataServiceContext.BeginExecute, then the materialization (or most of it anyway) is driven by the caller, since the call returns just IEnumerable which is not yet populated. If you would then transfer execution to a worker thread and enumerate the results there, the materialization will happen on that thread.
Just curious, why do you want to move it? Are you processing so much data, that it causes a visible UI lag?
I have scenario where thread updates form's control. I followed http://msdn.microsoft.com/en-us/library/ms171728.aspx to make it work, but I was not successful.
Program creates form control(list view), and a thread to fetch information from internet(stock quotes). Whenever user selects a known symbol from other form control, that would be added in listView, this intern adds to thread to fetch quotes from internet, and a delegate would be added to for that specific symbol, thread iterates through all the watch list symbols to fetch quotes from internet whenever there is change in price, thread calls registered delegate. In that delegate I am accessing listView elements, here I am facing problems thread inconsistent issues.
To solve this problem I followed the above mentioned link,
Approach-1) In the delegate I started background worker. Same problem
Approach-2) Main program creates background worker, this worker loops around a list to update in listView. Delegate adds new updated price to list on which background worker is looping. When background worker is accessing listView again thread inconsistent problems arise.
How to resolve this problem?
When background worker is accessing listView again thread inconsistent problems arise.
Yes. This is because it shouldn't be done. A Background Worker only provides safe access to the UI the RunWorkerCompleted and ProgressChanged events. The DoWork event is still run in the non-UI thread. To access the UI from the non-UI thread, "marshal back" to the UI-thread using Control.Invoke or SynchronizationContext.Send (these should lead to further findings if used as keywords.)
Happy coding.
We've got a Model-View-Presenter setup with our .NET Compact Framework app. A standard CF Form is implementing the view interface, and passed into the constructor of the presenter. the presenter tells the form to show itself by calling view.Run(); the view then does a Show() on itself, and the presenter takes over again, loading up data and populating it into the view.
Problem is that the view does not finishing showing itself before control is returned to the presenter. since the presenter code blocks for a few seconds while loading data, the effect is that the form is not visible for a few seconds. it's not until after the presenter finishes it's data load that the form becomes visible, even though the form called Show() on itself before the presenter started its data loading.
In a standard windows environment, i could use the .Shown event on the form... but compact framework doesn't have this, and I can't find an equivalent.
Does anyone know of an even, a pinvoke, or some other way to get my form to be fully visible before kicking off some code on the presenter? at this point, i'd be ok with the form calling to the presenter and telling the presenter to start it's data load.
FYI - we're trying to avoid multi-threading, to cut down on complexity and resource usage.
The general rule is: never do anything blocking on the UI thread
The UI in Windows (and in Windows CE as well) has an asynchronous nature. Which means that most API calls do not necessarily do whatever they're supposed to do immediately. Instead, they generate a series of events, which are put into the event queue and later retrieved by the event pump, which is, essentially, an infinite loop running on the UI thread, picking events from the queue one by one, and handling them.
From the above, a conclusion can be drawn that if you continue to do something lengthy on the UI thread after requesting a certain action (i.e. showing the window in your case), the event pump cannot proceed with picking events (because you haven't returned control to it), and therefore, your requested action cannot complete.
The general approach is as follows: if you must do complex data transformation/loading/preparing/whatever, do it on a separate thread, and then use Control.BeginInvoke to inject a delegate into the UI thread, and touch the actual UI controls from inside that delegate.
Despite your irrational fear of "complexity" that multithreading brings with it, there is very little to be afraid of. Here's a little example to illustrate the point:
public void ShowUI()
{
theForm = new MyForm();
theForm.Show();
// BeginInvoke() will take a new thread from the thread pool
// and invoke our delegate on that thread
new Action( PrepareData ).BeginInvoke(null,null);
}
public void PrepareData()
{
// Prepare your data, do complex computation, etc.
// Control.BeginInvoke will put our delegate on the UI event queue
// to be retrieved and executed on the UI thread
theForm.BeginInvoke( new Action( PutDataInTheForm ) );
}
public void PutDataInTheForm()
{
theForm.textBox1.Text = "data is ready!";
}
While you may play with alternative solutions, the general idea always remains the same: if you do anything lengthy on the UI thread, your UI will "freeze". It will not even redraw itself as you add new UI elements on the screen, because redrawing is also an asynchronous process.
Therefore, you must do all the complex and long stuff on a separate thread, and only do simple, small, guaranteed to run fast things on the UI thread. There is no other alternative, really.
Hope this helps.
If your key problem is that the form won't paint before your presenter data loading methods are completed, and you have a call to this.Show() in your Form_Load, try putting Application.DoEvents() directly after this.Show() to force/allow the form to paint.
protected void Form_Load(blah blah blah)
{
this.Show();
Application.DoEvents();
... data loading methods ...
}
No need to create another thread if you don't want to (although a couple of seconds have to be dealt with somehow).
You can use the activated event. Because it will fire when the form is activated, you need a boolean local to the form to check wether or not the form has been created for the first time.
Another option for you is to disconnect the event handler right after you finish presenting the form.
I'm currently designing/reworking the databinding part of an application that makes heavy use of winforms databinding and updates coming from a background thread (once a second on > 100 records).
Let's assume the application is a stock trading application, where a background thread monitors for data changes and putting them onto the data objects. These objects are stored in a BindingList<> and implement INotifyPropertyChanged to propagate the changes via databinding to the winforms controls.
Additionally the data objects are currently marshalling the changes via WinformsSynchronizationContext.Send to the UI thread.
The user is able to enter some of the values in the UI, which means that some values can be changed from both sides. And the user values shouldn't be overritten by updates.
So there are several question coming to my mind:
Is there a general design-guildline how to do that (background updates in databinding)?
When and how to marshal on the UI thread?
What is the best way of the background thread to interact with
binding/data objects?
Which classes/Interfaces should be used? (BindingSource, ...)
...
The UI doesn't really know that there is a background thread, that updates the control, and as of my understanding in databinding scenarios the UI shouldn't know where the data is coming from... You can think of the background thread as something that pushes data to the UI, so I'm not sure if the backgroundworker is the option I'm searching for.
Sometimes you want to get some UI response during an operation in the data-/business object (e.g. setting the background during recalculations). Raising a propertychanged on a status property which is bound to the background isn't enough, as the control get's repainted after the calculation has finished? My idea would be to hook on the propertychanged event and call .update() on the control...
Any other ideas about that?
This is a hard problem since most “solutions” lead to lots of custom code and lots of calls to BeginInvoke() or System.ComponentModel.BackgroundWorker (which itself is just a thin wrapper over BeginInvoke).
In the past, I've also found that you soon wish to delay sending your INotifyPropertyChanged events until the data is stable. The code that handles one propriety-changed event often needs to read other proprieties. You also often have a control that needs to redraw itself whenever the state of one of many properties changes, and you don’t wan the control to redraw itself too often.
Firstly, each custom WinForms control should read all data it needs to paint itself in the PropertyChanged event handler, so it does not need to lock any data objects when it was a WM_PAINT (OnPaint) message. The control should not immediately repaint itself when it gets new data; instead, it should call Control.Invalidate(). Windows will combine the WM_PAINT messages into as few requests as possible and only send them when the UI thread has nothing else to do. This minimizes the number of redraws and the time the data objects are locked. (Standard controls mostly do this with data binding anyway)
The data objects need to record what has changed as the changes are made, then once a set of changes has been completed, “kick” the UI thread into calling the SendChangeEvents method that then calls the PropertyChanged event handler (on the UI thread) for all properties that have changed. While the SendChangeEvents() method is running, the data objects must be locked to stop the background thread(s) from updating them.
The UI thread can be “kicked” with a call to BeginInvoke whenever a set of update have bean read from the database. Often it is better to have the UI thread poll using a timer, as Windows only sends the WM_TIMER message when the UI message queue is empty, hence leading to the UI feeling more responsive.
Also consider not using data binding at all, and having the UI ask each data object “what has changed” each time the timer fires. Databinding always looks nice, but can quickly become part of the problem, rather then part of the solution.
As locking/unlock of the data-objects is a pain and may not allow the updates to be read from the database fast enough, you may wish to pass the UI thread a (virtual) copy of the data objects. Having the data object be persistent/immutable so that any changes to the data object return a new data object rather than changing the current data object can enable this.
Persistent objects sound very slow, but need not be, see this and that for some pointers. Also look at this and that on Stack Overflow.
Also have a look at retlang - Message-based concurrency in .NET. Its message batching may be useful.
(For WPF, I would have a View-Model that sets in the UI thread that was then updated in ‘batches’ from the multi-threaded model by the background thread. However, WPF is a lot better at combining data binding events then WinForms.)
Yes all the books show threaded structures and invokes etc. Which is perfectly correct etc, but it can be a pain to code, and often hard to organise so you can make decent tests for it
A UI only needs to be refreshed so many times a second, so performance is never an issue, and polling will work fine
I like to use a object graph that is being continuously updated by a pool of background threads. They check for actual changes in data values and when they notice an actual change they update a version counter on the root of the object graph (or on each main item whatever makes more sense) and updates the values
Then your foreground process can have a timer (same as UI thread by default) to fire once a second or so and check the version counter, and if it changes, locks it (to stop partial updates) and then refreshes the display
This simple technique totally isolates the UI thread from the background threads
There is an MSDN article specific on that topic. But be prepared to look at VB.NET. ;)
Additionally maybe you could use System.ComponentModel.BackgroundWorker, instead of a generic second thread, since it nicely formalize the kind of interaction with the spawned background thread you are describing. The example given in the MSDN library is pretty decent, so go look at it for a hint on how to use it.
Edit:
Please note: No marshalling is required if you use the ProgressChanged event to communicate back to the UI thread. The background thread calls ReportProgress whenever it has the need to communicate with the UI. Since it is possible to attach any object to that event there is no reason to do manual marshalling. The progress is communicated via another async operation - so there is no need to worry about neither how fast the UI can handle the progress events nor if the background thread gets interruped by waiting for the event to finish.
If you prove that the background thread is raising the progress changed event way too fast then you might want to look at Pull vs. Push models for UI updates an excellent article by Ayende.
I just fought a similar situation - badkground thread updating the UI via BeginInvokes. The background has a delay of 10ms on every loop, but down the road I ran into problems where the UI updates which sometimes get fired every time on that loop, can't keep up with teh freq of updates, and the app effectively stops working (not sure what happens- blew a stack?).
I wound up adding a flag in the object passed over the invoke, which was just a ready flag. I'd set this to false before calling the invoke, and then the bg thread would do no more ui updates until this flag is toggled back to true. The UI thread would do it's screen updates etc, and then set this var to true.
This allowed the bg thread to keep crunching, but allowed the ui to shut off the flow until it was ready for more.
Create a new UserControl, add your control and format it (maybe dock = fill) and add a property.
now configure the property to invoke the usercontrol and update your element, each time you change the property form any thread you want!
thats my solution:
private long value;
public long Value
{
get { return this.value; }
set
{
this.value = value;
UpdateTextBox();
}
}
private delegate void Delegate();
private void UpdateTextBox()
{
if (this.InvokeRequired)
{
this.Invoke(new Delegate(UpdateTextBox), new object[] {});
}
else
{
textBox1.Text = this.value.ToString();
}
}
on my form i bind my view
viewTx.DataBindings.Add(new Binding("Value", ptx.CounterTX, "ReturnValue"));
This is a problem that I solved in Update Controls. I bring this up not to suggest you rewrite your code, but to give you some source to look at for ideas.
The technique that I used in WPF was to use Dispatcher.BeginInvoke to notify the foreground thread of a change. You can do the same thing in Winforms with Control.BeginInvoke. Unfortunately, you have to pass a reference to a Form object into your data object.
Once you do, you can pass an Action into BeginInvoke that fires PropertyChanged. For example:
_form.BeginInvoke(new Action(() => NotifyPropertyChanged(propertyName))) );
You will need to lock the properties in your data object to make them thread-safe.
This post is old but I thought I'd give options to others. It seems once you start doing async programming and Windows Forms databinding you end up with problems updating Bindingsource datasource or updating lists bound to windows forms control. I am going to try using Jeffrey Richters AsyncEnumerator class from his powerthreading tools on wintellect.
Reason:
1. His AsyncEnumerator class automatically marshals background threads to UI threads so you can update controls as you would doing Synchronous code.
2. AsyncEnumerator simplifies Async programming. It does this automatically, so you write your code in a Synchronous fashion, but the code is still running in an asynchronous fashion.
Jeffrey Richter has a video on Channel 9 MSDN, that explains AsyncEnumerator.
Wish me luck.
-R
I am late to the party but I believe this is still a valid question.
I would advise you to avoid using data binding at all and use Observable objects instead.
The reason is, data binding looks cool and when implemented the code looks good, but data binding miserably fails when there is lot os asynchronous UI update or multi-threading as in your case.
I have personally experienced this problem with asynchronous and Databinding in prod, we even didn't detect it in testing, when users started using all different scenarios things started to break down.