WPF Catch Exception thrown by User Controls inside a Window - wpf

I have a couple of custom user controls inside a WPF window dialog which can throw exception. What I want to do is to purge only that window dialog when an exception is thrown by one of those user controls. However, these controls handle their own click event, so I can't find anywhere to wrap them into a try-catch from within the window itself.
The only possible workaround I can think of would be to implement an event SomethingWentWrong in my user controls, and subscribe to it within my WPF Window. However, this seems to me like a very ugly way to do things.
Is there any better option?

There is no global way to trap exceptions easily within a single Window.
You could, potentially, subscribe to Application.DispatcherUnhandledException, which will give you a way to trap exceptions which occur on the Dispatcher. This would handle most typical user interface "events", but is application wide.

Related

Removing ViewModels in Prism when Closing Window

I have a winform application with data grid rows with some icons.
When the user clicks on one of the icons, a WPF window opens.
I have created this WPF window using Prism i.e. it has shell and regions mapped to view.
The issues I am facing is:
When I tried to close the WPF window, I get the exception "Cannot set Visibility or call Show, ShowDialog, or WindowInteropHelper.EnsureHandle after a Window has closed."
I understand that we can resolve the issue by hiding the window instead of closing it.
However, this makes my ViewModel and Services representing the older WPF window.
I have kept the static counter in the ViewModels and observed that every time, I open the WPF window, static count increases which means my old view models are not getting destroyed.
I would like how to handle this scenario correctly so that when I close the window everything related to the window should be disposed off.
I tried to do container.dispose in ShellViewModel, however, still it did not work.
There are two aspects here. Firstly, you can use either RegionMemberLifetimeAttribute on your view model or implement IRegionMemberLifetime to make Prism create a new instance each time.
Secondly, you have to create your own RegionBehavior (or take it from this Github Issue) to make Prism dispose view models.

MultiThread in WPF

I have several projects, each of thern have UserControl to manage them. All projects are in one solution and working simultaneously. all UserControls are in TabControl. But if one project don't handle his exception, all solution fail down. How can i run each UserControl in another Thread ?
I have several classes, they are models in MVVM. All of them have ViewModel and View. Now all classes start and workig in one thread. If one of then throw exception< all app will fail down. I want taht all models working in individual thread. But all Views of taht models are together in TabControl. How i can organize this sheme?
You can't. WPF has one, and only one, user interface thread. Modifying user interface elements from a background thread won't work and will raise an exception. (EDIT: This is not entirely correct, apparently it is possible to start individual windows in their own threads.)
If you have a problem with uncaught exceptions, have a look at the Application.DispatcherUnhandledException event, which allows you to register a central exception handler for your complete WPF application. If you set e.Handled = true;
at the end of your DispatcherUnhandledException handler, exceptions will cause your application to fall back to the user interface rather than terminating the application.
More information:
WPF global exception handler

Need help with Dispatcher.PushFrame style process blocking in WPF page

I am using Dispatcher.PushFrame to block my code while allowing the UI to refresh until a long running process is done. This works as expected, so long as my call to Dispatcher.PushFrame is from a button click event. If, however, I use this same code during the Page’s Loaded event or constructor, the UI does not refresh, and so never paints. As a random experiment, I tried using Window.ShowDialog from the constructor, and it does allow the UI to paint, even though control is blocked until the modal dialog closes. Can anyone offer a solution to allow this functionality from the Page Loaded event using Dispatcher.PushFrame or some other manual mechanism?
As an addendum, if I minimize or maximize my window, the UI paints and I can interact with it normally, but not until I manually perform the resize.
From my readings in MSDN on Object Lifetime Events and stumping around in Reflector it appears that the Loaded and Unloaded events are not raised in the same manner as other events. Internally a BroadcastEventHelper class is used, which coordinates the various Loaded events amongst every element in the visual tree before eventually raising them at the DispatcherPriority.Loaded level.
I believe this is why you're seeing this behavior.
As for a concrete solution, I suggest long running tasks not be placed in the Page.Loaded event handler and instead a BackgroundWorker or Task be issued to complete the work.

Injecting Mouse Input in WPF Applications

I've been working on injecting input into a WPF application. What makes this project hard is that I need to be able to inject the input into the application even though it's running in the background (i.e. another application has the input focus). Using the SendInput() function is therefore out of the question.
So far, I've got keyboard input working but am having trouble injecting mouse input.
I used Spy++ to observe the window messages that get sent to the WPF window when I physically click the mouse button. I then simply craft these same mouse messages (such as WM_LBUTTONDOWN and WM_LBUTTONUP) manually and send them explicitly to the WPF window to emulate mouse input.
Unfortunately, this doesn't work as expected (not even when I, for testing purposes, have set the WPF window as the foreground window).
I've added a button to my test WPF window which when clicked displays a message box. Injecting the appropriate mouse messages when I've manually positioned the cursor over the button doesn't cause the button to be clicked, however (i.e. the clicked event isn't fired by the WPF framework).
If I add a handler for mouse clicks on the actual dialog (the client area), that handler does get called if I position the cursor over the dialog itself and inject the same window messages as before:
this.MouseLeftButtonDown += WndMouseDown;
public void WndMouseDown(object sender, EventArgs e)
{
...
}
Strangely enough, if I change the push mode of the button to Press (i.e. it's considered clicked on mouse down rather than the default mouse up), the button clicked event is now fired when I inject the same messages as before. (It's worth mentioning that the handler from the example above correctly fires for both mouse downs and ups, so it'd seem the WPF framework does process both messages successfully.)
It seems like there are some other criteria that need to be fulfilled in order for a mouse clicked event to be fired by the WPF framework. Does anybody know how mouse input is handled internally in WPF, or why it's not interpreting my mouse up and down messages as a click on the button?
(It's worth mentioning that this approach [sending window messages] works fine on ordinary Win32 windows, such as the Start->Run dialog. The difference here is that WPF only has one physical Win32 window and the rest is WPF specific, which means all window messages go to that top-level window rather than the actual button.)
I've been searching high and low for an answer to this and would appreciate any thoughts or ideas.
I'd highly suggest going the UIAutomation route. You create an AutomationElement by window handle. Crawl to the button and invoke it. I'd just like to know how you managed to get the keyboard input working. I am currently trying to resolve the converse issue. How to get a WPF window (I've managed to get a hWnd to it via Win32 calls), to respond to virtual keyboard messages. I've logged ++spy sessions on the window in question and replicated it's input without success.
Use UI Automation to do this - trying to manually simulate input via window messages is a textbook mistake, like trying to start a land war against Russia.
Your strategy is basically sound but in order to send a message to a window owned by another process you must first register the message.
Here is an article explaining the whole business. The sample code is unfortunately in VB but I'm sure that won't stop you.

How do I get ToolStripMenuItem Shortcut Key to work when WinForms control is hosted in VB6 Form

We have a scenario where we are exposing a set of WinForms UserControls via COM to a Legacy VB6 application. We have 3 different controls which have a MenuStrip on it that has the Control + F shortcut key mapped to a menu item which invokes a control specific find items dialog when the shortcut is entered. When we try testing this code in our WinForms shell the appropriate dialog (the one hosted in the active MdiChild) pops up when all 3 controls exist, but in the VB6 host the wrong dialog usually appears (it seems to always be the dialog for the first control which was created).
I'm fairly certain this has something to do with message pumps and all, but I can't seem to figure out how to ensure that the proper ToolStripMenuItem is getting invoked when we enter the shortcut key.
I know the option of using a global/singleton ShortcutKey manager/service that overrides ProcessCmdKey is a possibility, but that would be the last resort we want to fall back on. I just get a feeling that a message pump needs to be started.
This may be inappropriate to your needs. And it might only apply to VB6 specifically. But have you considered having VB6 use the standard menus, and just have it hold onto the keyboard shortcuts, and trigger things appropriately? You could then simply hide the individual menus (but they should still fire their events).

Resources