WinForms Touchpad Scrolling - winforms

I'm working on a winforms application meant for touchpads and would like the user to be able to scroll through the contents by swiping their finger across the screen. I see that windows (at least windows 7) has this behavior built in for any scrollable control. However the control I'm trying to scroll is a third party control that went ahead and made its own scrollbar elements, so this built in behavior is lost.
I could get the same behavior by placing that third party control in a panel, and letting that panel handle the scrolling. However when the user swipes on that control, its containing panel doesn't hear those events and scrolling never takes place.
Are they any resources on how windows handles touchpad scrolling, or how I could make a panel respond to touchpad swipes when the cursor is inside one of its child controls?

Related

Does WPF have GUI layers in the editor?

I have a single window that does initialization first, and then those controls fade out as the Menu controls fade in. Depending on what the user clicks next, the current menu controls will fade out for the next set of controls to fade in.
While developing in the editor (Visual Studio 2017), I often have to hide and show certain controls so I can see the current 'screen layer' I'm interested in.
While I'm working in the editor, is there a way to click on layers in WPF like clicking on tabs? You can think of tabs like "Layer 1" on the window has the initialization controls, then Layer 2 has the menu controls, etc. Then just show/hide layers while developing in the editor like you would in paint programs. I'm just thinking that would make development easier.
No there is no standard way built-in to Visual Studio.
I strongly suggest to split the layers into (user)controls, this way the designer can focus on one thing at the time.

Is it possible to force touch in WPF to work like touch in WinForms?

I created WPF application which allows user to interact with itself using touch. Unfortunately touch is working very bad when several instances of the same application are open.
I will try to describe the behaviour that I am facing:
Open two windows and place them right next to each other.
Touch left window with your finger and do not release it.
Any attempts to interact with the second window using touch will fail.
I have some experience of developing touch applications using WinFroms. And I never had such problems before. So I performed the described above trick with two WinForms applications and they are working just like they should - first window (with finger not released) keeps the focus but second window still allows user to perform clicks on its surface.
I also tried mixed combination - when WPF window is focused, WinFroms window still is touchable. But not the other way around - when WinForms window is focused, WPF window won't respond.
Is there anything that can be done to change described behaviour of the WPF windows?

Silverlight 5 NotificationWindow captures mainwindow mouse

I'm finding that if you use the NotificationWindow with Silverlight 5 runtime in an out of browser app, while the notificationwindow is visible it captures any mouse input in the top left corner of the application's main window.
This can be easily demonstrated for example the sample here : http://forums.silverlight.net/t/212852.aspx/1
You simply add a button so it sits at the top left of the screen. It is not possible to click the button while the NotificationWindow is visible. This does NOT happen on SL4 runtime, but does happen on each of the SL5 machines I have tested.
The area that gets captured corresponds to the size of the NotificationWindow, as if the notification window was located at the top left of the MainWindow. If your NotificationWindow content has any controls, you can even click those controls by clicking in the corresponding location on the main window!
I've hit this problem on a SL4 application where users with Silverlight 5 runtime can no longer use the buttons on the top left of the main window.
Any help would be much appreciated.
Thanks
Danny

WPF Scrollviewer PanningMode move Window?

I have add a ScrollViewer in my WPF4 Window with ScrollViewer.PanningMode.
When I drag scrollviewer on my multitouch screen it's OK. But... the Window move with scroll when I arrive on top scroll or bottom scroll.
How can we avoid this?
Edit: To complete the answer: There is a native event for this which can just be marked as handled: ManipulationBoundaryFeedback
This movement is called boundary feedback which is governed by the operating system (can be set by users in the Pen and Touch settings on the Panning tab). I do not know if the Windows API allows you to prevent it, this page might be relevant.

Wpf popups or modal windows as user entry screens?

I am building an application which has multiple user entry screens. I would like to know if there are advantages/disadvantages of using wpf popups rather than modal windows?
I am using mvvm-light.
I have noticed that popups are being used extensively in touch applications (eg iPad).
The issue is really one of Desktop vs. Web applications. Popups in Silverlight (or other touch apps) involve having only one real window to work with (the mobile surface, or the web browser). If you are writing a desktop app, then modal windows will probably match user expectations better as Popups cannot leave the parent window.
pop ups are nice but are very difficult to control. In our apps we are using adorners to be 'pop-up' editors - we have created a control that can hold any other control and display it in the adorner layer of the main window. This allows to do things such as having 1 control appear next to another yet still have the other control in use or we can grey out the background and force focus to the new control and not allow any other control to be used until the 'ok' button is pressed. If you Google for adorners in wpf you will find a lot of excellent articles.

Resources