How does touch / gesture support in WPF compare with Windows Store API - wpf

I have been looking but have been unable to find as yet a nice comparison of the differences in touch / gesture support between the Windows Store app API's and WPF.
I have seen that WPF includes some basic touch events but do the WPF controls handle gestures such as swipe, hold and tap or would we need to implement our own identification of these getsures using the basic touch events?
Thanks
Gavin

Here is the full list of Windows Store App controls:
wpf controls
When developing a Windows Store App, you will use there controls. They have (where applicable) built in gesture support.

Related

WPF native controls in Xamarin

I was playing around with Xamarin and WPF after following the BoxViewClock tutorial from the Xamarin website but apart from that one article, which has been duplicated on a number of sites I cannot find any other advanced examples.
Is it currently possible to add/use a WPF custom control to an xamarin forms page? or for that matter a view/user control?
You typically start creating a custom control by deriving from one of Xamarin.Forms controls and you do it in shared .NET Standard library with Xamarin.Forms UI project. As soon as there's behavior which is not feasible within Xamarin.Forms you create a native renderer in a project targeting specific platforms, in our case in WPF project. Thus there're 2 steps for creating custom controls in Xamarin.Forms:
Derive from existing Xamarin.Forms controls in Shared project
Add extensions to it via platform renderer (or behaviour which is a custom case of renderer) in platform project.
You can find example of creating "ExtendedLabel" control which adds to the standard Xamarin.Forms' Label with capability to change cursor to Hand when mouse hovers over the label in WPF application:
Xamarin.Forms control: https://github.com/maxim-saplin/CrossPlatformDiskTest/blob/master/Saplin.CPDT.UICore/Controls/ExtendedLabel.cs
WPF platfrom renderer: https://github.com/maxim-saplin/CrossPlatformDiskTest/blob/master/Saplin.CPDT.WPF/ExtendedLabelRenderer.cs - you can ignore the code after "if (label.FormattedText != null &&..." which is a workaround for a bug in Xamarin.WPF and not related to the questions
P.S.: in the repo there's also a renderer for macOS which implements the same behaviour

Touch pan gesture isn't translated to WM_VSCROLL in Windows Forms

I'm working on implementing native Windows touch support in a legacy WinForms app for .NET 4.0+. The app is being developed in VS 2013, and the main test system is Win 8.1 Pro. Doing this in a custom control, which is a descendant of the Control class.
According to the Windows Touch Gestures Overview MSDN article, "the default gesture handler maps some gestures to Windows messages that were used in previous versions of Windows" (see the 'Legacy Support' subsection). However, in my tests the basic pan gesture one can use to scroll a control in the vertical direction using one finger, isn't translated to WM_VSCROLL. The protected OnMouseWheel method isn't also raised.
Have I missed any important settings or anything else we need to turn on to enable this default mapping for the basic touch gestures?
I have test it on windows 10. Window without WS_VSCROLL style will not receive the legacy WM_VSCROLL for touch pan gesture.
You need to translate itself, but it is very simple. Handle WM_GESTURE in your custom control's wndproc and translate the GID_PAN action to WM_VSCROLL should do the work.

User focus in multitouch environment

I am trying to create a multitouch application.
I have the hardware which will allow me to do this. On the software side I want to be able to have WPF textboxes, WPF web browsers, multiple focuses, multiple keyboards and multiple users at the same time.
From what I've seen, I can't be focused on two controls at the same time.
What is the Microsoft MultiTouch approach for this kind of job ?
The OS limitations are what they are (and don't appear to change in Win8): only one hWnd at a time can have focus.
Since you are using WPF though, everything within your application (with the exception of the WebBrowser control ActiveX widgets you may be using) is rendered within one big hWnd.
WPF 4 introduced native support for multitouch, including multi-touch capture. The APIs for this are many but pretty intuitive so I'll just say this... go to http://msdn.microsoft.com/en-us/library/ms590078.aspx and search within the page for all of the members with "Touch" in their name.
The catch however is that the controls shipping with WPF 4 don't work with the touch input events... you'll only be able to interact with one of those controls at a time. To take advantage of the multi-touch capture APIs, you'll have to create controls that are designed with it in mind. Fortunately, the Surface team at Microsoft has you covered on that... the "Surface 2.0 SDK" includes a suite of controls (usable on any Win7 machine, not just for Surface) that were built with this stuff in mind.
To create application with MultiTouch UI, use MultiTouch Framework in .Net
Go to http://multitouchvista.codeplex.com/

Silverlight Windows Phone 7: Gesture events?

I'm mocking up a wp7 app in Expression Blend and looking for set up an event handler in response to certain gestures. Some Bing-ing shows some people recommending to use "on click" or something and check the time between taps oneself, but it seems like there should be an easier way.
Is there an event for Silverlight controls that fires for gestures (or specific types of gestures?)
Silverlight for Windows Phone 7 doesn't natively contain any support for gestures. However, there are a number of options available:
The Silverlight for Windows Phone Toolkit contains a GestureService/GestureListener which I'd recommend looking at first.
Laurent Bugnion has created some MultiTouch behaviours which may be of interest, dependeing on your specific requirements.
A final option would be to use some of the gestures supported by the XNA Framework. Mike Ormond has written a good introduction to using them.
Beyond that you'll have to detect/determine gestures yourself through use of the ManipulationStarted, ManipulationDelta and ManipulationCompleted events. MSDN also has a guide to handling manipulation events which you could also use as a pointer to detecting gestures yourself.

Wpf popups or modal windows as user entry screens?

I am building an application which has multiple user entry screens. I would like to know if there are advantages/disadvantages of using wpf popups rather than modal windows?
I am using mvvm-light.
I have noticed that popups are being used extensively in touch applications (eg iPad).
The issue is really one of Desktop vs. Web applications. Popups in Silverlight (or other touch apps) involve having only one real window to work with (the mobile surface, or the web browser). If you are writing a desktop app, then modal windows will probably match user expectations better as Popups cannot leave the parent window.
pop ups are nice but are very difficult to control. In our apps we are using adorners to be 'pop-up' editors - we have created a control that can hold any other control and display it in the adorner layer of the main window. This allows to do things such as having 1 control appear next to another yet still have the other control in use or we can grey out the background and force focus to the new control and not allow any other control to be used until the 'ok' button is pressed. If you Google for adorners in wpf you will find a lot of excellent articles.

Resources