For the basic scenario described in the msdn overview (under Touch and Manipulation) TouchEnter and TouchLeave are fired for every corresponding TouchDown and TouchUp respectively. Unlike the mouse, the Touch and Stylus are not constrained to maintain contact with the screen.
Is there a way to use TouchEnter and TouchLeave is to capture only when a finger is dragged into the UIElement. As these events are fired for every touchUp and touchDown, what is the best way to differentiate these events?
One strategy that would work for the single finger case, is to have a flag set on TouchDown, and check if the flag is set on TouchUp. This allows some condition checks on TouchUp. However, for multiple fingers, it isn't feasible.
There are no PreviewTouchEnter and PreviewTouchLeave events fired, only PreviewTouchDown and PreviewTouchUp. The sequence of events for a finger lowered on to a UIElement and then raised over it is as follows:
TouchEnter
PreviewTouchDown
TouchDown
PreviewTouchUp
TouchUp
TouchLeave
This sequence doesn't help differentiate a TouchEnter that has happened due to a finger dragged across the screen into the UIElement, from a finger that is lowered onto the UIElement directly. Am I missing something, or does the framework not support such differentiation itself?
Can you use the TouchDevice Class to keep track of where touches are generated. New touches are given a new ID, so you could distinguish between existing touches and new ones, and which elements are capturing the device. I guess that circumvents the Manipulation events and the normal processes, but I hope that helps.
If you retrieve a TouchPoint for the event, there is a property on it named Action which tells you whether it is a Down, a Move or a Up event.
void m_element_TouchEnter(object sender, System.Windows.Input.TouchEventArgs e)
{
var touchPoint = e.GetTouchPoint(m_someElement);
if (touchPoint.Action == System.Windows.Input.TouchAction.Move)
{
//This is a "true" TouchEnter event
}
else if (touchPoint.Action == System.Windows.Input.TouchAction.Down)
{
//This is a "true" TouchDown event.
}
}
Related
Is there a way in WPF to get active touch points? I need to determine if user is touching screen, similar to Mouse classes' Pressed -property?
I just need to know if any touch is present on the screen - don't mind what UIElement it's touching.
Here are two options, but they may not be the most correct way to do it:
1) You could subscribe to the MainWindow.PreviewTouchDown and MainWindow.PreviewTouchUp and maintain a list of all the current touch devices. It would be easy to implement but could make your code messy.
2) Subscribe to Touch.FrameReported which you can get a collection of touch points from the TouchFrameEventArgs.GetTouchPoints(null);. This will happen on every touch event firing, so it may be too often, but it would allow you to handle this event from any class.
You can subscribe to your main windows ManipulationStarting event (when the first finger makes contact with the screen), ManipulationInertiaStarting event (when the last finger lifts off the screen) and/or ManipulationDelta event (when any finger moves).
Within your event handlers you can get a list of all current touchpoints via ManipulationDeltaEventArgs.Manipulators
Don't forget to set your main window's IsManipulationEnabled to true.
This way you just have to remember whether a manipulation is currently in progress or not. You don't have to keep track of all the individual touch points yourself.
I have a Canvas, with one child control. The child controls receives all PreviewTouchDown and PreviewTouchUp events fine - but after enabling Manipulation (IsManipulationEnabled = true) on the Canvas, only the "down" events get fired on child object, but the TouchUp and PreviewTouchUp events doesn't fire at all..
Any ideas what's going on here?
You need to set IsManipulationEnabled to true on the child element, too.
The relationship between touch and manipulation events is explained in Input Overview / Touch and Manipulation, section The Relationship Between Touch and Manipulation Events:
A UIElement can always receive touch events. When the
IsManipulationEnabled property is set to true, a UIElement can receive
both touch and manipulation events. If the TouchDown event is not
handled (that is, the Handled property is false), the manipulation
logic captures the touch to the element and generates the manipulation
events. If the Handled property is set to true in the TouchDown event,
the manipulation logic does not generate manipulation events. The
following illustration shows the relationship between touch events and
manipulation events.
Touch and manipulation events
I know its a year old, but this might help someone:
For a workaround, you can capture "Stylus tap" events just fine if the parent element has IsManipulationEnabled = true
....
MyChildElement.StylusSystemGesture += MyChildElement_StylusSystemGesture;
....
void MyChildElement_StylusSystemGesture(object sender, StylusSystemGestureEventArgs e)
{
if (e.SystemGesture == SystemGesture.Tap)
//Do something
}
How does button works in regards to tapping?
I would like to simulate a tapping on a control. To do so I cannot just handle 'TouchUp' event because I want to ensure the user tapped on that location and didn't just move his finger and released it over my control.
Good scenario: Tapping on my control
Bad scenario: touching on a place outside my control, Moving while touching to my control, Releasing the touch -> This will cause to touch up yet I don't want to handle that scenario.
Eran.
Thank you for replying. I found the answer.
If you explicitly force capture to specific element (ie elementA) then all the following events will be routed to that element.
private void elementA_TouchDown(object sender, TouchEventArgs e)
{
elementA.CaptureTouch(e.TouchDevice);
}
If you don't force capture to a specific element then the touch up event will fire on the last element you were on his boundaries when you release the touch.
I have implemented a UserControl. Then I would like to handle an event that is originally handled by Window (keyboard press). What is the best way to route the event caught by another component (higher in the components' tree)?
Thanks in advance for the replies and hints!
Cheers
It depends on the event you're trying to access. If it's a Preview event and the Window is setting e.Handled to true you'll need to use the method Alex suggests to circumvent the Window's handling of the tunneling. If it is a bubbling event (i.e. KeyDown) you don't need to do anything special since bubbling events hit the handlers on child elements first and go up the visual tree so the Window handler won't occur until after your UC's.
One thing you need to be careful with using Key events is that the event is only going to get picked up by your UC in the first place if the Focus is on or inside of it. This isn't something you need to worry about with things like Mouse events since they start at a specific location in the tree.
I believe you cannot gurantee that.
Window class is wrapping Win32 message-based event model and this will be the only WPF entity which will have access to those information.
I suggest that you create an attached property (which will be used by the Window) and implement the routing of the events yourself so that controls could subscribe to.
You can attach the routed handler specifying that you want to handle handled messages as well:
this.AddHandler(routedEvent, handler, true);
where this is an UIElement or derived class.
However there may still be events (key presses in this case) which don't make it past the window, not sure.
I've messed around with PreviewLostKeyboardFocus which almost gets you there. I've seen a couple of implementations using LostFocus, but that just forces focus back on the TextBox after it's lost focus and you can easily see this shifting on the screen. Basically, I'm just looking for the same type of behavior you could get with using OnValidating in WinForms.
In my opinion, the best way is generally not to do it. It is almost always better to just disable the other controls or prevent saving until the value is valid.
But if your design really needs this ability, here is what you should do:
Intercept the Preview version of keyboard and mouse events at your window level, or whatever scope you want to prevent focus changes within (eg maybe not your menu bar).
When the Tab KeyDown or Return KeyDown is detected in the text box, or when a MouseDown is detected outside the text box while it has the focus, call UpdateSource() on the binding expression, then if the validation has failed set Handled=true to prevent the KeyDown or MouseDown event from being processed further.
Also continue handling PreviewLostKeyboardFocus to catch any causes of focus change that aren't from the keyboard or mouse, or that your other code didn't recognize.
To add onto Ray's answer:
UpdateSource is called like so:
BindingExpression be = userTextbox.GetBindingExpression(TextBox.TextProperty);
be.UpdateSource();
Also, as an alternative you can set the text box binding to:
UpdateSourceTrigger = "PropertyChanged";
The latter will cause a continuous check, whereas the former will check when needed (performant).
If you attempt to focus an element inside its own LostFocus handler you will face a StackOverflowException, I'm not sure about the root cause (I suspect the focus kind of bounces around) but there is an easy workaround: dispatch it.
private void TextBox_LostFocus(object sender, RoutedEventArgs e)
{
var element = (sender as TextBox);
if (!theTextBoxWasValidated())
{
// doing this would cause a StackOverflowException
// element.Focus();
var restoreFocus = (System.Threading.ThreadStart)delegate { element.Focus(); };
Dispatcher.BeginInvoke(restoreFocus);
}
}
Through Dispatcher.BeginInvoke you make sure that restoring the focus doesn't get in the way of the in-progress loss of focus (and avoid the nasty exception you'd face otherwise)