Simulate tap in wpf using windows 7 touch framework - wpf

How does button works in regards to tapping?
I would like to simulate a tapping on a control. To do so I cannot just handle 'TouchUp' event because I want to ensure the user tapped on that location and didn't just move his finger and released it over my control.
Good scenario: Tapping on my control
Bad scenario: touching on a place outside my control, Moving while touching to my control, Releasing the touch -> This will cause to touch up yet I don't want to handle that scenario.
Eran.

Thank you for replying. I found the answer.
If you explicitly force capture to specific element (ie elementA) then all the following events will be routed to that element.
private void elementA_TouchDown(object sender, TouchEventArgs e)
{
elementA.CaptureTouch(e.TouchDevice);
}
If you don't force capture to a specific element then the touch up event will fire on the last element you were on his boundaries when you release the touch.

Related

WPF get active touch points

Is there a way in WPF to get active touch points? I need to determine if user is touching screen, similar to Mouse classes' Pressed -property?
I just need to know if any touch is present on the screen - don't mind what UIElement it's touching.
Here are two options, but they may not be the most correct way to do it:
1) You could subscribe to the MainWindow.PreviewTouchDown and MainWindow.PreviewTouchUp and maintain a list of all the current touch devices. It would be easy to implement but could make your code messy.
2) Subscribe to Touch.FrameReported which you can get a collection of touch points from the TouchFrameEventArgs.GetTouchPoints(null);. This will happen on every touch event firing, so it may be too often, but it would allow you to handle this event from any class.
You can subscribe to your main windows ManipulationStarting event (when the first finger makes contact with the screen), ManipulationInertiaStarting event (when the last finger lifts off the screen) and/or ManipulationDelta event (when any finger moves).
Within your event handlers you can get a list of all current touchpoints via ManipulationDeltaEventArgs.Manipulators
Don't forget to set your main window's IsManipulationEnabled to true.
This way you just have to remember whether a manipulation is currently in progress or not. You don't have to keep track of all the individual touch points yourself.

Strategy for differentiating TouchUp from TouchLeave, and TouchDown from TouchEnter?

For the basic scenario described in the msdn overview (under Touch and Manipulation) TouchEnter and TouchLeave are fired for every corresponding TouchDown and TouchUp respectively. Unlike the mouse, the Touch and Stylus are not constrained to maintain contact with the screen.
Is there a way to use TouchEnter and TouchLeave is to capture only when a finger is dragged into the UIElement. As these events are fired for every touchUp and touchDown, what is the best way to differentiate these events?
One strategy that would work for the single finger case, is to have a flag set on TouchDown, and check if the flag is set on TouchUp. This allows some condition checks on TouchUp. However, for multiple fingers, it isn't feasible.
There are no PreviewTouchEnter and PreviewTouchLeave events fired, only PreviewTouchDown and PreviewTouchUp. The sequence of events for a finger lowered on to a UIElement and then raised over it is as follows:
TouchEnter
PreviewTouchDown
TouchDown
PreviewTouchUp
TouchUp
TouchLeave
This sequence doesn't help differentiate a TouchEnter that has happened due to a finger dragged across the screen into the UIElement, from a finger that is lowered onto the UIElement directly. Am I missing something, or does the framework not support such differentiation itself?
Can you use the TouchDevice Class to keep track of where touches are generated. New touches are given a new ID, so you could distinguish between existing touches and new ones, and which elements are capturing the device. I guess that circumvents the Manipulation events and the normal processes, but I hope that helps.
If you retrieve a TouchPoint for the event, there is a property on it named Action which tells you whether it is a Down, a Move or a Up event.
void m_element_TouchEnter(object sender, System.Windows.Input.TouchEventArgs e)
{
var touchPoint = e.GetTouchPoint(m_someElement);
if (touchPoint.Action == System.Windows.Input.TouchAction.Move)
{
//This is a "true" TouchEnter event
}
else if (touchPoint.Action == System.Windows.Input.TouchAction.Down)
{
//This is a "true" TouchDown event.
}
}

Databinding falls behind event notification - discussion

Found an interesting problem that I first found in WinForms, and found again in Silverlight, and more than likely WPF as well when it comes to databinding.
I have a tab control with several tabs. As users click across the tabs, each time should be valid before allowing the user to switch from the tab.
For example, user is in a text box which is updated. Binding of text boxes is not flushed until the control loses focus. Loss of focus occurs when the cursor is moved from the control, and focus is given to another control.
In this scenario, the user tabs into a control (let's use text box for this example), and updates the text box. At this point the databinding has not flushed the control, and hence the VM has not yet seen the change. The user then uses their mouse to click the next tab of the control.
At this point things get interesting. I used the PreviewSelectionChanged (Telerik RadTabControl), as I want to check things out before the jump to the next tab occurs, and it also gives me the ability to cancel the event.
However, when I look at the VM, in this event, it still does not have the updated data. I see the VM is clean, and go ahead and allow the jump to the next tab.
As soon as this event is over however, the databindings flush, and the VM gets updated. what now? The events are out of sync! When the mouse was used to click the next tab, the textbox should have lost focus, flushed it's bindings, before the Preview of the Tab click! It's to late to jump back and say oops we didn't catch that in time!
I think I found an interesting work around to this issue - but I'm not 100% sure it will work 100% of the time. I cancel the current event, but then I use the Dispatcher and create a delegate pointing to another method with the same signature as the current event. The Dispatcher will add this message to the message pump, which by this time will now (hopefully?) be behind the messages of the VM updating...
My two questions are:
1) I'm assuming that the textbox control either didn't flush when the mouse left the control, or the process that was fired was too slow and hence the preview message was on the pump before the databinding - either way I see this to be a major issue.
2) Is the workaround a good solution?
Ok, first to answer question 1:
Just because the mouse left the textbox area, doesn't mean that the textbox lost focus. It only loses focus once something else gets focus. For example, if you moved the mouse out of the textbox and click on some other control on your page (it can be anything from a scroll viewer to another textbox, etc.) then your textbox will lose focus.
Now, based on that, the events do not happen in the wrong order. What happens is; your click event on the other tab triggers both the textbox to lose focus (and the data binding to take place) and the move to the next frame, and based on that, you basically get a race condition where the moving to the next tab happens before the databinding takes place.
About question 2:
What you can do is, set the UpdateSourceTrigger to Explicit, you will however be forced to then have some kind of text_changed event and manually update the binding.
You can read more about that here. It might not be the most complete explanation but is a good place to start.
On the other hand, you can associate some events to the textbox and force the textbox to lose focus on those events (e.g. mouse out).
Just an idea: Why not do everything in the VM's PropertyChanged event?
protected override void OnThisViewModelPropertyChanged(object sender, PropertyChangedEventArgs e) {
if(e.PropertyName == "WhateverProperty") {
//Do your magic here for whatever you want to set
}
}
Have your TabItems bound to a collection that will control is being disabled or not.
<sdk:TabControl>
<sdk:TabItem IsEnabled="{Binding SomeProperty, Converter={AmIDisabledOrWhatConverter}}" />
</sdk:TabControl>
That way, everything is triggered whenever a property is chaned in the vm. No more timing issues since everything is on the vm.
Just my two cents.
There's a design defect here, and you're trying to work around the defect instead of fixing it. You shouldn't have to figure out how to cancel the Click event on the tab. The tab shouldn't be processing Click events in the first place.
Generally speaking, if it's not legal for the user to click on a control, the control shouldn't be enabled. The tab should be disabled until the state of the view model is valid.
Your view model should be exposing a command for navigating to the next tab, and the tab should be bound to the command. The command's CanExecute method should only return true when the state of the view model on the current tab is valid.
This doesn't fix your other problem, which is that Silverlight doesn't support UpdateSourceTrigger="PropertyChanged" out of the box. But that's a solved problem (here is one example).
Note that if you implement commands to handle this wizard-like navigation in your application, you can, down the road, change the view to use something other than a tab control (e.g. to use navigation buttons like an actual wizard, or something like Telerik's PanelBar) without having to screw around with event handlers.
Change your bindings to include UpdateSourceTrigger="PropertyChanged".
This will ensure that your data sources are updated on every key stroke, not just LostFocus.
MyOwnTextBox()
{
this.TextChanged += (s, e) => UpdateText();
}
private void UpdateText()
{
BindingExpression be = GetBindingExpression(TextProperty);
if (be != null && be.ParentBinding.Mode == BindingModes.TwoWay)
{
be.UpdateSource();
}
}
I am using this class it updates my binding on the fly, however there is issue with empty string and null values, if your destination property is nullable string then this will update empty string in your destination property. You can get workaround by using some sort of string converter which will use null instead of empty string in case of nullable strings.

No ManipulationCompleted event in Surface Toolkit for Windows Touch Beta

I am using Surface Toolkit for Windows Touch Beta. I have a UserControl within a ScatterViewItem on a ScatterView. I want to receive ManipulationCompleted event on a UserControl but it doesn't seem to ever be raised even though IsManipulationEnabled="True" is also set. The same thing works perfectly in a non-Surface WPF4 app.
It appears various Touch WPF events play well with Surface but it seems like a lot of work to recreate a tap event and NSWE events that I can easily interpret from ManipulationCompleted event.
I am looking on ways to either receive ManipulationCompleted event on my UserControl or to simulate it by handling existing touch events.
Any pointers?
does the scatterviewitem move when your usercontrol is touched? only one element at a time can be tracking manipulations for a given touch. if the scatterviewitem is getting the manipulation events, that means your user control will not.
if you only want your usercontrol to handle the input, then have it listen to TouchDown and call usercontrol.Capture(touch). if you want to have the SVI do it's thing but also handled the completed event on your own, then you will have to register your event handler manually: usercontrol.AddHandler( ManipulationCompletedEvent, yourHandler, true). the last parameter says you want to handle the event even if SVI already has.

How do I capture keystrokes in a Silverlight application?

I am trying to capture any keystroke that happens inside my Silverlight 2 application. There are not any input fields, I'm writing a game, and need to know which of the arrow keys are being pressed. My best guesses at how to capture these through an event handler are unsuccessful. Any recommendations?
Good to see you again (virtually).
I'm assuming you already tried wiring KeyDown on the root UserControl element. Here are some tips:
The plugin needs focused before it sees key events. You'll have to force the user into clicking on the plugin to start.
Make sure you don't have another element (like the ScrollViewer) that is eating arrow keys events. If you have a ScrollViewer in play you'll only be seeing KeyUp.
No switching to full screen mode.
Must be something simple you are missing like that. Hope that helps.
Handle the KeyDown and/or the KeyUp event on your root Grid.
Silverlight 2 supports the KeyUp and KeyDown events (only, I believe - not KeyPress).
In these events, you get a KeyEventArgs object as a parameter. You can get the KeyCode or KeyData from this.
In reviewing a few different threads from different forums regarding this matter it would seem it is best to handle your keyevents on your Page (root) element, parse the key for the desired effect, and redirect your command to the particular control.
Page
this.KeyDown += (s, e) =>
{
MyControl control = (MyControl)Layoutroot.FindName("controlname");
if(control != null)
{
control.MyPublicFunction(e.Key);
}
};
MyControl
public MyPublicFunciton(Key pressedKey)
{
if(pressedKey == Key.Enter)
{
//Do something
}
}

Resources