How to use right hand gesture in kinect v2? - wpf

I am using kinect (V2) for developing a gesture based application in WPF. I am able to scroll the image, zoom it, get the click event. Now what i need to do is use the right click gesture using kinect. What is the gesture to use right click using kinect or do i need to write any custom code to achieve it. I need to use the right click using gesture in kinect ? Could you please help me with it?

Implementing a gestural system is not the same of implementing a WIMP (Windows-Icons-Menu-Pointer) interface. So you should probably completely overcome the idea of interacting with a desktop and "use a mouse via gestures". If you will decide to follow this latter (wrong) idea, you will risk to produce a not-so-usable interface.
Anyway, there is no standard "right-click gesture", and this is due to the fact that a gestural interface is different from a classical desktop GUI.
You should write some custom gesture to do this indeed.
For more information, take a look to the Human Interface Guidelines 2.0 by Microsoft.

Related

Getting started with UIAutomation

I'm trying to find a good resource to get started with UIAutomation. I need to simulate mouse input in a WPF application. Are there any good examples out there? I couldn't find any, and the MSDN documentation seems too extensive.
UI Automation is not really intended to simulate mouse clicks. It is meant to expose the user interface in a programmatically-accessible fashion.
It organizes controls in a hierarchy that can be easily traversed/navigated by screen readers or similar applications. And, it uses control patterns to allow users to interact with the controls.
A Button for example can expose the InvokePattern via it's automation peer. You can simulate a click using the Invoke method on that pattern. This is done independently of the mouse, so there would be no mouse over/enter/leave/down events, just a Click event.
You can use mouse class with UIAutomation. But as CodeNaked rightly said, we should use UIAutomation patterns for mouse like operations and it is not good practice to use mouse class.
You can refer this code project article to start with UIAutomation.
I hope this will help you.

WPF - InkCanvas touchscreen problem

I'm developing a touchscreen application, touchscreen overlay comes with its own SDK which disables all of the WPF default features. For example if I don't use this SDK i can easily draw on InkCanvas, program sees the overlay as a mouse input, but when I use this SDK it just doesn't recognize gestures, there are great features in SDK so I really want to use that, I can get the position of the touch point so how can I configure the InkCanvas to recognize this point and make me able draw based on this point.
BTW, I can draw with mouse when I use the SDK.
Any idea?
Thanks in advance,
The best way is to not use the SDK if you can help it, and just use a multitouch driver. This simplifies things greatly. Once you have this driver you need to:
enable pen and touch in windows
respond to the TouchDown, TouchMove, and TouchUp events. the touches don't get translated into events the same way mouse clicks / drags do. Look at the "Raw Touch" section of this article.
if you dont have a driver, or you insist on using their SDK, you should still look into the Touch events I listed above, as these are most likely what you need.

Windows 7 Touch

I want to write a WPF-application with support of touch events.
But, I have a problem. I installed Windows Surface toolkit for Windows Touch, but none of the Manipulation-events (ManipulationStarting, ManipulationDelta and other) for the Grid are not raised when I’m clicking and then moving the mouse over it.
Could anyone tell how to catch manipulation events in WPF for Windows 7 using Windows Surface toolkit's library?
Thanks in advance.
If you use a touch device the Manipulation events will work directly. If you want to use a mouse device to simulate touch, you can use Blake.NUI's MouseTouchDevice to do just that.
Simply add that file or library reference to your project and call
MouseTouchDevice.RegisterEvents(this);
on the constructor of your Window. You can also pass another element as parameter if you want to limit the mouse simulation to a smaller region.

Using SketchFlow to prototype MS Surface Applications

We're doing mockups/prototyping on a MS Surface device and I wonder if anyone has succeeded in using SketchFlow for this. The problem that I see is that the code generated by the tool uses normal WPF controls (Button, etc.) instead of the contact aware surface counterparts (SurfaceButton) which means that they won't work nice on the surface unless you also use a mouse.
Additionally, it would be nice if it was possible to hook in to other gesture events to trigger sketchflow transitions. Like the pinch gestures to switch page etc.
Has anyone had any success with prototyping for Surface using Sketchflow. If so, how did you do it?
You could write a trigger or behavior that would handle the other gesture events you mention to navigate around the SketchFlow player that would be re-usable, and should enable what you are suggesting.
As for the controls, I am not very familiar with surface development, but you might also be able to create a behavior that when attached adds some sort of surface ability, like clickability to the visual object.
Happy to help if I can be of any assistance.

Track "commands" send to WPF window by touchpad (Bamboo)

I just bought a touchpad wich allows drawing and using multitouch. The api is not supported fully by windows 7, so I have to rely on the build in config dialog.
The basic features are working, so if I draw something in my WPF tool, and use both fingers to do a right click, I can e.g. change the color. What I want to do now is assign other functions to special features in WPF.
Does anybody know how to find out in what way the pad communicates with the app? It works e.g. in Firefox to scroll, like it should (shown on this photo). But I do not know how to hookup the scroll event, I tried a Scrollviewer (which ignores my scroll attempts) and I also hooked up an event with the keypressed, but it does not fire (I assume the pad does not "press a key" but somehow sends the "scroll" command direclty. How can I catch that command in WPF?
Thanks a lot,
Chris
[EDIT] I got the scroll to work, but only up and down, not left and right. It was just a stupid "listbox in scrollviewer" mistake. But still not sure about commands like ZOOM in (which is working even in paint).. Which API contains such things?
[EDIT2] Funny, the zoom works in Firefox, the horizontal scrolling does not. But, in paint, the horizontal scrolling works...
[EDIT 3] Just asked in the wacom forum, lets see about vendor support reaction time...
http://forum.wacom.eu/viewtopic.php?f=4&t=1939
Here is a picture of the config surface to get the idea what I am talking about: (Bamboo settings, I try to catch these commands in WPF)
alt text http://img340.imageshack.us/img340/3751/20091008210914.jpg
Have you had a look at this yet.
WPF 3.5 does not natively support multi-touch (it is coming in WPF 4.0) however the samples in that kit should get you started using the Windows7 Integration Library which access the native Win32 APIs to provide the required support (Don't worry its not real ugly:).

Resources