WPF - InkCanvas touchscreen problem - wpf

I'm developing a touchscreen application, touchscreen overlay comes with its own SDK which disables all of the WPF default features. For example if I don't use this SDK i can easily draw on InkCanvas, program sees the overlay as a mouse input, but when I use this SDK it just doesn't recognize gestures, there are great features in SDK so I really want to use that, I can get the position of the touch point so how can I configure the InkCanvas to recognize this point and make me able draw based on this point.
BTW, I can draw with mouse when I use the SDK.
Any idea?
Thanks in advance,

The best way is to not use the SDK if you can help it, and just use a multitouch driver. This simplifies things greatly. Once you have this driver you need to:
enable pen and touch in windows
respond to the TouchDown, TouchMove, and TouchUp events. the touches don't get translated into events the same way mouse clicks / drags do. Look at the "Raw Touch" section of this article.
if you dont have a driver, or you insist on using their SDK, you should still look into the Touch events I listed above, as these are most likely what you need.

Related

How to use right hand gesture in kinect v2?

I am using kinect (V2) for developing a gesture based application in WPF. I am able to scroll the image, zoom it, get the click event. Now what i need to do is use the right click gesture using kinect. What is the gesture to use right click using kinect or do i need to write any custom code to achieve it. I need to use the right click using gesture in kinect ? Could you please help me with it?
Implementing a gestural system is not the same of implementing a WIMP (Windows-Icons-Menu-Pointer) interface. So you should probably completely overcome the idea of interacting with a desktop and "use a mouse via gestures". If you will decide to follow this latter (wrong) idea, you will risk to produce a not-so-usable interface.
Anyway, there is no standard "right-click gesture", and this is due to the fact that a gestural interface is different from a classical desktop GUI.
You should write some custom gesture to do this indeed.
For more information, take a look to the Human Interface Guidelines 2.0 by Microsoft.

Information on developing WPF touch-screen application for Windows XP

I am currently working on the WPF project which involves creating a touch-screen application for Windows XP embedded. And as Windows XP wasn't built for touch interaction, there are some problems and issues with developing those applications.
An example would be a click: On windows XP click is mouse down and mouse up event, however if you use your finger instead of the mouse, you might get a drag motion instead of the click, as when you press mouse down you finger might slightly move to the side from the initial position and you will get a drag instead of click. This is just a single example of the problems you get when developing an touch-screen app for Windows XP.
If someone has been working on the WPF touch-screen application for Windows XP, could you share some knowledge and point out the pitfalls you have encountered or if you know of any resources on this topic, could you please share it.
I would agree with #bflosabre91. With a mouse you could have the same problems and in fact happens quite frequently when someone is learning to use a mouse. I think this problem is more relevant at the hardware level and how the touchscreen actually interprets what the user is doing.
On the software side, you COULD add some logic something along the lines of:
On mouse down: record coordinates and maybe the control (button, etc.) that is under the pointer
On mouse up: compare recorded coordinates with current coordinates. If it's within x pixels, either do a "control.click" or move the mouse to the old coordinates and tell tell the mouse to click.
The hardware may already be doing something like this...
i have a WPF touch screen application and it is running on kiosks with XP(although its not XP embedded like you said). I haven't had any issue with any type of click event or anything like that. I programmed it using all the normal mouse click events so it technically does work with a mouse or with the touch screen. As long as you build the controls to be large enough to account for the fact a finger will be touching it instead of a mouse pointer, I did not come across any issues.

Windows 7 Touch

I want to write a WPF-application with support of touch events.
But, I have a problem. I installed Windows Surface toolkit for Windows Touch, but none of the Manipulation-events (ManipulationStarting, ManipulationDelta and other) for the Grid are not raised when I’m clicking and then moving the mouse over it.
Could anyone tell how to catch manipulation events in WPF for Windows 7 using Windows Surface toolkit's library?
Thanks in advance.
If you use a touch device the Manipulation events will work directly. If you want to use a mouse device to simulate touch, you can use Blake.NUI's MouseTouchDevice to do just that.
Simply add that file or library reference to your project and call
MouseTouchDevice.RegisterEvents(this);
on the constructor of your Window. You can also pass another element as parameter if you want to limit the mouse simulation to a smaller region.

Using SketchFlow to prototype MS Surface Applications

We're doing mockups/prototyping on a MS Surface device and I wonder if anyone has succeeded in using SketchFlow for this. The problem that I see is that the code generated by the tool uses normal WPF controls (Button, etc.) instead of the contact aware surface counterparts (SurfaceButton) which means that they won't work nice on the surface unless you also use a mouse.
Additionally, it would be nice if it was possible to hook in to other gesture events to trigger sketchflow transitions. Like the pinch gestures to switch page etc.
Has anyone had any success with prototyping for Surface using Sketchflow. If so, how did you do it?
You could write a trigger or behavior that would handle the other gesture events you mention to navigate around the SketchFlow player that would be re-usable, and should enable what you are suggesting.
As for the controls, I am not very familiar with surface development, but you might also be able to create a behavior that when attached adds some sort of surface ability, like clickability to the visual object.
Happy to help if I can be of any assistance.

Track "commands" send to WPF window by touchpad (Bamboo)

I just bought a touchpad wich allows drawing and using multitouch. The api is not supported fully by windows 7, so I have to rely on the build in config dialog.
The basic features are working, so if I draw something in my WPF tool, and use both fingers to do a right click, I can e.g. change the color. What I want to do now is assign other functions to special features in WPF.
Does anybody know how to find out in what way the pad communicates with the app? It works e.g. in Firefox to scroll, like it should (shown on this photo). But I do not know how to hookup the scroll event, I tried a Scrollviewer (which ignores my scroll attempts) and I also hooked up an event with the keypressed, but it does not fire (I assume the pad does not "press a key" but somehow sends the "scroll" command direclty. How can I catch that command in WPF?
Thanks a lot,
Chris
[EDIT] I got the scroll to work, but only up and down, not left and right. It was just a stupid "listbox in scrollviewer" mistake. But still not sure about commands like ZOOM in (which is working even in paint).. Which API contains such things?
[EDIT2] Funny, the zoom works in Firefox, the horizontal scrolling does not. But, in paint, the horizontal scrolling works...
[EDIT 3] Just asked in the wacom forum, lets see about vendor support reaction time...
http://forum.wacom.eu/viewtopic.php?f=4&t=1939
Here is a picture of the config surface to get the idea what I am talking about: (Bamboo settings, I try to catch these commands in WPF)
alt text http://img340.imageshack.us/img340/3751/20091008210914.jpg
Have you had a look at this yet.
WPF 3.5 does not natively support multi-touch (it is coming in WPF 4.0) however the samples in that kit should get you started using the Windows7 Integration Library which access the native Win32 APIs to provide the required support (Don't worry its not real ugly:).

Resources