I want to write a WPF-application with support of touch events.
But, I have a problem. I installed Windows Surface toolkit for Windows Touch, but none of the Manipulation-events (ManipulationStarting, ManipulationDelta and other) for the Grid are not raised when I’m clicking and then moving the mouse over it.
Could anyone tell how to catch manipulation events in WPF for Windows 7 using Windows Surface toolkit's library?
Thanks in advance.
If you use a touch device the Manipulation events will work directly. If you want to use a mouse device to simulate touch, you can use Blake.NUI's MouseTouchDevice to do just that.
Simply add that file or library reference to your project and call
MouseTouchDevice.RegisterEvents(this);
on the constructor of your Window. You can also pass another element as parameter if you want to limit the mouse simulation to a smaller region.
Related
I am using kinect (V2) for developing a gesture based application in WPF. I am able to scroll the image, zoom it, get the click event. Now what i need to do is use the right click gesture using kinect. What is the gesture to use right click using kinect or do i need to write any custom code to achieve it. I need to use the right click using gesture in kinect ? Could you please help me with it?
Implementing a gestural system is not the same of implementing a WIMP (Windows-Icons-Menu-Pointer) interface. So you should probably completely overcome the idea of interacting with a desktop and "use a mouse via gestures". If you will decide to follow this latter (wrong) idea, you will risk to produce a not-so-usable interface.
Anyway, there is no standard "right-click gesture", and this is due to the fact that a gestural interface is different from a classical desktop GUI.
You should write some custom gesture to do this indeed.
For more information, take a look to the Human Interface Guidelines 2.0 by Microsoft.
I'm attempting to create a UI where I interact with normal WPF controls without the mouse. I want to support multiple cursors so using any regular input simulation (such as SendInput) doesn't work. I also tried interlacing SendInput messages to simulate to mouse cursors but that didn't work either (only got one mouse input). I also have the constraint that I do not want to use Windows MultiPoint.
I've tried sending events to the controls (testing it on Button) through mouse events when I detect my cursor position is above the controls using: MouseEnterEvent, MouseLeaveEvent, MouseMove, MouseDownEvent and MouseUpEvent. But except for the MouseDownEvent, none of it seems to work.
Here is an example of how I send the MouseEnterEvent:
System.Windows.Input.MouseEventArgs e =
new System.Windows.Input.MouseEventArgs(System.Windows.Input.Mouse.PrimaryDevice, DateTime.Now.Millisecond);
e.RoutedEvent = System.Windows.Input.Mouse.MouseEnterEvent;
elementUnderCursor.RaiseEvent(e);
Where elementUnderCursor is a UIElement.
I think that i wanted the same thing for my multimose kinect app be but if you have an emulated mouse driver you would just have to figure out how to tell the app that an event happened with that mouse driver instead of your usb mouse or ps/2 mouse driver.
For instance,
send the mousedown and up event to simulate mouse click for mouse #1,#2. Also, update mouse positon for mouse #1,#2.
My reasoning for the above is i wanted it to work in any application by running a similar program to the above program in the background as a service.
Update dsf works for simulating virtual mouse devices im working on a multiple mouse project with kinect so please look at the current progress at the site: http://kinectmultipoint.codeplex.com.
be warned multiple mouse drivers cant be built overnight.
also, download the windows ddk to simulate the mouse devices. the testgenerichid.wsf script has to be changed for your scenario but its possible.
I'm developing a touchscreen application, touchscreen overlay comes with its own SDK which disables all of the WPF default features. For example if I don't use this SDK i can easily draw on InkCanvas, program sees the overlay as a mouse input, but when I use this SDK it just doesn't recognize gestures, there are great features in SDK so I really want to use that, I can get the position of the touch point so how can I configure the InkCanvas to recognize this point and make me able draw based on this point.
BTW, I can draw with mouse when I use the SDK.
Any idea?
Thanks in advance,
The best way is to not use the SDK if you can help it, and just use a multitouch driver. This simplifies things greatly. Once you have this driver you need to:
enable pen and touch in windows
respond to the TouchDown, TouchMove, and TouchUp events. the touches don't get translated into events the same way mouse clicks / drags do. Look at the "Raw Touch" section of this article.
if you dont have a driver, or you insist on using their SDK, you should still look into the Touch events I listed above, as these are most likely what you need.
I am currently working on the WPF project which involves creating a touch-screen application for Windows XP embedded. And as Windows XP wasn't built for touch interaction, there are some problems and issues with developing those applications.
An example would be a click: On windows XP click is mouse down and mouse up event, however if you use your finger instead of the mouse, you might get a drag motion instead of the click, as when you press mouse down you finger might slightly move to the side from the initial position and you will get a drag instead of click. This is just a single example of the problems you get when developing an touch-screen app for Windows XP.
If someone has been working on the WPF touch-screen application for Windows XP, could you share some knowledge and point out the pitfalls you have encountered or if you know of any resources on this topic, could you please share it.
I would agree with #bflosabre91. With a mouse you could have the same problems and in fact happens quite frequently when someone is learning to use a mouse. I think this problem is more relevant at the hardware level and how the touchscreen actually interprets what the user is doing.
On the software side, you COULD add some logic something along the lines of:
On mouse down: record coordinates and maybe the control (button, etc.) that is under the pointer
On mouse up: compare recorded coordinates with current coordinates. If it's within x pixels, either do a "control.click" or move the mouse to the old coordinates and tell tell the mouse to click.
The hardware may already be doing something like this...
i have a WPF touch screen application and it is running on kiosks with XP(although its not XP embedded like you said). I haven't had any issue with any type of click event or anything like that. I programmed it using all the normal mouse click events so it technically does work with a mouse or with the touch screen. As long as you build the controls to be large enough to account for the fact a finger will be touching it instead of a mouse pointer, I did not come across any issues.
I just bought a touchpad wich allows drawing and using multitouch. The api is not supported fully by windows 7, so I have to rely on the build in config dialog.
The basic features are working, so if I draw something in my WPF tool, and use both fingers to do a right click, I can e.g. change the color. What I want to do now is assign other functions to special features in WPF.
Does anybody know how to find out in what way the pad communicates with the app? It works e.g. in Firefox to scroll, like it should (shown on this photo). But I do not know how to hookup the scroll event, I tried a Scrollviewer (which ignores my scroll attempts) and I also hooked up an event with the keypressed, but it does not fire (I assume the pad does not "press a key" but somehow sends the "scroll" command direclty. How can I catch that command in WPF?
Thanks a lot,
Chris
[EDIT] I got the scroll to work, but only up and down, not left and right. It was just a stupid "listbox in scrollviewer" mistake. But still not sure about commands like ZOOM in (which is working even in paint).. Which API contains such things?
[EDIT2] Funny, the zoom works in Firefox, the horizontal scrolling does not. But, in paint, the horizontal scrolling works...
[EDIT 3] Just asked in the wacom forum, lets see about vendor support reaction time...
http://forum.wacom.eu/viewtopic.php?f=4&t=1939
Here is a picture of the config surface to get the idea what I am talking about: (Bamboo settings, I try to catch these commands in WPF)
alt text http://img340.imageshack.us/img340/3751/20091008210914.jpg
Have you had a look at this yet.
WPF 3.5 does not natively support multi-touch (it is coming in WPF 4.0) however the samples in that kit should get you started using the Windows7 Integration Library which access the native Win32 APIs to provide the required support (Don't worry its not real ugly:).