I am currently working on the WPF project which involves creating a touch-screen application for Windows XP embedded. And as Windows XP wasn't built for touch interaction, there are some problems and issues with developing those applications.
An example would be a click: On windows XP click is mouse down and mouse up event, however if you use your finger instead of the mouse, you might get a drag motion instead of the click, as when you press mouse down you finger might slightly move to the side from the initial position and you will get a drag instead of click. This is just a single example of the problems you get when developing an touch-screen app for Windows XP.
If someone has been working on the WPF touch-screen application for Windows XP, could you share some knowledge and point out the pitfalls you have encountered or if you know of any resources on this topic, could you please share it.
I would agree with #bflosabre91. With a mouse you could have the same problems and in fact happens quite frequently when someone is learning to use a mouse. I think this problem is more relevant at the hardware level and how the touchscreen actually interprets what the user is doing.
On the software side, you COULD add some logic something along the lines of:
On mouse down: record coordinates and maybe the control (button, etc.) that is under the pointer
On mouse up: compare recorded coordinates with current coordinates. If it's within x pixels, either do a "control.click" or move the mouse to the old coordinates and tell tell the mouse to click.
The hardware may already be doing something like this...
i have a WPF touch screen application and it is running on kiosks with XP(although its not XP embedded like you said). I haven't had any issue with any type of click event or anything like that. I programmed it using all the normal mouse click events so it technically does work with a mouse or with the touch screen. As long as you build the controls to be large enough to account for the fact a finger will be touching it instead of a mouse pointer, I did not come across any issues.
Related
I created WPF application which allows user to interact with itself using touch. Unfortunately touch is working very bad when several instances of the same application are open.
I will try to describe the behaviour that I am facing:
Open two windows and place them right next to each other.
Touch left window with your finger and do not release it.
Any attempts to interact with the second window using touch will fail.
I have some experience of developing touch applications using WinFroms. And I never had such problems before. So I performed the described above trick with two WinForms applications and they are working just like they should - first window (with finger not released) keeps the focus but second window still allows user to perform clicks on its surface.
I also tried mixed combination - when WPF window is focused, WinFroms window still is touchable. But not the other way around - when WinForms window is focused, WPF window won't respond.
Is there anything that can be done to change described behaviour of the WPF windows?
We are building a Windows application in .NET and one of its requirements is touch screen monitor. Other than that, it's a normal windows form based application. But except for making UI items little bigger for touch, I can't find anything I as a developer need to do for the requirement since touch screen is basically mouse operations. Am I missing something?
No, you are not missing anything. Do get the actual hardware hooked up so you can test it, "little bigger" is invariably underestimating the problem of fat fingers. Everything should work from a single click, right-clicks are horribly impractical, double-clicks are best avoided.
The only other thing you'll want to do is go into the Control Panel + Display applet and change the size of standard Windows UI elements. Pick a large window caption font if you want to allow the user to drag or close windows. Make the scrollbars at least twice as wide. And the menu and message box font. Go in the Mouse applet to increase double-click range and time if you want to support that.
If you do not need touch-specific event handling I think it's all you have to do. But touch means more than that and you may want to support it in a better way: http://archive.msdn.microsoft.com/WindowsTouch/Release/ProjectReleases.aspx?ReleaseId=2127
I'm attempting to create a UI where I interact with normal WPF controls without the mouse. I want to support multiple cursors so using any regular input simulation (such as SendInput) doesn't work. I also tried interlacing SendInput messages to simulate to mouse cursors but that didn't work either (only got one mouse input). I also have the constraint that I do not want to use Windows MultiPoint.
I've tried sending events to the controls (testing it on Button) through mouse events when I detect my cursor position is above the controls using: MouseEnterEvent, MouseLeaveEvent, MouseMove, MouseDownEvent and MouseUpEvent. But except for the MouseDownEvent, none of it seems to work.
Here is an example of how I send the MouseEnterEvent:
System.Windows.Input.MouseEventArgs e =
new System.Windows.Input.MouseEventArgs(System.Windows.Input.Mouse.PrimaryDevice, DateTime.Now.Millisecond);
e.RoutedEvent = System.Windows.Input.Mouse.MouseEnterEvent;
elementUnderCursor.RaiseEvent(e);
Where elementUnderCursor is a UIElement.
I think that i wanted the same thing for my multimose kinect app be but if you have an emulated mouse driver you would just have to figure out how to tell the app that an event happened with that mouse driver instead of your usb mouse or ps/2 mouse driver.
For instance,
send the mousedown and up event to simulate mouse click for mouse #1,#2. Also, update mouse positon for mouse #1,#2.
My reasoning for the above is i wanted it to work in any application by running a similar program to the above program in the background as a service.
Update dsf works for simulating virtual mouse devices im working on a multiple mouse project with kinect so please look at the current progress at the site: http://kinectmultipoint.codeplex.com.
be warned multiple mouse drivers cant be built overnight.
also, download the windows ddk to simulate the mouse devices. the testgenerichid.wsf script has to be changed for your scenario but its possible.
I am using Silverlight to develop an application that we anticipate will be used heavily on touch screen monitors in the healthcare setting. I am finding that simple controls like Button, or even simpler primitive controls that I've written myself to capture mouse events, are very flaky and unresponsive on an HP Touchsmart. Many times clicks do not even register, but the behavior is entirely unpredictable.
Any idea why a plain old Silverlight button is hard to click by touch on an HP Touchsmart?
I just bought a touchpad wich allows drawing and using multitouch. The api is not supported fully by windows 7, so I have to rely on the build in config dialog.
The basic features are working, so if I draw something in my WPF tool, and use both fingers to do a right click, I can e.g. change the color. What I want to do now is assign other functions to special features in WPF.
Does anybody know how to find out in what way the pad communicates with the app? It works e.g. in Firefox to scroll, like it should (shown on this photo). But I do not know how to hookup the scroll event, I tried a Scrollviewer (which ignores my scroll attempts) and I also hooked up an event with the keypressed, but it does not fire (I assume the pad does not "press a key" but somehow sends the "scroll" command direclty. How can I catch that command in WPF?
Thanks a lot,
Chris
[EDIT] I got the scroll to work, but only up and down, not left and right. It was just a stupid "listbox in scrollviewer" mistake. But still not sure about commands like ZOOM in (which is working even in paint).. Which API contains such things?
[EDIT2] Funny, the zoom works in Firefox, the horizontal scrolling does not. But, in paint, the horizontal scrolling works...
[EDIT 3] Just asked in the wacom forum, lets see about vendor support reaction time...
http://forum.wacom.eu/viewtopic.php?f=4&t=1939
Here is a picture of the config surface to get the idea what I am talking about: (Bamboo settings, I try to catch these commands in WPF)
alt text http://img340.imageshack.us/img340/3751/20091008210914.jpg
Have you had a look at this yet.
WPF 3.5 does not natively support multi-touch (it is coming in WPF 4.0) however the samples in that kit should get you started using the Windows7 Integration Library which access the native Win32 APIs to provide the required support (Don't worry its not real ugly:).