I am working on a wpf touchscreen application for a tabletop monitor. The monitor is a 46" 3M monitor. All the touch responses seem to work fine until someone touches the lower edge of the screen, then it appears to hang. No button responses or drag options work until the finger or hand is moved. I think it has something to do with the touch being in the region of the taskbar even though the taskbar itself does not appear.
The program does not hang if the user touches the side or top in the same way or even if they place their entire hand in the middle of the screen.
Since this is a tabletop monitor, it is very common for a user to lean on the monitor and unknowingly touch the lower edge with their hand.
It appears that this is a bug related to Windows 7 and multitouch drivers for Windows 7. I can reliably reproduce this bug with Windows 7 with a multitouch display and a bezel flat enough to have touch contact on the lower edge.
Using the same program and multitouch display with Windows 8, the problem does not occur.
Related
I have an application written using .Net 4.5 and WPF and mainly for use in Windows 10. In this application, I have created a custom window for capturing images and videos, using leadtools (directshow) to control the camera. Everything looks great with my camera on most devices that I have tried it on except the Surface Pro 4, and that is the device that we have bought multiple of for testing the application. If the tablet is moving while I try to set the selected camera, it takes a little over 10 seconds for the cameras to actually switch. The time doesn't change much (within a 10 milliseconds) whether the tablet is moving a little at the moment I tell it to switch cameras or if I shake the tablet wildly till the 10 seconds passes and the preview starts back up. Then, if I leave my Surface Pro 4 on its stand on my desk and tell it to switch cameras, it switches cameras in less than half a second. Has anyone else seen this problem? Any idea how to fix this problem?
I have a WPF application that includes a screen that users can scroll up and down. It has a scroll viewer control which provides a scroll bar and you can also scroll up and down using touch to drag the content up or down. This all works fine on all of my PCs, laptops and tablets and on most of my clients laptops and tablets.
However, I have one client who can only scroll the form up and down using the scroll bar. Using touch to drag and drop the contents of the form does not work.
I have seen this before on another tablet and we found that by installing all of the Windows updates the problem was resolved. However, on this tablet Windows is saying there are no more updates and the problem still exists.
The tablet is a Linx 10 that had Windows 10 on it from the start - it's not an upgrade. My hunch is that it needs the relevant Windows update but that Windows isn't reporting it as a required update.
Has anyone experienced this and identified what files/updates are needed to resolve it?
Just to re-iterate the application works fine on most PCs and tablets, so I don't think it's a basic coding problem which is why I haven't included any code.
We are building a Windows application in .NET and one of its requirements is touch screen monitor. Other than that, it's a normal windows form based application. But except for making UI items little bigger for touch, I can't find anything I as a developer need to do for the requirement since touch screen is basically mouse operations. Am I missing something?
No, you are not missing anything. Do get the actual hardware hooked up so you can test it, "little bigger" is invariably underestimating the problem of fat fingers. Everything should work from a single click, right-clicks are horribly impractical, double-clicks are best avoided.
The only other thing you'll want to do is go into the Control Panel + Display applet and change the size of standard Windows UI elements. Pick a large window caption font if you want to allow the user to drag or close windows. Make the scrollbars at least twice as wide. And the menu and message box font. Go in the Mouse applet to increase double-click range and time if you want to support that.
If you do not need touch-specific event handling I think it's all you have to do. But touch means more than that and you may want to support it in a better way: http://archive.msdn.microsoft.com/WindowsTouch/Release/ProjectReleases.aspx?ReleaseId=2127
I am working on a Silverlight 4 out of browser (OOB) application on a Windows 7 tablet PC. The majority of the time, the program is in full screen mode. However, if the user rotates the tablet, the application rotates and stays full screen, but is scaled for the previous screen orientation. Taking the application out of full screen and putting it back into full screen rescales everything correctly. Is there any way I can detect when the screen rotates? So far I've tried the app's Resize, LayoutChanged and FullScreenChanged events and have a handler for the main page's SizeChanged event. None of these get fired when the application is rotated in full screen mode.
Per Josh Einstein's suggestion, I tried polling the ActualWidth/Height of the application on a timer. It looks like after the rotation, the ActualWidth/Height values returned from the Silverlight plug-in are the pre-rotated values. Only taking it out of full screen and putting it back in full screen will change the resolution of the plug-in. The HtmlPage.Eval hack didn't work since I am OOB.
The WP7 version of Silverlight has the OnOrientationChanged event, which doesn't seem to be available in the desktop version. Anyone have any ideas?
Great question but as far as I can tell, it doesn't seem to be possible. I tried changing the screen resolution in a virtual machine (orientation change is really just a screen resolution change that results in width being larger than height or vice versa) and could not trigger any layout events.
The issue seems to be that the Silverlight plugin itself is not resized. With an elevated out-of-browser application you could presumably use COM interop with WMI to get the actual screen resolution, but it's kind of hacky. If it's an in-browser application, you could use the HTML DOM. In both cases, you'll probably have to poll for the current screen resolution instead of being notified via an event.
Mister Goodcat has written a blog post about how to get at this information in both scenarios.
I would file it as a Silverlight bug too.
I am currently working on the WPF project which involves creating a touch-screen application for Windows XP embedded. And as Windows XP wasn't built for touch interaction, there are some problems and issues with developing those applications.
An example would be a click: On windows XP click is mouse down and mouse up event, however if you use your finger instead of the mouse, you might get a drag motion instead of the click, as when you press mouse down you finger might slightly move to the side from the initial position and you will get a drag instead of click. This is just a single example of the problems you get when developing an touch-screen app for Windows XP.
If someone has been working on the WPF touch-screen application for Windows XP, could you share some knowledge and point out the pitfalls you have encountered or if you know of any resources on this topic, could you please share it.
I would agree with #bflosabre91. With a mouse you could have the same problems and in fact happens quite frequently when someone is learning to use a mouse. I think this problem is more relevant at the hardware level and how the touchscreen actually interprets what the user is doing.
On the software side, you COULD add some logic something along the lines of:
On mouse down: record coordinates and maybe the control (button, etc.) that is under the pointer
On mouse up: compare recorded coordinates with current coordinates. If it's within x pixels, either do a "control.click" or move the mouse to the old coordinates and tell tell the mouse to click.
The hardware may already be doing something like this...
i have a WPF touch screen application and it is running on kiosks with XP(although its not XP embedded like you said). I haven't had any issue with any type of click event or anything like that. I programmed it using all the normal mouse click events so it technically does work with a mouse or with the touch screen. As long as you build the controls to be large enough to account for the fact a finger will be touching it instead of a mouse pointer, I did not come across any issues.