Is there a way to test the multitouch capability of an application using a non-multitouch enabled machine? I'd like to simulate user input for zoom, scaling and rotating during runtime.
This is for a WPF application written in C#.
Try using multiple mouse cursors:
http://www.microsoft.com/presspass/features/2006/dec06/12-14MultiPoint.mspx
Related
I built a small desktop app on c# .net
Local desktop without any internet connection or communication with other any computers, I used winforms.
As expected I split up between logic and UI, and now I want to use WPF to my UI (change the way I implement the UI part), and in the future I will want to use mobile UI..
What are my steps should be?
Do I need to write new code, or there is something automatic?
I'm trying to create a WPF application that makes a Skype video call with the use of Kinect gestures (Kinect SDK v2.0). I want to use the Kinect color camera in the video call, while still be able to use Kinect in my WPF application to detect the "hung up" gesture. Is there a way to do this? While the Skype video call is active, I don't really need Kinects color stream in my WPF application. I just need the Body stream to detect the gestures. I'm using the Kinect for Xbox One.
Sure. It works out of the box like that.
The SDK 2.0 is designed specifically to enable "simultaneous multi-app support":
Improved multi-app support enables multiple applications to access a
single sensor simultaneously. - Developing with Kinect: Simultaneous multi-app support (on the very bottom)
There isn't anything special you have to consider. Since the SDK doesn't allow you to change any camera settings (like exposure time or gain), your applications basically just have read-access to the SDK, so they can't interfere with each other.
Just develop you applications separately and run them at the same time.
I`ve a legacy winform desktop app that works perfectly with mouse and keyboard. It has some selfmade controls that involve the creation of threads and so on, for example the longer a button is pushed the faster a number is incremented.
The application also uses a win32 dll. Now, the client wants that application to be touch enabled and run it in a tablet, which also means resizing and rotation capabilities.
My question is, which is the better way to get that application touch enabled and responsive design?
I can try to modify the existing winform, but I think it will be lot of work with poor results. I can also migrate to WPF and reuse the c# code, but I might have trouble with the keyboard, as I have not found a good way to show the keyboard and maintain the whole app on the screen. Or I can migrate to windows store app, but with the problem of that win32 dll, that I`m not sure it could be migrated.
The winform application is multilingual so creating a keyboard is not a valid option.
If the target is touch screen, then for sure the best option would be a Windows Store App, although there are several limitations.
If you are not going to publish this application in Windows Store, then you should be able to use all WinAPI functions. (I'm not sure what is win32.dll - if it's your own dll then it can be a problem).
Is there a way to prevent or disable video capturing of my WPF application? Probably some Win32 API calls or some mask over my WPF content. Or if it is imposible is there a way to at least prevent the most popular screen capture programs from recording what is happening in my WPF application?
To prevent an application from capturing window contents, you can call the SetWindowDisplayAffinity Windows API with a WDA_MONITOR affinity. While this prevents applications from capturing a screen, it will not prevent a user from whipping out their smart phone and taking a picture of the screen.
The API is available on systems running Windows 7 and later. It's also required that Desktop Window Manager composition is enabled. Turning off DWM composition will undo the effect, so you need to prevent users from turning DWM composition off. If you are running Windows 8 and later, this is not an issue, since Desktop Window Manager is always on.
I need to make a multi touch application for a real state display centre. With buttons and maps and floor plans. And the application will be running on Windows 8.1 0 on a 42 inch screen. I want to know what would be the best technology to make this kind application ? Flash or WPF ?
Does Action script has events for the buttons ?
If it's going to be on Windows 8.1 can you not do it as a Metro/Modern app? The support for multi-touch will be a lot better than it is in WPF I think.
There's some information here on how you can approach different app categories.
Using the Windows 8 SDK will give you the opportunity to use JavaScript instead of C# if you find that easier too.