Detect the onscreen keyboard in Windows 10 from Winforms - winforms

In a Winforms app running on Windows 10 (Anniversary or greater), is there a way to detect that the Windows onscreen keyboard has opened?

You can periodically enumerate all windows and search for DirectUIHWND class.
For the more info about windows enumeration, take a look to http://improve.dk/finding-specific-windows/

Related

Virtual keyboard not shown in kiosk mode

Wpf application targeting .NET Framework 4.6.2 on Windows 10 Enterprise.
The touch keyboard shows up automatically when a text field gets focus.
The computer station must operate in kiosk mode and one way to achieve it is to replace the default shell for the logged in user.
According to How to run custom, non window app in kiosk mode in windows 10 the following registry key was added: HKCU\Software\Microsoft\Windows NT\CurrentVersion\Winlogon\Shell
Now the application runs in kiosk mode but the virtual keyboard is not shown anymore, so it is useless.
Can someone confirm that automatic keyboard introduced in .NET Framework 4.6.2 doesn't work if the application is launched as the default shell ?
In the article Use Shell Launcher to create a Windows 10 kiosk, talking about the shell launcher Version 2 enhancements, it is reported:
You can use a custom Windows desktop application that can then launch UWP apps, such as Settings and Touch Keyboard.
Is the above keyboard the same one I need ? Following the shell launcher version 2 procedure will the keyboard popup again without rewriting the application code?
Thanks
Filippo

Touchscreen Windows 7 WPF

I have an app which I need to make accessible for Windows Touch. It is not a multi-touch application. I've looked at Microsoft's guidelines for touch applications which is interesting. There is one thing I am not clear on though, that is text input.
I would like a keyboard to appear when I click in a TextBox field. Is there a way to use the built-in on screen keyboard for this?
The first monitor I tested with was a Wacom. It is an older unit that uses a pen. It had some software that pulled up an on screen keyboard whenever I clicked in any text field (in any application). It was very handy. I thought this feature was using built-in Windows Tablet software because it didn't look like it came from a third party. A newer monitor I just purchased (Elo) does not have this feature though.
Answering my own question so it won't show up as unanswered any longer... From my comment above:
Looks like I've found the problem. The general purpose driver for the monitor wasn't installing it as a Tablet PC monitor. The Windows 7 only driver will provide a Tablet PC control panel settings. Now a keyboard shows up whenever I click in a TextBox field. The Windows XP compatible driver must have been using a legacy sub-system...

Put an application window as wallpaper

I'm looking to put an application window behind all other windows.
I was thinking of using .NET 4 and WPF or Silverlight.
I don't have any specific drawing code, I just want my application window as my desktop wallpaper while retaining the interactivity (no mucking around with screenshots etc).
Is this even possible and if so, how?
Edit: I'm using Windows 7 x64.
Edit 2: I know the desktop is drawn to a window with id 0 (or something like that). Can't I use interop calls to put my window as a child of window 0?
For Windows XP and previous versions, use Active Desktop ad set a web page that runs a Silverlight application.

Windows 7 touchscreen keyboard in a WPF app

I'm building a .NET 3.5 sp1 WPF app for use on Windows 7 - is there a way to place the touchscreen keyboard in my application?
NOTE: In contrast to other similar questions I'm not looking to embed the keyboard in my app, I just want to start the keyboard and use it for text entry.
Like Will said...just start the process.
System.Process.Start("osk")
Create a desktop shortcut to a batch file that starts both your application and the Windows on-screen keyboard?

Testing Windows 7 multitouch on a dev machine without multitouch?

Is there a way to test the multitouch capability of an application using a non-multitouch enabled machine? I'd like to simulate user input for zoom, scaling and rotating during runtime.
This is for a WPF application written in C#.
Try using multiple mouse cursors:
http://www.microsoft.com/presspass/features/2006/dec06/12-14MultiPoint.mspx

Resources