Wpf application targeting .NET Framework 4.6.2 on Windows 10 Enterprise.
The touch keyboard shows up automatically when a text field gets focus.
The computer station must operate in kiosk mode and one way to achieve it is to replace the default shell for the logged in user.
According to How to run custom, non window app in kiosk mode in windows 10 the following registry key was added: HKCU\Software\Microsoft\Windows NT\CurrentVersion\Winlogon\Shell
Now the application runs in kiosk mode but the virtual keyboard is not shown anymore, so it is useless.
Can someone confirm that automatic keyboard introduced in .NET Framework 4.6.2 doesn't work if the application is launched as the default shell ?
In the article Use Shell Launcher to create a Windows 10 kiosk, talking about the shell launcher Version 2 enhancements, it is reported:
You can use a custom Windows desktop application that can then launch UWP apps, such as Settings and Touch Keyboard.
Is the above keyboard the same one I need ? Following the shell launcher version 2 procedure will the keyboard popup again without rewriting the application code?
Thanks
Filippo
Related
In a Winforms app running on Windows 10 (Anniversary or greater), is there a way to detect that the Windows onscreen keyboard has opened?
You can periodically enumerate all windows and search for DirectUIHWND class.
For the more info about windows enumeration, take a look to http://improve.dk/finding-specific-windows/
I'm looking to put an application window behind all other windows.
I was thinking of using .NET 4 and WPF or Silverlight.
I don't have any specific drawing code, I just want my application window as my desktop wallpaper while retaining the interactivity (no mucking around with screenshots etc).
Is this even possible and if so, how?
Edit: I'm using Windows 7 x64.
Edit 2: I know the desktop is drawn to a window with id 0 (or something like that). Can't I use interop calls to put my window as a child of window 0?
For Windows XP and previous versions, use Active Desktop ad set a web page that runs a Silverlight application.
I'm trying to use a WPF application on our Citrix server, and made a really simple window with a button and a textbox. When running the application, noting but the application name on top of the window is shown. Is it not possible to use WPF on Citrix, or is our Citrix server too old?
It should be possible to use WPF on Citrix starting with PS 4.5 Feature pack 1.
Is your version older than this?
I'm building a .NET 3.5 sp1 WPF app for use on Windows 7 - is there a way to place the touchscreen keyboard in my application?
NOTE: In contrast to other similar questions I'm not looking to embed the keyboard in my app, I just want to start the keyboard and use it for text entry.
Like Will said...just start the process.
System.Process.Start("osk")
Create a desktop shortcut to a batch file that starts both your application and the Windows on-screen keyboard?
We have a legacy application running on UNIX. Our windows users login to the application via a terminal application or commaind prompt. The app looks like one of those car-dealer like application which runs in dos mode interface.
I am creating a new winform interface to the application but still want to keep the legacy application online for a while, however I dont want the users to open up command prompt, but instead access the application from within my winform. Is there a usercontrol which I can place on the winform for command prompt access?
Solution ~ Ended up using a control from Rebex.net, fast integration and easy to use.
See this blog post from Jeffrey
With his code you can write to a Console window, even from a Windows Forms application.
NOTE
I would NOT recommend to do that.
A Windows Application should use the Windows as GUI and not the console window...
I've been keeping my eye on this one. It looks promising.
http://www.codeproject.com/KB/IP/Terminal_Control_Project.aspx?fid=469468&df=90&mpp=25&noise=3&sort=Position&view=Quick&select=2857639