I've written a win32 App in C++ (a game) and I want to be able to know if the application has lost focus due to the user pressing CTRL-ALT-DEL and starting the task manager. How can I do this? What I want to do after detecting the event is to minimize the window of my game and pause its processing (animations, audio, etc.). However, if the user returns from the CTRL-ALT-DEL menu to the game then it should keep running as usual. I've thought that I could check for key presses on CTRL, ALT and DEL but that doesn't seem to work and just reacting to the lost the focus (WM_KILLFOCUS) is not what I want.
You can use WTSRegisterSessionNotification(), you'll get the WM_WTSSESSION_CHANGE message when the user presses Ctrl+Alt+Del and switches to the secure desktop.
Beware that you cannot tell that it was actually the secure desktop that he switched to, that would be rather nasty security leak. You'll also get the notification when he switches to another logon session. Also a case where you want to stop your game of course.
For that matter, a game ought to automatically pause whenever the game window loses the foreground. Nobody likes to be killed when they switch to their email reader :) Use the WM_ACTIVATEAPP message
Related
I am trying to write a tiny x11 screen locker program (something like i3lock) in c. When the program is run it spawns a full screen window. I want this window to intercept all keyboard input preventing the user from leaving until they have entered the password. How would I go about intercepting the keyboard input and preventing the user from leaving?
Any help would be much appreciated, thanks.
My solution has 2-machines where one is customer facing and the other is used by store personnel; in some situations when store personnel want to take control of the customer facing system we use UltraVNC to remote into it, push our application to a 2nd virtual display and put a PLEASE WAIT screen to the customer!
With Windows10 the concept of a virtual display is no longer possible so our remote view via UltraVNC lands on the primary display which means that the customer can SEE what the store personnel are doing and can also interact/interfere with it.... and this is my struggle today!
We found with UltraVNC we can "display user input" which works with keyboard/mouse but doesn't support blocking TOUCH input (we use a touchscreen) and there seems to be no way to put a PLEASE WAIT only to disable the screen (goes black - which has customers asking if the program crashed).
So opening this up to the general public to see if anyone has experience either with UltraVNC or has some completely different proposal for me to consider! I am open to all suggestions!
You rely on windows solution, On linux there is a dbus messaging system , I don't know if there is a similar messaging system in windows.
You may have a powershell script that unplugs the display or the touchscreen input.
and send a toast message to user.
https://gist.github.com/atao/12b5857d388068bab134d5f9c36241b1
Using the lisp implementation of the X11 protocol, get-overlay-window freezes when no compositor is running. If I kill the lisp process, the xid is printed out.
This also freezes my lisp window manager running in another lisp thread, though same process. Basically X acts like it's been grabbed, so thank god for ctrl-alt-f1.
Some previous questions about composite show others running into similar problems when no compositor is running.
I'm guessing that maybe the server is waiting for some sort of out of protocol authorization or something? Or something particular sequence of events has to be completed?
Having access to the overlay window when another compositor is active isn't helpful for writing a compositor!
Apparently I had a reading comprehension fail with the protocol description, or they a writing fail.
Asking composite to redirect windows automatically ensures the windows contents get drawn. It does not ensure they get drawn to the overlay! Nor does the overlay appear to be transparent. So even with setting all windows to be automatically updated, when the overlay window gets mapped by the call to get its XID it blocks you from seeing any other updates to the screen and blocks all input.
Making the overlay in a sense not very useful. Or the request to have automatic updates for redirected windows not useful. Either way, seems will have to paint every single pixel even of the windows we're not interested in.
Maybe it's just a driver thing?
I have a door telephone on our office building. When somebody presses the button it calls a telephone number for which we have a simcard. Right now that simcard is in a cellphone. Everytime we have a meeting at our office, we have to pick up the phone and press 3 to open the door. I'm looking for a solution to be able to programatically pick up the phone and press the 3. Does any such software exist? I have googled but found nothing.
TLDR; I need some software (and a sim card reader) that can programmatically pick up the phone when it rings and respond with a 3 on the numpad.
The OS doesn't matter.
Not sure if Stackoverflow is the right place to ask. Let me know if you have suggestions for other better places to ask.
You could try using the SIM Card in a normal 3G USB Dongle and an application called "Gammu" which can answer a call and sendDTMF codes i.e. number presses. I have only used Gammu on linux systems but I believe it works on Windows as well.
Another possible solution:
Setup voicemail for that SIM card, and when you are asked to leave your voicemail message (the message which is played to whoever gets to the voicemail) just press the button #3.
When you want, use call forwarding to redirect all calls to the voicemail. Alternatively, turn off the phone (on most cellular networks this will redirect all calls to the voicemail).
I am trying to write something for windows using WINAPI, so I can make the touch pad do whatever the mac touch pad do.
I have checked using Spy++ what WM messages the two finger taps and etc. send to the OS, but figured out it sends only those plus/minus:
WM_LBUTTONDOWN
WM_LBUTTONUP
WM_MOUSEHOVER
WM_MOUSEHWHEEL
WM_MOUSELEAVE
WM_MOUSEMOVE
WM_RBUTTONDOWN
WM_RBUTTONUP
When I tried to see what happend when clicking with 3 or 2 fingers it didn't send any particular message, unless I moved them a bit.
firstly i would like to start with this:
when 5 fingers going down show desktop (as win+D does).
How to write (driver?) something that can diagnose 5 fingers touching simultaneously the touch pad?
Of curse there is no OS messages for this, but I can make some unique combination of existed messages and by that diganose it.
If I need to write a driver can I do it generic for most of the touchpad, can I do it as add-on?
If you can post a good tutorial you are familiar with for writing a driver for windows, pls, cause I have no clue about it.
Do I need anything else to take into account :
1. Diagnose 5 fingers mouse events.
2. Make a thread in Explorer on startup that handle those new mouse messages.
thanks in advance
Mouse Input Notifications
In short, you can't.
First, there are touchpads that can physically detect only 1 finger touch, and for those who can detect many - their drivers do the translation for you.
Windows does not have any inherent support for reading multiple touch inputs - it relies on the touchpad drivers to provide them.
You can achieve your goal for SOME devices by writing your own touchpad driver (probably starting from Linux touchpad drivers and Windows driver development kit), but this is far from being simple.
And, you'll need to do this for each and every touchpad device you want to support (from Synaptics, Alps Electric, Cirque to name the few)
Only after that you can move on to implementing the reaction for the touchpad actions in applications like Explorer.