Who sent/posted the WM_MOUSEMOVE? - winforms

When using the Win32 API message loop (or any higher level abstraction of the same such as System.Windows.Forms.IMessageFilter) to get a message, how do I find out who/which control/which component/which piece of code posted the WM_MOUSEMOVE?
Does that even make sense in the context of Win32? Something akin to the object sender in .NET events.
I checked the documentation for WM_MOUSEMOVE and I can't find anything. It's been a decade I haven't used Win32 API extensively since.

All window messages are sent to a specific HWND. That's the receiver. The sender, for window messages, is the operating system as it generates the window message for the specific window when it decides that a message should be sent to the window.

Related

SFML window Always intercept all events, causing other areas to lose response

https://github.com/kamirr/QSFML
I used this method to complete the Qt\SFML integration
But I found Once this control gains focus, the event can no longer be delivered to other parts of the program.
thanks for any help.

Why does xlib interfere XNVCtrl calls? [duplicate]

I'm trying to create a multithreaded opengl application with libx11 - with one separate thread per window, and one manager thread.
I have an event loop in the manager thread:
while(true)
while(XQLength(mPlatformData->display)){
XNextEvent(mPlatformData->display, &event);
std::cout << "event" << std::endl;
}
}
This is a great event loop for single threaded applications, but with this multithreaded setup strange things happen.
When I'm creating a window, I need to disable the event queue, or GLXMakeCurrent will just hang - my entire thread stops, and does nothing.
I can't find much information about multithreaded X11 applications on the net, should I handle my events differently?
It is known that Xlib has several unfixable runtime issues that manifest in concurent access situations. I'm guessing you're running into exactly one of those.
This is one among the reasons why Xcb was created in the first place: Fix the problems of Xlib. GLX is specified against Xlib so this might seem like a show stopper when it comes to OpenGL. However there is a Xlib wrapping around Xcb and one can safely use that to interface with GLX and still use Xcb for the rest of the program: http://xcb.freedesktop.org/opengl/
I see two possible solutions:
Put a XLockDisplay/Mutex around XNextEvent and the GLX calls each; you don't have to lock for ordinary OpenGL, just the functions prefixed glX....
Use Xcb to get runtime correct behaviour and follow the guide I linked above to make it work with OpenGL/GLX.
As eile said you should check that you use XInitThreads.
I was able to get some good results from it when i used a background thread to do the window drawings of an animation. There seems to be no real problem if you stick to drawing code.
If you need more then that and because you are using low level libX11 the best is just to open multiple X11 connections and use one connection per toplevel window. I did this 10 years ago when i played with developing a BeOS cross platform toolkit and when everything was in a worse state then it is now.
You can use this even for event handling and child windows of a toplevel. But this needs some very tricky code for the XEvent masks.
What are you doing in your render threads? In any case, if you share a Display* connection across different threads you have to call XInitThreads.
I've made good experiences with one Display connection per thread. Use XSelectInput to get events on your main thread. Window IDs are shareable across different Display* connections.

what is required to get an overlay window using x11 protocol with no compositor running?

Using the lisp implementation of the X11 protocol, get-overlay-window freezes when no compositor is running. If I kill the lisp process, the xid is printed out.
This also freezes my lisp window manager running in another lisp thread, though same process. Basically X acts like it's been grabbed, so thank god for ctrl-alt-f1.
Some previous questions about composite show others running into similar problems when no compositor is running.
I'm guessing that maybe the server is waiting for some sort of out of protocol authorization or something? Or something particular sequence of events has to be completed?
Having access to the overlay window when another compositor is active isn't helpful for writing a compositor!
Apparently I had a reading comprehension fail with the protocol description, or they a writing fail.
Asking composite to redirect windows automatically ensures the windows contents get drawn. It does not ensure they get drawn to the overlay! Nor does the overlay appear to be transparent. So even with setting all windows to be automatically updated, when the overlay window gets mapped by the call to get its XID it blocks you from seeing any other updates to the screen and blocks all input.
Making the overlay in a sense not very useful. Or the request to have automatic updates for redirected windows not useful. Either way, seems will have to paint every single pixel even of the windows we're not interested in.
Maybe it's just a driver thing?

How to create a global window for process inter-communications?

I pretty much like the way a window is send/received messages and I want to reuse that for process inter-communications - I've heard of named pipes but I don't want to write to a file - it seems ugly and unintuitive for me.
So is it possible to create a window with sharable handle across multiple processes?
Window handles are shared by default, just as you can find them through FindWindow or FindWindowEx. What you want is a bit like socket communication, client-server-client transit protocol. It's just that sockets are more powerful and can be used on different machines.
You can communicate between processes by defining your own WM_* message type, and you can control "multi-to-multi" inter-process communication. But it is not practical in practice (if ugliness is not taken into account), it is not as powerful as socket, not as mature as socket's technology, more resource occupied(because of visible window).
Of course, as #IInspectable said, there is another way of message-only windows. But the window is not visible, which is not "intuitive". Getting a window handle is as "ugly" as opening a file. It's like encapsulating a message queue into an invisible window.
In addition, if the window is accidentally closed, the communication will fail.
So summary: You can use the visible window to communicate between processes according to your preferences, but this method is not practical (unless there is a special need).

How to make a Windows touch pad acts like a Mac touch pad?

I am trying to write something for windows using WINAPI, so I can make the touch pad do whatever the mac touch pad do.
I have checked using Spy++ what WM messages the two finger taps and etc. send to the OS, but figured out it sends only those plus/minus:
WM_LBUTTONDOWN
WM_LBUTTONUP
WM_MOUSEHOVER
WM_MOUSEHWHEEL
WM_MOUSELEAVE
WM_MOUSEMOVE
WM_RBUTTONDOWN
WM_RBUTTONUP
When I tried to see what happend when clicking with 3 or 2 fingers it didn't send any particular message, unless I moved them a bit.
firstly i would like to start with this:
when 5 fingers going down show desktop (as win+D does).
How to write (driver?) something that can diagnose 5 fingers touching simultaneously the touch pad?
Of curse there is no OS messages for this, but I can make some unique combination of existed messages and by that diganose it.
If I need to write a driver can I do it generic for most of the touchpad, can I do it as add-on?
If you can post a good tutorial you are familiar with for writing a driver for windows, pls, cause I have no clue about it.
Do I need anything else to take into account :
1. Diagnose 5 fingers mouse events.
2. Make a thread in Explorer on startup that handle those new mouse messages.
thanks in advance
Mouse Input Notifications
In short, you can't.
First, there are touchpads that can physically detect only 1 finger touch, and for those who can detect many - their drivers do the translation for you.
Windows does not have any inherent support for reading multiple touch inputs - it relies on the touchpad drivers to provide them.
You can achieve your goal for SOME devices by writing your own touchpad driver (probably starting from Linux touchpad drivers and Windows driver development kit), but this is far from being simple.
And, you'll need to do this for each and every touchpad device you want to support (from Synaptics, Alps Electric, Cirque to name the few)
Only after that you can move on to implementing the reaction for the touchpad actions in applications like Explorer.

Resources