I have a ListView with which I'd like to use a context menu that changes depending on selection. I'm making sure that I'm first able to display a menu when the right mouse button has been released (as per normal context menu behaviour).
In my ListView WNDPROC I'm using WM_CONTEXTMENU to display the context menu. The menu however is displayed at the location the cursor began the selection, not at the end.
From the MS documentation:
DefWindowProc generates the WM_CONTEXTMENU message when it processes the WM_RBUTTONUP or WM_NCRBUTTONUP message or when the user types SHIFT+F10. The WM_CONTEXTMENU message is also generated when the user presses and releases the VK_APPS key.
When I inspect the call stack, with a breakpoint in WM_CONTEXTMENU, I see that the message sent prior to WM_CONTEXTMENU was 0x0204 or WM_RBUTTONDOWN containing the coordinates of the cursor at this time. This probably explains the menu location issue, but why would this be happening?
When I hold the RMB down outside of the ListView and release it inside, the context menu still appears and I can see from the call stack that the last message was 0x0205 or WM_RBUTTONUP.
Not sure whether I have something wrong in my code, or I'm not understanding something. Any help on this issue would be greatly appreciated, thanks.
Rather than relying on the WM_RBUTTON(DOWN|UP) messages to determine the mouse coordinates, the WM_CONTEXTMENU's own lParam gives you the mouse's screen coordinates of the message that generated the WM_CONTEXTMENU. If those coordinates are not what you are expecting, you can use GetMessagePos() instead, which will report the screen coordinates at the time WM_CONTEXTMENU was generated. Either way, you can then convert the screen coordinates into ListView client coordinates using ScreenToClient() or MapWindowPoints().
Just be sure you also handle the case where the popup menu is being invoked by the user via keyboard input rather than mouse click. In that case, the lParam of WM_CONTEXTMENU will carry the screen coordinates [x=-1,y=-1], and you can query the ListView for the position of its selected item(s) using LVM_GETITEMPOSITION or LVM_GETITEMRECT as needed, then convert that position to screen coordinates using ClientToScreen() or MapWindowPoints(), and then display the popup menu at that screen location.
Related
My question is about combo boxes in Windows MFC applications.
The dropdown part of the combo box contains items composed of a bitmap and a string.
Sometimes, the strings are too long and I have to adjust the width of the dropdown part of the combo box using the CComboBox::SetDroppedWidth() method.
My problem is that when the combo box is near the right edge of the computer screen, the right part of the dropdown is hidden (see image_1 and image_2 below).
I would like it to behave like in Excel (see image_3 below) meaning I would like the dropdown list to be shifted accordingly so that all its items can be seen without being cropped.
How can this be achieved?
image_1: right part of the dropdown is NOT hidden
image_2: near the computer right edge, the right part of the dropdown is hidden
image_3: Excel combo box
=================================================================
EDIT 1
=================================================================
EDIT 2
Ok. I forgot to mention that m_cbXmodels is a CComboBoxEx object. This is why the handles are NULL. I could get the handles via GetComboBoxCtrl()...
Handle the CBN_DROPDOWN notification.
Get the handle for the list control with GetComboBoxInfo.
Now use MoveWindow to adjust the window as needed.
Getting the current screen size is available with MonitorFromWindow. See rcWork member in MONITORINFO. You just need to adjust the left and right coordinates.
EDIT: As you can read in the comments: My Approach with CBN_DROPDOWN is to early Thanks to zett42). It is not possible to resize the combo box list part here.
But it is possible to post a user defined message to the same window and to reposition the window than.
I am trying to implement the following feature using C/GTK3/Cairo:
-Left click on an GtkDrawingArea Widget and printf the coordinates Xo and Yo.
-While keeping the left button down, move the mouse and draw a line conecting (Xo,Yo) to the current mouse position.
-Release the left mouse button and printf("something")
How do I do this? Anyone knows of a good tutorial showing how to handle mouse clicl-move events?
So far, the best I found was this zetcode lines (which shows how to handle mouse click events but not button-down/move/button-up and this , which explains how to change the mouse cursor when hovering over a Widget.
Thanks
Did you see this GtkDrawingArea demo from the Gtk people? This one is written in C, but there is a Python version of the same program (links updated - thanks #kyuuhachi).
Anyway, in the constructor (__init__), calls are connected to the motion_notify_event.
You also need to connect to the button_press_event and the button_release_event.
Then, on button press, you save the coordinates of the start point. (and save it to the end point too, which are the same for now).
On each motion_notify_event, you delete the previous line (by overwriting), and redraw it to the new end point.
Finally, when the button is released, the line is final.
It's much easier if you use a canvas widget, for example GooCanvas, which takes care of most of the updating. You can just update the coordinates of the line object, and it will move itself. Also you can easily remove lines. The 'algorithm' is similar as above:
Connect button_press_event, button_release_event, and motion_notifyevent to the canvas,
When a button press occurs, create a GooCanvas.polyline object, and set begin and endpoint,
Update the endpoint on each motion_notify_event
Finalize with a button_release_event.
I'm trying to write a WinForms program that will capture mouse coordinates upon pressing and (more importantly) releasing the middle mouse button.
My form has topmost set to true (so that text in it can always be visible even when it doesn't have focus).
What I'm aiming for is being able to hover the mouse over a game window after my program starts, hit the middle mouse button, and have it record the mouse position for later use.
I can get it to detect when the middle mouse button is clicked inside the form using the MouseUp event (bound to the form itself) but have no clue what I need to do to have it detect when the mid mouse is clicked outside my form.
Thanks for any help guys.
I believe what you are after are called low level hooks. A quick google brings up this: Erroneous Mouse Coordinates Returned from Low Level Mouse Hook C#
A microsoft example of how to do can be found here: http://support.microsoft.com/kb/318804
In WinAPI is there a mouse move notification for the full desktop (full screen) and not for a window only?
I would like to receive mouse screen coordinates in my main window procedure.
Edit:
What I try to do is getting the coordinates from the mouse when dragging from a button in my window to outside that window.
Not as such, no. If you wanted to do something anywhere on the desktop from within your program, e.g. point somewhere or draw something anywhere, you could capture the mouse and then follow the movement until the mouse button is released. See SetCapture for this.
For an example, see this article on MSDN: Drawing Lines with the Mouse
Otherwise you can always use Windows hooks to follow mouse movements anywhere.
You can set a mouse hook to be notified about all mouse events.
You can use GetCursorPos, or GetMessagePos that contains coordinate of the last message
Windows 7 has the snapping feature which 'snaps' a window to the edges, and changes the one dimension of the window size to match the same dimension of the screen, and then choses another size for the other dimension.
I want to detect that it has happened so that in my WindowStyle.None window with custom chrome, I can implement the proper behavior for double clicking the titlebar.
The snapping does not alter the WindowState, so I cannot detect that it has happened. Since only one dimension of window is set to match the the screen, I can not distinquish between a snapped window and a user resized window. Does Win7 send out a specific message, or include a flag in the WM_xxxx messages when it causes a resize? Is the formula to the other dimension (not the one matching the screen) defined anywhere so that I can check if both dimensions match that of a snapped window?
have you tried using spy++ to watch window messages to see what gets sent in what order? maybe there's an extra mesage in there that windows is using.