Title pretty much says it all: I'm wondering whether it's possible to change the mouse cursor icon in response to feedback in a terminal app (e.g., a click event) from the ncurses library or another library?
For example: I am running xterm under X, and a curses application inside that xterm. I may or may not be sshed into another box.
A user clicks on an element of my cursor app -- is it possible to change the mouse cursor icon from a bar to a plus sign in response to the click?
There is some information here but I'd like a more complete resource:
Mouse movement events in NCurses
I don't believe it is. ncurses can read events from the mouse but not actually change mouse cursor settings. The terminal sends mouse movement and clicks to the ncurses program as escape sequences.
Some terminals, such as putty, will change the cursor to an arrow when a region is clickable. Otherwise, a text selection cursor is shown. But I don't think this is controllable through escape sequences.
Related
I have a ListView with which I'd like to use a context menu that changes depending on selection. I'm making sure that I'm first able to display a menu when the right mouse button has been released (as per normal context menu behaviour).
In my ListView WNDPROC I'm using WM_CONTEXTMENU to display the context menu. The menu however is displayed at the location the cursor began the selection, not at the end.
From the MS documentation:
DefWindowProc generates the WM_CONTEXTMENU message when it processes the WM_RBUTTONUP or WM_NCRBUTTONUP message or when the user types SHIFT+F10. The WM_CONTEXTMENU message is also generated when the user presses and releases the VK_APPS key.
When I inspect the call stack, with a breakpoint in WM_CONTEXTMENU, I see that the message sent prior to WM_CONTEXTMENU was 0x0204 or WM_RBUTTONDOWN containing the coordinates of the cursor at this time. This probably explains the menu location issue, but why would this be happening?
When I hold the RMB down outside of the ListView and release it inside, the context menu still appears and I can see from the call stack that the last message was 0x0205 or WM_RBUTTONUP.
Not sure whether I have something wrong in my code, or I'm not understanding something. Any help on this issue would be greatly appreciated, thanks.
Rather than relying on the WM_RBUTTON(DOWN|UP) messages to determine the mouse coordinates, the WM_CONTEXTMENU's own lParam gives you the mouse's screen coordinates of the message that generated the WM_CONTEXTMENU. If those coordinates are not what you are expecting, you can use GetMessagePos() instead, which will report the screen coordinates at the time WM_CONTEXTMENU was generated. Either way, you can then convert the screen coordinates into ListView client coordinates using ScreenToClient() or MapWindowPoints().
Just be sure you also handle the case where the popup menu is being invoked by the user via keyboard input rather than mouse click. In that case, the lParam of WM_CONTEXTMENU will carry the screen coordinates [x=-1,y=-1], and you can query the ListView for the position of its selected item(s) using LVM_GETITEMPOSITION or LVM_GETITEMRECT as needed, then convert that position to screen coordinates using ClientToScreen() or MapWindowPoints(), and then display the popup menu at that screen location.
I am trying to implement the following feature using C/GTK3/Cairo:
-Left click on an GtkDrawingArea Widget and printf the coordinates Xo and Yo.
-While keeping the left button down, move the mouse and draw a line conecting (Xo,Yo) to the current mouse position.
-Release the left mouse button and printf("something")
How do I do this? Anyone knows of a good tutorial showing how to handle mouse clicl-move events?
So far, the best I found was this zetcode lines (which shows how to handle mouse click events but not button-down/move/button-up and this , which explains how to change the mouse cursor when hovering over a Widget.
Thanks
Did you see this GtkDrawingArea demo from the Gtk people? This one is written in C, but there is a Python version of the same program (links updated - thanks #kyuuhachi).
Anyway, in the constructor (__init__), calls are connected to the motion_notify_event.
You also need to connect to the button_press_event and the button_release_event.
Then, on button press, you save the coordinates of the start point. (and save it to the end point too, which are the same for now).
On each motion_notify_event, you delete the previous line (by overwriting), and redraw it to the new end point.
Finally, when the button is released, the line is final.
It's much easier if you use a canvas widget, for example GooCanvas, which takes care of most of the updating. You can just update the coordinates of the line object, and it will move itself. Also you can easily remove lines. The 'algorithm' is similar as above:
Connect button_press_event, button_release_event, and motion_notifyevent to the canvas,
When a button press occurs, create a GooCanvas.polyline object, and set begin and endpoint,
Update the endpoint on each motion_notify_event
Finalize with a button_release_event.
I'm trying to write a WinForms program that will capture mouse coordinates upon pressing and (more importantly) releasing the middle mouse button.
My form has topmost set to true (so that text in it can always be visible even when it doesn't have focus).
What I'm aiming for is being able to hover the mouse over a game window after my program starts, hit the middle mouse button, and have it record the mouse position for later use.
I can get it to detect when the middle mouse button is clicked inside the form using the MouseUp event (bound to the form itself) but have no clue what I need to do to have it detect when the mid mouse is clicked outside my form.
Thanks for any help guys.
I believe what you are after are called low level hooks. A quick google brings up this: Erroneous Mouse Coordinates Returned from Low Level Mouse Hook C#
A microsoft example of how to do can be found here: http://support.microsoft.com/kb/318804
In WinAPI is there a mouse move notification for the full desktop (full screen) and not for a window only?
I would like to receive mouse screen coordinates in my main window procedure.
Edit:
What I try to do is getting the coordinates from the mouse when dragging from a button in my window to outside that window.
Not as such, no. If you wanted to do something anywhere on the desktop from within your program, e.g. point somewhere or draw something anywhere, you could capture the mouse and then follow the movement until the mouse button is released. See SetCapture for this.
For an example, see this article on MSDN: Drawing Lines with the Mouse
Otherwise you can always use Windows hooks to follow mouse movements anywhere.
You can set a mouse hook to be notified about all mouse events.
You can use GetCursorPos, or GetMessagePos that contains coordinate of the last message
I would like to be able to draw anywhere on the screen, so I think I should create a transparent, fullscreen, undecorated window.
The problem is, all events pass through the window. I'd like to catch a mouse-move event and use it.
Any ideas? Might I be able to do this in a higher-level library?
Simply take a screen capture and make a full screen window filled with those pixels. You won't get live update but you will be able to process the mouse and keyboard events however you like.