Enabling mouse in Allegro5 - c

I am starting using Allegro in my program in C, but I'm having difficulties creating the buttons. I am using this kind of logic:
if (event.type == ALLEGRO_EVENT_MOUSE_BUTTON_UP)
{
if ((event.mouse.x >= 442) &&(event.mouse.x <= 471) &&(event.mouse.y >= 202) &&(event.mouse.y <= 238))
{
dig = '1';
entr = 1;
}
But this spaces defined by the axis are non 'clickable'.Somebody here has some tip about the typo of command I should use?

I can only guess what was wrong, but there is no answer yet, so I will provide some tips about the possible problem. Your thread is titled enabling the mouse in allegro 5, so I can only assume you're not getting mouse input.
1) You need to install the mouse driver before you will get any mouse input :
if (!al_install_mouse()) {Fail();}
2) The mouse needs to be registered with your event queue.
al_register_event_source(event_queue , al_get_mouse_event_source());
In a typical GUI, most buttons are only considered 'pressed' if they receive both a mouse button down event over their click area, AND a mouse button up event over the same area. This way you don't get a button press when you click on something else, move the mouse over your button and release it. You also prevent button events from pressing the mouse, moving it off the click area and releasing it.

Related

Detect mouse click events while ignoring mouse movements in a windows console

I have a WSAWaitForMultipleEvents based loop, that (in addition to other stuff), triggers on keyboard and mouse events. Firstly setting:
SetConsoleMode(GetStdHandle(STD_INPUT_HANDLE), ENABLE_EXTENDED_FLAGS | ENABLE_PROCESSED_INPUT | ENABLE_MOUSE_INPUT);
And then using:
GetStdHandle(STD_INPUT_HANDLE);
As an event added to WSAWaitForMultipleEvents .
Now this is working fine but I'm really only interested in the mouse position when clicked. But naturally it is triggering for mouse movement as well. Is there any way of excluding mouse movement from the event such that WSAWaitForMultipleEvents will only awaken on mouse click and will ignore mouse movement?
Is there any way of excluding mouse movement from the event such that WSAWaitForMultipleEvents will only awaken on mouse click and will ignore mouse movement?
No. Using ENABLE_MOUSE_INPUT means that the console handle satisfies the wait on any mouse activity within the console window while the window is focused. That includes both mouse movements and mouse clicks. The documentation even says as much. So, you will just have to receive and discard the console events that you are not interested in.

How to handle mouse motion events in GTK3?

I am trying to implement the following feature using C/GTK3/Cairo:
-Left click on an GtkDrawingArea Widget and printf the coordinates Xo and Yo.
-While keeping the left button down, move the mouse and draw a line conecting (Xo,Yo) to the current mouse position.
-Release the left mouse button and printf("something")
How do I do this? Anyone knows of a good tutorial showing how to handle mouse clicl-move events?
So far, the best I found was this zetcode lines (which shows how to handle mouse click events but not button-down/move/button-up and this , which explains how to change the mouse cursor when hovering over a Widget.
Thanks
Did you see this GtkDrawingArea demo from the Gtk people? This one is written in C, but there is a Python version of the same program (links updated - thanks #kyuuhachi).
Anyway, in the constructor (__init__), calls are connected to the motion_notify_event.
You also need to connect to the button_press_event and the button_release_event.
Then, on button press, you save the coordinates of the start point. (and save it to the end point too, which are the same for now).
On each motion_notify_event, you delete the previous line (by overwriting), and redraw it to the new end point.
Finally, when the button is released, the line is final.
It's much easier if you use a canvas widget, for example GooCanvas, which takes care of most of the updating. You can just update the coordinates of the line object, and it will move itself. Also you can easily remove lines. The 'algorithm' is similar as above:
Connect button_press_event, button_release_event, and motion_notifyevent to the canvas,
When a button press occurs, create a GooCanvas.polyline object, and set begin and endpoint,
Update the endpoint on each motion_notify_event
Finalize with a button_release_event.

OpenTK makes touching button raise click event differently than mouse

What I am trying to do: I am currently hosting an OpenTK glControl (through a WindowsFormsHost) in my WPF application. The application has many buttons, but we will focus on the pause and play buttons. I use VBO's and this is how I animate GL.DrawArrays(PrimitiveType.LineStrip, 0, frameCount );. It is drawing the range of vertices from 0 to framecount, so when I hit play, it just starts incrementing framecount, which starts animating. When just using a mouse, everything works perfectly.
The Problem: My app needs to work with a touch screen also (FWIW, when I say touch, WPF is seeing it as a stylus, not touch). When the app is NOT animating, touch works like it should, I don't handle touch events so touching buttons just raised the Click event. If we aren't animating, I don't have any problems. So when I use my mouse Click the play button, the UI still responds to my Mouse (clicks work, hovering changes colors as it should, etc), but then touches seemingly get ignored for some period of time. I have to touch a button 3 to 5 times before it does what it is supposed to (like pause). Lets go back and now we aren't animating, this time, if I TOUCH the play button to raise the Click event, it will start animating, but my touches now have the same problem of having to touch multiple times to raise Click event, and on top of that, the UI now does not respond to the mouse (the WindowsFormsHost still reacts to mouse events properly). Clicking on buttons or hovering over buttons doesn't do anything. It is not until I can get the animation to pause again that the UI starts responding to Mouse again.
What I have tried and what (I think) I know: When the UI stops responding to mouse input and touch input is messsed up, if I use MOUSE or TOUCH to click anywhere outside of my app, it works as expected. This leads me to believe it is not a touch screen driver problem or anything. I also used Snoop to see what events were being raised (or not) to see the stylus and the mouse being captured and released and what has focus. I could not find an instance where one was not released. If anyone would like, I can post Snoop results showing differences between mouse and touch clicking. I tried putting a preview mouse down on the MainWindow, but after TOUCHING play, no Mouse clicks raise this event, and the first touch after touching the play button touches randomly will fire this event. I also tried hiding the WindowsFormsHost, and mouse and touch clicks all work like they should minus the glControl. Because I hide the WindowsFormsHost, the paint event never gets fired, leading me to believe there might a problem with the paint function. I am aware of this bug WPF Touch Bug, however I don't really think that this is my problem. Something tells me the interop is what is giving me problems but I am not sure.
My Code:
private void playFwdFunc()
{
//disable undrawing and enable drawing
undraw = false;
draw = true;
//unpause animation
paused = false;
//enable/disable appropriate buttons
pauseBtn.IsEnabled = true;
stepBackBtn.IsEnabled = false;
stepFwdBtn.IsEnabled = false;
clearStart.IsEnabled = true;
clearCurrent.IsEnabled = true;
//refresh control
glControl1.Invalidate();
}
private void playFwdClick(object sender, RoutedEventArgs e)
{
playFwdFunc();
}
In my glPaint event:
if (frameCount < vertices.Length && !paused)
{
//draw more vertices
if (draw)
{
if (0 < (int)(vertices.Length / (25000 / speedTrack.Value)))
frameCount += (int)(vertices.Length / (25000 / speedTrack.Value));
else
frameCount++;
}
//draw less vertices
else if (undraw)
{
if (0 < (int)(vertices.Length / (25000 / speedTrack.Value)))
frameCount -= (int)(vertices.Length / (25000 / speedTrack.Value));
else
frameCount--;
}
//make sure we dont have a negative framecount (null pointer)
if (frameCount < 0)
{
paused = true;
frameCount = 0;
// manageCodeBox();
}
//make sure we dont exceed # of vertices (null pointer)
else if (frameCount > vertices.Length)
{
frameCount = vertices.Length;
paused = true;
// manageCodeBox();
}
}
My Question: Why is clicking a button with mouse making the UI react differently than touching a button, even though they raise the same event? Any input is appreciated.
So I found what was causing the behavior, in my paint function I was constantly raising an event that had a glControl1.Invalidate().
My Best Guess: My only real guess is that certain events like touches which occur on the stylus thread some how gets into a live-lock with the UI thread when calling glControl1.Invalidate(). I think that that is why some touches got accepted, but the UI thread never reached mouse event listeners. As I said: best guess.
My Workaround: I originally made a DispatcherTimer to call glControl1.Invalidate(). Mouse events worked fine, but because touches got promoted to mouse clicks, there was a delay unless I handled the touch event (which I did not want to do for each button on my UI). My next solution seems to be working so far.
In my windows constructor: CompositionTarget.Rendering += invalidateProcessor;.
The function:
private void invalidateProcessor(object sender, EventArgs e)
{
Dispatcher.BeginInvoke(new Action(() => { if (!paused) glControl1.Invalidate(); }), DispatcherPriority.Background);
}
I chose DispatcherPriority.Background because the higher priorities cause a delay in the touches again.

How to capture mouse coordinates from another program

I'm trying to write a WinForms program that will capture mouse coordinates upon pressing and (more importantly) releasing the middle mouse button.
My form has topmost set to true (so that text in it can always be visible even when it doesn't have focus).
What I'm aiming for is being able to hover the mouse over a game window after my program starts, hit the middle mouse button, and have it record the mouse position for later use.
I can get it to detect when the middle mouse button is clicked inside the form using the MouseUp event (bound to the form itself) but have no clue what I need to do to have it detect when the mid mouse is clicked outside my form.
Thanks for any help guys.
I believe what you are after are called low level hooks. A quick google brings up this: Erroneous Mouse Coordinates Returned from Low Level Mouse Hook C#
A microsoft example of how to do can be found here: http://support.microsoft.com/kb/318804

Mouse movement events in NCurses

I wonder if there is such a thing as mouse movement events in NCurses, and if there is a way to catch them. Following the Interfacing with the mouse (from the NCurses programming HOWTO) it seems that by enabling the REPORT_MOUSE_POSITION bit in the call to mousemask, one can indeed catch mouse movement events.
So, I tried that and it does not seem to work. I have something like this:
int ch, count=0;
mmask_t old;
initscr ();
noecho ();
cbreak ();
mousemask (ALL_MOUSE_EVENTS | REPORT_MOUSE_POSITION, &old);
keypad (stdscr, TRUE);
while ((ch = getchar ()) != 'q')
{
count++;
if (ch == KEY_MOUSE)
{
MEVENT event;
assert (getmouse (&event) == OK);
mvprintw (0, 0, "Mouse Event!\n");
}
mvprintw (1, 1, "Event number %4d",count);
}
...
I expected that as I'll move my mouse cursor, I'll see the event counter increasing. But it didn't. I also tried moving it while mouse button 1 is down to see if generates "drag" events, and it also didn't do anything. The question is, if it's simply a problem of my terminal emulator? Or maybe I'm misunderstanding what NCurses considers as mouse movement events? All the other mouse events were received (and I can operate programs in the console that use the mouse).
I tried gnome-terminal, xterm, and some other stuff. I also tried a textual environment (without X) by going to the tty's of my linux machine (Fedora 15, Ctrl+Alt+F2) and that did not work.
Finally, assuming I do get this right and those events should be reported, what is the bstate field of a MEVENT for a mouse movement evenet?
Many thanks in advance!
You need:
a terminal which supports mouse event reporting;
$TERM pointing to a terminfo entry which has an appropriate XM entry to initialise the terminal correctly.
xterm at least satisfies (1); for (2), it's likely that you'll need to set a different value for TERM.
Try:
TERM=xterm-1002 to get a position event when the cursor moves to a different cell while a button is being held down; or
TERM=xterm-1003 to always get a position event whenever the cursor moves to a different cell, even if no button is pressed.
The resulting events have the REPORT_MOUSE_POSITION bit set on the bstate field.
(The "PORTABILITY" section of the curs_mouse(3x) man page describes the terminal initialisation, and the "Mouse Tracking" section of the Xterm Control Sequences documentation describes the relevant "private mode" extensions.)
The code that you've given above needs to use getch(), not getchar(); and needs a refresh() inside the loop! Other than that, it works for me with xterm when using one of the appropriate TERM settings.

Resources