How to detect multiple mobile inputs at the same time with Godot? - mobile

In the game I am currently making, the player will frequently need to press two buttons at the same time, for example "jump" and "left". Currently I have two buttons on the left side of the screen for left and right, and one on the right for jump. The problem is, if the player is pressing any of them, it effectively locks the other buttons, making it so you can't press them.
Is there some way, perhaps using InputEventScreenTouch, that I can detect the user's current touchscreen input every frame, and check if they are pressing one or more buttons?
Edit for more info:
Currently each button sends it's button_down() and button_up() signals to the same function in the HUD scene root, and I have simple functions to handle that:
func setLeft(val):
# This is the variable that the player
# checks each frame to see if it should move
self.get_parent().get_parent().goingLeft = val
So when the player starts pressing the left button, it sets goingLeft to true, and when they release it it sets goingLeft to false.
This should work fine, but I have discovered that if the player is touching the screen anywhere else, the buttons don't register input. For example, if I press the left button, the player starts going left, and then while I am holding that, I start pressing jump. The jump button doesn't register, because my finger is already pressing somewhere else on the screen (the left one).
It doesn't seem to matter if it is a button that I am pressing, just that Godot will only check buttons if there is exactly one input.
Is there some way to get around this?

I have found the solution, thanks to some people in the comments!
I changed the button types to TouchScreenButton nodes, and they have multitouch enabled, so I can press more than one at once, on a mobile device!

Related

How to handle mouse motion events in GTK3?

I am trying to implement the following feature using C/GTK3/Cairo:
-Left click on an GtkDrawingArea Widget and printf the coordinates Xo and Yo.
-While keeping the left button down, move the mouse and draw a line conecting (Xo,Yo) to the current mouse position.
-Release the left mouse button and printf("something")
How do I do this? Anyone knows of a good tutorial showing how to handle mouse clicl-move events?
So far, the best I found was this zetcode lines (which shows how to handle mouse click events but not button-down/move/button-up and this , which explains how to change the mouse cursor when hovering over a Widget.
Thanks
Did you see this GtkDrawingArea demo from the Gtk people? This one is written in C, but there is a Python version of the same program (links updated - thanks #kyuuhachi).
Anyway, in the constructor (__init__), calls are connected to the motion_notify_event.
You also need to connect to the button_press_event and the button_release_event.
Then, on button press, you save the coordinates of the start point. (and save it to the end point too, which are the same for now).
On each motion_notify_event, you delete the previous line (by overwriting), and redraw it to the new end point.
Finally, when the button is released, the line is final.
It's much easier if you use a canvas widget, for example GooCanvas, which takes care of most of the updating. You can just update the coordinates of the line object, and it will move itself. Also you can easily remove lines. The 'algorithm' is similar as above:
Connect button_press_event, button_release_event, and motion_notifyevent to the canvas,
When a button press occurs, create a GooCanvas.polyline object, and set begin and endpoint,
Update the endpoint on each motion_notify_event
Finalize with a button_release_event.

coco2d-x v3.x run specific code when touch screen is pressed is not released

I am developing a mobile game using coco2d-x framework with C++ language. The platforms I target are IOS and Android.
I want to run some code when an user is pressing a sprite as long as he doesn't release the press without moving the finger. I went through the documentations and looked at the different callbacks (onTouchBegan, onTouchMoved, onTouchEnded) but couldn't find a way to solve my problem.
In fact i have sprites drawn on the screen to simulate directional controller. I want to move a character as long as the user is pressing on a directional sprite.
Is there a way to run some code as long as a sprite is pressed ?
I would try something like this
onTouchBegan(...){
//let player move
//e.g. set some flag moving=true;
}
onTouchMoved(...){
//query x and y distance of finger movement
//cancel player movement if finger moved above a certain treshold
//or alternatively check if the finger is now
//outside of the bounding box of your sprite
//if so then
//moving=false;
}
onTouchEnded(...){
//stop movement if not already stopped before
//moving=false;
}
in your update function you can then simply check moving and execute your code as long as moving is true. Is that what you meant?

(Unity3d) How do I make a "move pad" on the screen? (Mobile)

In my game you can control the character by moving left and right, jumping and attacking. (This is a mobile game) I have a button that I use to jump and attack, which is easy because I just make a button and jump or attack with OnClick(). But for moving, I don't know how to find out if the user is pressing the button, I only know when it is clicked. How can I find this out? thanks.
If you dont understand what Im trying to say, basically here is my web game: http://dugelstudios.weebly.com/weapon-plus-plus.html
(Does not work on chrome, using safari or internet exploror)
and i am porting it to mobile, and i dont know how to make the player move left and right with touch controls.
You can use other MonoBehaviour methods such as OnMouseOver to check if a button is pressed, OnMouseEnter when a user begins to press a button, and OnMouseExit to check if a user has released the button.
You can also use OnMouseUpAsButton to mimic the behaviour of Button.onClick
For draggings movements, (like movement. For example, if you have a thumbstick, or something similar for movement), you can use OnMouseDrag.
Also, completely unrelated to your question, but something you have mentioned, you can enable NPAPI to enable WebPlayer builds in Chrome.
Just paste chrome://flags/#enable-npapi in a new tab in Chrome, and click the "Enable" button to get it running
I believe there is a thumb stick asset the standard Unity asset pack that's available on the store.

Clear a character on screen with ncurses

My program currently allows the user to draw the $ character on the screen with ncurses initialized when a key is pressed.
mvaddch(y,x,'$');
I also have a box drawn and I want to say that after the user presses a specific key, I want the $ to be erased and placed in the new position the user puts it without erasing the entire screen. I tried using erase(), but after that it would erase the entire screen and I don't want that. I want it to keep the box that was drawn. So how would I do this?
The usual way to approach this is to create windows on the screen, e.g., with newwin or subwin, and create the box and '$' in different windows while using clear/wclear to clear the appropriate places on the screen.
Keep in mind that getch/wgetch does a refresh on the window with which it is associated, so that may overwrite updates from overlapping windows.

How to capture mouse coordinates from another program

I'm trying to write a WinForms program that will capture mouse coordinates upon pressing and (more importantly) releasing the middle mouse button.
My form has topmost set to true (so that text in it can always be visible even when it doesn't have focus).
What I'm aiming for is being able to hover the mouse over a game window after my program starts, hit the middle mouse button, and have it record the mouse position for later use.
I can get it to detect when the middle mouse button is clicked inside the form using the MouseUp event (bound to the form itself) but have no clue what I need to do to have it detect when the mid mouse is clicked outside my form.
Thanks for any help guys.
I believe what you are after are called low level hooks. A quick google brings up this: Erroneous Mouse Coordinates Returned from Low Level Mouse Hook C#
A microsoft example of how to do can be found here: http://support.microsoft.com/kb/318804

Resources