Generating Mouse Scroll Event in Linux - c

I having small doubt in generating mouse event from C program. I am
writing a program to generate mouse events from a C program in linux. I
have implemented mouse click,drag. .. etc using xlib. But dont have any idea about
generating mouse scroll event.
Operating System : Fedora 15

X11 has two mechanism to report scroll events. The old-fashioned way is to treat the scroll wheel as two extra mouse buttons: scroll up is reported as button 4 and scroll down as button 5 (or vice versa, I don't remember). The modern way is to report them via the XInput2 extension, which allows things like horizontal scrolling and smooth scroll and suchlike.

Related

handle Multi touch and mouse events with single event -WPF

I need to provide panning and zooming capabilities to my image in WPF. I have followed the below link and it works fine for Touch screen based system.
Multi touch with WPF . But what it doesn't work with ,mouse events. such as scrolling the mouse wheel to zoom or rotating the image using mouse. below are my questions?
Is there any possibilities to achieve both mouse and touch events by means single event?
IF yes how to do that?
If it is not possible, Why?
There are no such Common thing except the click events like MouseLeftButtonDown,MouseDown ... You have to Implement your own logic for Touch and Mouse Based Interaction.
Because they are not the same
Take a touch and drag, and a click and drag, superficially the same
But a click can be left, right, middle, special, it can be multiple buttons at once, none of that has anything to do with a touch
likewise a touch pinch has no meaning as mouse scroll wheel event
So you need to capture the different Events and convert them into a meaningful command that the VM can perform

Is it possible to change the mouse icon from a terminal app?

Title pretty much says it all: I'm wondering whether it's possible to change the mouse cursor icon in response to feedback in a terminal app (e.g., a click event) from the ncurses library or another library?
For example: I am running xterm under X, and a curses application inside that xterm. I may or may not be sshed into another box.
A user clicks on an element of my cursor app -- is it possible to change the mouse cursor icon from a bar to a plus sign in response to the click?
There is some information here but I'd like a more complete resource:
Mouse movement events in NCurses
I don't believe it is. ncurses can read events from the mouse but not actually change mouse cursor settings. The terminal sends mouse movement and clicks to the ncurses program as escape sequences.
Some terminals, such as putty, will change the cursor to an arrow when a region is clickable. Otherwise, a text selection cursor is shown. But I don't think this is controllable through escape sequences.

mouse click function for linux

I am trying to build a virtual mouse by detecting finger movements with opencv. The finger detection is done. But I am stuck in mouse click function.
My work is similar to this :
http://8a52labs.wordpress.com/tag/colored-finger-tracking
But he has done in Windows. I am working in linux. I just want a library which provides me functions for doing left click, right click, mouse movement etc.
I am working with Fedora 16 , opencv and QT.
P.S - I have already moved the mouse cursor in Qt by QCursor::setPos(mouseX,mouseY);
But there is no function to make left click and right click.
If you want to do this system-wide, rather than just restricted to your Qt application, see this answer.
You need to post a QMouseEvent through QCoreApplication::postEvent(QObject* receiver, QEvent* event).

Program scroll bar arrow buttons functionality

In some programs, for example, Microsoft Office Excel, when you’re at the end of the document being edited and then press a right or down arrow of the scrollbars, then, you scroll further and the edited area will grow, producing more space for entering data. Can this functionality be implemented in WPF?
I tried using ScrollBar instead of ScrollView to get more control, but didn’t succeed. If the scroll bar is at its maximum value, pressing the right or down (depending on the orientation) button of it doesn’t fire the scroll event.

Two Finger Click Event using Windows 7 Multi-Touch

We would like 2 code samples (C# would be great) of Windows 7 multitouch functionality using .NET 3.5 sp1 and the ManipulationProcessor:
A two finger click sample – An event should be raised when a user “clicks” on a UIElement such as a Rectangle using 2 fingers simultaneously (Close together). The click event should be fired when the “down” events occur, not when “up” events occur.
A two finger drag sample – A delta event should be fired when a user places two fingers next to each other and drags them up or down the screen. Data needed is “delta amount” – the how far the fingers have dragged since the last Delta event as well as “delta direction” to indicate whether the user had dragged their fingers up or down the screen. This is similar to the Y Translation delta data that’s already present in ManipulationProcessor.ManipulationDelta, but should only be triggered when 2 fingers are present and next to each other throughout the gesture.
Here's a nice demo on doing multitouch applications. Littered with code samples.

Resources