handle Multi touch and mouse events with single event -WPF - wpf

I need to provide panning and zooming capabilities to my image in WPF. I have followed the below link and it works fine for Touch screen based system.
Multi touch with WPF . But what it doesn't work with ,mouse events. such as scrolling the mouse wheel to zoom or rotating the image using mouse. below are my questions?
Is there any possibilities to achieve both mouse and touch events by means single event?
IF yes how to do that?
If it is not possible, Why?

There are no such Common thing except the click events like MouseLeftButtonDown,MouseDown ... You have to Implement your own logic for Touch and Mouse Based Interaction.

Because they are not the same
Take a touch and drag, and a click and drag, superficially the same
But a click can be left, right, middle, special, it can be multiple buttons at once, none of that has anything to do with a touch
likewise a touch pinch has no meaning as mouse scroll wheel event
So you need to capture the different Events and convert them into a meaningful command that the VM can perform

Related

drag scrolling Silverlight

I am new to Silverlight and I am looking to create a StackPanel that scrolls by dragging it up or down, similarly to how PDF readers work. I did not know if there was anything built in or if I would have to create my own based on mouse down and up.
Hm, just catch MouseMove when LMB is down and change scrolling offset by horizontal difference between last and current points.

Duplicate a control on two windows on two screens

I need to duplicate a control on two windows on two different screens, I am writing a WPF application in which I have a "WebBrowserControl", I need to duplicate the "WebBrowserControl" onto another window and on another screen. I need to render exact copy on both screens, moreover I need to capture mouse movement on both screens. Mouse should be moving at exact locations on both screens.
Can anyone suggest anything? I have tried capturing screen and putting it in another window, but how do I capture mouse movement?
Any ideas? Any help will be of great value.
Thanks.
Handle the mouse move event is the answer on how to capture mouse movement.
MouseMove
The part that confuses me is how you are going to have a mouse on both screens.

Generating Mouse Scroll Event in Linux

I having small doubt in generating mouse event from C program. I am
writing a program to generate mouse events from a C program in linux. I
have implemented mouse click,drag. .. etc using xlib. But dont have any idea about
generating mouse scroll event.
Operating System : Fedora 15
X11 has two mechanism to report scroll events. The old-fashioned way is to treat the scroll wheel as two extra mouse buttons: scroll up is reported as button 4 and scroll down as button 5 (or vice versa, I don't remember). The modern way is to report them via the XInput2 extension, which allows things like horizontal scrolling and smooth scroll and suchlike.

WinAPI mouse move notification for full desktop

In WinAPI is there a mouse move notification for the full desktop (full screen) and not for a window only?
I would like to receive mouse screen coordinates in my main window procedure.
Edit:
What I try to do is getting the coordinates from the mouse when dragging from a button in my window to outside that window.
Not as such, no. If you wanted to do something anywhere on the desktop from within your program, e.g. point somewhere or draw something anywhere, you could capture the mouse and then follow the movement until the mouse button is released. See SetCapture for this.
For an example, see this article on MSDN: Drawing Lines with the Mouse
Otherwise you can always use Windows hooks to follow mouse movements anywhere.
You can set a mouse hook to be notified about all mouse events.
You can use GetCursorPos, or GetMessagePos that contains coordinate of the last message

Manage a space where you can move things (Silverlight)

I'm thinking at the better way to manage a "workspace" in my application, where the user could move things in this space, and could pan for instance.
Let's imagine I have my application interface, with some buttons all around, a treeview etc. and in the middle, a Canvas with some widgets the user can move and work with, and he could pan this space to move in it.
For the pan, I was thinking to handle the MouseLeftButtonDown in this space (that switch a boolean "UserMoving" to true), and on the MouseMove, to apply on translation to the space (from the Delta between the 2 events), until I detect a MouseLeftButtonUp, to know he has stopped (UserMoving to false). The problem is that if the user "mouse left button up" not on the space itself (a Canvas), I will never know he has released the mouse. What's the best to manage the fact that he can release the pan (or the drag&drop of a widget) wherever in the application?
Do anybody has already manage something like that and can help me?
You should call the UIElement.CaptureMouse on your Canvas at the mouse down. This will keep the canvas receiving mouse events, even if the user moves outside of it's region.
On mouse up, call ReleaseMouseCapture.

Resources