I'm thinking at the better way to manage a "workspace" in my application, where the user could move things in this space, and could pan for instance.
Let's imagine I have my application interface, with some buttons all around, a treeview etc. and in the middle, a Canvas with some widgets the user can move and work with, and he could pan this space to move in it.
For the pan, I was thinking to handle the MouseLeftButtonDown in this space (that switch a boolean "UserMoving" to true), and on the MouseMove, to apply on translation to the space (from the Delta between the 2 events), until I detect a MouseLeftButtonUp, to know he has stopped (UserMoving to false). The problem is that if the user "mouse left button up" not on the space itself (a Canvas), I will never know he has released the mouse. What's the best to manage the fact that he can release the pan (or the drag&drop of a widget) wherever in the application?
Do anybody has already manage something like that and can help me?
You should call the UIElement.CaptureMouse on your Canvas at the mouse down. This will keep the canvas receiving mouse events, even if the user moves outside of it's region.
On mouse up, call ReleaseMouseCapture.
Related
I need to provide panning and zooming capabilities to my image in WPF. I have followed the below link and it works fine for Touch screen based system.
Multi touch with WPF . But what it doesn't work with ,mouse events. such as scrolling the mouse wheel to zoom or rotating the image using mouse. below are my questions?
Is there any possibilities to achieve both mouse and touch events by means single event?
IF yes how to do that?
If it is not possible, Why?
There are no such Common thing except the click events like MouseLeftButtonDown,MouseDown ... You have to Implement your own logic for Touch and Mouse Based Interaction.
Because they are not the same
Take a touch and drag, and a click and drag, superficially the same
But a click can be left, right, middle, special, it can be multiple buttons at once, none of that has anything to do with a touch
likewise a touch pinch has no meaning as mouse scroll wheel event
So you need to capture the different Events and convert them into a meaningful command that the VM can perform
I have created a usercontrol that is essentially a text editor (using Graphics.Drawstring in OnPaint).
I have set AutoScroll = true, and AutoScrollMinSize values appropriately. Everything is working how it should...
EXCEPT, i would like the control to scroll itself WHILST I am currently scrolling (i.e. click and drag the scroll bar... and whilst it is being dragged the control should be scrolling the entire time). At the moment it only scrolls when the scroll bar is released (mouse up).
I have tried implementing _Scroll and invalidating the control, but that just makes it flicker uncontrollably.
I cannot find any examples online for this, due to it being difficult to describe!
Can anyone point me in the right direction please?
Control.Invalidate() will make something flicker badly. I faced this problem drawing cross-hairs over a mouse position on a PictureBox drawing a line chart before. The trick is to use (and I cant remember which one is best to come first)
Control.Update();
Control.Refresh();
in the Scroll event. Depending on what else you are drawing in the Control and how you are drawing it this may be better for you. Also this is tested on PictureBox, Control may be another matter.
I need to duplicate a control on two windows on two different screens, I am writing a WPF application in which I have a "WebBrowserControl", I need to duplicate the "WebBrowserControl" onto another window and on another screen. I need to render exact copy on both screens, moreover I need to capture mouse movement on both screens. Mouse should be moving at exact locations on both screens.
Can anyone suggest anything? I have tried capturing screen and putting it in another window, but how do I capture mouse movement?
Any ideas? Any help will be of great value.
Thanks.
Handle the mouse move event is the answer on how to capture mouse movement.
MouseMove
The part that confuses me is how you are going to have a mouse on both screens.
I'm looking to imitate the following control, found in the zune software, in the quickplay tab. When you move the mouse from left to right the boxes new, pinned and history move to the opposite direction.
I'm thinking of applying a method to the mousemove handler which checks for the relative position of the cursor and then moves a grid containing panels accordingly but before I try to reinvent the wheel I'd like to ask around if there are people who have more experience with this, especially getting a fluid motion.
Please let me know
Do you mean something like this http://sachabarber.net/?p=829
I'm trying to modify the default graph viewer of the Graph# library because its user interface is awful (just try dragging a node outside of the boundaries, you'll see!)
The basic setup is this: there is a GraphCanvas control (inherited from Panel) which has children of Vertex and Edge control types. What I want to achieve is:
GraphCanvas has scroll bars if the contents do not fit in the screen;
GraphCanvas can also be scrolled by "dragging" it (just click on an empty space and drag);
GraphCanvas can be zoomed in and out (via CTRL+mouse wheel);
Vertices can be dragged around. If a vertex is dragged outside the current boundaries of GraphCanvas, the boundaries are increased. The scroll bars should reflect this, however the current viewport should not scroll away while the vertex is being dragged . The same goes if dragging a vertex reduces the boundaries of GraphCanvas - it should stay the same size until the drag operation is finished and resize only then. Automatically scrolling the viewport during a drag operation is awfully confusing and easily introduces dragging errors. See the original implementation if you want to know what I mean.
Although I've got a fair bit of experience with .NET, I'm still a complete beginner in WPF. My current attempt is (in the measure/arrange layout phase) to give each vertext the XY coordinate it desires (even if negative) and implement zooming/scrolling by handling mouse events on the GraphCanvas and modifying the RenderTransform property. Dragging just changes the XY coordinates on the specific vertex (probably triggering the re-layout of the whole thing which would be nice to avoid too). Scrollbars are implemented by placing the GraphCanvas inside a ScrollViewer and implementing IScrollInfo on the GraphCanvas.
Unfortunately there seems to be a problem: I can get mouse events on the GraphCanvas itself only if it has a background at the point. That would be OK, I want a white background anyway, but in the negative coordinates of the GraphCanvas it does not draw the background - and thus does not respond to mouse events.
I'm also wondering if I'm doing the Right Thing by allowing all my child controls (vertices and edges) to go into negative coordinates. How would you implement this?
Added: To clarify about the background problem check out the following screenshot:
(source: valts.21.lv)
What you see here is a simple Windows Forms form with a WPF Host control on it. That has a ScrollViewer in it, and the ScrollViewer has the GraphCanvas in it. The GraphCanvas contains 4 vertices and 6 edges.
The GraphCanvas is stretched to fill the ScrollViewer. But since some of the vertices are at negative coordinates, it has a RenderTransform applied which simply shifts everything to the right (TranslateTransform). It also has a white background brush.
Note the gray area on the left. That's still a part of the GraphCanvas, but the background brush somehow doesn't exted there. Also, if I left-click there with my mouse (not on a node, but on the gray area), I do NOT get an event. If I left-click on the white area, I get all events just fine.
Call CaptureMouse on canvas.mouseDown and ReleaseMouseCapture on mouse up. Also, if you set your canvas background to transparent it will still be hit testable
You can attach a 'Draggable' behavior to each element.