I am using a GridSplitter to resize columns (what else :)). Works fine.
However, I am evolving in a Surface V2 environment and if I use touch simulation the Touch events are not transmitted to my dear GridSplitter.
Any hint on how to make that work?
Just register the event like this:
YourGridSplitter.PreviewDragEnter += new DragEventHandler(YourGridSplitter_PreviewDragEnter);
void YourGridSplitter_PreviewDragEnter(object sender, DragEventArgs e)
{
// nothing here
}
You need to capture the touchdevice.
gs.PreviewTouchDown += (s, e) => { e.TouchDevice.Capture((s as UIElement)); };
Related
I have a WPF Canvas with .NET-4.5.
I added events (which autocreated methods for) MouseLeftButtonDown and MouseDown. Using MessageBox, I have confirmed these methods are called when a user clicks on the canvas, but I can't find a way to get the mouse position from MouseButtonEventArgs.
When I added events (and autocreated methods for) ManipulationStarted and ManipulationStarting those MessageBoxes don't show up.
private void CenterCanvas_ManipulationStarted(object sender, ManipulationStartedEventArgs e)
{
MessageBox.Show("Doesn't show up"); // never shows up
}
private void CenterCanvas_MouseDown(object sender, MouseButtonEventArgs e)
{
MessageBox.Show("Shows up"); // shows up, but can't seem to get click position
}
In order to get the mouse position from a MouseEventArgs you would have to call the GetPosition method.
private void Canvas_MouseDown(object sender, MouseButtonEventArgs e)
{
var pos = e.GetPosition((IInputElement)sender);
System.Diagnostics.Trace.TraceInformation("MouseDown at {0}", pos);
}
For getting manipulation events you need to set IsManipulationEnabled to true. You may want to take a look at the Touch and Manipulation section in the MSDN Input Overview.
I want to synchronize 2 scrollviewers. Please let me know how to get scroll event of both scrollviewers and then synchronize them both?
First get the 2 scrollbars of the scrollviewers you want to sync.
In this case, scrollviewer1 and scrollviewer2
Then we get event handlers of both the scrollbars, in this case vertical. Then we can easily sync them through the events. The ScrollToVerticalOffset will scroll as per the other one does.
ScrollBar vertical1 = ((FrameworkElement)VisualTreeHelper.GetChild(scrollviewer1, 0)).FindName("VerticalScrollBar") as ScrollBar;
vertical1.ValueChanged += new RoutedPropertyChangedEventHandler<double>(vertical1_ValueChanged);
ScrollBar vertical2 = ((FrameworkElement)VisualTreeHelper.GetChild(scrollviewer2, 0)).FindName("VerticalScrollBar") as ScrollBar;
vertical2.ValueChanged += new RoutedPropertyChangedEventHandler<double>(vertical2_ValueChanged);
void vertical1_ValueChanged(object sender, RoutedPropertyChangedEventArgs<double> e)
{
scrollviewer2.ScrollToVerticalOffset(e.NewValue);
}
void vertical2_ValueChanged(object sender, RoutedPropertyChangedEventArgs<double> e)
{
scrollviewer1.ScrollToVerticalOffset(e.NewValue);
}
Hope this helps!
I am having a problem with Avalon Docking where my second panel that's docked at the bottom and set to AutoHide.
When UI runs the pane loads as Docked/Visible by default. I would like to have it hidden/minimized.
<ad:DockingManager>
<ad:ResizingPanel Orientation="Vertical">
<ad:DocumentPane>
<ad:DocumentContent>
<... data grid that fills the view>
</ad:DocumentContent>
<ad:DocumentPane>
<ad:DockablePane>
<ad:DockableContent Title="output" DockableStyle="AutoHide" IsCloseable="False">
<...some control>
I have tried various "hacks" suggested on Avalon forums, where OnLoad, you can
outputDockablePane.ToggleAutoHide();
and that works, meaning, when UI is loaded the pane is hidden. However, once you toggle auto hide in .cs code, clicking on the dock header at runtime to make the pane visible/float stops working. So you have to hook up DockingMananger.OnMouseUp() and parse through a couple of boolean states and manually call ToggleAutoHide() - I guess only on the time. Seems like a hack to me.
Here's what I am doing for now, till I find a proper and clean solution:
private void OnDockManagerLoaded(object sender, RoutedEventArgs e)
{
if(_firstTimeLoad && !_isDataGridLoaded)
{
outputDockablePane.ToggleAutoHide();
_forcedToAutoHide = true;
}
}
private void OnDockingManagerMouseUp(object sender, MouseButtonEventArgs e)
{
if (_forcedToAutoHide)
{
_forcedToAutoHide = false;
outputDockableContent.Activate();
outputDockablePane.ToggleAutoHide();
}
}
Is there a setting/property that I am totally missing, or/and a better way?
4 Years still Avalon Docking has the same issue .While I haven't found a proper solution yet , I have tried to refine you workaround logic.
private void OnDockingManagerMouseUp(object sender, MouseButtonEventArgs e)
{
if (outputDockableContent.IsAutoHidden)
{
outputDockableContent.IsActive = false;
}
}
I'm building a Windows Presentation Foundation control with Microsoft Blend.
When I leave my control by pressing the left-mouse-button, the MouseLeave-Event is not raised. Why not?
This is intended behaviour: When you are doing mousedown on a control and leaving the control, the control STILL retains its "capture" on the mouse, meaning the control won't fire the MouseLeave-Event. The Mouse-Leave Event instead will be fired, once the Mousebutton is released outside of the control.
To avoid this, you can simple tell your control NOT to capture the mouse at all:
private void ControlMouseDown(System.Object sender, System.Windows.Forms.MouseEventArgs e)
{
Control control = (Control) sender;
control.Capture = false; //release capture.
}
Now the MouseLeave Event will be fired even when moving out while a button is pressed.
If you need the Capture INSIDE the Control, you need to put in more effort:
Start tracking the mouseposition manually, when the mousekey is pressed
Compare the position with the Top, Left and Size Attributes of the control in question.
Decide whether you need to stop the control capturing your mouse or not.
public partial class Form1 : Form
{
private Point point;
private Boolean myCapture = false;
public Form1()
{
InitializeComponent();
}
private void button1_MouseDown(object sender, MouseEventArgs e)
{
myCapture = true;
}
private void button1_MouseMove(object sender, MouseEventArgs e)
{
if (myCapture)
{
point = Cursor.Position;
if (!(point.X > button1.Left && point.X < button1.Left + button1.Size.Width && point.Y > button1.Top && point.Y < button1.Top + button1.Size.Height))
{
button1.Capture = false; //this will release the capture and trigger the MouseLeave event immediately.
myCapture = false;
}
}
}
private void button1_MouseLeave(object sender, EventArgs e)
{
MessageBox.Show("Mouse leaving");
}
}
of course you need to stop the own tracking ( myCapture=false;) on MouseUp. Forgot that one :)
When I don't get mouse events I expect I typically use Snoop to help me understand what is happening.
Here are a couple of links:
1- Snoop (a WPF utility)
2- CodePlex project for Snoop
And for completeness and historical reasons (not the bounty - it doesn't make sense having two duplicate questions - you should probably move it into one if not too late)...
I made a thorough solution using global mouse hook here (approach 2)
WPF: mouse leave event doesn't trigger with mouse down
And simplified its use - you can use it by binding to commands in your view-model - e.g.
my:Hooks.EnterCommand="{Binding EnterCommand}"
my:Hooks.LeaveCommand="{Binding LeaveCommand}"
my:Hooks.MouseMoveCommand="{Binding MoveCommand}"
...more details in there
Old question but I came across the same problem with a Button (MouseLeave does not fire while MouseDown because MouseDown Captures the Mouse...)
This is how I solved it anyway:
element.GotMouseCapture += element_MouseCaptured;
static void element_MouseCaptured(object sender, MouseEventArgs e)
{
FrameworkElement element = (FrameworkElement)sender;
element.ReleaseMouseCapture();
}
Hope that helps someone looking for a quick fix :P
for example, here's some code from an implementation of an attached behavior for a double click command:
private static void fe_MouseDown(object sender, MouseButtonEventArgs e) {
if (e.ClickCount != 2) return;
...
}
assuming you have a thin client with the bulk of the logic in a testable architecture, is there a cheap and relatively painless way to test logic that depends on user gestures like a mouse event?
Cheers,
Berryl