I try to create a window, that can be moves or resized via multi-touch gestures. I tried it this way. I captures the TouchDown-Event of the window and saved all active TouchDevices in a List, to know which TouchDevices are active. I catch updated and deactivated event of the TouchDevices to know when they are moved and when they are deactivated. I save the Left and Top Property of the Window and the position where the TouchDevice started and everytime the Updated event is called I move the Window to the new Position relative to the new position of the TouchDevice. This works if I move the finger. But if I don't move the finger (or just very little) the window suddenly began shaking (moving chaotically) and then soon disappears to a position outside of the screen.
I think the problem here is, that the function "GetTouchPoint" of the TouchDevice only give relative coordinates related to the window (even if I set the parameter null instead if the window reference). And because the Window moves the relative position of the TouchDevice (that doesn't move) changes too. So I did a research but wasn't able to find a way to determine the screen coordinates of the touch device.
So I hope anyone can help me how to get the absolute coordinates of the TouchDevice. Or can help me find another way to "DragMove" the window with touch.(I tried DragMove, but that only works for mouse clicks, not TouchDowns) Also I like to resize the window when two Touch Devices are active and therefor I also need absolute coordinates because otherwise same effect happens.
I ran into this issue because my taskbar is on the right edge of the screen, effectively pushing the maximized window to the right. This issue also arises when the application's window is not maximized, and floating somewhere on the screen.
Here is an extension method that fixes the coordinates based on the application window's position.
public static Point FixCoordinates(this Point point)
{
var left = Application.Current.MainWindow.Left;
var top = Application.Current.MainWindow.Top;
return new Point(point.X + left, point.Y + top);
}
You may want to pass in a window that hosts your touched control as a parameter. In my case it is application's main window.
Also, since you tagged the question with "multi-touch", here is a method which averages multiple touch coordinates:
public static Point GetAverage(this IEnumerable<Point> points)
{
var averageX = points.Average(p => p.X);
var averageY = points.Average(p => p.Y);
return new Point(averageX, averageY);
}
And I use it in the code like this:
private void TouchAdornment_TouchDown(object sender, TouchEventArgs e)
{
var touchPosition = (sender as UIElement).TouchesOver.Select(t => t.GetTouchPoint(null).Position).GetAverage().FixCoordinates();
}
Related
I am a bit stuck in my current project.
I'm using a terrible API that needs to be told where and when to draw at any give time. This API does not expose any controls, and I need a control in order to place the drawn object properly.
What I have done is I've put a canvas in my grid in my view. This canvas takes up the space that my drawn API element needs to take up. So, by getting the canvas actualwidth and actualheight, I can draw my API element at the proper size. The issue i'm having is the position of the API element. The Canvas is in the proper place at all times, but when the program first starts, Canvas.TranslatePoint(new System.Windows.Point(0, 0), App.Current.MainWindow); returns 0,0. As soon as I manipulate the UI and cause the Canvas to Resize, the location function returns the real location and my API element draws in the proper spot. My question is why is the initial location 0? How can I remedy this? I call the initial draw function from the UserControl_Loaded event.
Thanks
P.S. Would I be wrong in thinking that the initial 0,0 is a relative coordinate, and not an absolute coordinate?
I figured out my issue - my control wasn't part of the PresentationSource. Long story short, you can't get the location of a control that isn't shown on the screen yet. My fix was this:
private void WaveformCanvas_Loaded(object sender, RoutedEventArgs e){
if((sender as Canvas).IsVisible)
{
(this.DataContext as StudioControlViewModel).MoveWaveform();
}
}
For me, the MoveWaveform() function calls WaveformCanvas.TranslatePoint(new System.Windows.Point(0, 0), App.Current.MainWindow);
So now the function only gets called when the control is visible, which has solved all of my issues.
How do I code the start location of my winform app such that it always starts in the bottom right hand corner of the screen. As using x and y coordinate only affects one particular screen resolution, on a smaller or larger screen the winform would not appear in the desired location.
Thanks!
You have to do this in the OnLoad() method/event, one of the few real reasons to use it. The form's actual size won't be the designed size because the user might have changed preferences like the window caption height or the form might be rescaled due to a different video DPI setting. This is all sorted out when OnLoad() starts running.
Make it look like this:
protected override void OnLoad(EventArgs e) {
var scr = Screen.FromPoint(this.Location);
this.Left = scr.WorkingArea.Right - this.Width;
this.Top = scr.WorkingArea.Bottom - this.Height;
base.OnLoad(e);
}
Check this on MSDN:
Setting the Screen Location of Windows Forms
Regards
I have stack panel with custom controls in it. I attach standard MouseDragElementBehavior to each item.
When I drag to the right the item moves underneath the other items.
What would be a viable solution to create better user experience - to show better visual cue - how the item moves and where is it going to be dropped.
After a bit of tinkering I realised that nothing can be dragged within stack panel to the right not being coverd by other elements .. unless you drag the very right item..
What I did to resolve it:
Created a visual cue (half transparent shape of a generic item to represnt it during the drag operation)
Made the cue invisible (width=0) and keep it always as the very last element of the stack panel children
Subscribed the stack panel to mouse left button up, down, move
Emulated drag/drop with code
Once the drag initated I turn the cue to visible and set its translate transform to the current mouse coordinates
Adjust translate transform on every mouse move event
On drop I hide the cue again and rearrange items in a way I want.
To stress again - whatever way you do - you have to manipulate with the last element in StackPanel.Children collection....
If the MouseDragElementBehavior doesn't work the way you need it to, you could always descend from the class and customize it to your needs. For example:
public class DragBehvaior : MouseDragElementBehavior
{
public DragBehvaior()
{
this.DragBegun += new MouseEventHandler(DragBehvaior_DragBegun);
}
void DragBehvaior_DragBegun(object sender, MouseEventArgs e)
{
var element = this.AssociatedObject as UIElement;
// Calculate the correct ZIndex here.
Canvas.SetZIndex(element, 100);
}
}
I have a panel, within that panel are several rectangular controls (the number of controls vaires) I want the user to be able to move the controls around within the panel so that they can arrange the controls in the way that suits them best. does anyone have any resources i could read or simple tips which would get me headed down the right road?
thanks
I figured out a possible, simple method of moving a control in a drag/move style... Here are the steps.
Select an element in your control which you wish to be the movement area. This is the area in which, if me user holds the mouse down, the control will move. In my case it was a rectangular border at the top of the control.
Use the OnMouseDown event to set a boolean (in my case IsMoving) to true and the MouseUp event to set it to false
On the first MouseDown event, set some Point property (InitialPosition) using the following code
if (FirstClick)
{
GeneralTransform transform = this.TransformToAncestor(this.Parent as Visual);
Point StartPoint = transform.Transform(new Point(0, 0));
StartX = StartPoint.X;
StartY = StartPoint.Y;
FirstClick = false;
}
Now that you have the starting position, you need to get the position of the mouse relative to your movement control. This is so you dont end up clicking the middle of your header to move it and it instantly moves the top left of the control to the mouse pointer location. To do this, place this code in the MouseDown event:
Point RelativeMousePoint = Mouse.GetPosition(Header);
RelativeX = RelativeMousePoint.X;
RelativeY = RelativeMousePoint.Y;
Now you have the point the control originated at (startX and STartY), the position of the mouse within your movement control (RelativeX, RelativeY), we just need to move the control to a new location! There are a few steps involved in doing this. Firstly your control needs to have a RenderTransform which is a TranslateTransform. If you dont want to set this in XAML, feel free to set it using this.RenderTransform = new TranslateTransform.
Now we need to set the X and Y coordinates on the RenderTransform so that the control will move to a new location. The following code accomplishes this
private void Header_MouseMove(object sender, MouseEventArgs e)
{
if (IsMoving)
{
//Get the position of the mouse relative to the controls parent
Point MousePoint = Mouse.GetPosition(this.Parent as IInputElement );
//set the distance from the original position
this.DistanceFromStartX= MousePoint.X - StartX - RelativeX ;
this.DistanceFromStartY= MousePoint.Y - StartY - RelativeY;
//Set the X and Y coordinates of the RenderTransform to be the Distance from original position. This will move the control
TranslateTransform MoveTransform = base.RenderTransform as TranslateTransform;
MoveTransform.X = this.DistanceFromStartX;
MoveTransform.Y = this.DistanceFromStartY;
}
}
As you can guess, there is a bit of code left off(variable declarations etc) but this should be all you need to get you started :) happy coding.
EDIT:
One problem you may encounter is that this allows you to move the control out of the area of its parent control. Here is some quick and dirty code to fix that issue...
if ((MousePoint.X + this.Width - RelativeX > Parent.ActualWidth) ||
MousePoint.Y + this.Height - RelativeY > Parent.ActualHeight ||
MousePoint.X - RelativeX < 0 ||
MousePoint.Y - RelativeY < 0)
{
IsMoving = false;
return;
}
Place this code in your MouseMove event before the actual movement takes place. This will check if the control is trying to move outside the bounds of the parent control. The IsMoving = false command will cause the control to exit movement mode. This means that the user will need to click the movement area again to try to move the control as it will have stopped at the boundary. If you want the control to automatically continue movement, just take that line out and the control will jump back onto the cursor as soon as it is back in a legal area.
You can find a lot of inspiration here:
http://www.codeproject.com/KB/WPF/WPFDiagramDesigner_Part1.aspx
http://www.codeproject.com/KB/WPF/WPFDiagramDesigner_Part2.aspx
http://www.codeproject.com/KB/WPF/WPFDiagramDesigner_Part3.aspx
http://www.codeproject.com/KB/WPF/WPFDiagramDesigner_Part4.aspx
I'm rendering a WPF grid with multiple elements (buttons, textbox, ...) to a bitmap which is then used as a texture for a 3D surface in a Direct3D scene. For user interaction I create a 3D ray from the 2D mouse cursor position into the 3D scene finding the intersection point with the gui surface. So I know where the user has clicked on the WPF grid, but from there I'm stuck:
How can I simulate mouse events on the WPF elements while they are not actually shown in an open window but rendered off-screen?
Recently, I was looking into UIAutomation and RaiseEvent but these are used to send events to individual elements, not the whole visual tree. Traversing the tree manually and looking for elements at the cursor position would be an option but I haven't found a way to do this accurately. VisualTreeHelper.HitTest is a good start but instead of finding TextBox it finds TextBoxView and instead of ListBox it finds a Border.
EDIT: returning HitTestResultBehavior.Continue in the result callback of HitTest lets me walk through all elements at a given point. I can now send mouse events to all these elements but the values of the MouseEventArgs object are those of the real mouse. So I have to create a custom MouseDevice which apparently is impossible.
PointHitTestParameters p = new PointHitTestParameters(new Point(
((Vector2)hit).X * sourceElement.ActualWidth,
(1 - ((Vector2)hit).Y) * sourceElement.ActualHeight));
VisualTreeHelper.HitTest(sourceElement,
new HitTestFilterCallback(delegate(DependencyObject o)
{
return HitTestFilterBehavior.Continue;
}),
new HitTestResultCallback(delegate(HitTestResult r)
{
UIElement el = r.VisualHit as UIElement;
if (el != null)
{
MouseButtonEventArgs e = new MouseButtonEventArgs(Mouse.PrimaryDevice, 0, MouseButton.Left);
if (leftMouseDown) e.RoutedEvent = Mouse.MouseDownEvent;
else e.RoutedEvent = Mouse.MouseUpEvent;
el.RaiseEvent(e);
}
return HitTestResultBehavior.Continue;
}), p);
You might be able to send windows messages (like WM_MOUSEMOVE) to the WPF window's HWND, via the PostMessage(..) win32 method. I believe these messages would be read by WPF and executed as if it came from a user.
If you are feeling really brave, you can try my hack to get access to the WPF IDirect3DDevice9Ex. From there you can copy the backbuffer in the swapshchain to your d3d scene. If you are using 9Ex (Vista D3D9), you can take advantage of the shared resources (share surfaces/textures/etc between devices and processes) feature to help with performance.
You may still have to do some trickery with windows messages for interactivity.