Two Finger Click Event using Windows 7 Multi-Touch - wpf

We would like 2 code samples (C# would be great) of Windows 7 multitouch functionality using .NET 3.5 sp1 and the ManipulationProcessor:
A two finger click sample – An event should be raised when a user “clicks” on a UIElement such as a Rectangle using 2 fingers simultaneously (Close together). The click event should be fired when the “down” events occur, not when “up” events occur.
A two finger drag sample – A delta event should be fired when a user places two fingers next to each other and drags them up or down the screen. Data needed is “delta amount” – the how far the fingers have dragged since the last Delta event as well as “delta direction” to indicate whether the user had dragged their fingers up or down the screen. This is similar to the Y Translation delta data that’s already present in ManipulationProcessor.ManipulationDelta, but should only be triggered when 2 fingers are present and next to each other throughout the gesture.

Here's a nice demo on doing multitouch applications. Littered with code samples.

Related

How to animate CN1 Slider progress on load

Please can you help me work out how to build a Codenameone Slider control that simply animates its Progress when it renders initially, so the user sees the progress bar increase over the course of a few seconds.
My actual implementation is to use Chen's awesome ArcProgress control to show how far something has grown, so as the control renders the arc is filled to its 70% or so level over a few seconds. I have the image above all built and working so far.
Many thanks
You just need to invoke setValue to indicate the current position. I'm guessing you don't see the progress moving because you're doing the progress on the EDT thus blocking painting.
All paint operations are performed on the EDT thread and so if your loading/processing code runs on that thread (the main thread) you're effectively blocking the paint operations. There's a long discussion of this in the EDT section of the developer guide.

handle Multi touch and mouse events with single event -WPF

I need to provide panning and zooming capabilities to my image in WPF. I have followed the below link and it works fine for Touch screen based system.
Multi touch with WPF . But what it doesn't work with ,mouse events. such as scrolling the mouse wheel to zoom or rotating the image using mouse. below are my questions?
Is there any possibilities to achieve both mouse and touch events by means single event?
IF yes how to do that?
If it is not possible, Why?
There are no such Common thing except the click events like MouseLeftButtonDown,MouseDown ... You have to Implement your own logic for Touch and Mouse Based Interaction.
Because they are not the same
Take a touch and drag, and a click and drag, superficially the same
But a click can be left, right, middle, special, it can be multiple buttons at once, none of that has anything to do with a touch
likewise a touch pinch has no meaning as mouse scroll wheel event
So you need to capture the different Events and convert them into a meaningful command that the VM can perform

Sudden fps drop during multi-touch

I have a WPF application showing a large image (1024x1024) inside a ZoomableCanvas. On top of the image, I have some 400 ellipses. I'm handling touch events by setting IsManipulationEnabled="True" and handling the ManipulationDelta event. This works perfectly when zooming slowly. But as soon as I start zooming quickly, there is a sudden frame-rate drop resulting in a very jumpy zoom. I can hear the CPU fan speeding up when this occurs. Here are some screenshots from the WPF Performance Suite at the time when the frame-rate drops:
Perforator
Visual Profiler
Software renderer kicks in?
I'm not sure how to interpret these values, but I would guess that the graphics driver is overwhelmed by the amount of graphics to render, causing the CPU to take over some of the job. When zooming using touch, the scale changes by small fractions. Maye this has something to do with it?
So far, I have tried a number of optimization tricks, but none seem to work. The only thing that seems to work is to lower the number of ellipses to around 100. That gives acceptable performance.
Certainly this is a common problem for zoomable applications. What can I do to avoid this sudden frame-rate drop?
UPDATE
I discovered that e.DeltaManipulation.Scale.X is set to 3.0.. in the ManipulationDelta event. Usually, it is around 1.01... Why this sudden jump?
UPDATE 2
This problem is definitely linked to multi-touch. As soon as I use more than one finger, there is a huge performance hit. My guess is that the touch events flood the message queue. See this and this report at Microsoft Connect. It seems the sequence Touch event -> Update bound value -> Render yields this performance problem. Obviously, this is a common problem and a solution is nowhere to be found.
WPF gurus out there, can you please show how to write a high performance multi-touch WPF application!
Well I think you've just reached the limits of WPF. The problem with WPF is that it tesselates (on CPU) vertex grafics each time it is rendered. Probably to reduce video memory usage. So you can imagine what happens when you place 600 ellipses.
If the ellipses are not resized then you could try to use BitmapCache option. In this way ellipses will be randered just once in the begining and then will be stored as textures. This will increase memory usage but should be ok I think.
If your ellipses are resized then previous technic won't work as each ellips will be rerendered when resized and and it will be even slower as this will rewrite textures (HW IRTs in perforator).
Another possibility is to design special control that will use RenderTargetBitmap to render ellipses to bitmaps and then will render it through Image control. In this way you can control when to render ellipses you could even render them in parralel threads (don't forget about STA). For example you can update ellipse bitmaps only when user interaction ends.
You can read this article about WPF rendering. I don't agree with the author who compares WPF with iOS and Android (both use mainly bitmaps compared to WPF). But it gives a good explanation about how WPF performs rendering.

How does QTP click if no coordinates are specified?

I am working on a WPF & .Net application that has many WPFbuttons that I must click. When I attempt to click the button objects, sometimes it works right away, more often it waits several seconds then proceeds, and sometimes it waits up to 3 minutes before my application proceeds.
There is a workaround (or rather, a kludge) by specifying object.Click 5,5 thereby setting QTP to click 5 pixel in from the top-left corner. This works, but why doesn't a regular click work on the program?
Further investigating, QTP says that by default micNoCoordinate is used when no coordinates are specified (this is supposed to click the center of the object).
I have tried micNoCoordinate as well as 5,5 and even 0,0 and while micNoCoordinate is a bit slower than specifying a real coordinate, all of these methods work much, much faster than not specifying a coordinate at all.
What is really going on when QTP "clicks" an object? I don't see a cursor on the button during a plain click event. Why is a plain .Click event so different than specifying a coordinate or even just using micNoCoordinate??
EDIT: Specifically, I'm quite interested in how the click events themselves differ when using coordinates vs. when not using coordinates. What type of click event is being simulated in each scenario?
By default, if no X/Y coordinates are provided, the object's .Click method will click directly in the center of the object. This is the same as specifying "micNoCoordinate" for both coordinates. You have correctly identified the functionality.
Make sure you have all of the latest patches relating to .NET and WPF (QTPNET_00062 and possibly others).

Generating Mouse Scroll Event in Linux

I having small doubt in generating mouse event from C program. I am
writing a program to generate mouse events from a C program in linux. I
have implemented mouse click,drag. .. etc using xlib. But dont have any idea about
generating mouse scroll event.
Operating System : Fedora 15
X11 has two mechanism to report scroll events. The old-fashioned way is to treat the scroll wheel as two extra mouse buttons: scroll up is reported as button 4 and scroll down as button 5 (or vice versa, I don't remember). The modern way is to report them via the XInput2 extension, which allows things like horizontal scrolling and smooth scroll and suchlike.

Resources