How to make a Windows touch pad acts like a Mac touch pad? - c

I am trying to write something for windows using WINAPI, so I can make the touch pad do whatever the mac touch pad do.
I have checked using Spy++ what WM messages the two finger taps and etc. send to the OS, but figured out it sends only those plus/minus:
WM_LBUTTONDOWN
WM_LBUTTONUP
WM_MOUSEHOVER
WM_MOUSEHWHEEL
WM_MOUSELEAVE
WM_MOUSEMOVE
WM_RBUTTONDOWN
WM_RBUTTONUP
When I tried to see what happend when clicking with 3 or 2 fingers it didn't send any particular message, unless I moved them a bit.
firstly i would like to start with this:
when 5 fingers going down show desktop (as win+D does).
How to write (driver?) something that can diagnose 5 fingers touching simultaneously the touch pad?
Of curse there is no OS messages for this, but I can make some unique combination of existed messages and by that diganose it.
If I need to write a driver can I do it generic for most of the touchpad, can I do it as add-on?
If you can post a good tutorial you are familiar with for writing a driver for windows, pls, cause I have no clue about it.
Do I need anything else to take into account :
1. Diagnose 5 fingers mouse events.
2. Make a thread in Explorer on startup that handle those new mouse messages.
thanks in advance
Mouse Input Notifications

In short, you can't.
First, there are touchpads that can physically detect only 1 finger touch, and for those who can detect many - their drivers do the translation for you.
Windows does not have any inherent support for reading multiple touch inputs - it relies on the touchpad drivers to provide them.
You can achieve your goal for SOME devices by writing your own touchpad driver (probably starting from Linux touchpad drivers and Windows driver development kit), but this is far from being simple.
And, you'll need to do this for each and every touchpad device you want to support (from Synaptics, Alps Electric, Cirque to name the few)
Only after that you can move on to implementing the reaction for the touchpad actions in applications like Explorer.

Related

Enable USB Debugging and remote control (scrcpy) with broken screen

and happy holidays. Apologies if this question is more appropriate for SuperUser, I ask here in case there is a programmatic solution to my problem.
In summary: On an Android device with a broken screen, assuming that USB Debugging is enabled or that the connected device is not trusted, is there any way to perform remote control?
So I have a Pixel 3a which I used to control just fine via scrcpy 1.21. Today the screen broke to the point that I only see a few glitchy lines, even though it looks like I can theoretically control it (based on the sounds, I managed to enter the log-in pattern).
The problem is that adb devices does not recognize it, even though the system (Ubuntu or Windows) recognizes that a Pixel 3a was connected for charging.
Is there any way I can salvage this situation?
ADB commands failed. scrcpy failed.

How to disable TOUCH when remote using VNC?

My solution has 2-machines where one is customer facing and the other is used by store personnel; in some situations when store personnel want to take control of the customer facing system we use UltraVNC to remote into it, push our application to a 2nd virtual display and put a PLEASE WAIT screen to the customer!
With Windows10 the concept of a virtual display is no longer possible so our remote view via UltraVNC lands on the primary display which means that the customer can SEE what the store personnel are doing and can also interact/interfere with it.... and this is my struggle today!
We found with UltraVNC we can "display user input" which works with keyboard/mouse but doesn't support blocking TOUCH input (we use a touchscreen) and there seems to be no way to put a PLEASE WAIT only to disable the screen (goes black - which has customers asking if the program crashed).
So opening this up to the general public to see if anyone has experience either with UltraVNC or has some completely different proposal for me to consider! I am open to all suggestions!
You rely on windows solution, On linux there is a dbus messaging system , I don't know if there is a similar messaging system in windows.
You may have a powershell script that unplugs the display or the touchscreen input.
and send a toast message to user.
https://gist.github.com/atao/12b5857d388068bab134d5f9c36241b1

what is required to get an overlay window using x11 protocol with no compositor running?

Using the lisp implementation of the X11 protocol, get-overlay-window freezes when no compositor is running. If I kill the lisp process, the xid is printed out.
This also freezes my lisp window manager running in another lisp thread, though same process. Basically X acts like it's been grabbed, so thank god for ctrl-alt-f1.
Some previous questions about composite show others running into similar problems when no compositor is running.
I'm guessing that maybe the server is waiting for some sort of out of protocol authorization or something? Or something particular sequence of events has to be completed?
Having access to the overlay window when another compositor is active isn't helpful for writing a compositor!
Apparently I had a reading comprehension fail with the protocol description, or they a writing fail.
Asking composite to redirect windows automatically ensures the windows contents get drawn. It does not ensure they get drawn to the overlay! Nor does the overlay appear to be transparent. So even with setting all windows to be automatically updated, when the overlay window gets mapped by the call to get its XID it blocks you from seeing any other updates to the screen and blocks all input.
Making the overlay in a sense not very useful. Or the request to have automatic updates for redirected windows not useful. Either way, seems will have to paint every single pixel even of the windows we're not interested in.
Maybe it's just a driver thing?

Windows TCP/UDP mouse driver

I am working on creating a touch pad device (custom hardware but similar to an android device) that acts as a touchscreen drawing pad similar to the Wacom Bamboo drawing pads. However, the key feature of the device is instead of connecting it to the computer with wires or via Bluetooth, it connects to the local WiFi network and searches for devices with a port open (currently 5000 for testing purposes). Currently, I have a client written in C that when launched opens up a DatagramSocket on port 5000 and waits for a custom UDP packet containing normalized X, Y, and pressure. Then, for testing purposes, I am putting the normalized X and Y into SendInput. SendInput "works" however injecting packets into the computers current mouse is not what I want. Instead, I want to have it considered as a seperate input device so programs like gimp will be able to detect it and assign custom functions based on the data (ie: have gimp utilize the pressure data).
The problem is I dont know where to start to create a driver that does the former. I have been extensively looking at the winddk thinking that might be the key. The problem with the winddk is I cannot find any documentation on creating a HID driver using data that is not from a ps/2 or usb. This tutorial got me thinking about using IOCTLs, but I am not really sure how to make them be considered as input.
As a side note, in the title I said TCP/UDP because I am willing, and considering for security purposes, to change from UDP connection to TCP.
If someone can push me in the right direction or link me to some related documentation and samples, that would be awesome because right now I am lost. Thank you.

Detecting dropped call in Mobile

I'm using Motorola device and developed it with J2ME . I'm searching for a functionality to detect incomming or outcomming calls when dropped .
I mean , when the call is dropped I need to recognize this event.
Thanks
There is no standard J2ME telephony API.
There could be a working proprietary java-based API on that particular handset but that's both unlikely and not obvious to verify.
You could use the life-cycle of the application to detect interrupts. Your device can detect when a phone-call comes in and ends, and trigger shownotify(), hidenotify(), startApp(), pauseApp(), then do something accordingly. See the canvas class. So if you have an application running, you can detect an incoming phone-call, wait for it to end then do something. This is of course very device/manufacturer specific and you are in a world of hurt when it comes to porting this for many devices. I'm not sure if you can do something like this for outgoing calls, since your app will be in the background and paused for most devices.
You could try checking the motorola developer webpage. Motorola has its own set of libraries for j2me, it may support the case you need.

Resources