I'm trying to translate the given vkCode and scanCode to actual characters, based on the keyboard. Example of what I'm trying to achieve is translating AltGr+V to '#', Shift+1 to '!' etc., but based on the current keyboard layout. I've managed to get it working briefly, but after an unknown change it's no longer working. Here is what I currently have:
unsigned char btKeys[256] = {0};
GetKeyboardState(btKeys);
HKL keyboardLayout = GetKeyboardLayout(0);
wchar_t szBuffer[2] = {0};
if (ToUnicodeEx(vkCode, scanCode, btKeys, szBuffer, 2, 0, keyboardLayout)) {
if (iswcntrl(szBuffer[0])) {
sendControl(szBuffer[0]);
} else {
sendCharacter(szBuffer[0]);
}
}
But for some reason I only get back the same character as was sent in via the vkCode (for example pressing Shift+1 yields only '1')
How can I get ToUnicodeEx to return the correct value?
Related
I'm trying to translate vk constants (possibly joined together) to their names (e.g., 81 (VK_CONTROL|VK_SHIFT|VK_A) = βControl+Shift+A.β
I tried using the GetKeyNameText function (both with vk and scancode constants), but it does not seem to work (the string is blank, it returns 0, and GetLastError returns 0 as well).
As far as I know, there is no API function that directly converts the combined virtual keys into text. For non-character keys (VK_CONTROL, VK_SHIFT, etc.), you need to manually concatenate strings.
For character keys, MapVirtualKey works fine for me.
I tested the code you provided:
UINT t = MapVirtualKey(VK_A, MAPVK_VK_TO_CHAR);
I can get its return value:
Filter out the control and shift with
char keybuff[64] = {0};
if (keys & VK_CONTROL) {
strcat(keybuff, "Control+");
keys &= ~VK_CONTROL; // turn bits OFF
}
if (keys & VK_SHIFT) {
strcat(keybuff, "Shift+");
keys &= ~VK_SHIFT;
}
And use MapVirtualKey to map the VK code to a character
https://learn.microsoft.com/en-us/windows/win32/api/winuser/nf-winuser-mapvirtualkeya
So in the following case key becomes 65 which is the ascii value for capital 'A'. Cast it to a char and you have the letter. This works for all codes below ascii 127.
UINT key = MapVirtualKey(VK_A, MAPVK_VK_TO_CHAR);
char c = (char)key; // 'A'
I advise you not to use GetKeyNameText API at all since this an outdated shit that is broken for non-US layouts. Create and use some lookup table youself instead. :)
If you still decide to use it then proper call of GetKeyNameText is tricky:
// Get keyboard layout specific localized key name
std::string GetKeyNameTextWrapper(uint16_t scanCode)
{
wchar_t name[128];
int charCount = 0;
// GetKeyNameText is not working for these keys
// due to use of broken MAPVK_VK_TO_CHAR under the hood
// See https://stackoverflow.com/a/72464584/1795050
const uint16_t vkCode = LOWORD(::MapVirtualKeyW(scanCode, MAPVK_VSC_TO_VK_EX));
if ((vkCode >= (uint16_t)'A') && (vkCode <= (uint16_t)'Z'))
{
const uint32_t flags = 1 << 2; // Do not change keyboard state of this thread
static uint8_t state[256] = { 0 };
// This call can produce multiple UTF-16 code points
// in case of ligatures or non-BMP Unicode chars that have hi and low surrogate
// See examples: https://kbdlayout.info/features/ligatures
charCount = ::ToUnicode(vkCode, scanCode, state, name, 128, flags);
// negative value is returned on dead key press
if (charCount < 0)
charCount = -charCount;
}
else
{
const LONG lParam = MAKELONG(0, ((scanCode & 0xff00) != 0 ? KF_EXTENDED : 0) | (scanCode & 0xff));
charCount = ::GetKeyNameTextW(lParam, name, 128);
}
return utf8::narrow(name, charCount);
}
Also note that GetKeyNameText will return different string according to string data that is sewn into keyboard layout dll. And that layout dlls may have bugs.
This is a mplab harmony code, I am trying to received a data at client, I want to control the LEDs from server socket, text "START" will turn the LEDs ON and "STOP" will turn the LEDs OFF, when I debug the code, it shows that Appbuffer[80] has all null variable, ACK[] = last variable is null \0, same in AOK[].I want to know that is it a right way to compare the string as I have written in the code here. Because when I debug the code, it escape this line and jump to server task init(). Please help me.
case APP_TCPIP_WAIT_FOR_RESPONSE:
{
char Appbuffer[80];
static const char ACK[]="START";
static const char AOK[]="STOP";
memset(Appbuffer, 0, sizeof(Appbuffer));
if (!TCPIP_TCP_IsConnected(appData.clientSocket))
{
SYS_CONSOLE_MESSAGE("\r\nConnection Closed\r\n");
appData.clientState = APP_TCPIP_WAITING_FOR_COMMAND;
break;
}
if (TCPIP_TCP_GetIsReady(appData.clientSocket))
{
TCPIP_TCP_ArrayGet(appData.clientSocket, (uint8_t*)Appbuffer, sizeof(Appbuffer) - 1);
SYS_CONSOLE_PRINT("%s", Appbuffer);
if(!strcmp(Appbuffer, ACK)) //// breakpoint
{
BSP_LEDStateSet(BSP_LED_1,BSP_LED_STATE_ON);
BSP_LEDStateSet(BSP_LED_2,BSP_LED_STATE_ON);
BSP_LEDStateSet(BSP_LED_3,BSP_LED_STATE_ON);
}
else if(!strcmp(Appbuffer, AOK)) // breakpoint
{
BSP_LEDStateSet(BSP_LED_1,BSP_LED_STATE_OFF);
appData.serverState = APP_TCPIP_CLOSING_CONNECTION;
SYS_CONSOLE_MESSAGE("Connection was closed\r\n");
}
}
}
AppBuffer has all null because of the call to function memset.
I didn't see in the code you posted, that AppBuffer is assigned a value.
Hence AppBuffer is essentially a zero-length string and therefore when you compare it with AOK, the result is false.
I wanted to make a program to simulate key presses. I think i am mostly done but i have done something wrong i guess because it is not doing what i expect it to do. I have made a small example program to illustrate the issue. The main problem is that if i want to generate capital letters it does not work with strings like 'zZ'. It is generating only small letters 'zz'. Although symbols like '! $ & _ >' etc. work fine (that require shift on my German keyboard layout) and even multi byte ones like 'π£' . What i am doing is this:
preamble:
So basically the main problem by emulating key presses is first the layout that changes from user to user and most importantly modifier keys. So if you go the naive route and get a keysym with XStringToKeysym() get a keycode from that keysym with XKeysymToKeycode() and fire that event its not working like most 'newcomers' would expect (like me). The problem here is, that multiple keysyms are mapped to the same keycode. Like the keysysm for 'a' and 'A' are mapped to the same keycode because they're on the same physikal button on your keyboard that is linked to that keycode. So if you go the route from above you end up with the same keycode although the keysyms are different but mapped to the same button/keycode. And there is usually no way around this because it is not clear how the 'A' came to existence in the first place. shift+a or caps+a or you have a fancy keyboard with an 'a' and 'A' button on it. The other problem is how do i emit key presses for buttons that are not even on the keyboard of that person running that application. Like what key is pressed on an english layout if i want to type a 'Γ' (german umlaut). This does not work because XKeysymToKeycode() will not return a proper keycode for this because there is no keysym mapping for it with that layout.
my approach:
What i am tying to do to circumvent this is finding a keycode that is not being used. You have 255-8 keycodes at your disposal but a regular keyboard has only ~110 keys on it so there is usually some space left. I am trying to find one of those keycodes that are unmapped on the current layout and use it to assign my own keysyms on it. Then i get a keysym from my char i got by iterating over my string and pass it to XStringToKeysym() which gives me the appropriate keysym. In case of βπ£β that is in most cases not mapped to any keyboard layout i know of. So i map it to the unused keycode and press it with XTestFakeKeyEvent() and repeat that for every char in the string. This works great with all fancy glyph one can think of but it does not work with simple letters and i really don't know why :( in my debugging sessions keysyms and keycodes seem to be correct its just that XTestFakeKeyEvent() does not do the right things in that case. Its possible that i fucked something up at the keymapping part but i am not really sure whats the problem here and i hope someone has a good idea and can help me find a way to a working solution.
I am just using this unicode notation in the strings array because i don't want to deal with this in the example here. Just assume there is code producing this from an arbitrary input string.
be aware that the code below can ruin your keymapping in such a way that you're not able to type and use your keyboard anymore and need to restart your X-Server/PC ... i hope it does not in its current state (working fine here) just be aware if you fiddle with the code
#include <stdio.h>
#include <stdlib.h>
#include <string.h>
#include <X11/X.h>
#include <X11/Xlib.h>
#include <X11/extensions/XTest.h>
#include <unistd.h>
//gcc -g enigo2.c -lXtst -lX11
int main(int argc, char *argv[])
{
Display *dpy;
dpy = XOpenDisplay(NULL);
//my test string already transformed into unicode
//ready to be consumed by XStringToKeysym
const char *strings[] = {
"U1f4a3",// π£
"U007A", //z
"U005A", //Z
"U002f", //'/'
"U005D", //]
"U003a", //:
"U002a", //*
"U0020", //' '
"U0079", //y
"U0059", //Y
"U0020", //' '
"U0031", //1
"U0021", //!
"U0020", //' '
"U0036", //6
"U0026", //&
"U0020", //' '
"U0034", //4
"U0024", //$
"U0020", //' '
"U002D", //-
"U005F", //_
"U0020", //' '
"U003C", //<
"U003E", //>
"U0063", //c
"U0043", //C
"U006f", //o
"U004f", //O
"U00e4", //Γ€
"U00c4", //Γ
"U00fc", //ΓΌ
"U00dc", //Γ
};
KeySym *keysyms = NULL;
int keysyms_per_keycode = 0;
int scratch_keycode = 0; // Scratch space for temporary keycode bindings
int keycode_low, keycode_high;
//get the range of keycodes usually from 8 - 255
XDisplayKeycodes(dpy, &keycode_low, &keycode_high);
//get all the mapped keysyms available
keysyms = XGetKeyboardMapping(
dpy,
keycode_low,
keycode_high - keycode_low,
&keysyms_per_keycode);
//find unused keycode for unmapped keysyms so we can
//hook up our own keycode and map every keysym on it
//so we just need to 'click' our once unmapped keycode
int i;
for (i = keycode_low; i <= keycode_high; i++)
{
int j = 0;
int key_is_empty = 1;
for (j = 0; j < keysyms_per_keycode; j++)
{
int symindex = (i - keycode_low) * keysyms_per_keycode + j;
// test for debugging to looking at those value
// KeySym sym_at_index = keysyms[symindex];
// char *symname;
// symname = XKeysymToString(keysyms[symindex]);
if(keysyms[symindex] != 0) {
key_is_empty = 0;
} else {
break;
}
}
if(key_is_empty) {
scratch_keycode = i;
break;
}
}
XFree(keysyms);
XFlush(dpy);
usleep(200 * 1000);
int arraysize = 33;
for (int i = 0; i < arraysize; i++)
{
//find the keysym for the given unicode char
//map that keysym to our previous unmapped keycode
//click that keycode/'button' with our keysym on it
KeySym sym = XStringToKeysym(strings[i]);
KeySym keysym_list[] = { sym };
XChangeKeyboardMapping(dpy, scratch_keycode, 1, keysym_list, 1);
KeyCode code = scratch_keycode;
usleep(90 * 1000);
XTestFakeKeyEvent(dpy, code, True, 0);
XFlush(dpy);
usleep(90 * 1000);
XTestFakeKeyEvent(dpy, code, False, 0);
XFlush(dpy);
}
//revert scratch keycode
{
KeySym keysym_list[] = { 0 };
XChangeKeyboardMapping(dpy, scratch_keycode, 1, keysym_list, 1);
}
usleep(100 * 1000);
XCloseDisplay(dpy);
return 0;
}
When you send a single keysym for a given keycode to XChangeKeyboardMapping and it is a letter, it automatically fills correct upper and lower case equivalents for shift and capslock modifiers. That is, after
XChangeKeyboardMapping(dpy, scratch_keycode, 1, &keysym, 1);
the keycode map for scratch_keycode effectively changes (on my machine) to
tolower(keysym), toupper(keysym), tolower(keysym), toupper(keysym), tolower(keysym), toupper(keysym), 0, 0, 0, 0, ...
In order to inhibit this behaviour, send 2 identical keysyms per keycode:
KeySym keysym_list[2] = { sym, sym };
XChangeKeyboardMapping(dpy, scratch_keycode, 2, keysym_list, 1);
This will fill both shifted and unshifted positions with the same keysym.
The program below uses a keypad and arduino to test whether an input is equal to the password. Whenever '#' is pressed the input is checked and then the variable storing the input resets to an empty string (char input[257] = ""). The program then loops back to the beginning of the void loop() code block. I am having a problem when my program loops back to the beginning of the "void loop()" code block. When i reset input to an empty string, input is reset to an empty string as it should. However, when the program loops the value of input is changed to what it was before. So if i originally entered "123abc" and hit '#' the program would tell me that the input was incorrect and then the program would reset the variable to an empty string, but when the program loops the variable storing the empty string is changed back to "123abc". What's happening? Why doesn't the variable remain an empty string?
#include <Keypad.h>
const byte ROWS = 4; //four rows
const byte COLS = 4; //three columns
char keys[ROWS][COLS] = {
{'1','2','3','A'},
{'4','5','6','B'},
{'7','8','9','C'},
{'*','0','#','D'}
};
byte rowPins[ROWS] = {5, 4, 3, 2}; //connect to the row pinouts of the keypad
byte colPins[COLS] = {9, 8, 7, 6}; //connect to the column pinouts of the keypad
char password[9] = "3994A", input[257]="";
Keypad keypad = Keypad( makeKeymap(keys), rowPins, colPins, ROWS, COLS );
int x;
void setup(){
Serial.begin(9600);
}
void loop(){
x = 0;
Serial.println(input);
while (1)
{
char key = keypad.getKey();
if (key)
{
if (key == '#')
{
break;
}
input[x] = key;
x+=1;
}
}
if (strcmp(password,input) == 0)
{Serial.println("Access Granted");
}
else
{Serial.println("Access Denied");
}
char input[257] = "";
Serial.println(input);
}
Not the same variable. You've got one input in your top block and another in your loop function.
This line
char input[257] = "";
makes a new local variable called input, but you dont want that. You already made a global one here:
char password[9] = "3994A", input[257]="";
Change
char input[257] = "";
to
memset(input,0,257);
So you dont lose the new variable off the stack and instead use the global one you declared earlier.
The second char input[257] = ""; declares a new variable named input. It does not change the contents of the existing variable named input. Use memset() instead.
I agree with the error in declaring again the input variable
char input[257] = "";
but i would not solve it with memset() but doing
input[0]='\0';
because this is faster and doesn't require to include the memset() function on the code. we are talking about a microcontroller, cpu and memory is precious and it is a good habit to write fast and light code.
Here, how could I achieve this:
Go to the official Arduino Website Download Keypad.h http://playground.arduino.cc/Code/Keypad#Download (ctrl+f and type: download) Download the library to put on C:\Program Files (x86)\Arduino\libraries Close your current Arduino screen then re-open the screen. Rewrite your code and put the code first. I okey with this method.
#include <Keypad.h>
have wrote this app which reads input from console.
for(; ; )
{
GetNumberOfConsoleInputEvents(stdinInput, &numEvents);
if (numEvents != 0) {
INPUT_RECORD eventBuffer;
ReadConsoleInput(stdinInput, &eventBuffer, 1, &numEventsRead);
if (eventBuffer.EventType == KEY_EVENT) {
if(eventBuffer.Event.KeyEvent.bKeyDown)
{
printf("%c",eventBuffer.Event.KeyEvent.uChar.AsciiChar);
dataBuffer[bufferLen++] = eventBuffer.Event.KeyEvent.uChar.AsciiChar;
dataBuffer[bufferLen] = '\0';
if ( dataBuffer[bufferLen] == 99 || eventBuffer.Event.KeyEvent.uChar.AsciiChar == '\r' ) {
printf("User Wrote: %s\n",dataBuffer);
memset(dataBuffer,0,sizeof(dataBuffer));
bufferLen = 0;
}
}
}
}
}
It puts the data on a buffer and then it prints out the buffer. The problem occurs when im using Shift or CapsLock to write Capital letters or ! # # $ % characters. Then it prints out NOTHING.
Ive tried something with the VK_LSHIFT code but didn't worked.
Also if try to write something in other language than English it prints out something like this ββββββββββββ It cannot recognize the other language.
Can someone give me a hint on how to fix those problems ?
Thanks!
ReadConsoleInput returns events for each keystroke. For example, if you type SHIFT+A to get a capital A then you'll receive four key events: SHIFT down, A down, A up, SHIFT up.
The SHIFT key does not have a corresponding ASCII code so eventBuffer.Event.KeyEvent.uChar.AsciiChar is set to zero. This zero terminates the string you are building in dataBuffer so you don't see anything typed after the SHIFT key.
The simplest fix is to ignore any key event with an ASCII code of zero.
Additionally, if you want this to work well with foreign languages you might do better to use ReadConsoleInputW and eventBuffer.Event.KeyEvent.uChar.UnicodeChar. Better yet, compile it all as a Unicode app.