Page MenuHome

Fix T85200: Handle WM_KEYDOWN/WM_KEYUP messages for virtual key VK_PACKET as Unicode input
Needs ReviewPublic

Authored by Sam Hocevar (samhocevar) on Jan 29 2021, 3:44 PM.

Details

Summary

Problem

Right now most third-party input software that provide custom macros or help writing special characters does not work in Blender. This is because Blender ignores the WM_KEYDOWN, WM_KEYUP, and WM_CHAR messages it receives from these applications (as a result of using e.g. SendInput()), relying instead on raw input inside the WM_INPUT handler.

When an application uses SendInput() with the KEYEVENTF_UNICODE flag, it sets the scancode to the UTF-16 value it wants to send to another application. But the application actually sees a scancode value of 0, and the UTF-16 value is instead found in lParam. No raw input event is created, that is why they do not work in Blender.

Solution

In this patch I suggest handling WM_KEYDOWN and WM_KEYUP messages for the virtual key VK_PACKET. These messages do not originate from a keyboard device, and have been generated by some third-party input software instead.

Diff Detail

Repository
rB Blender

Event Timeline

Sam Hocevar (samhocevar) requested review of this revision.Jan 29 2021, 3:44 PM
Sam Hocevar (samhocevar) created this revision.
Sam Hocevar (samhocevar) edited the summary of this revision. (Show Details)

I'm fairly certain there should be a way to handle this in WM_INPUT instead of WM_CHAR, but I only have passing familiarity with it. Could you outline why you believe virtually generated unicode input can not be processed by WM_INPUT instead of WM_CHAR?

Could you outline why you believe virtually generated unicode input can not be processed by WM_INPUT instead of WM_CHAR?

Unfortunately, when an application uses SendInput() without a virtual key value to send key events to Blender (or any other application), no WM_INPUT message is synthesised by the system, so Blender will not see any such message.

I cannot see a way around it that only relies on WM_INPUT. The Microsoft documentation for SendInput() and Simulated Keyboard Events explicitly states that when sending KEYEVENTF_UNICODE events, the virtual key must be 0, and the scan code must be the UTF-16 character value. So no physical information about the key can be provided.

Right now the only way Blender can receive non-ASCII characters through WM_INPUT is for characters belonging to the current keyboard mapping.

Fixed a typo that broke surrogate pair handling.

Sorry, I realise I merely paraphrased my original report and did not really clarify things.

  • Tools like AutoHotKey or WinCompose generate keyboard events using the SendInput() API to send characters to applications
  • When used in standard mode (characters restricted to the current keyboard layout), SendInput() triggers both WM_CHAR messages and WM_INPUT messages
  • When used in Unicode mode (using KEYEVENTF_UNICODE), SendInput() triggers WM_CHAR but not WM_INPUT
  • Blender only listens to WM_INPUT and thus cannot receive Unicode characters in the second scenario

@Sam Hocevar (samhocevar) thank you for clarifying. :)

Adding some more context as I research into the issue. The RawInput keyboard struct has a field for the virtual key (the Windows abstraction of a key) and a field for the scan code (an identifier standardized per keyboard layout type like qwerty or dvorak edit: I'm less certain this is standardized, may be or may be oem specific). SendInput co-opts this struct, and for Unicode it recycles the scan code member and assigns the virtual key to 0 as a sentinel value to indicate it's sending Unicode. Absent a virtual key WM_INPUT doesn't receive the input as a key event, and instead it routes through WM_KEYDOWN and WM_KEYUP events, which when processed by TranslateMessage generate a WM_CHAR event.

The reason using Unicode is because it maps naturally to text-based character input, whereas the virtual key and scan codes at a glance seem to me to be limited to the range of expected input from the system keyboard.

Does the above line up with your understanding?

Yes, that is correct.

While reviewing my patch I realised there are possible issues caused by not sending GHOST_kEventKeyUp events. I will update the patch to reflect this.

Note that there would be another way to handle this: doing the work in WM_KEYDOWN and WM_KEYUP instead.

  • pros: in that message the virtual key is explicitly set to VK_PACKET, which is a more robust (or at least more explicit) way to detect injected Unicode characters.
  • cons: requires calling ToUnicodeEx() to retrieve the character information, meaning some code duplication with GHOST_SystemWin32::processKeyEvent() and/or a need to refactor these functions

I’m willing to do the additional work if you think it’s worth it.

WM_KEYDOWN/UP seems more appropriate to me, especially given VK_PACKET can be used to identify Unicode events.

Sam Hocevar (samhocevar) retitled this revision from Handle WM_CHAR messages with scancode 0 as Unicode input to Handle WM_KEYDOWN/WM_KEYUP messages for virtual key VK_PACKET as Unicode input.

I reworked the patch to handle VK_PACKET as discussed. The diff viewer is a bit confused because processKeyEvent() now has one level of indentation less, but the patch is actually simpler.

  • The processKeyEvent() function is more generic.
  • WM_KEYDOWN and WM_KEYUP are handled for VK_PACKET by calling the new generic processKeyEvent().
  • A new processRawKeyEvent() function now takes care of calling processKeyEvent() with the proper arguments for raw input. That function is small and could be merged in s_wndProc() but I decided against it because s_wndProc() is already quite complex.
  • Changed magic number 2 in MapVirtualKey() call to its API name MAPVK_VK_TO_CHAR.
  • Minor code documentation fixes.
ghost/intern/GHOST_SystemWin32.cpp
1262

No longer ignored, needs explanation.

1276–1277

Should these be unsigned shorts?

1601–1612

These should be moved out of the Keyboard events, ignored commented section.

1609

Could you link me to the MSDN which explains this shift? From https://docs.microsoft.com/en-us/windows/win32/inputdev/wm-keydown it looks like only bits 16-23 are valid for the scan code.

1618

Find and replace error?

Sam Hocevar (samhocevar) marked 4 inline comments as done.

Made all suggested edits. Here is an example of Unicode input using WinCompose:

I'm not familiar with this code, but in general this seems like a safe change as it only affects "VK_PACKET virtual key" (and this event "will not be received by the WM_INPUT handler").
So I'm taking the risk and accepting.

But I added the Windows support team as reviewers in case anyone else wants to check it out.

ghost/intern/GHOST_SystemWin32.cpp
1263

For me that description needs a little more clarification.

  • Where are these messages ignored?
  • What are hardware keys? Are they the physical keys on a keyboard? So what are the other types of keys that are not "ignored"?

(I'm not sure who maintains this area, so whoever is going to revise this patch may have to study it first).

This revision is now accepted and ready to land.Feb 2 2021, 3:37 PM
This revision now requires review to proceed.Feb 2 2021, 4:46 PM

@Sam Hocevar (samhocevar) thanks for looking into my comments. I'd still like answers to the questions so that I understand the reasoning for the choices made. I could figure a lot of it out by tracing through the code, but you are already familiar so it would save me time understanding. I've add context to relevant questions.

ghost/intern/GHOST_SystemWin32.cpp
1262

No longer ignored, needs explanation.

1276–1277

To clarify this question: I'd appreciate if you could review when types are being converted to types of different length or signed/unsigned and either explain why it was intentional (in inline comments in this issue), or rewrite them to be consistent and explain what was changed and for what reasons.

Right now there are a few implicit conversions between different sized types which may be perfectly fine, but reviewing each takes time when I suspect the type conversion are unnecessary and would be made consistent after you go through them.

Address comments from reviewers:

  • clarify comment about why VK_PACKET events not being ignored does not affect us
  • get rid of unneeded implicit integer type conversions
Sam Hocevar (samhocevar) marked 3 inline comments as done.Feb 5 2021, 9:42 PM
Sam Hocevar (samhocevar) added inline comments.
ghost/intern/GHOST_SystemWin32.cpp
1263

I removed mention of the “hardware keys”. Do you think the new wording in the comment clarifies what happens, or does it need more explanations? I could add more details about where the messages are ignored (in the large switch/case around line 1600) but that information was already missing before my patch.

I believe a seasoned Windows programmer will immediately know where to look for when reading about WM_-something being ignored.

1276–1277

You are right, going from unsigned short to int and back to unsigned short is unnecessary. I overlooked those.

1609

You’re right, this isn’t good. I got confused by GTK+ which does the same, I guess they’re looking for trouble, too. Fixed.

1618

I just thought the sentence wasn’t very clear here. MSDN says both “The WM_CHAR message contains the character code of the key” and “wParam — The character code of the key” and I thought stating the first part without clarifying with the second part was omitting valuable information.

Sam Hocevar (samhocevar) edited the summary of this revision. (Show Details)Feb 5 2021, 9:45 PM
ghost/intern/GHOST_SystemWin32.cpp
1312

Just to be sure, MapVirtualKeyW does not return the expected value if vk is VK_PACKET?

Sam Hocevar (samhocevar) marked 3 inline comments as done.Feb 6 2021, 12:03 AM
Sam Hocevar (samhocevar) added inline comments.
ghost/intern/GHOST_SystemWin32.cpp
1312

Indeed, it will return 0. That function relies on the current keyboard layout (this is not documented but can be confirmed experimentally and is a well known fact), not the current input state (contrary to ToUnicodeEx which is used later in the function), so it cannot map VK_PACKET to multiple different Unicode characters.

Nicholas Rishel (nicholas_rishel) retitled this revision from Handle WM_KEYDOWN/WM_KEYUP messages for virtual key VK_PACKET as Unicode input to Fix T85200: Handle WM_KEYDOWN/WM_KEYUP messages for virtual key VK_PACKET as Unicode input.Feb 7 2021, 11:31 PM
Nicholas Rishel (nicholas_rishel) edited the summary of this revision. (Show Details)

For the most part this looks ready to go.

ghost/intern/GHOST_SystemWin32.cpp
1312

@Germano Cavalcante (mano-wii) VK_PACKET indicates the message contains a Unicode character but doesn't embed what that Unicode character is, whereas most (all?) other virtual keys meaningfully map to a single character for a given keyboard layout.

@Sam Hocevar (samhocevar) I'd suggest reordering this to MapVirtualKeyW(vk, MAPVK_VK_TO_CHAR) != 0 || vk == VK_PACKET as the former is the common case.

1611

https://docs.microsoft.com/en-us/windows/win32/inputdev/virtual-key-codes

The VK_PACKET key is the low word of a 32-bit Virtual Key value used for non-keyboard input methods.

Do you have an intuition for what this means? I'm guessing it just means the virtual key is the low word of wParam in keydown/up, which is true for all virtual keys?

@Sam Hocevar (samhocevar) just checking that you got the last questions. I was ready to merge this after clarifying the last details.