This addresses T77137 VR: Python API for Controller Interaction.
This patch allows users to create custom OpenXR input / output actions via the Python API. It also adds basic VR controller visualization and an OpenGL selection method for VR (partially fulfilling T77127 and T77128, respectively).
Changes are implemented at the GHOST, WM, and RNA / Python layers.
User-created actions can be used to execute a Python operator, retrieve state from a button or pose, or send a haptic pulse when performed during a VR session. Bindings for actions, using OpenXR paths are entirely defined by the user to allow the API to extend to any hardware supported by OpenXR.
The "VR Scene Inspection" add-on has also been extended to use the new VR action functions. This way, users can create and customize actions via the add-on sidebar if they prefer.
Visualization of VR controllers is first achieved by reading the state of a user-specified VR pose action. Then, custom drawing via the add-on (for screen visualization) as well as offscreen rendering from within Blender (for in-headset display) is performed to portray the controller axes. This basic visualization can later be replaced with more appropriate controller geometry.
Finally, invoke_3d() and modal_3d() functions were added to the basic select and transform operators, allowing them to be controlled via XR inputs. Implementation-wise, a VR controller's 3D location is first projected to 2D mouse coordinates using the perspective of the user's preferred eye and then the result is passed to the regular invoke() / modal() functions. Using this approach, controller-based OpenGL picking was implemented.
You can test out the XR actions functionality using the sample file below and the xr-actions-D9124 experimental build. Special thanks to blendFX for creating the sample.






