This addresses T77137 VR: Python API for Controller Interaction.
This patch allows users to create custom OpenXR input / output actions via the Python API. It also adds basic VR controller visualization and an OpenGL selection method for VR (partially fulfilling T77127 and T77128, respectively).
Changes are implemented at the GHOST, WM, and RNA / Python layers.
User-created actions can be used to execute a Python operator, retrieve state from a button or pose, or send a haptic pulse when performed during a VR session. Bindings for actions, using [[ https://www.khronos.org/registry/OpenXR/specs/1.0/html/xrspec.html#semantic-path-interaction-profiles | OpenXR paths ]] are entirely defined by the user to allow the API to extend to any hardware supported by OpenXR.
The "VR Scene Inspection" add-on has also been extended to use the new VR action functions. This way, users can create and customize actions via the add-on sidebar or preferences if they prefer.
{F8963077}
{F8961078}
{F8961079}
Visualization of VR controllers is first achieved by reading the state of a user-specified VR pose action. Then, custom drawing via the add-on (for screen visualization) as well as offscreen rendering from within Blender (for in-headset display) is performed to portray the controller axes. This basic visualization can later be replaced with more appropriate controller geometry.
{F8961129}
{F8961139}
{F8961140}
Custom actions can be saved to the VR Scene Inspection add-on's preferences and easily transferred to different blend fileFinally, invoke_3d() and modal_3d() functions were added to the basic select and transform operators, allowing them to be controlled via XR inputs. Implementation-wise, a VR controller's 3D location is first projected to 2D mouse coordinates using the perspective of the user's preferred eye and then the result is passed to the regular invoke() / modal() functions. Operator properties for button-based actions are stored in an "XR" keymap added to the blender default keyconfig and connected to VR actions by means of action set / action identifiers added to XR-type keymap items.
{F9126706}Using this approach, controller-based OpenGL picking was implemented.
{F9126707}
Finally, invoke_3d() and modal_3d() functions were added to the basic select and transform operators, allowing them to be controlled via XR inputs. Implementation-wise,You can test out the XR actions functionality using the sample file below and the xr-actions-D9124 [[ https://builder.blender.org/download/branches/ | experimental build ]]. a VR controller's 3D location is first projected to 2D mouse coordinates using the perspective of the user's preferred eye and then the result is passed to the regular invoke() / modal() functions. Using this approach, controller-based OpenGL picking was implemented.Special thanks to blendFX for creating the sample.
{F9890711}