-
Notifications
You must be signed in to change notification settings - Fork 2.4k
Description
This bug report was migrated from our old Bugzilla tracker.
Reported in version: HG 2.1
Reported for operating system, platform: All, All
Comments on the original bug report:
On 2015-04-15 12:45:49 +0000, wrote:
Currently there are two major HMD APIs available Occulus and Steam VR and in future some Window Managers (e.g. motorcar compositor) may even support HMDs directly allowing for a 3D/VR desktop with their own API.
It is a pain for developers to have to code for multiple different VR APIs and then also add in a separate event loop and window handling for VR and SDL depending on if the application will run in or out of VR.
To support VR, SDL could be extended in the following ways:
Support for giving information to the application on the type of stereoscopy the system supports. If it can use quad buffering, one double sized framebuffer/textures or multiple framebuffers/textures for example. Also if the application needs to apply the distortion itself with a function for calculating this.
If the system does not support quad buffering SDL may be able to support multiple "windows", one on the left eye display and one on the right eye display. Or have the application render to two textures which SDL will arrange itself via a shared context.
Support for abstracting the eye, projection and world matrix positioning. This information should have a window associated to allow situations where the application have multiple windows which are not full-screen and require different matrices.
Support for abstracting 6DOF pointer devices. Events should work similar to the mouse API and support window id and enter/leave events. There should be support for different buttons, different click types and also if the device is to be mapped as a hand, finger, weapon/laser pointer, mouse, etc. Click types could be either, "click at point" or "shoot ray" or this could be implied by the mapped type.
Type mapping allows the user to switch seamlessly between VR applications without having to first set-up which pointer device is the left or right hand and which are simple 3d mice. This information could then either be in a config file similar to GameController or read from the OS or VR APIS.
Even outside of VR people use "3D mice" for various software so this API would be useful.