Replies: 4 comments
-
If you'd like to give implementing this a shot, that would be great! I don't have a headset, so I'm not sure exactly the problems, but I'd guess this might be a blocker since I doubt those headsets support the hit test API: https://github.com/google/model-viewer/blob/master/packages/model-viewer/src/three-components/ARRenderer.ts#L135 It could probably be marked optional since as far as I know all AR-supporting phones support hit test anyway. @klausw might have some more insights when it comes to supporting both phone and headset AR with WebXR. |
Beta Was this translation helpful? Give feedback.
-
I have to do it anyway for a project.
Ok, that is was I was looking for. That's why you can enter the WebXR by using Hololens (because it has hit test capability) and not with Oculus Quest (as for now). |
Beta Was this translation helpful? Give feedback.
-
Just for clarification I had nothing to do with the hand tracking examples^^. The code was added by @fe1ixz in mrdoob/three.js#21712. |
Beta Was this translation helpful? Give feedback.
-
Seems as reasonable a path as any. I'll reserve judgement until I have a PR to review. |
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
-
Description
There are plenty of WebXR ready headsets in use at this point. For example: Hololens 2, Oculus Quest. They are considered AR devices and in model viewer we have AR WebXR mode to support them. Hololens2 works out of the box, although there are many limitations mostly related to absence of UI of any kind to enable 3d model positioning, while there is a UI for mobile. Quest is not working at all, although it has decent AR capabilities.
Again, the main issue is not being able to place the 3D scene in your environment.
First, we need to enable all supporting headsets to enter the WebXR mode
Make UI/UX to manipulate 3D scene: move, rotate, scale
Make use of hands for devices that support hand tracking (show hands to user), for simplicity, controllers are not a priority as we are talking more about AR than VR for model viewer, although there can be a way to unify hand and controllers input later.
Display bounding box of the 3D scene with grab points on the vertices for scaling and at the middle of edges for rotation.
Enable far interaction with the bounding box by casting a ray from each hand. You can get the better idea of how it can be done from Mixer Reality Toolkit made for Unity Engine. Although it might be the simplest possible implementation of this kind of UX for now. The idea is to have a minimum UX to be able to move your 3D scene in the space and so having the feature parity with mobile platform.
Browser Affected
OS
AR
Beta Was this translation helpful? Give feedback.
All reactions