-
-
Notifications
You must be signed in to change notification settings - Fork 36.1k
Add preliminary support for WebXR Layers #20696
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
|
Hmm, this one requires a deeper review... |
|
This is to be able to display an image or a video in VR in a more performant and crisper way, correct? |
This PR makes it possible to use such constructs. Currently, it enables layer support and draws content to a projection layer instead of a WebGL layer. Projection layers are slightly more efficient and also support texture arrays which could be enabled later. |
|
Do these layers use the same depth buffer? Or do they render on top of gl? |
Each layer has its own depth buffer and it's up to the user agent if they want to use it for reprojection/timewarp. |
|
What I'm trying to understand is whether the video will mix with the 3d scene or not. For example: A video layer surrounded by gl spheres. But by your response I deduce that the gl scene will be rendered first and the video/layer will be rendered on top ignoring the depth buffer of the gl scene. Correct? |
|
That is correct. Layers are rendered on top of each other in the order of the Typically, the controller and a simplified scene is rendered on top and then there are quad/cylinder layers for video or custom controls. |
| stencil: attributes.stencil, | ||
| framebufferScaleFactor: framebufferScaleFactor | ||
| }; | ||
| if ( session.renderState.layers === undefined ) { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Are we going to have to maintain two code paths from now on?
Or is this transitional and we'll able to just use the layers code path in the future?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I know Microsoft is going to start on layers and Google also said that they're interested because it gives them access to texture arrays and in the near future WebGPU.
Once they land support, we can remove the dual path.
|
So many questions...
The controller? The 3d model of the controllers?
Okay, I understand this.
So, ideally we should be rendering Also.. sounds like cube or equirect layer do not render on top of everything? They render under everything, correct? If so, what happens with transparent objects. I've experimented in the past with rendering the background as the last step in order to save on fillrate, but the idea failed when the scene had transparent objects. |
Yes. For instance, if you take the oculus shell, you can see that the controllers, the ray and the hands are rendered on top of everything else. This is done in a projection layer.
You can see this in a number of games. For instance, if you look at "the climb" or "falcon age", the characters and nearby bits of the scene are drawn to a projection layers. The background is typically a cubemap that is static or with some simple canned animations.
Yes
Can the background be complex? (ie like the background in the oculus shell) If so, yes.
It depends on their order in the layers array but typically, they render under everything else.
That depends on the blendtexturesourcealpha attribute. By default layers will alpha blend with each other so transparent content will be handled correctly. |
| if ( session && session.renderState.layers !== undefined && session.hasMediaLayer === undefined && video.readyState >= 2 && video2.readyState >= 2) { | ||
| session.hasMediaLayer = true; | ||
| session.requestReferenceSpace('local').then((refSpace) => { | ||
| const mediaBinding = new XRMediaBinding(session); | ||
| const equirectLayer = mediaBinding.createEquirectLayer(video, {space: refSpace, layout: "mono"}); | ||
| const quadLayer = mediaBinding.createQuadLayer(video2, {space: refSpace, layout: "stereo-left-right"}); | ||
| quadLayer.transform = new XRRigidTransform({x: 1.5, y: 1.0, z: -2.0}); | ||
| session.updateRenderState( { layers: [ equirectLayer, quadLayer, session.renderState.layers[0] ] } ); | ||
| video.play(); | ||
| video2.play(); | ||
| }); | ||
| } |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think we could create a XRMediaLayer class that extends Object3D so we can add it to the scene like any other object. And then let WebXRManager do all this binging stuff when it sees a XRMediaLayer.
What do you think of that design?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
That sounds reasonable.
How would you handle WebGL Layers?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
How would you handle WebGL Layers?
Can you elaborate?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
WebGL layers give you another texture to render to.
Just like how a projection layer gives you a stereo texture, other layer types can give you one.
Can three.js draw into multiple textures at a time? non-projection layers are also special in that they don't have to be drawn at the refresh rate.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
No, it can't draw into multiple textures yet.
I was not aware of WebGL layers... Seems like I'll have to do a proper read at the spec. I'm too confused right now 😅
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
We can tackle those later :-)
I believe this project updated three.js so it could target multiple layers at the same time: https://spellew.github.io/webxr-samples/media-layers.html
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Is there a talk that explains Layers? Or is https://immersive-web.github.io/layers/ the only resource?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
That is the only resource for the WebXR part.
For general information on layers, you can read:
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@dmarcos also hooked them into aframe: https://twitter.com/dmarcos/status/1322309762562760704
|
@mrdoob did you have a chance to think about how to best integrate layers? |
|
@mrdoob We're looking into creating some examples with WebXR Layers and three.js. |
What do you mean with surfaces? |
|
Each WebXR layer is pointing to an opaque texture. |
|
Sounds like something similar to multi render targets? |
|
Yes, is that already available? |
|
Not yet. I was waiting for WebGL2 landing on iOS so we don't have multiple code paths. |
|
Is it already in progress on a branch or a PR? |
|
Yes, see #16390. |
|
Sidenote: #20135 is required for the MRT PR so it's good if we manage to merge it this dev cycle. |
|
@cabanier We have MRT support now. Would you like to do any changes to this PR? |
I think this PR is a good start. We can add better support for layers in later patches (ie render to more than 1 layer) |
|
Sounds good! Any chance you can resolve the conflicts? |
Will do! |
I resolved the conflicts and ran some tests. |
|
#22060 has the latest changes. |
Description
WebXR Layers is an upcoming standard that allows more efficient projections.
This change introduces an initial implementation which lets three.js draw in a projection layer. It also has an example how one can merge it with Quad and Equirect media layers.
This contribution is funded by Oculus.