Skip to content

Conversation

@cabanier
Copy link
Contributor

Description

WebXR Layers is an upcoming standard that allows more efficient projections.
This change introduces an initial implementation which lets three.js draw in a projection layer. It also has an example how one can merge it with Quad and Equirect media layers.

This contribution is funded by Oculus.

@mrdoob
Copy link
Owner

mrdoob commented Nov 21, 2020

Hmm, this one requires a deeper review...

@mrdoob mrdoob added this to the r124 milestone Nov 21, 2020
@mrdoob
Copy link
Owner

mrdoob commented Dec 21, 2020

This is to be able to display an image or a video in VR in a more performant and crisper way, correct?

@cabanier
Copy link
Contributor Author

This is to be able to display an image or a video in VR in a more performant and crisper way, correct?

This PR makes it possible to use such constructs. Currently, it enables layer support and draws content to a projection layer instead of a WebGL layer. Projection layers are slightly more efficient and also support texture arrays which could be enabled later.
Once an experience switches to layers, it can use media layers to easily show crisp video or gl layers to draw crisp content.

@mrdoob
Copy link
Owner

mrdoob commented Dec 22, 2020

Do these layers use the same depth buffer? Or do they render on top of gl?

@cabanier
Copy link
Contributor Author

cabanier commented Dec 22, 2020

Do these layers use the same depth buffer? Or do they render on top of gl?

Each layer has its own depth buffer and it's up to the user agent if they want to use it for reprojection/timewarp.
In case of the Oculus, we discard the buffer (= we don't use it in the VR Compositor)

@mrdoob
Copy link
Owner

mrdoob commented Dec 22, 2020

What I'm trying to understand is whether the video will mix with the 3d scene or not. For example: A video layer surrounded by gl spheres.

But by your response I deduce that the gl scene will be rendered first and the video/layer will be rendered on top ignoring the depth buffer of the gl scene.

Correct?

@cabanier
Copy link
Contributor Author

That is correct. Layers are rendered on top of each other in the order of the layers array.
There is not depth sorting that happens in the system compositor.

Typically, the controller and a simplified scene is rendered on top and then there are quad/cylinder layers for video or custom controls.
The background is then filled in with a cube or equirect layer.

stencil: attributes.stencil,
framebufferScaleFactor: framebufferScaleFactor
};
if ( session.renderState.layers === undefined ) {
Copy link
Owner

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Are we going to have to maintain two code paths from now on?
Or is this transitional and we'll able to just use the layers code path in the future?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I know Microsoft is going to start on layers and Google also said that they're interested because it gives them access to texture arrays and in the near future WebGPU.
Once they land support, we can remove the dual path.

@mrdoob
Copy link
Owner

mrdoob commented Dec 22, 2020

So many questions...

Typically, the controller and a simplified scene is rendered on top

The controller? The 3d model of the controllers?
What do you mean with "simplified" scene?
And, rendered on top of what? On top of the layers?

and then there are quad/cylinder layers for video or custom controls.

Okay, I understand this.

The background is then filled in with a cube or equirect layer.

So, ideally we should be rendering scene.background in VR like this, right?

Also.. sounds like cube or equirect layer do not render on top of everything? They render under everything, correct?

If so, what happens with transparent objects. I've experimented in the past with rendering the background as the last step in order to save on fillrate, but the idea failed when the scene had transparent objects.

@cabanier
Copy link
Contributor Author

So many questions...

Typically, the controller and a simplified scene is rendered on top

The controller? The 3d model of the controllers?

Yes. For instance, if you take the oculus shell, you can see that the controllers, the ray and the hands are rendered on top of everything else. This is done in a projection layer.

What do you mean with "simplified" scene?

You can see this in a number of games. For instance, if you look at "the climb" or "falcon age", the characters and nearby bits of the scene are drawn to a projection layers. The background is typically a cubemap that is static or with some simple canned animations.
Because the background can be drawn at a lower framerate and in a simpler way, the GPU and CPU have more headroom to spend on the foreground content.

And, rendered on top of what? On top of the layers?

Yes

and then there are quad/cylinder layers for video or custom controls.

Okay, I understand this.

The background is then filled in with a cube or equirect layer.

So, ideally we should be rendering scene.background in VR like this, right?

Can the background be complex? (ie like the background in the oculus shell) If so, yes.
Also because the system compositor knows that it is drawing content from an equirect or cubemap, it can render the content faster, sharper and with less distortion.

Also.. sounds like cube or equirect layer do not render on top of everything? They render under everything, correct?

It depends on their order in the layers array but typically, they render under everything else.

If so, what happens with transparent objects. I've experimented in the past with rendering the background as the last step in order to save on fillrate, but the idea failed when the scene had transparent objects.

That depends on the blendtexturesourcealpha attribute. By default layers will alpha blend with each other so transparent content will be handled correctly.

Comment on lines +152 to +163
if ( session && session.renderState.layers !== undefined && session.hasMediaLayer === undefined && video.readyState >= 2 && video2.readyState >= 2) {
session.hasMediaLayer = true;
session.requestReferenceSpace('local').then((refSpace) => {
const mediaBinding = new XRMediaBinding(session);
const equirectLayer = mediaBinding.createEquirectLayer(video, {space: refSpace, layout: "mono"});
const quadLayer = mediaBinding.createQuadLayer(video2, {space: refSpace, layout: "stereo-left-right"});
quadLayer.transform = new XRRigidTransform({x: 1.5, y: 1.0, z: -2.0});
session.updateRenderState( { layers: [ equirectLayer, quadLayer, session.renderState.layers[0] ] } );
video.play();
video2.play();
});
}
Copy link
Owner

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think we could create a XRMediaLayer class that extends Object3D so we can add it to the scene like any other object. And then let WebXRManager do all this binging stuff when it sees a XRMediaLayer.

What do you think of that design?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

That sounds reasonable.
How would you handle WebGL Layers?

Copy link
Owner

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

How would you handle WebGL Layers?

Can you elaborate?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

WebGL layers give you another texture to render to.
Just like how a projection layer gives you a stereo texture, other layer types can give you one.
Can three.js draw into multiple textures at a time? non-projection layers are also special in that they don't have to be drawn at the refresh rate.

Copy link
Owner

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

No, it can't draw into multiple textures yet.

I was not aware of WebGL layers... Seems like I'll have to do a proper read at the spec. I'm too confused right now 😅

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We can tackle those later :-)
I believe this project updated three.js so it could target multiple layers at the same time: https://spellew.github.io/webxr-samples/media-layers.html

Copy link
Owner

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Is there a talk that explains Layers? Or is https://immersive-web.github.io/layers/ the only resource?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@cabanier
Copy link
Contributor Author

@mrdoob did you have a chance to think about how to best integrate layers?

@mrdoob mrdoob modified the milestones: r125, r126 Jan 27, 2021
@cabanier
Copy link
Contributor Author

@mrdoob We're looking into creating some examples with WebXR Layers and three.js.
Everything is working but we're wondering how we can render to separate surfaces. Is there a convenient way to do that?

@mrdoob
Copy link
Owner

mrdoob commented Feb 22, 2021

Everything is working but we're wondering how we can render to separate surfaces.

What do you mean with surfaces?

@mrdoob mrdoob modified the milestones: r126, r127 Feb 22, 2021
@cabanier
Copy link
Contributor Author

Each WebXR layer is pointing to an opaque texture.
This means that we'll need multiple renderers that can draw to each one of them and using the same GL context. Does that require changes in three.js?

@mrdoob
Copy link
Owner

mrdoob commented Feb 23, 2021

Sounds like something similar to multi render targets?

@cabanier
Copy link
Contributor Author

Yes, is that already available?

@mrdoob
Copy link
Owner

mrdoob commented Mar 2, 2021

Not yet. I was waiting for WebGL2 landing on iOS so we don't have multiple code paths.

@cabanier
Copy link
Contributor Author

cabanier commented Mar 2, 2021

Is it already in progress on a branch or a PR?

@Mugen87
Copy link
Collaborator

Mugen87 commented Mar 2, 2021

Yes, see #16390.

@Mugen87
Copy link
Collaborator

Mugen87 commented Mar 2, 2021

Sidenote: #20135 is required for the MRT PR so it's good if we manage to merge it this dev cycle.

@mrdoob mrdoob modified the milestones: r127, r128 Mar 30, 2021
@mrdoob mrdoob modified the milestones: r128, r129 Apr 23, 2021
@mrdoob mrdoob modified the milestones: r129, r130 May 27, 2021
@mrdoob
Copy link
Owner

mrdoob commented Jun 26, 2021

@cabanier We have MRT support now. Would you like to do any changes to this PR?

@cabanier
Copy link
Contributor Author

@cabanier We have MRT support now. Would you like to do any changes to this PR?

I think this PR is a good start. We can add better support for layers in later patches (ie render to more than 1 layer)

@mrdoob
Copy link
Owner

mrdoob commented Jun 28, 2021

Sounds good!

Any chance you can resolve the conflicts?

@cabanier
Copy link
Contributor Author

Sounds good!

Any chance you can resolve the conflicts?

Will do!

@cabanier
Copy link
Contributor Author

Sounds good!

Any chance you can resolve the conflicts?

I resolved the conflicts and ran some tests.
One of the test files that used to work is now broken: https://threejs.org/examples/?q=sandbox#webxr_vr_sandbox
It seems that there is now a problem when switching render targets. Do you know someone who can help me?

@cabanier
Copy link
Contributor Author

#22060 has the latest changes.
https://threejs.org/examples/?q=sandbox#webxr_vr_sandbox is broken because of a bug in the browser. That will be fixed in the next release.

@cabanier cabanier closed this Jun 29, 2021
@mrdoob mrdoob removed this from the r130 milestone Jun 29, 2021
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants