-
-
Notifications
You must be signed in to change notification settings - Fork 36.2k
Examples: Add WebGPU HDR Example #31893
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
Wrap `damp` into `nodeObject()`.
|
Can we then remove the HDR usage from |
examples/webgpu_hdr.html
Outdated
| // Enable Extended sRGB output color space for HDR presentation | ||
| THREE.ColorManagement.define( { [ ExtendedSRGBColorSpace ]: ExtendedSRGBColorSpaceImpl } ); | ||
| THREE.ColorManagement.workingColorSpace = ExtendedSRGBColorSpace; |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@RenaudRohlinger The working color space will need to remain Linear-sRGB (the default value). If this causes other issues I think we can work that out! You're correct that we do need to register Extended sRGB and assign it as the output color space, though.
Conceptually, you can think of Extended sRGB as a color space that pre-supposes some fixed representation of "white" relative to the display, and then provides a mechanism for output at greater wattage than this "white". There are technical/conceptual/perceptual issues with that approach, but it's what WebGPU HDR gives us today, so we'll use it as the output space.
In our working color space, prior to image formation (exposure, tone mapping), we can think of the RGB values as stimulus received by the camera. There's no image yet formed, nothing is relative to any display or viewing environment, and without such a reference to anchor perceptual interpretation there is no "white". So Extended sRGB has no practical meaning as a working color space in a lit rendering pipeline.
Additionally, Extended sRGB uses a non-linear transfer encoding that isn't valid for the PBR rendering process.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I see, thanks for the great explanations @donmccurdy!
|
Thanks for the reviews, should be good now! |
|
@RenaudRohlinger How would you feel about renaming this example to |
|
That’s definitely more pertinent, but I’m not sure the concept of |
|
Well, that is what the demo is about.... I think your |
|
My feeling is that the "Extended sRGB" space is a necessary evil to access the capabilities of the display at this point — if WebGPU were to expose Rec 2100 HLG (or similar) I would prefer to switch the example over to that as the output color space. |
|
I have some questions about this example. (I have a silicon iMac.)
|
Ah, this line is incorrect:
Instead we want to be checking the color space's defined tone mapping mode... three.js/examples/jsm/math/ColorSpaces.js Line 133 in 3220e98
... so that use of outputType=HalfFloat does not send output beyond [0,1] as it does here.
This how "Extended sRGB" is defined, and so unfortunately necessary under the current WebGPU HDR spec. Similar to my comment above, I think this is a problematic choice in the spec, and I'd be glad to drop support for Extended sRGB given any other option. It is, indeed, a non-linear sRGB encoding on the [0, ∞] domain.
I believe there are two mechanisms operating under the name "tone mapping" here, confusingly. The first is three.js' own tone mapping, which we haven't yet implemented for HDR output. I think it's important that we add that, even if the Extended sRGB output space is not sufficient for us to design it correctly, if only to understand the limitations better. The second is WebGPU's (or the OS's?) own "tone mapping" to adapt the formed image to the display. That behavior is visible in this example in the form of increasing/decreasing contrast when changing the brightness of the laptop display on macOS. Available HDR "headroom" decreases as the laptop brightness increases. |
|
Hm, if I set a breakpoint at WebGPUBackend.js#L258, I'm seeing |
|
We don't have an About the concept of extended, the API seems very declarative to me; I think Three.js should make things easier for this. Using We could warn when a tone mapping and output color space configuration isn’t compatible with HDR instead of recreating an public |
Similar to #31893 (comment), the canvas context configuration describes the output, we cannot base it on the working color space — which could be configured independently. Perhaps (unrelated to HDR) we could make post-processing effects work without
Personally I think
Hm, this is very challenging to make automatic. The image will not look the same, with differences based in browser, OS, and hardware decisions that I don't think we want to be trying to detect or counteract. See gpuweb/gpuweb#4919, this is a big part of why I prefer to consider HDR an experimental feature today. Also, the browser does allow developers to render wide-gamut or HDR color spaces to a canvas even if no connected display supports it, for example to export/download an image. I tend to think three.js should allow the same?
Adding some warnings is a good idea I think! I don't want to add any more Extended* color spaces though, just ExtendedSRGBColorSpace, and that only reluctantly. The better-defined spaces would be Rec2100 PQ and Rec2100 HLG, which are distinct and HDR-specific, but not currently available in WebGPU. |
I don't think this would apply to this case, but if there’s any issue related to this matter in WebGPURenderer, I’d like to take a look at it. The
I think the proposal is compatible with this thinking. My idea is to integrate The term HDR is declared as [0,>1] webgpu extended range of
We certainly won’t solve everything at this first stage, but this current scenario is quite feasible for automation, and I think it’s the first step toward simplifying things. |
There's no bug, a gradient between two nearby (and especially darker) colors can only be stretched across so many pixels without banding on an 8-bit canvas. The best material I'm aware of is Banding in Games from the makers of LIMBO and INSIDE. You'll find their concluding recommendations (slide 60) mention "Use highest affordable precision available" as the simplest fix. Certainly any render targets in the middle of the pipeline could introduce banding, too, but the canvas drawing buffer is not exempt.
I'm a bit nervous about internalizing/hiding the color spaces... it's a different direction that tools like Blender (with OpenColorIO configs) seem to be headed, and puts us in a position of needing to explain to users why content looks different in version X than in version X+1 more often. I'm not sure we can safely simplify it much more than .outputType/.outputColorSpace/.toneMapping. That said, I might not be understanding how you'd like to hide the Extended-sRGB color space — would that be by detecting the monitor's capabilites? by using the .outputType setting to imply HDR?
Agreed, we could safely log a warning in this case.
Apologies, I can't agree with this one. It's a very practical choice. See the deck from the LIMBO and INSIDE developers above, with some references to Skyrim and Kentucky Route Zero. More similar to Photoshop or Affinity Designer offering 8-bit, 16-bit, and 32-bit document modes. Using a 16-bit drawing buffer is more widely useful than any HDR color space, and far simpler and safer to adopt. |
|
I think we need to separate about working bits and output bits. I'll try to create an analogy with audio, for example: When we record something, we try to do it with the highest sampling and bit precision possible, how we do it as texture files, the more bits the better, because we have more information, but for playback since we record in 32 bit, the output depends on the hardware. You can save a file in 32 bit, 24 bit, but if your hardware output is 16 bit, the conversion will be done in the same way to 16 bit and you will not have any better subsequent result, different from working bits as I mentioned. Any analogy with Photoshop or working buffer precision we should refer to When we refer to HDR, we are referring to the bit precision and range, as the name suggests. Softwares I know this is determined by the BPC and not by the color space, as the webgpu itself does, we are dealing with precision beyond [0,>1], this should be independent of the color space or any other color output transformation used. I think I was the first to mention issues related to bit precision that were resolved with I'm not suggesting hiding the color spaces, but specifically the term Previously, HDR screen output was defined by So in a table we can define it as follows:
For
|
|
That's an important distinction about working bits vs. output bits, I agree! I really do mean output bits. 8-bit output reduces quality in some content, even if nothing exceeds the range [0, 1]. Certainly 16-bit output if the color buffer is only 8-bit would be pointless, but there are plenty of cases where 16-bit working + 16-bit output is a useful combination. This part is conceptually simple and well-defined. WebGPU HDR, on the other hand, requires major changes in three.js' tone mapping such that output can be >1, without clamping, supporting WebGPU's You are correct that a lot of software uses the term "HDR" to mean simply 16-bit or 32-bit precision, though. The term "HDR" has too many possible meanings, this is partly why I was opposed to having a |
I think the combination Sorry to be redundant, but since you're going to work on this, I'd like to summarize my opinion.
To make what I said make more sense, this should be changed too. // WebGPUUtils
getPreferredCanvasFormat() {
let bufferType = this.backend.parameters.colorBufferType;
if ( bufferType === undefined ) {
bufferType = this.backend.parameters.outputType;
}
if ( bufferType === undefined ) {
return navigator.gpu.getPreferredCanvasFormat();
} else if ( bufferType === UnsignedByteType ) {
return GPUTextureFormat.BGRA8Unorm;
} else if ( bufferType === HalfFloatType ) {
return GPUTextureFormat.RGBA16Float;
} else {
throw new Error( 'Unsupported buffer type' );
}
}
}This could work as: const renderer = new THREE.WebGPURenderer( {
outputType: THREE.HalfFloatType
} );
renderer.outputColorSpace = THREE.SRGBColorSpace;// set a color buffer type as default const renderer = new THREE.WebGPURenderer( {
colorBufferType: THREE.HalfFloatType
} );
renderer.outputColorSpace = THREE.SRGBColorSpace; |
|
Super excited to see this demo live! 🤩 In case anyone (like me) is trying to find the canonical link to the demo, it's: https://threejs.org/examples/webgpu_hdr.html |
|
@sunag I'm still on the hook for that "opening a PR proposing a workable definition of HDR", sorry things have been busy, but still planning to do that. :) In the meantime – Blender 5.0 was just released with added support for: (1) wide gamut output, (2) HDR output, and (3) a user-configurable working color space. Previously (3) was possible only by modifying Blender's internal OCIO config file. I haven't tested 5.0 deeply yet, but I think they've structured the new options in a thoughtful and appropriate way. Comparable to our
If an HDR output color space has been selected, additional "View" options (comparable to
... and incompatible options, like Khronos Neutral, are hidden. For the newly-added working color space dropdown, available options are Linear-sRGB, Linear-Rec.2020, and ACEScg. This is comparable to our
Output bit depth is not configurable on the viewport, but is exposed as a separate option when exporting to image or video, independently of color space:
|




Related issue: #29573 (comment)
Description
Add a basic example to test and demonstrate Extended SRGB ColorSpace via High Dynamic Range in WebGPU.
Heavily inspired by @greggman's HDR demo https://github.com/greggman/HDR-draw
Example:
https://raw.githack.com/renaudrohlinger/three.js/utsubo/feat/hdr-example/examples/webgpu_hdr.html
IMG_1211.MOV
This contribution is funded by Utsubo