Skip to content

Conversation

@RenaudRohlinger
Copy link
Collaborator

@RenaudRohlinger RenaudRohlinger commented Sep 12, 2025

Related issue: #29573 (comment)

Description
Add a basic example to test and demonstrate Extended SRGB ColorSpace via High Dynamic Range in WebGPU.

Heavily inspired by @greggman's HDR demo https://github.com/greggman/HDR-draw

Example:
https://raw.githack.com/renaudrohlinger/three.js/utsubo/feat/hdr-example/examples/webgpu_hdr.html

IMG_1211.MOV

This contribution is funded by Utsubo

@mrdoob mrdoob added this to the r181 milestone Sep 12, 2025
@Mugen87
Copy link
Collaborator

Mugen87 commented Sep 12, 2025

Can we then remove the HDR usage from webgpu_tsl_vfx_linkedparticles now that we have a dedicated example?

Comment on lines 86 to 88
// Enable Extended sRGB output color space for HDR presentation
THREE.ColorManagement.define( { [ ExtendedSRGBColorSpace ]: ExtendedSRGBColorSpaceImpl } );
THREE.ColorManagement.workingColorSpace = ExtendedSRGBColorSpace;
Copy link
Collaborator

@donmccurdy donmccurdy Sep 12, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@RenaudRohlinger The working color space will need to remain Linear-sRGB (the default value). If this causes other issues I think we can work that out! You're correct that we do need to register Extended sRGB and assign it as the output color space, though.

Conceptually, you can think of Extended sRGB as a color space that pre-supposes some fixed representation of "white" relative to the display, and then provides a mechanism for output at greater wattage than this "white". There are technical/conceptual/perceptual issues with that approach, but it's what WebGPU HDR gives us today, so we'll use it as the output space.

In our working color space, prior to image formation (exposure, tone mapping), we can think of the RGB values as stimulus received by the camera. There's no image yet formed, nothing is relative to any display or viewing environment, and without such a reference to anchor perceptual interpretation there is no "white". So Extended sRGB has no practical meaning as a working color space in a lit rendering pipeline.

Additionally, Extended sRGB uses a non-linear transfer encoding that isn't valid for the PBR rendering process.

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I see, thanks for the great explanations @donmccurdy!

@RenaudRohlinger
Copy link
Collaborator Author

RenaudRohlinger commented Sep 18, 2025

Thanks for the reviews, should be good now!

@Mugen87 Mugen87 merged commit 8d43f7a into mrdoob:dev Sep 18, 2025
8 checks passed
@Mugen87 Mugen87 mentioned this pull request Sep 18, 2025
@WestLangley
Copy link
Collaborator

@RenaudRohlinger How would you feel about renaming this example to webgpu_extended_srgb.html ?

@RenaudRohlinger
Copy link
Collaborator Author

That’s definitely more pertinent, but I’m not sure the concept of extended_srgb is very accessible to most people. In practice, many developers would probably expect to find this under HDR. If we do this, we could address this by using tags.json to alias the search with HDR.

@WestLangley
Copy link
Collaborator

Well, that is what the demo is about.... I think your tags suggestion is a good idea.

@donmccurdy
Copy link
Collaborator

My feeling is that the "Extended sRGB" space is a necessary evil to access the capabilities of the display at this point — if WebGPU were to expose Rec 2100 HLG (or similar) I would prefer to switch the example over to that as the output color space.

@WestLangley
Copy link
Collaborator

WestLangley commented Oct 4, 2025

I have some questions about this example. (I have a silicon iMac.)

  1. renderer.outputColorSpace = ExtendedSRGBColorSpace - Commenting this out does not appear to have any effect. I also question the API here. I think the output color space should be the linear working color space in this case.

  2. It appears the srgb OETF (gamma) is being applied by the shader. My understanding is it should not be. The system applies tone mapping and gamma in this case. Granted, if the OETF is not applied by the shader, the rendering is too dark, so something is not correct...

  3. I see no evidence of tone mapping being applied -- just increased brightness. How can we ensure this is working as intended?

  4. In the WebGPUBackend, context.configure() does not set color space, leaving it at the default sRGB. I think the color space should be consistent with the working color space, which can be linear display-p3, for example.

/ping @RenaudRohlinger @donmccurdy @sunag

@donmccurdy
Copy link
Collaborator

  1. renderer.outputColorSpace = ExtendedSRGBColorSpace ...

Ah, this line is incorrect:

const toneMappingMode = parameters.outputType === HalfFloatType ? 'extended' : 'standard';

Instead we want to be checking the color space's defined tone mapping mode...

outputColorSpaceConfig: { drawingBufferColorSpace: SRGBColorSpace, toneMappingMode: 'extended' }

... so that use of outputType=HalfFloat does not send output beyond [0,1] as it does here.

  1. It appears the srgb OETF (gamma) is being applied by the shader...

This how "Extended sRGB" is defined, and so unfortunately necessary under the current WebGPU HDR spec. Similar to my comment above, I think this is a problematic choice in the spec, and I'd be glad to drop support for Extended sRGB given any other option. It is, indeed, a non-linear sRGB encoding on the [0, ∞] domain.

  1. I see no evidence of tone mapping being applied...

I believe there are two mechanisms operating under the name "tone mapping" here, confusingly. The first is three.js' own tone mapping, which we haven't yet implemented for HDR output. I think it's important that we add that, even if the Extended sRGB output space is not sufficient for us to design it correctly, if only to understand the limitations better.

The second is WebGPU's (or the OS's?) own "tone mapping" to adapt the formed image to the display. That behavior is visible in this example in the form of increasing/decreasing contrast when changing the brightness of the laptop display on macOS. Available HDR "headroom" decreases as the laptop brightness increases.

@donmccurdy
Copy link
Collaborator

Hm, if I set a breakpoint at WebGPUBackend.js#L258, I'm seeing this.renderer.outputColorspace = 'srgb-linear' even though it's configured to ExtendedSRGBColorSpace. I assume this is related to post-processing, but I'm not sure how to detect the final output space in order to configure the canvas correctly...

@sunag
Copy link
Collaborator

sunag commented Oct 6, 2025

We don't have an ExtendedLinearSRGBColorSpace for workingColorSpace, so in some processes the context.configure() initialization process may be being triggered in LinearSRGBColorSpace, this is due to post-processing since the color output transformation is done in the RenderOutputNode where context.configure() has already been called.

About the concept of extended, the API seems very declarative to me; I think Three.js should make things easier for this.

Using outputType: THREE.HalfFloatType in an output without HDR should be seen as a sign of poor optimization on the user’s part, so if outputType = THREE.HalfFloatType, it should already be classified as extended sRGB or equivalent according to the color gamut, and that should be handled internally. Currently in any scenario where HDR isn’t available, it’s currently up to the user to implement all fallback modifications, when this could be automated by the renderer.

We could warn when a tone mapping and output color space configuration isn’t compatible with HDR instead of recreating an public Extended* one by one. That would be simple to implement in ColorSpaceNode and RenderOutputNode.

@donmccurdy
Copy link
Collaborator

donmccurdy commented Oct 6, 2025

We don't have an ExtendedLinearSRGBColorSpace for workingColorSpace...

Similar to #31893 (comment), the canvas context configuration describes the output, we cannot base it on the working color space — which could be configured independently.

Perhaps (unrelated to HDR) we could make post-processing effects work without renderer.outputColorSpace ever being changed? For example ignoring renderer.outputColorSpace and using renderTarget.texture.colorSpace instead, if a render target is defined. Then we could safely use renderer.outputColorSpace to configure the canvas. Not sure about this approach.

Using outputType: THREE.HalfFloatType in an output without HDR should be seen as a sign of poor optimization

Personally I think outputType: THREE.HalfFloatType (alone) is even more useful than HDR support! 🙂 It's a very effective fix for banding, even in sRGB, when dithering isn't enough or content is user-generated. And wide-gamut output (without HDR) benefits from >8-bit precision more than sRGB.

...it’s currently up to the user to implement all fallback modifications, when this could be automated by the renderer.

Hm, this is very challenging to make automatic. The image will not look the same, with differences based in browser, OS, and hardware decisions that I don't think we want to be trying to detect or counteract. See gpuweb/gpuweb#4919, this is a big part of why I prefer to consider HDR an experimental feature today.

Also, the browser does allow developers to render wide-gamut or HDR color spaces to a canvas even if no connected display supports it, for example to export/download an image. I tend to think three.js should allow the same?

We could warn when a tone mapping and output color space configuration isn’t compatible with HDR instead of recreating an public Extended* one by one...

Adding some warnings is a good idea I think! I don't want to add any more Extended* color spaces though, just ExtendedSRGBColorSpace, and that only reluctantly. The better-defined spaces would be Rec2100 PQ and Rec2100 HLG, which are distinct and HDR-specific, but not currently available in WebGPU.

@sunag
Copy link
Collaborator

sunag commented Oct 6, 2025

Personally I think outputType: THREE.HalfFloatType (alone) is even more useful than HDR support! 🙂 It's a very effective fix for banding, even in sRGB, when dithering isn't enough or content is user-generated. And wide-gamut output (without HDR) benefits from >8-bit precision more than sRGB.

I don't think this would apply to this case, but if there’s any issue related to this matter in WebGPURenderer, I’d like to take a look at it.

The outputType should refer to the canvas texture, but the default scene rendering of WebGPURenderer is not done on the canvas but on a RenderTarget which by default is 16bpc independent of outputType, all previous render targets including post-processing and color transformation should be in HalfFloatType by default. We are handling this with the colorBufferType property, so to optimize this process we should then apply Renderer( { colorBufferType } ). Just the result of the output color transformation is applied to outputType which is equivalent to the canvas representation.

Also, the browser does allow developers to render wide-gamut or HDR color spaces to a canvas even if no connected display supports it, for example to export/download an image. I tend to think three.js should allow the same?

I think the proposal is compatible with this thinking. My idea is to integrate ExtendedSRGBColorSpace and not expose to the user, Soon, when we have more appropriate color spaces for HDR as you mentioned, this will be a natural extension, which should also be beneficial to the user too for making it simpler.

The term Extended* is bringing up some other implications like the issue mentioned, so I suggest we simplify this process incorporating Extended so that this is automatic according to the color gamut and warn if it is incompatible

HDR is declared as [0,>1] webgpu extended range of rgba16float, and considers values ​​outside the range of [0,1], so defining an outputType: UnsignedByteType with ExtendedSRGBColorSpace is wrong, and outputType: HalfFloatType without ExtendedSRGBColorSpace should be wrong too.

Hm, this is very challenging to make automatic. The image will not look the same, with differences based in browser, OS, and hardware decisions that I don't think we want to be trying to detect or counteract. See gpuweb/gpuweb#4919, this is a big part of why I prefer to consider HDR an experimental feature today.

We certainly won’t solve everything at this first stage, but this current scenario is quite feasible for automation, and I think it’s the first step toward simplifying things.

@donmccurdy
Copy link
Collaborator

donmccurdy commented Oct 8, 2025

I don't think this would apply to this case, but if there’s any issue related to this matter in WebGPURenderer...

There's no bug, a gradient between two nearby (and especially darker) colors can only be stretched across so many pixels without banding on an 8-bit canvas. The best material I'm aware of is Banding in Games from the makers of LIMBO and INSIDE. You'll find their concluding recommendations (slide 60) mention "Use highest affordable precision available" as the simplest fix.

Certainly any render targets in the middle of the pipeline could introduce banding, too, but the canvas drawing buffer is not exempt.

I think the proposal is compatible with this thinking. My idea is to integrate ExtendedSRGBColorSpace and not expose to the user...

I'm a bit nervous about internalizing/hiding the color spaces... it's a different direction that tools like Blender (with OpenColorIO configs) seem to be headed, and puts us in a position of needing to explain to users why content looks different in version X than in version X+1 more often. I'm not sure we can safely simplify it much more than .outputType/.outputColorSpace/.toneMapping.

That said, I might not be understanding how you'd like to hide the Extended-sRGB color space — would that be by detecting the monitor's capabilites? by using the .outputType setting to imply HDR?

so defining an outputType: UnsignedByteType with ExtendedSRGBColorSpace is wrong...

Agreed, we could safely log a warning in this case.

...and outputType: HalfFloatType without ExtendedSRGBColorSpace should be wrong too.

Apologies, I can't agree with this one. It's a very practical choice. See the deck from the LIMBO and INSIDE developers above, with some references to Skyrim and Kentucky Route Zero. More similar to Photoshop or Affinity Designer offering 8-bit, 16-bit, and 32-bit document modes. Using a 16-bit drawing buffer is more widely useful than any HDR color space, and far simpler and safer to adopt.

@sunag
Copy link
Collaborator

sunag commented Oct 8, 2025

I think we need to separate about working bits and output bits.

I'll try to create an analogy with audio, for example: When we record something, we try to do it with the highest sampling and bit precision possible, how we do it as texture files, the more bits the better, because we have more information, but for playback since we record in 32 bit, the output depends on the hardware.

You can save a file in 32 bit, 24 bit, but if your hardware output is 16 bit, the conversion will be done in the same way to 16 bit and you will not have any better subsequent result, different from working bits as I mentioned.

Any analogy with Photoshop or working buffer precision we should refer to colorBufferType which is the working bits, the output depends solely on the hardware.

When we refer to HDR, we are referring to the bit precision and range, as the name suggests. Softwares I know this is determined by the BPC and not by the color space, as the webgpu itself does, we are dealing with precision beyond [0,>1], this should be independent of the color space or any other color output transformation used.

I think I was the first to mention issues related to bit precision that were resolved with colorBufferType #30392 (comment), so my question about this being a bug or if we have a way to reproduce this to fix it.

I'm not suggesting hiding the color spaces, but specifically the term Extended.

Previously, HDR screen output was defined by .outputColorSpace, which is quite strange, because although they are related, they are different things and the test to determine whether the screen output is HDR or not was determined by a test of extended string of outputColorSpace, this only exposes the limitation of this approach when we use LinearSRGBColorSpace.

So in a table we can define it as follows:

Name Description
.outputType Screen/Output bits precision and range
.colorBufferType Working bits precision and range

For ouputColorSpace

Name LDR HDR
SRGBColorSpace Yes Yes
DisplayP3ColorSpace Yes No, currently. Show warning

@donmccurdy
Copy link
Collaborator

That's an important distinction about working bits vs. output bits, I agree! I really do mean output bits. 8-bit output reduces quality in some content, even if nothing exceeds the range [0, 1]. Certainly 16-bit output if the color buffer is only 8-bit would be pointless, but there are plenty of cases where 16-bit working + 16-bit output is a useful combination. This part is conceptually simple and well-defined.

WebGPU HDR, on the other hand, requires major changes in three.js' tone mapping such that output can be >1, without clamping, supporting WebGPU's toneMapping: { mode: "extended" } definitions. This part is conceptually very messy, we can't impose those changes on everyone who just wants 16-bit output for quality reasons.

You are correct that a lot of software uses the term "HDR" to mean simply 16-bit or 32-bit precision, though. The term "HDR" has too many possible meanings, this is partly why I was opposed to having a {hdr: true} option on the renderer. I feel it is less confusing to deal in .outputType/.outputColorSpace/.toneMapping settings independently. Let me try opening a PR proposing a workable definition of HDR within three.js... I think that could be helpful for the Color Management page, and for conversation.

@sunag
Copy link
Collaborator

sunag commented Oct 9, 2025

The term "HDR" has too many possible meanings, this is partly why I was opposed to having a {hdr: true} option on the renderer.... Let me try opening a PR proposing a workable definition of HDR within three.js... I think that could be helpful for the Color Management page, and for conversation.

I think the combination .outputType/.outputColorSpace/.toneMapping is great the way it is; the bit definition is more in line with what we do in the library.

Sorry to be redundant, but since you're going to work on this, I'd like to summarize my opinion.

  • I think when we set .outputType = HalfFloatType we expect the screen output to be HDR.
  • colorBufferType is responsible for the precision/range of internal screen textures currently.
  • The term ExtendedSRGBColorSpace can be simplified and incorporated into SRGBColorSpace.

To make what I said make more sense, this should be changed too.

// WebGPUUtils
getPreferredCanvasFormat() {

		let bufferType = this.backend.parameters.colorBufferType;

		if ( bufferType === undefined ) {

			bufferType = this.backend.parameters.outputType;

		}

		if ( bufferType === undefined ) {

			return navigator.gpu.getPreferredCanvasFormat();

		} else if ( bufferType === UnsignedByteType ) {

			return GPUTextureFormat.BGRA8Unorm;

		} else if ( bufferType === HalfFloatType ) {

			return GPUTextureFormat.RGBA16Float;

		} else {

			throw new Error( 'Unsupported buffer type' );

		}

	}

}

This could work as:

const renderer = new THREE.WebGPURenderer( {
	outputType: THREE.HalfFloatType
} );
renderer.outputColorSpace = THREE.SRGBColorSpace;

// set a color buffer type as default

const renderer = new THREE.WebGPURenderer( {
	colorBufferType: THREE.HalfFloatType
} );
renderer.outputColorSpace = THREE.SRGBColorSpace;

@lgarron
Copy link
Contributor

lgarron commented Nov 2, 2025

Super excited to see this demo live! 🤩

In case anyone (like me) is trying to find the canonical link to the demo, it's: https://threejs.org/examples/webgpu_hdr.html

@donmccurdy
Copy link
Collaborator

donmccurdy commented Nov 18, 2025

@sunag I'm still on the hook for that "opening a PR proposing a workable definition of HDR", sorry things have been busy, but still planning to do that. :)

In the meantime – Blender 5.0 was just released with added support for: (1) wide gamut output, (2) HDR output, and (3) a user-configurable working color space. Previously (3) was possible only by modifying Blender's internal OCIO config file. I haven't tested 5.0 deeply yet, but I think they've structured the new options in a thoughtful and appropriate way.

Comparable to our renderer.outputColorSpace, Blender offers a "Display" option with a list of color spaces, each categorized as SDR or HDR. Note that both HDR options (Rec.2100-PQ and Rec.2100-HLG) share the Rec.2020 wide gamut.

Screenshot 2025-11-18 at 4 07 43 PM

If an HDR output color space has been selected, additional "View" options (comparable to renderer.toneMapping) become available...

Screenshot 2025-11-18 at 4 09 38 PM

... and incompatible options, like Khronos Neutral, are hidden.

For the newly-added working color space dropdown, available options are Linear-sRGB, Linear-Rec.2020, and ACEScg. This is comparable to our THREE.ColorManagement.workingColorSpace.

Screenshot 2025-11-18 at 4 12 58 PM

Output bit depth is not configurable on the viewport, but is exposed as a separate option when exporting to image or video, independently of color space:

Screenshot 2025-11-18 at 4 17 57 PM

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

7 participants