> The final piece of the puzzle is gamma correction. Applying gamma correction to these RGB coordinates produces a new set of values which we call (R', G', B') that are related to the original by a transfer function 6 ... The reason this is done is to account for how our eyes perceive brightness nonlinearly. We can distinguish changes in dark shades much more easily than light shades because a linear increase R in has much more of a relative effect when R is small. Switching to (R', G', B') therefore provides more resolution in dark regions of the image where the eye is more sensitive to variations in brightness.
I'm surprised that this isn't mentioned much earlier and much more prominently. Instead, it's practically a footnote.
Maybe I'm mistaken, but I would bet 90% of the awkwardness in the very first image is from averaging these values (R', G', B') for the gradients rather than switching to the true linear values, averaging, then converting back. This classic MinutePhysics video covers it well:
It would be interesting to learn more about colours spaces developed with Tetrachromacy in mind. I guess the rest of us should be classed as visually impaired.
Well yes I don't think anybody's monitor can render it anyway
These wavelength-indexed spectra always seem a bit weird... the blue is so cramped! When you plot them by frequency they feel just right. We say "ultraviolet" and "infrared" for a reason; never "infraviolet" or "ultrared".
It's like a piano that had the high notes to the left.
> The right approach would have been to select a color appearance model (CIECAM02 is the standard), convert all our colors to this coordinate system, do the mixing in this coordinate system and then convert back to RGB. That being said, I did not want to deal with all the extra complexity that would have come along with this. Instead, I opted for a much simpler approach.
Python's nice `colour` package supports several color appearance models.[1]
But I'm glad for the ground-truthy approach taken. I suggest a pattern, of interesting data being unavailable, because it doesn't align with incentives around science or commerce. Often it exists, just sitting on someone's disk, because they think no one is likely to care.
Screw, me, I was reading the title as "Rendering the Vibe Spectrum". I clearly need a break.
> The final piece of the puzzle is gamma correction. Applying gamma correction to these RGB coordinates produces a new set of values which we call (R', G', B') that are related to the original by a transfer function 6 ... The reason this is done is to account for how our eyes perceive brightness nonlinearly. We can distinguish changes in dark shades much more easily than light shades because a linear increase R in has much more of a relative effect when R is small. Switching to (R', G', B') therefore provides more resolution in dark regions of the image where the eye is more sensitive to variations in brightness.
I'm surprised that this isn't mentioned much earlier and much more prominently. Instead, it's practically a footnote.
Maybe I'm mistaken, but I would bet 90% of the awkwardness in the very first image is from averaging these values (R', G', B') for the gradients rather than switching to the true linear values, averaging, then converting back. This classic MinutePhysics video covers it well:
https://www.youtube.com/watch?v=LKnqECcg6Gw
This doesn't account for Tetrachromacy does it
https://en.wikipedia.org/wiki/Tetrachromacy
It would be interesting to learn more about colours spaces developed with Tetrachromacy in mind. I guess the rest of us should be classed as visually impaired.
There is a subsection of that page that is more relevant: https://en.wikipedia.org/wiki/Tetrachromacy#Tetrachromacy_in...
I also wonder.
Well yes I don't think anybody's monitor can render it anyway
These wavelength-indexed spectra always seem a bit weird... the blue is so cramped! When you plot them by frequency they feel just right. We say "ultraviolet" and "infrared" for a reason; never "infraviolet" or "ultrared".
It's like a piano that had the high notes to the left.
> The right approach would have been to select a color appearance model (CIECAM02 is the standard), convert all our colors to this coordinate system, do the mixing in this coordinate system and then convert back to RGB. That being said, I did not want to deal with all the extra complexity that would have come along with this. Instead, I opted for a much simpler approach.
Python's nice `colour` package supports several color appearance models.[1]
[1] https://colour.readthedocs.io/en/master/colour.appearance.ht...
But I'm glad for the ground-truthy approach taken. I suggest a pattern, of interesting data being unavailable, because it doesn't align with incentives around science or commerce. Often it exists, just sitting on someone's disk, because they think no one is likely to care.
Screw, me, I was reading the title as "Rendering the Vibe Spectrum". I clearly need a break.