Version 1, December 2020.
Caution: this is work in progress! → will take a few weeks at least before it stabilizes...
MacOS Color Management
Color Management is already complex in theory, but can be even more complex in practice, in particular wrt where is it done (apps, OS, GPU).
I found it difficult to find answers to my questions,
to wrap my head around all this. What follows are my notes, which I'm sharing as it could be useful to someone else. If you have any
comments, corrections, don't hesitate: (français, English, Deutsch).
The situation is probably quite different on MacOS, Windows, and Linux. I know next to nothing about Windows and Linux color management.
This blog applies only to the MacOS environment, with a bias towards photography.
This section is a quick summary of what I understand of color management.
Let's start with an example. When a camera takes a JPEG image, it stores the image as pixels. Each pixel has three numbers, in the range 0-255,
labelled "red", "green", "blue". However, these numbers, just by themselves, are not enough. If I say that some sculpture is 27 high, you'll
want to know whether it's 27 centimeters, inches, feet or meters (to stick to plausible cases). Same thing with pixels, we need to know
the "unit" to give them meaning. For RGB pixels, this will be a color profile, such as "sRGB" (web default), "Adobe RGB" or "ProPhoto RGB"
(frequently seen in photography), or one from among hundreds of others. It shouldn't be possible to have pixels values with no tags, but it is.
The usual convention is to assume sRGB in such cases.
A color profile describes the color characteristics of some real or imaginary device expressed in terms of
a standardized reference (known as the profile connection space (PCS)), typically
Both of these are defined by the CIE (Comission Internationale de l'Éclairage).
When converting from one color profile to another, there are options on how to deal with the out-of-gamut colors. If the target gamut
entirely includes the source gamut (eg. converting from sRGB to Adobe RGB), then there is no problem. Otherwise, there are four possible
rendering intents (the first two are relevant to photography)
- A color profile describes the gamut of the device, ie the colors it can represent among all the colors a human eye can see. This is typically
shown in a diagram (known as xyY) as in the image above. The partially rounded colored area represents all the colors a normal human eye can see. Along the curved
edge, we have the pure colors of the spectrum, from the shortest wavelegth violet (bottom left), thru green (top), to the longest wavelength red (bottom right);
and inside the colored area, we have all the possible mixes of these colors.
The gamut of a profile is the shape inside the colored area (with some profiles, the gamut can lay partially outside the visible colors).
For RGB profiles, a gamut is a triangle with three vertices corresponding to the three
RGB primaries. Any color outside the triangle cannot be represented faithfully by the profile.
On the figure above, the smallest triangle (in red) is the gamut of my laptop display.
The second one (in green) is the sRGB gamut, and the largest triangle (in purple) is Adobe RGB.
- A profile has a curve describing how the pixel numbers relate to how bright the color is.
This is known as a tone response curve (TRC).
The curve can be a regular
power curve or something close, typically with a gamma between 1.8 and 2.2, or it can be linear (gamma of 1.0).
- A profile has a white point, more or less centered in the triangle, the point that corresponds to pixel values (255,255,255). Typical values are D50 and D65, corresponding
to 5000°K (warmer) and 6500°K (cooler) respectively.
- A color profile has mathematical formulae and/or tables to allow converting between numbers in the device's color space to/from the profile
connection space (eg. XYZ).
Back to the example, assume that
I configured my camera to generate images using the "Adobe RGB" color profile (this conversion is done by the in-camera RAW to JPG processing).
Once my images are on the MacBook, I can open them with Preview. The images get displayed on my MacBook.
However we cannot just send the pixel numbers of the image
to the GPU for display. It is very unlikely that the result will match the colors intended in my images and the colors will look wrong.
Color management is the
solution: somehow the pixel numbers of my images need to be converted to numbers that will cause the correct colors to be displayed.
- relative colorimetric: colors that are in both gamuts are mapped directly, with adjustment of the white point if necessary; colors that are outside the target gamut are mapped to a near point on the edge of the target gamut (causes clipping — more on this below);
- perceptual: the colors are smoothly re-distributed into the target gamut, with less of a shift for the more central colors;
- absolute colorimetric: colors that are in both gamuts are mapped directly, without any adjustment of the white point; colors that are outside the target gamut are mapped to a near point on the edge of the target gamut (causes clipping — more on this below);
- saturation: not sure what this does, but it's only relevant for computer graphics.
To be able to do this, we measure (using a colorimeter such as SpyderX) the display to determine
what colors it generates and how it reacts to input numbers. From this we can generate
a color profile that represents the characteristics of the display.
With this color profile, we can
now convert the numbers from the original image, thru the profile connection
space (eg. XYZ), to the display's color space by doing two
Adobe RGB → XYZ → display profile.
Of course, in practice, this would probably be done by a single 3D lookup table, in the GPU.
In 2020, MacOS is (as far as I'm concerned) a fully color-managed environment, and Preview is a typical app. When it opens an image,
it finds out what profile is attached to it. Then, conceptually, two steps are needed to display the image:
image profile (eg. Adobe RGB) → XYZ → display profile.
Essentially, as I understand it, when an application sends pixels to the MacOS API, it provides both the RGB values and the associated color profile.
MacOS then handles the conversion necessary for the display device.
A good overview of color management on MacOS can be found on this Apple page.
In the 1990s, color management was often not handled by the OS, but by the application. This probably explains why
GIMP's → Preferences still has (including on MacOS)
an item called "Color Management" (as counterexample, Preview has no such item). I still find this quite confusing.
I am currently running GIMP 2.10. As installed, it does what it should. If
one explicitly provides a monitor profile (including via the checkbox for system
monitor profile), it will cause two display profiles to be used, resulting in false colors, something like:
Image profile → XYZ → GIMP monitor profile → XYZ → MacOS display profile!!!
Also, for a given image, Image → Color Management → Enable Color Management should be checked. Otherwise, the color profile
information carried with the image is ignored.
Much of the head scratching that led to this blog comes from an attempt to calibrate my MacBook's display.
I have a SpyderX puck and the SpyderXPro software.
But the resulting profile, altho it probably results in better colors, introduces rather significant artefacts. I think this is because
the gamut of my display is rather small, about 65% of sRGB (see image at top of this blog), so there is a lot of clipping.
Here's an experiment to illustrate this.
- The first image on the left is a triangle with a regular gradation of each of the three primaries, tagged with the sRGB profile:
green goes from 255 at top vertex to 0 on bottom line and similarly for red and blue.
- The second image is a screen capture of the first image displayed in Preview on my MacBook.
The screen capture shows the effect of my display's
color profile. If you have a well calibrated monitor with a good gamut, you might see the artefacts too.
This image was originally tagged with the display's color profile (by the screen capture), the profile has been removed in this image to hopefully
make the artefacts more visible.
- The third image is a modified version of the second (via a python script): for every pixel that has one or more of its RGB values at zero (black),
the zero colors of the pixel are saturated to 255, and the non-zero colors are blacked out.
These saturated areas show where the display profile is clipping colors to zero. So for example, at the bottom
we see the area where the green channel goes to zero (on the left, it overlaps with the blue channel also going to zero, hence the cyan color).
In that whole area we have lost all detail in the green channel, it's always zero altho it was varying in the original.
The top edge of this bottom area is quite
visible as an artefact on my screen; this is what triggered this whole exploration. It's amazing to see how little of the captured image does
not have any clipping !
There is also clipping at 255. But in comparison, it's very little.
Because there is clipping, this means that the
conversion was done with a colorometric intent (presumably relative). Perhaps I would prefer a perceptual intent for
conversion to my display's color profile. I think the artefacts are more disturbing than color shifts that come with latter.
If you're curious, here's my display's color profile.
At first, I thought there might be a bug either with the display profile or with MacOS' handling of it. But probably there is no
bug, it's just a not-so-great display, and any other calibration tool would have similar trouble with it. On my main system, with a decent monitor, I see no issue.
I've been working on a picture viewer in python 3, based on wxPython 3.8 phoenix (wxWidgets 3.1.4).
As far as I can tell, wxPython is not aware of the color profile of images it handles, and (probably implicitly) tells
MacOS that the pixel values it sends are sRGB. MacOS then does the conversion for the display profile based on this assumption.
So for now, my app reads the image's profile itself and does a conversion to sRGB for the few color profiles I deal with.
I wouldn't be surprised that this will eventually be addressed in this framework.
Tools and techniques
Here are some useful tools and techniques to explore color management on MacOS.
- The "Colorsync Utility" app to look at the various color profiles available.
- The "Digital Color Meter" app to see the actual pixel values as displayed on the screen.
- The "Screenshot" app to capture images with the pixel values as displayed on the screen. Preview also has a screen capture tool.
- The "Preview" app as reference: it displays images correctly wrt color management.
- There is no way as far as I know, on MacOS, to disable the display profile. However, when looking at various problems, it can be
useful (and simple) to temporarily set the display profile via
(Apple → System Preferences → Display → Color)
to different ones, eg. a larger one to reduce clipping artefacts, or a more dramatic one such as "ACES CG Linear (Academy Color Encoding System AP1)" which has a gamma of 1.0
made it clear in GIMP testing that it was doing the display profile conversion twice.