Gamma Control For Mac !!TOP!!
This is the second installment of a 2-part guest post by Jim Perkins, a professor at the Rochester Institute of Technology's medical illustration program. His first post detailed why it's a good idea to calibrate your computer monitor regularly. This next post walks us through the process and explains the mysterious settings known as gamma and white point.
Gamma Control For Mac
When you connect the colorimeter and run the calibration software, it will ask you to select some important settings. The two most important settings are gamma and color temperature, both of which are fairly difficult concepts to understand.
Gamma is the relationship between the numerical value of a pixel in an image file and the brightness of that pixel when viewed on screen. The computer translates the numerical values in the image file into voltage that is sent to the monitor. This relationship is non-linear, meaning that a change in voltage does not translate into an equivalent change in brightness. For almost all TVs and computer monitors, a change in voltage results in a change in brightness raised to the 2.5 power. The gamma for these devices, therefore, is said to be 2.5.
Gamma correction is a way of compensating for this non-linear relationship between voltage and brightness. A combination of hardware and/or software can reduce the gamma to something closer to 1.0, i.e. a perfect linear relationship. This helps ensure that a change in pixel value in the digital file translates into a proportional change in brightness on screen.
The original Mac was designed from the outset to be a graphic arts system. Its release coincided with the introduction of the Apple Laserwriter, the Linotype Linotronics imagesetter, and Aldus Pagemaker, the first page layout program. All of these components were tied together by the PostScript page description language, also released in 1984 by a fledgling company called Adobe. This launched the desktop publishing revolution of the mid-1980s and beyond. It was no coincidence that Apple chose a system gamma that was geared towards print output.
Windows PCs, on the other hand, have never had built-in gamma correction, although this is an option on some graphics cards. This reflects the fact that PCs were always targeted towards business and the mass consumer market rather than to graphics professionals. With no hardware correction, the Windows system gamma is about 2.2.
With the release of Mac OSX 10.6 (Snow Leopard) in 2009, Apple finally changed their default system gamma from 1.8 to 2.2. They did this, of course, to ensure that video games and web images looked the same on Mac and PC systems. In doing so, however, they abandoned their traditional base of support among graphics professionals.
The choice of gamma settings, therefore, is no longer dictated by the computer platform or operating system. Instead, when calibrating your monitor, you can choose a gamma setting that is best suited to the type of work you normally do. This will override the built-in settings of the system.
So which color temperature setting is best? As with the gamma setting, it depends on what kind of work you do. For many years, the standard color temperature setting for graphic arts work was 5000K (also known as D50). This is closest to neutral white and simulates common lighting conditions for reading printed materials. Therefore, I feel this is the ideal color temperature to select if you do mostly work for print.
5. When prompted, select the values for gamma and color temperature. If you do mostly print work, I recommend gamma 1.8 and 5000K (D50). If you create mostly web graphics, game assets, or other images viewed on screen, choose gamma 2.2 and 6500K (D65).
The Datacolor Spyder3Pro, X-Rite/Pantone ColorMunki Display, and the newer Pantone Huey Pro provide a choice of three or four gamma settings (including 1.8 and 2.2) as well as a choice of color temperature settings (including D50 and D65). These three models are the ideal choices for most digital artists and photographers. All three are in the $100-200 range, a small price to pay for accurate color on screen.
DarkAdapted modifies your screen gamma settings so that you may, for example, preserve your dark adaptation while using your computer. DarkAdapted is being used by astronomers, planetarium operators, graphics professionals, medical professionals, airline pilots, air traffic controllers, and others worldwide to provide flexible, dynamic control over their monitor's gamma (screen color response).
I am reusing two old monitors for which I found no custom way to set brightness, contrast, and gamma (see appendix below for full details). On a brightbackground, the colors look dull and I want to change the gamma correction tosomething close to a MacBook Pro retina display, which has sharp colors on thesame background.
On a Windows computer with the same monitor, I can change brightness, contrast, and gamma for the NVIDIA GeForce GT 220 graphic card with the utility in C:\Program Files\NVIDIA Corporation\Control Panel Client\nvcplui.exe. It shows three sliders for brightness, contrast, and gamma that have an immediate effect.
On macOS, the Display Calibrator Assistant (from System Preferences > Displays) asks to change brightness and contrast with the monitor's buttons and to choose a white point. It says nothing about gamma, and does not change any of the settings.
I found this page that addressesthe same problem. The third option mentions an open-source program, whose linkredirects to a blog where I found thetool,which is for Windows only. I installed .NET on a Windows computer and was ableto run the tool, but it only controls brightness, not contrast andgamma:
This should be a completely normal use case, and the call should simply do nothing if the gamma tables are already at their default values. But for unknown reasons, this API call will sometimes set all RGB Gamma tables to 0 and all colors will show up as black.
Gamma is an important but seldom understood characteristic of virtually all digital imaging systems. It defines the relationship between a pixel's numerical value and its actual luminance. Without gamma, shades captured by digital cameras wouldn't appear as they did to our eyes (on a standard monitor). It's also referred to as gamma correction, gamma encoding or gamma compression, but these all refer to a similar concept. Understanding how gamma works can improve one's exposure technique, in addition to helping one make the most of image editing.
2. Gamma encoded images store tones more efficiently. Since gamma encoding redistributes tonal levels closer to how our eyes perceive them, fewer bits are needed to describe a given tonal range. Otherwise, an excess of bits would be devoted to describe the brighter tones (where the camera is relatively more sensitive), and a shortage of bits would be left to describe the darker tones (where the camera is relatively less sensitive):
3. System Gamma. This represents the net effect of all gamma values that have been applied to an image, and is also referred to as the "viewing gamma." For faithful reproduction of a scene, this should ideally be close to a straight line (gamma = 1.0). A straight line ensures that the input (the original scene) is the same as the output (the light displayed on your screen or in a print). However, the system gamma is sometimes set slightly greater than 1.0 in order to improve contrast. This can help compensate for limitations due to the dynamic range of a display device, or due to non-ideal viewing conditions and image flare.
The precise image gamma is usually specified by a color profile that is embedded within the file. Most image files use an encoding gamma of 1/2.2 (such as those using sRGB and Adobe RGB 1998 color), but the big exception is with RAW files, which use a linear gamma. However, RAW image viewers typically show these presuming a standard encoding gamma of 1/2.2, since they would otherwise appear too dark:
If no color profile is embedded, then a standard gamma of 1/2.2 is usually assumed. Files without an embedded color profile typically include many PNG and GIF files, in addition to some JPEG images that were created using a "save for the web" setting.
This is the gamma that you are controlling when you perform monitor calibration and adjust your contrast setting. Fortunately, the industry has converged on a standard display gamma of 2.2, so one doesn't need to worry about the pros/cons of different values. Older macintosh computers used a display gamma of 1.8, which made non-mac images appear brighter relative to a typical PC, but this is no longer the case.
The overall display gamma is actually comprised of (i) the native monitor/LCD gamma and (ii) any gamma corrections applied within the display itself or by the video card. However, the effect of each is highly dependent on the type of display device.
LCD Monitors. LCD monitors weren't so fortunate; ensuring an overall display gamma of 2.2 often requires substantial corrections, and they are also much less consistent than CRT's. LCDs therefore require something called a look-up table (LUT) in order to ensure that input values are depicted using the intended display gamma (amongst other things). See the tutorial on monitor calibration: look-up tables for more on this topic.
Gamma is important because it affects the appearance of dark areas, like blacks and shadows and midtones, as well as highlights. Monitors with poor gamma can either crush detail at various points or wash it out, making the entire picture appear flat and dull. Proper gamma leads to more depth and realism and a more three-dimensional image.
If you have good gamma settings, your monitor will display better image quality and depth. But poor settings will remove essential details in the shadows and highlights. You should also modify your brightness and contrast settings since they also affect the calibration of gamma.
As soon as we compute the final pixel colors of the scene we will have to display them on a monitor. In the old days of digital imaging most monitors were cathode-ray tube (CRT) monitors. These monitors had the physical property that twice the input voltage did not result in twice the amount of brightness. Doubling the input voltage resulted in a brightness equal to an exponential relationship of roughly 2.2 known as the gamma of a monitor. This happens to (coincidently) also closely match how human beings measure brightness as brightness is also displayed with a similar (inverse) power relationship. To better understand what this all means take a look at the following image: