Photographers rarely think twice when it comes to buying expensive camera bodies or high-end lenses, but often seem willing to skimp on a monitor. Why is that? In many cases, it’s because one monitor appears very much like another, especially when purchased over the internet, which is how many of us shop for such things.
This article will help you know what you should look for in a monitor, and show you how to interpret many of the tech specs you’ll see when shopping online. Not so long ago, buying a monitor for photography was an expensive business, but today there is more choice available at every price point.
By Senado Federal
One of the things you must think about when choosing a monitor is panel technology. The “panel” is the main part of the monitor — the screen. It includes polarizing layers, glass substrates, a liquid crystal (LCD) layer, and a color filter. It’s a high-tech sandwich.
The main difference between monitor technologies lies in the way the liquid crystals are oriented, which fundamentally affects the way your monitor behaves. Here are the main three panel types:
These type of panels are often favored by gamers for their fast response times, which reduces unwanted ghosting and blurring effects in moving pictures. The biggest downside of TN panels is that their viewing angles are greatly inferior to other panel types. If you move in front of the screen, the color and contrast are liable to shift in appearance. This flaw varies in severity between monitors.
Be aware that, in monitor specs, viewing angle numbers are highly misleading. They’re based on a lenient contrast test, so you should ignore the common claim that a TN panel has 170/160° horizontal and vertical viewing angles. Those figures bear little relevance to what you’ll experience when editing a photo.
Laptops are almost always made with TN panels, which makes them sub-optimal for photo editing in a perfect world. They’re more usable if you can fix your position in front of the screen and maintain a consistent viewing angle.
Plane Switching panels are consistent in appearance from almost all probable viewing angles. In this respect, they are far superior to most TN panels and better than VA panels. IPS panels are also favored for their innately high-quality color reproduction. In most regards, a monitor with an IPS panel is better for photo-editing than one with a TN panel.
A drawback of IPS technology is a phenomenon known as “IPS glow”, which is a glowing effect that appears across much of the panel when viewing dark screens in subdued light. The more money you spend on an IPS monitor, the less likely you are to encounter this, but it’s probably fair to say that it’s more problematic for gamers. IPS glow is different to backlight bleeding, where light appears to seep out from the edges of the screen. That, too, is more likely in budget or mid-priced monitors.
There are various sub-categories of IPS panel, including S-IPS, e-IPS, H-IPS, and P-IPS. The basic benefits of an IPS panel apply to all of them, though the different types may vary in areas like color depth or response time. An e-IPS panel, for instance, is usually cheaper because it typically runs a lower color depth (i.e. 6-bit) than other IPS types. We’ll look at color depth anon.
Proprietary technologies that are similar in behaviour to IPS panels are Super PLS (Samsung) and AHVA (AUO).
These type are not considered as good as IPS in terms of their viewing angles or color reproduction, but better than TN panels in both respects. They are a kind of a happy medium. The technology is relatively rare, but still used by some of the leading manufacturers in a minority of displays (the proper word for monitors).
A VA panel typically has a bigger contrast ratio than an IPS panel, with an ability to display dark tones and blacks very effectively. Big contrast ratios are not always as desirable to photographers as they are to gamers, however, because they make it harder to imitate the dynamic range of a print when soft-proofing.
There is no right or wrong answer when deciding whether you should buy a standard or wide-gamut monitor, but there are pros and cons attached to either choice. Let’s look at some of them:
Rather oddly, I run standard gamut and wide-gamut monitors side-by-side, and the difference in colors is marked. However, with monitors as with many other things, ignorance is bliss, you don’t miss what you never had.
You’ll find the cheapest monitors typically have a 16:9 aspect ratio, which is fine for watching movies, but a 16:10 aspect ratio is worth aiming for if you can afford it. The latter allows a little more vertical working space and, as Wiki observes, is a closer fit for the classic 3:2 ratio used in many photos.
For many years, a myth circulated that said your photos should have a 72ppi resolution for the web. In fact, as most of us now know, a monitor screen is oblivious to image resolution. This is proven, if proof is still needed, by the fact that Photoshop’s “Save for Web” feature does not attach resolution to images, even though they appear as 72ppi when reopened.
Although several factors may affect the sharpness of an image on your screen (e.g. contrast, anti-glare filters, viewer-to-screen distance), the central thing that dictates sharpness is the monitor’s pixel density, or dot pitch. A greater pixel density or a finer dot pitch is indicative of a sharper onscreen image, all other things being equal. If you google “” or “PPI calculator”, you’ll find an easy means of calculating the pixel density of any screen.
As an example, an average desktop monitor might have a pixel density of around 90-100 ppi, while the 27” 5K iMac with Retina display has a pixel density of 217 ppi. That’s impressive in a big screen.
An extremely dense pixel pitch tends to have a flattering effect on photos, just like every photo looks sharp on a smartphone, but isn’t a necessity for efficient photo editing.
These days, “bigger is better” seems to be the mantra when it comes to choosing a monitor. Of course, it is pleasant to view your photos on a big screen, but my advice is to buy what you can afford and don’t give precedence to screen size over other important attributes. Also remember that big screens need big resolutions to look as sharp as smaller screens from the same distance, so don’t be deceived by pixel dimensions alone. Scrutinize the pixel density, as outlined above.
Apart from Apple iMacs, nearly all desktop monitors are equipped with anti-glare filters for the obvious purpose of cutting out distracting reflections. This creates a matte finish to the surface of the screen. The degree to which this affects the sharpness of the screen image varies a lot, ranging from imperceptible to adding a noticeable grainy effect. You might make an analogy with glossy versus matte prints; the glossy print typically looks a little sharper.
An anti-glare filter is not something to be avoided in a monitor (almost impossible, anyway), but it is worth researching how much it affects the image in your desired screen before buying. Ideally, of course, it’s a good idea to get a look at a monitor before investing. Always check negative reviews when buying online.
On to a slightly complicated subject, which we’ll attempt to keep simple. Color depth relates to how many distinct colors a monitor can display.
Theoretically, the more colors a monitor can display, the more smoothly it can reproduce gradual changes in tone and the less prone it is to frustrating “banding” or posterization effects (characterized by ugly pixelated blocks of color).
Most monitors on the market have one of the following two specs:
The second of these uses dithering to create colors that aren’t there, which is theoretically inferior to a monitor that can natively display 8-bit color. A monitor with 6-bit color is more prone to banding problems, as previously described.
Note that calibrating a monitor increases the likelihood of banding, so more color depth offsets this and effectively makes a monitor more adjustable. Laptop screens nearly always use 6-bit color, so should ideally be calibrated conservatively.
You may see 10-bit color in more expensive monitors. This, again, could be genuine 10-bit color depth or 8-bit + FRC. Bear in mind that a 10-bit monitor can only display its 1.07 billion colors if 10-bit is supported by your graphics processor, software and video connection.
Hardware LUT calibration is a fancy feature you’ll find in some high-end monitors from Eizo and NEC as well as a few consumer brands.
An LUT is a lookup table, which maps the input signals from your PC into, typically, 8-bit RGB color output from your LCD monitor.
In a monitor, greater color depth allows for smoother, more nuanced tonal transitions without banding. Like a monitor, an LUT may also vary in its color depth; the more colors it can process, the better the monitor will be at displaying smooth tones and precise color.
The above is true even if the final output is an 8-bit monitor, so a 10, 12, 14, or 16-bit LUT produces better color in an 8-bit monitor than an 8-bit LUT. The difference between a 10-bit and 16-bit LUT may be less appreciable.
The type of hardware calibration under discussion here doesn’t refer to use of a hardware device like a Spyder. Instead of storing an 8-bit LUT in your video card, like most monitors do, expensive graphics monitors usually have a high-bit LUT built-in to their own hardware for more refined calibration. You’ll still use a calibration device to measure your monitor’s color, but the final color reproduction should be superior.
Expensive graphics monitors often allow you to store and switch between calibration profiles, so you can alter calibration settings with the click of a mouse using proprietary software. This is impossible in normal monitors, where calibration data is loaded into the video card LUT on startup and not changeable without recalibrating your monitor.
When choosing a monitor for photography, panel type is king. If you buy the best IPS (or equivalent) monitor you can afford, the other features are frosting on the cake. Good luck!