While we’re on the subject of image sensors and pixel limited resolution, it is worth considering the question of what size your images should be for computer reproduction – say to put on a website like this one.
If we again take the Canon T2i are typical, the 18 M pixel sensor chip consists of a 5184 X 3456 pixel array. As discussed this defines some ultimate resolution for an image. For computer monitors a typical high resolution pixel density is 1024 X 768. this is 0.786432 M pixels. That said there is a trend to higher and higher pixel density monitors.For a fairly comprehensive and useful list of monitor pixel densities his site. The point is that this is what your monitor is capable of displaying. Anything else is wasted space or more accurately wasted memory. And it can seriously slow down your website, because of download times.
In addition, you don’t usually display an image as full screen. A good size, based on aesthetics and the need to have text with your pictures, is about 600 pixels X 400 pixels = 0.24 M pixels. Also, remember that bigger displays, like the ones at sports events and Times Square are meant to be viewed at greater distances. So the issue with monitor display is usually number of pixels as percentage of screen not pixels per inch as it is with printing. So the happy news, is that where the purpose is displaying images on computers: websites, social media, and emails to aunt Tilly there’s no need send huge images.
One caveat to keep in mind is that Aunt Tilly might want to have the picture printed and there, as we shall discuss, the critical number is pixels per inch and the size of the picture. I recently wanted to print a family picture someone had posted on Facebook. It was great on the monitor but as a print – no way!
BTW – there is an interesting and well written website that I would recommend as further reading on the topic of “the myth of dots per inch.” This site also explains several of life’s more profound mysteries like: why does Photoshop default to 72 pixels per inch, why is your computer screen called a desktop, and why does it have a virtual trash barrel – not to mention the all consuming question: “what’s a pica.”
It appears that in 1737 Pierre Fournier created the ciceros as a unit to measure printing type. Six ciceros were almost and inch, actually 0.998 in. Then around1770, François-Ambroise Didot made the ciceros bigger; so that it evenly divided the French foot into 6 X 12 or 72 equal parts. This is 0.1776 in. Today, and as established in 1886, the American Point System defines a “pica” as 0.166 in.; so that 6 of these make up 0.996 inches. We have, (ready for this?)12 points per pica and 6 picas per inch. That is 72 points per inch. Heard that number before 72 dots (or points) per inch?
When Apple introduced the Macintosh in 1984, they wanted something people could relate to. Type was meant to follow the rules of print, as in 72 points per inch. How many people can relate to that – I mean really? We worked on “desktops” and threw old files in the “trash.”