The artificial intelligence inside your camera

Figure 1 - An example of an artificial inteeligence system.  The top image is the query image.  The program called CIREs, for content based retieval system, then searches the web or other database to retrieve similar images.  From the Wikimedia Commons and in the public domain.
Figure 1 – An example of an artificial inteeligence system. The top image is the query image. The program called CIRES, for content based retieval system, then searches the web or other database to retrieve similar images, which are shown in the bottom.   The program is following a set of rules that define the image and here ultimately lead to magenta flower, butterfly optional. From the Wikimedia Commons and in the public domain.

We have discussed the fact that cameras used to have a persona about them.  It was almost as if they were people.  You know, the person that never takes a good picture of you.  Today, cameras have become so small and so integrated into our lives that they have lost this persona.  They’re just there. And it is really a paradox because just as we have given our cameras artificial intelligence and therefore a real persona, we pretty much take them for granted.

Cameras with artificial intelligence?  It almost seems like a silly statement to make.  Part of the problem is that we take artificial intelligence for granted.  Ray Kurzweil has pointed out in his book, “The Singularity is Near,” that every time we invent or create artificial intelligence, we call it something else.   And this is clearly the case for the modern digital camera.

Photography is a semi-complex process.  However, if you think about it, if you break it down, everything that you do in taking and processing a photography is a series of reasonably well thought out steps.  The process of automating a process, that is translating it from a task performed by a human to a task performed by an automata or machine is first translating the  criterion used  in the process, and then the steps come easily.

Perhaps the earliest element of the “taking a picture” process to be automated was automatic exposure.  We are really talking here about through the lens metering, and early systems worked by the process of taking an average or sometimes a central spot reading and setting that to the exposure and f-stop that would give you the intensity of neutral grey.  Such a system works in some cases but in others is wholly unacceptable.  So more and more exposure points were developed and the camera tried using a microprocessor to anticipate what was important to you.  Or you can give it a hint by setting to close-up, night scene, landscape, back-lit, whatever.  Don’t be so sure if you think that the human is still required.  The systems are getting better and better.

What about autofocus? The first autofocus systems were developed by Leica in the 1960’s and 1970’s.  By the 1980’s autofocus was becoming popular, widespread, and sought-after.  I remember as a proud Leica M3 rangefinder user  thinking at the time.  Who needs that?  Anyone can focus a camera.  Well, let me go on record and say that I was wrong, really way off.  The autofocus systems of today are just amazing.  Truly there a wee little person in my T2i doing all the work for me.  Yes sometimes I can do better with manual focus.  However, since they eliminated the split screen system it’s gotten a bit hard.

Think about the process of focusing.  How do you automate that process?  In the next few technical blogs I’d like to explore that process.  But for now let me just that automating the process was a matter of determining how the human brain measures sharpness of focus and then mimicking it using a microprocessor.

You might be inclined to say all right then but composition that’s untouchable.  As we move towards 2025, when it is predicted that we will have computer’s with the processing power of the human mind and something like forty years after that of the entire human race, I think it a mistake to say never. After all, when we compose we tend to follow, or violate, a set of rules or conventions.  That can be programmed into a machine.  The aesthetics of a beautiful woman, a sweet child, or a handsome man can all be pretty closely defined and will ultimately be translated into a machine language.

Right now I am a bit suspicious of the images coming off of the Hubble Space Telescope.  They are often so beautiful.  Is this because they are intrinsically so; or is someone cropping and choosing the color look up tables to make them appealing to human viewers? However, ultimately, perhaps regrettably, all of our aestheticisms can be measured and coded.

It is also important to remember that all of the processing power need not be on a chip inside your camera.  The infinity of the cloud is available. Siri is not inside your IPhone, but rather somewhere on a big computer in Cupertino, CA.

In the meanwhile we can marvel at the complexity of our cameras and the degree to which artificial intelligence has been incorporated into them.  My suggestion is that you decide on whether your camera’s persona is male or female.  Then give him or her a name.  Start talking to this person.  Pretty soon they will be responding.

 

Close Menu