Celebrating the
independent kiwi spirit of invention.
Research Topic: Supercomputing Retina
By Ian Mander, 26 May 2003. Minor tweaks 1 May 2009.
Question: Does the eye really do as much computing as a supercomputer?
How could it do that when supercomputers dissipate tens of kilowatts of
heat?
Answer: The human retina has about 10 million (or so) cells, with
complicated interconnects that allows it to do processing for edge detection,
movement, etc. It's thought to perform about 10 billion analogue add operations
per second. That's enough to put it into the supercomputer class, at least
by the definition of "supercomputer"
of a half dozen years ago. The fastest modern supercomputer at time of
writing is capable of over 35 trillion floating point operations per second,
or 35 teraflops. (The second and third fastest are both around 7.7 teraflops.)
However, the retina even outclasses these beasts, because the analogue
way the eye works is far more efficient than a digital computer and so
requires far fewer operations to do the same thing.
It's also because of this efficiency that the retina does this with very
little heat output – supercomputing without wasted heat.
A Closer Look
There are two issues here. One is how much preprocessing the retina
does (and what does it equate to for manmade computers) and the other
is the amount of heat it generates doing this.
Someone complained to me by email that "The total heat dissipation
(thermodynamically demanded for non-reversible computation) of modern
supercomputers is adequate to boil a human skull in under a second –
on the order of tens of kW. Even if the retinal processing is orders
of magnitude more energy efficient per bit processed... Merkle estimates
10^10 (10 billion) analog add operations/sec in a given retina, and
Teraflop (10^12, 1000 billion) clusters outstrip this processing performance
easily."
Let's look at those figures, and presume he didn't really mean boiling
the skull itself (made from bone). A back-of-the-envelope calculation
shows that if a terraflop cluster uses 10 kW and each retina does 100
times fewer operations at the same efficiency, that's 100 W times 2
eyes, or about 200 W total. That's about twice what the whole body
actually produces at rest. The heat from a 200 W light bulb (or a pair
of 100 W bulbs) in the brain box would be enough to cook it.
Since the eyes do all those analogue add ops and still stay at 37°C
– I know because only my glasses steam up, not my eyeballs – my back-of-the-envelope
conclusion is that something else is going on. Maybe much more efficient
computing, or processing in ways that we're only just beginning to understand.
This brings to mind Charles Darwin, who thought that cells were about
as complex as marbles. Today we know they're just a "little"
more complex, with (for example) transport mechanisms in a single cell
rivalling the combined transport systems of greater London. We live
and learn, and thus to the issue of how much a retina does...
Processing Power
Taking a quote from the first Google
result for "retina processing supercomputer" (emporium.turnpike.net/C/cs/eye.htm)
with spelling corrected in a couple of places:
It has been estimated that 10 billion calculations
occur every second in the retina before the light image even gets
to the brain! It is sobering to compare this performance to the most
powerful man-made computer. In an article published in the computer
magazine Byte (April 1985) Dr. John Stevens said:
"To simulate 10 milliseconds of the complete
processing of even a single nerve cell from the retina would require
the solution of about 500 simultaneous non-linear differential equations
one hundred times and would take at least several minutes of processing
time on a Cray supercomputer. Keeping in mind that there are 10 million
or more such cells interacting with each other in complex ways it
would take a minimum of a hundred years of Cray time to simulate what
takes place in your eye many times every second."
This backs my complainant's claim that the retina does 10 billion
analogue add operations per second. But that needs to be put into perspective.
How many digital operations would a manmade computer have to
do to equal it? (Let's ignore the comparison with an 18 year old Cray.)
How many floating point operations are required to solve any 500 simultaneous
non-linear differential equations? I wouldn't have a clue. Let's be
ridiculously conservative and say 1 per equation. (Yeah, right! It's
probably more like dozens per equation.) Calculating out gives 500 equations
x 100 times each x 100 to make one second x 10,000,000 cells gives 5x10^13
flops, which is 50 teraflops. It's probably considerably
more after considering higher-level edge and movement detection processing,
the actual number of floating point operations required to solve that
many simultaneous non-linear equations, etc. The retina's advantage
becomes more pronounced when considering the amount of time wasted by
the manmade computer just shifting data around, not doing useful calculations.
So even the fastest of today's modern supercomputers are well off being
able to do in real time what your retina does all day. Off popcorn.
Or pizza and Mountain Dew, if that's what you prefer.
Don't forget most people have two super-computing retinae. 8-)
Heat
Any argument that the retina couldn't do that many calculations because
it would boil the brain (not the skull!) with the heat produced reminds
me of the Wright brothers' father who (supposedly) said "If man
was meant to fly God would have given him wings." Just because
people a while back didn't understand flight didn't mean flight was
impossible. The retina (for example) operates in a far different way
to manmade computers, achieving some things "easily" that
computers struggle at.
I'll quote the very next paragraph from the same page mentioned above.
What makes this comparison even more incredible
is the fact that nerve cells such as the photo cells of the retina
conduct electrical signals approximately a million times slower than
the circuit traces or "wires" in a man made supercomputer.
Dr. Stevens said that if it were possible to build a single silicon
chip that could simulate the retina using currently available technology
it would have to weigh about 100 pounds where as [sic] the retina weighs
less than a gram. The "super chip" would occupy 10,000 cubic
inches of space whereas the retina occupies 0.0003 [cubic] inches
of space. The power consumption of the man-made superchip would be
about 300 watts, whereas the retina consumes only 0.0001 watts of
power!
There's the heat problem solved as well – the retina is orders
of magnitude more efficient. It doesn't get hot because it does what
it does without generating huge amounts of heat. A quick calculation
shows that without any heat loss, this much power would take a week
to raise a teaspoon of water just under 3 °C. In other words, it's
not a lot of heat.
Looking Onward
All this is very impressive, but it raises an interesting question
– where did this incredible amount of specified complexity come from?
"Nature" or "Evolution" don't provide any answers.
Consider: Could we make something that powerful now, with our current
technology? Yes – we could put together a computer that could do that
much computing, that would be that fast. However, it would dissipate
horrendous amounts of heat and be quite huge.
Perhaps if we gave ourselves more time we could do it? Actually, no.
Even if we gave ourselves plenty of time, we still couldn't make our
computer small enough or energy-efficient enough, because time is not
what we're missing to be able to make that computer. We simply don't
have the know-how. In order to make it we'd have to boost our technology
considerably to cope with the demands of the task, and couple that
with our intelligence to be able to put it together.
Nature has no intelligence, and is incapable of putting together such
computing power. It's not a matter of "God-of-the-Gaps"; it's
simply logic.
Celebrating the
independent kiwi spirit of invention.