Wednesday, February 13, 2013


New Bern, NC

On a Friday night in 1985, I was reading a copy of Scientific American in bed. I came upon a fascinating article by Martin Gardener on something called the Mandelbrot Set. The article included some stunning graphics which come from the trivial recursive formula Z=Z+C. The next morning, Saturday, I rushed to my office at ASEA-Atom to write my own program to create such graphics. I happened to have a Textronix color graphics terminal on my desk, plus access to banks of the most powerful minicomputers in the world at that time. Being a veteran FORTRAN programmer it took me only minutes to write a program which began producing wonder opus graphics. I played with that program for weeks, and shoes it to all my friends and neighbors.  It was great fun.

The Textronix terminal had only 800x600 pixels, each with 24 bits of color. Even so, and even with those powerful minis working for me, it took almost an hour to compute a full screen picture. I went through all sorts of tricks to be able to zoom, pan, preview the region of interest, and to interact with it in real time. There was a business benefit to that because I taught myself lots of software optimization techniques that I later used in company products. Such is the power of games and play in computing; they motivate us to advance the state of the art.

What prompted this post today is that I discovered and downloaded an app yesterday to do Mandelbrot graphics on the iPad. My God, what an unbelievable difference 30 years of technology makes. The number of pixels and the beauty of the screen are much better on the iPad. Most amazing, the one hour wait for a new calculation is now reduced to a second or less. It happens so fast that the eye can barely catch something that looks like a flicker. The iPad must be on e order of 10,000 times faster than those superminis from 1986. Even more amazing, I have reason to believe that the Android phone in my pocket is more than 5 times faster than the iPad. The pictures posted here came from that app (in Libby's hand I might add.)

When I first started working professionally with computers in 1966, computers cost millions of dollars and many were as big as houses. They were roughly, 10,000 times slower than 1986 computers or, 1,000,000,000 (one billion) times slower than my phone. Memory cost $1 per bit and my wage as a graduate electrical engineer was about $4/hour. Today memory costs are lower by a factor of 100 billion, but wages are higher by a factor of 15, so bits per hour of labor is up by about 5 billion to one.

It has been a wild and wonderful ride for tekkies of my generation. Just to remain useful and to avoid obsolescence of skills, we had to surf on the leading edge of that fantastic tidal wave. I'm very grateful for the privilege of having been a part of it. Perhaps now you understand why my fascination with all things computer continues even during our cruising life years.


  1. Dick,
    I had a similar experience
    observing the evolution of
    computer technology. When I
    graduated from college in
    1986 I moved back home with my
    parents while I looked for a job.
    My dad had an IBM PC Jr. 4.77 MHz,
    640 kB of RAM and no hard drive,
    just two floppies. It had good
    graphics though and had GWBASIC on
    ROM. Being a newly minted
    electrical engineer I decided to
    write a program to calculate and
    plot the frequency response of
    electrical filters. I would input a
    few parameters and then start the
    thing going. While it calculated
    the response curve I could go
    downstairs, make a pot of
    coffee,wait for it to finish and
    then head back up with my hot
    coffee. The plot would be just
    about finished.

    A dozen years or so later I wrote a
    program in C++ to calculate the
    frequency response of loudspeaker
    enclosures. The calculations were
    almost identical but the new
    program was so much faster that I
    could resize the window that the
    program was plotting into and it
    appeared that the curve was just
    stretching and shrinking when in
    fact it was recalculating the curve
    in a fraction of a second.



Type your comments here.