Sunday, April 23, 2023

Thoughts on Gordon Moore

 Gordon Moore passed away on March 24, 2023. He was 94 years old. 

He is best known for the article 

Cramming more components onto integrated circuits. 

It appeared in the magazine Electronics (it is now defunct), Volume 38, No. 8, April 19, 1965. Do you need to track it down in the basement of your library. No. Its here and here. I wonder if Moore would have predicted that his article would be available easily over 50 years later. Or is it? Link rot is a serious problem so you might want to download it to your local files. Note that the first link is some sort of official version and the second version is my local version. Not clear which link will rot first. The article also has an addition which is an interview with Moore that was done later.

In the article Moore said that the number of components-per-circuit (I think that means chip) will double every year. Moore credits Dave House with modifying it to `doubling every 18 months' and Carver Mead with calling it `Moore's Law'.  Later it came to be quoted as computer SPEED would double every 18 months. We will take this to be Moore's Law in this blog post. 

Is Moore's law dead? Browsing Google the answer seems to be that it is slowing down but not dead yet. (IDEA for a comedy sketch: Redo the Monty Python Dead Parrot sketch about the death of Moore's law.) 

If Moore had 1 penny in April 1965 and it doubled every 18 months then how rich would he be now? How rich was he in April 2022? Compare the two numbers. 


  1. Grumble. Computer (well, peecee) clock speeds haven't moved much since 2005, e.g. the 2005 Intel Pentium 4 at 3.8 GHz. Sure, the number of transistors has increased over the last18 years. But are they up 2^9 times since then? Nowhere close. Maybe 2^6.

    St. Gordon's Law is a thing of the distant past.

    Despite the grumbling, I'm still pretty pleased with recent advances: instructions per clock cycle keeps increasing (slowly, sure, but this is seriously kewl), cache sizes and memory architecture keep getting improved. I submit that improvements due to Intel engineers busting their collective rear ends (due in no small part to AMD nipping at their heels; thank you, AMD) are more impressive than the improvements in the 1985-2005 period, when your speed doubled just because the silicon got faster.

    1. I have a feeling that I'm missing a joke...
      Anyway, to fall for your straight line, it's a company that makes CPUs that compete with Intel, especially in gaming peecees....

    2. No joke. Anyway based on your response I googled AMD and think it is Advanced Micro Devices. I am often amazed at what some people (including myself) don't happen to know. This was one of those.

    3. My bad here: Having been a freelancer for 30 years. I didn't have a department providing hardware, so I had to think about it. Also, I tend to think of the details of CPU and memory design to be in your bailywick (I mean that's complexity, and there ain't nothin in god's world more complex than a modern out of order speculative execution multi-pipelined CPU with multiple levels of cache, so you complexity folks ought to be working on that, right? (Yes, I'm joking...)

    4. I hope Lance does not see your comment, or he'll say that this is one more thing CS theory has missed (he may have said it, already).

  2. There was an article in Science a few months ago by some semiconductor industry folks on the roadmap for the next few generations. They were clear that they thought the industry was good for another 3 or 4 generations of finer and finer feature sizes. My jaundiced reading of it was that they were wildly underestimating the difficulties; that the last couple of generations have been much harder than expected, and that future generations depend on technologies not yet developed and/or not yet even demonstrated in the lab.

    Still, the 40x0 generation of GPU cards is now out with significantly higher performance, albeit at sky high power consumption levels in the high end cards. However, the 4070 (35.8 billion transistors) seems to provide about the same performance as the 3080 (released Sept 2020, 28.3 billion trainsistors) at 200, instead of 350, watts. (The review sites are all gaming sites, so comparison for AI and other SIMD computation-intense applications may vary.)

    Hmm: better comparison would be 3090 (28.3 x 10^9 trs) vs. 4090 (76.3 x 10^9 trs). Ha! No wonder the 4090 is such a power hog.

  3. April 1965 is 57 years before April 2022, so 684 months, allowing the penny to double exactly 38 times. Which leads to a a wealth just shy of 2.75 billion dollars. Extrapolating this for another year would lead to a wealth of just over 4.36 billion dollars today.
    Now, I don't know Moore's wealth in April 2022, but I would guess it's less than 2.75 billion dollars.

  4. The variant of Moore's Law that died abruptly was the rate of change of CPU clock speeds. The key behind that was the end of Dennard scaling and not the # of transistors on a chip. We get multi-core instead.

    1. Yes, But. IMHO, "interesting" algorithms are ones that depend on intermediate results (think Alpha-Beta in chess vs. weather calculation) and more cores don't help those very much at all.

      So what does, say, a 6x incease in transistor count get you?

      Using GeekBench 6, two of the machines here (an i7-8550U 4-core laptop (1.5 billion transistors (maybe)) and an i7-12700K 12-core desktop (9 billion transistors (maybe)) sees a 4.3x multiprocessor improvement, and a mere 1.9x single core improvement, despite being a monster desktop vs. an aging laptop.

      (The 1.9x single-thread improvement overstates the improvement somewhat, since the laptop is held back by power dissipation limits; the water-cooled desktop has fans that get to go moby whoosh when needed.)

      Also, grumble, you only get 2.5 times as many threads for that 6x increase in transistor count.

      So ever diminishing returns from ever slower increases in transistor counts is the state of the game. In my jaundiced opinion...

  5. Wikipedia lists his net worth in 2023 as $7 billion, so I think, not far off. This is after he and his wife founded the Gordon and Betty Moore foundation, with a $5 billion gift, in 2000.