Federico Faggin on the First 40 Years of the Microprocessor — And the Next 40 (Video)
When Federico Faggin designed the Intel 4004 processor four decades ago, he had an inkling that the chip might be able to do more than just power the Busicom calculator for which it had been commissioned.
But what Faggin said he could not imagine was just how far those chips — and the products built around them — would extend into everyday life.
“I could not imagine the plethora of applications that emerged; and even more importantly, what I did not imagine was the social impact,” Faggin said, speaking Tuesday at an Intel event in San Francisco to commemorate the 40th anniversary of the 4004 — the world’s first general-purpose microprocessor.
While Faggin had argued successfully to his superiors that the 4004 could power far more than that basic calculator, he didn’t dream that it would pave the way for a day when people carried PCs and cellphones wherever they went.
“It goes to show that engineers can think the gadget, but they don’t really have the ability to imagine what it is to live with the gadget that they have invented,” Faggin said. “That certainly was the case for me, as well.”
Faggin’s marks are all over the chip industry. Before coming to Intel, he helped pioneer metal oxide semiconductor technology at Fairchild — the granddaddy of Silicon Valley companies. At Intel, he not only led the 4004 project (the design of which includes his initials), but also the 8008 — the world’s first eight-bit microprocessor. He went on to found and lead Zilog, another chip company; and Synaptics, a company best known for its trackpads.
The original 4004 had 2,300 transistors, with features that were about eight microns apiece (or about 8,000 nanometers). By contrast, the current second-generation Intel Core processors have features that are just 32nm big and pack in nearly a billion transistors.
As for the road ahead, Faggin sees enormous potential, but also some significant limits. Processor technology may enable a shift into quantum computation, for example, but it won’t allow for machines that approach human intelligence, he said.
“The human brain is so much more powerful than a computer,” Faggin said. “It has characteristics we have no idea how to implement.”
The computer, meanwhile, is mainly good at doing a more limited set of tasks way faster than humans.
“The computer is a tool and as such is a wonderful tool,” Faggin said. “It’s not ever going to be a competition (to the) human being.”
“Ever” might be too long a time frame to say that computers won’t match human intelligence, but Faggin said he is convinced it won’t happen in the next 40 years.
Consciousness, awareness and creativity are processes in the brain that aren’t even understood, let alone replicable, Faggin said.
“Our experience is a lived thing,” Faggin said. “A computer is a zombie.”