For computer chip builders, only one way to go: Up

May 05, 2011 By JORDAN ROBERTSON , AP Technology Writer
A reporter watches a video showing an image of an Intel 3-D Tri-Gate transistor at an Intel announcement in San Francisco, Wednesday, May 4, 2011. Intel Corp. said Wednesday that it has redesigned the electronic switches on its chips so that computers can keep getting cheaper and more powerful. (AP Photo/Jeff Chiu)

In the race to build a faster computer chip, there is literally nowhere to go but up. Today's chip surfaces are packed with the tiniest electronic switches the laws of physics allow, but Intel Corp. says it is blowing past those limits with a breakthrough, three-dimensional transistor design it revealed Wednesday.

Analysts call it one of the most significant developments in silicon transistor design since the integrated circuit was invented in the 1950s. It opens the way for faster smartphones, lighter laptops and a new generation of supercomputers - and possibly for powerful new products engineers have yet to dream up.

Minuscule fins jutting from the surface of the typically flat transistors improve performance without adding size, just as skyscrapers make the most of a small square of land.

"When I looked at it, I did a big, `Wow,'" said Dan Hutcheson, a longtime watcher and CEO of VLSI Research Inc. "What we've seen for decades now have been evolutionary changes to the technology. This is definitely a revolutionary change."

Intel CEO Paul Otellini said that "amazing, world-shaping devices" will be created using the new technology.

Computers are already doing things that were almost unimaginable when Intel co-founder Gordon Moore made his famous prediction in 1965 that computers should double in power every two years. The axiom, known as Moore's Law, has held true ever since as computers have gotten cheaper, smaller and more powerful.

Engineers believe Intel's new transistors will keep the axiom going for years to come. Chips with the 3-D transistors will be in full production this year and appear in computers in 2012.

When Moore's Law was first coined, the most advanced computers were large, mainframe-type machines that took up entire rooms and were best suited for narrow tasks done one at a time.

Today we have smartphones that let us carry around the Internet in our pocket, supercomputers that have beaten Jeopardy! and chess champions, and even experimental cars that drive themselves. Technologists entertain visions of even deeper integration of artificial intelligence into our lives as computer technology advances, such as robots performing surgery.

Transistors, tiny on/off switches that regulate electric current, are the workhorses of modern electronics. They're to computers what synapses are to the human nervous system. They have become faster over the years thanks to new materials and manufacturing techniques, but Intel's latest advance is a redesign of the transistor itself.

A chip can have a billion transistors, all laid out side by side in a single layer, as if they were the streets of a city. Chips have had no "depth" - until now. On Intel's chips, the fins will jut up from that streetscape, sort of like bridges or overpasses.

The fins give the transistor three "gates" to control the flow of electric current, instead of just one. That helps prevent current from escaping. There's a limit to how much current a chip can take, and the new design allows more of that power to be spent on computing rather than being wasted.

Intel has been talking about 3-D, or "tri-gate," transistors for nearly a decade, and other companies are experimenting with similar technology. Wednesday's announcement is noteworthy because Intel has figured out how to manufacture the transistors cheaply in mass quantities.

Other semiconductor companies argue that there's still life to be squeezed from the current design of transistors, but Intel's approach still allows it to advance at least a generation ahead of rivals such as IBM Corp. and Advanced Micro Devices Inc.

Intel's approach carries some risks because the technology is untested on the mass market. But Doug Freedman, an analyst with Gleacher & Co., said Intel's approach might actually reduce chip defects if the multiple gates make the transistors more reliable.

"Intel takes big gambles when it knows what it's doing," Freedman said.

The reduced power consumption also addresses a key need for Intel, which is the dominant maker of chips for personal computers but has been weak in the growing markets for chips used in smartphones and tablet computers. Intel's current chips use too much power for it to be competitive in those markets, and the 3-D chips could help it become more of a player.

Transistors are microscopic, but their performance is felt with every click of a mouse, tap on a smartphone or download from a website. The faster they twitch, the faster a computer "thinks" - and sucks up power.

They need to get smaller without leaking too much power, a worrisome issue as the materials reach the atomic scale and get worse at blocking current from escaping.

Intel's advance does not add a complete third dimension to chip-making - that is, the company can't add an entire second layer of transistors to a chip, or start stacking layers into a cube. That remains a distant but hotly pursued goal of the industry, as cubic chips could be much faster that flat ones while consuming less power.

And the technological advance Intel has achieved won't guarantee success, as has learned in repeated attempts at cracking the mobile market. The performance expectations and power requirements for phones and tablet computers are not as high as those for PCs.

Other chip makers such as Qualcomm Inc. and Texas Instruments Inc. have entrenched partnerships with cellphone makers that Hutcheson, the industry watcher with VLSI Research, said will be tough to overcome.

"When it comes to the mobile market, they have their work cut out for them," he said. But "this gives you the to build the next great system."

Explore further: Successful read/write of digital data in fused silica glass with high recording density

5 /5 (8 votes)
add to favorites email to friend print save as pdf

Related Stories

Intel Researchers Improve Tri-Gate Transistor

Jun 13, 2006

Intel Corporation researchers today disclosed they have developed new technology designed to enable next era in energy-efficient performance. Intel's research and development involving new types of transistors ...

Intel to spend up to $8B on US manufacturing

Oct 19, 2010

(AP) -- Intel Corp. on Tuesday revealed the scope of its latest infusion to keep its factories cutting-edge and push the chip industry's pace: an investment of up to $8 billion to build a new factory in Oregon ...

Intel launches chip for tablet computers

Apr 11, 2011

Intel Corp. has launched a new chip for tablet computers, Atom processor Z670 based platform, as the world's most powerful semiconductor company aims to become a contender in the market for mobile chips.

Recommended for you

Magic Leap moves beyond older lines of VR

22 minutes ago

Two messages from Magic Leap: Most of us know that a world with dragons and unicorns, elves and fairies is just a better world. The other message: Technology can be mindboggingly awesome. When the two ...

NBCUniversal settles with unpaid interns for $6.4M

1 hour ago

NBCUniversal will pay $6.4 million to settle a class action lawsuit brought by unpaid interns who worked on "Saturday Night Live" and other shows who claim they are owed wages, according to court documents.

Team infuses science into 'Minecraft' modification

1 hour ago

The 3-D world of the popular "Minecraft" video game just became more entertaining, perilous and educational, thanks to a comprehensive code modification kit, "Polycraft World," created by University of Texas at Dallas professors, ...

States ascend into the cloud

1 hour ago

Seven years ago, the state of Delaware started moving computer servers out of closets and from under workers' desks to create a consolidated data center and a virtual computing climate.

User comments : 8

Adjust slider to filter visible comments by rank

Display comments: newest first

nighmare
not rated yet May 06, 2011
this amazing
Buyck
not rated yet May 08, 2011
Oh... 22nm is just the beginning. Intel will start with 14nm in 2013 and 10nm in 2015 ! This no nonsense preparations are ont its way see the PDF file: http://download.i...tion.pdf
Quantum_Conundrum
1 / 5 (1) May 08, 2011
Today's chip surfaces are packed with the tiniest electronic switches the laws of physics allow.


This is not entirely true. The key is how small can be made to be mass producible.

There was the Australian team which demonstrated a 4nm process electronic Silicon transistor well over a year ago, and Intel and IBM have already claimed they both have a clear path to ~10 to 11nm process mass producible in 2-d transistors using top-down technologies.

The key difference here is that Intel is actually taking an active step in moving into the third dimension, even if it is only a "fractional" third dimension, i.e. a fractional dimension of maybe 2.01 or so.

Perhaps a dimension of 3.0 will never be possible, but we may one day see chips based on a fractional dimension arbitrarily approaching 3.0 by just doing a series of "folds".

What I would like to see next is the introduction of vertical walls and platforms, which would use optical circuitry between surfaces.
Quantum_Conundrum
1 / 5 (1) May 08, 2011
Again, the simplest way to get multi-layered chips is to "fold" the surface of the chips, and then communicate between individual components via the circuits on the folded end, OR by direct optical circuits across the voids between the surfaces.

Even with folding, it is likely you will need some scaffolding to hold the components in place.

What I have in mind is envisioned as this. If a chip is a towel, then what we have currently is an unfolded towel.

If you take the towel and fold it in half, and imagine circuitry on all surfaces with optical circuitry between the layers, that would be the start of 3-d.

Then you would take the towel and fold it in half the other way, and repeat until you have something more cubical, etc.

Now I'm talking chip level, since our expansion cards already do this a bit at system level, but with huge open "wasted" spaces everywhere.
Na_Reth
not rated yet May 08, 2011
IBM also developed a technique that if you go with 3d transistors you can make tiny channels in the chip that will cool the transistor, saying that this method performs so well it can cool chips with 60 degrees Celsius water.
This could also mean that a server park could be used as a city heater.
Na_Reth
not rated yet May 08, 2011
And if the heat is not required you can use it to power a turbine , regaining part of the energy that was used..
Quantum_Conundrum
not rated yet May 08, 2011
And if the heat is not required you can use it to power a turbine , regaining part of the energy that was used..


Close.

The idea is thermoelectric generators encorporated in the scaffolding and heat sink to recapture up to 15% to 20% of the waste heat as electricity, which is pretty significant because it would ultimately cut net power consumption of the chips by an additional 15% to 20% while simultaneously cooling the sytem by 15% to 20% of the cooling needs of the processor and RAM. This also cuts power usage even further, since you don't need as many/large of a fan or other cooling unit...
Quantum_Conundrum
1 / 5 (1) May 08, 2011
A more important question regarding 3d architecture is "what are we going to do with these insanely powerful computers once we have them?"

The local processing power will exponentially out-pace the transmission speed and bandwidth as we will even run out of available channels, even in wired networks.

IN supercomputing, we are already passing the point where the resolution of the models are actually capable of exceeding the resolution of our input data (i.e. weather forecasting.)

What does that leave? Video games and holograms?

Who the hell is going to program a game for a million core computer?

More importantly, is it even possible to program a game in such a way as to have non-linear logic?

Will the Real Time Strategy game of the future have every individual unit, structure, and neutral object in it's own seperate thread controlled by it's own seperate processor in a non-linear interactive environment?

Who is smart enough to program that?