Hot new material can keep electronics cool: Few atomic layers of graphene reveal unique thermal properties

May 10, 2010

Professor Alexander Balandin and a team of UC Riverside researchers, including Chun Ning Lau, an associate professor of physics, have taken another step toward new technology that could keep laptops and other electronic devices from overheating.

Balandin, a professor of electrical engineering in the Bourns College of Engineering, experimentally showed in 2008 that graphene, a recently discovered single-atom-thick carbon crystal, is a strong heat conductor. The problem for practical applications was that it is difficult to produce large, high quality single atomic layers of the material.

Now, in a paper published in Nature Materials, Balandin and co-workers found that multiple layers of graphene, which are easier to make, retain the strong heat conducting properties.

That's also a significant discovery in fundamental physics. Balandin's group, in addition to measurements, explained theoretically how the materials' ability to conduct heat evolves when one goes from conventional three-dimensional bulk materials to two-dimensional atomically-thin films, such as graphene.

The results published in Nature Materials may have important practical applications in removal of dissipated hear from electronic devices.

Heat is an unavoidable by-product when operating . contain many sources of heat, including millions of transistors and interconnecting wiring. In the past, bigger and bigger fans have been used to keep computer chips cool, which improved performance and extended their life span. However, as computers have become faster and gadgets have gotten smaller and more portable the big-fan solution no longer works.

New approaches to managing heat in electronics include incorporating materials with superior thermal properties, such as graphene, into silicon computer chips. In addition, proposed three-dimension electronics, which use vertical integration of computer chips, would depend on heat removal even more, Balandin said.

Silicon, the most common electronic material, has good electronic properties but not so good thermal properties, particularly when structured at the nanometer scale, Balandin said. As Balandin's research shows, graphene has excellent thermal properties in addition to unique electronic characteristics.

"Graphene is one of the hottest materials right now," said Balandin, who is also chair of the Material Sciences and Engineering program. "Everyone is talking about it."

Graphene is not a replacement for silicon, but, instead could be used in conjunction with silicon, Balandin said. At this point, there is no reliable way to synthesize large quantities of graphene. However, progress is being made and it could be possible in a year or two, Balandin said.

Initially, graphene would likely be used in some niche applications such as thermal interface materials for chip packaging or transparent electrodes in photovoltaic solar cells, Balandin said. But, in five years, he said, it could be used with silicon in , for example as interconnect wiring or heat spreaders. It may also find applications in ultra-fast transistors for radio frequency communications. Low-noise have already been demonstrated in Balandin's lab.

Balandin published the paper with two of his graduate students Suchismita Ghosh, who is now at Intel Corporation, and Samia Subrina, Lau. one of her graduate students, Wenzhong Bao, and Denis L. Nika and Evghenii P. Pokatilov, visting researchers in Balandin's lab who are based at the State University of Moldova.

Explore further: Blades of grass inspire advance in organic solar cells

Related Stories

Researcher Uses Graphene Quilts to Keep Things Cool

Dec 21, 2009

( -- University of California, Riverside Professor of Electrical Engineering and Chair of Materials Science and Engineering Alexander Balandin is leading several projects to explore ways to use ...

Nanotech: Hot Technology Gets a Cool Down

Jun 03, 2008

It’s the hottest technology – featherweight laptops that feature rapid response, crisp graphics and operate complex computer games; slim cell phones with Web-browsing capabilities, store high resolution photos and keep ...

A huge step toward mass production of graphene

Mar 10, 2010

Scientists have leaped over a major hurdle in efforts to begin commercial production of a form of carbon that could rival silicon in its potential for revolutionizing electronics devices ranging from supercomputers ...

Recommended for you

Blades of grass inspire advance in organic solar cells

2 hours ago

Using a bio-mimicking analog of one of nature's most efficient light-harvesting structures, blades of grass, an international research team led by Alejandro Briseno of the University of Massachusetts Amherst ...

How to make a "perfect" solar absorber

Sep 29, 2014

The key to creating a material that would be ideal for converting solar energy to heat is tuning the material's spectrum of absorption just right: It should absorb virtually all wavelengths of light that ...

User comments : 9

Adjust slider to filter visible comments by rank

Display comments: newest first

not rated yet May 10, 2010
Go UCR!! Sorry, I'm sitting in the campus library as I read this, not 300 meters from the engineering building and his office. Reducing the need for cooling fans would be spectacular, they draw a lot of power from the battery.
2 / 5 (1) May 10, 2010
Isn't the computer industry going to hit some sort of major tech bubble within a few years?

Keep cranking out multi-core processors that are now selling for as little as $50 U.S. and pretty soon everyone will have a computer that meets or exceeds their needs for decades to come, as video games are the only real thing that uses the new computers' power outside the scientific simulations. But it already takes as much as 10 years to make a video game that uses an existing computer's full power by design (that is, without memory leaks or bad code.)

In ten years, computers will be ~32 times more powerful and so it would take as much as 320 years for a software company to make a "top of the line" video game that actually uses the computer's capabilities.

So we are basicly already at a tech bubble in which new desktop computers will only be made for the purpose of replacing existing ones after they break down, as there will be no practical need for "more power"...
2 / 5 (1) May 10, 2010
Barring the invention of "fully interactive holographic environments", I cannot imagine that a normal person would have any way of actually using a ~32 core, 12ghz desktop computer 10 years from now, but Moore's law suggests that will only cost about $1000u.s. for the CPU, and another 5 years later it will cost just 50-100u.s., but an entertainment application that actually uses it's power would take ~320 years to produce...
not rated yet May 10, 2010
Good, another hologram discussion. Game development will speed up as programmers use newer and more advanced tools to create games. Faster computers allow for programs that allow programmers to start somewhere besides the ground up. It may be possible to use personality,morality,logic, and strategic decision algorithms real time to make gaming more interactive and dynamic, not pre-programmed. Again, programming these will be easier as people adopt standards and build on these. Currently, robotics labs write much of their own code and do not build on existing achievements of others.
not rated yet May 10, 2010
hmmmm, maybe human minds can be uploaded to these future video games so that the AI responses would be more realistic. I'm thinking death row inmates and captured terrorists.
5 / 5 (1) May 10, 2010
Bah, even for computer games we're barely scratching the surface. Try photorealistic rendering -- i.e. ray-tracing, with luminosity, caustics, and internal scattering included (requiring back-trace through many bounces per ray, perhaps on the order of 10) -- at high resolutions (HD+) in stereo and in real time (let's say 60 fps: 30 fps per eye), with scenes of near-arbitrary complexity (e.g. a forest, with every leaf and blade of grass procedurally modeled as a distinct volumetric object.) Take the most obscenely powerful supercomputer in existence today, and you just might have enough horsepower for such an application: but it's on an order of 10,000 times more powerful than a typical PC, takes up a building, and eats up megawatts of electricity per hour. I'd say there's room for progress =)

Then, like trekgeek1 says, faster processors free up developers from excessively worrying about optimization. That means (much) more automation, faster development cycles, and fewer bugs.
5 / 5 (1) May 10, 2010

And that's before we consider the need for accurate high-fidelity physics and hydrodynamics. This in itself (to be done in real-time) would require yet another separate supercomputer today.

And on top of that, let's also add sophisticated AI.

And then, let's allow 100,000 people to share a single game universe on a single server, up to and including letting all 100,000 congregate into a single crowd. But perhaps 100,000 isn't ambitious enough. How about 10,000,000?

So for games alone, there's never going to be enough computing power. Not for another century, at least =)
not rated yet May 11, 2010
Barring the invention of "fully interactive holographic environments", I cannot imagine that a normal person would have any way of actually using a ~32 core, 12ghz desktop computer 10 years from now, but Moore's law suggests that will only cost about $1000u.s. for the CPU, and another 5 years later it will cost just 50-100u.s., but an entertainment application that actually uses it's power would take ~320 years to produce...

Fear not. Microsoft will always find a way to use 32 cores to the max while you copy and paste files.
not rated yet May 15, 2010

Well, every action needs some level of fail-safe coding.