IBM Scientists Demonstrate World's Fastest Graphene Transistor

February 5, 2010, IBM

( -- In a just-published paper in the magazine Science, IBM researchers demonstrated a radio-frequency graphene transistor with the highest cut-off frequency achieved so far for any graphene device - 100 billion cycles/second (100 GigaHertz).

This accomplishment is a key milestone for the Carbon Electronics for RF Applications (CERA) program funded by DARPA, in an effort to develop next-generation communication devices.

The high frequency record was achieved using wafer-scale, epitaxially grown graphene using processing technology compatible to that used in advanced silicon .

"A key advantage of graphene lies in the very high speeds in which electrons propagate, which is essential for achieving high-speed, high-performance next generation transistors," said Dr. T.C. Chen, vice president, Science and Technology, IBM Research. "The breakthrough we are announcing demonstrates clearly that graphene can be utilized to produce high performance devices and ."

Graphene is a single atom-thick layer of bonded in a hexagonal honeycomb-like arrangement. This two-dimensional form of carbon has unique electrical, optical, mechanical and thermal properties and its technological applications are being explored intensely.

Uniform and high-quality graphene wafers were synthesized by thermal decomposition of a (SiC) substrate. The graphene transistor itself utilized a metal top-gate architecture and a novel gate stack involving a polymer and a high dielectric constant oxide. The gate length was modest, 240 nanometers, leaving plenty of space for further optimization of its performance by scaling down the gate length.

It is noteworthy that the frequency performance of the graphene device already exceeds the cut-off frequency of state-of-the-art silicon transistors of the same gate length (~ 40 GigaHertz). Similar performance was obtained from devices based on graphene obtained from natural graphite, proving that high performance can be obtained from graphene of different origins. Previously, the team had demonstrated graphene transistors with a cut-off frequency of 26 using graphene flakes extracted from natural graphite.

Explore further: Next generation devices get boost from graphene research

More information: Carbon Based Chips May One Day Replace Silicon Transistors:

Related Stories

Next generation devices get boost from graphene research

January 22, 2010

( -- Researchers in the Electro-Optics Center (EOC) Materials Division at Penn State have produced 100 mm diameter graphene wafers, a key milestone in the development of graphene for next generation high-power, ...

AMO Manufactures First Graphene Transistors

February 8, 2007

In the scope of his innovative project ALEGRA the AMO nanoelectronics group of Dr. Max Lemme was able to manufacture top-gated transistor-like field-effect devices from monolayer graphene.

Unzipping Carbon Nanotubes Can Make Graphene Ribbons

April 20, 2009

( -- By "unzipping" carbon nanotubes, researchers have shown how to make flat graphene ribbons. Graphene, which is a one-atom-thick sheet of carbon that looks like chicken wire, has unique electrical properties ...

Recommended for you

Atomic-scale ping-pong

June 20, 2018

New experiments by researchers at the National Graphene Institute at the University of Manchester have shed more light on the gas flow through tiny, angstrom-sized channels with atomically flat walls.

Chameleon-inspired nanolaser changes colors

June 20, 2018

As a chameleon shifts its color from turquoise to pink to orange to green, nature's design principles are at play. Complex nano-mechanics are quietly and effortlessly working to camouflage the lizard's skin to match its environment.

Method could help boost large scale production of graphene

June 19, 2018

The measure by which any conductor is judged is how easily, and speedily, electrons can move through it. On this point, graphene is one of the most promising materials for a breathtaking array of applications. However, its ...


Adjust slider to filter visible comments by rank

Display comments: newest first

2.7 / 5 (3) Feb 05, 2010
- 100 billion cycles/second (100 GigaHertz).

...ok...who needs quantum computers?!?
3.8 / 5 (4) Feb 05, 2010
- 100 billion cycles/second (100 GigaHertz).

...ok...who needs quantum computers?!?

Remember when the 486 was way faster than anything we needed? Now my 2.4 GHz computer is SOOOOOO slow!
4.5 / 5 (2) Feb 05, 2010
This is stunning...

What a performance improvement. In 1995 100MHz was good, now is 2010 a 3GHz Quad core is good but to have 100GHz! It really looks as though Moore's Law really is starting to accelerate to infinity...

I love technology!

Feb 05, 2010
This comment has been removed by a moderator.
4.8 / 5 (5) Feb 05, 2010
Does anybody even want or need 100GHz+? Oh, wait.. I do. Hurry it up will ya?
5 / 5 (4) Feb 05, 2010
Graphene transistors, Germanium optical buses, quantum computing. The next ten years of computer advancement is going to be like moving from vacuum tubes to transistors. We live in an amazing time.
4.5 / 5 (4) Feb 05, 2010
Yeah well 100GHz is nothing compared to the potential capability of light based transistors, which could probably operate down to the Angstrom wavelength, or... let's see... 3*10^18 Hz, or 3 "exaHertz". Thats a lot faster. Bye bye little electrons...
2.2 / 5 (5) Feb 05, 2010
It's a pity that applications (and also some OSes) always quickly find their way of swallowing all of that performance. I hope the programming languages will stay at this level (interpreted mostly), and won't move to another level. But it sounds intriguing to have one of those at home.
not rated yet Feb 05, 2010 them electrons for me...I still love them little buggers!!HAL? I'm sorry to report HAL died choking on a buckyball prize he missed in his frosted graphene sad.
I wonder what is the tensil strengh of a 1 atom thick flake???
1.3 / 5 (4) Feb 05, 2010

Yeah, I don't know why every new version of Windows seems to require about 10 times more RAM than the previous version, even though it doesn't even "do" anything new.
4.5 / 5 (4) Feb 05, 2010
- 100 billion cycles/second (100 GigaHertz).

...ok...who needs quantum computers?!?

Don't confuse core clocks with transistor speeds. A CPU's core clock is determined by the execution time of the CPU's slowest pipeline stage. Each pipeline stage consists of thousands of transistors, many of them connected in series. Thus, the overall clock speed is governed by a collective performance of the entire transistor chain, rather than any single transistor.

Even with 100 GHz transistors, core clock speeds probably won't exceed 10 GHz or so (e.g. if the longest pipeline stage is 10 transistors deep.)

And this is before even considering other slowdowns, due to "cable effects" (capacitance and inductance) of the interconnects between transistors (i.e. the "wires" in the circuit.) Light-based circuits may help somewhat with long-distance connections, but not over shorter distances. Plus there will be delays in optical/electric trans-modulation...
2 / 5 (1) Feb 05, 2010
Look, the readout up on the OLED! 100 GHz! it could only be aliens!
2.2 / 5 (5) Feb 05, 2010
Besides, it won't matter how fast these systems can become. Until many software developers learn how to tighten and manage their code better (be they Microsoft, Mozilla or otherwise), it's quite possible computing will be just as productive ten years from now as it is today.
2.5 / 5 (4) Feb 05, 2010
Besides, it won't matter how fast these systems can become. Until many software developers learn how to tighten and manage their code better (be they Microsoft, Mozilla or otherwise), it's quite possible computing will be just as productive ten years from now as it is today.

Ideally, everything would be written at the lowest level, but it takes a one in a million genius to do that even for programs that would be relatively small for our modern standards.

Creating this bulletin board from scratch in php or a similar scripting language is a very large project for any one person or even small group of people.

Making this thing in machine code directly, without compilers and interpreters, would be well...completely unfathomable...It's hard enough to comprehend in a "fifth generation" language...
3 / 5 (1) Feb 05, 2010
Since graphene transistor development is so early in the research stages, I suspect that commercial transistors will be in the Terrahertz range.
3.7 / 5 (3) Feb 06, 2010
...ok...who needs quantum computers?!?

Quantum computers can solve certain classes of problems much faster than today's computers. In fact, there is no polynomial expression you can write that is larger than the potential speedup from quantum computers. Ten to the one-hundredth power (a google) is a polynomial, so is ten to the google.

You might think that this is hyperbole, and no one would want to solve such a problem. Wrong! Even for very simple computer programs, the question of whether they will halt for any input may be in this class. Technically the general halting problem is unsolvable. But it is possible to write programs that are not expected to crash. Proving that though, in any reasonable time for a large program such as an operating system, will take a quantum computer. (And, yes you want to avoid livelocks as well as deadlocks and other types of crashes. Fine. The problem is that theorem proving for non-toy programs takes way too long.)
4 / 5 (2) Feb 06, 2010
The Singularity is [getting] near
2.5 / 5 (2) Feb 06, 2010
The possibilities are going to far beyond or magination. With this technology (super) computers are going to be much faster (x 100 or x 1000) and consume less power than it is today. I think the release will take five years for the professionals and ten for the consumers in the street!
2 / 5 (1) Feb 06, 2010

So far, I have yet to see a "quantum algorithm" that gives a truly reliable computation.

You see these articles talking about "well, the result has a 90% chance of being right," and things like this. Further, most of the algorithms I've seen amount to gimmick algorithms that only work if you already know the output, and only on a specialized machine designed to run just that algorithm...which is largely useless for practical computing purposes. (see next post I address this further.)

In our modern electronic computers, the result of a computation is always 100% correct, and the only source of error is either from bad human input or a mechanical malfunction(something broke down.)

The other thing is, the principle of superposition is not "magic". No matter how many quantum states a particle has, you still need detectors to "read" those states without influencing the other state(s) the particle has, else you destroy your data every time you check your data.
2 / 5 (1) Feb 06, 2010
So I said it would largely be useless, however I qualify this.

Suppose we have a "Quantum Server".

We might place different types of quantum processors on the "motherboard" for different purposes, and then the CPU queries a different type of processor depending on the algorithm or portion of an algorithm it needs computed...

Even this would leave the computer only able to solve problems which can be reduced to a specific set of algorithms.

Next, in terms of data storage (RAM and ROM,) a "Qubit" is only better than a "bit" if the space required for the data storage itself and the detectors and logic to read it remain smaller than the space required for a normal transistor or magnetic dot on a disk.

While in theory, "Base 3" from Qubits could allow exponentially more data storage per unit of space, this is of course only considering an "ideal" spintronic RAM or ROM device. A real device would have much space used by detectors to read and transmit data.
5 / 5 (1) Feb 07, 2010
Quantum_Conundrum - Quantum computers are going to be useful for problems you can't do on classical computers, they won't replace them. The 'gimmick algorithms' that have been demonstrated are very simple, in the same way that adding two integers on a mechanical computer/calculator was very simple.

The biggest benefit of quantum computing will be in simulating quantum systems - things like molecules, nanoparticles and biological processes. It will take time for quantum computers to be powerful enough to do this, but then, ENIAC wouldn't be much use doing today's CAD.
not rated yet Feb 08, 2010
It should be noted that there is no known method for manufacturing graphine at reasonable price or quantity and it's therefore limited to a laboratory curiosity. It may be there is a cost effective way to make graphine of price and quality for mas production use, but it's also possible there is not cost effective way to make graphine suitable for this use. Nature makes no guarantees of feasibility, and despite lots of people wanting graphine production, none have figured out how to do it.

This article adds support for graphene as worthy of further research is all. All they succeeded in doing was proving that it would be possible and likely efficient to make transistors on graphene. That's a good reason to invest in science to study it further, but a very long ways from sitting on your desktop.
not rated yet Feb 08, 2010
It may be there is a cost effective way to make graphine of price and quality for mas production use, but it's also possible there is not cost effective way to make graphine suitable for this use.

You need to better keep up with recent developments =)

Check this out:

not rated yet Feb 09, 2010
Okay -- lets say that computers will have anew chip with Gods 100Ghz core processor -- doesn't this mean that the Intel frontside bus will have to scale up to like 20Ghz~25 Ghz to even be useful?? Right now the front side bus delivers 4 bits per every mother board tic -- different than CPU clock. By the way the Motherboard would have to go a lot faster overall.

And lets say you are AMD and don;t use a frontside bus -- you still need RAM to be aviable at close to 40Ghz right???

not rated yet Feb 09, 2010
Intel is getting rid of FSB in its latest designs (by integrating memory controller onto the core, just like AMD has done.) Also, instead of expanding bandwidth to memory, they could just increase amounts of cache and count on benefiting from space locality in data usage. Memory bandwidth could be boosted by using more lanes, or maybe the whole motherboard could layout its circuitry over a graphene layer as well. Another potential alternative, might be to use optical interconnects for such long-distance signaling as CPU to RAM. And when it comes to RAM speeds, again one could try to boost them directly or one could just provide more modules in parallel: you boost bandwidth either way.
not rated yet Feb 09, 2010
My issue is not bandwitdh with ram its the frequency refresh rate it uses... RAM syncs at its refresh rate for the most part so 400 Mhz ram refreshes at 400Mhz and can only relay information at that same rate -- our fastest RAM is like 1600 MHZ ?? -- what is DDR3 rate -- DDR5 isn't too much faster -- so the issue is you need RAM thats at around 25 - 50 Ghz - which means the mobo has to be around 6-12Ghz
not rated yet Feb 10, 2010
The Singularity is [getting] near
I used to think the same thing. But my opinion now is that the singularity will happen once we have the ability to scan and then software simulate the human brain in real time. It would take a super powerful and massively parallel computer, but we'll get there.

Please sign in to add a comment. Registration is free, and takes less than a minute. Read more

Click here to reset your password.
Sign in to get notified via email when new comments are made.