Single-atom transistor is 'perfect'

Feb 19, 2012
This is a single-atom transistor: 3D perspective scanning tunnelling microscope image of a hydrogenated silicon surface. Phosphorus will incorporate in the red shaded regions selectively desorbed with a STM tip to form electrical leads for a single phosphorus atom patterned precisely in the center. Credit: ARC Centre for Quantum Computation and Communication, at UNSW.

In a remarkable feat of micro-engineering, UNSW physicists have created a working transistor consisting of a single atom placed precisely in a silicon crystal.

The tiny electronic device, described today in a paper published in the journal Nature Nanotechnology, uses as its active component an individual phosphorus atom patterned between atomic-scale electrodes and electrostatic control gates.

This unprecedented atomic accuracy may yield the elementary building block for a future quantum computer with unparalleled .

Until now, single-atom transistors have been realised only by chance, where researchers either have had to search through many devices or tune multi-atom devices to isolate one that works.

"But this device is perfect", says Professor Michelle Simmons, group leader and director of the ARC Centre for Quantum Computation and Communication at UNSW. "This is the first time anyone has shown control of a single atom in a substrate with this level of precise accuracy."

This video is not supported by your browser at this time.
In a remarkable feat of micro-engineering, UNSW physicists have created a working transistor consisting of a single atom placed precisely in a silicon crystal. Credit: UNSWTV

The microscopic device even has tiny visible markers etched onto its surface so researchers can connect metal contacts and apply a voltage, says research fellow and lead author Dr Martin Fuechsle from UNSW.

"Our group has proved that it is really possible to position one phosphorus atom in a silicon environment - exactly as we need it - with near-atomic precision, and at the same time register gates," he says.

The device is also remarkable, says Dr Fuechsle, because its electronic characteristics exactly match theoretical predictions undertaken with Professor Gerhard Klimeck's group at Purdue University in the US and Professor Hollenberg's group at the University of Melbourne, the joint authors on the paper.

The UNSW team used a (STM) to see and manipulate atoms at the surface of the crystal inside an ultra-high . Using a , they patterned phosphorus atoms into functional devices on the crystal then covered them with a non-reactive layer of hydrogen.

Hydrogen atoms were removed selectively in precisely defined regions with the super-fine metal tip of the STM. A controlled chemical reaction then incorporated phosphorus atoms into the silicon surface.

Finally, the structure was encapsulated with a silicon layer and the device contacted electrically using an intricate system of alignment markers on the silicon chip to align metallic connects. The electronic properties of the device were in excellent agreement with for a single phosphorus atom transistor.

It is predicted that transistors will reach the single-atom level by about 2020 to keep pace with Moore's Law, which describes an ongoing trend in computer hardware that sees the number of chip components double every 18 months.

This major advance has developed the technology to make this possible well ahead of schedule and gives valuable insights to manufacturers into how devices will behave once they reach the atomic limit, says Professor Simmons.

Explore further: Toward making lithium-sulfur batteries a commercial reality for a bigger energy punch

More information: DOI: 10.1038/nnano.2012.21

Related Stories

Experiments Prove Existence of Atomic Chain Anchors

Feb 03, 2005

Atoms at the ends of self-assembled atomic chains act like anchors with lower energy levels than the “links” in the chain, according to new measurements by physicists at the National Institute of Standards ...

Atoms Precision Placement Helps Building Nanoscale Devices

Sep 10, 2004

In an effort to put more science into the largely trial and error building of nanostructures, physicists at the Commerce Department's National Institute of Standards and Technology (NIST) have demonstrated new methods for placing what are typically unrul ...

Theorist helps develop first single molecule transistor

Jun 07, 2005

A scientist at the University of Liverpool has helped to create the world's smallest transistor - by proving that a single molecule can power electric circuits Dr Werner Hofer, from the University's Surface Science Research ...

Recommended for you

For electronics beyond silicon, a new contender emerges

Sep 16, 2014

Silicon has few serious competitors as the material of choice in the electronics industry. Yet transistors, the switchable valves that control the flow of electrons in a circuit, cannot simply keep shrinking ...

Making quantum dots glow brighter

Sep 16, 2014

Researchers from the University of Alabama in Huntsville and the University of Oklahoma have found a new way to control the properties of quantum dots, those tiny chunks of semiconductor material that glow ...

The future face of molecular electronics

Sep 16, 2014

The emerging field of molecular electronics could take our definition of portable to the next level, enabling the construction of tiny circuits from molecular components. In these highly efficient devices, ...

Study sheds new light on why batteries go bad

Sep 14, 2014

A comprehensive look at how tiny particles in a lithium ion battery electrode behave shows that rapid-charging the battery and using it to do high-power, rapidly draining work may not be as damaging as researchers ...

User comments : 24

Adjust slider to filter visible comments by rank

Display comments: newest first

tigger
4.3 / 5 (3) Feb 19, 2012
Go UNSW! Best materials engineering team in the world :-)
gwrede
3.7 / 5 (7) Feb 19, 2012
It is predicted that transistors will reach the single-atom level by about 2020 to keep pace with Moore's Law, which describes an ongoing trend in computer hardware that sees the number of chip components double every 18 months.
Well, with or without one-molecule transistors, that will only make a plus-minus 18 month difference into the moment we really will see Moore stopped. While that will truly be the proverbial Interesting Times(tm) for mankind, I hardly think it will be much fun. After half a century of continuous exponential progress, and the ensuing decent prices, we suddenly face an entirely new landscape.

Not to mention Microsoft, whose entire history is based on cavalierly wasting whatever horsepower people have scraped together the money for. Now they, all of a sudden, have to create the next Windows without it being much slower (in absolute terms), needing more memory and more CPU resources. Redmond will look like an ant nest after a big poke.
Skultch
3 / 5 (1) Feb 19, 2012
While that will truly be the proverbial Interesting Times(tm) for mankind, I hardly think it will be much fun. After half a century of continuous exponential progress, and the ensuing decent prices, we suddenly face an entirely new landscape.


why not? what about the landscape do you expect will make it not "fun", as you say?
Skepticus
3 / 5 (4) Feb 19, 2012
Sorry gwrede. The "Interesting Times" can't be trademarked. It has been in the Chinese curse "may you live in interesting times" for thousands of years, if anyone bothers to look beyond English literature.
antialias_physorg
5 / 5 (2) Feb 19, 2012
Interesting Times is a book (and a rather good one) by Terry Pratchett
Benni
3.2 / 5 (9) Feb 19, 2012
If this kind of stuff keeps up, there's going to be a whole army of us Electrical Engineers on the way back to night classes! A couple weeks ago it was the IBM team, now this! This news won't make my boss very happy, because our entire design group is probably going to descend on the Chief Engineer's office this week demanding tuition remuneration for the new classes we'll all want to take.

Don't think I'll be able to sleep tonite.......!
Lurker2358
2.9 / 5 (7) Feb 19, 2012
why not? what about the landscape do you expect will make it not "fun", as you say?


For the past 15 years or so, software companies, especially Microsoft and their OS, have tended to simply rely on hardware getting exponentially more powerful.

As a result they've bloated their software with countless inefficient applications and far from optimized code.

This is why your computer is like 50 times faster than in the 1990's, yet the operating system takes longer to load than ever before. A lot of other software is the same way.

The other thing is that once PCs and smartphones stop being twice as good every 18 to 24 months, people will only buy new ones when the old ones break down. this means revenues to hardware R&D and manufacturing firms will be greatly decreased, and may create problems in funding next generation technologies such as photonics or spintronics.

The good news, in 2020, your smartphone will have as many processors and ram as an Intel Server does in 2012.
_ucci_oo
5 / 5 (3) Feb 19, 2012
Cool, now we have the basics to build Deep Thought.
antialias_physorg
5 / 5 (2) Feb 19, 2012
I was rather hoping we'd go photonic, But as it stands this is pretty neat. Now get a good mass production process going and they'll be in business.
SurfAlbatross
4.5 / 5 (2) Feb 19, 2012
This is great work that will offer insight into devices of the future and the rules governing them, but will not itself usher in a new era of devices, perhaps for niche applications, but not for the masses.

For any of us to see this work in our own personal devices requires this technology to be mass produced, and if you're using an STM, that does not even approach mass production, a few orders of magnitude below even MBE.
marraco
5 / 5 (2) Feb 19, 2012
It should be noted that a single atom is the smallest element on the transistor, but it takes more than 108nm, or 11600nm² to make the transistor.

A lot of progress can be made by miniaturizing other parts of the transistor.
javjav
4.8 / 5 (5) Feb 19, 2012
Well, with or without one-molecule transistors, that will only make a plus-minus 18 month difference into the moment we really will see Moore stopped.

Moore's law does not have to stop at any point, at least not necessarily at 1 atom transistor level. Moore's law is an economics law, not an scientist theory. The industry have to fulfill the 18 months / double performance rule, and their research is planned to achieve that mark. If they go a bit slower their revenues will decrease. Going much slower means they go to bankrupt. In any case, making smaller transistors is only one of the possible ways to double performance/cost of a device (although it is the most famous one). Reducing manufacturing costs is another way. Using 3D layers of transistors rather than 2D is another method in development. And today, the cost/performance ratio is usually related with other factors like battery, screen, weight...which have huge headroom to improve, transistor size is not so critical
dweeb
4.5 / 5 (4) Feb 19, 2012
and how long is it gonna take to build a giga-transistor chip positioning single atoms at a time ?

I'd be buying stock in automated code compression instead , the redundant inefficiencies in microsofts code base has to be scandalous
finitesolutions
1 / 5 (3) Feb 20, 2012
You can now build some nanosized computers that you can take as a pill and augment your brain. The nanozised computers will travel to the brain and connect to the neuron network. They will then help the brain access information not stored inside its neurons( access the internet).
Ricochet
not rated yet Feb 20, 2012
I suppose we can all start playing "Society" (Gamer movie) after taking the pill, then...
antialias_physorg
3.7 / 5 (3) Feb 20, 2012
and how long is it gonna take to build a giga-transistor chip positioning single atoms at a time ?

Just like we build current chips: NOT one transistor at a time.

Why do you suspect that construction of this type of transistor cannot be parallelized?
nathj72
not rated yet Feb 20, 2012
A key piece of information was left out of this article. The device operates at milliKelvin temperatures (Liquid helium cooling). This makes it unlikely for civilian applications.

"I was rather hoping we'd go photonic"
This is not practical due to the wavelength of photons that are non-ionizing. However, there was some interesting new work presented using the wave nature of an electron in graphene (It was in a recent issue of IEEE spectrum).
jalmy
1 / 5 (1) Feb 20, 2012
Hardware is by far not the only place inprovements in computing will come from. Just because they achieve the smallest architecture doesn't mean the implementation of it will be the best. They have used the same building materials for centuries, yet I would not dare to say the best "house" has been build that can be built from them. This just means future improvements will have to come from other sources, such as software, materials, architecture. Alot of todays computing ability has not come from hardware improvements. It has come from mathematics.
finitesolutions
1 / 5 (1) Feb 20, 2012
"Alot of todays computing ability has not come from hardware improvements. It has come from mathematics."

Try watching a fullHD movie on a 2~3 year old PC : no you can't do.
Most of new PC can cope with fullHD movies mainly because of better hardware. Once you change the hardware the same software runs like hell.
tpb
2 / 5 (1) Feb 20, 2012
One of the biggest factors besides bloated software hurting the performance of our computers is the CPU clock speed and the speed of the external DRAM.

If we can put enough memory on the CPU chip, running at CPU clock speeds, we can get rid of all the level 1 and 2 cache memory and all the logic involved to load and flush the caches, not to mention the branch prediction hardware.

The room taken on the die and the power consumed by all this circuitry is all useless overhead needed only because DRAM is so slow.

If the CPU was faster, using multicore CPUs wouldn't have any advantage.
If all the space consumed by the multiple CPU cores and cache memory subsystems was replaced using single transistor high speed memory our CPUs would be faster, lower power and less expensive.
thematrix606
1 / 5 (1) Feb 22, 2012
"Alot of todays computing ability has not come from hardware improvements. It has come from mathematics."

Try watching a fullHD movie on a 2~3 year old PC : no you can't do.
Most of new PC can cope with fullHD movies mainly because of better hardware. Once you change the hardware the same software runs like hell.


Crack's bad, mmkay.

Maybe if you had said 10 years it would be more believable.

About the other part, hardware IS the key component that is required to push the envelope further, but software is the main driver, patiently waiting.
Deathclock
1 / 5 (1) Feb 22, 2012
why not? what about the landscape do you expect will make it not "fun", as you say?


For the past 15 years or so, software companies, especially Microsoft and their OS, have tended to simply rely on hardware getting exponentially more powerful.

As a result they've bloated their software with countless inefficient applications and far from optimized code.

This is why your computer is like 50 times faster than in the 1990's, yet the operating system takes longer to load than ever before. A lot of other software is the same way.

The other thing is that once PCs and smartphones stop being twice as good every 18 to 24 months, people will only buy new ones when the old ones break down. this means revenues to hardware R&D and manufacturing firms will be greatly decreased, and may create problems in funding next generation technologies such as photonics or spintronics.


There is so much wrong with this I don't even know where to begin...
Callippo
1 / 5 (1) Feb 22, 2012
, yet the operating system takes longer to load than ever before
I'm loading Windows once per half of year due the updates, as my notebook manages standby regime quite well - so I can still accept it. I'm just not using the PC for reboots, but for real work.
Ober
not rated yet Feb 25, 2012
Well I'm waiting for when they can send information back in time, so a chip can have the computations done at the time they are requested. Instantaneous computation. Yeah I know it involves technology that doesn't exist yet, but I do remember reading about a bloke trying to achieve time travel with a circulating light beam and passing an electron through it. Not sure what happened to him. Maybe he's time travelling!!! :-)