Next generation hard drives may store 10 terabits per sq inch: research

May 10, 2010 by Lin Edwards report
Bit addressing during TAR writing on bit patterned media. a, Schematic of the head path and write waveforms during experiment. Both up and down orientations were written. The head path is purposefully misaligned to the track direction by a fraction of a degree. Initial phase is random with each track. Write frequency was incremented by 1% between tracks. b, Large area HR-MFM image of resulting tracks. Scale bar, 1um. Single tone tracks at the highest data frequency are written properly with no adjacent track writing when the head is centered on the track and in phase with the island positions. Nebulous light regions are due to reversal of the soft magnetic material in the trenches between islands. c, Close-up HR-MFM image of a single track. 60 islands are written correctly before the write phase and track centering drift too far. Scale bar, 500nm. Image credit Nature Photonics, doi:10.1038/nphoton.2010.90.

(PhysOrg.com) -- The majority of today's hard disks use perpendicular recording, which means their storage densities are limited to a few hundred gigabytes per square inch. Scientists have for some time been trying to find ways of increasing the limit, and a new method has been proposed that could stretch the limit as high as ten terabits (Tb) per square inch.

The research, published in this week’s , has found a method that combines two writing procedures to store data on hard drives. Each procedure writes tightly packed data without affecting data on the bits surrounding it and avoiding the usual challenge with tightly packing data, which is that the heat generated in the write head can create superparamagnetisim that can interfere with surrounding bits and jumble the data on them (by flipping a 0 state to 1 or vice versa).

One of the procedures used is bit-patterned recording (BPR), which writes to “magnetic islands” lithographed into the surface, which isolate the write events and prevent superparamagnetic effects occurring. The other is thermally-assisted (TAR), in which a tiny region of the surface is heated when data are being written and then cooled. The heat allows the surface to magnetize quickly, and this, the small-grained design of the surface, and the distance between bits all help to prevent superparamagnetisim.

The two methods both present difficulties: BPR is limited by the need for a write head that exactly matches the size of the magnetic islands, while TAR is limited by its need for small grain media that can tolerate heating and cooling, and the difficulty of controlling the area heated. It turns out that combining the two methods solves all the problems. BPR’s magnetic islands remove the need for small grain media, and TAR writes only to the heated bit, so the size of the write head is less important. Using the two methods in combination means surrounding bits are unaffected, and data can be tightly packed on less expensive surfaces.

The new system, developed by Barry C. Stipe and colleagues from Hitachi research groups in California and Japan, uses a plasmonic nano-antenna to write the data, with laser light guided via a waveguide to the antenna, where it is transformed into a charge. The “E-shaped” antenna has a 20-25 nanometer (nm) wide middle prong that concentrates the charge on an area as tiny as 15 nm in diameter, rather like a lightning rod, with the outer prongs acting as grounds.

The write speed obtained by the researchers was 250 megabits per second and the error rate was low. Data tracks were separated by 24 nm, and the researchers obtained a data of one terabit per square inch of high-quality data quite easily. The researchers believe 10 terabits per square inch is theoretically possible.

Explore further: Researchers build reversible tractor beam that moves objects 100 times farther than other efforts

More information: Magnetic recording at 1.5 Pb m^(−2) using an integrated plasmonic antenna, Barry C. Stipe et al., Nature Photonics, Published online: 2 May 2010. doi:10.1038/nphoton.2010.90

Related Stories

InPhase Demos 515 Gigabits Per Square Inch Data Density

Mar 27, 2006

InPhase Technologies announced today that it has demonstrated the highest data density of any commercial technology by recording 515 gigabits of data per square inch. Holographic storage is a revolutionary ...

Recommended for you

Backpack physics: Smaller hikers carry heavier loads

10 hours ago

Hikers are generally advised that the weight of the packs they carry should correspond to their own size, with smaller individuals carrying lighter loads. Although petite backpackers might appreciate the ...

Extremely high-resolution magnetic resonance imaging

10 hours ago

For the first time, researchers have succeeded to detect a single hydrogen atom using magnetic resonance imaging, which signifies a huge increase in the technology's spatial resolution. In the future, single-atom ...

'Attosecond' science breakthrough

11 hours ago

Scientists from Queen's University Belfast have been involved in a groundbreaking discovery in the area of experimental physics that has implications for understanding how radiotherapy kills cancer cells, among other things.

User comments : 22

Adjust slider to filter visible comments by rank

Display comments: newest first

Quantum_Conundrum
1.6 / 5 (7) May 10, 2010
Good God.

What in the world are we going to do with that much information? They just quintupled the data density, and theorize another order of magnitude increase, and I don't even remotely use the 300GB Hard Drive I have now...

I can see uses for this regarding security camera data storage and as well stellar cartography, since there are more stars "out there" than there are grains of sand on the earth, but eventually there just isn't going to be any need for any more data storage except in space exploration and colonizing other planets.
kshultz222_yahoo_com
5 / 5 (2) May 10, 2010
Back in the '80's people were asking how many thousands of Wordstar documents you could store on that XX megabyte hard drive. Strange as it may seem, someone can easily find a use. Maybe everyone will save all their favorite movies, or all movies that have ever been commercially made, etc.
holoman
not rated yet May 10, 2010
Writing cells at 20nm has never been the problem with magnetics, reading is.

Ferroelectric/multiferroics can already
write and read at 3-5nm so what's the big deal.
degojoey
5 / 5 (5) May 10, 2010
I have over 1Tb of just high def MKVs on my hdd, not including the many Gbs of music and home videos and saved work (im a graphic designer). Really easy for me to kill a couple of Tb today, who knows when 3D comes to the norm how much info I'll be playing with. Completely necessary upgrade, my 1.5Tb is almost full, got it at christmas!
JayK
5 / 5 (5) May 10, 2010
Another great example of Moore's Law. Give people access to that much space (developers and users alike) and they'll find uses for it that you can never imagine. "640K ought to be enough for anybody" - Bill Gates 1981
trekgeek1
not rated yet May 10, 2010
It doesn't matter if you need it. It never hurts to have it just in case.We don't know how much some future application may need. How much do you think a holodeck program uses? Just sayin...................
Quantum_Conundrum
not rated yet May 10, 2010
How much do you think a holodeck program uses?


I've thought of that, but the computer technology will likely be waiting around for decades before such a device is invented, if it ever is invented.

One could calculate how much "3d video RAM" you would need to project a holographic environment into a 10ftx10ftx10ft room(1000 cubic feet).

The highest screen resolution I know of for video today is 2400x1600, which I think is on a 21 inch monitor. That comes to about 17.5x11 inches, which is close to 140 pixels in a line an inch long. So to figure how much stronger the "video card" would need to be for a hologram you could just do some divisions and conversions.

140 per inch
140^2 per inch^2 (Existing, across ~192.5inches^2)
140^3 per inch^3 (across 10ft^3)

So the total number of "pixels" would be:

140^3 * 10^3 * 12^3 = 4.741632*10^12 pixels.

Now to find out how many moore's laws cycles before a video card can do that...see next post...
x646d63
5 / 5 (1) May 10, 2010
Of course we'll use that space. Clouds will just get larger. You may not need 100TB on your home computer, but the clouds will.

The LHC produces 20 petabytes of information each year. Currently sensors are *limited* by storage capacity/speed. Increase capacity and speed and we'll get more sensors and more data to analyze.
Quantum_Conundrum
not rated yet May 10, 2010
to find out how many times we need to increase the video card's processor and RAM, we have...

divide needed pixels for a 10ft^3 room by the existing pixels for an existing monitor at max resolution on the best video card.

4.741632*10^12 / ((140^2 * 192.5)) = 1256727

So a 10ft cubed holodeck would need 1,256,727 times as much processor power and RAM as the best existing video card in order to have the same pixel density and color depth.

Moore's law will tell us how many years it would take to have a video card capable of doing this:

2^X = N

N = 1,256,727

Then X ~ 20.26

X is the number of doublings given the same area, and Moore's law predicts a doubling every 1.5 to 2 years, so you are looking at 30-40 years before a computer could run a holodeck program...

However, it would take several hundred years for any software company to actually make an interactive holodeck gaming program, other than something simple like 3d pong, or just pure "3d-video".
Quantum_Conundrum
not rated yet May 10, 2010
However, you could "fake" 3d video using a 10^3ft^3 box with screens on all 6 internal surfaces, essentially 6 2-d screens. This would give pixels as:

140^2 * 10^2 * 6 * 12^2 = 1.69344*10^9 pixels.

Which is significantly less pixels than a "true" holodeck (~3000 times less).

1.69344*10^9 / (140^2 * 192.5) = 448.83

2^X = N

N = 448.83

so X ~ 8.81

or 8.81 doublings.

so...

Making a video card capable of running this "fake holodeck" would be doable in approximately 13.22-17.6 years...and "should" sell for approximately the same price as the existing top of the line video card.
Quantum_Conundrum
1 / 5 (2) May 10, 2010
Finally, I'm not convinced there is even a "semi-practical" reason to make 3d projectors. A monitor with rendered graphics in a 2d or 3d environment does a fine job in terms of video gaming, and it would actually be hard to argue in favor of an interactive 3d environment for many types of games. Real Time Strategy isn't about perfect 3d rendering, for example, but more about, well, strategy and execution. Why spend 99.9% of a systems power on graphics when graphics aren't the main focus of strategy based or execution based gaming? And as far as first person shooters go, if someone wants to run around physically and play a FPS in a hologram, they could do the same thing for cheaper in an existing laser tag or paint ball game. The only benefit of a holgram would be better "map selection" compared to actually playing laser tag or paint ball...in any other respect, a keyboard and mouse with a 2d monitor works about as well.
trekgeek1
not rated yet May 10, 2010
"semi-practical"?? Uhhhh......... it would be awesome!!! That is my semi-practical reason. I appreciate the calculations and discussion, thank you. We must also consider that not all the space in a holodeck would need a pixel. Indeed, many objects could be hollow or even empty transparent air, thus reducing the number of pixels greatly by allowing heavy compression algorithms. Even simple run length coding and Huffman coding could help. Also, Moore's law may be crushed by quantum computing or other amazing improvements that will result in leaps rather than steps.
dirk_bruere
5 / 5 (1) May 10, 2010
Just ripping a BluRay disc to HDD will take up 25+GB. That's 40 movies per TB. Not a lot. Still, DRM will be doing its best to keep us in the stone age.
Husky
not rated yet May 11, 2010
how about a roomless holodeck, projected directly into the mind instead by neural connections, this would save on pixels, as you only have to provide field of view and do background culling etc...
podizzle
not rated yet May 11, 2010
moores law is only good before the singularity. holodeck in 25 years
podizzle
not rated yet May 11, 2010
how about a roomless holodeck, projected directly into the mind instead by neural connections, this would save on pixels, as you only have to provide field of view and do background culling etc...


holodeck is for people w/o brain chips... too scary for some people. not me
BloodSpill
1 / 5 (1) May 11, 2010
You know what's great about this?

Backing stuff up.
hard2grep
not rated yet May 12, 2010
yep, time for defrag... I shure hope that wasn't a raid-0 get-up.
MadPutz
not rated yet May 15, 2010
@Quantum_Conundrum

People said the same thing when RAM was measured by the kilobyte. It's not a matter of applying current applications, it's a matter of future applications that will be enabled that we do not know of yet. Otherwise you're comparing apples to oranges.
robbor
not rated yet May 15, 2010
i don't care how big a house you get, in no time you'll fill it up and run out of space
Foolish1
not rated yet May 16, 2010
How amazing is this really? A few hundred gigabytes vs a few terrabits. They are not in the same terms. 1 Gigabyte represent 8x more data than a gigabit. 1 terrabit is actually just 125 gigabytes.

1tbit density was the limit of the experiment and the write performance of 250mbit/sec (31.25MB/sec).

This may well signal significant improvements in storage densities but the wording and use of terms is either a typo or exceedingly dishonest in my opinion.
CarolinaScotsman
1 / 5 (1) May 17, 2010
In the early fifties, some engineers at IBM said that two or three computers for the whole country would be all we would ever need. Information technology is the largest growth industry ever and will continue to be so for the foreseeable future.
lengould100
Jun 08, 2010
This comment has been removed by a moderator.