To DDR3: Thanks for the memory but time for DDR4

May 10, 2012 by Nancy Owano report

(Phys.org) -- Micron Technology is polishing up its DDR4 memory modules, “sampling” the modules and getting feedback from major customers. The company plans to reach volume production later this year. In brief, Micron is getting ready to bring its DDR4 DRAM modules for market This means the computer industry can expect a new memory standard to make a difference across a range of computing devices, from enterprise computing to so called ultra-thins and tablets. Boise Idaho based Micron this week announced the first piece of its portfolio of DDR4-based modules as the 4-gigabit (Gb) DDR4 x8 part. The announcement said the complete portfolio of DDR4-based modules will include RDIMMs, LRDIMMs, 3DS, SODIMMs and UDIMMs (standard and ECC).

For the “soldered down space,” said Micron, the x8, x16, and x32 components will be available, with initial speeds up to 2400 megatransfers per second (MT/s), increasing to the JEDEC-defined 3200 MT/s. Observers assume that the DDR4 modules will first appear in enterprise machines with ultrathins and tablets up next. Advantages will be twofold; in terms of power savings and performance enhancements. Beyond Boise, one piece of unfinished business is still to be finalized, the JEDEC definition for DDR4. JEDEC stands for Joint Electron Devices Engineering Council and publication of the specification is awaited. JEDEC apparently intends to publish quite soon.

According to JEDEC, “With publication forecasted for mid-2012, JEDEC DDR4 will represent a significant advancement in performance with reduced power usage as compared to previous generation technologies.“

JEDEC said per-pin data rates, over time, will be 1.6 giga transfers per second to 3.2 giga transfers per second. With exceeding its expected peak of 1.6 GT/s, it is likely that higher performance levels will be proposed for DDR4 in the future. Also planned in the new standard are a “pseudo open drain interface on the DQ bus, a geardown mode for 2667 Mhz data rates and beyond, bank group architecture, internally generated VrefDQ and improved training modes.” JEDEC will host a DDR4 Technical Workshop following the publication of the standard.

Micron's DDR4 technology was co-developed with Taiwan-based Nanya Technology. Founded in 1995, Nanya does research and development and design, as well as manufacturing, of DRAM products. In 2008, Nanya and Micron entered a joint technology development agreement.

While the JEDEC publication will be complete midyear, is wasting no time, with plans to reach volume production of its products in Q4 2012.

Commenting on what to expect with DDR4, Nebojsa Novakovic, writing in VR-Zone, said DDR4 has better ways of handling parity and ECC errors than previous memory types, and it can provide recovery from both command and parity errors without crashing the system. Nonetheless, any real deployment will happen in 2014, he said, with AMD and Intel platform support of the new memory.

Explore further: MediaTek SoC to boost 64-bit Android devices

Related Stories

Recommended for you

Apple's fiscal 3Q earnings top analyst forecasts

7 hours ago

Apple's growth prospects are looking brighter as anticipation builds for the upcoming release of the next iPhone, a model that is expected to cater to consumers yearning for a bigger screen.

Putin signs law seen as crimping social media

8 hours ago

President Vladimir Putin on Tuesday signed a law requiring Internet companies to store all personal data of Russian users at data centres in Russia, a move which could chill criticism on foreign social networking ...

User comments : 43

Adjust slider to filter visible comments by rank

Display comments: newest first

Vendicar_Decarian
2.6 / 5 (10) May 10, 2012
I fear that because the DDR4 specification isn't complete, that there will be subtle differences between each manufacturer's implementation that will cause an increase in motherboard incompatibility issues.

CHollman82
2.5 / 5 (8) May 10, 2012
Your fear is unwarranted...
El_Nose
5 / 5 (2) May 10, 2012
i agree the consortium has stated that the ddr4 standard will be complete in a few monthes
Sonhouse
3.3 / 5 (4) May 10, 2012
I can't wait for DDR5 and DDR6!
Lurker2358
2.3 / 5 (12) May 10, 2012
I can't wait for DDR5 and DDR6!


I can't wait until humans actually learn how to do something useful with what we have, instead of just running a bigger OS and gossip engines, or adding more pixels to a screen.
CHollman82
2.8 / 5 (9) May 10, 2012
I can't wait for DDR5 and DDR6!


I can't wait until humans actually learn how to do something useful with what we have, instead of just running a bigger OS and gossip engines, or adding more pixels to a screen.


What are you waiting for? You're a human, show us all how it's done.
islatas
5 / 5 (3) May 10, 2012
I can't wait for DDR5 and DDR6!


I can't wait until humans actually learn how to do something useful with what we have, instead of just running a bigger OS and gossip engines, or adding more pixels to a screen.


There are plenty of extremely lightweight linux flavors out there. You can run your OS off a thumbdrive if you so desire. Have at it.
CHollman82
3.2 / 5 (9) May 10, 2012
Why would you want to? I can buy a gig of ram cheaper than I can buy a gallon of milk, I have 16 gigs of DDR3 in my home PC, it's cheap as shit... literally, a bag of fertilizer might be more expensive to produce than a RAM chip.
SteveL
5 / 5 (2) May 10, 2012
I can't wait for DDR5 and DDR6!

But you will. ;)
sirchick
not rated yet May 10, 2012
Why would you want to? I can buy a gig of ram cheaper than I can buy a gallon of milk, I have 16 gigs of DDR3 in my home PC, it's cheap as shit... literally, a bag of fertilizer might be more expensive to produce than a RAM chip.


Shame CPUs and GPUs are not as cheap :(
Vendicar_Decarian
2.3 / 5 (9) May 10, 2012
Then you are one spectacular fool.

"I have 16 gigs of DDR3 in my home PC" - CHollman82

Magnified by the fact that you admit to such idiocy in public.
Lurker2358
2.1 / 5 (7) May 10, 2012
Then you are one spectacular fool.

"I have 16 gigs of DDR3 in my home PC" - CHollman82

Magnified by the fact that you admit to such idiocy in public.


Yeah, there isn't an application out there that would actually use 16gigs for private purposes, and it's also unlikely that the memory would be the bottleneck in a PC above about 6 gigs anyway.

Even in the biggest, newest, memory hog video games, the RAM is pretty insignificant. The CPU and GPU is the bottleneck, always, because they design games to FORCE CONSUMPTION of more advanced GPUs, by always increasing resolution, color depth, shading, frame rates, etc, even when it's already beyond human perception.

Only reason I have 6 gigs of RAM is it came standard with the system. I've never come close to actually using it.
Lurker2358
1.8 / 5 (4) May 10, 2012
Anyway, there's no incentive to upgrade a computer.

I've had this one for nearly two years now, got it from the bargain bin for $500, and it runs Crysis and Starcraft 2, and hope to get another year or two or three out of it. WITH the 20 inch HD monitor and keyboard included.

Next time I buy a bargain bin computer for $500, it'll come with probably 16gigs of RAM, or more, standard, and like 4 times better GPU, so I won't be a hypocrite either.
CHollman82
3.9 / 5 (15) May 10, 2012
Hey idiots...

I do video editing using Sony Vegas and Adobe Premier Pro as well as 3D rendering with 3D studio max and Maya... you clowns have no idea what you are talking about, I can easily use 16gb of RAM.
CHollman82
3.9 / 5 (14) May 10, 2012
What a bunch of morons :rolleyes:

"No one will ever need more than 640kb of RAM"
kaasinees
3.1 / 5 (7) May 11, 2012
i only have 10GB and starting up chrome already uses gigs worth of cache.
Sometimes i just have a dozen things open including a virtual machine. 10 GB easily fills up when you develop something.
Than again if you just use email or office a lightweight linux with 2 gigs is enough for normal people.
Gamers also might want to have a little more gigs but can go by with 5 gigs, they usually need more GDDR not DDR.
ShotmanMaslo
5 / 5 (4) May 11, 2012
The CPU and GPU is the bottleneck, always, because they design games to FORCE CONSUMPTION of more advanced GPUs, by always increasing resolution, color depth, shading, frame rates, etc, even when it's already beyond human perception.


Then your perception must be pretty lame. There is still a lot to improve in game graphics.
kaasinees
3 / 5 (6) May 11, 2012
Also every step up in DDR version decreases energy consumption and increases speed. (sometimes slower because of higher CAS latency)
It is also cheaper to produce per GB.

I bought DDR3-2000 with CAS-7 so my calculated cycletime is 1ns.
I might have overdone a bit here. but i wanted fast memory.
elektron
not rated yet May 11, 2012
Vendicar_Decarian
2.7 / 5 (6) May 11, 2012
Then it isn't your home computer.

"I do video editing using Sony Vegas and Adobe Premier Pro as well as 3D rendering with 3D studio max and Maya.." - Chollman82

Sony Vegas is a perfect example of why Sony is failing as a company.
Vendicar_Decarian
3.3 / 5 (4) May 11, 2012
Mine: W7

https://docs.goog...vb2sxdzQ

"Sometimes i just have a dozen things open including a virtual machine." - Kaas

Foolish.
dgreyz
4.8 / 5 (6) May 11, 2012
Vendicar_Decarian / Lurker2358, Chollman82 is right, 3D artists just have a whole different need for computers than most other people.

And it gets pretty annoying to constantly hear people saying you're an idiot because of our need for many cores or large amounts of RAM etc, and claim that we actually don't need it. As if we're all suffering from some delusion, while it's actually nothing strange in our field.

For many good reasons it's even mandatory for our "home pc" to be capable of doing work as well. So I hope people could just respect that.
Pkunk_
4.5 / 5 (2) May 11, 2012
Great . now the conversation gets down to the level of how big my memory is.

Beat this -
total used free shared buffers cached
Mem: 32229 32108 120 0 769 28065
-/ buffers/cache: 3273 28955
Swap: 1499 0 1499

Keep in mind there are perfectly valid reasons to have 32gigs of RAM and more if you are actually running software that needs it.
Lurker2358
1.1 / 5 (7) May 11, 2012
Then your perception must be pretty lame. There is still a lot to improve in game graphics.


BS.

Developers spend entirely too much time on Graphics now as is, when they should be working on game play, story, dialog, and level design elements.

Do you know that the Starcraft 2 executable is only 6 megabytes, and the Starcraft:Broodwar executable was only about 500kb?

Almost all the other stuff in Starcraft 2 is 3d models and animations, and in broodwar, just animations, ok, plus map files, bu those are re-downloaded from server side every game anyway, to guarantee nobody is cheating. Sound files actually aren't that big.

At any rate, the majority of what you're paying for in a modern video game is 3d graphics and color depth, not the game engine, story, or game play, or anything at all that enhances "replay-ability".

My RAM on max settings isn't even the problem, it's the damned video card. But I could as well go buy a new computer for what a video card costs.
Lurker2358
1.5 / 5 (8) May 11, 2012
I run it on lowest settings, because that prevents lag, and it's also to your advantage to do so, even if you had a premium video card, because you want maximum reaction time in the interface, and minimum crap calculations interfering with the interface.

I run on lowest settings, and don't miss the enhancements, and probably would never use the max settings even if I had a $1000 video card it would require, except possibly in single player, because again, that would be a dumb thing to do.

Additionally, higher resolution actually screws up mouse precision, and requires you to move the mouse farther and farther.
jonnyboy
3 / 5 (4) May 12, 2012
I can't wait for DDR5 and DDR6!


I can't wait until humans actually learn how to do something useful with what we have, instead of just running a bigger OS and gossip engines, or adding more pixels to a screen.


What are you waiting for? You're a human, show us all how it's done.

I can't wait for DDR5 and DDR6!


I can't wait until humans actually learn how to do something useful with what we have, instead of just running a bigger OS and gossip engines, or adding more pixels to a screen.


What are you waiting for? You're a human, show us all how it's done.

more of a "troll" actually.
PosterusNeticus
3.8 / 5 (6) May 12, 2012
because they design games to FORCE CONSUMPTION of more advanced GPUs, by always increasing resolution, color depth, shading, frame rates, etc, even when it's already beyond human perception.


I've been a professional game programmer for the last 13 years, and you have no idea what you're talking about. You are literally just making random noises, like a bird that appears to have speech but is really just repeating sounds it once heard.

There are so many self-contradictions in the quoted text that I honestly don't even know where to begin correcting you. So instead I'll advise people to simply skim past your "valued input" on the matter.
wwqq
3.7 / 5 (6) May 12, 2012
Yeah, there isn't an application out there that would actually use 16gigs for private purposes


It doesn't have to be ONE application, but even then it's not hard to use up 16 GB. E.g. if you do a large photo merge in photoshop it will eat all the ram you can give it.

Even in the biggest, newest, memory hog video games, the RAM is pretty insignificant.


The only reason games aren't using every bit of memory they can is that harddrives are ridiculously slow.

[...]they design games to FORCE CONSUMPTION of more advanced GPUs


Most games are extremely conservatively designed, mass-market products. Almost every game has a freaking DX9 path.

[...]even when it's already beyond human perception.


Laughable. Lighting is still a bag of cheap tricks and hacks. Everything is still pretty low poly and blocky. Without correct motion blur you'd need about 500-1000 FPS to approach smooth movement. Environmental sound effects are just a cheap not very convincing hack.
Deathclock
1.8 / 5 (5) May 12, 2012
Then it isn't your home computer.

"I do video editing using Sony Vegas and Adobe Premier Pro as well as 3D rendering with 3D studio max and Maya.." - Chollman82


Because you would know right? I forgot that I had you over for the barbecue last weekend!
Deathclock
2.3 / 5 (6) May 12, 2012
because they design games to FORCE CONSUMPTION of more advanced GPUs, by always increasing resolution, color depth, shading, frame rates, etc, even when it's already beyond human perception.


I've been a professional game programmer for the last 13 years, and you have no idea what you're talking about. You are literally just making random noises, like a bird that appears to have speech but is really just repeating sounds it once heard.


Nailed it, perfect description of this guy, he always acts like an idiot. I think he is young, I remember thinking I knew everything when I was 17 too...
Deathclock
2.7 / 5 (7) May 12, 2012
Do you know that the Starcraft 2 executable is only 6 megabytes, and the Starcraft:Broodwar executable was only about 500kb?


Do you understand that CODE is and has always been a TINY fraction of the total memory requirement of a computer game?

Almost all the other stuff in Starcraft 2 is 3d models and animations, and in broodwar, just animations, ok, plus map files, bu those are blah blah blah


Code is small, data is large, this is almost always the case with anything, games or otherwise.

The majority of what you're paying for in a video game is 3d graphics and color depth


Color depth? You have no idea what you are talking about do you? Yes, the major cost in the production of modern games is the artistic resources including 3D models, textures, sound effects music and voice acting, etc. So what? Stories are cheap.

But I could as well go buy a new computer for what a video card costs.


Not one with a decent video card in it...
Deathclock
2.7 / 5 (7) May 13, 2012
I'm going to be honest with you, you sound poor and bitter to me. You sound like you have a serious case of sour grapes because you cannot afford modern hardware to play modern games so you lash out at anyone who can by disparaging those games.

FYI, my Skyrim data directory with all of the mods I have installed is 17.3 gigs with 21,000 files. Just the high resolution replacement textures consume 9.2 gigs of that. If the engine allowed it I could MORE than fill the 16gb of RAM on my system by playing Skyrim alone.
Deathclock
2.3 / 5 (6) May 13, 2012
because they design games to FORCE CONSUMPTION of more advanced GPUs, by always increasing resolution, color depth, shading, frame rates, etc, even when it's already beyond human perception.


I've been a professional game programmer for the last 13 years, and you have no idea what you're talking about. You are literally just making random noises, like a bird that appears to have speech but is really just repeating sounds it once heard.

There are so many self-contradictions in the quoted text that I honestly don't even know where to begin correcting you. So instead I'll advise people to simply skim past your "valued input" on the matter.


Wait, are you saying you guys DON'T keep increasing color depth at the behest of the video card manufacturers? I heard the newest nvidia cards are going to have 512bit color!

I'm shocked and appalled!

But seriously, that guy is a complete idiot.
CreepyD
5 / 5 (3) May 13, 2012
Game graphics are FAR FAR from being beyond human perception in terms of detail and such, what games is that guy playing?!
Even a maxed out Crysis 2 is far from perfect.

I have 3gb of DDR3 RAM, and there is certainly no need for more for the time being, and even more certainly no need for faster ram.
Also, why not DDR5 since that's the norm in GPU's anyway.

A faster GPU - maybe in a couple of years - game releases are so slow these days anyway because consoles are holding back the PC market.
CPU - no need to upgrade, even entry level ones these days are fast enough for almost all applications, and games rely much more on GPU's now anyway.
Anyone complaining of a slow PC, buy an SSD - it's the biggest single upgrade to your PC's speed that's ever been released.
kaasinees
2.7 / 5 (7) May 13, 2012
Lurker has a good point though.

You would know if you ever played morrowind that game content is just horribly ridiculous in skyrim.
Vendicar_Decarian
2.7 / 5 (7) May 13, 2012
Ya the reason is called stupidity.

"Keep in mind there are perfectly valid reasons to have 32gigs of RAM and more if you are actually running software that needs it." - Pkunk
Hengine
4.7 / 5 (6) May 13, 2012
Vendicar I don't like to say it but your comments invite people to be nasty to you. You're often ignorant and arrogant.

People have legitimate use for large quantities of RAM at home, deal with it. Home computers often double up as workstations and it may come as a big surprise to you but there is software that benefits greatly from large memory reserves. 3D CAD and analysis tools, HD video/photo editing, virtual boxes etc all demand massive chunks of RAM because they have to juggle and track billions upon billions of information bits quickly and efficiently.

High performance hardware becomes available and then software is made to make use of it and so on.
Deathclock
1.8 / 5 (5) May 13, 2012
Lurker has a good point though.

You would know if you ever played morrowind that game content is just horribly ridiculous in skyrim.


Nostalgia.

Skyrim is at least equal to Morrowind in story elements, and far superior in gameplay and graphics.

And before you ask, yes I played Morrowind when it was new, as well as Daggerfall and Arena.
wwqq
3 / 5 (4) May 14, 2012
Nostalgia.


Well, it's not nostalgia if he actually sits down and plays morrowind and still prefers it; then it's just poor taste.
wwqq
1 / 5 (1) May 14, 2012
Faster RAM is not very useful for the typical workload CPUs do today; but it is enormously beneficial for GPUs.

The reason CPUs don't benefit much is that they spend most of their time running this nasty, branchy code with lots of dependencies between tasks or waiting for harddrives.

They desperately need lower latency, which they largely achieve through massive caches(if you actually have to wait for a single byte to be read from RAM it will take hundreds of clock cycles!).

If you have some embarassingly parallel problem you can easily eat all bandwidth available; but if you have a problem of that kind you might as well use the graphics card, because that's exactly the sort of problem it is designed to solve.
gopher65
5 / 5 (1) May 16, 2012
CreepyD said:
Also, why not DDR5 since that's the norm in GPU's anyway.

GDDR5 used in high end graphics cards is not the same as DDR5 (which doesn't exist yet). GDDR5 is a simply a different implementation of DDR3. It generates more heat and uses more power, but it performs certain functions much faster than DDR3 does. It is designed with the needs of graphics cards in mind, rather than general computing.

Lurker2358 said:
I run it on lowest settings, because that prevents lag

I don't think that word (lag) means what you think it means;). But yeah, everyone who plays SCII on Battle.net runs on less than max settings for several reasons (on min settings cloaked units are easier to see). Anyone who claims that people run Starcraft II at max settings doesn't play the multiplayer game:P.

Vendicar_Decarian: I have 8 gigs of ram, and Photoshop regularly throws a fit about my lack of ram when I work with large files.
Deathclock
1.8 / 5 (5) May 16, 2012
I don't think that word (lag) means what you think it means;).


I also hate it when stupid people misuse the word "lag"... As I am sure you know, but for the benefit of others, lag refers to latency, which is the round trip time from your computer to the server. It is a temporal offset between what you see occur in the game and when the game server thinks it occurs. "Lag" should never be used to refer to reduced frame rate on the local system, because that is not what it means.
SteveL
5 / 5 (1) May 17, 2012
Some of the load caused game latency is somewhat controllable. I used to be a coder in the mid 90's for an online game (GemStoneIII & later GemStoneIV) and among other things developed code that monitored internal game latency and automatically adjusted the timings of many of the background tasks to reduce the processor load until the system recovered. There is a lot of code running in the background of online gaming that if slowed down for a few hundred milliseconds wouldn't adversely effect customer experience while momentarily reducing the load on the processor long enough to let it catch up.