Intel's Broadwell may put an end to CPU swap-outs

Nov 30, 2012 by Nancy Owano weblog

(Phys.org)—Never content to fixate on the next signpost on Intel's roadmap, Intel watchers are talking about what is beyond the Haswell processors toward its successor architecture, Broadwell. They say that Broadwell will not be offered as a land grid array (LGA)-based product but instead will signal a shift to a ball grid array (BGA). What this further means is that future Intel CPUs may come soldered to motherboards. This would mark the end to user -replaceable CPUs. Broadwell desktop CPUs will need to be soldered directly to motherboards. That places limits on users and system builders.

The talk is that 14-nanometer Broadwell will be offered only as a BGA product. If soldered onto a motherboard, there is no opportunity for users to swap out CPUs and mix-and-match motherboards. According to ZDNet, one of the sources indicating a shift, "Reports suggest that Intel is preparing to kill off PC upgrades by adopting the BGA rather than an LGA package." The way that are mounted on motherboards would depart from the socket procedure, where a processor is socketed onto a desktop motherboard rather than soldered directly on, as with such as tablets and laptops. A shift to BGA could end socket compatibility for Intel desktop processors. This limitation on standard desktop processors raises questions of what, if any, marketplace impact would ensue.

Most might not care; however, those who do want to do their own processor upgrades would notice should Intel take the route of soldering processors to a desktop board with Broadwell. They would miss the ability to upgrade and build their own systems should Intel make the move away from end-user, upgradeable CPUs. This may be seen as a drawback for DIY enthusiasts, including gamers, who like to choose from a variety of processors, depending on their needs.

Intel-watchers appear to be fairly confident that this is what Intel intends to do. "I have now independent confirmation from a PC building OEM, who declined to be named, along with two makers, that Intel has briefed them of the switch from LGA to BGA for Broadwell architecture processors," said Adrian Kingsley-Hughes in ZDNet. Broadwell chips are due in 2014.

Explore further: Toshiba to launch world's fastest microSD memory cards

Related Stories

Intel Boosts Mobile Celeron Performance

Aug 31, 2004

Intel Corporation today introduced the Intel® Celeron® M processors 350 and 360 for mobile PCs. Based on Intel's mobile architecture, the Intel Celeron M processor balances good mobile performance with exc ...

Intel Celeron D Processor Now Available

Jun 24, 2004

Intel Corporation today introduced the Intel® Celeron® D processor for desktop PCs. This processor line represents a new generation of Intel technology for desktop value market segments. Intel also unveiled ...

Intel lines up 14 Ivy Bridge processors

Jun 02, 2012

(Phys.org) -- A new lineup of 14 Ivy Bridge processors are out of the bag from Intel. Thursday’s announcement by Intel involves new processors for mobile computers and desktops, but special attention ...

Intel introduces first batch of Ivy Bridge processors

Apr 24, 2012

(Phys.org) -- Intel officially launched its 22-nanometer Ivy Bridge family of processors on Monday -- well, sort of. A sea of news headlines using the words rollout and release can be measured with the fact ...

Recommended for you

Finnish inventor rethinks design of the axe

1 hour ago

(Phys.org) —Finnish inventor Heikki Kärnä is the man behind the Vipukirves Leveraxe, which is a precision tool for splitting firewood. He designed the tool to make the job easier and more efficient, with ...

Japan's digital eyes show your emotions for you

2 hours ago

Can't be bothered to show anyone what you're thinking? Then a Japanese scientist has the answer—a pair of digital eyes that can express delight and anger, or even feign boredom.

User comments : 32

Adjust slider to filter visible comments by rank

Display comments: newest first

Claudius
2.5 / 5 (8) Nov 30, 2012
Well, this is one way to help AMD and other competitors go after the market of those who like to build their own systems.
tardiz
not rated yet Nov 30, 2012
Optimistacally this is a first step towards a system-in-package with an optical connection to the motherboard. That was supposed to start appearing in 2015, maybe intel is going to push some boundaries.
FrankHerbert
3 / 5 (5) Nov 30, 2012
You can still swap out BGA chips, it just requires a bit of skill with a heatgun. People do this all the time to fix their Xboxes.
randith
5 / 5 (1) Nov 30, 2012
Intel already had a habit of changing socket types much more often than AMD did, meaning that you had to upgrade your motherboard about as often as you had to upgrade your CPU anyway.

Still, motherboards fail more often than CPUs do, so for those times when your motherboard fails and you have to buy a whole new CPU/mobo combo, this is a bad thing.
El_Nose
3 / 5 (2) Nov 30, 2012
This had to happen anyway.

reasoning: optical interconnects will be mainstream by 2017 so this will allow people to be used to the idea that CPU's and Mobo's are one in the same

biggest impat:
mobo manufacturers -- this cuts out Asus and gigabyte and all the others who make top quality mobo's and in fact Inntel wrote contracts that said information on mobo design could be taken from the companies they partnered with, and no reciprocation made if Intel made mobo's themselves. to mobo makers this was a slap in the face
fmfbrestel
3 / 5 (2) Nov 30, 2012
Hardcore system builders can do what FrankHerbert is suggesting, so the only people this is going to really anger are the wana-be hardcore system builders. And since there are a but-load of those people, I predict a lot of angry users who will claim to switch over to AMD in protest, but then never actually follow through.

If AMD were better at winning converts from the pool of people pissed off with Intel, they wouldn't be having so many financial problems.
dschlink
5 / 5 (3) Nov 30, 2012
As an early adopter, I've only done a CPU swapout once in 43 years and that was 37 years ago. What you can gain from a CPU swap is minor compared to a motherboard upgrade, which isn't much more expensive.
Yevgen
2 / 5 (2) Nov 30, 2012
Anyway when you upgrade CPU you have to upgrade motherboard if you want to achieve a 2x level improvement, and if you don't, it is a waste of time/money to upgrade. So basically it is a dont care for DIY system assemblers. They will just buy the CPU/mobo combo which they in many case do already (since there are cheap deals for them), except that now CPU is already pre-soldered (less work).
As for mobo makers, it does indeed change their business model somewhat since they will have to become distributors for CPUs (which cost twice as much as mobo) and to keep inventory of them. That is somewhat of a bummer, and I suspect it might kill this concept once it is attempted in actual market place.
kochevnik
2.3 / 5 (3) Nov 30, 2012
I disagree as I had a proportionate speed improvement upgrading 1.8ghz celerons to 2.4ghz Prescott CPUs in the same motherboard. Multiply that by a hundred motherboards and it's big money. Also not all motherboards are equal. I used the pizza box case which dictated a (then) small motherboard 9x9. Bundling the CPU would certainly have constrained my performance options far more than something I arranged. Moreover many Chinese boards have cheap fuses which blow out at low temperatures. I had a Dell desktop fail last summer simply because the air conditioner was off that day. With Intel's meddling I would be out a CPU as well.
TrinityComplex
5 / 5 (1) Nov 30, 2012
As someone who repairs and upgrades computers as part of the job I can say that I don't see this causing much change. If it appears to be a problem with either the motherboard or processor we usually replace both (unless it's an obviously defective capacitor, which is quick and easy to replace), because while keeping spare RAM, HDDs, power supplies, and other components that are verified good on hand is viable, having spare motherboards and processors that we can swap out to determine the source with any given system build is just not time, space and cost effective.
kochevnik
2.6 / 5 (5) Nov 30, 2012
...having spare motherboards and processors that we can swap out to determine the source with any given system build is just not time, space and cost effective.
It's VERY time and cost effective if you have a server farm. Intel's approach is simply Apple's: Limit consumer choice by bundling to raise prices force consumers into substandard compromises.

Since Intel doesn't see huge performance gains ahead, they're saying "fuck it" and stamping out candy-ass prefab units to the unwashed masses. At that rate, soon they will be gluing the RAM on the board like Apple. Why not the drive as well? The gaming and overclocking community can go with AMD
TrinityComplex
not rated yet Nov 30, 2012
For a server farm or for large companies that invest in large numbers of computers all at the same time, yes, I agree somewhat. I suppose I should qualify that my position is repairing computers for various small business that buy new machines as they get new employees, because bulk purchasing for all employees at once is not in their budget. Saying that Intel is going to start doing it with other components may be stretching it, though.
Performance gains to cost of upgrading RAM is too high, and Intel knows that. I'd say they've at least put some effort into determining that this is a viable step. Someone running a large server farm will have the money to have a spare board with processor on hand to swap a bad one out quickly (we did that anyway at a server farm I worked at previously), and I can tell you from experience that it's cheaper for a small business to just replace both instead of paying for additional diagnostic time, as much as I hate throwing away possibly good parts.
VendicarD
1 / 5 (1) Nov 30, 2012
Joe Schmoe won't care. He purchases his PC from Walmart.

For him, this will lower his purchase price because of of them fancy, schmancy, clampy, wampy, CPU sockets won't be needed.
rwinners
1 / 5 (2) Nov 30, 2012
Back to the days of slotted processors?
VendicarD
1 / 5 (1) Nov 30, 2012
"Back to the days of slotted processors?" - Rwinners

Why not? Compared to the CPU all of the peripheral components are as slow as a flowing glacier.
rwinners
1 / 5 (2) Nov 30, 2012
Actually, MBs are becoming so small that this is not likely an issue.... as long as Intel provides adequate I/O for all the accessories... and the ability to accept third party video processors.
PosterusNeticus
1 / 5 (1) Nov 30, 2012
Whenever I decide it's time for a CPU upgrade, I build a whole new system anyway. I can't imagine myself installing a shiny new brain into a creaky old body.

My only concern is how this will impact the variety of options I currently enjoy for board & CPU combos. And at this point I'm not actually that concerned.
bobalony
4.3 / 5 (4) Dec 01, 2012
So couple this with on chip graphics and you have one size fits all computers (read as: tablets that sit in tower cases). I'm guessing the next step is just as another user said, soldering the ram chips onto the board as well. Pretty soon after that they will determine how much storage space the "average person" needs and solder that on as well.

Isn't it awesome to have a company make all your decisions for you? Oh by the way if you do any after market modifications you will get sued for hardware tampering.
kochevnik
3 / 5 (2) Dec 01, 2012
REMEMBER: Reverse engineering is a CRIME!
la7dfa
5 / 5 (1) Dec 01, 2012
I dont see this as a problem. The important part is that we still will access to a range of different performing computers.

You dont see a lot of people complaining about their graphics card having every component soldered on the board? So why should the rest of the computer be any different? Actually this means cheaper and more reliable PCs.

These days the improvements you can do to a good "off the shelf gaming computer" is quite limited, and even less so in the future.
Terratian
3 / 5 (2) Dec 01, 2012
As someone who has always built their PCs from scratch, this is a bad thing. I usually pick the parts I want and am not limited to cheap components which degrade the whole system. Once I made the mistake of buying a per-assembled PC, and it was nothing but headaches (for me). And it cost me much more than if I had built it myself.

For those who want a per-assembled PC, this is a good thing, it "should" mean cheaper prices.

The best solution, IMO, is for INTEL to offer a choice of either buying the CPU/Motherboard OR the CPU alone.
kochevnik
1 / 5 (2) Dec 01, 2012
Motherboards fail at a much higher rate than CPUs. I've had a CPU go through three motherboards based upon space, video card slot and component failure. PCI/AGP has gone through at least five evolutions now. Among other failures are cheapo capacitors with water in them and the blown fuses on a hot summer day. I've had motherboards catch on fire. Some motherboards are dual CPU and I want to launch a customer with one and allow them an easy upgrade later. Intel is gluing their CPU to a known point of failure because they expect the entire unit to be trashed 90% of the time. Right now there's a lively CPU salvage market on ebay and molotok.ru that Intel wants eliminated.
evropej
1 / 5 (4) Dec 01, 2012
1. If your motherboard or cpu dies, you are forced to buy both since the removal of a bga would cost more than the bga itself or vise versa.
2. Since this will force the mobo assembly to include the cpu, there will be less combinations available and less options for people who build systems.
3. This will force mobo companies to purchase the process prior to selling the mobo. This introduces a huge liability and cost to the mobo industry which most likely will kill variety and small companies.
4. For people who start out with cheap cpu or mobo and then upgrade later, this is not an option any longer and would push these purchases further out in time or reduce the investment amount.
5. AMD and Intel combinations on mobos will force mobo companies to make some hard decisions on what to invest and how much to build. Again, this is a huge economical effect.

Bottom line, the market will speak and show that this is not a good business decision. Intel is trying to monopolize the market share
albion01
5 / 5 (1) Dec 01, 2012
Maybe it's time for ATI/AMD to fork from the Intel instruction set?
MachinegunDojo
1 / 5 (3) Dec 01, 2012
At first I thought this was ridiculous, but after thinking on it some the biggest complaint I can see really is expandability since many features now reside or are in the process of residing on the CPU itself anyways. If this offers better bandwidth and perhaps better latency as well then I'd say go for it.
chromosome2
1 / 5 (2) Dec 02, 2012
Everyone who is freaking out, consider the following. Intel graphics engineers have, like everyone else, full access to the transistor counts, clocks, voltages, and benchmarks of their competitors' products. ....and they're Intel engineers. DDR3 in its current form has a tenth the bandwidth of GDDR5 as commonly implemented. Intel has to do something drastic to get to discrete graphics level bandwidth, like reducing line distance to RAM by, say, half. I expect this to be part of that.
wwqq
5 / 5 (3) Dec 02, 2012
In electronic systems, more than half of all failures are solder joint failures. Ball grid arrays are fragile. The solder joints fail due to repeated thermal expansion and contraction.

Failure of solder joints is an extremely common problem for graphics cards, XBOX 360 and PS3. Now you're going to have to deal with them on CPUs as well.
wwqq
5 / 5 (1) Dec 02, 2012
Everyone who is freaking out, consider the following[...]


I'm already pissed at intel for not selling CPUs without bundled integrated shit-graphics. I'm not going to be any less irritated making further sacrifices for the sake of marginal improvements in integrated graphics performance.
VendicarD
2 / 5 (4) Dec 03, 2012
Welcome to the Free Market, where all of your wishes instantly materialize and you are never constrained by the marketing whims of Corporations.

"I'm not going to be any less irritated making further sacrifices for the sake of marginal improvements in integrated graphics performance."
antialias_physorg
not rated yet Dec 03, 2012
Well, this is one way to help AMD and other competitors go after the market of those who like to build their own systems.

Which is a (very small) niche market.

The CPU already takes a back seat to the GPU when it comes to what you should upgrade if you want to have a faster (gaming) system.
Tausch
1.8 / 5 (4) Dec 03, 2012
Will the light bulb industry do this?
evropej
1 / 5 (4) Dec 03, 2012
The integrated GPU is crap. Maybe the GPU ( nvidia ) will finally put the processor to rest and this is the spark that was needed. Greed will lead every company to greatness.

More news stories

Finnish inventor rethinks design of the axe

(Phys.org) —Finnish inventor Heikki Kärnä is the man behind the Vipukirves Leveraxe, which is a precision tool for splitting firewood. He designed the tool to make the job easier and more efficient, with ...

Making graphene in your kitchen

Graphene has been touted as a wonder material—the world's thinnest substance, but super-strong. Now scientists say it is so easy to make you could produce some in your kitchen.

Poll: Big Bang a big question for most Americans

Few Americans question that smoking causes cancer. But they have more skepticism than confidence in global warming, the age of the Earth and evolution and have the most trouble believing a Big Bang created the universe 13.8 ...