Howdy, Stranger!

It looks like you're new here. If you want to get involved, click one of these buttons!

Fuzzy Avatars Solved! Please re-upload your avatar if it was fuzzy!

AMD Says It Is Not Abandoning Socketed CPUs

CabalocCabaloc Fort Pierce, FLPosts: 116Member

http://www.tomshardware.com/news/Socketed-LGA-packaging-BGA-packaging-Kaveri-Chris-Cook,19570.html

 

Came across this and something makes me curious . preventing the consumer from upgrading the cpu because it is soldered into the motherboard . Ok whats going on here and what direction we going in ?

«1

Comments

  • 13lake13lake Posts: 295Member Uncommon
    PCs with intel hardware will become classic consoles ?
  • RidelynnRidelynn Fresno, CAPosts: 4,160Member Uncommon


    Originally posted by Cabaloc
    http://www.tomshardware.com/news/Socketed-LGA-packaging-BGA-packaging-Kaveri-Chris-Cook,19570.html Came across this and something makes me curious . preventing the consumer from upgrading the cpu because it is soldered into the motherboard . Ok whats going on here and what direction we going in ?

    Well, to be honest, I wouldn't care if sockets went away. I would probably briefly mourn, because slotting a CPU into the socket for the first time feels a bit like Dr. Frankenstein implanting the brain into his new creation - but aside from that, I don't think much would change.

    Long ago, CPUs were soldered onto the motherboard. It's cheaper than sockets, it's usually more reliable (one of the leading cause of CPU failure, I would easily imagine, is people putting them into the sockets incorrectly and then returning them as failed). And if I'm honest, rarely do I build a computer and later on upgrade the CPU without having a motherboard upgrade go with it (usually because of socket incompatibility).

    The loss of the socket wouldn't mean the death of the desktop. It wouldn't even mean the death of the enthusiast community. It would be mildly inconveniencing for a small niche of users (mainly reviewers and hardcore overclockers). It would probably save $10-20 on the manufacturing price of CPU/Motherboards if they could be manufactured to be soldered in place (which is easy to automate, has low overhead cost, and doesn't require a big honkin' socket as an extra part).

  • ZuvielifyZuvielify Fremont, CAPosts: 168Member
    Originally posted by Ridelynn

     


    Originally posted by Cabaloc
    http://www.tomshardware.com/news/Socketed-LGA-packaging-BGA-packaging-Kaveri-Chris-Cook,19570.html

     

     

    Came across this and something makes me curious . preventing the consumer from upgrading the cpu because it is soldered into the motherboard . Ok whats going on here and what direction we going in ?


     

    Well, to be honest, I wouldn't care if sockets went away. I would probably briefly mourn, because slotting a CPU into the socket for the first time feels a bit like Dr. Frankenstein implanting the brain into his new creation - but aside from that, I don't think much would change.

    Long ago, CPUs were soldered onto the motherboard. It's cheaper than sockets, it's usually more reliable (one of the leading cause of CPU failure, I would easily imagine, is people putting them into the sockets incorrectly and then returning them as failed). And if I'm honest, rarely do I build a computer and later on upgrade the CPU without having a motherboard upgrade go with it (usually because of socket incompatibility).

    The loss of the socket wouldn't mean the death of the desktop. It wouldn't even mean the death of the enthusiast community. It would be mildly inconveniencing for a small niche of users (mainly reviewers and hardcore overclockers). It would probably save $10-20 on the manufacturing price of CPU/Motherboards if they could be manufactured to be soldered in place (which is easy to automate, has low overhead cost, and doesn't require a big honkin' socket as an extra part).

    I agree with you. I don't think I've ever changed my CPU without replacing my motherboard.

     

    The biggest loss I see with this is choice. Current motherboards support a wide range of chips because you can put any CPU in that matches the socket. If the CPUs are coupled with the motherboard, the options will go way down. I suspect what you'll see is just Low, Medium, and High end combinations, which may not necessarily be a bad thing. Sometimes a lot of choice is just distraction. 

  • TheLizardbonesTheLizardbones Arkham, VAPosts: 10,910Member

    By itself, I don't think this is too big a deal. I would think motherboard manufacturers aren't happy, because they would have to pick the CPU to put on a motherboard before they sell it, and if they pick the wrong one and stock up, they lose a bunch of money. If they wait for orders to come in, they run the risk of not supplying enough parts. I guess they have this issue now, but it would be even worse.

    It would be funny if, instead of making PCs more standardized and giving Intel more control, it actually did the opposite. People would buy motherboards, but it would just be a board with the CPU attached and end users bolted that board onto a mainboard that had everything else.

    I can not remember winning or losing a single debate on the internet.

  • fenistilfenistil GliwicePosts: 3,005Member

    I see it as big deal.   This might mean that mid range processors will be paired with mid range price motherboards and higher range processors like i5 or i7 today will be paired with very expensive motherboard leaving consumer no choice and increasing price of whole pc.  Also competition on motherboards market will be smaller.

    I see only minus as far for me as consusmer.  I will not support it.  If I will have a choice like AMD chips or maybe in more distant future ARM desktops - I will show Intel my middle finger if they go with that.

  • defector1968defector1968 Nar ShaddaaPosts: 393Member Common
    Originally posted by lizardbones

    By itself, I don't think this is too big a deal. I would think motherboard manufacturers aren't happy, because they would have to pick the CPU to put on a motherboard before they sell it, and if they pick the wrong one and stock up, they lose a bunch of money. If they wait for orders to come in, they run the risk of not supplying enough parts. I guess they have this issue now, but it would be even worse.

    It would be funny if, instead of making PCs more standardized and giving Intel more control, it actually did the opposite. People would buy motherboards, but it would just be a board with the CPU attached and end users bolted that board onto a mainboard that had everything else.

    they will create boards with both CPUs as all the VGA card companies create with both GPUs

    my biggest concern will be that when if the CPU dies, we need to change and the mainboard with it. that means more money to spend

  • TheLizardbonesTheLizardbones Arkham, VAPosts: 10,910Member


    Originally posted by fenistil
    I see it as big deal.   This might mean that mid range processors will be paired with mid range price motherboards and higher range processors like i5 or i7 today will be paired with very expensive motherboard leaving consumer no choice and increasing price of whole pc.  Also competition on motherboards market will be smaller.I see only minus as far for me as consusmer.  I will not support it.  If I will have a choice like AMD chips or maybe in more distant future ARM desktops - I will show Intel my middle finger if they go with that.

    Intel isn't operating in a closed system. There are motherboard manufacturers and OEMs that will be participating. I don't see anyone abandoning Intel for desktops, unless AMD creates some kind of miracle processor, but the manufacturers are going to have to compete with each other. It would largely be based on the CPU speed, with the motherboard taking a backseat. If one manufacturer has an i5-3570 cpu $10 cheaper than another manufacturer, then they'll probably sell more parts, and they won't lose money on the percentage because they'll use a lower end motherboard bit.

    I can not remember winning or losing a single debate on the internet.

  • TheLizardbonesTheLizardbones Arkham, VAPosts: 10,910Member


    Originally posted by defector1968
    Originally posted by lizardbones By itself, I don't think this is too big a deal. I would think motherboard manufacturers aren't happy, because they would have to pick the CPU to put on a motherboard before they sell it, and if they pick the wrong one and stock up, they lose a bunch of money. If they wait for orders to come in, they run the risk of not supplying enough parts. I guess they have this issue now, but it would be even worse. It would be funny if, instead of making PCs more standardized and giving Intel more control, it actually did the opposite. People would buy motherboards, but it would just be a board with the CPU attached and end users bolted that board onto a mainboard that had everything else.
    they will create boards with both CPUs as all the VGA card companies create with both GPUs

    my biggest concern will be that when if the CPU dies, we need to change and the mainboard with it. that means more money to spend




    That would be bad for the person where the CPU dies. That doesn't happen often though, and it would happen less often with the part being part of the motherboard. Now that I've said that, I'll be the person it happens to though.

    Gah. GPUs hardwired into the boards that you can't bypass would be a nightmare. When you first buy it, things would be great, but a year later? I say again, GAH!

    Then again, maybe better software would result. Developers would have to write better software to get better performance. Imagine software on your PC that was as optimized as well as the software for your game consoles.

    I think, overall most consumers will spend less money. On the supply side, money will get concentrated into fewer suppliers, with Intel holding more of the money. Other than that though, there are a lot of things that could happen. AMD might end up dominating the market (ha ha), or the market might push back enough to force a change...or people might just shrug and buy a new PC every year.

    I can not remember winning or losing a single debate on the internet.

  • TerranahTerranah Stockton, CAPosts: 3,605Member

    As a casual user but semihardcore gamer, I don't see it as a big deal.  I've changed a lot of components over the years but I've never changed a cpu.  By the time I would think about changing out a cpu, it's probably time to think about upgrading the entire rig anyway, which I do about every 5 years.  And if I was going to change the cpu, I would be changing out the motherboard too.  But that's me, the casual user.  Ignorant, but happy.

     

     

  • IchmenIchmen Winnipeg, MBPosts: 1,228Member

    id rather keep the socketting.  sure it can be a pain making sure the cpu matches the mobo, but frankly the only people who are effected by that are the custom builders. 

    id rather not have to fork out 500 bucks to replace a mobo that fried or cpu that fried. as they are sodered together. 

    honestly if i wanted a factory built pc i would buy them... and i dont want those paper weights >>"

    CPU: Intel Core i7 CPU 860 2.8GHz
    Evga GeForce 670 FTW
    Evga P55 SLI

    <image

  • ReizlaReizla AlkmaarPosts: 3,293Member Uncommon
    Originally posted by Ridelynn

     


    Originally posted by Cabaloc
    http://www.tomshardware.com/news/Socketed-LGA-packaging-BGA-packaging-Kaveri-Chris-Cook,19570.html

    Came across this and something makes me curious . preventing the consumer from upgrading the cpu because it is soldered into the motherboard . Ok whats going on here and what direction we going in ?


    And if I'm honest, rarely do I build a computer and later on upgrade the CPU without having a motherboard upgrade go with it (usually because of socket incompatibility).

    So far, I've upgraded at least 3 CPU's since I started building PC's in 1992. My latest upgrade was earlier this year from an AMD X3 to an X6. If the choice is up to me, I always buy more expenive moatherboards with support for (future) CPU's than that I'm forced to buy some crap that a board-builder thinks I might like. Honestly, it's much cheaper to buy a bit more expensive board (mine was €30 more than the standard one) and upgrade later for a cheaper price (2nd X6 was €125) than that I'd have to buy twice a whole mainboard witch CPU and pay more in the end...

    AsRock 990FX Extreme3
    AMD Phenom II 1090T ~3.2Ghz
    GEiL 16Gb DDR3 1600Mhz
    ASUS GTX970 3x HD monitor 1920x1080

  • OG_ZorvanOG_Zorvan Fresno, CAPosts: 615Member

    The problem i have with this is it would just be like those motherboard+CPU combo deals you see on Newegg, BestBuy, etc.

    I never buy the combo deals because inevitably I like the motherboard features but the CPU sucks, or I like the CPU but the motherboard has less features than the cardboard box it ships in.

    Also, I'm usually able to still beat the prices of those combos by shopping around for the CPU and motherboard individually.

    And finally, because sometimes you do end up with a bad CPU and having to yank the entire motherboard just to replace a burnt CPU does not appeal to me at all.

    One of the major reasons Apple PC's got left in the dust years ago. You couldn't upgrade individual components as Apple/IBM would glue, solder, and tape down anything and everything, even the damn ram sticks were glued in.

    What's next after integrated CPU/MB, CPU/MB/RAM? Then CPU/MB?RAM/GPU?

    No thanks.

    Edit: Don't make me start printing my own circuit boards again, dammit.

    EA CEO John Riccitiello's on future microtransactions: "When you are six hours into playing Battlefield and you run out of ammo in your clip, and we ask you for a dollar to reload, you're really not very price sensitive at that point in time...We're not gouging, but we're charging."

  • cronius77cronius77 Fairfax, VAPosts: 1,347Member Uncommon
    the last ive read about AMD anyways was they were moving away from CPUs for computers and moving development into handhelds http://www.tomshardware.com/news/amd-ultramobile-tablet-apu-cpu,18546.html
  • TheLizardbonesTheLizardbones Arkham, VAPosts: 10,910Member


    Originally posted by Reizla
    Originally posted by Ridelynn   Originally posted by Cabaloc http://www.tomshardware.com/news/Socketed-LGA-packaging-BGA-packaging-Kaveri-Chris-Cook,19570.html Came across this and something makes me curious . preventing the consumer from upgrading the cpu because it is soldered into the motherboard . Ok whats going on here and what direction we going in ?
    And if I'm honest, rarely do I build a computer and later on upgrade the CPU without having a motherboard upgrade go with it (usually because of socket incompatibility).
    So far, I've upgraded at least 3 CPU's since I started building PC's in 1992. My latest upgrade was earlier this year from an AMD X3 to an X6. If the choice is up to me, I always buy more expenive moatherboards with support for (future) CPU's than that I'm forced to buy some crap that a board-builder thinks I might like. Honestly, it's much cheaper to buy a bit more expensive board (mine was €30 more than the standard one) and upgrade later for a cheaper price (2nd X6 was €125) than that I'd have to buy twice a whole mainboard witch CPU and pay more in the end...


    AMD has no plans to go with Intel's socket scheme. Someday you'll be able to go from an X12 to an X24 with no problem. :-)

    I can not remember winning or losing a single debate on the internet.

  • QuizzicalQuizzical Posts: 14,763Member Uncommon

    I doubt that Intel will entirely abandon socketed CPUs, either.  I wouldn't be at all surprised if they end up having several bins in a generation that are BGA-only, but still offer socketed versions of the few bins that enthusiasts tend to care about.  If, among Ivy Bridge processors, the Core i3-3220 (which isn't competitive with AMD this generation, but Intel might be competitive at that price point in future generations), Core i5-3570K, and Core i7-3770K had LGA 1155 versions, while the rest were BGA-only, would that really be so troublesome?  On New Egg, the 3570K has 708 reviews, the 3770K has 417 reviews, and the other 13 Ivy Bridge processors all added together only have 482 reviews.

    There are two things that I'd find problematic about abandoning sockets entirely.  First, I don't want to have to replace a $200 CPU that is still perfectly good just because a $100 motherboard broke.  I'm less worried about the other way around, as processors are pretty durable.  Second, no sockets will mean vastly fewer feature combinations.  You'll probably still find a good number of options on the two or three bins that matter, but if you want to go off the beaten path a bit, good luck.

    Ever try to find a sub-$1000 laptop with a sensible hardware configuration?  They're few and far between, and if you want something that is not merely sensible for someone, but sensible for you personally, you're likely to be out of luck.  I really don't want desktops to go down that route.

    Processor upgrades may well become less important in the future.  When's the last Intel processors that were a sensible upgrade for people with an older motherboard?  You could make a case for Gulftown in 2010, but in the sub-$500 market (which is nearly the whole thing from a volume standpoint), you'd have to go back to Wolfdale and Yorkfield in 2008.  As more functions get integrated into the same die as the CPU, the likelihood of needing a new socket to accommodate the new functions increases.  If every generation would need a new socket anyway, what use is the option of a processor upgrade?

  • TealaTeala SomewherePosts: 7,430Member Uncommon

    I read the article, and it stated that though Intel was moving toward the BGA packaging of CPU's, Intel has not commented on whether they will continue to support socketed CPU's.    AMD says they will, because there is a still a huge market for high-end gaming rigs.   I suspect Intel sees it the same way.    It is easier for them to make MOBO's with the CPU's as part of them for the mass market computers(laptops, tablets, cheap desktop), but I think they see that there is still a need for socketed CPU's for the simple fact that there are still mass amounts of gamers that use custom/home built PC's.   

    I haven't bought a whole computer(pre-built) since I think 1999.    It is easier to buy the parts and put it together yourself.    You get what you want that way for a lot less cost.    I think there will always be a good size market for high-end gaming rigs.   I just wish to see them get smaller and become more moduler.    Imagine a home gaming rig not only being able to stack vid cards, like we do now, but CPU's as well.   We'd all have like little super computers for gaming.    ^_^

     

    image
  • RidelynnRidelynn Fresno, CAPosts: 4,160Member Uncommon


    Originally posted by lizardbones
    By itself, I don't think this is too big a deal. I would think motherboard manufacturers aren't happy, because they would have to pick the CPU to put on a motherboard before they sell it, and if they pick the wrong one and stock up, they lose a bunch of money. If they wait for orders to come in, they run the risk of not supplying enough parts. I guess they have this issue now, but it would be even worse.

    It would be funny if, instead of making PCs more standardized and giving Intel more control, it actually did the opposite. People would buy motherboards, but it would just be a board with the CPU attached and end users bolted that board onto a mainboard that had everything else.

    Well, I don't think it's that drastic. Think about the standard gaming computer - really there are only a handful of current generation CPU's that even get considered, despite the high number of SKU's of different CPUs

    Today, if your buying a computer, you get:
    Core i7 3770k (high budget)
    Core i5 3570k (mid budget)
    AMD FX-6300 (low budget)
    AMD A10-5700 (super budget budget)

    And then the motherboards are not that diverse either - despite the sheer number available, really it comes down to
    A) Which brand you want
    B) How heavy duty power circuitry you want
    C) Peripheral options (RAID Controllers, built-in I/O, SLI/CFX, etc) - most of which can be added on after the fact.

    You may consider other CPU's for core count or TDP for more specific purposes, but the vast majority of current CPU SKU's are for niche markets: High core count servers, laptop bins, small form factor ULV low TDP bins, OEM bins, etc. that just aren't appealing to someone building their own computer, and really aren't necessary to market.

    As for your last paragraph, that reminds me a lot of the old Pentium 2 SEC design, where the CPU was sold on a daughtercard and plugged into the motherboard into a slot similar to a PCI slot.

  • SoulStainSoulStain Tampa, FLPosts: 202Member
    By the time I'm ready to upgrade my CPU ..I'm ready to upgrade the motherboard for various reasons including the fact that my old MB wont accept the CPU I want. I really wouldn't have an issue with soldering the cpu to the MB.
  • RidelynnRidelynn Fresno, CAPosts: 4,160Member Uncommon


    Originally posted by OG_Zorvan
    One of the major reasons Apple PC's got left in the dust years ago. You couldn't upgrade individual components as Apple/IBM would glue, solder, and tape down anything and everything, even the damn ram sticks were glued in.

    Ironically enough, Apple is the top 5 in the US in PC Sales (by volume) and top 10 globally - considering they only sell Macintosh PCs, they only have a few models available, and they still glue, solder, and tape everything down (I would say even worse today than it used to be). And that doesn't consider anything with iOS on it, which makes up the vast majority of Apple's volume.

    I wouldn't cite hardware upgradeability as any reason Apple was "left in the dust". In fact, I'd argue that today, Apple is hardly in the dust at all, in the Global Top 100 by revenue, and the biggest company of all companies - #1 in the world - by total value, for 2012.

  • QuizzicalQuizzical Posts: 14,763Member Uncommon
    Originally posted by Ridelynn

     


    Originally posted by OG_Zorvan
    One of the major reasons Apple PC's got left in the dust years ago. You couldn't upgrade individual components as Apple/IBM would glue, solder, and tape down anything and everything, even the damn ram sticks were glued in.

     

    Ironically enough, Apple is the top 5 in the US in PC Sales (by volume) and top 10 globally - considering they only sell Macintosh PCs, they only have a few models available, and they still glue, solder, and tape everything down (I would say even worse today than it used to be). And that doesn't consider anything with iOS on it, which makes up the vast majority of Apple's volume.

    I wouldn't cite hardware upgradeability as any reason Apple was "left in the dust". In fact, I'd argue that today, Apple is hardly in the dust at all, in the Global Top 100 by revenue, and the biggest company of all companies - #1 in the world - by total value, for 2012.

    Desktops aren't what drives Apple's value.  If you restrict to devices with processor sockets (basically desktops and servers), then Apple has very much been left in the dust.  What is the volume on Mac Pro sales again?  Apparently it's enough that Apple hasn't bothered to update the line in more than two years.

    iOS is for phones and tablets, and there, Apple's competitors can't offer customizability or upgradeability, either.  It's not a matter of choosing not to; it's that the form factor doesn't allow it.  It's a different market from the one relevant to this discussion.  Ford sells a lot of cars, but I wouldn't buy a computer they built.

  • fenistilfenistil GliwicePosts: 3,005Member
    Originally posted by lizardbones

     


    Originally posted by fenistil
    I see it as big deal.   This might mean that mid range processors will be paired with mid range price motherboards and higher range processors like i5 or i7 today will be paired with very expensive motherboard leaving consumer no choice and increasing price of whole pc.  Also competition on motherboards market will be smaller.

     

    I see only minus as far for me as consusmer.  I will not support it.  If I will have a choice like AMD chips or maybe in more distant future ARM desktops - I will show Intel my middle finger if they go with that.



    Intel isn't operating in a closed system. There are motherboard manufacturers and OEMs that will be participating. I don't see anyone abandoning Intel for desktops, unless AMD creates some kind of miracle processor, but the manufacturers are going to have to compete with each other. It would largely be based on the CPU speed, with the motherboard taking a backseat. If one manufacturer has an i5-3570 cpu $10 cheaper than another manufacturer, then they'll probably sell more parts, and they won't lose money on the percentage because they'll use a lower end motherboard bit.

     

    Now those manufacturers will have to get into tighter deals with Intel as they will depend on Intel even more in production of motherboards.  You don't know how this cooperation will look like.  It might be that those producers will be limited to de-facto suppliers for Intel.   Also Intel produces their own motherboards and now it will be much easier to get additional shares on this market.

    Aside of that BGA is far LESS reliable and more short-lived.  BGA is what is used in laptops for CPU and GPU's and other things.  It has failed on me multiple times in my gaming laptops, especially since they stopped to use lead and there is no viable replacement for it, so 'dry joints' will just keep happening. 

  • RyowulfRyowulf Greensburg, PAPosts: 668Member

    fyi I got a Core i7 3770, since I don't plan on oc'ing.  If that choice is taken away from me, I'll end up paying more for features I won't be using or less without the features I want.

    Overall I don't think its a huge deal, but I will miss being able to mix and match my cpu and mb.

  • RidelynnRidelynn Fresno, CAPosts: 4,160Member Uncommon


    Originally posted by Quizzical

    Originally posted by Ridelynn  

    Originally posted by OG_Zorvan One of the major reasons Apple PC's got left in the dust years ago. You couldn't upgrade individual components as Apple/IBM would glue, solder, and tape down anything and everything, even the damn ram sticks were glued in.
      Ironically enough, Apple is the top 5 in the US in PC Sales (by volume) and top 10 globally - considering they only sell Macintosh PCs, they only have a few models available, and they still glue, solder, and tape everything down (I would say even worse today than it used to be). And that doesn't consider anything with iOS on it, which makes up the vast majority of Apple's volume. I wouldn't cite hardware upgradeability as any reason Apple was "left in the dust". In fact, I'd argue that today, Apple is hardly in the dust at all, in the Global Top 100 by revenue, and the biggest company of all companies - #1 in the world - by total value, for 2012.
    Desktops aren't what drives Apple's value.  If you restrict to devices with processor sockets (basically desktops and servers), then Apple has very much been left in the dust.  What is the volume on Mac Pro sales again?  Apparently it's enough that Apple hasn't bothered to update the line in more than two years.

    iOS is for phones and tablets, and there, Apple's competitors can't offer customizability or upgradeability, either.  It's not a matter of choosing not to; it's that the form factor doesn't allow it.  It's a different market from the one relevant to this discussion.  Ford sells a lot of cars, but I wouldn't buy a computer they built.


    ...

    Apple is the top 5 in the US in PC Sales (by volume) and top 10 globally... . And that doesn't consider anything with iOS on it ...

    Desktops aren't the the major item that drives Apple, but it's still very significant. Significant enough to be ranked, even though the parts are glued in, many parts are proprietary, and yes, some models don't get updated nearly as fast as technology evolves.

    Just because you wouldn't buy one doesn't make them irrelevant. Right now, of all computer retailers, Apple is driving the entire industry in industrial design and the "walled garden" concept (which I dislike, but I can't disregard), and has for many years. Love them or hate them, you'll end up using something that Apple brought to market one way or another, even if you loathe them.

    Check out Samsung Series 9, the Asus K56CM, the Acer S7, the latest Google Chromebook... Where did I see all those before? Oh yeah, in 2008 when Steve jobs first pulled it out from a manila envelope and called it a Macbook Air.

    Who single-handedly drove the industry to Gorilla Glass? Apple. Before the iPhone, it was a shelved product with no real application. Now it's on nearly every smart phone and many flatscreens.

    Can you find a smartphone that looks like a Blackberry any more? How about one that looks like an iPhone. Yup - all of them, including newer Blackberries.

    How many tablets resemble the Microsoft Tablet PC? I can think of one - Surface Pro. How many resemble the iPad in design and architecture... which strongly resembles the Newton in design and architecture, which was first introduced in 1993? Pretty much all of them.

    Just like Ford - you may not like the products, but Apple's influence is far reaching, even in the PC market, and goes well beyond their Mac Pro line that is in dire need of a refresh. For a line that hasn't had a proper refresh in several years, they still seem to be pushing along just fine. There has been a lot of speculation that Apple is looking to get out of the PC market entirely - they already dropped their Server line (it was split between the Mac Pro and Mac Mini) - the iMac and Mini's are glorified laptops in the last few iterations (and especially so in the current thin model). But if you are talking PC - you have to include those as well - the All-in-ones and the laptops. It's not just midtowers that enthusiasts like to work with - it's all x86, and those are all socketed still.

  • RidelynnRidelynn Fresno, CAPosts: 4,160Member Uncommon


    Originally posted by fenistil

    Aside of that BGA is far LESS reliable and more short-lived.  BGA is what is used in laptops for CPU and GPU's and other things.  It has failed on me multiple times in my gaming laptops, especially since they stopped to use lead and there is no viable replacement for it, so 'dry joints' will just keep happening. 

    I have a funny feeling that BGA failed on you not because the socket sucks, but because they were just subject to too many thermal stresses. Laptops, particularly gaming laptops, are notorious for packing in too much heat in too cramped of a space. That is little fault of the socket - and something is bound to break when they heat up that much, that fast, and can't stay properly cooled. It's not the peak temperature that kills it, it's the rapid fluctuation of temperature, it kills all solder joints, and is especially brutal on small sensitive spots. If it's not a BGA solder point, it could easily be a reflow joint someplace else, or the silicon degrading, or anything.

  • fatboy21007fatboy21007 triadelphia, WVPosts: 409Member
    first people scream doom 12-21-12, then the twinkies go extinct, Now Amd and intel cosider taking about my ability to change motherboards and cpus!, Aww hell no!, get the pitchforks, gather the army, tis time a war starts!
«1
Sign In or Register to comment.