Intel CPU with AMD GPU in the same package?

QuizzicalQuizzical Member EpicPosts: 18,074
There have been some rumors lately that this was going to happen.  It perhaps started at HardOCP last December:

http://www.hardocp.com/news/2016/12/06/amd_licensing_radeon_graphics_to_intel63

And now, he says that it's not merely going to happen, but will happen this year:

https://hardforum.com/threads/from-ati-to-amd-back-to-ati-a-journey-in-futility-h.1900681/page-72#post-1042797289

So does this make sense?  If it's a one-time thing, I don't think it does.  But if it lets Intel shut down their GPU division entirely, that's a different story entirely.

Intel has spent a ton of money over the years to develop their own GPU architectures.  It's not just fabricating the chips that is the problem.  It costs a lot of money to design the architecture, make particular chips using it and fix problems, license patents from the major GPU vendors, and write drivers.  Intel has spent a lot of money doing this, with results ranging from rather bad to shockingly awful.

If they license someone else's GPU, all those development costs go away.  They probably get a much better product, and provided that they pay substantially less than their own development costs would have been but substantially more than nothing (so that whoever sells the GPU makes money, too), it can be a win-win.  Intel did this in the past with one generation of Atom CPUs licensing an Imagination GPU.

Plenty of cell phone chip manufacturers license a GPU, generally from ARM or Imagination.  And that's in the same SoC as the CPU, even.  The rumor here is a multi-chip module, with an Intel CPU and an AMD GPU as separate chips in the same package.  Intel has done this in the past with Clarkdale, and also had multi-chip modules for pure CPU products, most notably the Core 2 Quad.  AMD has also done multi-chip modules with a number of CPUs, generally all of their high end server chips from Magny-Cours onward.

So why license AMD's GPU in particular?  The markets Intel CPUs target tend to ask for higher GPU performance than cell phones provide.  AMD and Nvidia are the only two proven GPU vendors for high performance GPUs, though Imagination would probably claim that they could offer the performance Intel needs, too.  AMD and Nvidia also happen to conveniently write drivers for Windows and Linux, so there's virtually no additional driver creation cost.  So why AMD and not Nvidia?  I don't know, but it could plausibly be that AMD offered a better price.

So why would AMD do this?  Don't they want a GPU advantage to drive sales of Raven Ridge?  Suppose that you're AMD and you have a choice between making $50 in profit on every CPU you sell and nothing on every CPU Intel sells, or $50 on every CPU you sell and $5 on every CPU Intel sells? Even if the deal means you end up with 12% market share rather than 15% as some of the people who care about the integrated GPU buy Intel, you still come out ahead.  This could also greatly reduce the risk to AMD, and keep them in business even if their CPU side has another hiccup.  AMD has been actively looking for ways to monetize their GPU IP, and this is one.

Does Intel want to be dependent on AMD for their CPUs?  Of course not, but even with this deal, they wouldn't be over the long term, even if they shut down their own GPU division.  A multi-chip module means that you could license an AMD GPU one generation and Nvidia the next--or ARM or Imagination or Qualcomm.  Game consoles have no problem with bouncing between GPU vendors from one generation to the next.  Apple does the same thing.

Another possibility that I'd like to introduce as my own speculation is that it could be Apple driving this.  Apple has long been unhappy with Intel GPUs and AMD CPUs, for obvious enough reasons.  If they threatened to ditch x86 in favor of ARM unless something like this happened, it could provide the impetus to get something done.  Of course, if that's what happened, the chip might end up being Mac-exclusive.

It's all speculation and rumors at this point, so I don't know if this is actually going to happen.  But it will be interesting to see if it does.

Comments

  • RidelynnRidelynn Fresno, CAMember EpicPosts: 6,005
    edited February 7
    Seems like we talked about this before, but maybe it was over on a different forum, as I can't readily find it here.

    I'd be surprised if Apple were the driving force here, but it wouldn't be the first time (Crystalwell). Apple could just about jump Intel ship entirely and move to their own CPU without a huge impact on typical productivity if they wanted to.  And in terms of CPU sales, OS X machine sales account for pretty much a rounding error in terms of total PC sales (as much as I like Apple, that's the fact).

    I would suspect the bigger driving force would be the HPC/AI market that nVidia is currently looking more or less unchallenged in, and Intel doesn't want to cede without a fight. Phi is pretty much where the Intel accelerated IGP was born (they had IGP before, but it was extremely basic and didn't support gaming)
    Post edited by Ridelynn on
  • filmoretfilmoret Palm Bay, FLMember EpicPosts: 4,906
    It would make sense if intel actually build the computers.  But they only provide the parts.
    Are you onto something or just on something?
  • QuizzicalQuizzical Member EpicPosts: 18,074
    The rumor linked above says that the first product will be "positioned in the entry-to-mid level performance segment".  That's useless for HPCs, as in the HPC world, the node and interconnect overhead are tremendously expensive and you need to pack a ton of stuff into every node.

    While Larrabee was originally supposed to eventually make its way into Intel GPUs, Intel seems to have abandoned that idea pretty quickly.  I suspect that Intel looked at performance numbers and saw that they got destroyed by AMD and Nvidia and gave up.  The modern Xeon Phi parts are basically Intel Atom cores plus wider AVX instructions, which is a really terrible way to do graphics.  It's not just that the fixed-function graphics hardware like tessellation and rasterization isn't there.  It's that the cache hierarchy is all wrong for graphics, e.g., having to go to L2 cache for things that any modern GPU would keep in registers and never touch a higher level of cache.

    Xeon Phi is also a terrible way to do most other embarrassingly parallel algorithms, but that's a different story.
  • filmoretfilmoret Palm Bay, FLMember EpicPosts: 4,906
    You did way too much thinking without reading about it.  Intel was already in a contract with Nvidia and that ended 2017.  So the rumor is they signed on with AMD this time instead of going with NVIDIA again.  Its nothing new its the same stuff they been doing for a few years now.  They are just changing from nvidia to AMD.  So the rumors say anyways.

    Doesn't make sense considering AMD is a competitor and Nvidia is not.
    Are you onto something or just on something?
  • RidelynnRidelynn Fresno, CAMember EpicPosts: 6,005
    edited February 7
    filmoret said:
    You did way too much thinking without reading about it.  Intel was already in a contract with Nvidia and that ended 2017.  So the rumor is they signed on with AMD this time instead of going with NVIDIA again.  Its nothing new its the same stuff they been doing for a few years now.  They are just changing from nvidia to AMD.  So the rumors say anyways.

    Doesn't make sense considering AMD is a competitor and Nvidia is not.
    Actually, I've heard the same thing = that It's just patent licensing so Intel can keep producing their own GPU. 
    Post edited by Ridelynn on
  • RidelynnRidelynn Fresno, CAMember EpicPosts: 6,005
    Finally found the previous discussion, @bestever talks about it here:

    http://forums.mmorpg.com/discussion/456132/amd-makes-a-bold-move/p8
  • RidelynnRidelynn Fresno, CAMember EpicPosts: 6,005
    edited February 7
    Quizzical said:
    The rumor linked above says that the first product will be "positioned in the entry-to-mid level performance segment".  That's useless for HPCs, as in the HPC world, the node and interconnect overhead are tremendously expensive and you need to pack a ton of stuff into every node.

    While Larrabee was originally supposed to eventually make its way into Intel GPUs, Intel seems to have abandoned that idea pretty quickly.  I suspect that Intel looked at performance numbers and saw that they got destroyed by AMD and Nvidia and gave up.  The modern Xeon Phi parts are basically Intel Atom cores plus wider AVX instructions, which is a really terrible way to do graphics.  It's not just that the fixed-function graphics hardware like tessellation and rasterization isn't there.  It's that the cache hierarchy is all wrong for graphics, e.g., having to go to L2 cache for things that any modern GPU would keep in registers and never touch a higher level of cache.

    Xeon Phi is also a terrible way to do most other embarrassingly parallel algorithms, but that's a different story.
    Phi may suck, but it started life out as Larrabee, intended for graphics. So Larrabee never made it to a graphics processor, but it did make it to the HPC arena. Intel IGP is still getting destroyed by AMD and nVidia, and yet, they are still by far the largest provider of graphics on PCs in the world. So being last in a 3-man race doesn't seem to deter them from creating products, or being able to successfully install them into people's computers.

    Actually, that sounds familiar. Didn't nVidia start out with something made for graphics first, then move on to HPC and AI and such that can leverage SIMD....

    So if all this HPC stuff is starting out as Graphics, wouldn't it make sense to have your first cross-license deal with a graphics company also start out with... graphics? And then migrate it to HPC, since that seems to be how the evolution of pretty much everything HPC has went (except IBM or the new Chinese super computers, no idea what they are using honestly)

    Now, that's just as much speculation as saying Apple is driving the deal, I think honestly it's more to do with just patents for general GPU architecture than anything. But it's fun to speculate.
    Post edited by Ridelynn on
  • CleffyCleffy San Diego, CAMember RarePosts: 5,626
    I find it implausible AMD will provide GPU chips to Intel. However, I can see Intel making a licensing agreement with AMD for patents like they previously did with nVidia. It actually makes more sense for Intel to license AMD patents as the companies already share a lot of licensing agreements, and AMDs patents make more sense for Intel GPUs as AMD technology is more open than nVidia.
    It would definitely be interesting if AMD designs Intel GPUs. It would help secure AMD financially. It may also turn out better for Intel as they would keep an uncompetitive AMD as a competitor instead of a company with more resources or deal with FTC mandates.
  • GladDogGladDog Pottstown, PAMember RarePosts: 836
    If you look at Nvidia's history, you see over and over again that the CEO of Nvidia is a jerk that finds new ways to piss off people every time they negotiate a deal.  He may be an exceptional and successful businessman, but he seems to revel in finding ways to piss off people.  Even though AMD is a competitor, Intel may have found that the crow they had to swallow was a lot tastier than dealing with Nvidia.

    Before AMD purchased ATI, they were in a strong merger negotiation with Nvidia.  At the time, it looked like the AMD/ Nvidia merger would make a lot more sense.  AMD had great ties with Nvidia, and had partnered in several successful ventures.  ATI was working with Intel on a couple of projects, and if AMD purchased Nvidia that would have overcome a huge anti-trust hurdle allowing Intel to buy ATI.  But the CEO of Nvidia made some unacceptable demands that once again proved his jerkiness, causing the whole deal to fall apart.  AMD quickly made an offer for ATI that was probably pretty favorable to ATI stockholders (since it was accepted very quickly), and the rest is history.
    blueturtle13


    The world is going to the dogs, which is just how I planned it!


  • filmoretfilmoret Palm Bay, FLMember EpicPosts: 4,906
    Cleffy said:
    I find it implausible AMD will provide GPU chips to Intel. However, I can see Intel making a licensing agreement with AMD for patents like they previously did with nVidia. It actually makes more sense for Intel to license AMD patents as the companies already share a lot of licensing agreements, and AMDs patents make more sense for Intel GPUs as AMD technology is more open than nVidia.
    It would definitely be interesting if AMD designs Intel GPUs. It would help secure AMD financially. It may also turn out better for Intel as they would keep an uncompetitive AMD as a competitor instead of a company with more resources or deal with FTC mandates.
    Their previous deal with Nvidia was for 66mil a year.  I don't see them spending more then that with AMD but you never know.

    GladDog said:
    If you look at Nvidia's history, you see over and over again that the CEO of Nvidia is a jerk that finds new ways to piss off people every time they negotiate a deal.  He may be an exceptional and successful businessman, but he seems to revel in finding ways to piss off people.  Even though AMD is a competitor, Intel may have found that the crow they had to swallow was a lot tastier than dealing with Nvidia.

    Before AMD purchased ATI, they were in a strong merger negotiation with Nvidia.  At the time, it looked like the AMD/ Nvidia merger would make a lot more sense.  AMD had great ties with Nvidia, and had partnered in several successful ventures.  ATI was working with Intel on a couple of projects, and if AMD purchased Nvidia that would have overcome a huge anti-trust hurdle allowing Intel to buy ATI.  But the CEO of Nvidia made some unacceptable demands that once again proved his jerkiness, causing the whole deal to fall apart.  AMD quickly made an offer for ATI that was probably pretty favorable to ATI stockholders (since it was accepted very quickly), and the rest is history.
    IDK man you trying to say AMD merger with Nvidia and somehow the ceo made unreasonable demands.  One company was worth 4 bil and the other company was worth over 15 billion.
    Are you onto something or just on something?
  • GladDogGladDog Pottstown, PAMember RarePosts: 836
    edited February 7
    filmoret said:
    Cleffy said:
    I find it implausible AMD will provide GPU chips to Intel. However, I can see Intel making a licensing agreement with AMD for patents like they previously did with nVidia. It actually makes more sense for Intel to license AMD patents as the companies already share a lot of licensing agreements, and AMDs patents make more sense for Intel GPUs as AMD technology is more open than nVidia.
    It would definitely be interesting if AMD designs Intel GPUs. It would help secure AMD financially. It may also turn out better for Intel as they would keep an uncompetitive AMD as a competitor instead of a company with more resources or deal with FTC mandates.
    Their previous deal with Nvidia was for 66mil a year.  I don't see them spending more then that with AMD but you never know.

    GladDog said:
    If you look at Nvidia's history, you see over and over again that the CEO of Nvidia is a jerk that finds new ways to piss off people every time they negotiate a deal.  He may be an exceptional and successful businessman, but he seems to revel in finding ways to piss off people.  Even though AMD is a competitor, Intel may have found that the crow they had to swallow was a lot tastier than dealing with Nvidia.

    Before AMD purchased ATI, they were in a strong merger negotiation with Nvidia.  At the time, it looked like the AMD/ Nvidia merger would make a lot more sense.  AMD had great ties with Nvidia, and had partnered in several successful ventures.  ATI was working with Intel on a couple of projects, and if AMD purchased Nvidia that would have overcome a huge anti-trust hurdle allowing Intel to buy ATI.  But the CEO of Nvidia made some unacceptable demands that once again proved his jerkiness, causing the whole deal to fall apart.  AMD quickly made an offer for ATI that was probably pretty favorable to ATI stockholders (since it was accepted very quickly), and the rest is history.
    IDK man you trying to say AMD merger with Nvidia and somehow the ceo made unreasonable demands.  One company was worth 4 bil and the other company was worth over 15 billion.
    The numbers I saw when that merger was on the table showed the two companies with approximately equal value, with AMD being slightly more valuable.  That has changed since then of course.  That was during the heyday of the Athlon and Athlon X2.
    Post edited by GladDog on


    The world is going to the dogs, which is just how I planned it!


  • filmoretfilmoret Palm Bay, FLMember EpicPosts: 4,906
    GladDog said:
    filmoret said:
    Cleffy said:
    I find it implausible AMD will provide GPU chips to Intel. However, I can see Intel making a licensing agreement with AMD for patents like they previously did with nVidia. It actually makes more sense for Intel to license AMD patents as the companies already share a lot of licensing agreements, and AMDs patents make more sense for Intel GPUs as AMD technology is more open than nVidia.
    It would definitely be interesting if AMD designs Intel GPUs. It would help secure AMD financially. It may also turn out better for Intel as they would keep an uncompetitive AMD as a competitor instead of a company with more resources or deal with FTC mandates.
    Their previous deal with Nvidia was for 66mil a year.  I don't see them spending more then that with AMD but you never know.

    GladDog said:
    If you look at Nvidia's history, you see over and over again that the CEO of Nvidia is a jerk that finds new ways to piss off people every time they negotiate a deal.  He may be an exceptional and successful businessman, but he seems to revel in finding ways to piss off people.  Even though AMD is a competitor, Intel may have found that the crow they had to swallow was a lot tastier than dealing with Nvidia.

    Before AMD purchased ATI, they were in a strong merger negotiation with Nvidia.  At the time, it looked like the AMD/ Nvidia merger would make a lot more sense.  AMD had great ties with Nvidia, and had partnered in several successful ventures.  ATI was working with Intel on a couple of projects, and if AMD purchased Nvidia that would have overcome a huge anti-trust hurdle allowing Intel to buy ATI.  But the CEO of Nvidia made some unacceptable demands that once again proved his jerkiness, causing the whole deal to fall apart.  AMD quickly made an offer for ATI that was probably pretty favorable to ATI stockholders (since it was accepted very quickly), and the rest is history.
    IDK man you trying to say AMD merger with Nvidia and somehow the ceo made unreasonable demands.  One company was worth 4 bil and the other company was worth over 15 billion.
    The numbers I saw when that merger was on the table showed the two companies with approximately equal value, with AMD being slightly more valuable.  That has changed since then of course.  That was during the heyday of the Athlon and Athlon X2.
    Hmm looking at financials it appears AMD was worth around 40 billion in 2006.  Nvidia was worth about 20 billion back then.  Regardless it was a good move for nvidia I would say considering the outcome.  Nvidia currently worth over 68 billion and has been on a steady climb.  Meanwhile AMD very rocky road and very scary for investors.
    Are you onto something or just on something?
  • OzmodanOzmodan Hilliard, OHMember RarePosts: 8,637
    edited February 7
    Well if someone asks me about a laptop and it has a Intel GPU I laugh at them.   It would behoove Intel to stop wasting resources on GPU design and align with one of the major GPU makers.  Even AMD CPU/GPU chips make them look bad when it comes to games.  I think it would sell more of their chips and save them all that design money.

    As to the thought that Apple might move away from Intel, I tend to discount that because I know a lot of people who buy Apple because they can run windows on their machines for games when they need to.  If Apple were to do that it would hurt their PC sales significantly.

    Some of you don't seem to grasp that Huang, Nvidia's CEO is compared to Larry Ellison a lot when it comes to business dealings.  I have seen many comments that dealing with Nvidia is like dealing with a vipers nest. 
    Post edited by Ozmodan on
  • GladDogGladDog Pottstown, PAMember RarePosts: 836
    edited February 9
    filmoret said:
    GladDog said:
    filmoret said:
    Cleffy said:
    I find it implausible AMD will provide GPU chips to Intel. However, I can see Intel making a licensing agreement with AMD for patents like they previously did with nVidia. It actually makes more sense for Intel to license AMD patents as the companies already share a lot of licensing agreements, and AMDs patents make more sense for Intel GPUs as AMD technology is more open than nVidia.
    It would definitely be interesting if AMD designs Intel GPUs. It would help secure AMD financially. It may also turn out better for Intel as they would keep an uncompetitive AMD as a competitor instead of a company with more resources or deal with FTC mandates.
    Their previous deal with Nvidia was for 66mil a year.  I don't see them spending more then that with AMD but you never know.

    GladDog said:
    If you look at Nvidia's history, you see over and over again that the CEO of Nvidia is a jerk that finds new ways to piss off people every time they negotiate a deal.  He may be an exceptional and successful businessman, but he seems to revel in finding ways to piss off people.  Even though AMD is a competitor, Intel may have found that the crow they had to swallow was a lot tastier than dealing with Nvidia.

    Before AMD purchased ATI, they were in a strong merger negotiation with Nvidia.  At the time, it looked like the AMD/ Nvidia merger would make a lot more sense.  AMD had great ties with Nvidia, and had partnered in several successful ventures.  ATI was working with Intel on a couple of projects, and if AMD purchased Nvidia that would have overcome a huge anti-trust hurdle allowing Intel to buy ATI.  But the CEO of Nvidia made some unacceptable demands that once again proved his jerkiness, causing the whole deal to fall apart.  AMD quickly made an offer for ATI that was probably pretty favorable to ATI stockholders (since it was accepted very quickly), and the rest is history.
    IDK man you trying to say AMD merger with Nvidia and somehow the ceo made unreasonable demands.  One company was worth 4 bil and the other company was worth over 15 billion.
    The numbers I saw when that merger was on the table showed the two companies with approximately equal value, with AMD being slightly more valuable.  That has changed since then of course.  That was during the heyday of the Athlon and Athlon X2.
    Hmm looking at financials it appears AMD was worth around 40 billion in 2006.  Nvidia was worth about 20 billion back then.  Regardless it was a good move for nvidia I would say considering the outcome.  Nvidia currently worth over 68 billion and has been on a steady climb.  Meanwhile AMD very rocky road and very scary for investors.
    It's good to have hindsight of course, but back then everyone was rooting for that merger.  AMD and Nvidia made some excellent collaborations, especially with SLI.  An Athlon X2 paired with SLI 7800's was a pretty compelling rig back then.  Not too long after that merger failed and AMD acquired ATI, Intel launched the Core2duo, which slammed AMD to the ground.  And Nvidia launched their GTS and GTX 8800s.

    ***ADDED AS AN EDIT***

    I've often wondered what would have happened if the AMD/Nvidia merger would have gone through.  If it had, and Intel purchased ATI, I think there would have been several very changed results.

    First off, AMD is a company that is willing to take chances more than a mega company like Intel.  When Intel released the Core2duo, AMD would have greatly invested in Nvidia to recoup their losses.  Nvidia has some really top-notch engineers, and with a lot of funding and impetus, they could have choked the GPU market, pretty much owning it (back then the market share between Nvidia and ATI was a lot closer).  With the extra profit from the GPU market, they might have been able to make the CPU race a lot closer, especially if they were marketing Athlon and Phenom CPUs with a Nvidia GPU built in. 

    Remember, most sales back then were desktops and laptops to non-gamers.  An Athlon 2 X4 with a Nvidia 8400 or 8600 built on would have been great for the masses.  It would have been top notch with the web, graphics, movies and even capable for gaming.  Even if the CPUs were slower, I think the graphics improvement would have been enough to keep their market share strong.  And such an APU running dual graphics with a true 8600 card using SLI tech would have been pretty doggone good, probably the entry level for serious gaming.

    Of course having ATI patents and tech, along with some really good engineers that would finally be getting what they were worth from Intel, would have been a boon for Big Blue.  If every Intel chipset motherboard had an ATI graphics processor on it, those Core2duos would have looked really, really good to consumers, way better than with the lame GM950 video on most i945, i965, and X31 & X35 motherboards at the time.  And when they started building video into their i3, i5 and i7 chips, if it was using ATI tech they would have been very big, maybe bigger than they ended up.

    But bottom line is I think we would have had a far more competitive market if AMD merged with Nvidia and Intel acquired ATI...
    Post edited by GladDog on
    blueturtle13


    The world is going to the dogs, which is just how I planned it!


  • RidelynnRidelynn Fresno, CAMember EpicPosts: 6,005
    Ozmodan said:

    As to the thought that Apple might move away from Intel, I tend to discount that because I know a lot of people who buy Apple because they can run windows on their machines for games when they need to.  If Apple were to do that it would hurt their PC sales significantly.

    Apple will do what is best for Apple, and that isn't to sell hardware for the purpose of Bootcamp. They are invested in their ecosystem, and while they do make Bootcamp available, it's mostly just a marketing point they can use to try to tempt people over to their ecosystem (sure, you can run Windows, but OS X really just works). 

    Right now, the only thing really keeping them from going ARM-only is that ARM isn't quite up to the performance they need for their OS X machines (yet). Apple is willing to put out a new generation that doesn't represent a speed increase (in fact, they have done generations before that are slightly slower than the previous generation), but it has to come with some other benefit to outweigh it - different design, better battery, something. The A10 started to post numbers close to Intel's own Pentium/Celeron lines, but they are still a ways from the Core lineup that Apple has been using to date.

    Apple has changed desktop processor types before (twice before, in fact, 68k -> PPC -> x86, not accounting for the fact that iOS runs on ARM). Apple has supported multiple CPU architectures in a single operating system before (and supposedly they have a branch of OS X that runs on ARM already). None of that is easy, but it's nothing that Apple hasn't done before. If /when ARM has enough advantages, Apple will jump in a heartbeat and not look back.

    The only reason Intel is even remotely interested in what Apple does (apart from the fact that they like to sell CPUs), is not because of the volume of CPUs Apple buys, but it's because where Apple goes, the rest of the industry seems to follow.
  • OzmodanOzmodan Hilliard, OHMember RarePosts: 8,637
    edited February 14
    Ridelynn said:
    Ozmodan said:

    As to the thought that Apple might move away from Intel, I tend to discount that because I know a lot of people who buy Apple because they can run windows on their machines for games when they need to.  If Apple were to do that it would hurt their PC sales significantly.

    Apple will do what is best for Apple, and that isn't to sell hardware for the purpose of Bootcamp. They are invested in their ecosystem, and while they do make Bootcamp available, it's mostly just a marketing point they can use to try to tempt people over to their ecosystem (sure, you can run Windows, but OS X really just works). 

    Right now, the only thing really keeping them from going ARM-only is that ARM isn't quite up to the performance they need for their OS X machines (yet). Apple is willing to put out a new generation that doesn't represent a speed increase (in fact, they have done generations before that are slightly slower than the previous generation), but it has to come with some other benefit to outweigh it - different design, better battery, something. The A10 started to post numbers close to Intel's own Pentium/Celeron lines, but they are still a ways from the Core lineup that Apple has been using to date.

    Apple has changed desktop processor types before (twice before, in fact, 68k -> PPC -> x86, not accounting for the fact that iOS runs on ARM). Apple has supported multiple CPU architectures in a single operating system before (and supposedly they have a branch of OS X that runs on ARM already). None of that is easy, but it's nothing that Apple hasn't done before. If /when ARM has enough advantages, Apple will jump in a heartbeat and not look back.

    The only reason Intel is even remotely interested in what Apple does (apart from the fact that they like to sell CPUs), is not because of the volume of CPUs Apple buys, but it's because where Apple goes, the rest of the industry seems to follow.
    Not true, Apple's computer products are more of a cult following than anything else.  They cost twice as much, are not upgradeable for the most part and are generally more conservative compared to other PC options.

    As to ARM, they have a LONG way to go to be competitive with Intel CPUs.  Adding more cores generally does not help that much, it all comes down to how the software was written.  When you consider how cheap PC's are right now, putting a low end ARM processor computer out would just get laughed at and of course Apple could not price it at the low end.

    Bootcamp also sells a lot of Apple computers, because the effort to write a game for the Apple OS is not worth the effort, so if you want to play a game it is necessary and Apple darn well knows it.
    Post edited by Ozmodan on
  • RidelynnRidelynn Fresno, CAMember EpicPosts: 6,005
    This doesn't necessarily mean that Apple's ARM is beating Intel, but it does indicate that the gap is rapidly closing:

    http://www.theverge.com/2016/9/16/12939310/iphone-7-a10-fusion-processor-apple-intel-future

    I don't think Apple cares about gaming on the Desktop, otherwise they would do things like release timely graphics driver updates, provide for better graphics performance & hardware, and support more modern versions of OpenGL and Vulkan. Instead, they seem to only care about gaming on iOS, maybe because they get a big cut there. Metal was initially for iOS, and only later was ported to OS X (mainly so that developers could do iOS->OS X ports). 

    Heck, most OS X capable computers that are sold, are sold with no discrete graphics at all, only Intel IGP.

    Even if you argue that Bootcamp is how Apple is providing that support. The latest Bootcamp drivers: Aug 12, 2015.

    At one point they did. But that hasn't been the case for a long while now, at least on OS X. 
    Ozmodan
  • RidelynnRidelynn Fresno, CAMember EpicPosts: 6,005
    Old thread but some recent developments:

    http://www.fudzilla.com/news/graphics/43663-intel-is-licensing-amd-graphics

    Still just a rumor as far as I'm concerned, as no official sources have confirmed. But it's keeping an old rumor alive.
    Torval
  • QuizzicalQuizzical Member EpicPosts: 18,074
    Ridelynn said:
    Old thread but some recent developments:

    http://www.fudzilla.com/news/graphics/43663-intel-is-licensing-amd-graphics

    Still just a rumor as far as I'm concerned, as no official sources have confirmed. But it's keeping an old rumor alive.
    Intel has now apparently denied it:

    http://www.barrons.com/articles/intel-refutes-rumor-of-licensing-amd-graphics-technology-1495064908
    Torvalblueturtle13
  • DrevarDrevar College Station, TXMember UncommonPosts: 176
    Another misunderstood thing about the Nvidia/Intel agreement is that it didn't end in March.  What ended was the access to any NEW patents filed by Nvidia AFTER March.  All the patents the Nvidia had prior will still be available to Intel forever, as far as I can tell.  It was a CYA agreement more than a real tech swapping deal.
    Torvalblueturtle13

    "If MMORPG players were around when God said, "Let their be light" they'd have called the light gay, and plunged the universe back into darkness by squatting their nutsacks over it."
    -Luke McKinney, The 7 Biggest Dick Moves in the History of Online Gaming

    "In the end, SWG may have been more potential and promise than fulfilled expectation. But I'd rather work on something with great potential than on fulfilling a promise of mediocrity."
    -Raph Koster

  • RidelynnRidelynn Fresno, CAMember EpicPosts: 6,005
    Drevar said:
    Another misunderstood thing about the Nvidia/Intel agreement is that it didn't end in March.  What ended was the access to any NEW patents filed by Nvidia AFTER March.  All the patents the Nvidia had prior will still be available to Intel forever, as far as I can tell.  It was a CYA agreement more than a real tech swapping deal.
    Looks like you are correct:

    http://www.marketwatch.com/story/intel-and-amd-license-rumors-should-finally-be-dead-2017-05-22
    blueturtle13
Sign In or Register to comment.