Quantcast

Howdy, Stranger!

It looks like you're new here. If you want to get involved, click one of these buttons!

Intel to launch a CPU with an integrated AMD GPU in the same package

QuizzicalQuizzical Member LegendaryPosts: 22,131
edited November 2017 in Hardware
Rumors that this would happen started late last year with HardOCP insisting that it was going to happen.  The rumors lingered on for a while until Intel pointedly denied them a couple of months ago.  Well, today they've just confirmed that the rumors were right all along.

https://newsroom.intel.com/editorials/new-intel-core-processor-combine-high-performance-cpu-discrete-graphics-sleek-thin-devices/

Or if you want some tech media takes on it:

https://www.anandtech.com/show/12003/intel-to-create-new-8th-generation-cpus-with-amd-radeon-graphics-with-hbm2-using-emib

http://techreport.com/news/32792/intel-brings-a-core-cpu-and-radeon-gpu-together-on-one-package

I always expected that there would be a product with an AMD CPU, AMD GPU, and a stack of HBM2.  I didn't expect that it would be an Intel CPU connecting to an AMD GPU with some sort of PCI Express connection internal to the CPU package.

Apparently it's targeted at laptops and will have a TDP for the combined chip in the range of 35-45 W.  If you want a highly portable gaming laptop with some serious gaming power, this is the product you're looking for.  By comparison, Raven Ridge will be lower power and lower performance.

Comments

  • WizardryWizardry Member LegendaryPosts: 17,882
    I don't want to decipher what it really means,i only look at prices and factual performance.
    I don't think i have ever had a PC perform the way it was hyped to perform,it is always worse in actual use.

    Two players vying for one product,it can only mean an over priced product.

    On a longshot wild guess....Intel thinking of buying out AMD eventually to further monopolize the industry?

    Never forget 3 mile Island and never trust a government official or company spokesman.

  • gervaise1gervaise1 Member EpicPosts: 6,919
    The industry is bigger than Intel and AMD - the latter being a tiny player.

    Samsung are number 2 after Intel; Broadcom + Qualcomm would be number 3. With various underdogs like ARM ! (owned by Softbank), NVidia (Tiagra) etc. - some of which are also take over targets.

    This obviously works financially for Intel and AMD. Same way that the Apple X has a Samsung screen and battery (13% by value). Or LG OLED TV screens - now being used by Sony, Panasonic, Phillips.
  • CleffyCleffy Member RarePosts: 6,254
    Intel GPUs have always been worthless. Considering Intel's contract dispute with nVidia, AMD makes the most sense.
    Phry
  • OzmodanOzmodan Member EpicPosts: 9,726
    edited November 2017
    Good for the gaming public.

    Come on, the Broadcom attempt to pick up Qualcomm will never go anywhere.  It breaks so many monopoly laws that it is will not even be considered by the feds.


  • CleffyCleffy Member RarePosts: 6,254
    That and Qualcomm is a bit too large to acquire. It would require a merger. I think a merger would be possible since they would be competing with Samsung, Apple, and Intel.
  • gervaise1gervaise1 Member EpicPosts: 6,919
    Ozmodan said:
    Good for the gaming public.

    Come on, the Broadcom attempt to pick up Qualcomm will never go anywhere.  It breaks so many monopoly laws that it is will not even be considered by the feds.


    Maybe. They will still be smaller than Intel and Samsung and one has to assume that Broadcom believe it will go through. No reports in the press suggesting it will be stopped either - there is no "obvious" monopoly. That said sometimes its hard to discern the basis on which such decisions are made.
  • gervaise1gervaise1 Member EpicPosts: 6,919
    What do Intel get out of this? I struggled with this as did the writers of the articles. Then it struck me - what I was looking at was a mini-motherboard. Designed and patented by Intel (who will decide what to make open source and what to charge for.)

    Integration has gone hand in hand with the evolution of the PC. New components being created and over time integrated onto the motherboard e.g. no sound to sound card to on-board sound chip.

    The precise nature of the beast is unknown but in the future will we have the option to buy gpus rather than graphics cards?  
  • GruntyGrunty Member EpicPosts: 8,657
    edited November 2017
    gervaise1 said:
    What do Intel get out of this? I struggled with this as did the writers of the articles. Then it struck me - what I was looking at was a mini-motherboard. Designed and patented by Intel (who will decide what to make open source and what to charge for.)

    Integration has gone hand in hand with the evolution of the PC. New components being created and over time integrated onto the motherboard e.g. no sound to sound card to on-board sound chip.

    The precise nature of the beast is unknown but in the future will we have the option to buy gpus rather than graphics cards?  
    GPUs will still require some form of dedicated RAM. Which means more slots on a motherboard.  That won't be a mini-motherboard. 

    It will also limit upgradeability and that is still a big thing for graphics. Nowadays you may not want a new CPU for several years but digital graphics are not even close to being in a stable or even stagnant capability.
    Phry
    "I used to think the worst thing in life was to be all alone.  It's not.  The worst thing in life is to end up with people who make you feel all alone."  Robin Williams
  • gervaise1gervaise1 Member EpicPosts: 6,919
    Grunty said:
    gervaise1 said:
    What do Intel get out of this? I struggled with this as did the writers of the articles. Then it struck me - what I was looking at was a mini-motherboard. Designed and patented by Intel (who will decide what to make open source and what to charge for.)

    Integration has gone hand in hand with the evolution of the PC. New components being created and over time integrated onto the motherboard e.g. no sound to sound card to on-board sound chip.

    The precise nature of the beast is unknown but in the future will we have the option to buy gpus rather than graphics cards?  
    GPUs will still require some form of dedicated RAM. Which means more slots on a motherboard.  That won't be a mini-motherboard. 

    It will also limit upgradeability and that is still a big thing for graphics. Nowadays you may not want a new CPU for several years but digital graphics are not even close to being in a stable or even stagnant capability.
    It looks like the HBM2 "product" will supply the dedicated memory that the GPU needs - why its in close proximity to it. Will it supply the system RAM though either initially or going forward.

    As you say "upgradeability" is an issue - at least for e.g. game players today less so for a huge tranche of the market.

    And it might be that this product is simply intended for that tranche of the market: for people who don't need dedicated graphics cards. My idle speculation though was whether the design might allow people to upgrade by buying a new gpu! 

    This would be simple evolution. Afterall people used to buy dedicated sound cards, hard drives and CD drives that they connected to dedicated IDE or SCSI cards, dedicated network cards, dedicated wireless cards. And so forth.

    There will still have to be power connections, external ports and extra storage (presumably) etc. so maybe "card" rather than a "mini-motherboard" would be a better choice of word? It looks to be a design that has all the key components though. 
  • TorvalTorval Member LegendaryPosts: 20,014
    Intel could also use Optane technology and their own propriety memory/NAND hybrids. Optane isn't the only crossover memory technology they're working on.

    I have a feeling these will be for systems that aren't as user serviceable or upgradeable. This will be the SoC for devices. They could use these in NUCs or similar small form factor low power devices. There could be a lot of enthusiast builder opportunities here, but I think it's more intended an OEM thing for device makers. At least that's what I've gotten out of it so far.
    Fedora - A modern, free, and open source Operating System. https://getfedora.org/

    traveller, interloper, anomaly, iteration


  • QuizzicalQuizzical Member LegendaryPosts: 22,131
    Grunty said:
    gervaise1 said:
    What do Intel get out of this? I struggled with this as did the writers of the articles. Then it struck me - what I was looking at was a mini-motherboard. Designed and patented by Intel (who will decide what to make open source and what to charge for.)

    Integration has gone hand in hand with the evolution of the PC. New components being created and over time integrated onto the motherboard e.g. no sound to sound card to on-board sound chip.

    The precise nature of the beast is unknown but in the future will we have the option to buy gpus rather than graphics cards?  
    GPUs will still require some form of dedicated RAM. Which means more slots on a motherboard.  That won't be a mini-motherboard. 

    It will also limit upgradeability and that is still a big thing for graphics. Nowadays you may not want a new CPU for several years but digital graphics are not even close to being in a stable or even stagnant capability.
    HBM2 already has the memory in the same package as the GPU, and that's what the product is going to use.
  • TorvalTorval Member LegendaryPosts: 20,014
    Quizzical said:
    Grunty said:
    gervaise1 said:
    What do Intel get out of this? I struggled with this as did the writers of the articles. Then it struck me - what I was looking at was a mini-motherboard. Designed and patented by Intel (who will decide what to make open source and what to charge for.)

    Integration has gone hand in hand with the evolution of the PC. New components being created and over time integrated onto the motherboard e.g. no sound to sound card to on-board sound chip.

    The precise nature of the beast is unknown but in the future will we have the option to buy gpus rather than graphics cards?  
    GPUs will still require some form of dedicated RAM. Which means more slots on a motherboard.  That won't be a mini-motherboard. 

    It will also limit upgradeability and that is still a big thing for graphics. Nowadays you may not want a new CPU for several years but digital graphics are not even close to being in a stable or even stagnant capability.
    HBM2 already has the memory in the same package as the GPU, and that's what the product is going to use.

    Anand explains why HBM is only for the GPU here:
    "Firstly, judging by the wording and Intel's launch video, it can basically be confirmed that EMIB is only being used between the GPU and the HBM2. The distance between the CPU and GPU is too far for EMIB, so is likely just PCIe through the package which is a mature implementation. This configuration might also help with power dissipation if the chips are further apart."

    Later on in that commentary they offer further speculation which I found interesting. The idea that this is for Apple devices makes a lot of sense.
    Fedora - A modern, free, and open source Operating System. https://getfedora.org/

    traveller, interloper, anomaly, iteration


  • CleffyCleffy Member RarePosts: 6,254
    edited November 2017
    I think the market placement for this makes a lot of sense. There was a time not too long ago that if you were below a certain price-point you should never consider Intel. This is because Intel's GPUs sucked. If you were buying a $500 laptop it would be hard to suggest getting the Core i5 with integrated graphics over an A10 purely because of the GPU.
    It also has 2 other applications due to the compute performance of the Vega chip. It can be used by businesses for demonstration purposes. It can be placed in a farm and off put a lot of GPGPU functions to the Vega chip maximizing efficiency.
  • RidelynnRidelynn Member EpicPosts: 7,061
    There are a lot of reasons here, but the more I think about it, I think there are underlaying currents here. Maybe I just have my tinfoil hat on too tightly today.

    This very well could be an Apple product - Intel has done Apple-specific configurations before (Crystalwell). They weren't exclusively provided to Apple, but they were definitely designed around Apple's specifications.

    That being said, I don't think that's a huge volume of chips. Apple is a big company now, but that isn't on the back of their x86 computer line. The overall volume of Macbooks and iMacs is still a pretty low number compared to the overall x86 market. This is a great perk, but I don't think this was a driving decision in this particular announcement. 

    There is also the talk of Apple going ARM in their OSX line. This may be a tactic being employed by Intel intended to drive Apple to reconsider that, or at the very least, delay that. I don't know for certain any more than anyone else if Apple really is intending to do that, or what timetable they have to do it, but if that's the course that Apple has chosen to chart, then there is next to nothing that Intel could do to alter that, or significantly impact the time table. Again, I don't see a high enough volume from Apple in the x86 line to really make this a decision driver, but stranger things have happened.

    I do think that it had more to do with IP. I don't know if this is a signal that Intel has given up on graphics entirely or not. I could be Intel needed AMD patents, and the cheapest way to do that was to give the allusion that  your going to integrate their product early on, do a generation or two and float it out there without a lot of promotion. If it hits on it's own merits without a huge marketing campaign, then great, that was a low risk high payoff venture. If it doesn't hit, well, you have Intel's R&D dept in the background working the entire time to replace nVidia patented items with AMD patented items, and will re-release Intel HD Graphics and Larrabee 2.0 in a couple-three years. And the price of integrating that one or two generations (and giving the allusion of cooperation) up front was less than the price of purchasing the IP outright.

    Then there is the Raja factor. Raja Koduri has been in the graphics field for a long time. He came from ATI/AMD graphics as CTO, and in 2009 went to Apple and worked on their GPU/Retina transition. Then he went back to be VP of AMD's Radeon Technologies Group in 2013, and was largely the person responsible for pulling AMD over to HBM memory, and has had a very large influence on Vega (and perhaps more so, Navi). He announced a sabbatical from AMD about a month ago. Then, out of the blue on Tuesday, he resigns from AMD.

    The very next day, Intel releases a press release saying Raja is now working at Intel, and Intel is back in to pushing a discrete graphics product. Pundits buzzed about this being specifically to target nVidia, and there's something to that. But it's curious - I could see hiring Raja over to help with the integration of AMD into the Intel package, as was announced last week. That would make perfect sense. 

    I wonder how much of it was to just weaken AMD though. If Intel really does see Zen as a big of a threat as they are reacting to - couldn't this be just another reaction to Zen? How better to cripple AMD than to hit them where they are really doing well? And say what you want about Vega, but it has been selling and is generally a well received product. And for a long time, the RTG group has been floating the CPU group, pretty much since the release of the GCN architecture.

    The announcement of a return to discrete graphics is curious. I would imagine that mining/datacenter sales volumes of GPUs has no small part in that. If you can't convince the world to buy your CPUs, and the world has shifted to buying discrete GPU packages, then you need to package your product as a discrete GPU; which was exactly what Knight's Landing tries to do, but hasn't been successful at.

    I'm sure there was no single reason that is driving all of this, but it certainly is an interesting sequence of events lately.
    PhryTorval
  • QuizzicalQuizzical Member LegendaryPosts: 22,131
    Ridelynn said:

    The announcement of a return to discrete graphics is curious. I would imagine that mining/datacenter sales volumes of GPUs has no small part in that. If you can't convince the world to buy your CPUs, and the world has shifted to buying discrete GPU packages, then you need to package your product as a discrete GPU; which was exactly what Knight's Landing tries to do, but hasn't been successful at.
    In the consumer space, I think that discrete GPUs are going to mostly but not entirely disappear, though it will take a few more years.  Mid-range to high end gaming desktops will still have discrete GPUs, and perhaps high end gaming laptops, but that will be about it.  We'll end up thinking of discrete video cards as something that you don't really need to even look at unless you're looking to spend perhaps $1000+ on a gaming desktop or $1500+ on a gaming laptop, or perhaps want to upgrade the GPU in an older system without replacing the CPU, too.

    I think that the bigger driver of this is HPCs and data centers.  x86 is a latency-optimized architecture, not throughput-optimized, and so it is simply terrible at most embarrassingly parallel algorithms--which is precisely the sort of thing that you buy racks full of servers for.  No amount of having wider AVX instructions or more, weaker cores on a chip can change that.  The top end Xeon Phi can have something like 288 threads resident at a time.  The integrated GPU in Raven Ridge probably tops out at 25600, though I haven't checked on whether Vega changes some hardware caps from GCN.  My point in citing those numbers is not that Raven Ridge is nearly 100 times as fast as Xeon Phi (it isn't), but that they're using different paradigms entirely and x86 can't be made anything remotely resembling throughput-optimized.

    Trying to heavily use AVX also has the problem that it's a total nightmare to code for.  x86 CPUs simply don't have a GPU's local memory, so they don't have a good way to swap data between threads or between AVX lanes.  For embarrassingly parallel algorithms, x86 is really a non-starter unless there is some reason why your algorithm cannot fit on GPUs or FPGAs.

    Intel may also be worried that more consumer stuff may offload work onto the integrated GPU.  If that happens and Intel doesn't have a credible integrated GPU, then their consumer CPUs could be in serious trouble.

    There are also reputational effects to consider.  If Intel were to suddenly fix their integrated GPUs so that they were better than AMD's and reliably had excellent drivers, how long would it take the world to notice?  Probably a lot longer than if they also had excellent, high end consumer GPUs of the sort that gamers would push a lot harder in ways that expose whatever problems the hardware and drivers might have.
    RidelynnTorval
  • OzmodanOzmodan Member EpicPosts: 9,726
    I highly doubt Apple will go to an ARM chip for a cpu in either a laptop or a desktop.  It would basically put a huge dent in those markets as most Apple users I know will run windows on their Apple computers for games.  You kill the games and you make a huge reason not to buy Apple.
  • VrikaVrika Member EpicPosts: 6,438
    Ozmodan said:
    I highly doubt Apple will go to an ARM chip for a cpu in either a laptop or a desktop.  It would basically put a huge dent in those markets as most Apple users I know will run windows on their Apple computers for games.  You kill the games and you make a huge reason not to buy Apple.
    They could equip the laptop with a touch screen and have it run all iOS apps and games.

    I think Apple wants to put an ARM chip to their Macs and combine their ecosystems. It's just a question of can they find a way to do it.
     
  • RidelynnRidelynn Member EpicPosts: 7,061
    edited November 2017
    Gaming, and people who run Windows (either via Bootcamp or virtualization), I think, are not a big market segment Apple is worried about upsetting. I do understand a lot of people do just that (I have Win10 installed in VMWare on this Macbook now). Apple has flirted with gaming before on OS X, but by and large, they ignore it on anything except iOS - and even there they aren't exactly pushing any envelopes. Bootcamp hasn't received any serious updates or attention in a long time, and you really don't want to be gaming via virtualization.

    The move to Apple to ARM would be driven by two very overarching Apple principles - the ability to exert control over the supply chain, and to deeply customize and integrate the hardware and software.

    Backwards compatibility and cross-compatibility are not issues at all. Apple has a lot of experience supporting cross-architectures and migrating users and software over during a transition period. They've done it twice so far in the lifespan of Macintosh (from Mororola x86 to PowerPC to Intel x86), and handled it fairly well both times.

    They also have a lot of experience working and optimizing for ARM, so it's not like they would be jumping to something entirely new.

    My instinct would say they would willingly (and maybe even gladly) forgo Bootcamp, and in it's place offer something similar to allow iOS apps to run natively. Then they can tout that huge library of iOS games out there. I won't get into the quality of a typical iOS game versus a typical Windows game, that's for another discussion, but that would be the direction I would expect Apple to take, and by and large a lot of people would sign right off on it and continue to buy Macbooks and iMacs.

    Besides, Windows already runs on ARM.... and has for a while. You just haven't ever wanted to in the past, because Windows ARM offerings have sucked bigly.

    That, and, I know a lot of people run Bootcamp and play games on Apple computers. But I don't know of anybody who went out and bought an Apple just to play Windows games on - Apple machines, in general, make for pretty anemic and very expensive gaming rigs.
    Quizzical
  • OzmodanOzmodan Member EpicPosts: 9,726
    Apple has killed off more PC generations than any other company via stupid designs.  Jobs decision to go with Intel was one of the few that actually showed some intelligence.  ARM chips are second rate compare to either Intel or AMD and that is not changing any time soon.  So moving to an ARM chip would make their pricey computer designs second rate, a prime way to make their Apple computers only purchased by people who care for looks over what it actually does.

    It would make their small market share, minuscule.  I really doubt that even Apple is that stupid.

    Asm0deus
Sign In or Register to comment.