Howdy, Stranger!

It looks like you're new here. If you want to get involved, click one of these buttons!

Intel, tilting at windmills again.

OzmodanOzmodan Member EpicPosts: 9,726
Always have to like articles that provide a morning chuckle.

https://www.marketwatch.com/story/intel-makes-it-a-three-way-race-with-amd-and-nvidia-on-graphics-chips-2018-06-12

Throwing money down the rabbit hole is not going to thrill investors.

«13

Comments

  • RhomsRhoms Member UncommonPosts: 174
    DMKano said:
    Ozmodan said:
    Always have to like articles that provide a morning chuckle.

    https://www.marketwatch.com/story/intel-makes-it-a-three-way-race-with-amd-and-nvidia-on-graphics-chips-2018-06-12

    Throwing money down the rabbit hole is not going to thrill investors.


    Hey I think it's good to have a 3rd option. 

    The whole Green vs Red war is just stale - a 3rd contender (I hope Intel brings something worthy to the table) should make it interesting.
    We could use a good competitor to mix things up a bit.  I can't remember when we truly had competition at the high end between AMD and Nvidia.  Was it the 7970/gtx680 in 2012?  It's been a while, and I hope some competition can make better and cheaper products.

    Current game: Pillars of Eternity

    Played: UO, AC, Eve, Fallen Earth, Aion, GW, GW2 

    Tried: WOW, Rift, SWTOR, ESO 

    Future: Camelot Unchained?  Crowfall?  Bless?

  • VrikaVrika Member LegendaryPosts: 7,882
    If NVidia and AMD are windmills, then that would make Intel a nuclear power plant. They're going against lot smaller opponents here.
     
  • [Deleted User][Deleted User] Posts: 12,263
    The user and all related content has been deleted.

    거북이는 목을 내밀 때 안 움직입니다












  • AmatheAmathe Member LegendaryPosts: 7,630
    I picture some poor guy who now has to consider three companies when coding games. Bet he doesn't get a raise for the extra work.

    EQ1, EQ2, SWG, SWTOR, GW, GW2 CoH, CoV, FFXI, WoW, CO, War,TSW and a slew of free trials and beta tests

  • QuizzicalQuizzical Member LegendaryPosts: 25,347
    edited June 2018
    Amathe said:
    I picture some poor guy who now has to consider three companies when coding games. Bet he doesn't get a raise for the extra work.
    Nonsense. Intel already makes integrated GPUs that a lot of people use, so developers of most games have to check to make sure it works on Intel.

    Furthermore, the commonly used APIs for GPU programming work well and make portability between architectures (whether the same vendor or different vendors) easy to do.  (Well, except for CUDA, which has killing portability as its primary reason to still exist, but games generally don't use it.)  If you want to make architecture-specific optimizations, that can add a little bit of work, but you don't really have to do that.  Other than that, all that you really have to worry about is driver bugs.
    Phrymetternick
  • AmatheAmathe Member LegendaryPosts: 7,630
    Quizzical said:
    Amathe said:
    I picture some poor guy who now has to consider three companies when coding games. Bet he doesn't get a raise for the extra work.
    Nonsense. Intel already makes integrated GPUs that a lot of people use, so developers of most games have to check to make sure it works on Intel.

    Furthermore, the commonly used APIs for GPU programming work well and make portability between architectures (whether the same vendor or different vendors) easy to do.  (Well, except for CUDA, which has killing portability as its primary reason to still exist, but games generally don't use it.)  If you want to make architecture-specific optimizations, that can add a little bit of work, but you don't really have to do that.  Other than that, all that you really have to worry about is driver bugs.
    Looks like you contradict yourself about 5 times. So "nonsense" back at ya lol.

    EQ1, EQ2, SWG, SWTOR, GW, GW2 CoH, CoV, FFXI, WoW, CO, War,TSW and a slew of free trials and beta tests

  • RidelynnRidelynn Member EpicPosts: 7,383
    edited June 2018
    I think Intel could come up with a competitor in the discrete GPU arena. There is absolutely no reason they shouldn't be able to. The big question is, why now?

    Right now we are within spitting distance of iGPUs being able to handle 95%+ of what gamers and developers want. And that number is growing. Sure, you can say 4K/Raytracing/VR... and sure, those things need more power than an iGPU can probably deliver in the next 5 years, but they are also items that only the top 5%  or so are really using right now too, and everything is getting more powerful or offloading processes smartly that even that is within reach sooner than later.

    So entering now with a big push into discrete GPU - the purpose certainly isn't gaming. Gaming happens to get some nice marketing buzz going though, and Intel certainly loves their marketing, so I don't doubt such a product would have some gaming variant, probably with it's own branding and big marketing campaign. But it certainly isn't a gaming-first endeavor.

    I think AI has a good bit to do with it. That market has exploded recently. I also think mining has a good deal to do with it - I don't think mining as a hardware driving force has a lot of legs left to it, but related tech (particularly block chain encryption) certainly has potential. 

    The other interesting part is the AMD factor. Intel and AMD were very hush hush about their partnership on an iGPU, and KabyG it does exist, it's been remarkably low key for what otherwise (at least on paper) should be a breakthrough product. Products that contain KabyG have been expensive, poorly energy optimized, and not all that attractive. I wouldn't necessarily point the finger at either Intel or AMD, I think more of it is optimization than any glaring hardware deficiency, but when you do mashups like this optimization is a tricky support topic to iron out.

    And the other side of that coin - even though Intel had AMD signed up as a partner, they sniped out Raja Koduri, the former head of the RTG group and was heavily involved with Vega/Navi.

    There have been rumors that Vega was primarily designed to Apple specifications (iMac Pro primarily, which may be true), and discrete cards just happened to come as a byproduct of that... and Navi is in a similar situation except the customer is Sony (for presumably the PS5). Maybe that's why Intel saw the need to go their own path - AMD wasn't catering to Intel's needs, and Intel wanted something to battle nVidia, not to go courting Apple or Microsoft with, and that isn't the direction that AMD seems to be focused on.

    Intel is desperately trying to find new markets. The x86 market is not growing and hasn't been for a long while. They missed the boat entirely on mobile. They are ok on communications, but they aren't a market leader. A lot of their other attempts to branch out have fizzled (not because they didn't have good ideas, but because their support didn't seem to be behind any of them - they just kinda threw a lot of random ideas at the wall halfheartedly and acted surprised that none of it stuck), and now their biggest advantage - their fabrication, appears to be struggling.

    Just speculation and random comments, but it's fun to randomly comment. If I were to bet at a company that will look radically different in 10 years time, my vote would entirely go to Intel. All it will take is Apple pulling the plug, and the rest of the PC manufacturers will fall in line like they always do - and the entire consumer PC marketplace will change almost overnight. Intel will be left holding just server business, and look an awful lot like IBM does today... In fact, I wouldn't be surprised if those two look at merging or one buying out the other if/when that does occur, as they would be natural bedfellows, or at least one would drive the other out of existence.

    TL;DR I wouldn't expect a great gaming GPU out of Intel no matter what... no, especially (considering Raja is over there) because Intel is saying that one is come. Intel is a gasping fish out of water and is grasping for anything they can to try to remain relevant to shareholders.
    [Deleted User]
  • RidelynnRidelynn Member EpicPosts: 7,383
    edited June 2018
    Amathe said:
    I picture some poor guy who now has to consider three companies when coding games. Bet he doesn't get a raise for the extra work.
    Let's just take Fortnite as an example, since it's pretty popular with the kids right now:

    PC nVidia (Kepler, Maxwell, Pascal)
    PC AMD (GCN)
    PC Intel (iGPU)
    PS4 & PS4 Pro
    XB1 & XB1X
    Switch
    iOS A8/A9/A10/A11
    Android (and it's 10 bajillion different hardware configurations)

    Developers are already supporting a lot more than just 2 hardware configurations, even if you just look at Red/Green. APIs and good drivers are what make that possible. It doesn't hurt that Fortnite is built on Unreal, and Unreal supports a wide array of hardware with it's API, making it a lot less painful for a developer to move pretty quickly across different platforms -- accomodating all the various input form factors becomes a bigger hurdle than graphics at that point.
    [Deleted User]
  • tawesstawess Member EpicPosts: 4,227
    It was always foretold... After all it is RGB... =P 

    Anyway... Nobody ever won by doing nothing... more or less. 
    RidelynnZombieCat

    This have been a good conversation

  • QuizzicalQuizzical Member LegendaryPosts: 25,347
    Don't make a graphics chip if graphics isn't your goal.  For anything other than graphics, you can improve the chip a lot by chopping out the fixed-function graphics hardware.

    Intel is already building an ASIC for AI, aren't they?  If you build an ASIC for an algorithm and run the same algorithm on a general purpose chip with the same resources dedicated to both, the ASIC will win, and usually by a lot.  General purpose chips have their uses for algorithms not important enough to have built an ASIC for them (which includes nearly everything), but once you've got an ASIC, the general purpose chips aren't still needed for that particular algorithm.

    I can't see cryptocurrencies as driving this, either.  Other than building an ASIC, the best way to build chips with cryptocurrencies in mind is an FPGA with a suitable memory controller.  Intel conveniently just bought Altera, so they're now one of the top to FPGA vendors.

    It's possible that Intel wants the game console market.  They can't get there without a respectable GPU, which is something that they don't have right now.
  • [Deleted User][Deleted User] Posts: 12,263
    The user and all related content has been deleted.

    거북이는 목을 내밀 때 안 움직입니다












  • QuizzicalQuizzical Member LegendaryPosts: 25,347
    Quizzical said:
    Don't make a graphics chip if graphics isn't your goal.  For anything other than graphics, you can improve the chip a lot by chopping out the fixed-function graphics hardware.

    Intel is already building an ASIC for AI, aren't they?  If you build an ASIC for an algorithm and run the same algorithm on a general purpose chip with the same resources dedicated to both, the ASIC will win, and usually by a lot.  General purpose chips have their uses for algorithms not important enough to have built an ASIC for them (which includes nearly everything), but once you've got an ASIC, the general purpose chips aren't still needed for that particular algorithm.

    I can't see cryptocurrencies as driving this, either.  Other than building an ASIC, the best way to build chips with cryptocurrencies in mind is an FPGA with a suitable memory controller.  Intel conveniently just bought Altera, so they're now one of the top to FPGA vendors.

    It's possible that Intel wants the game console market.  They can't get there without a respectable GPU, which is something that they don't have right now.
    Perhaps but if rumor is to be believed ( by Ubisoft and others) there will only be one more generation of dedicated home consoles before it goes all streaming.  
    I don't doubt that there will be another major push for game streaming.  I'm strongly skeptical that playing games by streaming over the Internet (as opposed to over a LAN, which is a viable niche and will remain such) will ever catch on, however.  Why would anyone move to game streaming if it's inferior in every way except for game download times to the console you already have?

    The problem is one of physics.  It's cheaper to move data a millimeter within a chip than a thousand miles across the Internet.  That's what they have to "fix" in order for game streaming to be viable as anything other than a very low end option.

    If game streaming as a way to play games (as opposed to watching someone else play where the latency doesn't matter) at high graphical settings catches on, ISPs will revolt and find some way to sabotage it.  The entire ISP business model completely collapses if your average customer is trying to use 10 TB/month of bandwidth and your network only has enough capacity for them for 100 GB/month per customer.
  • [Deleted User][Deleted User] Posts: 12,263
    The user and all related content has been deleted.

    거북이는 목을 내밀 때 안 움직입니다












  • QuizzicalQuizzical Member LegendaryPosts: 25,347
    Quizzical said:
    Quizzical said:
    Don't make a graphics chip if graphics isn't your goal.  For anything other than graphics, you can improve the chip a lot by chopping out the fixed-function graphics hardware.

    Intel is already building an ASIC for AI, aren't they?  If you build an ASIC for an algorithm and run the same algorithm on a general purpose chip with the same resources dedicated to both, the ASIC will win, and usually by a lot.  General purpose chips have their uses for algorithms not important enough to have built an ASIC for them (which includes nearly everything), but once you've got an ASIC, the general purpose chips aren't still needed for that particular algorithm.

    I can't see cryptocurrencies as driving this, either.  Other than building an ASIC, the best way to build chips with cryptocurrencies in mind is an FPGA with a suitable memory controller.  Intel conveniently just bought Altera, so they're now one of the top to FPGA vendors.

    It's possible that Intel wants the game console market.  They can't get there without a respectable GPU, which is something that they don't have right now.
    Perhaps but if rumor is to be believed ( by Ubisoft and others) there will only be one more generation of dedicated home consoles before it goes all streaming.  
    I don't doubt that there will be another major push for game streaming.  I'm strongly skeptical that playing games by streaming over the Internet (as opposed to over a LAN, which is a viable niche and will remain such) will ever catch on, however.  Why would anyone move to game streaming if it's inferior in every way except for game download times to the console you already have?

    The problem is one of physics.  It's cheaper to move data a millimeter within a chip than a thousand miles across the Internet.  That's what they have to "fix" in order for game streaming to be viable as anything other than a very low end option.

    If game streaming as a way to play games (as opposed to watching someone else play where the latency doesn't matter) at high graphical settings catches on, ISPs will revolt and find some way to sabotage it.  The entire ISP business model completely collapses if your average customer is trying to use 10 TB/month of bandwidth and your network only has enough capacity for them for 100 GB/month per customer.
    True but can all that be said say 10 or 12 years from now? Console generations have moved beyond the old average of every 5 years. If, indeed, Ubisoft is right and there is to be only one more generation of hardware with the next Xbox rumored to be releasing in 2020 that would put it's lifespan at what? between 5 to 12 years? That would bring us to Now we are talking 2025 to 2032? Seems we can progress to a streaming only gaming environment by then. 
    Streaming games will improve, but rendering them locally will improve, too.  What you can do today with streaming is inferior to what you could do ten years ago by rendering the game locally.  What you'll be able to do ten years from now will streaming will almost certainly be inferior to what you could do today rendering the game locally.

    To give you some idea of the relative bandwidths involved:

    Typical broadband Internet connection:  0.01 GB/sec
    High end (currently gigabit) Internet connection:  0.125 GB/sec
    Monitor cable for mainstream monitor (currently 1080p at 60 frames per second):  0.5 GB/sec
    PCI Express bandwidth for relatively good video card:  10 GB/sec
    Global memory bandwidth for video card:  300 GB/sec
    Local memory bandwidth for video card:  10000 GB/sec
    Register bandwidth for video card:  100000 GB/sec

    All of those numbers will go tend to go up in the future, but the ratio between them will likely stay similar to what it is today.  One thing that should be striking is that some of the numbers are massively larger than the others.  Using one of the higher things on the list when you should be using one of the lower ones can kill performance in a hurry.  The goal of streaming games is to do exactly that.

    To make things worse, all of those except for the Internet connections are things that you can readily use all day long.  With the Internet connection, if you try to do that all day every day, your ISP will probably try to shut you down one way or another, as that would be more than 10 TB/day on a gigabit Internet connection.

    Computations are cheap.  Internet bandwidth is expensive.  That's dictated by physics, and the laws of physics aren't going to change just because someone doesn't like them.

    The only real advantage of game streaming is that you can skip the time it takes to download a game.  But the amount of data that goes through your monitor cable in one minute is more than the size to download most games.  If you've got an Internet connection that can deliver that kind of bandwidth, then downloading games is pretty trivial and streaming becomes pointless.
  • [Deleted User][Deleted User] Posts: 12,263
    The user and all related content has been deleted.

    거북이는 목을 내밀 때 안 움직입니다












  • RidelynnRidelynn Member EpicPosts: 7,383
    Rendering remotely on a giant server farm will vastly outstrip whatever resources you could put locally, even in a full blown PC form factor.

    You are limited on bandwidth - local bandwidth will always outstrip remote bandwidth.

    There's probably smart ways to split up that workload rather than just say it all has to be one side or the other. Just to say Steaming will never work I don't think is entirely accurate - maybe we don't do 100% streaming assets, we have a good deal still accomplished locally, but some portion of that is calculated and generated remotely and streamed. 

    How exactly that looks i don't know, and I'm just looking at things with my head in the clouds..  but seems there is some potential there. 

    Both rendering capacity and bandwidth capacity are going up all the time... hard to say how it will look in 5-10 years time... (Except in the US, where everything will be Comcast or AT&T and will still be vastly inferior to the rest of the developed world)
    [Deleted User]
  • QuizzicalQuizzical Member LegendaryPosts: 25,347
    Quizzical said:
    Streaming games will improve, but rendering them locally will improve, too.  What you can do today with streaming is inferior to what you could do ten years ago by rendering the game locally.  What you'll be able to do ten years from now will streaming will almost certainly be inferior to what you could do today rendering the game locally.

    To give you some idea of the relative bandwidths involved:

    Typical broadband Internet connection:  0.01 GB/sec
    High end (currently gigabit) Internet connection:  0.125 GB/sec
    Monitor cable for mainstream monitor (currently 1080p at 60 frames per second):  0.5 GB/sec
    PCI Express bandwidth for relatively good video card:  10 GB/sec
    Global memory bandwidth for video card:  300 GB/sec
    Local memory bandwidth for video card:  10000 GB/sec
    Register bandwidth for video card:  100000 GB/sec

    All of those numbers will go tend to go up in the future, but the ratio between them will likely stay similar to what it is today.  One thing that should be striking is that some of the numbers are massively larger than the others.  Using one of the higher things on the list when you should be using one of the lower ones can kill performance in a hurry.  The goal of streaming games is to do exactly that.

    To make things worse, all of those except for the Internet connections are things that you can readily use all day long.  With the Internet connection, if you try to do that all day every day, your ISP will probably try to shut you down one way or another, as that would be more than 10 TB/day on a gigabit Internet connection.

    Computations are cheap.  Internet bandwidth is expensive.  That's dictated by physics, and the laws of physics aren't going to change just because someone doesn't like them.

    The only real advantage of game streaming is that you can skip the time it takes to download a game.  But the amount of data that goes through your monitor cable in one minute is more than the size to download most games.  If you've got an Internet connection that can deliver that kind of bandwidth, then downloading games is pretty trivial and streaming becomes pointless.
    So all that really seems to just be the limitations of a fiber line. So what could replace a fiber line to increase that speed? 
    Not sure why many companies believe it possible to leave hardware behind for streaming if it was not only possible but already in the works. As shown at this E3 with EA and Microsoft. 
    Many detractors said the same thing about Netflix when it began streaming. Now nearly everything is streamed. Like PSNow from PlayStation which streams over 650 games to the PS4 and PC. 
    I have a hard time believing that it won't happen. There are already signs it has begun.
    Netflix isn't limited by latency.  It can also buffer far ahead of time so that a hiccup in your Internet connection doesn't matter.  Those are both wildly false if you're trying to play a game.

    For streaming a movie, whoever made the movie can look at the whole product and compress it across time.  You can't do that if you're trying to play a game in real time.  You can still try to compress single frames in isolation, but that might mean that a given image quality takes ten times as much space as if you could compress across time.

    Well, you could compress an image across time for games, but the latency would be so awful that it would be useless.  If you press a button and it doesn't seem to register until a full second later, that would be regarded as unplayable by the standards of 40 years ago.

    With Netflix, there's also the issue that rendering the movie locally isn't an option.  If game streaming were the only way that it's possible to play games, then people would do it and it might seem okay if we didn't know of any other way.  But if you can get a massively better experience for less money by rendering the game locally on an integrated GPU, why stream it?  Not only is that the case today, but the gap is probably going to widen in coming years.

    Bandwidth isn't the only problem.  Latency is a huge problem, too, and probably more intractable than bandwidth.  It takes time to send your inputs to the remote server that is running the game.  It also takes time to send the completed image back.  It takes time to compress and decompress the image.  It also effectively backs you up by a frame for the time it takes to transmit that frame.  That could easily add an extra 100 ms of latency.

    For comparison, the difference between a high end gaming rig and a low end one at the same graphical settings might be about 50 ms.  Unless the frame rate is low enough that your brain doesn't interpret it as motion, it's better to think of the effects of higher frame rates in terms of latency than raw frame rates.
  • [Deleted User][Deleted User] Posts: 12,263
    The user and all related content has been deleted.

    거북이는 목을 내밀 때 안 움직입니다












  • FlyByKnightFlyByKnight Member EpicPosts: 3,967
    Amathe said:
    I picture some poor guy who now has to consider three companies when coding games. Bet he doesn't get a raise for the extra work.
    I picture some smart guy writing a framework that transpiles to all 3 platforms so poor guys have nothing to worry about.
    "As far as the forum code of conduct, I would think it's a bit outdated and in need of a refre *CLOSED*" 

    ¯\_(ツ)_/¯
  • gervaise1gervaise1 Member EpicPosts: 6,919
    Ridelynn said:
    <snip> The big question is, why now? <snip> 


    That is the question. Some good observations as well.

    And - as commented above - Intel is a very big company. What they lack in discrete graphic experience they make up for in the sheer amount of cash they could decide to invest. 

    And if they do create a solid discrete graphics chip - and presumably market it under brand name X I would be very surprised if this wasn't followed by a cpu featuring X. So this could be a strategy that downstream could boost cpu sales.

    Furure consoles - possibly; I would also add TVs. And then - as mentioned - there is also the whole AI.
  • QuizzicalQuizzical Member LegendaryPosts: 25,347
    But PlayStation streams games now. EA showed off their streaming this E3 as did Microsoft. Seems the industry has already decided to move ahead in this direction regardless of the doubt beyond it. 
    They offer streaming as a way to run games built for older consoles that wouldn't otherwise be able to run at all.  That's basically a last resort approach of, at least it lets you play the game badly rather than not letting you play it at all.  But playing a PS3 game by streaming it to a PS4 is going to be a far inferior experience to playing the same PS3 game on a PS3.

    My argument isn't that there will never be any streaming of games at all.  Especially for purely turn-based games, the latency doesn't kill the game entirely.  My argument is that gamers will never accept it if someone tries to make streaming into the primary way to play a game.  Make streaming into the only way that a game can be played at all and you greatly diminish the market for that particular game.

    It's kind of like saying that if someone today tried to launch a new game console to compete with PS4 Pro and Xbox One X, but it offered less performance than the original Xbox One (non-X), cost $700, and didn't have any notable advantages to compensate, people wouldn't buy it.
  • MarknMarkn Member UncommonPosts: 307
    Streaming wont be possible until ISPs catch up in terms of capacity and speed.   To be honest what's in it for them to spend billions replacing the infrastructure they use ?   I would guess 90-95% of the US still doesn't have or cannot get 1gig internet.   Then you got the FCC allowing your ISPs to data cap you and charge extremely high prices when you go over or have an extra fee for unlimited.
    Arglebargle[Deleted User]
  • VrikaVrika Member LegendaryPosts: 7,882
    But PlayStation streams games now. EA showed off their streaming this E3 as did Microsoft. Seems the industry has already decided to move ahead in this direction regardless of the doubt beyond it. 
    They're developing it because it's going to sell. That doesn't mean it's going to be only alternative.

    A bit like fixed broadband connections and mobile broadband connections are both being developed. The world has enough markets for multiple different competing technologies, each of which has their own advantages and disadvantages. Developing one thing doesn't mean abandoning the other.
     
  • Jamar870Jamar870 Member UncommonPosts: 570
    Wouldn't another factor against it be how many "hops" the stream has to make to get to you?
  • [Deleted User][Deleted User] Posts: 12,263
    The user and all related content has been deleted.

    거북이는 목을 내밀 때 안 움직입니다












Sign In or Register to comment.