Howdy, Stranger!

It looks like you're new here. If you want to get involved, click one of these buttons!

Nvidia trying to force companies to stop selling AMD GPUs

2»

Comments

  • MoiraeMoirae Member RarePosts: 3,318
    That's hilarious.
  • VrikaVrika Member LegendaryPosts: 7,985
    Ridelynn said:
    Quizzical said:
    Someone pointed out on another forum that there's a giant shortage of video cards right now, there has been for several months, and there's no sign of it ending soon.  Right now, board partners can pretty trivially sell all the cards they can make to miners.  If this program had been in place several months ago, partners who didn't sign up would have probably either gotten exactly nothing in the last few months or else been restricted to low end cards that aren't suitable for gaming.
    My understanding is that the current lack of cards isn't because nVidia/AMD can't crank out enough GPU chips, it's because there isn't enough RAM manufacturing capability to produce enough GDDR5/HBM to go around right now.

    So yes, AIBs can sell every card they can produce. However, GPU chip manufacturers (nV/AMD) could probably get enough manufacturing capability to crank out enough GPUs if they wanted to. My understanding is that nV has stopped P102/104 production recently anyway, in order to start ramping up for ~whatever~ is supposedly coming next (Ampere/Turing/Volta/whatever they are calling it now). And AMD is gunshy of ramping up production to meet demand, because they got burned when they did that for the 7970.

    Now, there have been points in the not to distant past where GPU production was the bottleneck - that was almost always connected to a paper launch by the manufacturer though. I can only think of a couple of times where it was due to something else: the 6970 and 7970 where scarce for a while there because of mining, iirc. But I can't think of any nVidia chip that did the same, apart from paper launch issues.
    I think that rumor about NVidia stopping production turned out to be false information. The latest rumor is that than NVidia's next consumer graphic card is Turing and they wouldn't start manufacturing until mid-June. 

    http://www.tomshardware.com/news/nvidia-turing-graphics-architecture-delayed,36603.html


    But this is just another rumor. NVidia has so far managed to keep details of their next gaming GPU hidden really well.
    RidelynnKyleran
     
  • RidelynnRidelynn Member EpicPosts: 7,383
    Vrika said:


    But this is just another rumor. NVidia has so far managed to keep details of their next gaming GPU hidden really well.
    That or they are just very good at putting out a lot of false information to obfuscate what is true.
  • VrikaVrika Member LegendaryPosts: 7,985
    edited May 2018
    Update on story: NVidia is ending Geforce Partner Program
      https://www.pcgamer.com/nvidia-just-killed-its-controversial-geforce-affiliate-program/
     
  • QuizzicalQuizzical Member LegendaryPosts: 25,493
    Here's a link from HardOCP, which broke the original story:

    https://www.hardocp.com/news/2018/05/04/nvidia_pulling_plug_on_gpp

    Here's Nvidia's take on it announcing the cancellation:

    https://blogs.nvidia.com/blog/2018/05/04/gpp/

    The obvious lies are rather insulting, but that's what you get when dealing with Nvidia.  There's the simple fact that, if the reporting by Kyle of HardOCP were inaccurate, Nvidia could conclusively prove it wrong by publishing the contracts that they demanded that their partners sign.  Some of the statements are real howlers in the post, too.  For example:

    "GPP was about making sure gamers who want NVIDIA tech get NVIDIA tech."

    Nvidia owns a number of trademarks that they can use to mark a product as clearly Nvidia.  "Nvidia" and "GeForce" are the obvious ones.  "Quadro" and especially "Tesla" have other uses in other markets, but as GPUs go, made it abundantly clear that it was an Nvidia GPU.  Those are far more strongly associated with Nvidia than the less well-known brands that they tried to force to be Nvidia only, such as "Republic of Gamers" or "Aorus". 

    "GPP had a simple goal – ensuring that gamers know what they are buying and can make a clear choice."

    If that's Nvidia's goal, then perhaps they should tell it to whoever it is that names their GeForce cards.  And tell them to stop creating different cards with exactly the same name.  
  • IceAgeIceAge Member EpicPosts: 3,202
    Here is Quizzical again , crying that his beloved AMD company , can't keep it up with Nvidia/Intel . I would do the same if I were Nvidia. I mean, take it as you want, but it's business after all. 

    Reporter: What's behind Blizzard success, and how do you make your gamers happy?
    Blizzard Boss: Making gamers happy is not my concern, making money.. yes!

  • QuizzicalQuizzical Member LegendaryPosts: 25,493
    IceAge said:
    Here is Quizzical again , crying that his beloved AMD company , can't keep it up with Nvidia/Intel . I would do the same if I were Nvidia. I mean, take it as you want, but it's business after all. 
    You would do the same?  So you're saying that, given the chance, you'd implement some semi-secret and likely illegal program, take a boatload of negative publicity for it, and then two months later cancel the whole thing and say "never mind" after imposing considerable costs on some of your business partners to redo a bunch of packaging and marketing?  You really think that's the optimal strategy here?
  • IceAgeIceAge Member EpicPosts: 3,202
    Quizzical said:
    IceAge said:
    Here is Quizzical again , crying that his beloved AMD company , can't keep it up with Nvidia/Intel . I would do the same if I were Nvidia. I mean, take it as you want, but it's business after all. 
    You would do the same?  So you're saying that, given the chance, you'd implement some semi-secret and likely illegal program, take a boatload of negative publicity for it, and then two months later cancel the whole thing and say "never mind" after imposing considerable costs on some of your business partners to redo a bunch of packaging and marketing?  You really think that's the optimal strategy here?
    No!

    I will focus to my partners who works exclusively with me. After that, sure .. will ship my remaining products to the rest ..

    Reporter: What's behind Blizzard success, and how do you make your gamers happy?
    Blizzard Boss: Making gamers happy is not my concern, making money.. yes!

  • QuizzicalQuizzical Member LegendaryPosts: 25,493
    IceAge said:
    Quizzical said:
    IceAge said:
    Here is Quizzical again , crying that his beloved AMD company , can't keep it up with Nvidia/Intel . I would do the same if I were Nvidia. I mean, take it as you want, but it's business after all. 
    You would do the same?  So you're saying that, given the chance, you'd implement some semi-secret and likely illegal program, take a boatload of negative publicity for it, and then two months later cancel the whole thing and say "never mind" after imposing considerable costs on some of your business partners to redo a bunch of packaging and marketing?  You really think that's the optimal strategy here?
    No!

    I will focus to my partners who works exclusively with me. After that, sure .. will ship my remaining products to the rest ..
    Well then, what I described is what Nvidia just did.  You said you would do the same, but now you're saying that you wouldn't.

    I think the real lesson to be learned here is that it helps to read before replying.
  • VrikaVrika Member LegendaryPosts: 7,985
    Torval said:
    Dvora said:
    It's when you don't have enough confidence in your own company's products/services that these type of ultimatums are handed-out as deflection marketing.  I like Nvidia cards, always have, but next time I'm shopping for a graphics card I might just jump brands. 
    I'm thinking the same thing... Never liked AMD cards but if theres something close in performance next upgrade, I'll give it a go.
    The landscape is changing. Intel is already deep into lines of discrete server and consumer graphics cards. Apple is moving core central and graphics processor design in-house. They've already dropped support for external Nvidia GPUs. AMD is still competitive for standalone cards and can offer robust APUs that Nvidia cannot.

    It's not that Nvidia doesn't have confidence that they can make the top card. They've shown year after year they can do that. What they can't control is how relevant they are in the landscape. If the landscape shifts poorly, or they have another major Tegra flaw that sends partners like Nintendo looking elsewhere, they could easily go the way of the dodo. They are scared, but for different reasons that they don't have confidence in their product.
    NVidia has been beating AMD is desktop GPUs and managed to hit jackpot with their AI and tech. Right now they're doing better than ever before.
     
  • [Deleted User][Deleted User] Posts: 12,262
    The user and all related content has been deleted.

    거북이는 목을 내밀 때 안 움직입니다












  • QuizzicalQuizzical Member LegendaryPosts: 25,493
    Vrika said:
    Torval said:
    Dvora said:
    It's when you don't have enough confidence in your own company's products/services that these type of ultimatums are handed-out as deflection marketing.  I like Nvidia cards, always have, but next time I'm shopping for a graphics card I might just jump brands. 
    I'm thinking the same thing... Never liked AMD cards but if theres something close in performance next upgrade, I'll give it a go.
    The landscape is changing. Intel is already deep into lines of discrete server and consumer graphics cards. Apple is moving core central and graphics processor design in-house. They've already dropped support for external Nvidia GPUs. AMD is still competitive for standalone cards and can offer robust APUs that Nvidia cannot.

    It's not that Nvidia doesn't have confidence that they can make the top card. They've shown year after year they can do that. What they can't control is how relevant they are in the landscape. If the landscape shifts poorly, or they have another major Tegra flaw that sends partners like Nintendo looking elsewhere, they could easily go the way of the dodo. They are scared, but for different reasons that they don't have confidence in their product.
    NVidia has been beating AMD is desktop GPUs and managed to hit jackpot with their AI and tech. Right now they're doing better than ever before.
    Nvidia is in a very volatile position right now, as their traditional markets are shrinking.

    Discrete laptop GPUs are probably going away almost entirely.  Five years from now, having a discrete GPU in a laptop might be as rare as putting an HEDT CPU in one.

    Discrete desktop GPUs will be around for a while, but more and more of the low end is being eaten up by integrated graphics.  If AMD (or even Intel, if they can ever be trusted to make working drivers) ever puts a stack of HBM2 in an integrated GPU, that's the end of sub-$200 discrete GPUs in new computers.

    Professional graphics cards aren't going anywhere, but that's not a huge market.

    Nvidia tried to get into cell phones and failed miserably.  I don't see that changing now.

    Without having a viable CPU architecture, Nvidia is completely locked out of all but the lowest end of game consoles for the foreseeable future.

    ---------------------------------------------------------------------

    GPU compute is volatile.  Several years ago, AMD and Nvidia took different approaches to GPU compute.  AMD treated it about as they had treated graphics:  build good hardware for compute and the drivers and APIs to let you write code to use it.  Nvidia tended to build less good compute hardware (Kepler and Maxwell weren't remotely competitive with the contemporary AMD GCN GPUs at compute), but would additionally write a bunch of software for it that would only run on Nvidia GPUs.  The idea was that, if what you needed was a subset of what Nvidia had written, you could probably get better performance for a given price tag on AMD by writing your own, but if you bought Nvidia, you wouldn't have to write your own.

    In a commercial sense, Nvidia won that fight in a landslide.  Several years ago, Nvidia CEO Jensen Huang declared that Nvidia was a software company.  He was ridiculed for it at the time, but this is what he was talking about.  He saw that for GPU compute, most of the people willing to spend most of the money couldn't or wouldn't write much code of their own.  At most, they'd be willing to stitch together some pieces of software that someone else had written, and some wouldn't even be willing to do that much, but required the entire GPU portion to be handed to them on a silver platter or else they wouldn't use it at all.

    But in another sense, GPU compute is still in its infancy.  Relying on Nvidia (or someone they've hired to write code) to write your code for you means that most of the algorithms that would be far more efficient on GPUs aren't yet running on GPUs.  A lot of what is running on GPUs is running very inefficiently because trying to call external libraries on a GPU for part but not all of your kernel code is wildly inefficient.  That's about the best that can be done if you're relying on Nvidia to write code for you.

    Nvidia has made a ton of money by doing this with AI.  But that's probably going away.  I don't mean that AI is going away, but Intel and Google are building machine learning ASICs.  Nvidia got there first, yes, and they've made billions as a reward for it.  But if an algorithm is important enough to justify building ASICs, you can't beat an ASIC.  It's possible that ASICs built by Nvidia end up winning the AI market, and a substantial chunk of Nvidia's GV100 die could reasonably be thought of as a machine learning ASIC, as those tensor cores are useless for just about everything else.  But it's vanishingly unlikely that general purpose GPUs will beat ASICs in the exact algorithm that the ASICs are built for.

    On the other hand, technology has always been so.  Virtually all of the desktop, laptop, and server CPUs and GPUs that will be bought new today are dies that weren't commercially available two years ago.  That includes all of Pascal, Volta, Polaris, Vega, Sky Lake-X, Kaby Lake, Coffee Lake, Ryzen, Threadripper, EPYC, and Xeon Silver/Gold/Platinum, among other things.

    It's possible that Nvidia will find some other new market and be making billions of dollars five years from now on something that today, we're not even aware will matter.  They've done so in the past.  But when that's what your business model relies upon, it's volatile, and you're one botched generation from your finances being about where AMD was three years ago, and without any obvious path to recovery.
  • PhryPhry Member LegendaryPosts: 11,004
    Ridelynn said:
    Vrika said:


    But this is just another rumor. NVidia has so far managed to keep details of their next gaming GPU hidden really well.
    That or they are just very good at putting out a lot of false information to obfuscate what is true.
    Alternatively they could be putting out enough 'false' rumours to give the conspiracy theorists something to play with  :p
    Ozmodan
  • OzmodanOzmodan Member EpicPosts: 9,726
    Quizzical said:
    Vrika said:
    Torval said:
    Dvora said:
    It's when you don't have enough confidence in your own company's products/services that these type of ultimatums are handed-out as deflection marketing.  I like Nvidia cards, always have, but next time I'm shopping for a graphics card I might just jump brands. 
    I'm thinking the same thing... Never liked AMD cards but if theres something close in performance next upgrade, I'll give it a go.
    The landscape is changing. Intel is already deep into lines of discrete server and consumer graphics cards. Apple is moving core central and graphics processor design in-house. They've already dropped support for external Nvidia GPUs. AMD is still competitive for standalone cards and can offer robust APUs that Nvidia cannot.

    It's not that Nvidia doesn't have confidence that they can make the top card. They've shown year after year they can do that. What they can't control is how relevant they are in the landscape. If the landscape shifts poorly, or they have another major Tegra flaw that sends partners like Nintendo looking elsewhere, they could easily go the way of the dodo. They are scared, but for different reasons that they don't have confidence in their product.
    NVidia has been beating AMD is desktop GPUs and managed to hit jackpot with their AI and tech. Right now they're doing better than ever before.
    Nvidia is in a very volatile position right now, as their traditional markets are shrinking.

    Discrete laptop GPUs are probably going away almost entirely.  Five years from now, having a discrete GPU in a laptop might be as rare as putting an HEDT CPU in one.

    Discrete desktop GPUs will be around for a while, but more and more of the low end is being eaten up by integrated graphics.  If AMD (or even Intel, if they can ever be trusted to make working drivers) ever puts a stack of HBM2 in an integrated GPU, that's the end of sub-$200 discrete GPUs in new computers.

    Professional graphics cards aren't going anywhere, but that's not a huge market.

    Nvidia tried to get into cell phones and failed miserably.  I don't see that changing now.

    Without having a viable CPU architecture, Nvidia is completely locked out of all but the lowest end of game consoles for the foreseeable future.

    ---------------------------------------------------------------------

    GPU compute is volatile.  Several years ago, AMD and Nvidia took different approaches to GPU compute.  AMD treated it about as they had treated graphics:  build good hardware for compute and the drivers and APIs to let you write code to use it.  Nvidia tended to build less good compute hardware (Kepler and Maxwell weren't remotely competitive with the contemporary AMD GCN GPUs at compute), but would additionally write a bunch of software for it that would only run on Nvidia GPUs.  The idea was that, if what you needed was a subset of what Nvidia had written, you could probably get better performance for a given price tag on AMD by writing your own, but if you bought Nvidia, you wouldn't have to write your own.

    In a commercial sense, Nvidia won that fight in a landslide.  Several years ago, Nvidia CEO Jensen Huang declared that Nvidia was a software company.  He was ridiculed for it at the time, but this is what he was talking about.  He saw that for GPU compute, most of the people willing to spend most of the money couldn't or wouldn't write much code of their own.  At most, they'd be willing to stitch together some pieces of software that someone else had written, and some wouldn't even be willing to do that much, but required the entire GPU portion to be handed to them on a silver platter or else they wouldn't use it at all.

    But in another sense, GPU compute is still in its infancy.  Relying on Nvidia (or someone they've hired to write code) to write your code for you means that most of the algorithms that would be far more efficient on GPUs aren't yet running on GPUs.  A lot of what is running on GPUs is running very inefficiently because trying to call external libraries on a GPU for part but not all of your kernel code is wildly inefficient.  That's about the best that can be done if you're relying on Nvidia to write code for you.

    Nvidia has made a ton of money by doing this with AI.  But that's probably going away.  I don't mean that AI is going away, but Intel and Google are building machine learning ASICs.  Nvidia got there first, yes, and they've made billions as a reward for it.  But if an algorithm is important enough to justify building ASICs, you can't beat an ASIC.  It's possible that ASICs built by Nvidia end up winning the AI market, and a substantial chunk of Nvidia's GV100 die could reasonably be thought of as a machine learning ASIC, as those tensor cores are useless for just about everything else.  But it's vanishingly unlikely that general purpose GPUs will beat ASICs in the exact algorithm that the ASICs are built for.

    On the other hand, technology has always been so.  Virtually all of the desktop, laptop, and server CPUs and GPUs that will be bought new today are dies that weren't commercially available two years ago.  That includes all of Pascal, Volta, Polaris, Vega, Sky Lake-X, Kaby Lake, Coffee Lake, Ryzen, Threadripper, EPYC, and Xeon Silver/Gold/Platinum, among other things.

    It's possible that Nvidia will find some other new market and be making billions of dollars five years from now on something that today, we're not even aware will matter.  They've done so in the past.  But when that's what your business model relies upon, it's volatile, and you're one botched generation from your finances being about where AMD was three years ago, and without any obvious path to recovery.
    I have to disagree.  There has to be a major technology change for integrated graphics to be a major challenge to separate graphics cards in gaming computers or any sort.  Even 1080p stresses any integrated chip and 2k and 4k are just a no go in most games.  Games are getting even more complex with VR etc.  That will drive the graphics card market for the foreseeable future.  The biggest problem for Nvidia is making major performance gains with their next graphic card series.
  • QuizzicalQuizzical Member LegendaryPosts: 25,493
    Ozmodan said:
    Quizzical said:
    Nvidia is in a very volatile position right now, as their traditional markets are shrinking.

    Discrete laptop GPUs are probably going away almost entirely.  Five years from now, having a discrete GPU in a laptop might be as rare as putting an HEDT CPU in one.

    Discrete desktop GPUs will be around for a while, but more and more of the low end is being eaten up by integrated graphics.  If AMD (or even Intel, if they can ever be trusted to make working drivers) ever puts a stack of HBM2 in an integrated GPU, that's the end of sub-$200 discrete GPUs in new computers.

    Professional graphics cards aren't going anywhere, but that's not a huge market.

    Nvidia tried to get into cell phones and failed miserably.  I don't see that changing now.

    Without having a viable CPU architecture, Nvidia is completely locked out of all but the lowest end of game consoles for the foreseeable future.
    I have to disagree.  There has to be a major technology change for integrated graphics to be a major challenge to separate graphics cards in gaming computers or any sort.  Even 1080p stresses any integrated chip and 2k and 4k are just a no go in most games.  Games are getting even more complex with VR etc.  That will drive the graphics card market for the foreseeable future.  The biggest problem for Nvidia is making major performance gains with their next graphic card series.
    The Xbox One X uses an integrated GPU, and offers GPU performance in the same ballpark as a GeForce GTX 1060 or Radeon RX 580.  The cheapest that you can get a faster discrete desktop GPU than that right now is around $500 for a GeForce GTX 1070.

    There's no technical reason why you can't do that with desktops and laptops.  AMD hasn't yet, but it strikes me as almost inevitable that they will do so eventually.  And if they do, and your integrated GPU can match the performance of a $200 discrete GPU, what happens to the market for $200 discrete GPUs?  It won't completely vanish, but it will certainly be a lot smaller than it is now.
    [Deleted User]Ridelynn
  • ScotScot Member LegendaryPosts: 24,354
    Torval said:
    You can still buy discrete sound cards, but most mainboards come with one integrated and the market for standalone is virtually nonexistent. That's about how I see discrete cards going in the next few years.

    As a creative soundblaster Z man I have to differ and because I can still get such a good card I imagine I will always be able to get a separate graphics card too. But how significant is the difference, there's the rub and obviously for sound cards it is nowhere near as different as it used to be.
  • laseritlaserit Member LegendaryPosts: 7,591
    I got screwed in a similar fashion years ago.

    I used to supply building suppliers in Western Canada with Joist Hangers and a few other products. I had about 25% of the variety of products that Simpson Strong Tie had. Simpson Strong Tie went to all the building suppliers and told them All or Nothing. They had to sell all Simpson Strong Tie product or Simpson wouldn't sell them any product. I'm a wee small to take on a multi-billion dollar corp.

    It was a pretty hard hit, lost around 1.5 million in sales. Luckily I was diversified enough that I survived. 

    Back in those days about 50% of my sales were from my products and 50% was from being a job shop. Now about 10% of my sales are from my own products and 90% is from being a job shop.

    Incidentally Simpson Strong Tie is now my main Kwik Strip Tie customer these days. I produce about a million of them a year.  
    Scot

    "Be water my friend" - Bruce Lee

  • RidelynnRidelynn Member EpicPosts: 7,383
    edited May 2018
    Ozmodan said:

    I have to disagree.  There has to be a major technology change for integrated graphics to be a major challenge to separate graphics cards in gaming computers or any sort.  Even 1080p stresses any integrated chip and 2k and 4k are just a no go in most games.  Games are getting even more complex with VR etc.  That will drive the graphics card market for the foreseeable future.  The biggest problem for Nvidia is making major performance gains with their next graphic card series.
    You could make a mathematical case for how long it would take for the performance of a current top-tier card to be available in an integrated form factor, just based on time between generations, increase in performance between generations, and so forth. If you do the math, it is a surprisingly short amount of time.

    But really... it comes down to the question of exactly what is "good enough". For a lot of people, 1080 is already good enough, and honestly, after having recently purchased a 4K monitor, I tend to agree with respect to graphical fidelity. (I do really like the PPI for text though).

    Heck, if you look (and believe) the latest Steam survey: Intel carries 10% of all gaming GPUs, and that low bar is good enough already. And in terms of sales, Intel is still the largest provider of graphics in the world, by far. AMD carries 15%, and of that 15%, only about 5% of it is in modern discrete GPUs. Only about 7% of all GPUs in use are more powerful than that found in the XB1X. "More Powerful" isn't a very big market right now.

    I think we are close to the point in graphics where sound cards have come. Sure, you can always make it better, but for most people, it's going to be good enough and there won't be a point in paying a lot of money for better over that. 

    [Deleted User]
  • GruugGruug Member RarePosts: 1,794
    I do not see this as any different then Ford or Chevy saying that dealerships should only sell their product and not the competitor's. If you make an EXCLUSIVE deal with someone, you are expected to live up to it.
    AmazingAvery

    Let's party like it is 1863!

  • VrikaVrika Member LegendaryPosts: 7,985
    edited May 2018
    Gruug said:
    I do not see this as any different then Ford or Chevy saying that dealerships should only sell their product and not the competitor's. If you make an EXCLUSIVE deal with someone, you are expected to live up to it.
    NVidia has a dominant market position, Ford and Chevy do not have. 
    Ozmodan
     
  • QuizzicalQuizzical Member LegendaryPosts: 25,493
    Scot said:
    Torval said:
    You can still buy discrete sound cards, but most mainboards come with one integrated and the market for standalone is virtually nonexistent. That's about how I see discrete cards going in the next few years.

    As a creative soundblaster Z man I have to differ and because I can still get such a good card I imagine I will always be able to get a separate graphics card too. But how significant is the difference, there's the rub and obviously for sound cards it is nowhere near as different as it used to be.
    Discrete video cards aren't going to disappear entirely.  I don't expect that they'll ever become as rare as discrete sound cards, either.  You can always get more performance out of burning more power, and putting 300 W in a single package is awkward to cool.

    That's why I said sub-$200 video cards.  It used to be that AMD and Nvidia would produce new $30-$50 cards every generation.  The bottom of the line in the current generation is the GeForce GT 1030 and Radeon RX 550 that both have an MSRP of $80.  (Only the GeForce card is available at that price, largely because it uses DDR4 memory, while the Radeon requires GDDR5 that is scarce because of miners.)  Today, you can still get sub-$50 cards from either vendor, but you'd be looking at a Radeon R5 230 or GeForce GT 710 that both date to about 2013 and are probably more notable today (for being the cheap option) than they were the day they launched.

    Give it a few years and the market for sub-$100 discrete cards will probably look like that.  Maybe even sub-$200 cards.  And no, I'm not predicting hyperinflation.  I am predicting huge jumps in integrated GPU performance in desktops and laptops, however, and that there won't be much point in buying a new video card slower than an integrated GPU.
    Scot
  • QuizzicalQuizzical Member LegendaryPosts: 25,493
    Gruug said:
    I do not see this as any different then Ford or Chevy saying that dealerships should only sell their product and not the competitor's. If you make an EXCLUSIVE deal with someone, you are expected to live up to it.
    While a given dealership may only sell a particular brand, the same company could own several dealerships that all sell different brands of cars.  For example:

    https://www.tonkin.com/new-cars/for-sale

    If you click on the "Make" option on the left side, that's a lot of different brands of cars.  Some are different cars produced by the same company, but a lot of them aren't.
  • CleffyCleffy Member RarePosts: 6,414
    I think nVidia's reasoning makes sense.
    RoG GTX 1080ti
    RoG RX 580
    AmazingAvery
  • IceAgeIceAge Member EpicPosts: 3,202
    Quizzical said:
    Gruug said:
    I do not see this as any different then Ford or Chevy saying that dealerships should only sell their product and not the competitor's. If you make an EXCLUSIVE deal with someone, you are expected to live up to it.
    While a given dealership may only sell a particular brand, the same company could own several dealerships that all sell different brands of cars.  For example:

    https://www.tonkin.com/new-cars/for-sale

    If you click on the "Make" option on the left side, that's a lot of different brands of cars.  Some are different cars produced by the same company, but a lot of them aren't.
    You always have "answers" and "comparations" heh?

    Seriously , move on already. AMD is a bitch for Nvidia and Intel. I am all for supporting whatever you want, but you are not only supporting it, you are delusional defending it with no particular common sense. Just .. blindly supporting. 

    Keep in mind that AMD would of done the same if they were big. But they aren't...

    Oh well ..
    [Deleted User]

    Reporter: What's behind Blizzard success, and how do you make your gamers happy?
    Blizzard Boss: Making gamers happy is not my concern, making money.. yes!

Sign In or Register to comment.