Howdy, Stranger!

It looks like you're new here. If you want to get involved, click one of these buttons!

Watchdogs 2 Benchmarks

MalaboogaMalabooga Member UncommonPosts: 2,977
wd2 1920

wd2 proz

and....for everyone who was counting on playing games on GTX1070/1080 in 1440p+/60 FPS on Ultra in near future

wd2 2560

«1

Comments

  • SavageHorizonSavageHorizon Member EpicPosts: 3,466
    Playing it on my PS4 Pro it looks and plays wonderful. 




  • WarlyxWarlyx Member EpicPosts: 3,363
    edited November 2016
    the money wasted on i7 and 8gb gpu 32gb ram lol .... isnt worth it , if u are rich sure go ahead but for the general playerbase ....that pc budget is insane just the geforce gtx 1080 cost as much as my current PC :(


  • filmoretfilmoret Member EpicPosts: 4,906
    Oh but wait I thought you said the Rx480 is better then the gtx 1060.  Looks like its the opposite.  I can't believe you posted benchmarks saying otherwise.  Isn't that against your rules of conduct?
    Are you onto something or just on something?
  • MalaboogaMalabooga Member UncommonPosts: 2,977
    edited November 2016
    OMG whole of 1 FPS in NVidia gimpworks title rofl. Its doing great considering its being actively gimped by NVidia

    OTOH just look at that poor Kepler 780ti, also being gimped by NVidia and now peforms on 380 level and it costed 700$ just 2 years ago lol. OTOH, at the same time, 280x/380x costed 299/239 $

    Its funny, in AMD sponsored titles 780ti performs as it should and beats/is on par with 970 just like it was at release, but in NVidia gimpworks titles 970 is 30+% faster lol

    and your limited capacity cant comprehend that i speak on THE WHOLE where 480 is better than 1060.
    Post edited by Malabooga on
  • filmoretfilmoret Member EpicPosts: 4,906
    Yea but the dual rx480's aren't even close to that gtx 1080.  And they are using 2x the power.
    Are you onto something or just on something?
  • MalaboogaMalabooga Member UncommonPosts: 2,977
    edited November 2016
    right, blind and stoopid rofl

    and there you go again meddling with stuff you have no clue about rofl

    wd2 2560

  • filmoretfilmoret Member EpicPosts: 4,906
    Where did you find benchmarks for this game?
    Are you onto something or just on something?
  • 13lake13lake Member UncommonPosts: 719
    filmoret said:
    Where did you find benchmarks for this game?
    gamegpu.ru, one of the most objective, and real life accurate benchmark websites in the world ?
  • filmoretfilmoret Member EpicPosts: 4,906
    Ah don't forget this one.

    wd2 3840
    Are you onto something or just on something?
  • MalaboogaMalabooga Member UncommonPosts: 2,977
    edited November 2016
    4k is irrelevant.

    1080 is what matters to 97,5%+ people, some may even care about 1440p, 4k is tending to 0 rofl. It matters more for consoles than PC.

    and to remember that those morons used to call 1080 4k card rofl. Thats 700$ down the drin right there lol
  • MalaboogaMalabooga Member UncommonPosts: 2,977
    edited November 2016
    13lake said:
    filmoret said:
    Where did you find benchmarks for this game?
    gamegpu.ru, one of the most objective, and real life accurate benchmark websites in the world ?
    Nooooo, you told him even if it says all over pictures where they are from, but hes to "smart" to figure it out on his own rofl
  • PhryPhry Member LegendaryPosts: 11,004
    Checked out the Watchdogs 2 Benchmarks on Gamers Nexus, makes for interesting viewing honestly, though whether they are more informative than the OP's benchmarks or not, i couldn't say, certainly the AMD cards did not fare as well as the Nvidia ones, but apparently that is largely due to AMD not having an equivalent card at this time to the 1080's or the 1070's

    https://www.youtube.com/watch?v=VyeyPCzWMQQ
  • MalaboogaMalabooga Member UncommonPosts: 2,977
    Funny how NVidia cards did almost teh same/bit better but AMD cards did much worse across the board, interesting huh. But russians are not sponsored by anyone, while Gamers Nexus is heavily sponsored, NVidia included.
  • filmoretfilmoret Member EpicPosts: 4,906
    Malabooga said:
    4k is irrelevant.

    1080 is what matters to 97,5%+ people, some may even care about 1440p, 4k is tending to 0 rofl. It matters more for consoles than PC.

    and to remember that those morons used to call 1080 4k card rofl. Thats 700$ down the drin right there lol
    So when you are comparing dual RX480's to the gtx 1080.  For some reason you quoted the 1440 and ignored the 1080 resolution.  According to your theory that the 1080 resolution is the most important.  Then the amd cards are total crap compared to the Nvidia cards.  Even your own benchmarks from your amd loving website says so.
    Are you onto something or just on something?
  • MalaboogaMalabooga Member UncommonPosts: 2,977
    edited November 2016
    1080/2x480 arent 1080p setups dumbo.
  • RidelynnRidelynn Member EpicPosts: 7,383
    Again, the GPU comparison and the CPU comparison chart doesn't show even close to the same numbers (i7 5960X 1080 SLI, one shows 104/121, the other shows 92/109).

    Apart from the usual inconsistency - it's nice to see an SLI-enabled title, and beyond that there is a surprising amount of CPU per-core scaling as well. The FX-8150 (the original crappy Bulldozer) beating out a Haswell i5 is almost unheard of in gaming. You can also see a very clear distinction between Sandy Bridge, Haswell, and Skylake in the test, which is also somewhat unsual.
  • MalaboogaMalabooga Member UncommonPosts: 2,977
    edited November 2016
    1. thers no incosistency, you simply arent looking as good as you SHOULD look at whats tested.

    2. thats what happens when you use 8 cores, you act surprised, but in decently threaded applications 8150 was between i5 and i7 in most cases and faster than i7 in other cases.

    Games being late in using more CPU cores is just a testament how cr@ppy devs are, coupled with ancient DX11 which only used 1 core (DX11.3 aka pseudo DX12 finally opens the gates and DX12/Vulkan are capable of using all cores without much trouble)

    Also, 6700k being a whole lot of 13% faster than 5 years old 2600k, and 6600 being marginally faster than 4670k, along with 6700k being marginally faster than 4770k. Thats how much Intels CPUs "improved" over 5 years, thats 5 generations of CPUs and Kaby Lake has virtually no improvement over Skylake... ... ...
    Post edited by Malabooga on
  • RidelynnRidelynn Member EpicPosts: 7,383
    edited November 2016
    Can you explain how there is no inconsistency?

    Graph 1:  i7 5960x, Gigabyte x99 mobo, 1080p, VHQ, 1080SLI - 104/121 fps
    Graph 2:  i7 5960x, Gigabyte x99 mobo, 1080p, VHQ, 1080SLI - 92/109 fps

    So what's the difference?
  • JemAs666JemAs666 Member UncommonPosts: 252
    Ridelynn said:
    Can you explain how there is no inconsistency?

    Graph 1:  i7 5690x, Gigabyte x99 mobo, 1080p, VHQ, 1080SLI - 104/121 fps
    Graph 2:  i7 5960x, Gigabyte x99 mobo, 1080p, VHQ, 1080SLI - 92/109 fps

    So what's the difference?
    I think the difference is the site being the most objective in the world.
  • MalaboogaMalabooga Member UncommonPosts: 2,977
    edited November 2016
    Ridelynn said:
    Can you explain how there is no inconsistency?

    Graph 1:  i7 5690x, Gigabyte x99 mobo, 1080p, VHQ, 1080SLI - 104/121 fps
    Graph 2:  i7 5960x, Gigabyte x99 mobo, 1080p, VHQ, 1080SLI - 92/109 fps

    So what's the difference?
    1. 5690x @ 4,6 GHz
    2. 5690x @ stock 3 GHz

    thats 53% CPU OC

    its all there for those who dont just skim and miss important details. Just look at guy above trying to sound smart, but being embarassed in the process.
  • RidelynnRidelynn Member EpicPosts: 7,383
    Malabooga said:
    Ridelynn said:
    Can you explain how there is no inconsistency?

    Graph 1:  i7 5690x, Gigabyte x99 mobo, 1080p, VHQ, 1080SLI - 104/121 fps
    Graph 2:  i7 5960x, Gigabyte x99 mobo, 1080p, VHQ, 1080SLI - 92/109 fps

    So what's the difference?
    1. 5690x @ 4,6 GHz
    2. 5690x @ stock 3 GHz

    its all there for those who dont just skim and miss important details.
    Thanks, I didn't see that, or maybe I have no idea what LLu means and it wasn't clear to me. But I appreciate you pointing it out, because it wasn't clear to me at all.
  • MalaboogaMalabooga Member UncommonPosts: 2,977
    edited November 2016
    GTX1080/i7 6700k, Watchdogs 2 gameplay, 20-25 FPS @ 1080p lol

    *brought to you by NVidia Gameworks    lol



  • filmoretfilmoret Member EpicPosts: 4,906
    Malabooga said:
    GTX1080/i7 6700k, Watchdogs 2 gameplay, 20-25 FPS @ 1080p lol

    *brought to you by NVidia Gameworks    lol



    They were using broken tired amd cards for this.
    Are you onto something or just on something?
  • MalaboogaMalabooga Member UncommonPosts: 2,977
    edited November 2016
    Yeah, all dem benchmarks are obsolete, they just benched first mission, beyond that performance is abbysmal (as more and more videos of actual gameplay like above started to pop up... ... ....)
    Post edited by Malabooga on
  • cheyanecheyane Member LegendaryPosts: 9,100
    I was thinking of upgrading to a 1080 but the price here in Rome is so high still and in Euros some more. I have an Alienware with like 875 watts power supply so upgrading should not be an issue wish it did not cost so much though .
    Chamber of Chains
Sign In or Register to comment.