Howdy, Stranger!

It looks like you're new here. If you want to get involved, click one of these buttons!

VESA incorporates Adaptive-sync into DisplayPort 1.2a

QuizzicalQuizzical Member LegendaryPosts: 25,353

http://www.vesa.org/featured-articles/vesa-adds-adaptive-sync-to-popular-displayport-video-standard/

A lot of people probably have no idea what that title means, so I'll explain.  The non-technical explanation is that hardware coming soon will be able to:

1)  make a given frame rate appear smoother in games,

2)  make videos shot at arbitrary frame rates appear smoother, and

3)  use less power for the system under most circumstances.

And it will be able to do all of that without costing any more than current hardware.

How so?  Let's delve into some technical details.

The basic problem with current monitors is that they want every frame to be displayed for just as long as each other.  A monitor typically updates about every 1/60 of a second, without regard to whether it makes sense to do so.  If a new frame comes in every 1/40 of a second, it will show one frame "twice", then the next frame "once", and alternate like that.  Adaptive-sync will make it possible for a video card and monitor to coordinate that a new frame is ready every 1/40 of a second, so the monitor should display a new frame every 1/40 of a second.

In gaming, a common question is whether to turn on vertical sync or not.  If vertical sync is on, you get the behavior described above, which makes a given frame rate below the monitor's refresh rate look less smooth than it could.  Frame rates in games tend to bounce around some, but this adds additional variation in how long it takes for each frame to be displayed.  It also adds display latency.

The alternative is to turn vertical sync off and start displaying new frames as soon as they are ready.  The problem with this is that it often means that a monitor displays parts of more than one frame simultaneously.  When there is rapid movement, the boundary between the two frames can be very visible, and this "tearing" can look terrible.

By allowing a monitor to start displaying a new frame whenever one is ready, rather than at fixed intervals, you get the best of both worlds:  the complete frames of vertical sync on together with the reduced latency of vertical sync off, as well as smoother animations than either.

Furthermore, if there aren't new frames every 1/60 of a second, the monitor doesn't have to do as much work to constantly display a new frame.  This will save on power consumption.  For example, as you read this, how fast is the contents of your monitor changing?  If you didn't happen to draw a flash ad, it's likely that entire seconds go by without changes.

For what it's worth, VESA is the Video Electronics Standards Association.  They make standards for monitors, with the goal that you can pick your monitor and your video card independently and they should just work together flawlessly.

Last year, Nvidia announced G-sync to do roughly the same thing as the new Adaptive-sync.  AMD quickly responded by announcing FreeSync, and that VESA was working on this.  One problem with G-sync is that it requires extra hardware, so only a handful of monitors will support it, and the extra hardware might add $100 to the cost of the monitor.  Another is that it's proprietary to Nvidia, so it will not work with AMD video cards.  AMD is apparently keeping the name FreeSync around, and as best as I can tell, they're using it to describe the GPU-side of working with a monitor that supports Adaptive-sync.

Adaptive-sync doesn't require extra hardware, so it doesn't bloat monitor costs.  Furthermore, it will work with video cards from any vendor.  Indeed, AMD has announced that several GPU chips available today already support it, including Hawaii (R9 290 and 290X), Bonaire (R7 260X), Kaveri, and Kabini/Temash.  That includes the top and bottom of AMD's current lineup, so it's probable that all AMD GPUs will support it for many years to come.  I wouldn't be surprised if some Nvidia Kepler GPUs available today support it, and would be surprised if Maxwell doesn't.

So how big of a deal is this?  Let's put it this way:  this is what I'm waiting on before replacing my current computer, and not coming CPUs or GPUs.  (More monitors is the driving reason for the upgrade.)

Comments

  • RidelynnRidelynn Member EpicPosts: 7,383

    I agree, it's a huge deal. The bad news is that DisplayPort adoption and technology has historically been very slow. Only one GPU manufacturer (AMD) and only one computer manufacturer (Apple) really have pushed it hard. DP MST still has very low availability, despite being "the next big thing in display connectors" about 3 years ago - since DP and HDMI more or less directly compete with each other.

    I don't necessarily tie my monitor upgrades in with my overall computer upgrades though - although if this coincides with a widespread jump to 4K (or multiple monitors, as you mention) then it makes sense to do so.

  • QuizzicalQuizzical Member LegendaryPosts: 25,353
    I'd need a new video card to property drive more, higher resolution monitors.  I figure that as long as I'm upgrading a computer over 4 1/2 years old like that, I might as well just replace the thing outright.
  • RokZonRokZon Member Posts: 1

    Thanks Quizzical.  Excellent dissection into laymen terms.

    This information only solidifies my commitment to ATI(AMD) video cards (first and fore-most being that i've burned up 3 nvidia cards and not a single ati card since my transition 3 cards ago). 

    nVidia and their compulsion toward proprietary bs.

  • RidelynnRidelynn Member EpicPosts: 7,383


    Originally posted by RokZon
    Thanks Quizzical.  Excellent dissection into laymen terms.This information only solidifies my commitment to ATI(AMD) video cards (first and fore-most being that i've burned up 3 nvidia cards and not a single ati card since my transition 3 cards ago). nVidia and their compulsion toward proprietary bs.

    Hardware-wise, I don't think nVidia is that proprietary. They just tend to lean toward DVI/HDMI more than DisplayPort and offer fewer monitor outputs than AMD.

    Software is a different matter (CUDA/PhysX).

  • QuizzicalQuizzical Member LegendaryPosts: 25,353
    Originally posted by Ridelynn

     


    Originally posted by RokZon
    Thanks Quizzical.  Excellent dissection into laymen terms.

     

    This information only solidifies my commitment to ATI(AMD) video cards (first and fore-most being that i've burned up 3 nvidia cards and not a single ati card since my transition 3 cards ago). 

    nVidia and their compulsion toward proprietary bs.


     

    Hardware-wise, I don't think nVidia is that proprietary. They just tend to lean toward DVI/HDMI more than DisplayPort and offer fewer monitor outputs than AMD.

    Software is a different matter (CUDA/PhysX).

    Nvidia's G-sync does about the same thing as Adaptive-sync, but is proprietary.  That's probably what he meant.

    To use G-sync, a monitor has to have some physical card inside that Nvidia sells; Adaptive-sync doesn't require that.  The only real advantage of G-sync is that is available now (only in the ASUS VG248QE, and only with the separate purchase of an additional card to modify the monitor with), whereas Adaptive-sync isn't expected to be available in monitors you can buy until early next year.

Sign In or Register to comment.