It looks like you're new here. If you want to get involved, click one of these buttons!
A lot of people probably have no idea what that title means, so I'll explain. The non-technical explanation is that hardware coming soon will be able to:
1) make a given frame rate appear smoother in games,
2) make videos shot at arbitrary frame rates appear smoother, and
3) use less power for the system under most circumstances.
And it will be able to do all of that without costing any more than current hardware.
How so? Let's delve into some technical details.
The basic problem with current monitors is that they want every frame to be displayed for just as long as each other. A monitor typically updates about every 1/60 of a second, without regard to whether it makes sense to do so. If a new frame comes in every 1/40 of a second, it will show one frame "twice", then the next frame "once", and alternate like that. Adaptive-sync will make it possible for a video card and monitor to coordinate that a new frame is ready every 1/40 of a second, so the monitor should display a new frame every 1/40 of a second.
In gaming, a common question is whether to turn on vertical sync or not. If vertical sync is on, you get the behavior described above, which makes a given frame rate below the monitor's refresh rate look less smooth than it could. Frame rates in games tend to bounce around some, but this adds additional variation in how long it takes for each frame to be displayed. It also adds display latency.
The alternative is to turn vertical sync off and start displaying new frames as soon as they are ready. The problem with this is that it often means that a monitor displays parts of more than one frame simultaneously. When there is rapid movement, the boundary between the two frames can be very visible, and this "tearing" can look terrible.
By allowing a monitor to start displaying a new frame whenever one is ready, rather than at fixed intervals, you get the best of both worlds: the complete frames of vertical sync on together with the reduced latency of vertical sync off, as well as smoother animations than either.
Furthermore, if there aren't new frames every 1/60 of a second, the monitor doesn't have to do as much work to constantly display a new frame. This will save on power consumption. For example, as you read this, how fast is the contents of your monitor changing? If you didn't happen to draw a flash ad, it's likely that entire seconds go by without changes.
For what it's worth, VESA is the Video Electronics Standards Association. They make standards for monitors, with the goal that you can pick your monitor and your video card independently and they should just work together flawlessly.
Last year, Nvidia announced G-sync to do roughly the same thing as the new Adaptive-sync. AMD quickly responded by announcing FreeSync, and that VESA was working on this. One problem with G-sync is that it requires extra hardware, so only a handful of monitors will support it, and the extra hardware might add $100 to the cost of the monitor. Another is that it's proprietary to Nvidia, so it will not work with AMD video cards. AMD is apparently keeping the name FreeSync around, and as best as I can tell, they're using it to describe the GPU-side of working with a monitor that supports Adaptive-sync.
Adaptive-sync doesn't require extra hardware, so it doesn't bloat monitor costs. Furthermore, it will work with video cards from any vendor. Indeed, AMD has announced that several GPU chips available today already support it, including Hawaii (R9 290 and 290X), Bonaire (R7 260X), Kaveri, and Kabini/Temash. That includes the top and bottom of AMD's current lineup, so it's probable that all AMD GPUs will support it for many years to come. I wouldn't be surprised if some Nvidia Kepler GPUs available today support it, and would be surprised if Maxwell doesn't.
So how big of a deal is this? Let's put it this way: this is what I'm waiting on before replacing my current computer, and not coming CPUs or GPUs. (More monitors is the driving reason for the upgrade.)