Discussion in 'Videocards - AMD Radeon' started by IstariLight, Sep 14, 2017.
Vega 56 and 1070 ti overclocked should be equally fast pretty much. Going by current performance.
Also without FreeSync is pointless
Vega 56 + FS Monitor is a steal -> and you will never go back
At this point I would say it boils down to price and power consumption. Considering OP still has a 280x he keeps his GPU for a while. That would lend to the 1070ti costing less in power over a few years to the tune of ~$70-150 depending on how much up time there is on the GPU. That and even with the mining craze you can still get a 1070 for less than a Vega 56 (at leas in the states you can).
The only issue with Freesync is its narrow operating range when compared to G-sync. Is G-sync worth the premium over Freesync that is subjective.
I already had a 390x(that gave up and I replaced with 280x) so don't want any more heat thanks. I wasn't disappointed so much by the performance more the power and heat again.
Someone hasn't heard of LFC. And with FreeSync monitors, you can edit your active range.
LFE? As in low-frequentcy effects for sound?
If you meant LFC doesn't your monitor have to support it?
You can enable LFC if you extend the FS range below 37. There's a nice chat about this in the CRU thread. I have mine @ 34-60 currently.
Not the only issue. Another issue is lack of enforcement of the implementation of variable overdrive. Not all FreeSync monitors implement variable overdrive, a feature which is of importance when each refresh cycle could be anywhere between 1 / refresh rate and 1 / minimum refresh rate, in order to keep motion blur and overdrive artifacts down.
For example, the Nixeus EDG 27 does implement a variable overdrive mechanism, as the company reports.
Another issue with FreeSync, compared to G-Sync, is no enforcement of ULMB, which exists on most if not all G-Sync monitors as far as I know. It tends to be the case that when you have a high refresh rate monitor, there are some refresh rates below or at the maximum where you can comfortably complete pixel transitions then strobe the backlight in order to minimize motion blur.
The feature remains optional on most FreeSync monitors, with only few insisting on its inclusion such as BenQ, Samsung, and Eizo.
The range issue is a nightmare by itself, with some monitor unable to handle anything sufficiently higher than their default range even after tweaking the FreeSync range via CRU. Thus, LFC remains disabled for these monitors as it requires a 2.5x range, which is pretty wide, but laughably, only a 57Hz minimum on a 144Hz monitor, compared to all G-Sync monitors that ship with 30Hz minimum with 144Hz, 180Hz, and 240Hz refresh rates.
There was a point to Nvidia's quality control. The monitor market is rife with crap. And most monitor makers don't really know, or care, what they're doing.
FreeSync models exist in abundance, but only a few would be a worthwhile purchase.
You get what you pay for.
Someone hasn't considered that not all monitor FreeSync ranges can go as low as needed to achieve the 2.5x range required for LFC. Flicker and blanking occurs often on models that cannot do so.
As for popular models like the Asus MG279Q, the existing FreeSync range of 35 - 90Hz is laughable and a slap to the face. You can increase that range to 144Hz, barely achieving the required 57Hz minimum for LFC. Some don't - some stop at 58Hz, or higher. A lottery.
Some other monitors ship with 40 - 60Hz FreeSync ranges. One even ships with a 50 - 60Hz range. Do you get a reliable reduction in the FreeSync range minimum on all of these monitors? Some will flicker or blank out.
That's at 60Hz. You're running 4K, it seems. The vast majority of the G-Sync / FreeSync market is about 1080p 16:9, 1440p 16:9, 1080p Ultrawide, and 1440p Ultrawide high refresh rate displays (> 100Hz in most cases).
In all cases, both LFC and G-Sync's way of handling framerates below the minimum refresh rate aren't perfect, and they can't be, as they are predictive. Furthermore, they are worse the closer you are to the minimum (as then you'd have to use the higher refresh rate multiples to compensate). That is, you wouldn't want to float around 55FPS on a 57Hz - 144Hz monitor. Not when you're otherwise getting a pretty much flawless, smooth experience within the range.
G-Sync has made me completely non-tolerant of any frametime inconsistencies. Game engine hitches now stick out like a sore thumb, and any microstutters are immediately noticeable and rather jarring. In those cases, speaking based on my personal experience, I wouldn't want any predictive, non-perfect handling of frame delivery.
LFC / G-Sync's own implementation are meant to (greatly) reduce the impact of dipping below the minimum. But they do not necessarily allow one to sustain framerates below such minimum as comfortably as if they were within the range. G-Sync monitors keep that minimum at 30Hz which, for most cases, is enough. For 24FPS movies without frame doubling, it's not, but those are generally constant / stable framerate content, so frame doubling can actually be enabled.
Thx for the info. I like learning something new.
The power argument is easy to understand even if one gets a golden V56 that UV and OC so it actually uses around 180-200w power while running somewhat oc'd 1080 speeds. Norm would be closer to 225-250w on UV and OC. For me that some 50-75W (I think 1070 ti won't use as little as 1070) wouldn't cost much at all in 3 years either since electricity is cheap here. Running 64 vs 1080 would cost me maybe 40€ in 3 years if I left both at stock.
But then again at current prices Vega makes 0 sense. As it costs way over msrp.