List of FreeSync - Adaptive Sync Compatible Monitors

Discussion in 'Frontpage news' started by Hilbert Hagedoorn, Jan 14, 2015.

  1. tigermoth

    tigermoth Banned

    Messages:
    86
    Likes Received:
    0
    GPU:
    2x r9 290 amd cards
  2. Fox2232

    Fox2232 Ancient Guru

    Messages:
    11,809
    Likes Received:
    3,367
    GPU:
    6900XT+AW@240Hz
    Kind of full of misconceptions.
    There was never need to sync GPU to CPU, graphics driver tells DX/Mantle API that it is ready to get new frame information to render.
    That is only sync there is between GPU and "CPU" (CPU has nothing to do with it, it just processes general x86 instructions of any kind).

    Mantle/DX12 will work with Free/G-Sync without problem. Because both solutions use Graphics card display output to do all synchronization.

    More interesting question would be if Non DP 1.2a compliant Card can be used for rendering and then image would be Freesynced to display via 1.2a compliant APU output.

    This may increase sales for APUs since there are many powerful enough GPUs around which don't support Freesync.
    And if so, can image be rendered by nVidia HW and displayed via APU? :infinity:
     
  3. tigermoth

    tigermoth Banned

    Messages:
    86
    Likes Received:
    0
    GPU:
    2x r9 290 amd cards
    thx fox for that info.
     
  4. fantaskarsef

    fantaskarsef Ancient Guru

    Messages:
    12,681
    Likes Received:
    5,155
    GPU:
    2080Ti @h2o
    Yeah, that's what I thought.
     

  5. vazup

    vazup Master Guru

    Messages:
    325
    Likes Received:
    22
    GPU:
    r9 280X
  6. SpecChum

    SpecChum Master Guru

    Messages:
    616
    Likes Received:
    12
    GPU:
    Vega 64 - Acer XR34
    I wonder what the chances of current high end monitors being made compatible.

    There was, if I recall, a mention that it's possible, although not sure how likely, a firmware upgrade could be all that's needed.

    Be great if my XL2720Z could be "upgraded" to Freesync.

    I know I've got an Nvidia card but in 6 months, who knows. That's why I got this BenQ instead of a GSync one.
     
  7. tigermoth

    tigermoth Banned

    Messages:
    86
    Likes Received:
    0
    GPU:
    2x r9 290 amd cards
    Last edited: Jan 14, 2015
  8. Fox2232

    Fox2232 Ancient Guru

    Messages:
    11,809
    Likes Received:
    3,367
    GPU:
    6900XT+AW@240Hz
    BenQ is misleading customers for years. When I got mine I got it because website stated that there will be FW updates. There is only one type of displays which got official FW update which can be DLoaded and flashed into display unit, but only if you have special programming adapter.
    Otherwise you have to send them your monitor and they'll do it in-house.

    And since BenQ has Freesync screen now, I do not think they will consider older screen as candidates for upgrades even if they were physically capable.
    If that new XL2730Z does not cost too much, I'll get one with new GPU. Otherwise I'll take other FSync screen. But I really like BenQ pivots.
     
  9. alanm

    alanm Ancient Guru

    Messages:
    10,857
    Likes Received:
    2,930
    GPU:
    Asus 2080 Dual OC
    Surprised. But 'justified' fail if you will :D. Given that rest of LGs 34" 'ultra-wide line up is this:

    34UM95 = 3440x1440
    34UC97 = 3440x1440
    34UM67 = 3440 x 1440

    while all the rest of 2560x1080 are 29"
     
  10. sykozis

    sykozis Ancient Guru

    Messages:
    22,041
    Likes Received:
    1,215
    GPU:
    MSI RX5700
    The "technology" is free....meaning that neither AMD nor VESA are charging an additional licensing fee for it. The ASIC and panel needed for such feature still costs money and as such increases the cost of the displays that support it. The companies making the ASIC have to get paid for their product. The companies making the panel have to get paid for their product. Anyone that expects relatively new "technology" to be free when it hits the market is out of their mind. It's not free to develop or implement and as such typically results in higher prices until R&D is covered and uptake increases to the point where profit margins reach a desirable level for the companies making such products.
     

  11. 17seconds

    17seconds Member

    Messages:
    29
    Likes Received:
    1
    GPU:
    EVGA GTX 1080 FTW+
    Quote:
    "The AMD Radeon™ R9 295X2, 290X, R9 290, R9 285, R7 260X and R7 260 GPUs additionally feature updated display controllers that will support dynamic refresh rates during gaming.
    http://support.amd.com/en-us/search/faq/219
     
  12. -Tj-

    -Tj- Ancient Guru

    Messages:
    17,312
    Likes Received:
    2,023
    GPU:
    Zotac GTX980Ti OC
    Any info about this one, @ their homepage there is nothing yet, is this IPS too?

    Nixeus NX-VUE24 24" 1080p 144Hz
     
  13. kanej2007

    kanej2007 Ancient Guru

    Messages:
    8,396
    Likes Received:
    60
    GPU:
    MSI GTX 1080 TI 11GB
    That is a BEAUTIFUL monitor. An ultrawide screen being IPS, drool drool...

    There are also several Dell panels being ultra wide and IPS.

    Great for movies and games such as Command & Conquer/Skyrim would look Fuucckking great on such a screen.

    Once several more ultra wide screen IPS panels are released being around 2-4ms, 120-144hz and 2k or 4k, I'll definitely be getting one...
     
  14. Fox2232

    Fox2232 Ancient Guru

    Messages:
    11,809
    Likes Received:
    3,367
    GPU:
    6900XT+AW@240Hz
    CS 1.6 would be very bad on 21:9 screen. Its FOV is horizontally locked not vertically. That causes loss of top & bottom area when going from 4:3 to 16:9/10 and loss of ability to see upper and lower areas would be too big on 21:9. It would cause quite big disadvantage.
    And I do not know any modification which would keep vertical FOV and expand horizontal. :bang:
     
  15. kanej2007

    kanej2007 Ancient Guru

    Messages:
    8,396
    Likes Received:
    60
    GPU:
    MSI GTX 1080 TI 11GB
    Haha, agreed, which is why I specifically mentioned games such as Skyrim, C & c, Warhammer, Dota, Warcraft.

    Some shooters will not look right due to FOV as you already mentioned...
     

  16. Rugburn

    Rugburn Member Guru

    Messages:
    133
    Likes Received:
    0
    GPU:
    2x EVGA GTX1080 Ti FTW3
    I keep thinking back to a year ago when Nvidia's, Tom Petersen, first responded to FreeSync.. An article posted here on Guru3d.. [URL="http://www.guru3d.com/news-story/nvidia-responds-to-amd-freesync.html" ]http://www.guru3d.com/news-story/nvidia-responds-to-amd-freesync.html[/URL]

    """Desktop monitors use other interfaces, like HDMI and DisplayPort, and typically have a scaler chip situated in the path between the GPU and the panel. As a result, a feature like variable refresh is nearly impossible to implement on a desktop monitor as things now stand."""

    and

    """When asked about a potential VESA standard to enable dynamic refresh rates, Petersen had something very interesting to say: he doesn't think it's necessary, because DisplayPort already supports "everything required" for dynamic refresh rates via the extension of the vblank interval. That's why, he noted, G-Sync works with existing cables without the need for any new standards. Nvidia sees no need and has no plans to approach VESA about a new standard for G-Sync-style functionality—because it already exists."""

    What bothers me most is that instead of working on a "standard", Nvidia developed a separate module that they only allow to work with Nvidia hardware, and comes at a premium price..

    So this is the way I understand it.... FreeSync can work on both AMD and Nvidia hardware, but Nvidia wont support it because they invested in G-sync....

    G-sync can work on both Nvidia and AMD hardware, but Nvidia wont allow it....

    I prefer to support a standard.. So unless Nvidia changes direction, or offers gsync without the huge premium.. I will likely go red team on my next gpu upgrade...
     
  17. Fox2232

    Fox2232 Ancient Guru

    Messages:
    11,809
    Likes Received:
    3,367
    GPU:
    6900XT+AW@240Hz
    Cable between nVidia graphics card and G-Sync monitor is standard DP cable.
    If you know what kind of signaling nV uses, you can make G-Sync work with intel/AMD graphics.
    In same way as nVidia Vision3D monitor can be forced to run with AMD/intel graphics in 3D mode (strobe, which was initially not possible as nV locked those features).

    And when I look at this, nV Vision3D collides with variable sync of GSync, and only one of those can be used at time.
    While passive 3D can be used with G/Free-sync.

    But backlight strobe feature which people around world tried to unlock for improved response time has its purpose.
    Hacking GSync monitor to work with AMD/intel HW gives no benefit as Freesync works nearly same (same for average user) and those differences are not to Freesync disadvantage.
     
  18. fantaskarsef

    fantaskarsef Ancient Guru

    Messages:
    12,681
    Likes Received:
    5,155
    GPU:
    2080Ti @h2o
    Yeah, I can see why you're putting the monitor in for the GPU choice now, I'm really tempted to do so too... I'm already considering NOT going for the 1440p Swift, and probably getting a cheap 120Hz monitor without gsync because if nvidia doesn't make it cheaper (which they won't for the next months / years), I might consider going for AMD. If only their GPUs were my choice to go red, and not the monitors....

    absurd, isn't it, to choose the GPU because of your monitor? Haven't seen people go for nvidia because of gsync...
     
  19. Fox2232

    Fox2232 Ancient Guru

    Messages:
    11,809
    Likes Received:
    3,367
    GPU:
    6900XT+AW@240Hz
    Some moved for GSync, but considering nVidia would lock ex AMD user on their side, GSync should have been donated by nV as that would bring revenue in future.
    They had practically entire year to gain unquestionable dominance in PC market.
     
  20. givmedew

    givmedew Member

    Messages:
    27
    Likes Received:
    0
    GPU:
    (2) CrossFire AMD R9 290X
    So I really never ever stick with AMD or NVIDIA... it always depends on who is the best at that time. I mean I went 2GB 5850 to GTX 670 to (2) R9 290s (one I bought early November 2013 for $380 and unlocked to X and the other was an X that I got for $200 used in late April 2014 when all those idiots who bought 20 290X cards for $500-700 a pop found out it was a pipe dream)

    But I go back and fourth... before the 5850 I had NVIDIA...

    BUT!!!

    This TIME!!!!!!!!!

    I have 2 main PCs... One is MINE and one I build and maintain so my little brother who is in college can play top notch games with me.

    Well I replaced the (2) GTX 580s I had in his system for a single Gigabyte GTX 970 G1 Gaming...

    I found out the NVIDIA RAPED ME!!!

    The card utterly fails if more than 3.5GB of VRAM is used... the 980 does not have this problem and nobody knew that at launch. NVIDIA says it has something to do with the way the RAM is partitioned or some junk like that. They knew about this from the get go... because they are the ones who did the partitioning... for what reason is unkown... they say because the last 512MB behaves slower... but it could have been just to cripple it compared to the GTX 980... because that is what NVIDIA does... they play around with stuff...

    IT IS BS!!!

    On a bunch of games I have to run them with lowered texture settings... I don't have to do that on my 290X setup...

    So now not only is AMD going the legit ethical way with the FreeSync but NVIDIA budget enthusiast level card is JUNK!!!! Absolute JUNK!!!

    It is in no way shape or form a better card than the 290X...

    I thought it was... which is why I bought my brother the $370 GTX 970 G1 Gaming when I could have grabbed a used R9 290 or 290X for $200-250

    NOW

    I have a problem with NVIDIA

    It will take at least 1 year if not 2 years to clear that issue up...

    They will have to do 2 things...

    Make an enthusiast budget card that works

    AND

    Enabled AdaptiveSync/FreeSync on all of their cards... the later could take weeks, months, or years but it WILL happen.
     

Share This Page