New Upcoming ATI/AMD GPU's Thread: Leaks, Hopes & Aftermarket GPU's

Discussion in 'Videocards - AMD Radeon' started by OnnA, Jul 9, 2016.

  1. OnnA

    OnnA Ancient Guru

    Messages:
    13,506
    Likes Received:
    3,436
    GPU:
    3080Ti VISION OC
  2. OnnA

    OnnA Ancient Guru

    Messages:
    13,506
    Likes Received:
    3,436
    GPU:
    3080Ti VISION OC
  3. OnnA

    OnnA Ancient Guru

    Messages:
    13,506
    Likes Received:
    3,436
    GPU:
    3080Ti VISION OC
    What is FreeSync?

    Over the past few weeks we’ve been looking into and exploring the world of FreeSync 2. Now this isn’t a new technology – it was announced at CES 2017 – but it’s only now that we’re starting to see the FreeSync 2 ecosystem expand with new display options. As HDR and wide-gamut monitors become more of a reality over the next year, there’s no better time to discuss FreeSync 2 than now.

    And there’s a fair bit of confusion around what FreeSync 2 really is, how it functions, and how it differs from the original iteration of FreeSync.

    This article will explore and explain FreeSync 2 as the technology currently stands, as it’s a little different to the tech AMD announced more than a year ago. Our detailed impressions of using a FreeSync 2 monitor will come next week.

    What is FreeSync?
    Here’s a quick refresher on the original FreeSync. The name FreeSync is a brand name that refers to AMD’s implementation of adaptive synchronization technology. It essentially allows a display to vary its refresh rate to match the render rate of a graphics processor, so that, for example, a game running at 54 FPS is displayed at 54 Hz, and when that games bumps up to 63 FPS the display also shifts to 63 Hz. This reduces stuttering and screen tearing compared to monitor operating at a fixed refresh rate, say 60 Hz, displaying a game running at an unmatched render rate like 54 FPS.

    [​IMG]

    FreeSync requires a few modifications to the display’s internal controllers, and also a compatible graphics processor, to function. Nvidia’s competing technology that achieves similar results, G-Sync, uses an expensive proprietary controller module. FreeSync is an open standard, and was adopted as the official VESA Adaptive Sync standard, so any display controller manufacturer can implement the technology.

    The core technology of FreeSync is just this one feature: adaptive sync. Display manufacturers are able to integrate FreeSync into their displays through whatever means they like, provided it passes adaptive sync validation.

    A monitor certified as FreeSync compatible only means that monitor supports adaptive sync; there’s no extra validation for screen quality or other features, so just because a monitor has a FreeSync logo on the box doesn’t necessarily mean it’s a high quality product.

    What is FreeSync 2?
    And this is where FreeSync 2 comes in. It’s not a replacement to the original FreeSync, and it’s not really a direct successor, so the name ‘FreeSync 2’ is a bit misleading. What it does provide, though, are additional features on top of the original FreeSync feature set. Every FreeSync 2 monitor is validated to have these additional features, so the idea is that a customer shopping for a gaming monitor can buy one with a FreeSync 2 badge knowing it’s of a higher quality than standard FreeSync monitors.

    [​IMG]

    Both FreeSync and FreeSync 2 will coexist in the market. While the naming scheme doesn’t suggest it, FreeSync 2 is effectively AMD’s brand for premium monitors validated to a higher standard, while FreeSync is the mainstream option.

    You’re not getting old technology by purchasing a monitor with original FreeSync tech, in fact the way adaptive sync works in FreeSync and FreeSync 2 is identical. Instead, FreeSync monitors simply miss out on the more premium features offered through FreeSync 2.

    What are these new features? Well, it breaks down into three main areas: high dynamic range, low framerate compensation, and low latency.


    FreeSync 2: High Dynamic Range
    Let’s tackle HDR support first. When AMD originally announced FreeSync 2 they went into detail on how their implementation of FreeSync 2 was going differ from a standard HDR pipeline. FreeSync 2’s HDR tone mapping was supposed to use calibration and specification data sent from the monitor to the PC to simplify the tone mapping process.

    [​IMG]

    The idea was the games themselves would tone map directly to what the display was capable of presenting, with the FreeSync 2 transport passing the data straight to the monitor without the need for further processing on the monitor itself. This was in contrast to standard HDR tone mapping pipelines that see games tone map to an intermediary format before the display then figures out how to tone map it to its capabilities. Having the games do the bulk of the HDR tone mapping work was supposed to reduce latency, which is an issue with HDR gaming.

    That’s how AMD detailed FreeSync 2’s HDR implementation back at CES 2017. While it sounded nice in theory, one of the key issues raised at the time was that the games themselves had to tone map specifically to FreeSync 2 displays. This meant games would need to integrate a FreeSync 2 API if this HDR implementation was ever to succeed, and we all know how difficult it is to convince a game developer to integrate a niche technology.

    As FreeSync 2 stands right now, that original HDR implementation isn’t quite ready yet. AMD’s website on FreeSync 2 simply lists the technology as including “support for displaying HDR content,” and there is no mention anywhere of FreeSync 2 supported games. And when you actually use a FreeSync 2 monitor, HDR support relies entirely on Windows 10’s HDR implementation for now, which is improving slowly but isn’t at the same level AMD’s original solution is set to provide in an ideal environment.

    The reason for this is FreeSync 2 support was only introduced in AMD’s GPU Services 5.1.1 in September 2017, so game developers have only had the tools to implement FreeSync 2’s GPU-side tone mapping for a bit over 7 months now. Getting these sorts of technologies implemented in games can take a long time, and right now there’s no word on whether any currently released games have used AGS 5.1.1 in the development process.

    [​IMG]

    One of the features AMD mentioned as part of their HDR implementation was automatic switching between HDR and SDR modes, so you could game using the full HDR capabilities of your display while returning to a comfortable SDR for desktop apps. Unfortunately this doesn’t seem to be functional at the moment either, instead FreeSync 2 once again makes use of Windows’ standard HDR implementation that doesn’t handle the HDR to SDR transition too well.

    However, while the implementation might not be anything special at the moment, FreeSync 2 does guarantee several things relating to HDR. All FreeSync 2 monitors support HDR, so you’re guaranteed to get an HDR-capable monitor if it has a FreeSync 2 badge. FreeSync 2 also ensures you can run both adaptive sync and HDR at the same time for an optimal gaming experience. And finally, AMD states that all FreeSync 2 monitors require “twice the perceptual color space of sRGB for better brightness and contrast.”

    It’s unclear exactly what AMD means by “twice the perceptual color space,” but the idea is a FreeSync 2 monitor would support a larger-than-sRGB gamut and higher brightness than a basic gaming monitor.

    [​IMG]

    And it does appear that AMD’s FreeSync 2 validation process is looking for more than just a basic HDR implementation. So far, every FreeSync 2 monitor that’s available or has been announced meets at least the DisplayHDR 400 specification. This is a fairly weak HDR spec but we have seen some non-FreeSync 2 supposedly HDR-capable monitors fail to meet even the DisplayHDR 400 spec, so at least with FreeSync 2 you’re getting a display that meets the new minimum industry standard for monitor HDR.

    Of course, some monitors will exceed DisplayHDR 400, like the original set of Samsung FreeSync 2 monitors such as the CHG70 and CHG90; both of these displays meet the DisplayHDR 600 spec. Ideally I’d have liked to see FreeSync 2 stipulate a DisplayHDR 600 minimum, but 400 nits of peak brightness from DisplayHDR 400 should be fine for an entry-level HDR experience.

    FreeSync 2: Low Input Latency
    The second main FreeSync 2 feature is reduced input latency, which we briefly touched on earlier. HDR processing pipelines have historically introduced a lot of input lag, particularly on the display side, however FreeSync 2 stipulates low latency processing for both SDR and HDR content. AMD hasn’t published a specific metric they are targeting for input latency, however it’s safe to say 50 to 100ms of lag like you might get with a standard HDR TV would not be acceptable for a gaming monitor.

    [​IMG]

    How FreeSync 2 is achieving low latency support in 2018 appears to be more on the display side than the original implementation announced at the start of 2017.


    As we mentioned when discussing FreeSync 2’s HDR implementation, the original idea was to push all tone mapping into the game engine to cut down on display-side tone mapping, thereby reducing input latency as the display’s slow processor wouldn’t need to get involved as much. As games haven’t started supporting FreeSync 2 yet, today it seems this latency reduction is purely coming from better processing hardware in the display, for example current Samsung FreeSync 2 monitors include a ‘low latency’ mode that is automatically enabled when FreeSync 2 is enabled.

    FreeSync 2: Low Framerate Compensation
    The final key feature is low framerate compensation. This is a feature that goes hand-in-hand with adaptive sync, ensuring adaptive sync functions at every framerate from 0 FPS up to the maximum refresh rate supported by the display.

    There is one simple reason why we need low framerate compensation: displays can only vary their refresh rate within a certain window, for example 48 to 144 Hz. If you wanted to run a game below the minimum supported refresh rate, say at 40 FPS when the minimum refresh is 48 Hz, normally you’d be stuck with standard screen tearing or stuttering issues like you’d get with a fixed refresh monitor. That’s because the GPU’s render rate is out of sync with the display refresh rate.

    [​IMG]

    Low framerate compensation, or LFC, extends the window in which you can sync the render rate to the refresh rate using adaptive sync. When the framerate falls below the minimum refresh rate of the monitor, frames are simply displayed multiple times and the display runs at a multiple of the required refresh rate.

    In our previous example, to display 40 FPS using LFC, every frame is doubled and then this output is synced to the display running at 80 Hz. You can even run games at, say, 13 FPS and have that synced to a refresh rate; in that case the monitor would run at 52 Hz (to exceed the 48 Hz minimum) and then every frame would be displayed 4 times.

    The end result is LFC effectively removes the minimum refresh rate of adaptive sync displays, but for LFC to be supported, the monitor needs to have a maximum refresh rate that is at least double the minimum refresh rate. This is why not all FreeSync monitors support LFC; some come with just 48 to 75 Hz refresh windows, which doesn’t meet the criteria for LFC. However in the case of FreeSync 2, every monitor validated for this spec will support LFC so you won’t have to worry about the minimum refresh rate of the monitor.

    Current FreeSync 2 Monitors
    This wouldn’t be a look at FreeSync 2 in 2018 without exploring what FreeSync 2 monitors are actually available right now, and what monitors are coming.

    Currently there are only three FreeSync 2 monitors on the market, and all are from Samsung’s Quantum Dot line-up: the C27HG70 and C32HG70 as 27- and 32-inch 1440p 144Hz monitors respectively, along with the stupidly wide C49HG90, a double-1080p 144Hz monitor. All three are DisplayHDR 600 certified.

    -> https://www.techspot.com/article/1630-freesync-2-explained/
     
    Embra and Maddness like this.
  4. Maddness

    Maddness Ancient Guru

    Messages:
    1,644
    Likes Received:
    812
    GPU:
    3080 Aorus Xtreme
    I want a 27" 1440p 25-200hz Freesync 2 monitor with HDR. Preferably OLED to.
     

  5. OnnA

    OnnA Ancient Guru

    Messages:
    13,506
    Likes Received:
    3,436
    GPU:
    3080Ti VISION OC
    you don't need low as low as 15-25, because of LTC (Mine is 30-70 OC'ed to 30-74) it means Low is 30/2=15 so i have 15-75 Freesync
    Thats why lot's of new FS Monitors have 40-144 and so on.... it reads 20-144Hz
     
    Last edited: May 19, 2018
    Maddness likes this.
  6. OnnA

    OnnA Ancient Guru

    Messages:
    13,506
    Likes Received:
    3,436
    GPU:
    3080Ti VISION OC
  7. OnnA

    OnnA Ancient Guru

    Messages:
    13,506
    Likes Received:
    3,436
    GPU:
    3080Ti VISION OC
    AMD/ATI Computex Press Conference

    @ 8 days 00h
     
    Last edited: May 29, 2018
  8. OnnA

    OnnA Ancient Guru

    Messages:
    13,506
    Likes Received:
    3,436
    GPU:
    3080Ti VISION OC
    AMD/ATI has slightly increased its GPU market share in the first quarter of 2018, Intel remains the king.

    Jon Peddie Research, the industry’s market research firm for the graphics industry, has released its Q1 2018 market report. According to the report, year-to-year total GPU shipments increased 3.4%, desktop graphics increased 14% and notebooks decreased -3%.
    And even though AMD has slightly increased its GPU market share, overall GPU shipments decreased -10% from last quarter (AMD decreased -6%, Nvidia decreased -10%, and Intel decreased -11%).

    [​IMG]

    As said AMD increased its market share again this quarter benefitting from new products for workstations, and crypto-currency mining, while Nvidia held steady.
    On the other hand, Intel’s market share slightly decreased, although it remains the dominant GPU maker.

    [​IMG]

    Over three million add-in boards (AIBs) were sold to cryptocurrency miners worth $776 million in 2017. In this first quarter, an additional 1.7 million were sold.

    The attach rate of GPUs (includes integrated and discrete GPUs) to PCs for the quarter was 140% which was up 5.75% from last quarter.
    Discrete GPUs were in 39.11% of PCs, which is up 2.23%, the overall PC market decreased -14.12% quarter-to-quarter and increased 0.46% year-to-year.

    -> https://www.jonpeddie.com/press-releases/gpu-market-increased-year-to-year-by-3.4
     
  9. Maddness

    Maddness Ancient Guru

    Messages:
    1,644
    Likes Received:
    812
    GPU:
    3080 Aorus Xtreme
    I would love them to release info on a refreshed Vega series card. Either that or some info on Navi and the Threadripper refresh.
     
  10. warlord

    warlord Ancient Guru

    Messages:
    2,761
    Likes Received:
    927
    GPU:
    Null
    I need a faster GPU under 400-450$ with at least 8GB vram not fiasco gpus with 4gb only. Can AMD bring me something strong with rich features? Fury/580 were/are disappointing and vega overpriced as hell at the moment.
     

  11. JonasBeckman

    JonasBeckman Ancient Guru

    Messages:
    17,502
    Likes Received:
    2,891
    GPU:
    MSI 6800 "Vanilla"
    Yeah Vega is only worth it with a discount, they're starting to come down more in price overall but not those levels (MSRP.) but about 100$ over has been possible now but they sell out fast.
    (And the prices fluctuate heavily.)

    50 - 60% performance gain depending on title but at 600 - 800$ or more depending on country it's still not a great investment especially compared to the competition and the 1070Ti and 1080 if you're not using Freesync or have some other reason to stick with AMD. :)
    (390X already comes in 8 GB versions too so not as badly VRAM bottlenecked like the Fury can be though there's 4 GB of these and Polaris too.)

    If you are an overclocker Vega 56 is also a bit riskier since many newer production runs use Hynix memory instead of Samsung, these HBM2 modules don't clock as well even if you flash the GPU to use a compatible Vega 64 bios and get some more voltage going though Vega 64 is also still going to vary depending on overall quality so some clock better than others for RAM and GPU.
    (Undervolting and balancing power draw and temps tend to really help, RAM speed can see gains up to 1000Mhz for Samsung if stable and then it drops of a bit.)

    Even the 480 and 580 Polaris GPU's had a price increase until recently although they're more readily available at least.
     
    Maddness and warlord like this.
  12. Fox2232

    Fox2232 Ancient Guru

    Messages:
    11,809
    Likes Received:
    3,366
    GPU:
    6900XT+AW@240Hz
    I really wish you get that 1080Ti and pair it with your X4 860K.
    And then I hope, you'll keep that combo till 1080Ti performance becomes affordable as entry level GPU.
     
    Embra and user1 like this.
  13. RzrTrek

    RzrTrek Ancient Guru

    Messages:
    2,523
    Likes Received:
    716
    GPU:
    RX 580 ♥ MESA 21.1
    I would be surprised if AMD could deliver something even remotely close to the GTX 1080 Ti for the same price without needing twice the amount of power in 2018.
     
  14. OnnA

    OnnA Ancient Guru

    Messages:
    13,506
    Likes Received:
    3,436
    GPU:
    3080Ti VISION OC
    I'm not surprised, i'm patience :D
     
  15. OnnA

    OnnA Ancient Guru

    Messages:
    13,506
    Likes Received:
    3,436
    GPU:
    3080Ti VISION OC
    AMD Radeon RX Vega 56 in a notebook

    Is this a full-fat desktop variant of Radeon RX Vega 56? Well, it seems unlikely to expect 210W graphics cards in a mobile form factor. It could be downclocked or optimized for mobile use, or be a proper mobile variant (which AMD confirmed a few months ago).

    Acer did not reveal any details about the graphics card itself, but Vega 56 clearly stands for 56 Compute Units. It is offered as an option to GTX 1070 variant, which itself is 115-120W graphics solution.

    Acer Predator Helios 500 notebook will be offered in few main variants including Intel i9-8950HK + NVIDIA GTX 1070 and AMD Ryzen 7 2700 + AMD Radeon RX Vega 56. This, ladies and gentlemen, is the first true high-performance AMD-only laptop in years.

    The Intel+NVIDIA option will support G-Sync technology, while AMD solution will give you FreeSync support. All Helios 500 laptops are equipped with FullHD 144Hz panels.

    Prices of Predator Helios 500 notebooks start from 2000 USD (i7-8750H variant). Unfortunately, pricing of i7-8950HK / Ryzen 2700 models has not yet been confirmed.


    [​IMG]

    [​IMG]
     

  16. Maddness

    Maddness Ancient Guru

    Messages:
    1,644
    Likes Received:
    812
    GPU:
    3080 Aorus Xtreme
    Looks pretty sweet. Will be interesting to see the numbers.
     
  17. OnnA

    OnnA Ancient Guru

    Messages:
    13,506
    Likes Received:
    3,436
    GPU:
    3080Ti VISION OC
    AMD/ATI Computex Press Conference

    Counter:
    @ 6 days 22h
     
    Last edited: May 30, 2018
    Maddness likes this.
  18. user1

    user1 Ancient Guru

    Messages:
    1,704
    Likes Received:
    586
    GPU:
    hd 6870
    hopefully they have something cool to show, kinda miss the days where we would get big announcements(fermi , tahiti, r300, g80) , every thing seems rather incremental over the past 5 years
     
    Maddness likes this.
  19. OnnA

    OnnA Ancient Guru

    Messages:
    13,506
    Likes Received:
    3,436
    GPU:
    3080Ti VISION OC
    yup, it's because we have a little Next-Gen API implementation stagnation.
    We need to wait for Next-Gen Consoles, then we will have 90% Games build in-mind as only DX12/VLK
    H/W is here, look at VLK DOOM & Wolf in 4k ;)


    [​IMG]

    "- I'm a software engineer that does distributed computing at the operating system level these days, but i once worked on a game engine (FreeSpace 2 Open project) so i have some experience there.

    The DX12 paradigm of asynchronous computing is powerful crap, the problem is that it's also really hard - like honestly beyond the ability of most game dev shops average devs.

    = You're actually correct. sometimes new generations of the APIs have pretty big changes in how you have to design your program over all to be able to take advantage of them properly. The jump from DX9 to 10/11 was like that and the jump from 11 to 12 is again like that.

    if you write your game in the DX11 fashion but use DX12 you're going to see very little gains - a little more free CPU. but if you write your game from the ground up in the DX12 fashion you could see some pretty big gains on machines with higher processor core counts, etc."

    -> https://www.reddit.com/r/pcgaming/comments/7inibd/the_state_of_dx12_games_in_2017/

    -> https://en.wikipedia.org/wiki/List_of_games_with_DirectX_12_support

    >32 DX12 games out there (some games do have recent DX12 patches tho)
    >4 VLK Games (more on LinuX)

    Next-Gen API implementation in Games is 400% better than DX11 in first 2 years of it's debut :D
    True DX12 is with Us since last Autumn (DX12.1 patch with SM/VM up to 6.1)
    DX11.0 has SM 5.0 DX11.1 & 11.2 has SM 5.1 (used in all Frostbite games, since BF4, NFS15)

    [​IMG]
     
    Last edited: May 30, 2018
    user1 likes this.
  20. Fox2232

    Fox2232 Ancient Guru

    Messages:
    11,809
    Likes Received:
    3,366
    GPU:
    6900XT+AW@240Hz
    I did read that when you are exposed to fatal radiation, your eyeballs start to glow from inside. Is that attempt to show what person, which is exposed to such level of radiation, sees?

    Or maybe it is just cataract.
     
    OnnA likes this.

Share This Page