1. This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.

New Upcoming ATI/AMD GPU's Thread: Leaks, Hopes & Aftermarket GPU's

Discussion in 'Videocards - AMD Radeon' started by OnnA, Jul 9, 2016.

Thread Status:
Not open for further replies.
  1. OnnA

    OnnA Ancient Guru

    Messages:
    9,455
    Likes Received:
    1,923
    GPU:
    Vega 64 XTX LiQuiD
     
    Last edited: May 13, 2018
  2. OnnA

    OnnA Ancient Guru

    Messages:
    9,455
    Likes Received:
    1,923
    GPU:
    Vega 64 XTX LiQuiD
    TrueAudio Next


    Do you remember AMD's TrueAudio Next technology? If not, you'd be forgiven; it's not gained as much traction as it could (should?) have, considering its open nature. As a quick reminder, this is AMD's GPU-accelerated audio pipeline, which adds "audio raytracing" capabilities to audio by delivering true spatial positioning and object interactions in a given scene - at much higher performance than the usual CPU-based solutions.

    [​IMG]


    The 1.2 version is being hailed as a "coming of age" for True Audio Next, which includes "a number of notable performance and feature improvements, and it reflects the enhancements built into the version of TAN supported in Steam Audio." Efficiency has also been improved, with minimized "memory, buffer transfer and synchronization overhead". The remainder of the blog post by AMD's Fellow Design Engineer Carl Wakeland follows."The TAN GPU utilities library now supports AMD Resource Reservation, in which a configurable part of the GPU may be reserved for audio processing apart from the normal GPU compute resources. As explained in earlier blogs, Resource Reservation protects audio and graphics queues and compute resources from blocking each other, allowing them to coexist on the GPU as never before possible. Developers can now call a function to query a system's TAN support and available resources, as well.

    Finally, a number of new samples are added to exemplify and streamline the process of building audio applications using TAN:
    • Accelerated mixing. Mixing on the GPU with TAN can minimize buffer transfer overhead.
    • 10-band EQ.
    • IIR (Infinite Impulse Response) filter.
    • Time domain convolution and doppler sample.
    We continue to work on exciting new optimizations for future releases. Meanwhile, we welcome contributions from others - please feel free to make a pull request to submit your own examples and optimizations for TAN."

    [​IMG]
     
  3. OnnA

    OnnA Ancient Guru

    Messages:
    9,455
    Likes Received:
    1,923
    GPU:
    Vega 64 XTX LiQuiD
    AMD’s RX Vega 56 finally gets miniaturised :D
    Meet Nano 56

    [​IMG]

    That’s where PowerColor’s Nano Edition card differs, as Hardwareluxx report. It’s the first design to offer the cut-down PCB and only a single fan design for mini-ITX builds, and while not the first time we've heard about it,
    the card will supposedly make an appearance at Computex at the start of June. It seems only an RX Vega 56 version will be available for the time being.

    That might be with good reason, too. AMD cards are pretty power-hungry, and with higher consumption comes higher temperatures, too.
    The single axial fan of the PowerColor Nano Edition is going to have to be well-designed to cope with all that heat the RX Vega 56 is capable of pumping out.

    It’s also a rather bland shroud design, and even the rather tame R9 Nano miniature graphics cards, from a few years back, stand out with their reference flair a little more than the PowerColor RX Vega variant.
    The only feature that breaks up the shroud design are the 8+6 pin power connectors.

    Hopefully we will get to see more in way of performance numbers from Computex to see how this ITX graphics card really copes. While other GPUs have been returning closer to MSRP in recent weeks,
    AMD’s latest graphics architecture are still relatively pricey versus comparable performing cards. However, there is some hope that PowerColor’s miniature RX Vega 56 may be affordable,
    as their Red Dragon RX Vega 56 is one of the only RX Vega 56 cards on the market right now for under £500.
     
  4. JonasBeckman

    JonasBeckman Ancient Guru

    Messages:
    15,623
    Likes Received:
    1,620
    GPU:
    Sapphire 5700XT P.
    Nice to see another Nano Vega, and this one is in form factor too although I wonder how well it cools as a result.
    Going by the PCB teardown (As it's called.) the GPU is fairly similar to the standard PCB but with less voltage phases though still plenty powerful but as a result of everything being closer together heat build up is a bigger concern and some of the components don't handle heat very well which is probably why the other Nano variants opted for a extended cooler instead of keeping a smaller form factor. :)

    Shame these much as any other Vega GPU fluctuate a lot in terms of pricing and overall availability can be pretty low, would be nice to see more reviews of these but I guess that might be a bit difficult if there's no cards in stock ha ha.
     

  5. OnnA

    OnnA Ancient Guru

    Messages:
    9,455
    Likes Received:
    1,923
    GPU:
    Vega 64 XTX LiQuiD
  6. OnnA

    OnnA Ancient Guru

    Messages:
    9,455
    Likes Received:
    1,923
    GPU:
    Vega 64 XTX LiQuiD
  7. OnnA

    OnnA Ancient Guru

    Messages:
    9,455
    Likes Received:
    1,923
    GPU:
    Vega 64 XTX LiQuiD
    What is FreeSync?

    Over the past few weeks we’ve been looking into and exploring the world of FreeSync 2. Now this isn’t a new technology – it was announced at CES 2017 – but it’s only now that we’re starting to see the FreeSync 2 ecosystem expand with new display options. As HDR and wide-gamut monitors become more of a reality over the next year, there’s no better time to discuss FreeSync 2 than now.

    And there’s a fair bit of confusion around what FreeSync 2 really is, how it functions, and how it differs from the original iteration of FreeSync.

    This article will explore and explain FreeSync 2 as the technology currently stands, as it’s a little different to the tech AMD announced more than a year ago. Our detailed impressions of using a FreeSync 2 monitor will come next week.

    What is FreeSync?
    Here’s a quick refresher on the original FreeSync. The name FreeSync is a brand name that refers to AMD’s implementation of adaptive synchronization technology. It essentially allows a display to vary its refresh rate to match the render rate of a graphics processor, so that, for example, a game running at 54 FPS is displayed at 54 Hz, and when that games bumps up to 63 FPS the display also shifts to 63 Hz. This reduces stuttering and screen tearing compared to monitor operating at a fixed refresh rate, say 60 Hz, displaying a game running at an unmatched render rate like 54 FPS.

    [​IMG]

    FreeSync requires a few modifications to the display’s internal controllers, and also a compatible graphics processor, to function. Nvidia’s competing technology that achieves similar results, G-Sync, uses an expensive proprietary controller module. FreeSync is an open standard, and was adopted as the official VESA Adaptive Sync standard, so any display controller manufacturer can implement the technology.

    The core technology of FreeSync is just this one feature: adaptive sync. Display manufacturers are able to integrate FreeSync into their displays through whatever means they like, provided it passes adaptive sync validation.

    A monitor certified as FreeSync compatible only means that monitor supports adaptive sync; there’s no extra validation for screen quality or other features, so just because a monitor has a FreeSync logo on the box doesn’t necessarily mean it’s a high quality product.

    What is FreeSync 2?
    And this is where FreeSync 2 comes in. It’s not a replacement to the original FreeSync, and it’s not really a direct successor, so the name ‘FreeSync 2’ is a bit misleading. What it does provide, though, are additional features on top of the original FreeSync feature set. Every FreeSync 2 monitor is validated to have these additional features, so the idea is that a customer shopping for a gaming monitor can buy one with a FreeSync 2 badge knowing it’s of a higher quality than standard FreeSync monitors.

    [​IMG]

    Both FreeSync and FreeSync 2 will coexist in the market. While the naming scheme doesn’t suggest it, FreeSync 2 is effectively AMD’s brand for premium monitors validated to a higher standard, while FreeSync is the mainstream option.

    You’re not getting old technology by purchasing a monitor with original FreeSync tech, in fact the way adaptive sync works in FreeSync and FreeSync 2 is identical. Instead, FreeSync monitors simply miss out on the more premium features offered through FreeSync 2.

    What are these new features? Well, it breaks down into three main areas: high dynamic range, low framerate compensation, and low latency.


    FreeSync 2: High Dynamic Range
    Let’s tackle HDR support first. When AMD originally announced FreeSync 2 they went into detail on how their implementation of FreeSync 2 was going differ from a standard HDR pipeline. FreeSync 2’s HDR tone mapping was supposed to use calibration and specification data sent from the monitor to the PC to simplify the tone mapping process.

    [​IMG]

    The idea was the games themselves would tone map directly to what the display was capable of presenting, with the FreeSync 2 transport passing the data straight to the monitor without the need for further processing on the monitor itself. This was in contrast to standard HDR tone mapping pipelines that see games tone map to an intermediary format before the display then figures out how to tone map it to its capabilities. Having the games do the bulk of the HDR tone mapping work was supposed to reduce latency, which is an issue with HDR gaming.

    That’s how AMD detailed FreeSync 2’s HDR implementation back at CES 2017. While it sounded nice in theory, one of the key issues raised at the time was that the games themselves had to tone map specifically to FreeSync 2 displays. This meant games would need to integrate a FreeSync 2 API if this HDR implementation was ever to succeed, and we all know how difficult it is to convince a game developer to integrate a niche technology.

    As FreeSync 2 stands right now, that original HDR implementation isn’t quite ready yet. AMD’s website on FreeSync 2 simply lists the technology as including “support for displaying HDR content,” and there is no mention anywhere of FreeSync 2 supported games. And when you actually use a FreeSync 2 monitor, HDR support relies entirely on Windows 10’s HDR implementation for now, which is improving slowly but isn’t at the same level AMD’s original solution is set to provide in an ideal environment.

    The reason for this is FreeSync 2 support was only introduced in AMD’s GPU Services 5.1.1 in September 2017, so game developers have only had the tools to implement FreeSync 2’s GPU-side tone mapping for a bit over 7 months now. Getting these sorts of technologies implemented in games can take a long time, and right now there’s no word on whether any currently released games have used AGS 5.1.1 in the development process.

    [​IMG]

    One of the features AMD mentioned as part of their HDR implementation was automatic switching between HDR and SDR modes, so you could game using the full HDR capabilities of your display while returning to a comfortable SDR for desktop apps. Unfortunately this doesn’t seem to be functional at the moment either, instead FreeSync 2 once again makes use of Windows’ standard HDR implementation that doesn’t handle the HDR to SDR transition too well.

    However, while the implementation might not be anything special at the moment, FreeSync 2 does guarantee several things relating to HDR. All FreeSync 2 monitors support HDR, so you’re guaranteed to get an HDR-capable monitor if it has a FreeSync 2 badge. FreeSync 2 also ensures you can run both adaptive sync and HDR at the same time for an optimal gaming experience. And finally, AMD states that all FreeSync 2 monitors require “twice the perceptual color space of sRGB for better brightness and contrast.”

    It’s unclear exactly what AMD means by “twice the perceptual color space,” but the idea is a FreeSync 2 monitor would support a larger-than-sRGB gamut and higher brightness than a basic gaming monitor.

    [​IMG]

    And it does appear that AMD’s FreeSync 2 validation process is looking for more than just a basic HDR implementation. So far, every FreeSync 2 monitor that’s available or has been announced meets at least the DisplayHDR 400 specification. This is a fairly weak HDR spec but we have seen some non-FreeSync 2 supposedly HDR-capable monitors fail to meet even the DisplayHDR 400 spec, so at least with FreeSync 2 you’re getting a display that meets the new minimum industry standard for monitor HDR.

    Of course, some monitors will exceed DisplayHDR 400, like the original set of Samsung FreeSync 2 monitors such as the CHG70 and CHG90; both of these displays meet the DisplayHDR 600 spec. Ideally I’d have liked to see FreeSync 2 stipulate a DisplayHDR 600 minimum, but 400 nits of peak brightness from DisplayHDR 400 should be fine for an entry-level HDR experience.

    FreeSync 2: Low Input Latency
    The second main FreeSync 2 feature is reduced input latency, which we briefly touched on earlier. HDR processing pipelines have historically introduced a lot of input lag, particularly on the display side, however FreeSync 2 stipulates low latency processing for both SDR and HDR content. AMD hasn’t published a specific metric they are targeting for input latency, however it’s safe to say 50 to 100ms of lag like you might get with a standard HDR TV would not be acceptable for a gaming monitor.

    [​IMG]

    How FreeSync 2 is achieving low latency support in 2018 appears to be more on the display side than the original implementation announced at the start of 2017.


    As we mentioned when discussing FreeSync 2’s HDR implementation, the original idea was to push all tone mapping into the game engine to cut down on display-side tone mapping, thereby reducing input latency as the display’s slow processor wouldn’t need to get involved as much. As games haven’t started supporting FreeSync 2 yet, today it seems this latency reduction is purely coming from better processing hardware in the display, for example current Samsung FreeSync 2 monitors include a ‘low latency’ mode that is automatically enabled when FreeSync 2 is enabled.

    FreeSync 2: Low Framerate Compensation
    The final key feature is low framerate compensation. This is a feature that goes hand-in-hand with adaptive sync, ensuring adaptive sync functions at every framerate from 0 FPS up to the maximum refresh rate supported by the display.

    There is one simple reason why we need low framerate compensation: displays can only vary their refresh rate within a certain window, for example 48 to 144 Hz. If you wanted to run a game below the minimum supported refresh rate, say at 40 FPS when the minimum refresh is 48 Hz, normally you’d be stuck with standard screen tearing or stuttering issues like you’d get with a fixed refresh monitor. That’s because the GPU’s render rate is out of sync with the display refresh rate.

    [​IMG]

    Low framerate compensation, or LFC, extends the window in which you can sync the render rate to the refresh rate using adaptive sync. When the framerate falls below the minimum refresh rate of the monitor, frames are simply displayed multiple times and the display runs at a multiple of the required refresh rate.

    In our previous example, to display 40 FPS using LFC, every frame is doubled and then this output is synced to the display running at 80 Hz. You can even run games at, say, 13 FPS and have that synced to a refresh rate; in that case the monitor would run at 52 Hz (to exceed the 48 Hz minimum) and then every frame would be displayed 4 times.

    The end result is LFC effectively removes the minimum refresh rate of adaptive sync displays, but for LFC to be supported, the monitor needs to have a maximum refresh rate that is at least double the minimum refresh rate. This is why not all FreeSync monitors support LFC; some come with just 48 to 75 Hz refresh windows, which doesn’t meet the criteria for LFC. However in the case of FreeSync 2, every monitor validated for this spec will support LFC so you won’t have to worry about the minimum refresh rate of the monitor.

    Current FreeSync 2 Monitors
    This wouldn’t be a look at FreeSync 2 in 2018 without exploring what FreeSync 2 monitors are actually available right now, and what monitors are coming.

    Currently there are only three FreeSync 2 monitors on the market, and all are from Samsung’s Quantum Dot line-up: the C27HG70 and C32HG70 as 27- and 32-inch 1440p 144Hz monitors respectively, along with the stupidly wide C49HG90, a double-1080p 144Hz monitor. All three are DisplayHDR 600 certified.

    -> https://www.techspot.com/article/1630-freesync-2-explained/
     
    Embra and Maddness like this.
  8. Maddness

    Maddness Master Guru

    Messages:
    946
    Likes Received:
    211
    GPU:
    EVGA RTX 2080Ti FTW
    I want a 27" 1440p 25-200hz Freesync 2 monitor with HDR. Preferably OLED to.
     
  9. OnnA

    OnnA Ancient Guru

    Messages:
    9,455
    Likes Received:
    1,923
    GPU:
    Vega 64 XTX LiQuiD
    you don't need low as low as 15-25, because of LTC (Mine is 30-70 OC'ed to 30-74) it means Low is 30/2=15 so i have 15-75 Freesync
    Thats why lot's of new FS Monitors have 40-144 and so on.... it reads 20-144Hz
     
    Last edited: May 19, 2018
    Maddness likes this.
  10. OnnA

    OnnA Ancient Guru

    Messages:
    9,455
    Likes Received:
    1,923
    GPU:
    Vega 64 XTX LiQuiD

  11. OnnA

    OnnA Ancient Guru

    Messages:
    9,455
    Likes Received:
    1,923
    GPU:
    Vega 64 XTX LiQuiD
    AMD/ATI Computex Press Conference

    @ 8 days 00h
     
    Last edited: May 29, 2018
  12. OnnA

    OnnA Ancient Guru

    Messages:
    9,455
    Likes Received:
    1,923
    GPU:
    Vega 64 XTX LiQuiD
    AMD/ATI has slightly increased its GPU market share in the first quarter of 2018, Intel remains the king.

    Jon Peddie Research, the industry’s market research firm for the graphics industry, has released its Q1 2018 market report. According to the report, year-to-year total GPU shipments increased 3.4%, desktop graphics increased 14% and notebooks decreased -3%.
    And even though AMD has slightly increased its GPU market share, overall GPU shipments decreased -10% from last quarter (AMD decreased -6%, Nvidia decreased -10%, and Intel decreased -11%).

    [​IMG]

    As said AMD increased its market share again this quarter benefitting from new products for workstations, and crypto-currency mining, while Nvidia held steady.
    On the other hand, Intel’s market share slightly decreased, although it remains the dominant GPU maker.

    [​IMG]

    Over three million add-in boards (AIBs) were sold to cryptocurrency miners worth $776 million in 2017. In this first quarter, an additional 1.7 million were sold.

    The attach rate of GPUs (includes integrated and discrete GPUs) to PCs for the quarter was 140% which was up 5.75% from last quarter.
    Discrete GPUs were in 39.11% of PCs, which is up 2.23%, the overall PC market decreased -14.12% quarter-to-quarter and increased 0.46% year-to-year.

    -> https://www.jonpeddie.com/press-releases/gpu-market-increased-year-to-year-by-3.4
     
  13. Maddness

    Maddness Master Guru

    Messages:
    946
    Likes Received:
    211
    GPU:
    EVGA RTX 2080Ti FTW
    I would love them to release info on a refreshed Vega series card. Either that or some info on Navi and the Threadripper refresh.
     
  14. warlord

    warlord Ancient Guru

    Messages:
    2,370
    Likes Received:
    778
    GPU:
    Null
    I need a faster GPU under 400-450$ with at least 8GB vram not fiasco gpus with 4gb only. Can AMD bring me something strong with rich features? Fury/580 were/are disappointing and vega overpriced as hell at the moment.
     
  15. JonasBeckman

    JonasBeckman Ancient Guru

    Messages:
    15,623
    Likes Received:
    1,620
    GPU:
    Sapphire 5700XT P.
    Yeah Vega is only worth it with a discount, they're starting to come down more in price overall but not those levels (MSRP.) but about 100$ over has been possible now but they sell out fast.
    (And the prices fluctuate heavily.)

    50 - 60% performance gain depending on title but at 600 - 800$ or more depending on country it's still not a great investment especially compared to the competition and the 1070Ti and 1080 if you're not using Freesync or have some other reason to stick with AMD. :)
    (390X already comes in 8 GB versions too so not as badly VRAM bottlenecked like the Fury can be though there's 4 GB of these and Polaris too.)

    If you are an overclocker Vega 56 is also a bit riskier since many newer production runs use Hynix memory instead of Samsung, these HBM2 modules don't clock as well even if you flash the GPU to use a compatible Vega 64 bios and get some more voltage going though Vega 64 is also still going to vary depending on overall quality so some clock better than others for RAM and GPU.
    (Undervolting and balancing power draw and temps tend to really help, RAM speed can see gains up to 1000Mhz for Samsung if stable and then it drops of a bit.)

    Even the 480 and 580 Polaris GPU's had a price increase until recently although they're more readily available at least.
     
    Maddness and warlord like this.

  16. Fox2232

    Fox2232 Ancient Guru

    Messages:
    9,738
    Likes Received:
    2,199
    GPU:
    5700XT+AW@240Hz
    I really wish you get that 1080Ti and pair it with your X4 860K.
    And then I hope, you'll keep that combo till 1080Ti performance becomes affordable as entry level GPU.
     
    Embra and user1 like this.
  17. RzrTrek

    RzrTrek Ancient Guru

    Messages:
    2,285
    Likes Received:
    588
    GPU:
    RX 580 8GB ❤ 144hz
    I would be surprised if AMD could deliver something even remotely close to the GTX 1080 Ti for the same price without needing twice the amount of power in 2018.
     
  18. OnnA

    OnnA Ancient Guru

    Messages:
    9,455
    Likes Received:
    1,923
    GPU:
    Vega 64 XTX LiQuiD
    I'm not surprised, i'm patience :D
     
  19. OnnA

    OnnA Ancient Guru

    Messages:
    9,455
    Likes Received:
    1,923
    GPU:
    Vega 64 XTX LiQuiD
    AMD Radeon RX Vega 56 in a notebook

    Is this a full-fat desktop variant of Radeon RX Vega 56? Well, it seems unlikely to expect 210W graphics cards in a mobile form factor. It could be downclocked or optimized for mobile use, or be a proper mobile variant (which AMD confirmed a few months ago).

    Acer did not reveal any details about the graphics card itself, but Vega 56 clearly stands for 56 Compute Units. It is offered as an option to GTX 1070 variant, which itself is 115-120W graphics solution.

    Acer Predator Helios 500 notebook will be offered in few main variants including Intel i9-8950HK + NVIDIA GTX 1070 and AMD Ryzen 7 2700 + AMD Radeon RX Vega 56. This, ladies and gentlemen, is the first true high-performance AMD-only laptop in years.

    The Intel+NVIDIA option will support G-Sync technology, while AMD solution will give you FreeSync support. All Helios 500 laptops are equipped with FullHD 144Hz panels.

    Prices of Predator Helios 500 notebooks start from 2000 USD (i7-8750H variant). Unfortunately, pricing of i7-8950HK / Ryzen 2700 models has not yet been confirmed.


    [​IMG]

    [​IMG]
     
  20. Maddness

    Maddness Master Guru

    Messages:
    946
    Likes Received:
    211
    GPU:
    EVGA RTX 2080Ti FTW
    Looks pretty sweet. Will be interesting to see the numbers.
     
Thread Status:
Not open for further replies.

Share This Page