Radeon Technology Group - Tech update December 2015

Discussion in 'Frontpage news' started by Hilbert Hagedoorn, Dec 8, 2015.

  1. Hilbert Hagedoorn

    Hilbert Hagedoorn Don Vito Corleone Staff Member

    Messages:
    48,325
    Likes Received:
    18,405
    GPU:
    AMD | NVIDIA
    The AMD Radeon Technology Group is responsible for everything that is related to Radeon graphics cards and APUs, today shares some new technology enhancements. In this December 2015 update we'll talk...

    Radeon Technology Group - Tech update December 2015
     
  2. Clouseau

    Clouseau Ancient Guru

    Messages:
    2,841
    Likes Received:
    508
    GPU:
    ZOTAC AMP RTX 3070
    Cool, will not upgrade my graphics card and monitor till 2017.
     
  3. AMDJoe

    AMDJoe AMD rep

    Messages:
    115
    Likes Received:
    0
    GPU:
    AMD
    At least you'll know what's going to be available at a lower price :)
     
  4. Denial

    Denial Ancient Guru

    Messages:
    14,201
    Likes Received:
    4,105
    GPU:
    EVGA RTX 3080
    I mean that's true -- but for the most part all the features announced here for freesync are already available on G-Sync. And I'm sure that Nvidia will have additional features planned for Pascal's release relating to G-Sync. That's the advantage of having the onboard FPGA -- they can just update the module as they go if they find something new and useful to add.

    That being said, I think AMD's approach is obviously the more consumer friendly method and hopefully it will keep G-Sync costs in check.
     

  5. Yakk

    Yakk Guest

    Messages:
    144
    Likes Received:
    21
    GPU:
    Graphics card
    Great read. Really interesting to see what The Radeon Group is working towards. Keep us updated!
     
  6. Caesar

    Caesar Ancient Guru

    Messages:
    1,555
    Likes Received:
    680
    GPU:
    RTX 4070 Gaming X
    according to the last line of the review:

    good for the the success :john: of AMD for year 2015.
     
  7. holler

    holler Master Guru

    Messages:
    228
    Likes Received:
    43
    GPU:
    Asrock 7900XTX Aqua
    disagree, how often do you upgrade your monitor? not nearly as often as the GPU. I rather the changes come from the GPU and drivers, then have to upgrade (which always means replacement) my monitor every other year... gsync is a fail compared to freesync long term...
     
  8. Denial

    Denial Ancient Guru

    Messages:
    14,201
    Likes Received:
    4,105
    GPU:
    EVGA RTX 3080
    Uh, it's kind of the opposite, well it could be the opposite depending on where it goes. The G-Sync module is an FPGA. That's why it's so expensive. The module can be reprogrammed specifically for future hardware support so you don't need a new monitor. AMD has been doing driver based workarounds for problems that the G-Sync module handles natively. This is fine, but there may come a point where AMD can't physically support a specific feature because the scalar in a specific monitor won't support it. In which case you will need a new monitor. G-Sync may not necessarily run into this problem because the module can be completely reprogrammed as necessary.
     
  9. Reddoguk

    Reddoguk Ancient Guru

    Messages:
    2,659
    Likes Received:
    592
    GPU:
    RTX3090 GB GamingOC
    When are we getting 8K OLED screens @144hz with Free-sync? i can't wait :(
     
  10. Kaarme

    Kaarme Ancient Guru

    Messages:
    3,511
    Likes Received:
    2,353
    GPU:
    Nvidia 4070 FE
    If you could afford a good Nvidia gaming video card and a G-Sync screen in the first place, you can also afford to upgrade them when necessary. They aren't for the poor man.
     

  11. Lane

    Lane Guest

    Messages:
    6,361
    Likes Received:
    3
    GPU:
    2x HD7970 - EK Waterblock
    Because you think that Nvidia will send you a new modules, or Asus sold it to you, instead of sold you a new fresh monitor ? ...

    I dont think, you can reprogramm the FPGA scalar so easely ( its not like flash a firmware in it ) .

    With AMD, all new need is a driver update .. and i think Nvidia it is the same for add new features on G-sync....

    As for monitor side ( i think to HDR, BT.2020 ) etc .. it still depend of the panel, the electronic and the driver support .
     
    Last edited: Dec 8, 2015
  12. Fox2232

    Fox2232 Guest

    Messages:
    11,808
    Likes Received:
    3,371
    GPU:
    6900XT+AW@240Hz
    Commercially maybe, it may be phased out and no new G-Sync screens may be made in few years.
    But people who bought G-Sync based screen will be able to use it till it dies and it provides same functionality as Freesync.
    Only difference is that you can select how FS behaves once you go out of FS range of screen.
    (But maybe nV already added same feature too? Some owner of G-Sync screen can fill this information in.)

    I am more interested in that lenovo Y700. Is FX-8800p in user configurable mode for TDP limit? (Or is it at least in 35W mode)?
    If it is in 35W mode wouldn't it be better to run in dual graphics mode with R7-M365? (cheaper with same performance)

    Will we see FX-8800p in 13.3'' or smaller notebook with configurable TDP? (Apparently without dGPU.)
     
  13. Denial

    Denial Ancient Guru

    Messages:
    14,201
    Likes Received:
    4,105
    GPU:
    EVGA RTX 3080
    What? I never said Asus or Nvidia would send a new module. I said that Nvidia can update the module, via a driver. It's an FPGA -- it can be completely reprogrammed via software updates.

    http://www.geforce.com/whats-new/articles/g-sync-gets-even-better

    You can set it for going above the range but not below. Although I don't know why you'd want it to set it off for going below. The frame doubling (LFC as AMD calls it) or whatever does a good job of keeping it smooth below the threshold. I'm assuming AMD's solution does the same.

    Overall G-Sync or Freesync, the technology is totally worth it. I didn't really believe the hype till I bought my PG278Q and games in the 35-70 fps range definitely become way smoother/better experience when G-Sync is enabled. I'm sure Freesync is the same way. Its something you really have to experience, but I definitely think it's worth the cost of a new monitor or paying $100 more in Nvidia's case.
     
    Last edited: Dec 8, 2015
  14. moab600

    moab600 Ancient Guru

    Messages:
    6,658
    Likes Received:
    557
    GPU:
    PNY 4090 XLR8 24GB
    good more competition is better, but AMD have bad record of broken promises.

    This time i hope they turn out to be true, but i know nvidia won't sit and watch.
     
  15. xIcarus

    xIcarus Guest

    Messages:
    990
    Likes Received:
    142
    GPU:
    RTX 4080 Gamerock
    Oh well, we'll have to wait 91 more years what choice do we have.

    Hue.

    GSync is not even remotely a fail.
    The FPGA module is reprogrammable, meaning Nvidia can push 'firmware' (if you may) updates through drivers for example. That way it's less likely that your monitor goes obsolete.
    And as it stands currently, Freesync is simply playing catch-up to GSync which has more functionality. GSync is simply the better tech but it's hardware locked.

    I would even argue that GSync is more future-oriented aside from being proprietary tech.

    We're talking about 4k 120+Hz monitors here. They WILL be expensive. I don't think you want to pay GPU money for a monitor every few years.
     

  16. Kaarme

    Kaarme Ancient Guru

    Messages:
    3,511
    Likes Received:
    2,353
    GPU:
    Nvidia 4070 FE
    Nvidia's whole financial and market share success is based on its exceptionally smart sit and watch strategy. While AMD has spent extra money to guess, develop, and offer people what they might need tomorrow, Nvidia has used the most cost effective current technology to give people what they need today. Obviously people have then bought what they need for the current games, not games that might appear one day.

    That being said, I still don't understand why AMD didn't keep HBM for themselves for the time being just like Nvidia has kept G-sync hardware locked, for example, forcing AMD to develop and market what people generally consider a less refined solution (even if cheaper). Next year AMD could then have watched Nvidia try to compete with GDDR5 against HBM2. If Nvidia now beats AMD to the next generation, like it did last time, people will spend their money on Nvidia first. Actually they still might in either case since I reckon they first look at the options from their current supplier, which is more likely Nvidia with their huge market share.
     
  17. moab600

    moab600 Ancient Guru

    Messages:
    6,658
    Likes Received:
    557
    GPU:
    PNY 4090 XLR8 24GB
    Maybe some part of HBM cash from nvidia goes to AMD as well, if they would keep it for themselves, nvidia might found other way to use their own HBM or what it was called.

    anyway, 2k16 gonna be intersting, Pascal and AC should bring new stuff to the table.
     
  18. Tronman

    Tronman Guest

    Messages:
    102
    Likes Received:
    0
    GPU:
    XFX 295x2
    Really hope that freesync over HDMI makes it into big screen 4k tvs in the future. I thought I was in a very small minority of PC gamers that play from the couch with a large TV (extremely low input lag coupled with a wireless mouse, kb and controller makes it a dream), however I think the popularity of the ultra large wasabi mango displays has proved there is a market for it.

    @AMDJoe do you happen to know if AMD has ever considered pushing this tech into that space? Powerful steam boxes are already available and everyone knows just how big the console market is... I'm sure there will be an appetite for UHD gaming from console sized PCs, and freesync over HDMI on 4k panels would really complement this.
     
  19. RealNC

    RealNC Ancient Guru

    Messages:
    4,894
    Likes Received:
    3,168
    GPU:
    RTX 4070 Ti Super
    AMDoes what NVidont.

    :flip:
     
  20. Prince Valiant

    Prince Valiant Master Guru

    Messages:
    819
    Likes Received:
    146
    GPU:
    EVGA GTX 1080 ti
    Lower rates don't really feel much different to me, not that I've done a side-by-side or GS on/off comparison because having it on is nice. The lack of tearing without V-sync is worth it for GS or FS (presuming FS is about the same). The only problem I've had is that some older games won't run at all with G-Sync and sometimes games require I manually set G-Sync to on for the specific game in the NV control panel despite having it as the default.
     

Share This Page