AMD takes the lead in the new Forza Horizon 4 DX12 benchmark

Discussion in 'Frontpage news' started by Hilbert Hagedoorn, Sep 18, 2018.

  1. korn87

    korn87 Guest

    Messages:
    9
    Likes Received:
    0
    GPU:
    GTX1080 8Gb
    can you do a vega64 test at the maximum settings in 2160p? All the settings to the right, msaa 4x, fxaa off, blur in motion - quickly.

    [​IMG]
     
  2. H83

    H83 Ancient Guru

    Messages:
    5,511
    Likes Received:
    3,036
    GPU:
    XFX Black 6950XT
    The Vega 56/ are very nice cards the problem was that AMD bet all on HBM and that bet failed in almost every way... Because of that the cards were more expensive than they should and their numbers were also limited because of HBM insufficient availability. Those facts doomed Vega from being succeffull...
     
    Jayp likes this.
  3. Jayp

    Jayp Member Guru

    Messages:
    151
    Likes Received:
    61
    GPU:
    2080 Ti
    I will just leave it at this. I owned two GTX 670 cards and loved them. Started with one and then went SLI shortly after. Being that not all games support SLI I often ran one. Those cards were great for me all the way until I moved to a 980 Ti. By that point a single GTX 670 and 680 were getting quite long in the tooth depending on what you like to play. The point is even a couple years ago you didn't want to still have either of those cards and it really is quite irrelevant how good each runs today. Additionally, there are plenty of other benchmarks that show GTX 680 and HD7970 near each other performance wise. Before the GTX 670s I owned two 6950s and that was the last time I took AMD serious for my gaming needs. I have an RX580 in my secondary Ryzen build right now just for productivity and to test AMD progress with drivers. The reality is I wish AMD was more competitive as they have been in the past. They have yet to make a card to take on the 1080 Ti and that is where I am currently. Really wanted a Vega card for my Ryzen system but for gaming and my productivity needs it's a hard pill to swallow considering the performance and power consumption.
     
  4. millibyte

    millibyte Maha Guru

    Messages:
    1,282
    Likes Received:
    244
    GPU:
    Smoothbore125mm gun
    Game seems to run remarkably well on both AMD and Nvidia from what I can tell. Win-win for everyone.
     
    FranciscoCL and Jayp like this.

  5. The Goose

    The Goose Ancient Guru

    Messages:
    3,057
    Likes Received:
    375
    GPU:
    MSI Rtx3080 SuprimX
    Tried the demo on my Evga gtx 1080FTW and it runs super smooth @1440p but has anti alias issues
     
  6. Neo Cyrus

    Neo Cyrus Ancient Guru

    Messages:
    10,793
    Likes Received:
    1,396
    GPU:
    黃仁勳 stole my 4090
    So... is there any news if AMD will be releasing any graphics cards before 2020?

    For a millisecond after reading the words "AMD takes the lead" I hoped this meant AMD cards were coming. Then 1 of my brain cells woke up to answer and said "LUL NO", and I read the rest of the title.
     
  7. Pimpiklem

    Pimpiklem Guest

    Messages:
    162
    Likes Received:
    49
    GPU:
    amd
    anti alias issues only on nvidia cards because nvidia AA isnt fit for purpose because its never looks on.

    looks super on AMD. slowly people are beginning to look closer.
    thats why people want 4k because the mess AA is on the nvidia cards.
     
  8. slicer

    slicer Member Guru

    Messages:
    140
    Likes Received:
    51
    GPU:
    Sapphire Vega 64
    I would say that all AMD cards basically perform same or even better after years. There are not one series that has lost its initial performance over the years with newer drivers.That can not be said for Nvidia cards. Case closed!
    Look at this benchmark:
    https://www.guru3d.com/articles_pag..._graphics_performance_benchmark_review,7.html

    Already in 1080p results- Fury-X beats GTX980Ti with considerable margin, not to mention at 1440p and 4k when Fury-X is even better than GTX1070Ti!
    Not that technology is bad, but Nvidia gimps its "medium" and "high-end" cards with very bad bandwiths! Always have and always will...
    If only the GTX1070 and GTX1080 would have the bandwith to support that good Pascal core they are hosting. And thats why those cards will lose out in years to come to the VEGA lineup today! Like GTX980Ti is losing now already to Fury-X in some titles and will continue to do so in future games where high textures will be its demise.

    Fun fact!
    Every time when a game developer properly uses technology- AMD is a winner. And it is strange, when game is an AMD title or closely developed with AMD...all cards have a nice performance, including NVIDIA cards. There are like no 20FPS gaps when you switch manufacturer, but when game is an NVIDIA sponsored... suddenly AMD cards have like -30+ fps disadvantage. :D Coincidence?
    That already shows shady side of NVIDIA. Will never support that firm by buying new again. Last card I bought new was GTX660. Rest of Nvidia cards I have bought have been used, so no "new" money for Nvidia.
     
    Last edited: Sep 19, 2018
  9. Denial

    Denial Ancient Guru

    Messages:
    14,207
    Likes Received:
    4,121
    GPU:
    EVGA RTX 3080
    Not one series lol.. tell that to the Terascale VLIW owners who's cards went to garbage after GCN came out.
     
  10. slicer

    slicer Member Guru

    Messages:
    140
    Likes Received:
    51
    GPU:
    Sapphire Vega 64
    Ofc lets go back to totally different architecture... That said HD6970 and HD6990 are still good cards for many titles today for low/medium settings in 1080p range
     

  11. Denial

    Denial Ancient Guru

    Messages:
    14,207
    Likes Received:
    4,121
    GPU:
    EVGA RTX 3080
    I had multiple Terascale (4000/6000 series) cards a year or two after GCN where various titles were straight up broken and unplayable. That's a little worse than losing performance due to lack of available VRAM (GTX 680) or a shift in development paradigm towards compute (Kepler).

    You talk about the Fury X bandwidth but ignore the fact that any game that utilizes over 4GB of VRAM cripples its performance despite AMD claiming it was comparable to 12GB of GDDR5. The Fury X is a 8.6TFlop card vs a 5.4 Stock 980Ti.. there were titles at launch were the Fury X was faster or similar in performance - honestly every title that properly utilizes compute should be faster.. So when you sit here 3 years later and show yet another title similar in performance, developed by a company that's been partnered with AMD for years, on DX12 with Async, as an example of how AMD cards age better.. it's a weak argument at best.

    That's not to mention that nearly every 980Ti shipped was factory overclocked and all of them could be manually overclocked to at least 1450 Mhz, most 1500.
     
    Barry J and Jayp like this.
  12. Jayp

    Jayp Member Guru

    Messages:
    151
    Likes Received:
    61
    GPU:
    2080 Ti
    Lol come on dude you take an async compute enabled DX12 benchmark and try to tout case closed. Get out of here with that BS. You're using a case that favors your argument and isn't the case for all games. We know that AMD has a longer history of being good with DX12 and Async compute they helped develop it. But Nvidia does just as good if not better on DX11 with older cards. Linus covered this Nvidia card controversy quite well and showed that if anything Nvidia cards have improved over the years definitely not got worse. An HD6970 would be hard pressed to maintain even low settings in a lot of newer titles at 1080p unless 30 fps sounds good to you. An HD6990 is a dual GPU that will only reap some kind of benefit where dual GPUs are supported. Want good performance with older Nvidia cards it is simple don't run DX12 or disable Async compute if you do. Bandwidth isn't everything and Nvidia has been a very competitive performer with less bandwidth. Don't confuse the GPUs ability to handle async compute with it's memory bandwidth. I want AMD to do better I really do we need competition. Some of AMD cards have held up a little bit stronger but does it really matter if a 6-7 year old video card gets 5 FPS over the other when they both can't even get 60 fps steady at 1080p?

    Look here starting with DX11 titles click through all of them and look at the steady victory the 980 Ti has over the Fury X. Far cry 5 was developed with AMD and Nvidia still takes the lead. Is 3 years not enough time in the drivers for the Fury X to be more competitive? Maybe by the time those cards struggle to get 60 fps on low and no one runs them the Fury X will take some lead.
    https://www.guru3d.com/articles_pages/asrock_phantom_gaming_x_radeon_rx580_8g_oc_review,16.html
     
  13. Jayp

    Jayp Member Guru

    Messages:
    151
    Likes Received:
    61
    GPU:
    2080 Ti
    Agreed! Memory bandwidth isn't everything and HBM memory for the time being is over hyped for the consumer market. Nvidia has gone a long way on DDR memory. Nvidia cards also overclock nicely. Something that can't be said for Fury X and Vega cards. When I fork over big cash for a video card I need it to perform best in the beginning until I am done with it not at the end when I am ready to dump it. I want AMD to do better I really do I am not brand loyal to any PC component. I just want the best performance for my use case. It is just exhausting seeing people defend AMD game performance when it really is in more isolated cases and it's too little to late.
     
  14. Stormyandcold

    Stormyandcold Ancient Guru

    Messages:
    5,872
    Likes Received:
    446
    GPU:
    RTX3080ti Founders
    Slicer - Here's the thing, though. On release my MSI Gaming X GTX1070 was £420. The Fury X was £500. With that knowledge in-mind, now go through the game review benchmarks and compare.
     
  15. airbud7

    airbud7 Guest

    Messages:
    7,833
    Likes Received:
    4,797
    GPU:
    pny gtx 1060 xlr8
    they had the lead for 1 day....2080ti :p
     

  16. Fox2232

    Fox2232 Guest

    Messages:
    11,808
    Likes Received:
    3,371
    GPU:
    6900XT+AW@240Hz
    What is shader power good for when you face slow discard or something else? Amount of work on that does not change with resolution. But required work of shaders do.
    It really falls down to what game uses and in what ratio.
    Like Shadow of the Tomb Raider: 1080p and RX-580 8GB is bit ahead of Fury X; 1440 Fury X is already ahead. What to take from that? Apparently not VRAM related slowdown.
    In BF1: Fury X is 9% ahead of RX-580 8G @1080p; Fury X increases lead to 18% on 1440p.
    ... many more show similar thing. AMD has their tools showing what kind of work on frame takes what amount of time. I am sure they knew what to improve on Polaris, and if build to same size as Fiji, it would show reasonable Clock to Clock improvement (Unlike Vega). Or maybe Raja ignored that problem completely and only reason for faster discard on Polaris is 40% higher clock than Fiji has.
     
  17. OnnA

    OnnA Ancient Guru

    Messages:
    17,963
    Likes Received:
    6,823
    GPU:
    TiTan RTX Ampere UV
  18. PrMinisterGR

    PrMinisterGR Ancient Guru

    Messages:
    8,129
    Likes Received:
    971
    GPU:
    Inno3D RTX 3090
    Any chance to have some Turing numbers for this?
     
  19. pharma

    pharma Ancient Guru

    Messages:
    2,496
    Likes Received:
    1,197
    GPU:
    Asus Strix GTX 1080
    I think you can find some here though he doesn't use the latest driver (411.63) optimized for Forenza Horizon.
     
  20. PrMinisterGR

    PrMinisterGR Ancient Guru

    Messages:
    8,129
    Likes Received:
    971
    GPU:
    Inno3D RTX 3090
    Seems like the 2080 is faster than usual compared to the 1080Ti.
     

Share This Page