New Upcoming ATI/AMD GPU's Thread: Leaks, Hopes & Aftermarket GPU's

Discussion in 'Videocards - AMD Radeon' started by OnnA, Jul 9, 2016.

  1. OnnA

    OnnA Ancient Guru

    Messages:
    15,768
    Likes Received:
    4,932
    GPU:
    3080Ti VISION OC
    Hmm, Yup that will work also :)

    But i mean, that we are standing still from some time now in DX11.0 or 11.1 (This one is better cuz can utilise 4/6/8 cores)
    Where is new #Better GFX
    DX12 now has SM/VM 6.0

    Here, go there and see some CPU Core Tests (on Bottom of every game)
    -> http://gamegpu.com

    And DOOM in Vulcan shows us clearly how powerful Fiji really is :)
    Also showed us that #Something is rotten 'in the industry'

    The real POWAH is like this:

    1. 1080Ti 2x8PIN
    2. 1080 1x8+1x6Pin
    3. Fury 2x8Pin
    4. 1070/980Ti
    5. 580/480
    6. 1060 6GB/470 8GB
    7. 470 4GB
     
    Last edited: May 31, 2017
  2. Denial

    Denial Ancient Guru

    Messages:
    14,004
    Likes Received:
    3,786
    GPU:
    EVGA RTX 3080
    It would help if you just said what you wanted to stay instead of typing in weird cryptic messages.

    Is DX12 better than 11? Sure. But it's up to developers to go and support it - the graphics companies are already supplying all the tools for it.

    And yeah, we know that AMD has utilization problems in DX11 and obviously DX12/Vulkan help alleviate that.. but I'm not sure what you mean by something "rotten". DX12/Vulkan are considerably more difficult to develop games on. This was made explicitly clear when both API's launched. In fact Anandtech had an entire article saying that most smaller game engine developers would most likely never move to DX12 - which is why Microsoft is continuing to develop DX11, with 11.3/4 as a high level alternative.
     
  3. OnnA

    OnnA Ancient Guru

    Messages:
    15,768
    Likes Received:
    4,932
    GPU:
    3080Ti VISION OC
    but I'm not sure what you mean by something "rotten"

    Tell me
    Did DX11 was more difficult to develop than DX10 or very popular even today DX9?
    So why the heck was so fast adopted? #Corn Flakes & Milk

    And now tell me why better (a little easier even than DX11 to develop - those are MS words not mine, they provide easy to use and adopt tools for DX12)
    DX12/Vulcan adoption is now slower than Year ago?

    I tell You bro
    #Money clouded progress
    thats why #something is rotten

    #THX goes to [....] for 2Cores & 4Cores
    #Uff we got affordable ZEN 6/12 & 8/16

    #THX goes to [....] for all going DX11.0 with max 1-3 threads
    #Uff we will have VEGA with HBCC and Tile-based approach to rendering.
     
    Last edited: May 31, 2017
  4. Denial

    Denial Ancient Guru

    Messages:
    14,004
    Likes Received:
    3,786
    GPU:
    EVGA RTX 3080
    You have a link for Microsoft saying DX12 is easier? Because when DX12 was announced I read nothing that said that - in fact I read the exact opposite:

    http://www.anandtech.com/show/8544/microsoft-details-direct3d-113-12-new-features

    DX11 was widely adopted because it enabled a number of highly requested features at the time. It also wasn't any more difficult than DX10, just changed the ways some of the older features were implemented and added new ones.

    DX12 on the other hand is the opposite. It didn't ship with a new shader model, so there was little incentive in terms of new feature sets. And it's far more difficult to optimize/code for. The only thing it really seems to do so far is:

    A. Make optimization significantly harder for the developer.

    B. Lower drawcall count, which didn't really seem like much of a bottleneck in the first place.

    C. Alleviate scheduling issues on AMD cards, something Nvidia managed to work around in software.

    So I don't know why you'd think adoption would be anywhere near as fast as DX11.
     

  5. OnnA

    OnnA Ancient Guru

    Messages:
    15,768
    Likes Received:
    4,932
    GPU:
    3080Ti VISION OC
    I posted on this topic just the other day. But I think the decision of whether to use DX11 or DX12 is mostly a question of whether your user's computers can support DX12.

    I mean, yes, DX12 is more difficult than DX11. DX11 is more difficult than DX10. DX10 is more difficult than DX9 and OpenGL. OpenGL is more difficult than MonoGame. MonoGame is more difficult than Unity. If you want easy: Unity. But DX11 is so difficult I wouldn't say it's really "easier" than DX12. Depending on where you're coming from knowledge wise, DX11 is an enormous learning curve. I say this never having written a DX12 program and only having compiled a Vulkan tutorial. But I think saying DX12 is more difficult than DX11 is like saying it's more difficult to walk to New York from LA than it is to walk from LA to Boston. By the time you've managed to walk to Boston, New York isn't that much further. You've already made it to Boston. If New York is where you really want to be then why not just keep going?


    I think there's more opportunity to mess up in a really big way and have your game crash in DX12. (But then again you were working with unmanaged code and COM in DX11.) Multi tasking always scares people. So, it's definitely more difficult. But by the time you learn HLSL just so you can draw something to the screen, figured out how to deal with the Windows OS (at least so that you can get a process and control the window you are running in), and dealt with COM, "you've come a long ways baby" (and that doesn't even get into writing your own Python scripts to extract modeling data from Blender, writing your own modeling class, learning Win Sock for Internet game play, etc.). If you can handle that, I figure multi-tasking can't be that much more difficult.



    From what I read, DX12 handles resources far better than DX11, but it means extra steps on your part to take responsibility for those resources. But that's pretty much the same difference as DX9 and DX10/11.



    But as DX becomes more complicated, it's a shame it becomes more difficult to learn to do. Then again, I'm not sure DX has ever been easy. I tried for the better part of a decade to learn 3D programming in DX9 with no success to speak of. By the time I was actually ready, I ended up teaching myself DX11. And I found it reasonably easy. But that was only because I had over a decade of experience elsewhere. So, maybe it is good to learn in stepping stones.



    Overall though, it seems to me that if you are going to learn DX12, you should probably use it for everything unless a more simple tool is called for. But if a more simple tool is called for DX11 is probably too much too. For example, I needed a tool to read my model files to examine the data in a human readable format so I wrote a C# program. I might prototype something in Unity or what I actually use is XNA for prototyping. Several things that I was worried about tackling directly in DX11 I prototyped in XNA first.



    Again though, I've never written a single DX12 program. The closest I've come to it is program in DX11 and spending a Saturday compiling a Vulkan tutorial. I've also flipped through Frank Luna's DX12 book and found it to be remarkably similar to his DX11 book.



    I would think that if you're going to do DX12, just do it and stick with it. At that point you've already dealt with the difficult part and you probably need the practice anyway even if it is a smaller project.



    Another way of looking at it though is that most of your beginner projects probably would run just fine on DX9. I used XNA for years and it was built on top of DX9. Any time I ran into any type of performance issue it was because of the way I had coded it, not a real problem of exceeding it's limitations. So, you might learn DX11 before DX12 just because there's so much more information out there to help you learn DX11 and it's a pretty decent stepping stone to DX12.

    -> https://blogs.msdn.microsoft.com/directx/2014/03/20/directx-12/

    -> https://www.gamedev.net/topic/680854-directx-11-for-small-scale-and-directx-12-for-large-scale

    -> http://www.eurogamer.net/articles/digitalfoundry-2015-why-directx-12-is-a-gamechanger

    We need to wait, let's us hope that we will have CGI like 120FPS 4k HDR Gaming in near future.
    But without DX12/Vulcan you will need nV 2999 at 990mm with 450tW the Titan 50cm GPU with '24GB GDDR77' lol (you catch my drift?)

    ---
    Q:
    You have a link for Microsoft saying DX12 is easier? Because when DX12 was announced I read nothing that said that - in fact I read the exact opposite....

    No i don't have it, and what i mean is that MS gives to developers all Tools needed for easier transitions from DX11 -> DX12
    When you google about devs & DX12 you shall find some good stuff ;)
    BTW it's not that hard -> look at DX12 Forza ApeX or Vulcan DOOM

    ---

    Are you fo'real:
    Q. DX11 was widely adopted because it enabled a number of highly requested features at the time. It also wasn't any more difficult than DX10, just changed the ways some of the older features were implemented and added new ones.

    New APIs especially when Low-Leveled like DX12/Vulcan/Mantle are more difficult, it's NP for Real Developers.

    Not only H/W but more likely H/W + Great Low level API will get us there (4k HDR 120FPS)

    UPD. Ok bro we can easily drop this subject, becouse we can't do anything at all to move things forward towards New APIs etc.
     
    Last edited: May 31, 2017
  6. Tuga

    Tuga Member

    Messages:
    48
    Likes Received:
    0
    GPU:
    nvidia 670 gtx
    At the end of the day, amd needs to deliver some enticing high end product. Can't give it all to nvidia.. come on!

    The VR is really close now, in 4-6 months both subnautica and assetto corsa received official HTC Vive support (early was only Oculus Rift), so I could now natively play those 2 games with my 3rd party HMD (which relies on SteamVR). Samsung is starting to push new Quantum Dot VA monitors (C24FG73), and there's even a 32" 4k model (U32H850) (freesync).. all these fantastic stuff require some good GPU !

    (We're finally getting some "real" va/4k monitors, kinda exciting to get away from the norm of 1000:1 IPS)
     
    Last edited: May 31, 2017
  7. Denial

    Denial Ancient Guru

    Messages:
    14,004
    Likes Received:
    3,786
    GPU:
    EVGA RTX 3080
    DX10 adoption sucked because of Vista. DX11 was better because 7 was widely adopted. 10/11 were both easier than 9 because of capbits which forced GPU vendors to unify their architectures under a similar design paradigm. Obviously there was a learning curve, as there is with any major change, but DX11 is still a high level API.

    Multiple developers and industry people all say the opposite with 12. That it's significantly harder to optimize it across a range of hardware. This is not only evident by them literally saying it - but by the released DX12 titles having a multitude of issues on various hardware setups.

    Would I prefer DX12? Obviously - it's the superior API. But I've seen no indication of some imaginary force holding it's adoption back, or any low level API for that matter. Microsoft is heavily pushing DX12 on both the Xbox and PC. AMD obviously wants both. Nvidia ported it's entire gameworks library over to DX12 and nearly the entirety of senior management in Khronos Group, including it's CEO are Nvidia employees.

    I don't really know who else would have the clout or incentive to block next gen APIs.

    So I think clearly the lack of adoption is due to the learning curve/difficulty of programming vs the advantages it provides, which at the moment only seems to be to the benefit of AMD.
     
  8. OnnA

    OnnA Ancient Guru

    Messages:
    15,768
    Likes Received:
    4,932
    GPU:
    3080Ti VISION OC
    OK :)

    #No Comments

    ...Let us hope for Moar...

    #DOOM
    #FH3
    #ApeX
     
  9. PrMinisterGR

    PrMinisterGR Ancient Guru

    Messages:
    8,091
    Likes Received:
    933
    GPU:
    Inno3D RTX 3090
    Can I chime in here and just say that Vega will most likely end up being the superior products to Pascal? AMD seems to have tackled GCN's greatest issues with it, and larger and bigger parts of it are in the console refreshes. It depends for how long you plan to keep what you get, I guess. I don't believe that the top Vega will be faster than the Ti, but i it's priced between it and the 1080, for the initial launch, it will look like the more sane long term investment to me.
     
  10. OnnA

    OnnA Ancient Guru

    Messages:
    15,768
    Likes Received:
    4,932
    GPU:
    3080Ti VISION OC
    I want True DX12/Vulcan Games
    With SM/VS 6.0 ...

    #Better GFX
    #LowerTDP (Yes DX12/Vulcan done that)
    #Better CPU AI (We need Moar Cores)
    #Better Physics (We need Moar Cores for this puppy also)
    #Dolby ATMOS
    #32Bit or HDR

    etc.

    Just Play some new Game = WTF is this in 2017 :thumbdown
    Generally we are in the past :finger:

    Im playing now Prey 2017 on CRYeng. and it looks similar to Crysis3 2013 :bang:

    #EA for BF1 DX12 (Im Playing it in DX11.1 becouse SweetFX and i have constant 70FPS in Multi, so it can be done)
    #Bethesda SoftWorks for DOOM on Vulcan

    UPD.
    Hmm look here:


    UPD.2
    Also watch on YT Video comparing TombRaider GFX evolution 1996-2016
    -> https://www.youtube.com/watch?v=Ae5rwNVNVrQ

    And now look at ground + grass + Tree leafs/bushes
    Hmm so the game from 2006 DX9 have similar GFX to the RotTR 2016 :3eyes:

    Now Play some BF1 & BF3 & BF4 ;)
    Yes BF1 is superior for sure but those Groundies/Bushes/Trees looks similar....
    The Tek & New API are needed fo'more
     
    Last edited: May 31, 2017

  11. OnnA

    OnnA Ancient Guru

    Messages:
    15,768
    Likes Received:
    4,932
    GPU:
    3080Ti VISION OC
    Yup :banana:
    He has his first Rodeo :formal:
     
    Last edited: May 31, 2017
  12. Loophole35

    Loophole35 Ancient Guru

    Messages:
    9,797
    Likes Received:
    1,161
    GPU:
    EVGA 1080ti SC
    I said it in the other thread and I will say it in this thread. Pascal has been out for over a year. It will have been out for a year and a half when Vega finally launches (not even the gaming variety). You realize that that means Vega will have to be truly relevant 2 years after Pascal is deemed truely obsolete (BTW Maxwell is still very much relevant). Are you willing to bet Vega will be a power player in 2020+?
     
  13. Fox2232

    Fox2232 Ancient Guru

    Messages:
    11,808
    Likes Received:
    3,370
    GPU:
    6900XT+AW@240Hz
    Yes and No. It is quite possible that if there is shortage (Forge does not deliver HBM chips), there are penalties (reduced price for AMD to pay).

    So, it is bad for us, and for AMD's image as they have trouble deliver thing they designed due to 3rd party unable to perfect their stuff in a year.

    But HBM is good. Just few years in future, you will have entire notebook PCB at size of 5 * 8 cm. You will have APU + 1 stack of 8GB HBM2/3 (200+ GB/s) and all IO on PCB. All very space and power efficient.
    And so much space for all goodness.

    Then apply this to data centers... CPU density will be on another level than today.
     
  14. Maddness

    Maddness Ancient Guru

    Messages:
    2,193
    Likes Received:
    1,417
    GPU:
    3080 Aorus Xtreme
    Its just a great pity that HBM hold ups have affected both Fury release in the past and now the upcoming Vega release. Sometimes it feels like AMD can't catch a break.
     
  15. OnnA

    OnnA Ancient Guru

    Messages:
    15,768
    Likes Received:
    4,932
    GPU:
    3080Ti VISION OC
    Yup :)
    We need to see All Picture with every angle :idea:

    Im little sad, becouse we all struck at Old Era in 2017
    All we have is DX11/9/10 (Yes Seraph is DX10)
    + we have "next Gen " APIs but Next-gen is Performance only not GFX :bang:
    Ahh

    Earlier was a Mix of Intel/AMD + ATI/nV
    DX9b was ATI 9800 (1C CPU)
    DX9c was GTX 6800 (Yes on that GPU i was Playing NFS Ung.2) (1C CPU 64Bit)
    Then was ATI 1950 -> 3870 (1C CPU 64Bit)
    Before DX11 i was on GTX 285 1.2GB (Core2Duo then PhenomII x6)
    Then i went for ATI 5870 then 5 years ago i go for 7970GHz then i get free upgrade from XFX to 280X and now im on Fury HBM with ZEN 8/16.

    So i remember clearly what is Progress = :banana: and New Next-GEN GFX = :banana:
    Do you remeber Dragon in Unigine Heaven DX11 (5870) ?
    Everybody was talking about The Elder Scrolls in DX11 :) (Those houses and streets in DX11 = lol becouse it never happened)

    C'mon lets share some insights and good things (GFX/Games and related things, Yes SCENE & Demos are OK)
     
    Last edited: Jun 1, 2017

  16. haste

    haste Ancient Guru

    Messages:
    2,082
    Likes Received:
    1,000
    GPU:
    GTX 1080 @ 2.1GHz
    Is there really not going to be VEGA launch at Computex? This is just ridiculous...
     
  17. Maddness

    Maddness Ancient Guru

    Messages:
    2,193
    Likes Received:
    1,417
    GPU:
    3080 Aorus Xtreme
    Unfortunately it looks like a lack of HBM unavailability is the culprit.
     
  18. OnnA

    OnnA Ancient Guru

    Messages:
    15,768
    Likes Received:
    4,932
    GPU:
    3080Ti VISION OC
    Crytek’s Senior Cinematic Artist, Joe Garth, has been experimenting with some leaves from Quixel Megascans in CRYENGINE, and shared some screenshots showcasing them. This scene was very light as it featured a few assets and very low amount of drawcalls/polygons, which is why it was running with more than 100fps (do not know the specs of the PC used).

    As Joe stated:

    “I try to keep my scenes and assets fairly light and not use any slower high end features. An optimized scene has lots of benefits, you can work much more quickly, render quickly (for pitchvis this is essential) and the scene could also be ported to lower spec devices (VR, low end pc’s etc).”

    -> http://i.imgur.com/9oqwCII.mp4

    [​IMG]

    [​IMG]

    [​IMG]

    I've talk about Trees & Bushes looks Old/Obsolete -> It seems ppl in Crytek agrees ;)
    Thats a way better Tree/ground/bushes visuals
     
    Last edited: Jun 3, 2017
  19. haste

    haste Ancient Guru

    Messages:
    2,082
    Likes Received:
    1,000
    GPU:
    GTX 1080 @ 2.1GHz
    Reminds me of this UE4 render:

    [​IMG]
     
  20. OnnA

    OnnA Ancient Guru

    Messages:
    15,768
    Likes Received:
    4,932
    GPU:
    3080Ti VISION OC
    After analysing some Data from ( http://gamegpu.com/Page-7.html )

    I've got some numbers, of how many CPU cores Games needs this days :)

    So we've got a majority of Games (those not using CryEngine or FrostBite) need:

    2-3 Cores !

    Games in CryEngine 4/5 are better becouse the can utilise 8-16 Cores !
    Games like BF1, TitanFall 2 or ME Catalyst utilise 6-8 Cores
    Games in UE4 3-8 cores (depends on Dev)

    So now we have 16/32 Cores ThreadRipper on the way = :bang:

    PS.
    THX goes to 2-4 Cores Mainstream for so many Years :cyclone:
     
    Last edited: Jun 4, 2017

Share This Page