Witcher 3 Announcement From CD Projekt Red

Discussion in 'Videocards - AMD Radeon Drivers Section' started by LtMatt81, May 15, 2015.

  1. gx-x

    gx-x Ancient Guru

    Messages:
    1,530
    Likes Received:
    158
    GPU:
    1070Ti Phoenix
    Nah, gddr5 wasn't bad, it was nVidia's that made the mistake with adopting gddr4, they bought so much of it that they were putting out low range cards with 2GB vram just to get rid of it and they were still using gddr3 on high end cards.

    By the way, Maxwell GPUs have full dx12 support. Nvidia will support DX12 on every Fermi, Kepler, and Maxwell-class GPU.


    PS. Who is "everybody" ?! You mean AMD right? Intel is hardly going to put vram into CPU's :) And AMD still needs a faster card to get to gtx 980ti. Overclocking the memory on radeon GPUs doesn't yield much performance so i doubt they are having bandwidth issues, especially if we look at radeon 285 being faster than 280 with only 256bit bus on it. It's gcn that they improved and that yielded performance increase. But we will see.
     
    Last edited: May 23, 2015
  2. onemoar

    onemoar Guest

    Messages:
    292
    Likes Received:
    0
    GPU:
    GTX 1060SC @2250/4550
    you read but you don't understand at all .. I suggest you stop talking you are making your self look stupider by the moment
    its the ram manufacturers that hold all the power not the other way around
     
    Last edited: May 23, 2015
  3. gx-x

    gx-x Ancient Guru

    Messages:
    1,530
    Likes Received:
    158
    GPU:
    1070Ti Phoenix
    wth? Please be more specific, I might learn something. ;)
     
  4. PrMinisterGR

    PrMinisterGR Ancient Guru

    Messages:
    8,129
    Likes Received:
    971
    GPU:
    Inno3D RTX 3090
    That is completely wrong. The first company to use GDDR4 was AMD. Quote from the GDDR4 Wikipedia page:
    If you CTRL-F GDDR4 in the NVIDIA list of graphics product, there isn't a single one NVIDIA product ever using GDDR4. Not one.

    NVIDIA will provide DX12 drivers for up to Fermi. They don't actually support all the DX12 tiers, not even with Maxwell.
    For more information look at the links and presentations gathered at this post.

    EDIT: The GCN chart support below appears to be false, according to Wagnard's post later. Still, except Maxwellv2, all NVIDIA cards seem to have lower caps support for DX12 at this point.
    To make it short and sweet though, these are the DX12 resource Tiers:
    [​IMG]

    And this is their support per GPU architecture:
    [​IMG]

    Only GCN is fully supporting everything in hardware.

    That's not really a surprise, since DX12 and Vulkan are almost carbon copies of Mantle, which was built around GCN. And no, that's not speculation, even people from Dice in Twitter joke about it.

    And this is a handy hardware support table:
    [​IMG]

    You will notice that only Maxwell v2 has full Graphics/Compute/Copy capabilities, and even then it has less CP and Compute Units than GCN 1.0.

    Only they plan to, and they give help to developers to make applications that will take advantage of it.
    HBM is not only for graphics. It is a similar change (or even greater) than the one we got from EDO-RAM to DDR. It is a new way to make memory and interface it with chips. Its first applications will be on GPUs (like the GeForce DDR was the first to get DDR memory), and then it will be adopted for everything (it already is, if you see the Intel link). Companies like Samsung, Apple and Qualcomm are interested in it because it enables much much smaller PCB footprints.

    Last time I checked the fastest card that NVIDIA has out is the Titan X. Once the people who have given 1000$ for it are milked, the R9 3xx cards that reach/exceed its performance are out, then they will get the 980Ti out (which is going to be a rebranded Titan X, since unless they go to 20nm there is no magic, it is crucial to remember that there is no magic), price it at 500$ and compete properly once again.

    Memory bandwidth is so important that both AMD and NVIDIA put dedicated silicon in their chips so that they will compress data on the fly to and fro the memory. So, suddenly a technique that gives you 60% higher bandwidth, 40% less power consumption, halves PCB footprint and makes designs like this:
    [​IMG]

    possible, is a non-issue, right?
     
    Last edited: May 25, 2015

  5. gx-x

    gx-x Ancient Guru

    Messages:
    1,530
    Likes Received:
    158
    GPU:
    1070Ti Phoenix
    you made several points and pointed out where I was wrong and where I was mislead, thank you for that.

    I really don't have the will or the energy to debate this further. Sure, maxwell is only tier 2 and and has one or two features lacking that will be emulated. So what? it remains to be seen if those are even useful and what toll on framerate does emulation actually take. For that, we will have to wait for someone to actually make a game that supports dx12 fully, and by that time I am willing to bet AMD will have Zen out and radeon 4xx series around the corner.
    And it's not like my 270x will run future dx12 games with acceptable performance. And I am not impressed with Mantle either, few fps more (or less in some cases) doesn't make it super cool and from my understanding - that's what dx12 will bring so...meh.

    I am out.
    have a nice one.
     
  6. Burningcoals

    Burningcoals Guest

    Messages:
    22
    Likes Received:
    0
    GPU:
    EVGA 980 Ti SC+ @1450Mhz
    Depending on the amount of memory you have it will be, but when it runs out it has to use HDD.

    My point was, find your bottleneck. Because I have a 7950 and after recent update I am getting 45-48FPS on High post processing and High Graphics, with hairworks turned off.

    I should be running crossfire and using hairworks and everything on ultra... Freaking amd..
     
  7. theoneofgod

    theoneofgod Ancient Guru

    Messages:
    4,677
    Likes Received:
    287
    GPU:
    RX 580 8GB
    I have plenty of RAM. There is no fixable bottleneck here.
     
  8. Blackfyre

    Blackfyre Maha Guru

    Messages:
    1,388
    Likes Received:
    391
    GPU:
    RTX 3090
    In an age where everyone on the internet seems to be childish, you've accepted that you had some stuff wrong and we all learnt something through these posts. That's respectable. People seem to have forgotten how to say "yeah I just don't know" or "I'm not sure". Everyone wants to claim they know everything, and no one is willing to admit they were wrong. We all say or do wrong things all the time. It's okay.

    :banana:
     
  9. PrMinisterGR

    PrMinisterGR Ancient Guru

    Messages:
    8,129
    Likes Received:
    971
    GPU:
    Inno3D RTX 3090
    Thank you for not being a douchebag. Most people would either troll instead of admitting at this point. :)

    That's a nice actual point, to be fair. All the talk up to now is about specs, but we don't have anything concrete in our hands except the overhead test, which is only part of the story. The first DX12 games will start appearing in about 3-4 months (in time for Windows 10). Then we'll finally see.

    That depends. The whole Witcher 3 "downgrade" thing showed us that even companies with big budgets can't afford to invest in a separate PC port. DX12 will make porting even easier, so you might get your money's worth from that 270x, if you are willing to be around console quality/frame rate (or even a better depending on the title).
    Mantle was awesome in Thief, for example, and you need to keep in mind that at this point there is major adoption and at least a couple of years of experience with lower-level APIs on the PC.
     
  10. sammarbella

    sammarbella Guest

    Messages:
    3,929
    Likes Received:
    178
    GPU:
    290X Lightning CFX (H2O)
    leas

    +1

    If i don't learn at least ONE new thing every day i feel this day is somewhat empty...about mistakes i'm the Champion of the world! :D
     

  11. Wagnard

    Wagnard Ancient Guru

    Messages:
    2,746
    Likes Received:
    519
    GPU:
    MSI Geforce GTX 1080
    This is completely wrong, it was confirmed on the geforce forum by ManuelG himself.

    Source: https://forums.geforce.com/default/...he-directx-12-features-/post/4497236/#4497236
     
    Last edited: May 25, 2015
  12. Prefix

    Prefix Member Guru

    Messages:
    176
    Likes Received:
    18
    GPU:
    Sapphire R7 260X 2GB
  13. PrMinisterGR

    PrMinisterGR Ancient Guru

    Messages:
    8,129
    Likes Received:
    971
    GPU:
    Inno3D RTX 3090
    I didn't know that, thanks for correcting. I'll edit my post too. :)

    I wonder how complete the profiles that DXCap viewer is getting, are, at this point though. The only differences between GCN versions in CAPS seem to be the min/max filtering.
    Also, apart from Maxwell v2, all of the GCN cards have more features support than the NVIDIA cards. I wonder if tile2/3 support could come to GCN via a driver update (if that is indeed possible).
     
  14. Wagnard

    Wagnard Ancient Guru

    Messages:
    2,746
    Likes Received:
    519
    GPU:
    MSI Geforce GTX 1080
    While theses feature are optional, the 290X missed a few optional features.

    290X-Tiled Resources is "Tier 2" (GTX 970 is Tier 3)
    290X-ASTC Compression is "no" (Same for GTX 970)
    290X-Min/Max Filtering is "Yes" (same for GTX 970)
    290X-Map DEFAULT Buffers is "Yes" (same for GTX 970)
    290X-Conservative Rasterization is "NO" (GTX 970 is Yes - Tier 1)
    290X-PS-Specified Stencil Ref is "NO" (same for GTX 970)
    290X-Rasterizer Ordered Views is "NO" (GTX 970 is yes)


    **Remember, Beta Windows 10 + Beta SDK = Beta results.
     
    Last edited: May 25, 2015
  15. PrMinisterGR

    PrMinisterGR Ancient Guru

    Messages:
    8,129
    Likes Received:
    971
    GPU:
    Inno3D RTX 3090
    My question about all these is: How many of them are hardware dependent, and how many of them have to be driver implemented. Since both architectures are more or less completely programmable, apart from a few things, everything else has the possibility to be supported, right? I mean, there is no "real" hardware for Min/Max filtering (for example), right?
     

  16. Wagnard

    Wagnard Ancient Guru

    Messages:
    2,746
    Likes Received:
    519
    GPU:
    MSI Geforce GTX 1080
    Honestly , I can't answer that for the moment because it will be pure speculation from me. I'll try to get information from someone who may know.
    If I get something ill be sure to tell you.
     
  17. PrMinisterGR

    PrMinisterGR Ancient Guru

    Messages:
    8,129
    Likes Received:
    971
    GPU:
    Inno3D RTX 3090
    And another thing is that most developers will probably go for the lowest possible denominator of these features. That denominator being the console chips and the Intel integrated graphics.
     
  18. sammarbella

    sammarbella Guest

    Messages:
    3,929
    Likes Received:
    178
    GPU:
    290X Lightning CFX (H2O)
    I fear the worst if we need explicit AMD old gen GPU support.

    We have an example of "imposible" backward support in VSR for series 7000 and rebranded 280.

    same hardware "imposible" support.

    I expect the same "imposible" excuse to support the theorically programable (and "old") hardware. :3eyes:

    .....
     
  19. theoneofgod

    theoneofgod Ancient Guru

    Messages:
    4,677
    Likes Received:
    287
    GPU:
    RX 580 8GB
    DX12 was functional in earlier Technical Previews with my card. Now not anymore. I am worried about that :)
     
  20. PrMinisterGR

    PrMinisterGR Ancient Guru

    Messages:
    8,129
    Likes Received:
    971
    GPU:
    Inno3D RTX 3090
    I'm asking because I have this back tingle that someone in the driver team will say that they won't support all the features in earlier hardware because it makes no financial sense. (Anyone remember what happened with SSAA some years ago, and more recently VSR?).

    Although I like to keep a middle ground, this will really be the last straw if they do it.
     

Share This Page