Nvidia GeForce 364.72 WHQL driver download & discussion

Discussion in 'Videocards - NVIDIA GeForce Drivers Section' started by masterchan777, Mar 28, 2016.

  1. Damien_Azreal

    Damien_Azreal Ancient Guru

    Messages:
    11,509
    Likes Received:
    2,181
    GPU:
    Gigabyte OC 3070
    I'm not saying specifically for those games.
    I'm saying in general, hopefully newer driver sets will offer improved DX12 performance all the way around.

    It's something Nvidia need to work on.
    I was simply using those games as an example, there are a couple big name Betas starting... and normally Nvidia puts out drivers for such releases.
    And if they do, hopefully, they are able to offer up some solid improvements overall.
     
  2. dr_rus

    dr_rus Ancient Guru

    Messages:
    3,882
    Likes Received:
    1,015
    GPU:
    RTX 4090
    Well, I wouldn't expect any significant DX12 improvements for Maxwell prior to Pascal launch now. It's pretty obvious that Pascal's general performance is their main focus at the moment. The improvements for Maxwell's DX12 will come either with Pascal driver or at a later date.
     
  3. Ieldra

    Ieldra Banned

    Messages:
    3,490
    Likes Received:
    0
    GPU:
    GTX 980Ti G1 1500/8000
    What do you base this on ?

    I've played two DX12 games, AotS and RotTR. Both run fine.

    Drivers have a smaller influence under DX12
     
  4. Damien_Azreal

    Damien_Azreal Ancient Guru

    Messages:
    11,509
    Likes Received:
    2,181
    GPU:
    Gigabyte OC 3070
    Even a small improvement is better then nothing.
    I'm not expecting anything grand, and with the condition of Nvidia's last few set releases... I'm not expecting much or anything.

    But, every little bit helps.
     

  5. otimus

    otimus Member Guru

    Messages:
    171
    Likes Received:
    1
    GPU:
    GTX 1080
    Yeah, I think it's more of a case of most DX12 titles being optimized for AMD by the game developers, and or most of said games too coming from console-centered developers, where the only hardware is AMD.
     
  6. Yxskaft

    Yxskaft Maha Guru

    Messages:
    1,495
    Likes Received:
    124
    GPU:
    GTX Titan Sli
  7. dr_rus

    dr_rus Ancient Guru

    Messages:
    3,882
    Likes Received:
    1,015
    GPU:
    RTX 4090
    Well, from what we saw up till now NV's DX12 driver is in a pretty good shape if not at peak of possible performance. Can't say that I've had any driver issues in any DX12 games I have.

    Fermi has the same featureset as Kepler.
     
    Last edited: Apr 13, 2016
  8. -Tj-

    -Tj- Ancient Guru

    Messages:
    18,097
    Likes Received:
    2,603
    GPU:
    3080TI iChill Black
    I think its mostly because of cuda compatibility issues, by Kepler its better, Maxwell advanced some more, Pascal improved a little further (extra Unified memory)- but not as much as Fermi > Kepler.

    Fermi GF100/110: 2.0
    Kepler GK100/110: 3.5
    Maxwell GM200: 5.2
    Pascal GP100: 6.0 (Tesla only so far..)
    https://en.wikipedia.org/wiki/CUDA

    If you read numerous Fermi issues in that link then you know why.. And its only for cuda rendering, not that they will drop support completely.



    Also funny how much they mention TDR in that readme documentation.. :infinity:
    Either disable it completely in windows or set to 30 sec or set to 60sec..

    Now I really wonder if I raise it to 30sec to overcome higher gpu OC issues, so far it was only TDR if I OC'ed beyond idk 1490MHz..
     
    Last edited: Apr 13, 2016
  9. otimus

    otimus Member Guru

    Messages:
    171
    Likes Received:
    1
    GPU:
    GTX 1080
    It does make them look bad. But the sad fact is, is no one is really going to care. Probably 99% of the people who are knowledgeable enough to know the differences between different videocards upgrade their GPUs every 2-3 years anyways.

    That's sort of the unfortunate nature of PC hardware. The people knowledgeable enough to complain don't bother because they have little experience with how things pan out on older things, and the people not knowledgeable enough probably think everything is fine. (and, frankly, it sort of is fine. Sort of. I mean, it's not, but there are probably worse things Nvidia is doing/has done that probably deserve more attention than that).

    I just really want DX12 and Vulkan to become widely used, and right now, there seems to be so many stop gaps holding that up, that I fear we'll probably barely get much use out of Vulkan for a good while, and DX12 probably won't get widely used until a new iteration or two.
     
    Last edited: Apr 13, 2016
  10. Monchis

    Monchis Guest

    Messages:
    1,303
    Likes Received:
    36
    GPU:
    GTX 950
    It seems like there is room for improvement for some nvidia cards in killer instinct, gtx960 running the same as the hd7850:

    [​IMG]
     

  11. dr_rus

    dr_rus Ancient Guru

    Messages:
    3,882
    Likes Received:
    1,015
    GPU:
    RTX 4090
    Not sure what you mean there as all DX12 cards are running on the exact same compute capability at the moment -- DirectCompute 5.0. There's DC 4.0 for older DX10 h/w but there's nothing above 5.0 at the moment.


    DX12 and Vulkan were never meant to be widely used as they are made for a rather limited number of top tier graphics programming gurus. With Fermi being one of DX12 support targets it would actually make it harder to build a DX12 renderer so lack of Fermi in that picture will actually help those who want to go D3D12 route.
     
  12. yobooh

    yobooh Guest

    Messages:
    260
    Likes Received:
    15
    GPU:
    Gigabyte 970 G1
    Over time AMD is still continuing to improve HD 7xxx series performance.
    In almost all bench now a 7870 surpass my 660ti (expecially since the release of Crimson drivers) while in the past it was the reverse and this make me angry.
    For me this is the prove that AMD continue to improve also older generation while Nvidia is focused on Maxwell...or better at the moment on Pascal.

    Anyway...the Xone has a card similar to AMD 7770 so...it's obvious that the game is just well optimized for those cards...same for Quantum Break
     
    Last edited: Apr 13, 2016
  13. Alessio1989

    Alessio1989 Ancient Guru

    Messages:
    2,941
    Likes Received:
    1,239
    GPU:
    .
    Most annoying compromises on D3D12 are Tier 1 in Resource Binding and in Resource Heap. Fermi is/was the only tier1 architecture in both of them. But the D3D12 API provides all developer needs to handle those limitation.
    I am not in the position nor I am able to state how much Fermi impacted some specifications of the API (though I am aware about some fragments :) ).
    The most curious could do a research crossing the different IHVs architectures documentation (though CUDA references are not the best thing in the world).. Please note there are more then 3 IHVs, which made the research more challenging.
    Anyway, the most obvious statement is that Fermi was probably a waste of time and resources..
     
  14. Monchis

    Monchis Guest

    Messages:
    1,303
    Likes Received:
    36
    GPU:
    GTX 950
    And the hd7850 is moping the floor with my previous gtx660, it seems like I upgraded to maxwell and overclocked to max just to keep up with that old amd card :bang:
     
  15. -Tj-

    -Tj- Ancient Guru

    Messages:
    18,097
    Likes Received:
    2,603
    GPU:
    3080TI iChill Black

    Direct compute version is not CUDA capability version. Check that CUDA wiki link again.:nerd:

    That nsight requires certain features beyond CUDA 2.xx and that's why Fermi has trouble with it, imo its best to drop support for it anyway, but yeah that doesnt mean it will now be legacy gpu by normal compute apps/dx12 games/or suddenly lose normal driver support.
     
    Last edited: Apr 13, 2016

  16. dr_rus

    dr_rus Ancient Guru

    Messages:
    3,882
    Likes Received:
    1,015
    GPU:
    RTX 4090
    NSight isn't about CUDA only, it allows general GPU debugging in graphics as well. Discontinuing Fermi support means that starting with the next version the devs won't be able to optimize their (DX11/DX12) code for Fermi - although I don't think that there are many devs right now who do this anyway so in my view this isn't a big deal.
     
  17. otimus

    otimus Member Guru

    Messages:
    171
    Likes Received:
    1
    GPU:
    GTX 1080
    It's not so much that they "kept improving" the cards, so much as it is a combination of them always being fairly powerful, just that AMD had really, really bad DX11 drivers early on in GCN's life, and it's gotten better over time, coupled with the fact that AMD kept rebadging and rebadging and rebadging for like nearly 4 years straight.
     

Share This Page