970 memory allocation issue revisited

Discussion in 'Videocards - NVIDIA GeForce' started by alanm, Jan 23, 2015.

Thread Status:
Not open for further replies.
  1. gtx980

    gtx980 Guest

    Messages:
    7
    Likes Received:
    0
    GPU:
    gtx980
    ok, maybe the bios has other clocks for cuda?
    you must have kind of oc'ed your 970.

    i cant change the speed via afterburner in nais and it says 3005mhz.
    it would be helpful to post your observed mem speed.
     
    Last edited: Jan 24, 2015
  2. nanogenesis

    nanogenesis Guest

    Messages:
    1,288
    Likes Received:
    6
    GPU:
    MSI R9 390X 1178|6350
    Nai's bench runs in P02 state. If you want more DRAM Bandwidth match memory clock in P00 state with P02 state in the bios.

    Also any 256bit 7ghz memory clock chip like the GTX770, here is a skewed result:
    [​IMG]

    Most likely if the user ran it in headless, the last 2 memory blocks would fall in line, but lets just look at dram figures. 173GB/s.

    So my assumption is GTX980 runs the dram bench at 7ghz memory clock, GTX770 as well, only the GTX970 uses P02 memory clock state and runs at 6Ghz. A GTX760/670 with 6ghz 256bit memory will show the same figures as a GTX970 in DRAM benchmark.

    Can someone confirm?

    Also my P02 state is set at 3780mhz. I'm sure by unitary method or some other calculation, 3505mhz would net ~175Gb/s
     
    Last edited: Jan 24, 2015
  3. gtx980

    gtx980 Guest

    Messages:
    7
    Likes Received:
    0
    GPU:
    gtx980
    so you just overclocked via bios?

    you should state that, to make the hunt not more difficult than it has to be.
    on my 980 it is 3005mhz resulting in 178gb/s
     
    Last edited: Jan 24, 2015
  4. gtx980

    gtx980 Guest

    Messages:
    7
    Likes Received:
    0
    GPU:
    gtx980
    the 970 runs nais at 3005mhz

    but i have to post 2 more times to share a link
     

  5. AMN3S1AC

    AMN3S1AC Guest

    Here is GPUZ sensor log I have while playing Shadow of Mordor at 1080p with Asus GTX 970. It shows my vram usage peaking at 3710MB. Also, my GPU load goes from around 97% to about 75% during this time and I get some hitching/stuttering.

    SOM GPUZ Log
     
    Last edited by a moderator: Jan 24, 2015
  6. VAlbomb

    VAlbomb Guest

    Messages:
    152
    Likes Received:
    6
    GPU:
    Nvidia G1 Gaming GTX 970
    Funny..
    puu.sh/eXaqD/be18c35f4e.jpg

    puu.sh/eXaGD/b6b8ae32eb.jpg

    puu.sh/eXaMX/0ca19c7e30.jpg
     
  7. SuperAverage

    SuperAverage Guest

    Messages:
    247
    Likes Received:
    2
    GPU:
    Gigabyte xtreme 1080
    As far as belittling anyone, I didn't, and it was an offer for someone who was fine with their card, regardless of differences between advertised specifications and actual specifications, to move along if the matter wasn't of concern to them, after calling me a retard.

    Onto the matter at hand:

    This craps getting confusing.

    First, what version of this "benchmark" should we be using?

    Secondly, are we sure it does what we want it to?

    Thirdly, the fact that people can run it without realising it needs to be headless for accurate results is muddying up any relevance it may have.

    Fourth, the fact that other factors, such as DWM, across multiple platforms can skew results muddies the data pool even further.

    All of these things make for super inconclusive results.

    What needs done is a real benchmarking program, preferably with some kind of GUI and a standardized output window that can be verified and screenshotted.

    It needs to show which blocks are in use and by what, memory speed, card model, memory bandwidth a given block/chunk of addresses has, etc.

    That is to say, show whats filling vRAM and where, and when and where the slowdowns happen.

    I can run this program, obviously, I have both IGPU and hell, two 970's, all I have to do is un-sli them and run from one and test the other, but the benchmark, as it stands, is not restrictive or informative enough to use as a yardstick for this issue.

    Believe me, if I could code, I'd give it a try, because I'd like to determine one way or another if 970's in part, majority or whole have an issue.
     
    Last edited: Jan 24, 2015
  8. VultureX

    VultureX Banned

    Messages:
    2,577
    Likes Received:
    0
    GPU:
    MSI GTX970 SLI
    Let me help you there:

    [​IMG]

    [​IMG]

    [​IMG]

    This pretty much illustrates the ungodly performance drop that occurs on the 970GTX. It doesn't happen in every game, but there is surely something wrong.
     
  9. Pill Monster

    Pill Monster Banned

    Messages:
    25,211
    Likes Received:
    9
    GPU:
    7950 Vapor-X 1100/1500
    That doesn't sound all that surprising when you think about it, because not only are those old games 32bit, they are also DX9. DX9 doesn't support unified address space or Tiled Resources.

    As a dev developing a DX9 game I would want to keep everything in dedicated VRAM even if it didn't need to used straight away, otherwise the data has to be mirrored in RAM then copied to VRAM using a swap buffer.

    DX11 apps on the other hand (Far Cry 4) can reserve CPU memory and page to and from it as required, and it's zero copy so no need for a huge buffer. (I'm not 100% certain on that but I don't think a swap buffer would be necessary).
    So I would assume what u are seeing is normal (or a bug) depending on the app, OS, and the drivers.

    Just some food for thought......
    tbh I haven't seen anything yet that seems conclusive either way. Too many variables in the mix and no control.
     
    Last edited: Jan 24, 2015
  10. AMN3S1AC

    AMN3S1AC Guest

    From Nvidia Chat Support this afternoon - (names changed)

    [10:11:39 PM] NV Chat: We have our entire team working on this issue with a high priority. This will soon be fixed for sure.
    [10:11:54 PM] Me: So, what is the issue?
    [10:12:07 PM] Me: What needs to be fixed?
    [10:12:46 PM] NV Chat: We are not sure on that. We are still yet to find the cause of this issue.
    [10:12:50 PM] NV Chat: Our team is working on it.
     

  11. Memorian

    Memorian Ancient Guru

    Messages:
    4,021
    Likes Received:
    890
    GPU:
    RTX 4090
    If it's hardware-related and NVIDIA tell us to RMA, i can't even think how hard it will be to convince the shops to RMA for this reason..Please Lord, let this be fixed with a simple bios update..
     
  12. Cru_N_cher

    Cru_N_cher Guest

    Messages:
    775
    Likes Received:
    2
    GPU:
    MSI NX8800GT OC
    For me its a pure economical decision based also on facts of changing developing routes in the future and will have virtually no impact for users in the long run ;)

    Price/Performance/Consumption

    I can't really believe it is a bug and gone through WHQL and Nvidias own Simulations unnoticed.

    Also Nvidias Design is fully concentrating on Variable Framerate results that can't be broken that easily anymore perception wise Shadowplays VFR mode shows that also G-sync.

    Now when Software Efficiency cant hold up with this naturally problems arise ;)

    And that is what many users are currently percepting the more complex the systems become (multithreading in a simulated enviroment) the more problematic it will be to hold them perceptional stable this is especially a problem on the Modular PC Hardware Architecture ;)
     
    Last edited: Jan 24, 2015
  13. flexy

    flexy Guest

    Messages:
    198
    Likes Received:
    3
    GPU:
    Riva 128
    Just wondering whether this test can be compiled to run on Linux or from a booable DVD, DOS even etc....to entirely exclude Windows, DWM etc.

    And, edit: Having the full 4GB available at full speed *DOES MATTER* of course: There are people who downsample from 4k for instance. Having such huge memory performance drops, sometimes from 2.6GB on is....absolutely ridiculous! Please don't argue with nonsense it won't matter since "no game uses that much video memory".
     
    Last edited: Jan 24, 2015
  14. davido6

    davido6 Maha Guru

    Messages:
    1,441
    Likes Received:
    19
    GPU:
    Rx5700xt
    wonder what other cards are effected with this
     
  15. Im2bad

    Im2bad Guest

    Messages:
    791
    Likes Received:
    0
    GPU:
    3080 Gaming X Trio
    I would think most shops would take them back, especially if the manufacturer says there is a fault in the product.

    It's great to know that Nvidia is at least claiming to look into it. It'll be nice to know whether there really is a problem. Uncertainty is always annoying.
     

  16. SuperAverage

    SuperAverage Guest

    Messages:
    247
    Likes Received:
    2
    GPU:
    Gigabyte xtreme 1080
    Thanks for suggesting that. I meant to.

    That is, a bootable image that excludes windows altogether.

    An image like gparted or similar that will test video ram.
     
  17. Cru_N_cher

    Cru_N_cher Guest

    Messages:
    775
    Likes Received:
    2
    GPU:
    MSI NX8800GT OC
    That is not the Design Goal of the GTX 970 sorry 2K/4k @ moderate framerates though you have to cut down on more complex simulations and go with the GTX 980 for that or compensate the loses with G-sync.
    The Primary Goal in Nvidias Hardware Design is Balanced efficiency since Maxwell @ a acceptable price point.
    Not many of the uneconomical mostly useless things users do here all of the time which drive no improvements but mostly only higher the overhead in the system exponentially ;)
     
    Last edited: Jan 24, 2015
  18. SuperAverage

    SuperAverage Guest

    Messages:
    247
    Likes Received:
    2
    GPU:
    Gigabyte xtreme 1080
    Advertised with 4GB of usable vRAM.
     
  19. Cru_N_cher

    Cru_N_cher Guest

    Messages:
    775
    Likes Received:
    2
    GPU:
    MSI NX8800GT OC
    and physically they are usable and now ?
     
  20. gUNN1993

    gUNN1993 Guest

    Messages:
    289
    Likes Received:
    31
    GPU:
    GTX 670 Windforce 3
    From my 670, looks like there's only a slowdown at the last couple hundred mb (which I assume is windows, as I have no idea how to switch to Igpu without removing the GPU)

    http://puu.sh/eXmCR/e6662a23d6.png
     
Thread Status:
Not open for further replies.

Share This Page