3080 Owner's thread!

Discussion in 'Videocards - NVIDIA GeForce' started by Syranetic, Sep 17, 2020.

  1. SpajdrEX

    SpajdrEX AMD Vanguard

    Messages:
    2,547
    Likes Received:
    852
    GPU:
    Sapphire RX 6800XT
    This Gainward nonGS is not bad, need only 0.837V for 1905Mhz (tested in Quake II RTX), will try to go lower with voltage.
     
    Maddness likes this.
  2. Archvile82

    Archvile82 Member Guru

    Messages:
    192
    Likes Received:
    79
    GPU:
    EVGA 3080 FTW U
    I finally received evga 3080 ftw u today. Apart from some teething issues related to drivers and TV firmware I'm really happy.

    Games feel better and the performance at 4k is solid compared to the 2080ti.

    I'm using the OC bios as the normal would make the fans spin up and down which would sound annoying. Plus the card idle around 48-50 due to 0 fan. TBH I can't really here it on OC mode plus the card idles around 32-34 and about 72 in time spy bench.
     
    mitzi76 likes this.
  3. mitzi76

    mitzi76 Ancient Guru

    Messages:
    8,722
    Likes Received:
    19
    GPU:
    MSI 970 (Gaming)
    Can you test some vram usage at 4k pls and have you had an issues? (re going over 10gb).
     
  4. Shadowdane

    Shadowdane Maha Guru

    Messages:
    1,371
    Likes Received:
    43
    GPU:
    MSI RTX2080Ti
    And it's a load of bullcrap... You can't just move a chip designed for Samsung 8nm fab over to TSMC 7nm fab. It would need a redesign to work on the 7nm TSMC node!! The Samsung 8nm is technically a tweaked 10nm process with a higher density hence they decided to call it 8nm. But in all honesty the nanometer naming we have for transistors now, has little bearing to the actual transistor gate size anymore. That kind of went out the window when they got smaller than 32nm and then jumped to FinFET designs.

    https://www.techpowerup.com/266351/...w-density-metric-for-semiconductor-technology
    https://www.techpowerup.com/272489/...-7-nm-node-using-scanning-electron-microscope

    Looking at the CPU side of things, the Intel 14 nm chip features transistors with a gate width of 24 nm, while the AMD/TSMC 7 nm one has a gate width of 22 nm. The gate width had been the metric for years up until the 32nm process when that metric really couldn't shrink down significantly.
     
    Icanium likes this.

  5. Astyanax

    Astyanax Ancient Guru

    Messages:
    8,384
    Likes Received:
    2,795
    GPU:
    GTX 1080ti
    Yes you can.
     
  6. Archvile82

    Archvile82 Member Guru

    Messages:
    192
    Likes Received:
    79
    GPU:
    EVGA 3080 FTW U
    Hi, I would but I don't run any monitoring software. In the past I found they gave me problems. All I can say is everything feels smooth, no stutters and load times seem faster.
     
    mitzi76 likes this.
  7. Shadowdane

    Shadowdane Maha Guru

    Messages:
    1,371
    Likes Received:
    43
    GPU:
    MSI RTX2080Ti
    endbase likes this.
  8. Icanium

    Icanium Ancient Guru

    Messages:
    1,547
    Likes Received:
    87
    GPU:
    MSI GTX1080ti GamX
    We are not privy to nvidia's sourcing strategy. I read earlier that nvidia tried to use threat of Samsung 8nm to get a better price from TSMC on their 7nm fab and failed which resulted in Ampere going to Samsung. Nvidia would not need to move a chip designed for Samsung 8nm fab over to TSMC 7nm fab, if nvidia had dual sourced Ampere for both Samsung and TSMC.
    https://www.techpowerup.com/266656/tsmc-secures-orders-from-nvidia-for-7nm-and-5nm-chips
     
  9. Netherwind

    Netherwind Ancient Guru

    Messages:
    7,575
    Likes Received:
    1,491
    GPU:
    MSI 3080 Gaming X
    No issues at 4K so far. Played RDR2, Detroit: Become Human, Horizon: Zero Dawn, Witcher 3.
     
    Maddness likes this.
  10. scope

    scope New Member

    Messages:
    4
    Likes Received:
    0
    GPU:
    2 x 7970 Sapphire OC
    There are all kinds of fake news multiplied and circulated. As I said, the GPU has to be redesigned from scratch in order to be transferred to a new fab proccess. You can draw your own conclusions.
     

  11. Shadowdane

    Shadowdane Maha Guru

    Messages:
    1,371
    Likes Received:
    43
    GPU:
    MSI RTX2080Ti
    There isn't a single game I can think of that actually exceeds 10GB memory usage currently. Most of this nonsense that people think games are actually using anywhere close to 10GB is due to pretty much all the monitoring tools only tracking the Allocated Vram. The Allocated stat is basically just how much memory the game is asking to be reserved for the game so it can't be utilized by Windows or other applications. Most games don't use the entire poll of Allocated Vram. In some cases there can be as much as a 40-50% buffer of available Vram that's not being used, it's just being reserved.

    However the most recent Afterburner Beta (v4.6.3.15854 Beta 3) which added features to the Gpu.dll plugin to actually track the actual memory usage for the process that is running!

    [​IMG]

    Here is RDR2 on Ultra settings at 4K with 1.5x resolution scaling (So actually running at 5760x3240 resolution). Actual memory usage is 6.2GB with 7.3GB allocated, I only rode around for maybe 2-3 mins though. Usually memory usage will increase slightly after playing the game for maybe an hour as you load different areas. Most likely might increase 1GB more usage after playing a while and more textures get cached to Vram.

    http://shadowdane.com/public/RDR2/RDR2_6K_downscaled_4K.png
     
    mitzi76, OnnA, Undying and 1 other person like this.
  12. SpajdrEX

    SpajdrEX AMD Vanguard

    Messages:
    2,547
    Likes Received:
    852
    GPU:
    Sapphire RX 6800XT
    @Shadowdane : Breakpoint at 4K Ultimate settings does use more than 10GB VRAM, though I didn't tried it yet as it's not exactly game I really need to play :)
     
  13. Shadowdane

    Shadowdane Maha Guru

    Messages:
    1,371
    Likes Received:
    43
    GPU:
    MSI RTX2080Ti
    Have you tested that with the new Afterburner feature as I mentioned above?? There is a difference between Allocated Vram and Actual Vram being used! I don't own Breakpoint, but here's Rise of the Tomb Raider with 3 scenes from the built in bench. The game is basically reserving about 25% more memory than what's actually being used. It's basically like regular Ram if you give applications 32GB to work with they are going to allocate more to keep a bigger cache loaded in your Ram.

    I'm sure if you tested this game on a 3090 for example it might allocate more than 10GB but the actual used memory would be about the same.

    https://imgur.com/gallery/7I2E7uK

    Scene 1
    Allocated: 10.77GB
    Used: 8.37GB

    Scene 2
    Allocated: 10.75GB
    Used: 8.34GB

    Scene 3
    Allocated: 10.08GB
    Used: 7.68GB
     
    Last edited: Oct 23, 2020
    mitzi76 and endbase like this.
  14. Archvile82

    Archvile82 Member Guru

    Messages:
    192
    Likes Received:
    79
    GPU:
    EVGA 3080 FTW U
    I managed to get 4k,120,10bit and HDR working on my C9. I used PC resolutions instead. Doom Eternal looks amazing, so smooth and oddly enough everything looks clearer, could be to do with the smoother motion.
     
  15. Chastity

    Chastity Ancient Guru

    Messages:
    2,588
    Likes Received:
    839
    GPU:
    Nitro 5700 XT
    Well, the design wouldn't change much, but it would have to be tweaked for the new density, then retaped for the 7nm fab, but it can be done. Would cost a bundle for the retaping, however.
     

  16. Icanium

    Icanium Ancient Guru

    Messages:
    1,547
    Likes Received:
    87
    GPU:
    MSI GTX1080ti GamX
    No transfer required. Nvidia dual sourced Ampere to both TSMC 7nm and Samsung 8nm fab proccesses. Nvidia wanted to use Samsung threat to get a better price for TSMC 7nm and failed. From a May 5th 2020 techpowerup article, "TSMC has reportedly secured orders from NVIDIA for chips based on its 7 nm and 5 nm silicon fabrication nodes, sources tell DigiTimes". Cost of TSMC 7nm process has now gone down, which is what NVIDIA was hoping for.
     
  17. sertopico

    sertopico Maha Guru

    Messages:
    1,067
    Likes Received:
    115
    GPU:
    EVGA 3080 FTW3 GU
    Managed to get a 3080FTW3 Ultra Gaming, out of PURE luck. Now, I am trying to undervolt this beast to lower the power draw and temps, best I got is 0.900V@1995MHz, but it still sucks about 360-380W. What's the sweet spot in your opinion? Besides, does anybody know how to disable the rgb christmas lights? EVGA precision does not allow me, there is no option to customize the leds.
     
  18. nizzen

    nizzen Maha Guru

    Messages:
    1,208
    Likes Received:
    306
    GPU:
    3x2080ti/5700x/1060
    NVidia has been making Ampere (GA100) since Mars 2020 :)
     
  19. N0sferatU

    N0sferatU Ancient Guru

    Messages:
    1,720
    Likes Received:
    49
    GPU:
    EVGA RTX 3080 Ultra
    Yes Precision X1 controls all the LEDs. I can make the entire card go dark (no lights) or anywhere in between (static color, rainbow, wave, etc.)

    In other news, my goodness this card is butter. Flew around my area (Tampa, FL) and it's astonishing with the graphics ramped up. 3440x1440 Ultra settings. I've found out in the middle of the ocean (aka no scenery other than the current hurricane out in the Atlantic I flew in) I hit "GPU limit" instead of "Mainthread" (CPU) limit. It's roughly 92-98 FPS it can crank on FS 2020 without the CPU limiting it.

    EDIT: This is the GeForce software recording it in realtime and still rendering the game at this frame rate. Stupid. Costs only a few FPS tops over not recording.

     
  20. sertopico

    sertopico Maha Guru

    Messages:
    1,067
    Likes Received:
    115
    GPU:
    EVGA 3080 FTW3 GU
    I dl'ed the wrong version, didn't see there was a specific release for the 3000 series. :D I finally turned everything off. Thank you!

    edit: I am having a small issue with my FTW3: basically on facebook, when a video is being autoplayed in any newsfeed and I refresh the page keeping the video on screen, the nvlddmkm driver crashes and is restored. When this happens I noticed that the core frequency spikes to 1800MHz, the browser's window goes black for one second and afterwards everything works again. This happens only with facebook's videos, if for instance I am watching a 4k video on yt and try to refresh nothing unusual happens. This problem occurs with the card at default settings, with no customized oc and so on. Only thing I did was switching to OC mode from the card itself. Can maybe somebody who owns a ftw3 test this particular scenario and report back? Idk if it's the card causing problems, windows 10, or the way facebook is programmed. The card gives me no troubles while playing, drivers never crashed, even at the most demanding settings and resolutions.
     
    Last edited: Oct 26, 2020

Share This Page