NVIDIA GeForce RTX 2080 SUPER (Founder/Reference) review

Discussion in 'Frontpage news' started by Hilbert Hagedoorn, Jul 23, 2019.

  1. MonstroMart

    MonstroMart Maha Guru

    Messages:
    1,397
    Likes Received:
    878
    GPU:
    RX 6800 Red Dragon
    BTW i'm not saying the 2080 Super is bad or anything. Just that replying in a somewhat hash way to someone for saying it should not be recommended is stretching it a lot imo.

    I mean where i live considering the price it doesn't make much sense to buy one. They are close to a grand (950-1000$). I mean if i'm gonna spend that amount on a single gpu i might as well go for a Ti at this point (1400-1500$). It's like trying to save money by buying a 70 000$ car instead of the 100 000$ one you love. Doesn't make much sense to me. If i want to buy something high end but i'm on a "budget" a 2080 normal (around 850-900$ where i live) while they are still available or better imo a 2070 Super (680-700$) looks like a better investment considering RTX 2.0 is probably around the corner.

    Anyway i just think vbetts reply was kind of harsh. I don't know the history between him and the guy he was replying to since i just seldomly read this board but i don't think saying that the 2080 Super is not an easy recommendation is totally stupid. It's definitely not an exciting product according to many reviewers and i tend to agree with this.
     
    Last edited: Jul 25, 2019
  2. Reddoguk

    Reddoguk Ancient Guru

    Messages:
    2,665
    Likes Received:
    597
    GPU:
    RTX3090 GB GamingOC
    We all know that early adopters get screwed over 90% of the time, that's not down to a reviewers opinion or recommendation of a product, it's just the current competition between AMD and Nvidia that forces these newer models to be released. AMD has a big portion of blame here on the GPU side because they failed to produce an enthusiast or even high-end cards to compete with RTX.

    So btw i believe RTX was a semi disaster for Nvidia, releasing RT tech maybe 2 years before it's fully utilized and having to find new ways to shift their products while people have eyes and can see the lack of RT progress. I fear by the time RT is fully implemented into most games the tech will have moved on. Which of course sucks for all 20XX owners. From what i've read around the net there's still only a hand full of RT enabled games and even then hardly used by some RTX owners because of issues or low frame rates.

    If only we could see into the future of PC tech then purchasing the right equipment @ the right time would be easy. Unfortunately that's not possible so people are going to get butt hurt.
     
  3. schmidtbag

    schmidtbag Ancient Guru

    Messages:
    8,010
    Likes Received:
    4,383
    GPU:
    Asrock 7700XT
    So something I noticed that is a bit odd to me:
    It only has an 8-pin and 6-pin power connector. According to the power consumption graphs, this thing is up there with the Vega 56, 780Ti, and 390X, each of which need 2x 8-pins. The 2080SF reference likely was never intended to be overclocked, but still a bit odd to me. Of course, its wattage is actually low enough to get by on what it is provided with (75W from mobo + 150W from 8-pin + 75W from 6-pin = 300W, which is sufficient headroom) but different workloads draw different amounts of power. I'm sure if you were to overclock this thing, run something like FurMark, and then somehow manage to trigger the RT cores, you'd definitely exceed the power limits.

    Of course, the power limits are just simply a guideline, but, this is a reference card. I think it'd have been wise for Nvidia to expand this GPU's tolerances, and not leave room for potential worst-case scenarios.
     
  4. Loophole35

    Loophole35 Guest

    Messages:
    9,797
    Likes Received:
    1,161
    GPU:
    EVGA 1080ti SC
    Reference 780ti only had a 6+8 config.
    [​IMG]
     

  5. schmidtbag

    schmidtbag Ancient Guru

    Messages:
    8,010
    Likes Received:
    4,383
    GPU:
    Asrock 7700XT
    Ah whoops, I must have been looking at one of the non-reference cards.
    So, I guess Nvidia hasn't really been doing anything all that different all these years.
    Still goes to show they don't seem to want this thing OC'd.
     
  6. Loophole35

    Loophole35 Guest

    Messages:
    9,797
    Likes Received:
    1,161
    GPU:
    EVGA 1080ti SC
    Of course they don't. They don't want the RMA's involved with OCing.
     
  7. Fox2232

    Fox2232 Guest

    Messages:
    11,808
    Likes Received:
    3,371
    GPU:
    6900XT+AW@240Hz
    Like what? You can use 2x 6-pin on card to draw 400W. No problem for solder, nor cables from PSU.
    There are mods of PCB, that can kill current sense and allow card to draw all it wants/can at the moment.

    If OC results in heat damage it will not be connector, but likely undersized VRMs with insufficient cooling. 8-pin connectors are there just to satisfy standard that's blown out of proportions.
     
  8. schmidtbag

    schmidtbag Ancient Guru

    Messages:
    8,010
    Likes Received:
    4,383
    GPU:
    Asrock 7700XT
    Yes, you can do that. But it's not exactly good practice for a company to create a product that doesn't conform to industry standards, even if those standards are blown out of proportion. That being said, Nvidia is actually conforming to the standards; like I said earlier, they are in fact well within the limits with their current setup. My point is just that a little bit of an OC under certain workloads and that limit will be readily exceeded.
    Again, I know it'll probably be fine, and like Loophole35 said, Nvidia isn't exactly interested in or obligated to allowing people to go above the product's specs. My point was really just that, compared to similar products, this seems a bit tight for an enthusiast product.
    But, I'm sure 3rd party variants will offer 2x 8-pins.
     
  9. Fox2232

    Fox2232 Guest

    Messages:
    11,808
    Likes Received:
    3,371
    GPU:
    6900XT+AW@240Hz
    Standard is simply safety overkill. 50cm wire with 1.5mm^2 area can pull 35A and V-drop on way there and back will be ~0.4V on 12V therefore something like 11.6V.
    Biggest issue would be connector and potential heat at contact points. But PCIe connectors are rather good, not like old 4-pin molex.

    And point remains same. Any decent PSUs have one rail for all GPUs, therefore if there is 600W available, that 8+6pin configuration is enough as long as SMDs on PCB of graphics card are willing to pull. Therefore having 8+6pin is not real obstacle from nVidia against OC. Limiting factor would be power limit in vBIOS or design of PCB itself.
     
  10. schmidtbag

    schmidtbag Ancient Guru

    Messages:
    8,010
    Likes Received:
    4,383
    GPU:
    Asrock 7700XT
    In an ideal world, sure. But not every PSU manufacturer conforms to the industry standards. The point of giving so much headroom is to account for the weakest link in the PC, or those who try to cheat the system. Not everyone uses the proper wire gauge, let alone copper wires. For multi-rail PSUs, some of them might not offer enough power over the rails the GPU has access to. Some PSUs advertise their peak wattage, but not sustained wattage. Even when you have a good PSU, you have to account for what the rest of the system handles, where vdroop might be more prominent. Of course, having an 8-pin connector doesn't guarantee these things are addressed, but it opens doors to lawsuits when it fails to comply with the standards. By using 6-pin connectors, the PSU manufacturer isn't obligated to offer any more than 75W over that connector.
    All that being said, I'm not so much concerned about the GPU itself offering those last 2 pins, because I trust Nvidia knows how to make solid hardware. The problem isn't them. The overkill safety you mention only applies if the PSU is also overkill.

    The whole point of giving excessive headroom is to guarantee results. Take vehicle tires for example: they can usually withstand well over 10x the pressure they advertise, but there's a reason they tell you where to stop. When you have a product that can be in such a wide range of environments, you have to account for all extremes.

    Like I said, without overclocking, this GPU has more than enough headroom, so, Nvidia isn't doing anything wrong here. I actually wasn't complaining about what they were doing. I'm just saying that it doesn't leave much headroom for overclocking.

    Anyway, you could argue "anybody who buys this shouldn't cheap out on a crappy PSU" but you can't control the stupidity of other people, and neither can Nvidia. By offering 2x 8-pins, you set aside all concerns. But, if Nvidia doesn't want people OC'ing this (which they probably don't) then the one 6-pin connector makes for a good deterrent (or punishment) to idiots.
     
    Fox2232 likes this.

  11. Loophole35

    Loophole35 Guest

    Messages:
    9,797
    Likes Received:
    1,161
    GPU:
    EVGA 1080ti SC
    @Hilbert Hagedoorn hey boss, I noticed some odd data on the charts. The 2080ti numbers seem low when compared to the 2080S. The I looked back at all the old reviews of the 2080ti and noticed they were performed on an older test bench (the x99 with a 5960x at 4.2Ghz) the newest reviews are using the 9900k@default. You have a note about transitioning to the new system but are any of the benches using both systems? If so do you plan on running at least a FE 2080ti full suite again. I just don't see the 2080S being only 10% slower at QHD and the 2070S being ~17% slower as reality.
     

Share This Page