What we want to see in hardware benchmarks? (CPU/GPU)

Discussion in 'The Guru's Pub' started by sverek, Oct 31, 2018.

  1. sverek

    sverek Ancient Guru

    Messages:
    5,580
    Likes Received:
    2,439
    GPU:
    NOVIDIA -0.5GB
    I'd like to start with GPU benchmarking

    The major issue I see right now with GPU testing in games, is that all benchmarks are run on ULTRA settings.
    Without a doubt it shows us how well GPU performs for specific game, without any compromises.
    It's rather helpful for 1080p, cause modern mid-high tier GPUs handles it pretty nicely and still deliver nice stable FPS (or at least I hope so).

    However as we reach 1440p and 4K I really want to see how games performs on HIGH settings instead.
    This is the visual grading I have in my mind:

    Code:
    Very Low < Low < Medium < HIGH < VERY HIGH <= ULTRA
    As a budget-minded consumer, I find higher tier GPUs like GTX1070 or Vega54 the most attractive.
    My goal is not to see all eye candies, but having stable and smooth gameplay.
    So having an idea how well game performs without murdering a GPU would be really helpful for me.

    I am pretty sure some games might lose 50fps by simply switch from Very HIGH to ULTRA.
    So while GPUs like GTX1070 or Vega54 were straggling to maintain 60fps on 1440p ULTRA,
    they might deliver stable 120fps on HIGH settings.

    This also related to how 4k scares consumers with low FPS, which caused by benchmarks being run on ULTRA settings.
    Games on 4K were never meant to be played on ULTRA settings imho.
    Run 4K on HIGH settings and I am pretty sure we will see a totally different FPS.

    ------

    With above said, I also want to see how bad lag spikes were during benchmarking.
    Having 0.1% low would be really helpful to notice it.

    For example
    Intel i5 6600k might have higher average FPS than Ryzen 2700x.
    However if we look in CPU heavy game like BF1,
    6600k might have issues with processing heavy fights resulting in fps being dropped to 40fps at some times, while 2700x maintained 90fps as minimum.

    0.1% low would also be helpful for GPU, since they also might experience throttle and drop fps, but still have higher average FPS.


    This is something that crossed my mind and I just wanted to share.
    Right now we only have consumers sharing their fps on different game settings and don't have reliable source on it.

    What's your thoughts and how benchmarking could be improved?
     
    Neo Cyrus, user1 and StewieTech like this.
  2. Noisiv

    Noisiv Ancient Guru

    Messages:
    6,664
    Likes Received:
    504
    GPU:
    2070 Super
    I disagree with going easy on ULTRA settings.
    For one simple reason: I never understood benchmarks as "this is the FPS you're going to get".

    Nor do I particularly want to see Guru3D fiddling with custom settings, finding some "sweet-spot" configuration.
    For several reasons:

    1. It's not straightforward. By definition benching at non-MAX settings is opposite of well-defined, giving rise to multiple possible configs. It's tiresome, it's time consuming.
    2. There are no quick benefits, because running at non-MAX settings still doesn't mean "this is the FPS you're going to get".
    3. If I can't see MAX settings FPS results in gaming benchmarks, then where... what do I have to do to see games running at ULTRA?
    4. ULTRA is a showcase of present graphics technologies and an indication of times to come. Meaning it gives a perspective of what kind of FPS hit we might expect from future games running at sweet-spot mixed settings.
    5. It's an open invitation for trolls and bias accusation.
    6. KISS. No single review is going to give you all the answers. But at least normal review will covers the basics in a systematic straightforward manner.
     
    StewieTech and sverek like this.
  3. mbk1969

    mbk1969 Ancient Guru

    Messages:
    8,558
    Likes Received:
    5,445
    GPU:
    GeForce GTX 1070
    I bet majority of players who can play only on high end HW with UTRA settings can`t even see the difference between HIGH and ULTRA. But they get joy only from ULTRA.
     
    StewieTech likes this.
  4. jbscotchman

    jbscotchman Ancient Guru

    Messages:
    3,557
    Likes Received:
    2,564
    GPU:
    MSI 1660 Ti Ventus

  5. HeavyHemi

    HeavyHemi Ancient Guru

    Messages:
    6,444
    Likes Received:
    678
    GPU:
    GTX1080Ti
    I can see the difference and I'm at native 4K. As you well know, it all depends upon the title how much distinctions there is between settings level. On several games there are settings beyond even the 'Ultra' preset that can absolutely tank FPS.
    And I don't mean just AA either. I think you're feeling the need for a major upgrade to join the 4K crowd and it's making you salty. :p
     
  6. HeavyHemi

    HeavyHemi Ancient Guru

    Messages:
    6,444
    Likes Received:
    678
    GPU:
    GTX1080Ti
    My thoughts are that as each new resolution becomes commonly available, currently 4K, that "Ultra" preset benches are included. This establishes a baseline from which progress can be judged. The earlier the better.
    All benches should be done, where possible, using the game's presets and using the built in benchmark(s) if one exists. Benching other titles with custom scripts is a bit more problematic, but nothing is perfect.
     
  7. Agent-A01

    Agent-A01 Ancient Guru

    Messages:
    11,363
    Likes Received:
    898
    GPU:
    1080Ti H20
    We want 720P results! ;)
     
  8. user1

    user1 Maha Guru

    Messages:
    1,464
    Likes Received:
    477
    GPU:
    hd 6870
    I would tend to agree that Ultra is usually non-representative, I recall a quite few times where going from veryhigh to Ultra on some settings, caused a >15% performance loss for virtually no improvement in visual fidelity.
     
  9. mbk1969

    mbk1969 Ancient Guru

    Messages:
    8,558
    Likes Received:
    5,445
    GPU:
    GeForce GTX 1070
    I am sure you can. I just doubt that everyone can actually see the difference. Many people "see" the difference when they know about the differences. Just like audiophiles.

    No, I praise gameplay, story, art style (for example, like in "Dishonored" series) not visuals. I just need good enough effects, textures, AA. I believe that the more photorealism implemented the less imagination involved. When you read a book you use your imagination for 200% to visualize the content. When you watch a movie you see the result of the crew`s imagination and your one sleeps (unneeded).
     
  10. Celcius

    Celcius Master Guru

    Messages:
    210
    Likes Received:
    247
    GPU:
    Radeon R9 380X
    Things I really don't want to see within a lower-tier graphics card review:

    1. Test results for cards whose suggested retail price are more than, say, $100 USD above the price of the card being reviewed. If someone is reading a review of an inexpensive card, how does it serve them to clutter-up the test results with cards selling for a price of 4X or higher? Consider an admittedly hypothetical, (but not too different from reviews you've actually seen), GT 2030 example:

    HITMAN III DOUBLE-TAP
    Average Framerates, DX12 - 4K
    Graphic Settings: Beyond Ultra Extreme

    GTX 2080Ti ===================================================================================== 119.9 FPS





    (Insert boiler-plate results for 3 dozen other previously-tested, much more expensive cards here.)




    GT 2030 DDR5 == 7.3 FPS

    What is the point? How is someone enlightened by this? In general, I'd be much more interested in seeing results from the sort of cards that would plausibly represent what someone is upgrading from, using contemporary drivers.

    2. (See above.) Benchmarking inexpensive graphic cards in games at 4K at High, Ultra or their image-quality equivalent. So, the GTX 2050 achieved an average framerate of 15.6 FPS at 4k Ultra in GTA VI: Is 'Frisco Burning? Who frackin' cares? Anyone going to run with that? I feel a reviewer is just wasting his-or-her time, as well as the time of their audience.

    3. Exclusively using monster systems with top-end CPU and RAM combinations to evaluate much lower-tier graphics cards. Yes, I've heard the defense of this practice being that this eliminates the possibility of CPU-related bottlenecking. And, I'm sure it helps. But, who is running an i7-8086K /Z370 /RX 550 combo?

    Doing this doesn't help someone who is trying to get a handle on how something affordable will perform in Lara Croft: Brunch With The Tomb Raider using their i3-3240, or FX-6300, or Pentium G4560. Remember, that we're discussing lower-tier graphics cards here. One or two throwback, "beer & pretzels" platforms, updated with the most currently available drivers, should be a part of every reviewer's kit to represent the great, unwashed masses. Yeah, it's more work, but someone, somewhere could step up and differentiate themselves from six dozen other review sites. I mean, throw us a bone now and then, yo?
     

  11. H83

    H83 Ancient Guru

    Messages:
    2,831
    Likes Received:
    443
    GPU:
    MSI Duke GTX1080Ti
    Sverek i agree that using ultra settings in "real life" is unrealistic and almost useless because the visual difference between Ultra and High settings is minimal and in most games it´s extremely demanding. But for reviewers it makes perfect sense to use the highest settings possible to showcase how "hard" a video card can go and to make the benchmarks consistent and replayable across different sets of hardware. Using a different setting than Ultra would only create problems and offer very little.
    Users only have to see the charts comparing different cards to know how much they perform in Ultra and then extrapolate the performance for high settings, a 20/25% average performance increase i think.

    As for the performance dips that can be interesting but it´s time consuming and very boring to look at and take conclusions of it.
     
  12. Fox2232

    Fox2232 Ancient Guru

    Messages:
    9,939
    Likes Received:
    2,289
    GPU:
    5700XT+AW@240Hz
    For CPU benchmarking, fps target. 60/120/180/240fps 1080p.
    Testing Method:
    - Strongest GPU with one of strongest CPUs.
    - Graphical settings to max and gradually reduced till required FPS is reached.
    - On this settings, All other CPUs are tested.
    - And on this settings tests are redone with strongest GPU from competition. (To check for possible mis-optimizations of driver/game for architecture as it is common that games use multiple coding paths specific for AMD or nVidia.)
     
  13. HeavyHemi

    HeavyHemi Ancient Guru

    Messages:
    6,444
    Likes Received:
    678
    GPU:
    GTX1080Ti
    Okay, but logically if you find graphical elements distracting, you should stick to reading. :p Seriously though I get what you mean, I liked Fallout 3 verses Fallout 4 more for the storyline even thought Fallout 4 had superior visuals. I liked the art style of the Borderlands games the storyline was entertaining enough. For me the pinnacle of gaming seemed the late 90's early 2000's with Unreal Tournament and Nascar Racing 3 being games that were absolutely fantastic online. Spent many hours bouncing around Hall of Giants.
     
  14. HeavyHemi

    HeavyHemi Ancient Guru

    Messages:
    6,444
    Likes Received:
    678
    GPU:
    GTX1080Ti
    I was going to read the rest of your post but I could not get past your prediction! Heh heh....:p
     
  15. SerotoNiN

    SerotoNiN Ancient Guru

    Messages:
    3,433
    Likes Received:
    1,055
    GPU:
    EVGA RTX 2080
    Enthusiast site = enthusiast benchmarks. Not sure there's enough user interest in such OP suggested benchmarks. Most want to be on PC to have all the bells and whistles on. So you get benchmarks with all the bells and whistles on. Anything less, would be console basically? I'm not throwing shade at consoles. I love them at times. I'm saying, if your PC (and I don't mean to sound snobby here) can't handle low, medium, or whatever else the console counter-part is. Just buy a console. Don't look to benchmarks to see how a game plays at it's worst visual presentation? That's asking a lot of dedicated time, effort and essentially money to benchmark for a very small percentage of users.
     

  16. sverek

    sverek Ancient Guru

    Messages:
    5,580
    Likes Received:
    2,439
    GPU:
    NOVIDIA -0.5GB
    Show me a console that can play 144fps on 144hz panels. Oh wait, you can't get 144fps with "all the bells and whistles on". :D

    And you right as other pointed out. Just test whatever devs defines as ULTRA, to keep benchmarks consistent ffs.
     
  17. Holocron

    Holocron Ancient Guru

    Messages:
    2,178
    Likes Received:
    8
    GPU:
    GTX 850M / 4GB
    For me, as long as the number of shader cores on my videocard closely matches the ones on the latest consoles then I am content, knowing that I would be able to play any game that they'll put out in the market, though definitely not on ultra settings.

    What I really want to see is how future games will make use of raytracing. Now that is what I want to see in graphics hardware benchmarks.
     
  18. airbud7

    airbud7 Ancient Guru

    Messages:
    7,707
    Likes Received:
    4,528
    GPU:
    pny gtx 1060 xlr8
    I cannot shoot with a controller/ Grrrrrr!...I tried Black Ops 4 on Xbox and got my azz kicked! is there some type of auto-aim or something? How they do it?

    Give me a kb/m and I'll kick all there butt'z/ they will be crying to mama! :D

    [​IMG]

    :p
     

Share This Page