So what do I think of the new GTX 580 then?

Discussion in 'Videocards - NVIDIA GeForce' started by Darren Hodgson, Nov 13, 2010.

  1. macdaddy

    macdaddy Guest

    Messages:
    2,400
    Likes Received:
    4
    GPU:
    TITAN X


    Exactly my point you have to tweak it. But default is what we talking about
     
  2. qstoffe

    qstoffe Active Member

    Messages:
    55
    Likes Received:
    0
    GPU:
    nVidia GTX 580 @ stock
    That means you can't trust todays gfx hardware reviews. They test cards with default settings. Where do you draw the line of different IQ? A black screen?
     
  3. Sever

    Sever Ancient Guru

    Messages:
    4,825
    Likes Received:
    0
    GPU:
    Galaxy 3GB 660TI
    default settings for both nvidia and amd focus on performance optimisations, so youre comparing optimisations with optimisations. occasionally some reviewers show the differences between each card with the same settings, like guru3d occasionally and alienbabeltech.

    best way is to judge it for yourself. image quality is subjective. everyone has different tastes and preferences.
     
  4. Darren Hodgson

    Darren Hodgson Ancient Guru

    Messages:
    17,222
    Likes Received:
    1,541
    GPU:
    NVIDIA RTX 4080 FE
    That screenshot, although from an older build of GPU-Z, just reminded of something odd I noticed with the latest version last night (v0.48).

    When I first run GPU-Z it shows the Bus Interface as PCI-E 2.0 x16 @ 2.0 x16 or something like that. However, if I leave GPU-Z running then it changes back and forth at random after a certain length of time between that and the one you have, PCI-E x16 @ x16. I'm guessing that is a bug in GPU-Z though.
     

  5. Rollo2

    Rollo2 Banned

    Messages:
    73
    Likes Received:
    0
    I'm not aware of any independent review site finding that NVIDIA's default settings reduce image quality and increase performance across multiple games.

    Do you have a link?

    There's a big difference in finding a couple bugs with isolated pieces of games to consistent reduction of image quality by reduced AF. Both companies do not do this.



    NVIDIA Focus Group Member
    NVIDIA Focus Group Members receive free software and/or hardware from NVIDIA from time to time to
    facilitate the evaluation of NVIDIA products. However, the opinions expressed are solely those of the Members.
     
  6. Sever

    Sever Ancient Guru

    Messages:
    4,825
    Likes Received:
    0
    GPU:
    Galaxy 3GB 660TI
    im assuming if you keep mentioning that you're an nvidia focus group member that you actually have an nvidia card.

    if yes, go into the nvidia control panel, 3d settings, hit restore and see the default global settings for yourself.

    negative lod bias is set to allow by default, and the control panel states that reduces image quality.
    trilinear optimisation is on by default, which reduces image quality to improve performance.
    antialiasing gamma correction is defaulted to off, which the panel states also reduces image quality for openGL.

    you need to manually change these settings if you want better quality.
     
  7. TirolokoRD

    TirolokoRD Guest

    Messages:
    1,934
    Likes Received:
    0
    GPU:
    EVGA 980TI ACX 2
    Cant wait to see a Lighting MSI version of the GTX 580, that monster should almost launch a rocket to the moon.
     
  8. alanm

    alanm Ancient Guru

    Messages:
    12,274
    Likes Received:
    4,478
    GPU:
    RTX 4080
  9. dragomirc

    dragomirc Guest

    Messages:
    280
    Likes Received:
    0
    GPU:
    eVGA GTX580 965/1930/4400
    GTX 580 overclocked Xtreme Vantage - NO PhysX.
    GPU overclock 24% reise GPU score 20%.

    [​IMG]
     
  10. Raidflex

    Raidflex Member Guru

    Messages:
    115
    Likes Received:
    0
    GPU:
    GTX 780 Ti SLI @1300/7500
    I really am considering replacing my GTX 280's with 1 GTX 580 to get rid of any SLI issues. Also it would take about 250W of heat dump out of my WC system.

    The price is the biggest issue though, even when selling my 280's I would only have a little more then half the cost covered for about the same amount of GPU power. It seems, (at least on ebay) that the GTX 285's tend to sell for much more then the 280's.
     

  11. MrMicrochip

    MrMicrochip Member Guru

    Messages:
    120
    Likes Received:
    0
    GPU:
    Gainward GTX 1070 Phoenix
    I think the main point is that with default settings nvidia cards have better image quality. If you change ati driver settings to same quality level as nvidia drivers already are by default, you loose performance.

    I'm not saying it is so because i don't know, but thats the point they are making.
     
  12. Sever

    Sever Ancient Guru

    Messages:
    4,825
    Likes Received:
    0
    GPU:
    Galaxy 3GB 660TI
    thats the point he's 'trying' to make, but the fact is that both sides default to performance instead of high image quality. theyre both more or less the same quality wise until you tweak them.
     
  13. dchalf10

    dchalf10 Banned

    Messages:
    4,032
    Likes Received:
    0
    GPU:
    GTX670 1293/6800
    [​IMG]
     
  14. Darren Hodgson

    Darren Hodgson Ancient Guru

    Messages:
    17,222
    Likes Received:
    1,541
    GPU:
    NVIDIA RTX 4080 FE
    Sorry for the late reply as I originally missed this but I've not seen any issues with the GTX 580 other than some minor texture flickering in Star Wars The Force Unleashed II. I've no problems at all, 100% stability in games and certainly no BSODs (what is one of those? ;) ).
     
  15. qstoffe

    qstoffe Active Member

    Messages:
    55
    Likes Received:
    0
    GPU:
    nVidia GTX 580 @ stock
    Hehe cool. I just got my own 580 today. I havn't have much time to check it out yet though I'm sure there are some bugs since this is a brand new card. The one thing I've noticed compared to my old 4870x2 is how much smother 60fps is. I was unhappy with microstuttering just when I got the 4870x2 but then I sort of got used to it. Now I know how much smother a single card is again. Sure it is a subtle difference and I'm sure some don't notice it but I certainly do. Looking back I should never have gone multi-GPU. But atleast now I know, multi GPU sucks for a perfection seeker such as myself ;)

    I will try and fit the accelero xtreme plus (with 480 mounting) tomorrow. Otherwise the 580 cooler is fairly quiet for a high-end gfx card but not silent enought for me. Thank god for aftermarket coolers :)
     

  16. MikeMK

    MikeMK Ancient Guru

    Messages:
    11,106
    Likes Received:
    108
    GPU:
    Nvidia RTX 4090
    Ive managed to get my hands on a GTX 580 now, and ive only run it for an hour, but im quite impressed. Its the 1st single GPU setup that ive been able to run crysis 1920x1200 4x AA 16xAF enthusiast settings at decent FPS. Very nice! Now ive just gotta resist getting a 2nd ;)
     
  17. Undying

    Undying Ancient Guru

    Messages:
    25,480
    Likes Received:
    12,886
    GPU:
    XFX RX6800XT 16GB
    @MikeMK didnt you had 5870's ? You are second guru here that changed to GTX580 from 5870's lol i just dont see the point in what you did guys.
     
  18. dragomirc

    dragomirc Guest

    Messages:
    280
    Likes Received:
    0
    GPU:
    eVGA GTX580 965/1930/4400
    Get a GTX 580 see for yourself.
     
  19. Uncle Dude

    Uncle Dude Ancient Guru

    Messages:
    2,312
    Likes Received:
    5
    GPU:
    RTX 4090
    Yup, this card has spoiled me forever. There's nothing smoother and prettier than max settings, 4xAA (or better) and a rock solid 60fps in most games. This hobby is going to consume a fair portion of my earnings for the foreseeable future. :3eyes:

    Dragomirc: 4.8 GHz? Holy schnikeys....:)

    edit - and 960 MHz? I take it you're on water? :3eyes:
     
    Last edited: Nov 24, 2010
  20. dragomirc

    dragomirc Guest

    Messages:
    280
    Likes Received:
    0
    GPU:
    eVGA GTX580 965/1930/4400
    Well that 4.8GHz is an old overclock from January. For every day use is 4.4GHz. (heatkiller wb)
    [​IMG]

    And yes, card is watercooled (Danger Den wb) never goes over 34 C.
     
    Last edited: Nov 24, 2010

Share This Page