Next Nvidia GPU architecture?

Discussion in 'Videocards - NVIDIA GeForce' started by IhatebeingAcop, Aug 27, 2015.

  1. -Tj-

    -Tj- Ancient Guru

    Messages:
    18,103
    Likes Received:
    2,606
    GPU:
    3080TI iChill Black
    Well if Im completely honest with you, its not a opinion but basing it because they contradict on your mighty saint nvidia and I know you're a big nv fan, so..


    If you didn't notice yet, Nvidia is playing them dirty tricks again, making half-ass async compute and it failed badly, so now they bitched to remove such feature from a game, yes really classy. They should just deal with it and make a proper support or don't promote it in their HW at all.

    So what if I was TheHunter before, does that make you feel all warm any fuzzy inside now that you specifically had to mention it in attempt to troll me down? gl with that. :bang:
     
    Last edited: Sep 1, 2015
  2. Catroqui

    Catroqui Guest

    Messages:
    743
    Likes Received:
    9
    GPU:
    RTX 3080/1080 Ti
    BTW. What is the expected Pascal release date?

    And about those days that any new graphics architecture release would double the performance... those were expensive days.

    I remembered how disappointed i was back in 2006 with the release of G80 cards, and i had bought a vey expensive 7800 GTX by christmas of 2005 and had no money to buy a 8800 GTX...
     
    Last edited: Sep 1, 2015
  3. Spets

    Spets Guest

    Messages:
    3,500
    Likes Received:
    670
    GPU:
    RTX 4090
    What are you on about?
    So you believe pre-alpha benchmark from a company that was willing to sabotage results in a prior benchmark should be held as an end-all result?
    This is turning into our last run-in where you were trying to push your theory of double-precision performance being a measurable benchmark for gaming.

    No one knows anything at this stage about what's going on. Oxides statements so far has only contradicted itself.
    Also from the link you posted earlier:
    Something's up but nothing is concluded at this stage.

    Just pointing out that you clearly haven't changed which is a shame because when you're not trolling you're actually half decent to converse within the forums.
     
  4. thatguy91

    thatguy91 Guest

    What will be interesting is whether Nvidia has put the rectification for this in place already with the next gen GPU, and if not whether they rush changes now that it has become 'an issue'. Also, whether they just encourage other developers, particularly those using gameworks and partners (and game with any Nvidia logo in it, despite what they claim) to disable the features that don't agree with Nvidia. It would be pretty rotten if this is the case, and you can almost guarantee that it will happen. Seeing as games will be ported to/from Xbox One, which also is getting updated DirectX to make development between the two easier, a clear sign will be whether these features are enabled on the console for best performance (on AMD hardware), and disable on PC at Nvidia's request.
     

  5. Batboy

    Batboy Member

    Messages:
    26
    Likes Received:
    0
    GPU:
    EVGA Geforce Titan X Hyb.
    So, AMD is guilty for our receiving poor Intel CPU upgrades. Anyway, we don't need all that in general. I was playing Witcher 3 with all ultra settings and still my CPU usage was not higher than 40% the whole time.

    Yes, video editors (like me sometimes :D) will always need a faster and faster CPU. I was speaking about gamers.
     
  6. Ryu5uzaku

    Ryu5uzaku Ancient Guru

    Messages:
    7,552
    Likes Received:
    609
    GPU:
    6800 XT
    In GTA 5 or Farcry 4 you would see a real improvement with lets say 6700k at same clocks as your 4770k. Witcher 3 is pretty much equal between 6700, 3770 and 4770 on 4.4.
     

Share This Page