Gigabyte confirms GeForce RTX 4070 Ti graphics cards

Discussion in 'Frontpage news' started by Hilbert Hagedoorn, Nov 29, 2022.

  1. Encoder performance until AMD can at least meet a minimum to run a Quest 2. A 6900XT needs around 1440p encode resolution around 200Mbps to try to do 120Hz, whereas a RTX 3060 does 120 at near-4K with 300 Mbps no problem, and apparently even Turing can.

    That matters as soon as you increase the refresh rate from the default 72Hz; on any AMD GPU currently that results in a worse overall image, but it isn't a concern on NVIDIA. The game can render at 4K with all the graphical effects and DLSS or whatever all it wants, but the end-result displayed on the headset relies on the encoder. Paying $1K for a GPU that (potentially) can't handle a soon-outdated VR headset is silly.

    I have no doubt Lovelace can handle it. It remains to be seen if RDNA3 can.
     
  2. cucaulay malkin

    cucaulay malkin Ancient Guru

    Messages:
    9,236
    Likes Received:
    5,208
    GPU:
    AD102/Navi21
    don't know about any of that.
     
  3. winning.exe

    winning.exe Member

    Messages:
    22
    Likes Received:
    17
    GPU:
    Nvidia
    That claim really has no legs to stand on. CDNA is a re-hash of GCN; it's lacking many of the things that big data et al. look for these days. While CDNA supports things like INT8 and FP16, Nvidia has much better mixed precision and matrix performance (i.e. through FP8, INT8 and below, "sparse formats" to accelerate AI workloads). Then you add niceties like CUDA-X with a massive breadth of software support, and the story becomes similar to the desktop: if you have the money, you buy Nvidia. ROCm exists on the AMD side (for better or for worse), but that really isn't a serious competitor. Hence why Nvidia ships 9 in 10 accelerators in this space.

    There is no paradox: when you buy Nvidia, you get a much better product and software stack which you'll be glad to lock yourself into, because the alternative is AMD's ongoing compute catastrophe :D

    At this point, even legacy OpenCL software runs much better on Nvidia hardware.

    This is coming from a company that invents entire instruction sets to lock competitors out (see: SSE1-4, AVX, AVX2, AVX512). Intel is interested in making software that runs well on their products, and if — by coincidence — it also runs well on someone else's, its only by coincidence :D

    I have personally seen this time and time again in the open source space, with things like Intel ISPC, Embree, Clear Linux, OneAPI and so on. OneAPI is a hilarious example of this, because it masquerades as an open standard, but it's very clear that their intention is that you'll be using OneAPI with Intel CPUs and GPUs (i.e. Sapphire Rapids and Ponte Vecchio). If it works on AMD hardware, that will be purely by coincidence, and Intel is certainly not investing any time or effort on this front :p
     
    Krizby likes this.
  4. schmidtbag

    schmidtbag Ancient Guru

    Messages:
    8,020
    Likes Received:
    4,398
    GPU:
    Asrock 7700XT
    I'm not talking about CDNA. I'm talking about CDNA2. Pretty different, and very competitive. I suggest you read up on that.
    "glad you lock yourself into". Right, I'm glad you don't run any businesses I've worked for.
    The point I was trying to make is Nvidia has an enticing platform since they're well equipped whether you're doing CUDA, OpenCL, or Vulkan compute. That makes getting vendor locked not such a big deal, because you're not getting a bad experience no matter what. Under certain workloads though, they don't have the best hardware or the best prices. Both AMD and Intel are behind but they won't be for long.
    Source?
    Not really an apples to apples comparison, seeing as it doesn't seem to take AMD very long to implement the same instructions. Y'know why? Because Intel did all the hard work already. Worst-case scenario, AMD uses an electron microscope to scan Intel's chips and reverse-engineer them.
    When it comes to GPUs, the architectures are so wildly different and much of their functionality is hidden behind closed software/firmware.
    I wouldn't be surprised if Intel decides to do this with their GPUs but currently they have no leverage to do so. For now, Intel will have to play nice and share.
    Nvidia and AMD have done the same. Doesn't change the fact that if you look at the track record of which companies have made open source contributions that don't have caveats, Intel does fairly well in this regard.
    I'm not entirely sure what the issue is with Clear, but it's really just a demo, and a successful one because most of other Linux distros seemed to have adopted the optimizations Intel made.
     
    Venix and tunejunky like this.

  5. tunejunky

    tunejunky Ancient Guru

    Messages:
    4,460
    Likes Received:
    3,084
    GPU:
    7900xtx/7900xt
    sales, sales, sales
    that's what drives everything and so few of those games are hits. not only are there few hits, there are legendary buggy titles that have taken up to a year to be playable (cp2077 much).
    so when i say there aren't enough titles available, there aren't enough titles available that sell anywhere like a WoW expansion
     
  6. winning.exe

    winning.exe Member

    Messages:
    22
    Likes Received:
    17
    GPU:
    Nvidia
    CDNA2 is derived from CDNA, which is derived from GCN. I suggest you read up on that o_O

    Source?

    No point in continuing to discuss in bad faith, so I’ll leave things off here.
     
  7. cucaulay malkin

    cucaulay malkin Ancient Guru

    Messages:
    9,236
    Likes Received:
    5,208
    GPU:
    AD102/Navi21
    here we go again with games that run on a potato in a thread about next gen graphics.
    what is wrong with you today ?
    every time I say that there are people who enjoy max details with ray tracing, you respond with the same "but a lot of games don't have ray tracing and sell well" over and over again. guess what, they run on rtx gpus too.
    you're so much about value now, but at the same time you kept convincing people that when 6750xt came out for 100usd more it made more sense than the original 6700xt.
    your #1 purpose here is to create scenarios where amd cards do relatively well compared to nvidia, completely disregarding all others. well, maybe you should take a look at what people actually want to buy as reflected in discrete gpu market share numbers,cause nearly 90% of buyers kinda disagree.
    you think gta6,witcher 4 and every major upcoming game won't use rt ? cause if I were to take guess, I'd say they will,and to a large extent.
     
    Last edited: Nov 30, 2022
    Krizby likes this.
  8. geogan

    geogan Maha Guru

    Messages:
    1,271
    Likes Received:
    472
    GPU:
    4080 Gaming OC
    I thought of that... but already had EK Elite 360 for CPU in there taking up all the room... there was no more room in that old case for any other decent sized rad. New case has a lot more room for that sort of thing ;)
     
    Venix likes this.
  9. Krizby

    Krizby Ancient Guru

    Messages:
    3,104
    Likes Received:
    1,788
    GPU:
    Asus RTX 4090 TUF
    People who say that RT/DLSS games are niche are actually the niche LOL, very small percentage of gamers buy expensive next-gen GPU just to exclusively play old games at 1000FPS.
     
    Last edited: Dec 1, 2022
    cucaulay malkin likes this.
  10. beedoo

    beedoo Member Guru

    Messages:
    149
    Likes Received:
    126
    GPU:
    6900XT Liquid Devil
    The problem with this statement is, that unless you can get both vendors to provide identical RT performance, then one must always lag behind. Unless you meant something else.

    Let me qualify this. If AMD is 10% faster than the 4080 at RT (not that it will be), and also, as currently suggested, cheaper, does that mean the 4080 is lagging behind? Is AMD lagging behind if they're 10% slower, yet cheaper?
     
    Last edited: Dec 1, 2022

  11. cucaulay malkin

    cucaulay malkin Ancient Guru

    Messages:
    9,236
    Likes Received:
    5,208
    GPU:
    AD102/Navi21
    10% is very comparable imo, it's 20-30% that I'd have a problem with cause that's usually a tier up/down in gpu segments.
    to make it clearer, to me 3070 is like 6700xt in rasterization despite the latter being 10% slower. it's rt where it lags behind.
     
    Last edited: Dec 2, 2022
    beedoo likes this.
  12. cucaulay malkin

    cucaulay malkin Ancient Guru

    Messages:
    9,236
    Likes Received:
    5,208
    GPU:
    AD102/Navi21
    that is what im trying to explain to tunejunkie, in the $800+ segment you need different criteria, that gpu should run max settings best as possible. steam popularity has nothing to do with gpu choice for a high end system. it's a niche purchase, so if standard is low-medium at 1080p, then a +800 gpu should absolutely run niche settings. if I wanted something less I'd buy a 3060/6700 and get 150fps at 1080p in every popular game.

    [​IMG]
     
    Last edited: Dec 1, 2022
  13. schmidtbag

    schmidtbag Ancient Guru

    Messages:
    8,020
    Likes Received:
    4,398
    GPU:
    Asrock 7700XT
    Your statement is twisting the truth a bit:
    If you are buying an expensive next-gen GPU then you're right, people are more focused on RT and other cutting-edge features than playing old games super fast. For the vast majority of people though, playing a slideshow is not worth RT, especially in games where RT is implemented poorly. So, RT as a whole is a rather niche feature, for now. I firmly believe RT is the future.

    As for DLSS, it depends on the implementation and the resolution. Some games it's like magic, other games I would rather just play at a lower resolution or lower some other graphical setting. When DLSS is used to compensate for the performance loss of RT, it has to work like magic or else you're making your system work much harder without a net visual improvement. There are a small handful of games where both RT and DLSS enabled make sense. Of those games, only in some cases would it make sense on a GPU worth less than 4 figures. DLSS can only do so much to upscale 720p.
     
  14. tunejunky

    tunejunky Ancient Guru

    Messages:
    4,460
    Likes Received:
    3,084
    GPU:
    7900xtx/7900xt

    give me a fracking break with your Ad Hominem fallacies.
    insulting me doesn't change a thing, especially when you're short-sighted, misguided, and argumentative.

    i do not create any scenarios whatsoever, i speak to many things that apparently you do not understand (like manufacturing and business).

    my enthusiasm is for technology and manufacturing (which you apparently never noticed in your fan-boy state).
     
  15. Krizby

    Krizby Ancient Guru

    Messages:
    3,104
    Likes Received:
    1,788
    GPU:
    Asus RTX 4090 TUF
    I'm pretty sure there are plenty more RT games coming out in the near future, kinda short sighted to only focus on current games, especially when you are buying high-end GPU.

    DLSS/XeSS/FSR have been improving steadily, the majority of reviewers/gamers agree that they bring a net improvement to your gaming experience (higher FPS at little to no visual loss), particularly at 1440p and above. You are actually the minority who think lowering resolution/settings is somehow better than using DLSS/XeSS/FSR2.0

    Sure if you buy a cheap GPU to play older games, nothing wrong with that, but when we are discussing about high-end GPUs, RT/DLSS are much more relevant.
     
    cucaulay malkin likes this.

  16. mackintosh

    mackintosh Maha Guru

    Messages:
    1,189
    Likes Received:
    1,094
    GPU:
    .
    You (should) buy for your current needs, not your idea of what the future might bring. You needn't have had to have been around this hobby for 30 years to realise that future proofing is mostly an illusion. That lesson could have and should have been learned in just two or three.
     
    tunejunky and carnivore like this.
  17. schmidtbag

    schmidtbag Ancient Guru

    Messages:
    8,020
    Likes Received:
    4,398
    GPU:
    Asrock 7700XT
    I have no doubt you're right there will be more RT games; I wouldn't be surprised if most new AAA games will be RT-capable. This doesn't really change my points though - RT is just too demanding to be appealing to the average user, and DLSS (generally speaking) isn't worth using to compensate for the performance deficit if you're upscaling more than 33% (probably should've said that the first time around). So, it doesn't matter if RT becomes more available, it still isn't going to be widely used for a while.
    You're kinda twisting my words here. Most people, including myself, would agree that supersampling (especially DLSS) where 1440p is the starting resolution usually yields great results. But here's the thing: most people don't have a GPU powerful enough to upscale 1440p with RT and DLSS/XeSS/FSR enabled with playable framerates. In fact, I would argue most people don't have a GPU powerful enough to smoothly/reliably enable RT at 1080p, no upscaling at all. So when you're faced with the reality of what most people will experience, suddenly, DLSS ain't so great if you expect to use RT along with it.
    I agree - for high-end, RT/DLSS is much more relevant. What sparked my response is you were acting like RT in general wasn't niche.
     
  18. Krizby

    Krizby Ancient Guru

    Messages:
    3,104
    Likes Received:
    1,788
    GPU:
    Asus RTX 4090 TUF
    "Near future" is not exactly future proofing, there are plenty of RT games coming out in the next few months
     
    cucaulay malkin likes this.
  19. H83

    H83 Ancient Guru

    Messages:
    5,512
    Likes Received:
    3,036
    GPU:
    XFX Black 6950XT
    More than 90% of gamers don't have a GPU powerful enough to run the games at playable settings, so the question remains, does it really matter?...
     
  20. -Tj-

    -Tj- Ancient Guru

    Messages:
    18,103
    Likes Received:
    2,606
    GPU:
    3080TI iChill Black
    Last edited: Dec 1, 2022
    DannyD likes this.

Share This Page