Quad-slot NVIDIA GeForce RTX 4090Ti/TITAN 800W graphics card / cooler caught on camera

Discussion in 'Frontpage news' started by Hilbert Hagedoorn, Jan 30, 2023.

  1. Hilbert Hagedoorn

    Hilbert Hagedoorn Don Vito Corleone Staff Member

    Messages:
    48,325
    Likes Received:
    18,407
    GPU:
    AMD | NVIDIA
    mbk1969 and fantaskarsef like this.
  2. Moonbogg

    Moonbogg Master Guru

    Messages:
    306
    Likes Received:
    212
    GPU:
    GTX1080Ti@2.1GHz
    Sounds cool. I predict $5000.00
     
    Odarnok and fantaskarsef like this.
  3. fantaskarsef

    fantaskarsef Ancient Guru

    Messages:
    15,636
    Likes Received:
    9,512
    GPU:
    4090@H2O
    Comes with 2 12VHPWR connectors, for extra crit chance. Probably uses 600W still, but lowers clocks to make it happen.

    Maybe it's just noob me, but nobody wants that card like those (already old?) leaked screenshots show us with air cooling, this to me looks like AIO / custom loop mandatory.
    If this even ever sees the light of store shelves.
     
  4. mikeysg

    mikeysg Ancient Guru

    Messages:
    3,286
    Likes Received:
    740
    GPU:
    MERC310 RX 7900 XTX
    Honestly, enough with the mocking, everyone knows nobody's gonna buy it at that price, even Mr Leather Jacket knows that!




















    But priced at a reasonable 4999USD, Mr Leather Jacket believes it'd be a steal!!!;):D
     

  5. Picolete

    Picolete Master Guru

    Messages:
    494
    Likes Received:
    261
    GPU:
    Sapphire Pulse 6800

    Only if you buy more, to save more
     
  6. Embra

    Embra Ancient Guru

    Messages:
    1,601
    Likes Received:
    956
    GPU:
    Red Devil 6950 XT
    How about some somewhat affordable mid-ranged GPUs??
     
    Aniboom, schmidtbag and fantaskarsef like this.
  7. Texter

    Texter Guest

    Messages:
    3,275
    Likes Received:
    332
    GPU:
    Club3d GF6800GT 256MB AGP
    Big is back, because bigger is better...
     
  8. Solfaur

    Solfaur Ancient Guru

    Messages:
    8,004
    Likes Received:
    1,526
    GPU:
    GB 3080Ti Gaming OC
    My only concern is I can't 2way SLI them, yet alone 4way. :(
     
  9. H83

    H83 Ancient Guru

    Messages:
    5,443
    Likes Received:
    2,982
    GPU:
    XFX Black 6950XT
    A 800w videocard?! That`s just wrong...
     
  10. fantaskarsef

    fantaskarsef Ancient Guru

    Messages:
    15,636
    Likes Received:
    9,512
    GPU:
    4090@H2O
    You just have to get either PCIe riser cables, and custom blacksmithing supports for both the cards, or just go 4way custom watercooling loop with waterblocks and risers for the price of more than one organ in central Asia. o_O
     
    Solfaur likes this.

  11. schmidtbag

    schmidtbag Ancient Guru

    Messages:
    7,955
    Likes Received:
    4,336
    GPU:
    HIS R9 290
    And yet, even without miners or scalpers, I'm sure these would all sell-out immediately. Why make anything affordable to the masses when it seems somehow, everyone's got enough money to buy these behemoths?

    The market trends sure are getting suspicious to me.
     
  12. Denial

    Denial Ancient Guru

    Messages:
    14,201
    Likes Received:
    4,105
    GPU:
    EVGA RTX 3080
    A friend mentioned it to me and at first I was like eh but now I think he might be right - I think Nvidia is going to push Geforce Now as the future for midrange/low end. No one is going to want to buy a $700 midrange card - might as well just subscribe for $10 a month at that point. I think it also makes sense given the general cost of design as well.

    [​IMG]

    The cost for building a modern chip is increasing exponentially along with transistor gate cost essentially stalling:

    [​IMG]

    So what exactly do you do? You can delay it by trying to design for cost instead like AMD is doing.. but that best that buys you a generation or two. I think at some point you just say frack it.. cheap GPUs are not a thing - build them massive, run them in the cloud, and scale gaming instances across them to massively increase the efficiency of the hardware. You can still sell them to users who still want a dedicated card. Everyone else pays $10-20 a month to run one of those instances depending on what experience they want. It's far less GPUs being manufactured - it's less SKU's you need to build, less headache with inventory and everything else.It gives Nvidia constant revenue stream which companies love. It arguably gives them more benefit for value-add features like RTX as everyone will get it as soon as the server is upgraded.. which gives developers greater incentive to use Nvidia specific technologies. I think from Nvidia's viewpoint it makes perfect sense to do this and from a customer perspective might actually be the best scenario for them as well.

    Don't get me wrong - I love dedicated hardware.. but its possible at some point it's just not do-able anymore.
     
    Last edited: Jan 30, 2023
    Aniboom, schmidtbag, Spets and 2 others like this.
  13. tsunami231

    tsunami231 Ancient Guru

    Messages:
    14,702
    Likes Received:
    1,843
    GPU:
    EVGA 1070Ti Black
    probably true, but if have do my gaming threw cloud i will just stick to old stuff, honestly i dont buy it cost them half of what they charging per card to make the cards. it just really dawned on them that they charge arm and leg and make half and if not a quarter of that still keep there profit

    litterly if that happen there isnt much need for pc anymore seeing tv can just stream them so can our phones and tablets

    They idea of streaming my games and or renting hardware to stream the game does is not appealing to me. i like to physical own my hardware and games on disc for consoles for reason.

    never mind the fact stream would mean latecny issue, and fact that in states you cant touch internet for less then 50$ month is most cases the normal price double that or more were as euro seem to pay whole lot less. and you would be screwed with internet to stream. they could sell there cards and decent prices and still make profit it just they dont want to or care too.

    if price keep going they way there going sooner or later they wont sell at all. in 3 generations a xx70 series double to $799 and in some case $1000 in 3 more generations it keeps happen we will see $1499 xx70 card that would just be insane. and would be death pc gaming and consoles for that matter for alot people.

    It is what is and what happens will happen. cloud/stream gaming is not something i want nor is something i would pay monthy for.
     
    Last edited: Jan 30, 2023
    schmidtbag likes this.
  14. schmidtbag

    schmidtbag Ancient Guru

    Messages:
    7,955
    Likes Received:
    4,336
    GPU:
    HIS R9 290
    Likewise, and we have to consider that we are both in places where we have access to high-speed and relatively low-latency internet. There's a lot of people where this simply cannot work.

    There's the whole dichotomy of demand going up but hardware capabilities hitting an upper limit, which is really more a matter of how we need to reflect upon development practices. That's probably one of the reasons why Nvidia started investing in DLSS when they did, because they knew they couldn't depend on developers optimizing their code, so they needed a head start in developing AI-assisted performance improvements in the likely event that they couldn't keep up with modern demands. Unless there's some major breakthrough, that day will come, and I figure that may come within a decade. If the rumors about the 4090 Ti are true, I would say that day has already come. It doesn't matter how fast a GPU is if becomes impractical to operate, let alone buy.
    I think what matters more is finding ways to further optimize games, without the need of AI (but AI is fine too). There are plenty of examples of games that run so poorly compared to others of similar (or maybe even better) detail level. Sometimes these games improve in updates, whether from the drivers or the game itself. The thing is, even the games that run well most likely have a lot of room for improvement.

    For too long, we've been spoiled by having more specs for the same reasonable price. Developers didn't have to try to make anything efficient or optimized because they would just throw more cores or RAM at the problem. It was reasonable for them to assume consumers would just go ahead and upgrade, because it was affordable to do so. Well, now we're left with nearly 20 years of poorly optimized code and hardware has reached a point where it's either too expensive or you're getting the same level of performance for the same price. That's one of the reasons I like using Linux and think iOS is well-designed, because you can get an overall better user experience for a fraction of the system resources.

    Something needs to change, or else PC gaming is going to become a hobby for the rich. Nvidia and AMD can only do so much to squeeze in more performance for a reasonable wattage or price if they want to show any sign of progress, which is hard to do when it's getting so hard to shrink transistors.
     
    tsunami231 likes this.
  15. tsunami231

    tsunami231 Ancient Guru

    Messages:
    14,702
    Likes Received:
    1,843
    GPU:
    EVGA 1070Ti Black
    I will probably get castrated for this but Windows is horrible inefficient when compared to iOS more time then not IOS is snappy and fast with mobile hardware of lesser design. Windows runs like trash on mobile tech if trying to run windows on that hardware iphone tech and that low of ram it will choke. and if you have any thing less then 4gb or even 8gb ram win 10 god help you i could never run windows with that kind ram, mean while ios is snappy and fast with 2~4gb and mobile tech that runing what 25watts if that?

    I want my 1070ti replaced with something new but to get something in same series or even in the xx60 series I could litterly build a new pc with 12700k+z690+32gb 5400ram+psu + and hsf + the case. well i could if i get cpu/mb from microcenter which what i been doing for quiet some time.

    I could buy gpu instead and start playing newer game on pc just find out how much 6700k and that 2400mhz 16gb i have is gona be bottleneck. game wise what i play 6700k isnt really bottleneck but i sure that 2400mhz ram is.

    shrinking the die is just make thing more expensive they need to go back draw board and resign chip that super efficient need a WHOLE lot less cuda cores, transistors and what not to produce that performance. but i sure it not that easy and they not gona want spend the money when they can just jack up power and cores and cudas and call it day.

    im sorry but if 800tdp become the normal i want nothing to do with 200 tdp is already way to much heat generation then i like. in winter it fine i dont use my heater, in summer i would need to jack up AC just to keep heat in check, i dont even want to image how bad it would be with 300tdp or even 600tdp more, seeing i already thing it to much. and cost electricity is skyrocketing in alot places like alot other things.

    Know all those movies we seen over decades where alien races said this chip is 1000x more power full then all computer on planet an uses next to no energy? yah we need that which probably will never happen. even if it did only 1% of the 1% would be able to afford it.

    I love pc gaming but these prices are gona kill it off for lot people. at this point i would not be surpise if next gen consoles are 899~1000$, which would kill console gaming for me and alot of people i know. which would pave path to cloud/streaming gaming

    most people i know now wont even touch pc for gaming or new consoles newer then ps2, cause everything after has been push for digital a permittee need for internet.

    I remember when when all i did was pop cartages in to console and in 5 seconds i was play the game game where release in working order no broke states. those were the days
     
    schmidtbag likes this.

  16. Vmhasegawa

    Vmhasegawa Member

    Messages:
    47
    Likes Received:
    4
    GPU:
    Sapphire RX 480 8gb
    Looking at my 5600xt and reading what the "new" midrange pricing could sound like:

    Please don't die. Just don't.
     
  17. schmidtbag

    schmidtbag Ancient Guru

    Messages:
    7,955
    Likes Received:
    4,336
    GPU:
    HIS R9 290
    You're preaching to the right person haha. I've ditched Windows on all of my PCs for almost a decade, including my gaming PC. There are many reasons for this, but its inefficiency was one of the main reasons.
    Android for a long while was pretty inefficient too but it does seem to have improved, a little. Companies keep throwing more background processes into the mix to undermine the overall improvements to the OS.
    I share the same mentality. That's why I'm still stuck with my R9 290. Thankfully, it still receives updates on Linux, so it's been more playable.
    Honestly, I predict it'd be less of a bottleneck than you think it'll be. Keep your background processes to a minimum, maybe try to overclock a little higher if possible, and you might be ok for a little while longer. Games aren't that CPU dependent anymore, and 16GB is plenty when you keep your system lean.
    The CUDA cores are used for gaming, so they ought to stay. This is a lot more difficult than you think it is though - there's no one-size-fits-all solution to how the transistors are arranged. There are solutions that are more optimal than others, but even then, that depends on modern workloads. Nvidia could start from scratch but honestly, I don't think they'd see that big of a difference. With the way modern APIs are designed and the preferences toward easier (rather than optimized) game development, Nvidia doesn't have much room to improve. What really needs to happen is a shift in software development. It's much more practical to make software take advantage of the ISA than it is to go the other way around, but the problem is convincing developers to agree to that.
    Totally agree, though my upper limit is 300W.
    I've mentioned before that even a 400W GPU risks tripped circuit breakers, when you have a powerful CPU, a good sound system, and your AC connected to the same breaker. No idea how they think 800W is supposed to be feasible.
    That's often why I resort to indie games, because being fun is often their #1 priority. AAA games are very pretty but more often than not, they're annoying to play. That's why I'm fine with getting something like a 6700XT, because that ought to be powerful enough to play what I want to play in 4K @ 60FPS+, but... I'm not willing to spend $350 on a mid tier GPU that's approaching 2 years old.
     
  18. tsunami231

    tsunami231 Ancient Guru

    Messages:
    14,702
    Likes Received:
    1,843
    GPU:
    EVGA 1070Ti Black
    @schmidtbag

    honestly i toying with notion of get 12700k+z690 asus tuf plus+ 32 gb 5600mhz 36CL corsair ram+ Deep cool AK620 HSF from Microcenter before going back to NC as that would cost $580 before taxs. I really want the (non k) models but they not give away mb for only 70$ vs normal price to go with 12700 or 13700 and same mob it would cost 200$ more. honestly this seem like better deal, though i really should add new psu to this cause current 750watt i have is now 5+ years old and been in my 920 and this 6700k.

    new build wont help me with wanting to running everything at 1440p with newer games but atlest CPU and ram be of little concern going forward. and maybe if the 4070 and 4060 ti price right i will get some time down road but more likely 5xxx will be out before i bother if i do this.or that 1070ti will die before i replace it gpu price are ridiculously and are are tdp imo
     
    Last edited: Jan 31, 2023
  19. Alessio1989

    Alessio1989 Ancient Guru

    Messages:
    2,939
    Likes Received:
    1,239
    GPU:
    .
    rtx4090ti.JPG


    meme takes up only 4 PCI slots too :V
     
    fantaskarsef likes this.
  20. PrMinisterGR

    PrMinisterGR Ancient Guru

    Messages:
    8,125
    Likes Received:
    969
    GPU:
    Inno3D RTX 3090
    I still don't think that a GPU in the cloud has better ROI than just selling it, and despite AMD getting the ridicule they deserve for their Windows software and driver quality, I really think they have the right idea with the chiplets.
     

Share This Page