Discussion in 'Frontpage news' started by Hilbert Hagedoorn, Dec 8, 2020.
Interesting to me the AMD looks way better tho in that image way more atmospheric and realistic.
Another disappointing halo product. The 3090 was also disappointing in how little extra it offered. Even if it's nothing that I'll ever have, it's fun to see the really over the top products.
I know it's subjective, but it looks totally washed out to me.
Ignore the 'atmosphere', they are youtube screenshots not directly from the game.
Outside of the missing reflections, the rest of the game should look exactly the same.
I think the reflections on nvidia makes it look horrible. And yeah I can see the youtube compression doing it's magic. Would be nice to have direct screenshots tho. I meant that the trashcans or whatever do look better on the AMD platform while on that weird youtube clip shot the AMD has more detail everywhere. But as you said it's youtube so.
Most reviewer I read said that there is no point to buy a 6900 XT given its price point and it does not improve much from 6800 XT.
it's amd's response to 3090
a value-oriented unreasonably expensive titan-esque card where are all those people screaming at nvidia for pricing titans at $1K ? well,at least titans were the fastest hands down in their time.this isn't a better overall product than a 3080 imo and the price tag is just as ridiculous as what nvidia have been charging for theirs.not even a different memory config,same as $580 rx6800
I'd buy it for 650 though as long as it has no artificially limited clocks
Yes, but if they had the current tech to make something more powerful than this card, they probably would've. Why would they not attempt to one-up the 3090 with this product? I know it's 500$ less, theoretically, but it's also not anywhere close to being a 350$ upgrade over the 6800 XT or the RTX 3080. Just like the RTX 3090 was nowhere near a 750$ upgrade over the 3080.
A Titan was at least something that did something more - and the 2080 Ti did something more than a 2080. The 3090 is of course differentiated, in naming, and it's a silly over the top halo product, but it's just nowhere near as good an investment as a Titan was, because of the lack of the pro features.
That's because raytracing in that game is punched into your face for purpose of punching you. Broken shader simply looks closer to what brain expects because they made rt on those objects unrealistic.
@AlmondMan When in history you had big difference between full GPU and its cut down variant while both had same TBP?
There is reason why 60CU cut down 6800 non-XT comes with 250W power limit and weaker cooling.
Yeah was watching a video and it looks tad too much. I guess it's hard to have balance.
Fight between selling new technology. And having natural feel.
I have no intention buying CP2077 till I see their implementation in Witcher 3. Putting shiny objects/materials into game for purpose of having shiny surfaces all around will break immersion.
If I do not like Witcher 3 changes, I am surely not going to like implementation in CP2077.
ampere's 3090 is more of a workstation card than any of the titans except Titan V
And yet it still lacks in a lot of workstation features.
Guys, I would recommend to update the way you display benchmark results and its' logic - OC results displayed afterwards, out of general context is not relvant anymore. It would be much more informative to include OC results into GPU list, since OC became mainstream compared to 10 years ago, but seems like you still approach it the same way. Secondly, GPU final shootout with FPS performance percentage comparison in major resolutions on a separate page would be also much more convenient. Thanks
I say it is other way around. Back in the day, people had to know what they were doing and could gain huge performance.
OC today is poor. Way more accessible, but gains are poor. Both for CPUs and GPUs.
I don't see RT like that, shiny objects /materials exist in the real world and with RT we can have them in our game world too adding to immersion.
I want to upgrade to something cos I have had a 2080ti for over 2 years, but when I compare my results with benchmarks it is a 3080 in performance.. confused.... and the 2080Ti's used in benchmarks now seem to be 220-240w versions? mine is just an Asus at 338w that doesn't seem to reach 60 degrees.. Guess mine has a higher power profile than the ones used in benchmarks. Either way, since I game at 4K a few % actual upgrade maybe seems odd? https://www.3dmark.com/spy/15616392 vs AMD Radeon RX 6900 XT review - DX12: 3DMark Time Spy (2016) (guru3d.com)
Walk around your home. Look at percentage of shiny surfaces and note if they are really reflective, like mirror/glass. Or if they are more diffusing light than reflecting.
Most of surfaces that do diffusion are in no need of RT at all. Proper shader code will do. But that's not the point right? Point is in realism vs forcing reflective surfaces.
I believe you have at least some hair. Imagine your photo being processed by AI and turning you into Yeti. Too much is too much. Effects should be used in places where they belong.
When it clearly puts reflective surfaces where you expect none. And turns materials that should not be reflective into reflective ones, it breaks immersion.
not only is it poor
but it seems like oc headroom is included in the price