Discussion in 'Frontpage news' started by Hilbert Hagedoorn, Jun 12, 2019.
Well the new trailer looked dope.. hopefully it stays that way.
Is that 2005 comment? On today's standard those effects are bad and/or run poorly. There's reason why you just replied with: "No!" statement instead of showing me how awesome they are and how well they run on your GPU.
10 years ago, nVidia's tech demos looked much better than their implementations in games today. While HW available improved beyond comparison.
Great looking smoke, liquids, physics. Yeah, You are right, fanboy does not see them in games. Or is that other way around and only those "special" people can see them?
I have had dozens of discussions right here with people owning nVidia's cards over years. And their crown argument was that they disable those effects in problematic games.
my 2080Ti Sli is ready. Hopefully It's SLI ready and they dont downgrade it to fit people with budget computer or old videocards and expect for miracles.
I doubt SLI will be supported.
Not unless it has a DX11 code path. And seeing as how game devs have an unwillingness to code for mGPU in DX12 and Vulkan, I wouldn't hold my breath. He should still be more than fine with a single 2080Ti though.
Ah yes, critical thinking can offend some people. If it bothers you that much perhaps this isn't the forum for you. However, I will point out that if this is as good as you, or anyone else who is so pro RTX RT, then perhaps we should see a live demonstration of it's use in this game instead of still pictures.
You know because it's suppose to actually be in the game.
And while they are at it, be transparent about the PC required, frame rates achieved and best resolutions for it.
The game is 10 months away, plenty of time for a RTX gameplay video. Not sure why you require it now, other then setting up some dumb goal post that hasn't been crossed yet so you can write a poem about it, or whatever it is that you're doing.
Demo right here.
This was dated on youtube on Aug 27, 2018. They are a bit further along then you realize.
You aren't making much of any sense let alone any point.
10 months away? Have a feeling Nvidia will time their Ampere release with it.
Well, That's nice gameplay w/o DX-R. I hope they can continue having same IQ on release date with DX-R disabled. (except some bugs)
But it should always be remembered that this is not visual showcase. (As if DX-R could be visual showcase any time soon...) This is game with story where players can do a lot of stuff.
I get the feeling that is exactly what will happen. Game is going to be epic, that's for sure. I have no doubt about that.
I somewhat doubt they'll release a new product generation in April for a single game. And if they do, I doubt it will be that much faster in anything than current Turing lineup. Although I'd like it tbh, I doubt it.
I'm just wrapping up Witcher 3, so I'm really looking forward to playing this game in 2022 when affordable hardware catches up to it.
I would not be surprised if this game will be bundled with Ampere.
I don't think it will be faster because the cost per transistor is still rather limiting on 7nm but I think they'll probably be shipping a new architecture early next year. They've stated that they were waiting for Samsung 7nm EUV - which just hit mass production now and typically its 8-12 months after a process hits mass production that Nvidia starts shipping GPUs on it. It also wouldn't surprise me if their next architecture used chiplets to some degree - they've been experimenting with scalable inference chips with up to 36 chiplets. It would be interesting if they broke out the tensor cores on it's own chiplet in an effort to improve yields. Idk if they are at that level yet though.
As for the RT performance - a large part of the BVH optimization is algorithmic. It's possible that the next architecture has the same number of RT cores but the performance per core is increased due to new techniques. That's not to mention that game developers are getting better in general at optimizing for it. A large part of the overhead seems to be syncing the raster/BVH environment and starting the BVH as early as possible in the frame. I think as developers get their hands on DXR, especially now that next generation consoles are supporting it, the performance will just naturally improve.
I think they won't increase core count, but focus more on rt core count. Gpu is already fast enough by normal raster.
Well then they'd introduce a new process (7nm EUV), and a new architecture? Like I said, I doubt it somewhat (they weren't able to milk the current generation enough I thought), but if you are right I'm not that unhappy. And I know there's a lot of people out there wanting higher 4K performance without RTX, and RTX will only really kick off once it performs better. It would just feel so un-Nvidia to do that... although I've been wrong in the past as well about that.
edit: On second thought, it does make sense. Milked first Turing iteration due to higher prices, throw 2nd one rather "rapidly" to get RTX to have a grip on the ray tracing market and once again get people to buy new cards when they were already holding back on Turing.
I reckon they will include dx11... and in that case just using the witcher 3 sli profile ought to work.
If not then i am going to very sad... 4k with a single 1080... not going to be a good experience.
i just want to know if it will support NVLink because I'm ready to buy another GPU if I have to
It will but you'll need to enable DLSS so really it will be running at 720p