Discussion in 'Frontpage news' started by XenthorX, May 13, 2020.
didn’t half-life and thus counter-strike use flats? And if I remember right you should not exceed 200 flats in cs 1.6 for it to run well.
Honestly I wasn't impressed by this that much. It's a canned and scripted demo, and UE4's TAA problems are just as evident here.(We have all this GPU power and no one can still be bothered to fix this ?) Screen space shadows are still used for things like foliage and don't look good. The lighting looks nice and is at least more performant than Ray Tracing without making image quality take a nose dive. The geometry is more impressive than the textures, though they go hand in hand. And I hope developers like Square Enix can look at this to fix the last gen looking garbage textures and texture streaming in Final Fantasy VII Remake1 in a PS5 update/PC version
Running only at 1440p also seems disappointing considering all that hardware. I'd much have current games running at 4k60 with just better textures and better AA. So much of them look so good as it is.
Flagship titles at nintendo consist of Super Mario, Legend of Zelda and Pokemon.
Super Mario (627.51 million)
Pokémon (301.5 million)
Wii Series (203.15 million)
The Legend of Zelda (105.62 million)
Yoshi titles outside of the mario universe are not flagships, and usually developed by a third party studio, in this case Good-Feel.
Handheld Zelda's have been done by a mix of made third party and inhouse, Capcom did the Oracle and Minish games, the 4 swords content in the gameboy advance LTTP release was done by capcom.
Spirit Tracks, Phantom Hourglass, Links awakening and A Link between worlds was Nintendo EAD, and the remake of Links Awakening in Unreal was done by Grezzo (and honestly, you can tell it wasn't done by nintendo.....)
For their console selling titles, they use their own engines because they know the hardware the best.
And fallen rocks disapeared too. so unreal
I've said this before many times - MY PS4 Pr0 does not make any noise whatsoever, no matter what the res or the game or the activity, because I use the very last variant of the Pr0 the 7216b AKA the "Red Dead Redemption bundle".
Additionally, I installed the MX-500 SATA3 SSD into it, which I'm pretty certain added to the lack of noise, due to lower temps and of course, lower power draw, and lest we not forget, the vibration of the HDD which no longer is in it.
If you can find one, and you got the $, I strongly recommend picking one up.
I'm past the last gen. I have an SSD in my X and that is a true difference all around performance wise. Having true sata6 is nice that is for sure.
Speaking of red dead. Never knew the game had more than two screens when loading explaining shtuff until I tried to show my father-in-law how it loads fast only to find out it was on my external.
I'm talking seconds when most things took nearly minutes.
And still can not believe Pokemon is truly a thing to this day.
Go chase them different colors down.... Never did get into it... Can't see how anyone can...
Sure - but if you see one (and you want the best) then, it's the one to get (maybe for someone else, or, as a media box, w/e).
As for RDR2 - I averaged <10 seconds for fast travel, save game loads. I did have a weird moment once when it took 25 seconds to load from the SSD, but it was just once.
GoW and HZD are mega fast loaders...The Last Guardian is not, that still takes >15 seconds for a save game...no idea why as it's not that graphically taxing from what I can see.
Not only they will, the PS5 architect specifically mentions that they will have more features than the upcoming cards from AMD. He literally says that people might say that the PS5 will have the same hardware, and it will actually have more.
This whole video is a great dive into hardware specs, and how games are made in general. If you have the time and patience, give it a look, it's very informative.
I'm not sure I would even like longer games at this moments, as most open-world games clock in the hundreds of hours. There is the issue of AI, audio and gameplay that might be helped just but what the new tech enables though.
This is exactly what this is doing. If they incorporate the scaler in the engine itself so you as a developer can automatically "pack" a specific size of asset for each platform (ie, specify smaller assets for weaker platforms, that they then will also get this scaling), then that solves cross-platform titles basically. It probably will work this way, as this engine and these technologies are launching on every platform known to man. As an example, you can imagine that you're making a big AAA title. Your initial models are in the tens of millions of polygons, and you use them as the "source". These will be packed for PC and PS5/Xbox, and you can specify a minimum spec for streaming them (like 30MB/sec or something), and then you need to prepare assets for the Switch, you could ask the engine to "auto-shrink" your assets to your specifications so that they could fit what the Switch I/O subsystem can do.
This system moves the bottleneck from graphics to I/O for model fidelity, and I also saw mentioned somewhere that the triangle->pixel mapping doesn't need to be 1:1, so for weaker platforms you can literally just set two parameters (the transfer rate of the I/O and the triangle->pixel relationship) and then have performance "auto" scaled.
I really hope every other engine just straight up copies all this.
There are no traditional textures, that's the point
The engine supports 1:1 triangle -> pixel mapping, all that was pure material and geometry.
I was thinking more along the lines of the world data not being present on the hd at all, only things like character models and items etc. It's less about source art downgrading to console-quality, and more about saving space by having the world itself data streamed live.
Example; Game called "World" is 10TB in size. Client-side only needs to have the base-line 500GB assets, the rest is streamed as you play.
Of-course, logically, if internet speed was fast enough, then, the client-side install could also be very small and client-assets could then also be streamed as needed.
My understanding is that currently, it doesn't work like that and all assets exist on client PC.
The latencies would be crazy, even real gigabit ethernet is only 100MB/sec. This would be nice and it would also enable much smaller install sizes, since there is a single size that the assets need to be, per platform.
Turn 4K option on in YouTube, even if you have only a 1080p screen.
I was watching on a 4K tv with the video set to 4K.... Still wasn't as blown away as everyone else seemed.
Well the results don't impress me too much. Also did you read about the interview when one of the devs said that the PS5 demo had an unlocked frame rate and could only manage 40fps and it looked choppy so they had to cap it at 30fps. The same dev said that a laptop RTX2070 ran the demo fine at 60fps and the same settings. I could be wrong here and misread some of the articles but thats whats floating around.
Thanks for sharing this bit of information. I wondered about how that would run on a PC, but did not have the time and interest to google it over the last days.
But I have to say, I expected something like that to be honest. Now we still don't know any details, but the first UE5 games will come around in a few years, I guess.
If the numbers are true btw, 30 capped (40ish) vs 60fps, I'd like a certain CEO's explaination of that, since consoles will help get PCs progress faster.
It could be a number of things, the PS5 demo could be early code and needs more optimisation. Maybe they haven't tapped the full power of the system, maybe they were not using the SSD to its full capability.
The same thing happens every time though. They show demos at these conferences or live streams, everyone gets all excited and hype builds. Tabloid headlines read "PC is dead", etc, etc. Then it turns out there are once again limitations in place and devs have to get creative with their approach to the hardware and game they are building. Weather it be capping at 30fps or using a dynamic resolution, etc, etc there is always a trick being used to fool people.
Believing anything these CEO's tell you is so dumb. Go back in time and look at the PS3 demos with Killzone 2 or Eight Days, or the PS4 demos with the old face demo, and then saying they can do more or the physics engine they showed for the PS4. We got nothing that looked remotely like these demos or engines. Full blown lies, I mean just go back and watch the Fallout 76 reveal and you will find out "it just works" and have "16x the detail". Even people like Jenson are not immune from this either, he showed up at the PS3 conference and spouted some bullshit about the GPU even though he knew it was based on old tech and that the ATi chip in the 360 used a more modern approach with semi-unified shaders. A few months later he shows off the 8800GTX.
It's all hype to drive sales. Like people actually think that AMD has combined a 3700X and an RDNA2 GPU onto the same die? And that both are fully fledged chips with the same cache, controllers, I/O, engines, etc, etc.... and that they can keep power and heat down to fit inside a small box that will get suffocated by casuals who place them inside entertainment centres? Something has to give somewhere down the line, its never ever as good as they say it is. Sure its going to wipe the floor with the last generation of consoles, thats a given fact. However, if it beats the last gen by 5x you can bet they will hype it as 10x better.
And that CEO you were talking about... he's done and said this kind of stuff for years now the whole company has whenever someone drops a bag of cash on their desk....
Yeah @CPC_RedDawn I generally concur, it's just I'm kind of fed up people even feeding into the delusional beliefs of people hyping that which is by numbers clearly behind. I merely meant, now that there are numbers (30-40 vs 60 or even more), I would like that idiot to answer to such numbers.
While that code maybe isn't optimized yet, they have to double performance to get where the PC with a 2070, which by far is not the most performant GPU out there, is already. I just don't fancy crap talk which would easily be proven wrong by a simple benchmark. But give "alternative facts" first and then not follow up on dementi sure seems to be signs of a deranged mind.
"Epic also appreciated the benefits of the PS5's SSD, noting that such graphics cannot be achieved using a hard drive or even a SATA SSD as they do not have enough speed to load the high resolution textures as they are needed"
The size of that SSD is 825 Gb... you're not going to get too many high res textures on it ... rofl...
This sounds wrong even on the raw spec part of the thing. If a "source" actually says that they shouldn't have any credibility for anything else they say.
This is literally a bunch of tabloid bullshit, with no regard to what is actually being said.
This is what he said about the PS3:
If that last part is not true, I don't know what is.
He was correct about the PS4. Huge pool of commonly accessible memory, x86, HDD by default (remember the previous generations didn't have it), all at a time when the current PC GPUs were maybe giving 3GB of VRAM to play with, slow main memory and slow PCIe to talk to the CPU. It took us going to 6GB+ GPUs with DDR4, to be able to have streaming with no hitching at the same settings as console games. There were a ton of games around at the time that had streaming issues with their PC ports just because of that. I know, I was here
He's also correct about the PS5. The I/O subsystem should be copied verbatim to the PC, instead of just trying to brute force it like most people seem happy to do.