AT this time more powerful gfx cards are useless. The gaming industry is still making games that look like dx9 because theyre lagging behind on technology. I mean I don't know about you, but I havent seen any substantial change in gfx since UT3 and Crysis. And that was what? 2007? Those daym lazy programmers.
Crysis still has worse performance than games that look far better than it. So yes, it was unoptimized.
No the point is he really, really doesn't like Crysis and will bash it at random if not all opportunities he gets.
Hopefully they are because i would't be so woried about the performance since they can easy pull ahead.
Do you have a grasp of how market competition works? the hd6970 performs similar to nvidia's second in line card and so sells at a similar price... does that make any sense? or you could buy ati's top card (6990) which ouperforms nvidia's top gtx 590 and costs less.. For the price of gtx 570 sli i would have picked up hd 6970 cf and got much higher performance, 2Gb Vram and lower power usage.
Thought we all learnt our lesson about hyping something that we have little information on @BlackZero lower power usage sure, performance wise I don't see a significant performance boost over it: http://www.anandtech.com/bench/Product/298?vs=307 Looks more like give and take, with more give to the 570's
yeah if you consider a single game (civilization 5) and maybe hawx2 though the fact they favour nvidia cards is well known and people hardly play them, still even though I don't trust anandtech's chart anymore due to the large number of oddities brought about by carelessness with drivers I'll point out that in that same chart the 6970 cf performs 10% faster in crysis at high resolution, 10% faster in metro2033, 10% faster in battlefield bad company 2, same still in stalker COP... need i go on? As for why choose higher resolution, it's clear that higher reesolutions give a better indication of future stress levels as polygons become more complex and texture sizes increase, battlefield 3 anyone? and it seems a large number of other reviews agree http://www.hardocp.com/article/2011/01/11/amd_69706950_cfx_nvidia_580570_sli_review/4 http://techgage.com/article/amd_radeon_hd_6950_hd_6970_crossfirex/6 http://hexus.net/tech/reviews/graph...-2gb-graphics-card-two-way-crossfirex/?page=6 and according to guru3d even 6950 cfx performs at the same level at gtx 570 sli, let alone hd 6970 cfx. http://www.guru3d.com/article/radeon-hd-6950-crossfirex-review/8
alot of games look alot better then crysis. just look at metro2033 or frostbite engine... crysis is an old engine and it just doesnt realy cut it anymore it was a hit back then because of its graphics. the gameplay was rather dull and the AI was horrible.. the only thing that i praise crysis for was its sound. i think i never had a game that used my 5.1 surround more then crysis... until bad company 2
Well, we need as much power as we can get because Super HD (or equivalent) is just round the corner. Next-gen monitors are going to have amazing pixel-pitch...to the point that you may not visually see the difference when using a feature like AA as long as your rig could run at native resolution. I'm fully expecting at least 4000-6000 pixels on the horizontal. It will make current triple-screen solutions look old and dated.
It's weird, I remember when I was a kid, quite clearly, never seeing Jaggies. Up until around 2006 / 2007, then every game I've played since then has Jaggies. I can go back and play games before that time, and 90% of them do not have Jaggies, even without AA. Or at least you can barely notice them. Why are they so common these days? Infact, in BF3, I can't enable AA, as I'm already 99%...but I can see objects almost entirely covered in Jaggies at a distance. I can go boot up HL1, no AA or anything, and no real Jaggies...I can boot up Quake or something, no Jaggies...ect. IT's odd.
I'd be extremely disappointed if the next gen of card weren't far more than 45% more powerful. 8800GTX to GTX280 to GTX480, each time there was about a 100% increase in power ( after a few drivers ). I expect the same next time. Even a refresh on the same process yielded a 20% increase in performance so if a massive die shrink and new tech can't do more than 45...I'll be skipping the next series of cards.
They were so retangular that it was a feature. Memory does great thing for the visuals of a game, since you remember how you felt about it and not what it really looked. So you may be under effect of nostalgia, imo a good thing.
I think you are some what correct on this if my memory serves me correctly it could be due to how the games where coded for the gpu's then I can't remember all the stages it went through. You know from coding pixel by pixel the then to pixel shaders correct if I am wrong. I am interested in knowing if there is any truth or if it is in fact just nostalgia.