Radeon R9 290X is looking increasingly good on paper. Most of its rumored specifications, and SEP pricing were reported late last week, but the ones that eluded us were clock speeds. A source that goes by the name Grant Kim, with access to a Radeon R9 290X sample, disclosed its clock speeds, and ran a few tests for us. To begin with, the GPU core is clocked at 800 MHz. There is no dynamic-overclocking feature, but the chip can lower its clocks, taking load and temperatures into account. The memory is clocked at 1125 MHz (4.50 GHz GDDR5-effective). At that speed, the chip churns out 288 GB/s of memory bandwidth, over its 512-bit wide memory interface. Those clock speeds were reported by the GPU-Z client to us, so we give it the benefit of our doubt, even if it goes against AMD's ">300 GB/s memory bandwidth" bullet-point in its presentation. Among the tests run on the card include frame-rates and frame-latency for Aliens vs. Predators, Battlefield 3, Crysis 3, GRID 2, Tomb Raider (2013), RAGE, and TESV: Skyrim, in no-antialiasing, FXAA, and MSAA modes; at 5760 x 1080 pixels resolution. An NVIDIA GeForce GTX TITAN was pitted against it, running the latest WHQL driver. We must remind you that at that resolution, AMD and NVIDIA GPUs tend to behave a little differently due to the way they handle multi-display, and so it may be an apples-to-coconuts comparison. In Tomb Raider (2013), the R9 290X romps ahead of the GTX TITAN, with higher average, maximum, and minimum frame rates in most tests. http://www.guru3d.com/news_story/are_these_real_amd_r9_290x_benchmarks.html
I'm colour blind and I can barely see which card is supposed to be which on those line graph screenshots as both lines look almost the same colour to me! The Tomb Raider bar graph is perfectly fine though.
I'd guess that those are just ES clock speeds, the final thing will likely be above 800MHz unless they are having some serious production problems... which both nVidia and AMD are always having. They'll make sure to mention that right before telling us that trollolol video cards are now $700-$1000 and not $300-$500 anymore, oh wait... I honestly can't believe I'm looking forward to the PS4, thanks nVidia and AMD.
Why the high res? The percentile of people using a res like that must be less than two decimal places.
Haha, probably because a similar percentile of people will own said videocards! Those cards are more suited to uber high screen res anyway, that's what most people would buy them for I would think.
How the hell can a TITAN have a min frame rate of 35.0 but an avg or 29.1?????? That does not make any sense the avg should be higher than the min!! I call fake on these!
It's incredible. When reviewers use very high resolutions to more accurately measure GPU performance, we hear how hardly anybody uses such high resolutions and so it's unrealistic. Then when reviewers choose to use relatively low gaming resolutions to more accurately measure how CPUs perform, it's the exact same argument. You can't win. I suppose, if pharmaceutical drug producers started to test in general, random populations rather than using a control mechanism, that would also be more realistic.
I'm not saying that they're fake or not, just that the graph labels are poorly chosen The graph is actually made up of two components, FXAA and MSAA 4x. FXAA is represented by the top 3, and MSAA 4x is represented by the bottom 3. For each set, the minimum is less than the average.
You should be able to follow which line is which using this: Screen 1: green lowest Screen 2: Red starts and finishes lowest Screen 3: Red lowest Screen 4: Green lowest Screen 5: Red starts and finishes lowest Screen 6: Red is higest in the middle Screen 7: Red starts and finishes highest Screen 8: Red finishes highest
Actually they do test in different population types, and also they use simulation tools like what I develop at Simcyp to simulate different populations including paediatrics which are generally not ethical to test (unless its terminal patients and then its sometimes done)
If its beating Titan at those clocks, and they are real*, there will be no need to flick the switch *big bucket of salt in hand
I imagine, the fact that you develop software that is used for modelling, supports, rather than disproves, the non-random argument.
Well, those have to be pretty damn ****ed up graphs then. No matter how I read it, FXAA AVG is smaller than FXAA min. Tomb Raider doesn't even support MSAA so I wonder where those numbers come from. I also wonder how it is possible to have same min and average framerate in that "MSAA" test. It is impossible when max framerate is higher than min.
Dunno if they are fake or not, but thoses clock speed ( both core and memory ) look to dont even fit the presentation done by AMD.. - You need 885mhz at min for got 5Tflops ( 2816SP ). ( slide say > of 5Tflops ) - You dont match the triangle rate given by AMD at thoses clock - You dont match the bandwith of >300GB/s given by AMD ( 288GB/s = 7970ghz )