Discussion in 'Frontpage news' started by Hilbert Hagedoorn, Aug 14, 2017.
Like it more than 64.
I think with GDDR6 mass availability hitting in Q1 2018, Volta's consumer oriented stuff was always going to show up then. The average launch cycle for Nvidia has historically been roughly ~10 months, but it's been as long as 14 in some cases.
I'm kind of curious to see how AMD is going to refresh Polaris. Will they wait for Navi in 2019? Or will they cut down Vega next year on 7nm w GDDR6?
I am rather interested what they will do next year with the 7nm also. It might liven up Vega arch. At least they had that in plans for next year I think.
Not like it really matters. Pascal GP106 and Polaris10 already give the masses all the performance that is needed now at 1920x1080 (the most popular resolution for gaming). GP104 and Vega give the 2560x1440 crowd a perfect option and GP102 is there for those that want high refresh or 4k. Volta and Navi will not really be "needed" for another 6 months to a year.
As long as developers don't get lazier.
Well, i can get £90 for my 970 at this time, so if i can nab a 1070 for say £300 i would grab it, and it would cost me £220 in real cash, but i am in no hurry to do this, the cheaper i can get it the better, but no doubt i will have to catch some site having a deal.
I did the same with this card, i got £80 for my 760, so i got this card for the same £210, and this came in a bundle with a 240 SSD drive and a few games.
So i can wait.
If readers here want the very best, irrespective of cost, why are they wasting their time here? Get your PA to order a 1080ti, & replace that in a year.
Then there is the rest of us. Ryzen 6/8 cores may not be 'best' for gaming, but a lot of smart money reckons its the best choice, all considered.
Verdicts so far on vega 56 seem like predicting the outcome of a war from a static snapshot of one aspect of one fluid battle - the cavalry perhaps .
ie. benchmarks on the current crop of games is a ~sole determinant for many reviewers and commenters.
A truism is any game can get playable frames on a half decent rig, its just a matter of compromising some settings.
Oh and BTW, Vega is a fundamental paradigm shift new gen GPU.
I will guess the 1070 chip only manifests in a few products.
In the short time since the inaugural Vega FE, we see, the ~same vega chip in products from $7k usd pro cards (not even including AsusS alleged dual vega gpu 32 lane card), to $400 consumer cards, and a range of vega mobile & desktop APUs by xmas.
ALL simple variants of one stock chip, & using Fabric.
Vega HBCC feature alone is potentially huge. U have heard there is a memory crisis, right? Its not temporary. Production improves, but at a fraction of the rate of processors which demand more of it.
AMD say HBCC can double or triple your gpuS actual vs perceived cache, w/o u noticing, and yet it is barely remarked on by pundits or comments.
U want economical? What better than a 4GB gpu that works like a 8-12GB card?
Conversely, Vega presents the almost unfathomable possibilities of almost infinite gpu workspace. There is already a $7k usd 2TB (yes TB) vega pro card, but with vega & a threadripper & few small cheap striped ~128GB nvme ssds, anybody has this option.
HBCC, as u may know, will intelligently shuffle data for optimal availability, in the most suitable medium in the cache pool. Amd clearly demonstrated its effectiveness by turning the ssd array on and off during a memory maxing task.
A pc is a foundation for present and future solutions. Even if I were a gamer, warts and all, I would take the ~cheapest TR with; 3200 memory, 4K cpu oC, 3+ m.2 nvme ports, vega 56, and make the best cake I can with those ingredients.
Whatever perceived immediate benefits of a system based on a hybrid of products from commercial enemies like Nvidia & intel, & with track records as predators (especially Intel), those advantages will soon be outweighed by the harmonious "greater than the sum of its parts" amd ecosystem using sibling vega and zen.
I don't see how Vega being used across the product stack is any different than AMD's previous offerings. They just traded FP64 for FP16 and that's about it. It's also arguably the reason why AMD's Vega GPU's are unable to compete, they are a jack of all trades, master of none.
HBCC is mostly just a marketing. Nvidia's hardware is just as capable of unified memory access, Nvidia just limits it to TCC mode on Windows because WDDM has strict memory access requirements for both security and compatibility reasons. And AMD's biggest showcase of HBCC w/SSD was 8K real time rendering - which Nvidia actually beat them too on P6000 (http://www.tweaktown.com/news/58669/nvidia-beats-amd-real-time-8k-video-editing/index.html) The Raedon SSG that you refer to doesn't even launch until the end of this year, so I don't know why you're saying it's "already" an option.
And while I like the idea of HSA, I've been hearing about how it's going to revolutionize AMD's product stack for the last decade. I get that they are getting closer and closer to their vision of it - but so far it hasn't really brought anything that's revolutionary (aside from Zen's scalability). Ravenridge might be our first real look - but I don't think there is some product down the line that's going to magically pair with Vega and make it a product that I want.
True that. I game at 1440p which means I am actually not in a hurry for a long time. I kind of want new stuff but only because I want new stuff.
the voltage is 10% higher?? jesus dude. good reply, thx for the reality check
i was going off of the CU/stream/TMU + clocks & wondering wtf i was missing. should have known...didnt even consider that the silicon could be super leaky or just close to its limits already! yikes :funny:
Sorry, no. That was just my crude assumption to explain power difference.
But even if assume only 5% higher voltage (needed for reliable operation of 5% higher clocks) that's still 40% over Vega 56:
1.15 * 1.05 * (1.05)^2 = 1.40
236W * 1.40 = 330W (Vega 64)
Oh and Vega 64 runs hotter meaning even more leakage, and that was not even included in my simple calculus explanation. So it's pretty easy to see why such power delta between the two.
And if anything I would expect Vega 64 to be slightly more power efficient at Vega 56 clocks.
Blunty's RX Vega 56 vs GTX1070 (overclocked) video, benchmarked with Ryzen 5 1600 gives some interesting results;
The most interesting thing for me here is not only seeing the differences in action, but, that the overclocked GTX1070 also seems to lose in GTA V! Interesting Vega 56 performance across the board, apart from synthetic benchmarks here.
Some have commented that the results are different when testing with an Intel cpu. What I'm learning is that consumers will need to put more consideration into the gpu they buy.
If the Vega 56 is only £20-50 more than the GTX1070, then, overall the Vega 56 is the better buy.
If prices end up edging closer to £500, then, people need to consider the GTX1080 as well.