Discussion in 'Frontpage news' started by Hilbert Hagedoorn, Oct 28, 2020.
Probably for 6800 XT or even 6800. But not for 6900 XT.
I had to read up on the differences between G-sync and Freesync and from what I gather, there is none really. For curiosity I'm checking what it would cost to buy a FreeSync replacement for my X34A.
Last time AMD was this competitive with nvidia it was 6000 series... wait...
On a more serious note, I think the 6800 should be just a tad cheaper, let's say 550$. Now it's too close to 3070 in performance to justify the 80$ increase, and too close to 6800 xt in price to justify the performance decrease.
Maybe they should have gone with 12 or 14 gigs of memory to save a couple bucks?
That would be funny now that the tables have turned.
Off topic: my line of work, SAM stands for System Award Management
The 6700 hasn't been announced yet.
I think you might see street prices do just that. Most of this tech doesn't sell at MSRP for very long if there is any competition.
nVidia did already returned to pre-Turing Pricing. So, while some were impressed by price of 3080 and below, it was actually back to what we had before.
And AMD competing with that while giving good alternative is more than we could expect.
Would nVidia released Ampere with Turing pricing scheme, AMD's card now would look super cheap. Same can be said if AMD released their cards before Ampere.
(Just because nVidia destroyed Turing prices month before AMD did not make RX 6000 launch prices any worse.)
At start, nVidia did ask $200 extra claiming that their magical board will get new features over time... "Soon TM" style.
Then Freesync (adaptive sync) compatibility was talked off with: "Totally different, inferior, incompatible."
Now: "Soon TM" did not happen. Freesync = G-Sync = AdaptiveSync.
But when you look for new Freesync screen, AMD changed naming few times. And you want newest "standard name" which supports AdaptiveSync+HDR at same time. (IIRC Freesync Premium.)
Goin with 12 or 14 would have decrease bandwidth and so performance.
How is that everyone was angry for 3070 8gigs, but 80$ for extra 8gigs and a bit more of perf aren't good enoguh?
Are drivers fine? Just because AMD does not have an employee to read bug reports, it does not mean that bugs do not exist. AMD should care less about new iterations of raptr interface and start supporting their "supported" products.
Sadly, both AMD and Nvidia are sandbagging their products. Nvidia with VRAM capacity and AMD with their "reliable" driver support.
I can't believe I will say this. We need Intel to save us...
Nope. The slides said that Rage mode was on for the 6900XT only when compared to an nVidia card. 6800 XT had Rage (and Smart access memory) mode enabled on the slide comparing it with itself, but without these features enabled. The added power was not mentioned though.
Maybe you should read the comment I responded to as well then instead of selectively quoting, and it may make more sense, corporate PR where? Bit of Orwell and pointing out that not all products are available dirt cheap to all individuals if he so chooses he is free not to buy but frankly the world or either of these companies would give a rats ass
Team red it is 6800xt for me my 2080ti will be donated to the Wifes PC to replace her 2070 Yayy all team red for the first time in a few years long may it last and no more stupid nvidia centre crashes!
This thread is impossible. I was going to read it through before commenting, but every time I thought I was on the second to last page, by the time I'd scanned through the comments and clicked the button for the next page, I was again only on the second to last. This is like an endless maze. Bah.
This looked better than I expected, to be honest, but I'm nonetheless glad AMD didn't hype the GPU beforehand like they would have done in the past. Now we got decent results with no hype (aside from random outsiders who didn't even have any connections). I was pretty much 75% sure I'd get an Nvidia video card before this presentation, but now I feel like it's at least 50-50, depending on how these AMD cards will look like in real reviews from Guru3D. Of course I'd likely get a lesser card anyway, but if the top cards suck, the lesser cards would suck as well. I just hope AMD's RT performance won't suck. If it's useless in the not-yet-announced lesser cards, it would be a deal breaker for me. I expect 3060 Ti to be at least somewhat capable in RT, after all, though that also remains to be seen.
can't wait to replace my ancient GTX1060 GPU to RX6800XT
Nah, AMD thought they'd show NVIDIA how Ampere is really done.
I do not know in which threads this was said, when the threads usually go for the usual trend we see those days, i disappear.
But yeah that would have struck me if i had read it.
Strange that the 3000 serie can't do it, at the end is something done with the pcixepress lanes. Maybe some extra wiring on the io/die?
Looks really good. However, after watching
and hearing about a list of replicable bugs in the 5000 series, I've decided to sit on the fence, look at driver threads and see how this progresses early next year before upgrading. Without a doubt, AMD is looking really strong right now, but, yes, RT performance...
Which method will win the RT battle? Dedicated cores for RT like in RTX, or integrated support in the cores like AMD?
Ok, I was already set on upgrading from i7 4790k to 5800X and keeping my Vega 64 for a bit yet, but here is AMD just casually showing Smart Cache and I have to say I'm really tempted for 6800XT a lot now too.