Discussion in 'Frontpage news' started by Hilbert Hagedoorn, Jan 21, 2020.
If it's RDNA2, shouldn't it be RX 6x00 series?
I don't know if the rumour is true or not, but I doubt it will come made in 7nm. AMD must be waiting for the refined 7nm+ for that extra juice, wile refining RDNA to version 2.0 adding Ray Tracing: that one will be interesting to see.
AMD already had different architectures on the same series with GCN, they could do the same with RDNA
Yeah, and it caused confusion.
If there's really RTRT-hardware, they should call it RX 6000 series.
They should reduce the price of 5700/xt cards also. They are overpriced
No point unless it has HDMI 2.1
Nvidia is not afraid of anything here. Data centers are its field, their cards are the top performers, drivers works better. Many people still want to buy a fast card and want competition, exactly how it happened with CPUs.
How this is a determining factor? bandwidth for what kind of resolution/fps?
4k 144 with no sampling
I see everywhere 5700XT priced as 2060 super, it does not look overpriced with the current trend.
I hit around 3-330watts with my powercolor red devil on air. That is with my OC of course....
Can't that be obtained somehow even if they cannot get the 2.1 certification? like reaching the output speed but not other required specs? Being able to output 4k/144 vs being full 2.1 spec compliant can be a different thing?
I would assume the next generation to be able to output that much frames, without being able to reach that in 4k new titles or titles from 2018+ with high settings ( imho ), still would be bad if they can't do that.
I don't care if the Big Navi power efficiency makes me buy a new psu. If I can get 2080TI performance at half the price I am in. I have refused to buy any Nvidia RTX cards because of the gluttonous pricing. AMD has such a great chance of carving out a big piece of the enthusiast gpu pie. Just get it right this time. And if you do get it right, raise the persons salary that is responsible for getting it right so he/she doesn't go to Big Blue Balls.
OK this sounds interesting. I think this is the only real chance of caching up in the high end GPU segment. They need the 5800XT to match the 2080ti and they need the 5950 to at least match 3080 while the 5950XT to go head to head with 3080TI. But as we all know with AMD the hype train is always big. I'll believe it when i see it this time.
Compared to all the past speculations on their GPU's, this sounds the most impossible to happen tbh )))
5800xt sounds good, right on track with what is needed from AMD to compete in higher segments. Will be interesting to see what is even above this considering names AMD has prepared for release beyond this GPU. I would be all behind big Navi going HBM again.
I don't understand this continuous mention of drivers being better/worse, recent experiences with AMD drivers have been great with less software bloat too. I do agree though that release drivers really need to be up to scratch to make a great first impression on the market though.
I decided to go 4K a a few years back, on my second monitor as 1st 60Hz screen failed inside it's extended warranty that was bought. I am now using what was easily the best choice for someone not on the bleeding edge of tech (top end screens get Very Expensive!!!) which ended up being the Acer Nitro XV273K 4k, QDot, 144Hz capable. Not qualified for Freesync 2 though.
(I got an amazing price, several hundred cheaper than other sites, from Acer's own website due to an Anniversary sale + added a 4 year extended warranty for about £30)
To achieve it's 144Hz certain settings have to be dialed : you miss chroma subsampling, HDR, Freesync or Gsync.... oh and you need to use 2x DP 1.4 cables.
On the other hand you can have all options On but then are limited to 98Hz.
120Hz is a nice option for One cable usage and some good quality of life stuff.
When I researched upcoming panels last year there was not a whole lot on the horizon, but I have noticed a lot of new monitor releases recently so there should be some good new choices. If I was buying a screen today or the near future I would want a single cable solution that requires no compromises in quality or features. Last I checked not all features are supported over HDMI yet so you are looking for DP2.0.
DP1.4 effective bandwidth = 25.92 Gbps
DP2.0 effective bandwidth = 77.37 Gbps
I would imagine the new HDMI 2.1 will have full support of features but you would have to check.
It's certainly a good time for PC hardware these days, Really looking forward to a proper Enthusiast card from AMD again <3
Yeah it probably can but I think people like the stamp of approval.
For example we have these monitor companies shipping 4K 144hz monitors that on paper look fine but when you get it and actually run it in 144hz mode, text looks like blurry garbage because it drops to 4:2:2 to hit that. And now they are doing weird resolutions like that 49" samsung one, where the issue is there but it happens at 92hz or something. It's like unless you research it, you have no idea what the actual max ceiling is before the image quality degrades. IMO it leads to a frustrating experience. I don't want to have to dig around the internet all day to buy a monitor. I want to see that it's 2.1 supported, my GPU is 2.1 supported and I know it's going to do 4K 144 with no sampling, without having to rely on reddit posts and whatnot.
As for the frames, no idea - I'm sure with consoles getting a bump in horsepower that games will push for better graphics.. especially with RT coming out. So more than likely newer titles on the consoles will still target 1080P/60 but with higher fidelity. That being said it is nice to have the choice to run games in 4K and turn down the graphics settings. I do play various games like league of legends, CS, siege, etc where even my 1080Ti can do 4K @ 90-140 depending on the scene. So I imagine midrange from next gen will easily be able to do that across a ton of games.
Maybe not. If AMD doubles the CU count they will have to lower clocks. I'd expect base ~1200Mhz, game ~1350Mhz and boost ~1500Mhz. That way they can definitely keep the TDP around 275W. Those clocks should not need more than 900mV.
i sincerely doubt Big Navi will have HBM2 (or 3). the cost is too high and the demand for Instinct cards is high and will remain so (mainly because of pcie 4 leveraged in system).
too rich for consumer blood, i expect some variant of GDDR on a wide bus. the pro/sumer market will be fine with that, and the efficiencies and yields of gpu and memory will allow them to sell large numbers (by AMD standards) of Big Navi.
as far as comparison to Nvidia, imho, AMD has only power efficiency to conquer as their designs are finally fully modern.
I am excited about big Navi. But then there is the elephant in the room called Ampere. 2080 Ti performance for $699 would be a good day though. Except I already went 2080 Ti for $999 last year. Would like to have AMD GPU to pair with my AMD CPU though.