And it has the very same shader processor count as the RX 5500. Although it is an entry model that is positioned under the Radeon RX 5500 XT, it has the same number of shader processors, 1,408.... Radeon RX 5300 Was Silently Released
"TBP (Typical Board Power) is supposed to be 100W, which is 30W lower than that of Radeon RX 5500 XT, but it should be noted that an auxiliary power connector is required." EPIC Fail for a low end card. Not an option for many Dells and other branded PCs, as well as media centers etc. Too little, too late, too pathetic.
It's 1gb more than the concurence, for the same price or so... Don't forget that this board is not for us, and so our argument is invalid The only bad thing is the power connector, as some targeted PC don't have a PCIe connector... (and so would go NVidia)
Sadly it might reach normal market as the RX550 Lite (you know, the 2Go 64bit version) that were OEM only at launch...
Mistake to make it require a 6pin. The 1650 is a great card because it can drop into a basic OEM unit and function as a family PC that is also good enough for the kids to use for a bit of Fortnite etc. Needing a bigger PSU means you cut out that market.
I have a 4gb VRAM card and all I can say is: 3gb ain't enough. The bus and memory speed will push back allot too. If you want to game, don't buy this or you'll be disappointed.
...It's a 5300......The GPU isn't going to magically run better at resolutions the GPU itself can't handle just because of added ram. You really gotta get your entire method of thinking reworked. If something won't help the card, but will only make it more expensive, you as a consumer, should not want it. In fact if the benchmarks of this GPU given by AMD are correct (reviews should confirm), then this 3GB card is besting a 4GB card....by a pretty decent amount. But hey if you wanna get that 4GB card simply because it's more memory...not because it brings you more performance...then by all means, waste your money. Needing the PCI-Express connector for systems that simply don't have it, sure, but your statement of "needing a bigger PSU" doesn't really hold ground. It's an additional 25 watts, if the OEM market is cutting their PSU down so much it can't spare 25 watts, it was a bad PC in the first place.
Yeah I really don't get this either. They really should have figured out how to make this 75W so it wouldn't need auxiliary power. It really doesn't look good when a low-end model needs aux power. That's very true (too true...) but regardless, this GPU sits in an awkward position. It's not ideal for HTPCs or office use because it's too power hungry and big. It's not ideal for gaming because there's not enough VRAM. Anyone who depends on heavy compute needs would want something more powerful (or with CUDA support). It probably isn't going to be cheap enough for anyone on a serious budget. I'm guessing it was silently released because it would be compared to the 1650, which likely going to be a more sensible choice.
4GB is sufficient for 1080p. It's not an ideal amount but people forget that textures in modern games are optimized for 4K. Most of the time, you can turn down texture detail a notch or two and you'll have much less of a VRAM problem with little to no visible difference. Disable AA and you might actually have spare VRAM.
Not necessarily, there are molex->pcie adapter cables. Sure, it's not a very "clean" \ reliable option, but for 25-30w it's far from a problem, aside from the chance that the OEM unit is so "gimped" it couldn't take those extra watts (which are basically around what 2 hdds use, so...)
So, according to these graphs, this card is just as fast as 5500XT (especially bf5, could not be bothered to look for rest). There must be some magic involved!