Discussion in 'Videocards - AMD Radeon' started by WhiteLightning, Sep 28, 2018.
Are rumors that AMD will release lower-end Navi GPU's only after Polaris stocks will sell out, true?
I would not say so. That low end Navi speculated was based on stronger chip than actual 5700 XT. But gutted by 40%. (unrealistic)
There may be lower end chip altogether as Navi with 12 Dual-CUs would be like 25% upgrade over Polaris RX-580. While price will be reasonable. And power draw would be good too.
10m30 How much it cost ?
ASUS ROG and AMD reveals the world's first Display Stream Compression Monitor
4K resolutions at over 120Hz is a difficult task. Simply put, the bandwidth isn't there on DisplayPort 1.4 to deliver that kind of refresh rate at 4K.
To reach 4K at 120Hz or 144Hz compromises need to be made, resulting in sub-par image quality thorough chroma subsampling.
At E3 2019, AMD and ASUS ROG have showcased the "world's first display with support for "Display Stream Compression" (DSC), a compression technique for DisplayPort which is considered practically lossless.
This technology enabled 4K resolutions at 144Hz without chroma subsampling, which means that gamers need not choose between their display's high resolution, refresh rate or chroma quality again. With DSR, pristine 4K 144Hz is achievable.
DSC will be available on AMD's Radeon Navi RX 5700 series of graphics cards (Polaris & Vega later), with ASUS' unnamed ROG monitor featuring a 43-inch 4K panel which supports 144Hz, FreeSync 2 HDR and ships with a Display HDR 1000 certification.
The monitor itself looks like an upgraded version of the ROG Strix XG438Q monitor which was revealed at CES 2019, but at this time it is unknown whether or not the XG438Q has been upgraded, or it this is a wholly new monitor.
Why is this launch article implying nvidia doesn't have DSC, pascal and turing both support DSC on their 1.4 controllers.
compressed streams incur latency, so that'll be an interesting thing to test for.
LG reveals the world's first 1ms IPS gaming displays
When it comes to high-end gaming monitors, PC users have a choice.
Do I opt for the image quality and wide viewing angles of IPS panels, or the crisp, responsive feel of TN? LG's response to this question is this; Why not have both?
At E3 2019, LG has revealed the world's first 1ms IPS Gaming Monitors, UltraGear series displays which offer the stunning image quality of their Nano IPS panels while offers ultra-fast response times with 144Hz refresh rates. With LG, gamers can get the whole package.
LG's new 1ms IPS panels come in two flavours, with their 38-inch 3840x1600 38GL950G (G-sync) and their 27-inch 2560x1440 27GL850, the 27-inch model is an FreeSync (Adaptive-Sync).
The 38-inch model also supports LG's Sphere Lighting 2.0 technology.
If you don't believe LG's 1ms response time claims, the company has sought out external validation, with UL, a renowned standards body, certifying these displays as offering 1ms Gray-to-Grey response times.
On top of that, LG has also made multi-monitor setups practically seamless with a 3-side borderless design.
Both monitors support a 10-bit colour depth and offer 98% coverage of the DCI-P3 colour space and 135% of the sRGB colour space.
The 27-inch Ultragear will support HDR10 (it apparently didn't meet any VESA DisplayHDR specification) and a 144Hz refresh rate, with the 38-inch model supporting the DisplayHDR 400 specification and a 144Hz refresh rate, which can be boosted to 175Hz through overclocking.
Anti lag is just lowering the prerender queue to 1, what AMD is not saying is that you will perceive more stutters in doing this during frametime spikes.
Have you heard what they said? Your statement says otherwise.
So, where is your source?
I have Queue Tweaked and set to 1 instead of 3.
1 is Best for me.
You make me laugh.
Can you stay on topic? You made a claim on unreleased technology. I am asking you on : "What on Earth made you come to your conclusion."
Did that question make you laugh? Why? Because it is so simple that everyone except me already knows answer?
Then it should be easy for you to explain/prove.
Or do you laugh because someone actually still replies to your fiction posts just to make something look worse?
This is likely the right line of thinking but I believe there is more to this, that anti-lag is a dynamic solution and not just a fixed "flip queue 1" setting as we've been able to do this for years via registry setting..
The way I understand the presentation of Anti-lag is that it is dynamically changing the queue size as well as prioritising elements within the queue with the available frametime to optimise which frame the user is able to respond to.
If you watch the video from 2:40, it's explained that it's a software optimisation so it can be reasonable to assume that there is more going on than a simple pre-render frame limit setting.
we'll find out when we have cards in hands of reviewers.
Samsung and AMD could power Nintendo's Switch successor
Samsung's graphics partnership with AMD is set to deliver great things, bringing the power of AMD's RDNA graphics architecture to the masses alongside Samsung's powerful Exynos series of mobile chips.
When Nintendo attempts to replace its Switch console, the company will likely opt to create a new handheld/home console hybrid.
While sourcing chips from Nvidia will be an option for the company, Nvidia's minimal efforts on consumer-grade ARM products leaves them in a position where they might not be able to create silicon that's sufficiently "next-gen" enough for Nintendo's next console, at least on the CPU side.
Nvidia hasn't updated their Shield tablet line with new silicon since their Tegra K1, which contains a Kepler-based graphics chip, while their Shield TV series of components has not been enhanced with updated chips than their Maxwell-based Tegra X1, the same chip the Switch uses. As it stands, if Nintendo wanted to create a next-generation Switch console, Nvidia doesn't have a new off-the-shelf SoC to sell them, a factor which will force Nintendo to look at alternative suppliers.
This is where AMD's partnership with Samsung comes in.
Last week, AMD and Samsung entered a strategic punishment which was designed to deliver "low-power, high performance" graphics tech to the masses, merging AMD's RDNA graphics with Samsung's already capable ARM SoCs.
Together, both companies plan to deliver "groundbreaking graphics products", though this deal will take years to start bearing fruit.
While AMD has made ARM processors in the past, their focus with ARM was never on mobile platforms, making the mobile market an area which is practically inaccessible to the company.
While AMD's new Zen 2 series of x86 processors are incredibly efficient, ARM-based CPUs are still the leaders of the mobile market, which means that AMD either needs to invest in ARM components or find a partner that can handle the ARM side for them.
In this regard, Samsung is an ideal partner for AMD, as it enabled the company to get its graphics components into more areas of the market and secures additional funding for future GPU developments.
AMD Navi Radeon Display Engine and Multimedia Engine Detailed
Two of the often overlooked components of a new graphics architecture are the I/O and multimedia capabilities.
With its Radeon RX 5700-series "Navi 10" graphics processor, AMD gave the two their first major update in over two years, with the new Radeon Display Engine, and Radeon Multimedia Engine.
The Display Engine is a hardware component that handles the graphics card's physical display I/O.
The Radeon Multimedia Engine is a set of fixed-function hardware that provides CODEC-specific acceleration to offload your CPU.
The Navi Radeon Display Engine features an updated DisplayPort 1.4 HDR implementation that's capable of handling 8K displays at 60 Hz with a single cable.
It can also handle 4K UHD at 240 Hz with a single cable. These also include HDR and 10-bit color. It achieves this by implementing DSC 1.2a (Display Stream Compression).
The display controller also supports 30 bpp internal color-depth. The HDMI implementation remains HDMI 2.0. The multi-plane overlay protocol (MPO) implementation now supports a low-power mode.
This should, in theory, reduce the GPU's power draw when idling or playing back video.
The Radeon Multimedia Engine is updated with support for more CODECs. The "Navi 10" GPU provides hardware-acceleration for decoding VP9 video at formats of up to 4K @ 90 fps (frames per second), or 8K @ 24 fps.
The H.265 HEVC implementation is more substantial, with hardware-accelerated encoding of 4K at frame-rates of up to 60 fps.
H.265 HEVC decoding is accelerated at 8K @ 24 fps, and 4K @ 90 fps, and 1080p at up to 360 fps. H.264 MPEG4 encoding gets a boost of 4K @ 150 fps, and 1080p @ 600 fps decoding; and 4K @ 90 fps and 1080p @ 150 fps encoding.
nvidia has pascal and volta based tegra products nintendo can use - and nintendo is already suggested to be working them into Switch refreshes.
nintendo's toolset is opengl based, i strongly doubt they will go to a vendor who has inferior opengl and break switch backwards compatibility.
Mind you AMD drivers might be shiet on opengl sure on windows. But the hardware mostly isn't any more inferior in that sense. AMD seems to do quite fine on linux when it comes to open sauce drivers and opengl. And ps4 uses opengl being one of the points.
For sure Nintendo would get more powerful package from amd as total but not as energy efficient as ARM setup is and I think that ARM is the reason why they won't go x86 since it would make things harder for them.