1. This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.

New Upcoming ATI/AMD GPU's Thread: Leaks, Hopes & Aftermarket GPU's

Discussion in 'Videocards - AMD Radeon' started by OnnA, Jul 9, 2016.

Thread Status:
Not open for further replies.
  1. OnnA

    OnnA Ancient Guru

    Messages:
    9,463
    Likes Received:
    1,926
    GPU:
    Vega 64 XTX LiQuiD
    If you talkin' about Visual Fidelity then not much :D
    If you talking about sliders -> High/Medium with some Ultra touch + Low AA (or reshade/RP) (my AC:O settings)
    Good optimised Games like EA/Betesda Almost all @ Ultra
    Slowly industry is leaning towards DX12/VLK so the future of gaming is IMO - Bright light

    PS. The Division 2 is first DX12 only Game on Steam (Yup, like old times when was only DX11 ones)
     
  2. Rich_Guy

    Rich_Guy Ancient Guru

    Messages:
    12,264
    Likes Received:
    433
    GPU:
    MSI 2070S X-Trio
  3. OnnA

    OnnA Ancient Guru

    Messages:
    9,463
    Likes Received:
    1,926
    GPU:
    Vega 64 XTX LiQuiD
  4. warlord

    warlord Ancient Guru

    Messages:
    2,371
    Likes Received:
    778
    GPU:
    Null
    Grow up some people, stop being nervous about wannabe gamer thing because you aren't and give some real money to the best and largest programming corp around.

    Microsoft is bigger than any hardware exclusive company for a reason. Evolution and Software. Windows 10 are great and 70 times better/faster/more stable than windows 7. Even Linux are better than windows 7 in everything.

    Does this ring any bell to you and others here wannabe pro users here still using windows vista/7? I want to vomit from my eyes when guru3d users use windows xp,vista, seven etc.

    Since 2000 here we were always in the edge of things, we were either nerds or learners with potential :D How did this happen? It is a pure sacrilege of enthusiast class!

    DX12 exclusives and store apps was the best movement from microsoft, it gave opportunity for a better future especially for developers. I understand most of users here, cannot appreciate research and development. You do not know about science at all.

    Even Xbox One/PS4 is superior to any windows 7 PC in every aspect. LOL.
     

  5. OnnA

    OnnA Ancient Guru

    Messages:
    9,463
    Likes Received:
    1,926
    GPU:
    Vega 64 XTX LiQuiD
    I'm all for New and shiny.
    Was XP, then XP 64 on Athlon 64 ;) then Vista on CoreDuo then Phenom x6 for Win7/8 now ZEN for WinX

    IMHO, DX12 & VLK are way to go, no more no less but those are Best thing in this Decade.
    [YES AMD/ATI with Mantle give this new APIs head start]
     
  6. OnnA

    OnnA Ancient Guru

    Messages:
    9,463
    Likes Received:
    1,926
    GPU:
    Vega 64 XTX LiQuiD
    AMD/ATI Partners with:
    CAPCOM for Resident Evil 2
    Rebellion for Strange Brigade
    Ubisoft for The Division 2

    During a panel hosted at the E3 2018 Coliseum, Scott Herkelman, Vice President and General Manager of Radeon Gaming at AMD, announced a partnership with Ubisoft on Tom Clancy’s The Division 2, due on March 15th, 2019.
    This is a departure from the first game, which was optimized for NVIDIA cards and even included several GameWorks features.
    However, AMD and Ubisoft seemed to have gotten close again recently after Far Cry 5 was also optimized for AMD cards.

    Herkelman couldn’t provide more details yet on this partnership for The Division 2, though we do know the game should require DirectX 12 compatible hardware according to the Steam page.

    However, he was able to confirm two additional partnerships with Rebellion Developments for Strange Brigade and with CAPCOM for Resident Evil 2 remake.
    This isn’t the first time AMD partners with Rebellion, as their Sniper Elite III was one of the very few games which supported the now-defunct Mantle API while Sniper Elite 4 supported DirectX 12.
    The same can likely be expected for Strange Brigade, due to be available on August 28th with a built-in benchmark.

    Perhaps more surprisingly, Herkelman also welcomed CAPCOM Producer Peter Fabiano on stage, who revealed that the Resident Evil 2 remake (launching on January 25th, 2019) will support DirectX 12 and FreeSync 2.
     
  7. OnnA

    OnnA Ancient Guru

    Messages:
    9,463
    Likes Received:
    1,926
    GPU:
    Vega 64 XTX LiQuiD
    AMD Navi GPUs Will Not Use MCM Design, Feature Single Monolithic Die Instead, Reveals RTG SVP – Yet To Conclude If MCM Can Be Used in Traditional Gaming Graphics Cards

    It looks like AMD is going to stick with traditional monolithic dies and not aim towards Multi-Chip Module (MCM) solutions as far as their next-generation GPUs are concerned. This was revealed to PCGamesn in an interview with the SVP of AMD RTG.

    AMD RTG SVP: Yet To Conclude If MCM Can Be Used For Gaming Graphics Cards, Looking Into It, AMD Navi GPUs Will Stick With Traditional Designs

    The interview was done with the Senior Vice President of Engineering at AMD Radeon Technologies Group, David Wang. Upon being asked whether the AMD Navi GPUs would use an MCM (Multi-Chip Module) approach, David replied that while they are looking into the MCM approach, they haven’t yet concluded whether that is a viable approach for traditional gaming graphics cards. Following is the quote from PCGamesn:

    “We are looking at the MCM type of approach,” says Wang, “but we’ve yet to conclude that this is something that can be used for traditional gaming graphics type of application.” via PCGamesNWe know that GPUs are designed years in advance and once the designs are finalized, there’s little you could do in terms of design change since companies are on a tight schedule and engineering teams have to start working on the next design. We saw it with the Vega GPUs which AMD started designing as soon as Raja Koduri took the helm of the Radeon Technologies Group back in 2015.
    A year later, we saw the company celebrating a key development milestone in 2016, a year prior to the release of Radeon RX Vega GPUs.We know that Navi GPUs are headed for launch next year and most of the design work is already completed with the development phase to begin really soon.

    AMD has already got experience on the 7nm process since their Vega 20 parts based on the new process are heading out early next year. But as much as we wanted to see MCM die on the Navi GPUs, I guess we have to wait a bit longer.

    With Navi GPUs, AMD is going to stick with the traditional monolithic design that we see on all modern GPUs. Unlike the MCM approach that AMD is taking on their HEDT Threadripper and server EPYC parts, the GPUs are yet to use the full potential of AMD’s Infinity Fabric, something which Raja Koduri wanted to implement on their next-gen Radeon parts. Unfortunately, Raja Koduri left AMD for Intel as the Chief architect of their core and visual computing group and has confirmed to be working on Intel’s first discrete graphics cards aimed at the gaming market for a 2020 release.

    “To some extent you’re talking about doing CrossFire on a single package,” says Wang. “The challenge is that unless we make it invisible to the ISVs [independent software vendors] you’re going to see the same sort of reluctance.

    “We’re going down that path on the CPU side, and I think on the GPU we’re always looking at new ideas. But the GPU has unique constraints with this type of NUMA [non-uniform memory access] architecture, and how you combine features… The multithreaded CPU is a bit easier to scale the workload. The NUMA is part of the OS support so it’s much easier to handle this multi-die thing relative to the graphics type of workload.”

    So, is it possible to make an MCM design invisible to a game developer so they can address it as a single GPU without expensive recoding?

    “Anything’s possible…” says Wang.

    [​IMG]

    “That’s gaming” AMD’s Scott Herkelman tells us. “In professional and Instinct workloads multi-GPU is considerably different, we are all in on that side. Even in blockchain applications we are all in on multi-GPU. Gaming on the other hand has to be enabled by the ISVs. And ISVs see it as a tremendous burden.”

    Does that mean we might end up seeing diverging GPU architectures for the professional and consumer spaces to enable MCM on one side and not the other?

    “Yeah, I can definitely see that,” says Wang, “because of one reason we just talked about, one workload is a lot more scalable, and has different sensitivity on multi-GPU or multi-die communication. Versus the other workload or applications that are much less scalable on that standpoint. So yes, I can definitely see the possibility that architectures will start diverging.”

    via PCGamesN

    Now we have had multiple topics on what it takes for an MCM die to work, what the yields would be like and what performance estimates we should expect. The way an MCM die works is that it uses several different dies connected together through a high-speed interconnect. On EPYC chips, AMD connects four Zen Core complexes with their infinity fabric interconnect. These small dies form together to make a high-core count chip that runs efficiently and on par or better than single monolithic chips. AMD’s EPYC has even led Intel to start developing their own MCM solution. Also, NVIDIA is researching their own MCM solutions for future GPU architectures, which will probably end up in server space first before we ever get to see them on consumer cards.

    But on the gaming graphics card side, David mentions that while the hardware implementation exists to make it work, the software side still needs a lot of updating. It’s quite the same thing as Crossfire or any other multi-GPU interface that exists for gamers. Unless there’s proper software support, you won’t see any gains and with MCM, you are basically trying to make multiple GPUs look like a single GPU. You already know how well things went with SLI and Crossfire which are literally dead and have nill support from game developers. But the issue only exists with gaming industry since, in the server space, multiple GPUs can work together seamlessly with proper performance scaling, unlike Crossfire and SLI. So for Navi, an MCM die seems out of the equation but hopefully, AMD will be able to deliver a working solution in their Radeon cards in future.

    Editorial from WCCFtech
     
  8. OnnA

    OnnA Ancient Guru

    Messages:
    9,463
    Likes Received:
    1,926
    GPU:
    Vega 64 XTX LiQuiD
    AMD Radeon PRO V340

    A slide presenting yet unseen Radeon PRO V340 graphics card has been published by BenchLife. The slide was not part of the official press deck for 7nm Vega Computex announcement, so we have no idea where it came from (it could be real, but it could also be fake).

    The Radeon PRO V340 is allegedly a multi-user computing solution for virtualized environments. It features 32GB HBM2 memory. Now here’s where the story gets interesting.
    There are three possibilities: this is either Vega 10 with 32GB HBM higher density stacks, it’s dual Vega 10 with 16GB HBM2 or it’s 7nm Vega 20 with 32GB HBM2.

    Originally this story only focused on 7nm Vega as the original source did, but apparently, the PRO V340 is already present in the drivers.
    The card is codenamed: AMD686C.4 = “AMD Radeon Pro V340 MxGPU” which means multi-GPU solution.

    The slide is not very detailed but it does mention shared hardware encoding for H.265 and H.264 codecs.
    On top of that, we have a ‘security processor’, but the slide does not explain the purpose of this device.

    We have not really seen many Radeon PRO products for virtualization recently. Those cards lack display outputs and cannot be used as desktop graphics cards.

    [​IMG]

    You can also find Radeon Pro V340 at GFXBench:

    [​IMG]

    Source: BenchLife, GFXBench via HardwareLuxx
     
  9. OnnA

    OnnA Ancient Guru

    Messages:
    9,463
    Likes Received:
    1,926
    GPU:
    Vega 64 XTX LiQuiD
    AMD Radeon Pro V340 – 2x Vega 10 GPU Using An Multi GPU Solution

    The card was quietly unveiled in a presentation given by Nick Pandher, Director of Market Developement Professional Graphics at AMD during a chinese press event.
    The card is one of the first AMD cards to feature 32 GB of HBM2 memory and is essentially 2 Vega 10s running in parallel.
    The graphics card is being purported as a virtualization solution with capacity for 32 users.

    Cloud providers that offer computing resources as a service can utilize this card to greatly increase flexibility and allow multiple users to use resources at the same time.
    The drivers for this card already exist and its codenamed AMD686C.4 or in other words the AMD Radeon Pro V340 MxGPU where MxGPU stands for multi-GPU solution. Interestingly, the card has hardware support for HEVC or H265 which is one of the codecs with a growing demand out there right now.

    The slides also mention a security processor which I assume is somethign similar to the ones found on AMD processors and maintains the virtualization layer and keeps the virtual environment detached and unaccessible by the cloud provider.
    This is something that can be a very attractive feature since it means that it would be virtually impossible (hacks like Spectre excepted) for the cloud provider itself to gain access to any of the user data thats running in the virtualized enviroonment.

    Another key development is native support for Adobe CC being introduced for the Radeon SSG family. I cannot state how big of a deal this is.
    I have worked in the amateur video industry and almost everyone I knew used NVIDIA GPUs primarily because of an ecosystem thats based entirely on CUDA tools.
    Adobe integrating native support for AMD cards is the first step for the company to start breaking into the closed down ecosystem of the professional video world and actually allow its GPUs to put their hardware to the test.

    [​IMG]

    [​IMG]
     
  10. OnnA

    OnnA Ancient Guru

    Messages:
    9,463
    Likes Received:
    1,926
    GPU:
    Vega 64 XTX LiQuiD

  11. OnnA

    OnnA Ancient Guru

    Messages:
    9,463
    Likes Received:
    1,926
    GPU:
    Vega 64 XTX LiQuiD
    Last edited: Jun 16, 2018
  12. Maddness

    Maddness Master Guru

    Messages:
    946
    Likes Received:
    211
    GPU:
    EVGA RTX 2080Ti FTW
  13. Komachi

    Komachi Member

    Messages:
    28
    Likes Received:
    9
    GPU:
    RX 480 CF@1350/2200
    Last edited: Jun 16, 2018
  14. Goiur

    Goiur Master Guru

    Messages:
    673
    Likes Received:
    101
    GPU:
    AMD RX Vega 56
    They made that promise with the 480... and well... we all know how it worked out.
     
    Last edited: Jun 17, 2018
  15. OnnA

    OnnA Ancient Guru

    Messages:
    9,463
    Likes Received:
    1,926
    GPU:
    Vega 64 XTX LiQuiD
    No, they made a promise that it will be affordable VR/FHD capable GPU -> They delivered good :D (189-275€)
     

  16. Goiur

    Goiur Master Guru

    Messages:
    673
    Likes Received:
    101
    GPU:
    AMD RX Vega 56
    Everything started with a promise of high-end performance under 400€/$, i almost sold my FuryX because of all the rumors.

    It end up being a big smoke screen lol
     
  17. -Tj-

    -Tj- Ancient Guru

    Messages:
    16,385
    Likes Received:
    1,476
    GPU:
    Zotac GTX980Ti OC
    I don't trust anything tweaktown says, not since that fake nv 118x rumor, it was the only site babling how 8GB will be 1000$ and 16GB 1400$ xD

    And now he goes the same about ps5, some imaginary midrange nvidia 1160gtx and what not.. I can predict that too ;)
     
    Maddness and Fox2232 like this.
  18. Only Intruder

    Only Intruder Maha Guru

    Messages:
    1,109
    Likes Received:
    140
    GPU:
    Sapphire Fury Nitro
    That's what it was though, just rumours, speculation and community based hype, and the truth is AMD stated high end performance from their own product line, not the competition i.e. 290/390 performance (which was AMD's "high end" with the Fury being "enthusiast" at the time), at a lower cost and more efficient power budget.

    Same thing happened with Vega - AMD actually said very early that Vega would be a competitor to the 1080 but then the community created hype with rumours spreading that it'd be a 1080 ti or Titan killer.... When it came to being released, it proved it was a competitor to the 1080, exactly as the original claims stated.

    It is horrible though, when the hype train gets going and so many hopes and expectations are created, wildly out of the original claims ends up leading to disappointment.
     
    Dekaohtoura, HandR and OnnA like this.
  19. OnnA

    OnnA Ancient Guru

    Messages:
    9,463
    Likes Received:
    1,926
    GPU:
    Vega 64 XTX LiQuiD
    Hype, but not Hype tooooo much :D
    And all is in check then !
    I like the Hype Train -> it clearly shows that most of Us enthusiasts are Big Kids deep in our Hearts and that's Great :D:rolleyes::p:cool:;)
     
  20. OnnA

    OnnA Ancient Guru

    Messages:
    9,463
    Likes Received:
    1,926
    GPU:
    Vega 64 XTX LiQuiD
    Valve Fixes The Steam Hardware Survey

    Valve identified a problem with its Steam Hardware Survey and has deployed a fix. The survey tracks a portion of Steam's 125 million active users, so we in the media often uses it to track the hardware and software that gamers use most frequently.
    Unfortunately, the widely cited survey has suffered from extreme changes in key tracking areas over the last seven months due to an error in the reporting system.
    That means the survey has largely been useless for its intended purpose of reflecting broad trends in CPU and GPU usage, among other areas.

    Steam's survey asks users to agree to participate, and if a user opts in, the survey queries the system for basic information about the hardware and software installed on the system.
    Valve adds the information to its database and then shares some of the most important data with the public.

    Steam designed the survey to query its users’ systems once per year, but the company discovered that Asia-based cyber cafes manage their systems in such a way that the survey could be completed multiple times per year on a single PC.
    This duplicated the entries multiple times, which skewed the results of the survey. Typically, we would expect an automated system to weed out multiple entries through identifying information from the host computer, but Steam either protects its users by not collecting such data, or the duplicate entries simply went undetected.

    The duplicate entries led to inflated statistics for Windows 7 usage, CPU and GPU market share, and an erroneous report of a rise in the number of quad-core systems.
    Steam has deployed a fix to correct the issue, and the company contends that all its data from April 2018 onward will be correct.

    However, it appears that the company isn't correcting the erratic data it gathered in the previous months.
    That means the April 2018 Steam Hardware and Software Survey is the only accurate measure of hardware usage by Steam members in the last seven months.
    It also isn't clear if the company made other changes to its survey methodology since it last posted accurate data in July 2017.

    AMD's return to prominence in the desktop processor market finds it trading blows with Intel as they release new waves of competing chips, and while it's difficult to find accurate market share information, the company has undoubtedly made up some ground.
    The latest survey doesn’t reflect that change. Assuming Steam's data in July 2017 was accurate, this chart outlines the changes over the previous seven months:

    -> https://www.tomshardware.com/news/steam-hardware-survey-cpu-gpu,37007.html

    [​IMG]
     
Thread Status:
Not open for further replies.

Share This Page