New Upcoming ATI/AMD GPU's Thread: Leaks, Hopes & Aftermarket GPU's

Discussion in 'Videocards - AMD Radeon' started by OnnA, Jul 9, 2016.

  1. OnnA

    OnnA Ancient Guru

    Messages:
    17,952
    Likes Received:
    6,811
    GPU:
    TiTan RTX Ampere UV
    Maddness likes this.
  2. Maddness

    Maddness Ancient Guru

    Messages:
    2,440
    Likes Received:
    1,739
    GPU:
    3080 Aorus Xtreme
    Good review. I really like the Strix cooling.
     
  3. OnnA

    OnnA Ancient Guru

    Messages:
    17,952
    Likes Received:
    6,811
    GPU:
    TiTan RTX Ampere UV
  4. OnnA

    OnnA Ancient Guru

    Messages:
    17,952
    Likes Received:
    6,811
    GPU:
    TiTan RTX Ampere UV
    AMD FreeSync 2 is Coming to XBOX ONE X / S & Why It’s a Huge Deal!

    Something amazing happened yesterday. Something I personally have been hoping and waiting for, for a long time. AMD’s FreeSync 2 technology is finally coming to game consoles.
    Microsoft announced yesterday that it’s officially bringing AMD FreeSync 2 support to the XBOX ONE X and XBOX ONE S with its spring update.
    This is an absolutely massive deal and something gamers familiar with FreeSync will surely be extremely excited about.

    In June of last year I published an editorial on how literally game-changing variable refresh rate technology, i.e. FreeSync, would be for console gaming and why both Microsoft and Sony should seriously consider supporting it on their latest XBOX ONE X and Playstation 4 Pro.

    AMD FreeSync 2 Technology is Now a Must Have For XBOX Gamers & Here’s Why

    I’m going to start off with a quote from my editorial from a couple years back, because it kind of explains everything.

    Attempting to force the incredibly complex game worlds of today’s games to adhere to a fixed framerate is like trying to tame a lion. It’s extraordinarily challenging and any failures or hiccups will lead to catastrophic outcomes.
    FreeSync, solves all of this.

    Khalid Moammer, June 14, 2016

    In a nutshell, FreeSync completely gets rid of tearing AND makes your game run smoother. Tearing is not something that console gamers would normally see, as games on the consoles would normally be either fixed to a framerate of 30 or 60 FPS with V-Sync enabled.
    However, V-Sync introduces lag, a lot of it in many cases. This means that even if we exclude screen tearing from the equation, there’s still virtually no situation where an adapative refresh rate, i.e. FreeSync, is not superior to a fixed refresh rate.

    How FreeSync Actually Works and What it Does

    What Is V-Sync ?

    Let’s starting by talking about V-Sync, which is a frame delivery technique that forces your game to stay in sync with your display’s refresh rate. It does this by limiting your game’s FPS to a fixed 30 or 60 frames per second. In contrast, FreeSync allows your display to run at a dynamic refresh rate and simply match the game’s variable framerate in real-time.

    To understand why this is so important we have to understand why we need V-Syn in the first place. Traditional displays, like the TV you have hooked up to your game console, deliver images at a fixed rate.
    Be it 60 times every second, 30 times a second or even up to 144 times a second as is the case on high-end gaming monitors.

    Why Games Inherently Hate V-Sync & Love Variable Refresh Rate / FreeSync

    [​IMG]

    This creates a huge problem, because games just don’t work like this. The hardware inside your gaming device is always working to process & deliver game images “frames” as fast as it possibly can.
    Modern games have an incredibly wide variety of environments and scenes, some of which are naturally far more visually and computationally intensive than others, which is why framerates can vary greatly from one moment to the next, as some frames will take more time to process and display.

    This makes the framerate of modern games inherently variable, while your display’s refresh rate is the very opposite of that. To address this disparity V-Sync is used. Vertical Sync, as the name implies, syncs every frame with the vertical refresh rate of your monitor.

    It works by forcing each newly rendered frame to wait for its turn. It has to wait for the next refresh cycle, there’s one cycle every 16.6 milliseconds on a typical 60Hz display.
    As soon as the last cycle ends and a new one starts, it begins drawing the frame on the display. And that’s when you actually get to see it. This makes sure that every frame is always in sync with the display.

    Lag, So Much Lag!
    This ensures that no screen tearing occurs, which is introduced when the display begins drawing a frame in the middle of a refresh cycle.
    But, as explained above, because frames aren’t displayed as soon as they’re rendered and have to “wait” every time for a new refresh cycle this introduces lag.

    [​IMG]

    As a result, the image you see in front of you will actually be lagging behind what is actually happening in the game. And if your framerate drops below its 60 or 30 target, this lag grows exponentially worse because the display will have to draw the same frame twice, on two subsequent refresh cycles, until it receives a new rendered frame to display.

    What All of This Means Means
    Well, several things. First, for game developers it means they can build richer and more dynamic worlds without having to worry about an arbitrary 30 or 60 framerate limit.
    For gamers it means smoother gameplay and much more fluid game control and response. For TV makers it means they will now have to introduce TVs that support FreeSync.

    There are already over 100 displays that support the technology on the market, but none of them are TVs. This will undoubtedly change very soon. FreeSync actually costs very little to implement because it doesn’t require any special hardware. It relies on an industry standard and all the processing required is taken care of by the scalar chip that already exist within the display. It’s also officially supported on the HDMI standard, something that AMD introduced a couple of years back in anticipation of console makers embracing the technology.

    The Playstation 4 Pro graphics hardware is based on the same well established PC graphics architecture that AMD continues to sell today, so there’s no question that it supports FreeSync.
    We expect to hear a similar announcement from Sony about bringing FreeSync support to the Playstation 4 Pro in the near future.
    So stay tuned PS4 gamers, you will not miss out!

    THX goes to -> https://wccftech.com/amd-freesync-2-is-coming-to-xbox-one-x-s-and-its-a-huge-deal
     
    Last edited: Mar 12, 2018

  5. OnnA

    OnnA Ancient Guru

    Messages:
    17,952
    Likes Received:
    6,811
    GPU:
    TiTan RTX Ampere UV
    ASRock Teases Phantom Gaming Graphics Cards, Rumored to be AMD Radeon Powered

    [​IMG]

     
    Killian38 likes this.
  6. Goiur

    Goiur Maha Guru

    Messages:
    1,341
    Likes Received:
    632
    GPU:
    ASUS TUF RTX 4080
    Yayyy another card that no one will ever buy !!
     
  7. OnnA

    OnnA Ancient Guru

    Messages:
    17,952
    Likes Received:
    6,811
    GPU:
    TiTan RTX Ampere UV
  8. OnnA

    OnnA Ancient Guru

    Messages:
    17,952
    Likes Received:
    6,811
    GPU:
    TiTan RTX Ampere UV
    Introducing Project ReSX!

    In recent months, we kicked off a special project inside of the Radeon™ Software group known as Project ReSX (Radeon™ eSports Experience). The goal of this project was to optimize the performance of some of the most popular PC games in the world on Radeon™ GPUs, to ensure the best possible eSports experience for Radeon graphics card owners.


    Radeon Software Adrenalin Edition 18.3.1 incorporates a host of changes made possible by Project ReSX. It delivers some notable performance improvements over the Radeon Software Adrenalin Edition version released late last year. With a focus on overall gaming experience, Project ReSX focused on improving FPS performance, 99th percentile frame times and click-to-response to deliver a better eSports experience. Here are some of the gains:

    [​IMG]1

    The positive results shown above are the product of many hours of hard work from our software and engineering teams in cooperation with some of the world’s top game developers.
    Download Radeon Software Adrenalin Edition 18.3.1 and experience faster, fluid and more responsive gaming for yourself!


    -> https://gaming.radeon.com/en/introducing-project-resx
     
  9. OnnA

    OnnA Ancient Guru

    Messages:
    17,952
    Likes Received:
    6,811
    GPU:
    TiTan RTX Ampere UV
  10. OnnA

    OnnA Ancient Guru

    Messages:
    17,952
    Likes Received:
    6,811
    GPU:
    TiTan RTX Ampere UV
    Last edited: Mar 19, 2018

  11. OnnA

    OnnA Ancient Guru

    Messages:
    17,952
    Likes Received:
    6,811
    GPU:
    TiTan RTX Ampere UV
    Crytek has released the GDC 2018 tech demo video for its first-person PvP bounty hunting game, Hunt: Showdown, showcasing the capabilities of CRYENGINE V. Some of the highlights of this video are the Vulkan Renderer, voxelization, the Global Illumination effects and the shadows that are cast by the sun.

    Like its previous titles, Hunt: Showdown – and all games using CRYENGINE V – feature ‘touch bending’, meaning that players can bend the grass or bushes as they pass (was in Crysis3).

    CRYENGINE V also features Vegetation Detail Bending, Terrain Blending, Steam Integration functionalities and PFX2.

     
    Last edited: Mar 20, 2018
  12. OnnA

    OnnA Ancient Guru

    Messages:
    17,952
    Likes Received:
    6,811
    GPU:
    TiTan RTX Ampere UV
    MicroLED vs OLED Explained in Detail
    Before we dive into the differences between microLED and OLED, let us talk about how they are similar. If you have not yet noticed, both terms feature the name LED, meaning that these panels are made using LEDs or light emitting diodes. These diodes give off their own light, which is very different from LCD, which needs a dedicated backlight to project colors.

    As a result, both microLED and OLED panels are able to produce very high contrast ratios and deeper black colors. Since there is no backlight present, when these diodes no longer emit light, they no longer consume power, leading to better battery life.

    Now let us talk about on how these technologies differ from one another.

    • OLED: Organic Light Emitting Diodes use organic materials.
    • MicroLED: Microscopic Light Emitting Diodes use inorganic materials. Devices can be made thinner because according to Android Authority, there is no polarizing and encapsulation layer. Thanks to this, microLED components can also become very tiny, measuring less than 100µm. This is less than the width of human hair.
    You can also think of microLEDs as shrunk down versions of traditional LEDs. The underlying technology is not new but shrinking the components and placing them on an array is a very expensive and might we add, difficult process. This is why you will notice that microLED technology is found on smartwatches rather than smartphones, for now.

    [​IMG]

    Increasing the display size and using microLED to do that is one of the biggest challenges of the display industry. That being said, you can imagine what Samsung must have gone through when it unveiled its 146-inch microLED-touting TV called ‘The Wall’.

    Another reason why these displays are going to be so expensive to make is because of the higher resolutions that manufacturers will end up choosing. The higher the resolution, the higher the soldering accuracy that is going to be required, and that does not cheap.

    Challenges Encountered in Mass Producing microLED Panels
    As expensive and complex the process is, another problem for display manufacturers is how to bring forth those millions of tiny LEDs and merge them with a smartphone circuit panel. There are two ways to go about this.

    1. LEDs can be placed into a larger array, then be soldered on the smartphone’s logic board to form a complete display.
    Challenge: Accuracy issues in placing these tiny microLED components will arise.

    1. Wafer production methods can be exercised. This method is most cost-effective and will be able to churn out more panels in a short amount of time.
    [​IMG]

    Challenge: This method is only suitable for devices that use a low-resolution display. This would explain why the Apple Watch family is sold in millions of units. Using a wafer production method for a smartphone display will only be possible if the manufacturer chooses to opt for a smaller resolution, and that is not going to happen in 2018. The highest resolution of the Apple Watch Series 3 is 312 x 390 and that is for the 42mm option. Imagine a microLED panel with a resolution of 2160 x 1080. The differences of pixels are going to be miles apart and so will the cost.

    MicroLED vs OLED – Advantages and Disadvantages
    There are a lot of manufacturing hurdles that companies are going to encounter with microLED technology but if it turns out to a viable approach, then this display actually holds significant perks over OLED tech.

    Improved brightness to power ratio (measured in lux/W): What this means is that microLED will be able to achieve the same level of brightness as OLED while consuming less power. In fact, the latest iteration can be up to 90 percent more efficient than LCD and up to 50 percent more efficient than OLED.

    Longer lifespan: As you’ve all heard of the OLED burn-in issues taking place with smartphones, with Pixel 2 XL owners reporting about this problem a lot, microLED will not exhibit the same issues. It is even possible for this technology to last longer than LCD panels before you start to witness shifts in color patterns.

    Higher resolutions in smaller form factors: Being able to see a smartphone with a 4K panel is going to be a common sight and will not just be limited to the Sony Xperia XZ series.

    Very high response time: Measured in nanoseconds (thousand times faster than microseconds, or the response time of OLED screens).

    Color benefits: This will range from higher contrast ratio, wide color gamut, and higher brightness levels.

    [​IMG]

    The perks that you’re seeing is only half the story. Let us take a look at the disadvantages that come with using microLED displays.

    Very costly: 3-4 times more expensive than OLED panels.

    Though this is just a single disadvantage, it branches out to more problems overtime. Companies will be reluctant to invest in the expensive facilities and machinery that are required to make these sort of panels. According to a source close to this information, Apple nearly dropped out of pursuing microLED panel manufacturing because of the complications and ridiculous costs that are accompanied with this sort of venture.

    THX goes to -> https://wccftech.com/microled-vs-oled-everything-explained
     
  13. OnnA

    OnnA Ancient Guru

    Messages:
    17,952
    Likes Received:
    6,811
    GPU:
    TiTan RTX Ampere UV
    ==
    Also we can expect soon a new addition to DX12/Vulka API.
    and this will be Free for all...

    "AMD is collaborating with Microsoft to help define, refine and support the future of DirectX12 and ray tracing.

    AMD remains at the forefront of new programming model and application programming interface (API) innovation based on a forward-looking, system-level foundation for graphics programming. We’re looking forward to discussing with game developers their ideas and feedback related to PC-based ray tracing techniques for image quality, effects opportunities, and performance.’"

    UPD:

     
  14. OnnA

    OnnA Ancient Guru

    Messages:
    17,952
    Likes Received:
    6,811
    GPU:
    TiTan RTX Ampere UV
    AMD’s Open Source Vulkan Ray Tracing Engine Debuting In Games This Year – Radeon Rays 2.0

    Hot off the heels of NVIDIA’s announcement of RTX, a GameWorks ray tracer supported in Volta and later generation GPUs, AMD has announced its own open source Vulkan based real-time ray tracing engine.

    Dubbed Radeon Rays, the company’s ray tracing developer suite will now support real-time ray tracing in Radeon Rays 2.0.
    The new engine is compatible with OpenCL 1.2. Built on Vulkan, Radeon Rays 2.0 leverages the API’s advanced support for asynchronous compute to make real-time ray tracing a reality.
    AMD is offering Radeon Rays 2.0 for free, the latest version of the SDK can be downloaded directly from GitHub.

    AMD’s Open Source Vulkan Ray Tracing Engine Debuting In Games This Year
    Contrary to early reports, Radeon Rays 2.0 is not just a developer tool for professional 3D artists, it’s also a game development tool that game developers can take advantage of to implement real-time ray tracing in their games for photorealistic lighting and shadow effects.

    In an interview with Golem.de AMD revealed that it expects the feature to make it into the gaming realm this year.In fact, PC gamers with high-end graphics cards will be able to enjoy this computationally demanding yet visually stunning feature in as soon as a couple of months, just by ticking the “Ultra” setting in a yet unnamed gaming title, the company confirmed.
    So stay tuned folks, looks like PC graphics will be getting a healthy dose of excitement this year.
     
  15. OnnA

    OnnA Ancient Guru

    Messages:
    17,952
    Likes Received:
    6,811
    GPU:
    TiTan RTX Ampere UV

  16. OnnA

    OnnA Ancient Guru

    Messages:
    17,952
    Likes Received:
    6,811
    GPU:
    TiTan RTX Ampere UV
    Editorial:

    Also this all Ray Tracing thing is IMO a lack of New & Fresh Ideas to push boundries of GFX forward.
    Also it will be another 'Sky' gimmick to put sticker on and price ~1000€ :D

    What gamers need?
    We need coders and Good use of new APIs DX12/Vulcan + new Real Features (not old & not-efficient ones -> http://www.pouet.net/prod.php?which=13090 )
    I bet SM/VM (Shader/Vertex) Model 6.0 will give very good IQ....
    Yup, the Raw Powah is not the way to go (hmm maby it is but max for 499€ lol)

    So we are still waiting, for something really new :p
    The Hype machine for next gen is now Ray Tracing.

    NeO Out Peace :D
     
    Last edited: Mar 24, 2018
  17. OnnA

    OnnA Ancient Guru

    Messages:
    17,952
    Likes Received:
    6,811
    GPU:
    TiTan RTX Ampere UV
    AMD have been working on ray tracing accel stuff for a while (eg fireRays/ Radeon Rays a couple of years ago).



    ATI/AMD also had some real time stuff back in DX11 ;)
     
  18. OnnA

    OnnA Ancient Guru

    Messages:
    17,952
    Likes Received:
    6,811
    GPU:
    TiTan RTX Ampere UV
    and ->

    AMD/ATi RV770 Ruby demo video - Aug/2008



    and ->

    Simulating Dynamic Skin Microgeometry SIGGRAPH 2015
     
    HK-1 likes this.
  19. OnnA

    OnnA Ancient Guru

    Messages:
    17,952
    Likes Received:
    6,811
    GPU:
    TiTan RTX Ampere UV
    Analyst firm Susquehanna has cut AMD and NVIDIA's share price targets on the wake of confirmed reports on Bitmain's upcoming Ethereum ASIC. There's been talks about such a product for months - and some actual silicon steering as well that might support it.
    Susquehanna, through analyst Christopher Rolland in a note to clients Monday, cited their travels in Asia as a source of information.

    This has brought confirmations that "(...) Bitmain has already developed an ASIC [application-specific integrated circuit] for mining Ethereum, and is readying the supply chain for shipments in 2Q18."
    And it doesn't seem Bitmain is the only company eyeing the doors of yet another extremely lucrative ASIC mining market:
    "While Bitmain is likely to be the largest ASIC vendor (currently 70-80% of Bitcoin mining ASICs) and the first to market with this product, we have learned of at least three other companies working on Ethereum ASICs, all at various stages of development."

    [​IMG]

    Rolland believes Bitmain's specialized chip offering for Ethereum will hurt demand for PC graphics cards - as well it should, since it stands as much a status quo breaker as Bitcoin mining ASIC were at the time of their introduction.

    So? Vega for 499€ :D for those who need it :p
     
    Evildead666 and Fox2232 like this.
  20. Evildead666

    Evildead666 Guest

    Messages:
    1,309
    Likes Received:
    277
    GPU:
    Vega64/EKWB/Noctua
    Yeah, this is looking good :)
    Graphics card prices back where they should be, but memory prices may take a hit, and it might be using HBM, so that could put that memory to pasture for a while.
    luckily there is GDDR6 that probably won't be used for miners...
     

Share This Page