Intel Expands on Xe Product positioning - Shows DG1 Development card photos and Xe Slides

Discussion in 'Frontpage news' started by Hilbert Hagedoorn, Jan 9, 2020.

  1. drac

    drac Ancient Guru

    Messages:
    1,756
    Likes Received:
    31
    GPU:
    Strix RTX2080 Ti OC
    Sadly not really interested unless the card supports g-sync fully (not g-sync compatible). Proper g-sync is just too good to give up, was a game changer when it was released and still is.
     
  2. TechEnthusiast

    TechEnthusiast Member

    Messages:
    13
    Likes Received:
    1
    GPU:
    RTX 2080ti
    The other Sync options are not THAT bad tho.
    Yes, G-Sync is slightly better, but do you really notice the difference? Have you tried?

    Unless I see Freesync and G-Sync side by side in a comparison video, I personally have a hard time making out which is which. Even if you learn the differences and try to really look for them, it is hard to spot in most games.
    That is basically why so many people did not bother with G-Sync, when it had the huge price premium and still ignore the G-Sync "ultimate" version now. It is superior on paper, no way around that, but meh.. what good is something that you hardly notice, even if you try. ;-)

    (Not gonna lie tho, I enjoy G-Sync anyways. But I am also kinda silly and spend way too much on tech that I don't need, but want anyways.)
     
  3. XenthorX

    XenthorX Ancient Guru

    Messages:
    2,985
    Likes Received:
    888
    GPU:
    EVGA XCUltra 2080Ti
    Picture me intrigued
     
  4. rl66

    rl66 Ancient Guru

    Messages:
    2,405
    Likes Received:
    187
    GPU:
    Sapphire RX 580X SE
    ??? there is 3 DP and 1 HDMI (but ok if it's fake one, for the show... ;) )
    I would have said if you have to judge it from it's max consume to compare what you have on screen...
     

  5. drac

    drac Ancient Guru

    Messages:
    1,756
    Likes Received:
    31
    GPU:
    Strix RTX2080 Ti OC
    For me personally, maybe I'm sensitive, g-sync is quite a bit better than adaptive. I've owned both so yep have used proper g-sync and "g-sync compatible" or adaptive sync.

    Adaptive blurs/smears the image more so than g-sync, esp. at high FPS. The LFC is also set at higher FPS than g-sync, well it was on my Acer Nitro XV273K. Lastly it just plain doesn't work as well as g-sync, game play can still stutter and produce tearing, while with proper g-sync I've never had a single issue.

    There really isn't that much of a premium for g-sync, its the other features that come with g-sync monitors which can drive up the price, on the high end models at least. The way I look at it is, your monitor is an incredibly important part of the PC, with a bad one whats the point of having nice hardware? Also how often do you get a new one? Not very often. I'd say the premium for g-sync is worth it.
     
  6. bernek

    bernek Ancient Guru

    Messages:
    1,529
    Likes Received:
    52
    GPU:
    2080TI/1080TI/VEGA
    I think I need to try the "real" gsync then since I only had gaming freesync monitors that work with gsync compatible... for example freesync monitor + vega64 compared to 1080ti + freesync monitor (gsync compatible) I didn't notice any difference ...
     
  7. AlmondMan

    AlmondMan Master Guru

    Messages:
    502
    Likes Received:
    42
    GPU:
    5700 XT Red Dragon
    Gamers Nexus did some highspeed recording of the Destiny 2 gameplay and attempted some framecounting. At all low settings at 1920x1080, 100% resolution scale, they were seeing 30-45 FPS in a map that was empty and without enemies. They added that it also had significant input lag.

    This is of course pre-production performance. But we probably shouldn't expect much more than this.
     
  8. asturur

    asturur Master Guru

    Messages:
    647
    Likes Received:
    164
    GPU:
    Geforce Gtx 1080TI
    is not that i am blind, the article state:

    "While the plating is ready for display-port and HDMI connectors, these are factual missing for the development product as well."
    So those are place holders, fake ports.
     
  9. TechEnthusiast

    TechEnthusiast Member

    Messages:
    13
    Likes Received:
    1
    GPU:
    RTX 2080ti
    Not gonna lie: I was smiling on every comment claiming that, since it is so clearly stated in the article itself. Plenty of commenters never seem to read the article. Just watch at pictures and make up their own version. ;-)
     
  10. isidore

    isidore Ancient Guru

    Messages:
    6,234
    Likes Received:
    29
    GPU:
    RTX 2080TI GamingOC
    Sorry Intel, but with Raja Koduri as head of GPU you will be 10 years ahead of time with the design, but 5 years behind on performance. History doesn't lie..
    And looking at that design pic...they want an all in one, that can offer all time of GPU workloads.
    Looks like Raja has settled in comfortably. AMD era all over again, he has the impression that the hardware will be best in every category, but actually it will be average.
    No matter how much time Intel will give him, he is just a visionary, Intel needs a real life head there, that can produce performance GPU's that sell, not fantasy.
     
    Last edited: Jan 10, 2020

  11. Dribble

    Dribble Member Guru

    Messages:
    123
    Likes Received:
    48
    GPU:
    Geforce 1070
    Also to be fair to Raja, while at AMD he had his hands tied (well more his pockets emptied). He can't have had the funding for his grand visions which imo was the reason he left. At Intel there is plenty money, so this is his chance to show what he can do...
     
  12. Denial

    Denial Ancient Guru

    Messages:
    12,748
    Likes Received:
    1,989
    GPU:
    EVGA 1080Ti
    This is based on what exactly? GCN was designed by Eric Demers who left AMD in 2012. AMD then clearly focused it's entire budget on CPUs leaving Raja to essentially manage GCN with minimal resources across multiple different product sectors. The fact that AMD was at all competitive during this time is honestly incredible. He also completely turned AMD's driver stack around - AMD's drivers were absolute trash when he took over. Now most people consider them on par with Nvidia.

    The Raja hate makes no sense to me given the cards he was dealt.
     
    schmidtbag likes this.
  13. TechEnthusiast

    TechEnthusiast Member

    Messages:
    13
    Likes Received:
    1
    GPU:
    RTX 2080ti
    Not a fan of Raja myself, since he never delivered anything I would consider good. But truth be told: He also never got the chance to deliver something good.
    Intel is the very first time he has the budget to actually work on his ideas and the very first time he can actually work on a product from start to finish.

    We will see how it turns out, but Intel will have hundreds of people on the task, not just that one guy. He may have the lead, but a lot of great people will make sure he gets this right.
     
  14. isidore

    isidore Ancient Guru

    Messages:
    6,234
    Likes Received:
    29
    GPU:
    RTX 2080TI GamingOC
    I hope you both ^^ are right. I don't hate the man, just saying he looks more like a visionary then a real life implementer. I was under the impression, he was the supporter of HBM GPU's which to be fair where never backed by software and were never used at full potential. And it doesn't matter if it was because their were expensive or hard to work with, what matters is that Nvidia didn't waste time on that and delivered. So yeah, maybe he is not the perfect man for the job at Intel atm, when they need something that perform decent and they need it fast.
     
  15. TechEnthusiast

    TechEnthusiast Member

    Messages:
    13
    Likes Received:
    1
    GPU:
    RTX 2080ti
    Why would they need it fast?
    They have all the time in the world. There is zero pressure to release a GPU now for Intel, or is there? They can take all the time they need to make it as good as they want it.
    The only question will be: Once it is good enough for Intel, will it also be good enough for gamers? ;-) We tend to drastically overvalue our own market position when it comes to PC components. Gamers may turn a little profit for them, but in the grand scheme of things, we are meaningless.
    Gamers always talk about "value", want "more power" and "lower prices", even tho the margins are already slim as hell compared to the much bigger markets.

    So in short:
    I am not sure if Intel will really try to focus on the gaming market, when it is much easier and more lucrative to go for servers first. If that works out well, they can downscale for gamers and see if the product can find a place.
     

  16. sykozis

    sykozis Ancient Guru

    Messages:
    21,563
    Likes Received:
    877
    GPU:
    MSI RX5700
    Considering it's not a finalized product, and is only being made available to software developers, that's not really a problem. The card, as pictured (image 10 and 11), is not a production card. It's also based on Xe LP. It reportedly doesn't even have outputs on it.... It doesn't take a lot of performance to optimize code for an architecture....

    Actually, if you read the article....it's destined for software developers and ISVs.....
    Also, there are only 2 images of the actual card and they don't show any ports.

    We don't see the plating on the actual card, only in the renders. Image 10 and 11 are the actual card. Images 7, 8, 9 and 12 are renders of the card. The production cards are expected to have 3 display port and 1 hdmi, the same as the plating on the SDV card.... These cards are not intended for testing graphics performance. They're for developing, testing and optimizing compute code like OpenCL, etc for the Xe architecture.
     
  17. Size_Mick

    Size_Mick Master Guru

    Messages:
    499
    Likes Received:
    253
    GPU:
    Asus GTX 1070 8GB
  18. Venix

    Venix Maha Guru

    Messages:
    1,185
    Likes Received:
    407
    GPU:
    Palit 1060 6gb
    Not sure what are we arguing about if this is an actual card that will be released is a 1030 competitor while i do not believe intel will be able to compete at high end i hope they give a good fight in the mid range!
     
  19. deusex

    deusex Master Guru

    Messages:
    561
    Likes Received:
    47
    GPU:
    Nvidia 2080 Ti FE
    Pcgamer has and tested the sample, they weren't impressed at all.
     
  20. EspHack

    EspHack Ancient Guru

    Messages:
    2,521
    Likes Received:
    53
    GPU:
    ATI/HD5770/1GB
    well im hoping this DG1 card is their gt1030 DDR4 and they dont need a whole year to finally bring guns to the $700+ market like AMD did last time
     

Share This Page