Intel Expands on Xe Product positioning - Shows DG1 Development card photos and Xe Slides

Discussion in 'Frontpage news' started by Hilbert Hagedoorn, Jan 9, 2020.

  1. drac

    drac Ancient Guru

    Messages:
    1,782
    Likes Received:
    38
    GPU:
    Gigabyte 3080 Ti
    Sadly not really interested unless the card supports g-sync fully (not g-sync compatible). Proper g-sync is just too good to give up, was a game changer when it was released and still is.
     
  2. TechEnthusiast

    TechEnthusiast Member

    Messages:
    18
    Likes Received:
    4
    GPU:
    RTX 2080ti
    The other Sync options are not THAT bad tho.
    Yes, G-Sync is slightly better, but do you really notice the difference? Have you tried?

    Unless I see Freesync and G-Sync side by side in a comparison video, I personally have a hard time making out which is which. Even if you learn the differences and try to really look for them, it is hard to spot in most games.
    That is basically why so many people did not bother with G-Sync, when it had the huge price premium and still ignore the G-Sync "ultimate" version now. It is superior on paper, no way around that, but meh.. what good is something that you hardly notice, even if you try. ;-)

    (Not gonna lie tho, I enjoy G-Sync anyways. But I am also kinda silly and spend way too much on tech that I don't need, but want anyways.)
     
  3. XenthorX

    XenthorX Ancient Guru

    Messages:
    5,035
    Likes Received:
    3,405
    GPU:
    MSI 4090 Suprim X
    Picture me intrigued
     
  4. rl66

    rl66 Ancient Guru

    Messages:
    3,924
    Likes Received:
    839
    GPU:
    Sapphire RX 6700 XT
    ??? there is 3 DP and 1 HDMI (but ok if it's fake one, for the show... ;) )
    I would have said if you have to judge it from it's max consume to compare what you have on screen...
     

  5. drac

    drac Ancient Guru

    Messages:
    1,782
    Likes Received:
    38
    GPU:
    Gigabyte 3080 Ti
    For me personally, maybe I'm sensitive, g-sync is quite a bit better than adaptive. I've owned both so yep have used proper g-sync and "g-sync compatible" or adaptive sync.

    Adaptive blurs/smears the image more so than g-sync, esp. at high FPS. The LFC is also set at higher FPS than g-sync, well it was on my Acer Nitro XV273K. Lastly it just plain doesn't work as well as g-sync, game play can still stutter and produce tearing, while with proper g-sync I've never had a single issue.

    There really isn't that much of a premium for g-sync, its the other features that come with g-sync monitors which can drive up the price, on the high end models at least. The way I look at it is, your monitor is an incredibly important part of the PC, with a bad one whats the point of having nice hardware? Also how often do you get a new one? Not very often. I'd say the premium for g-sync is worth it.
     
  6. bernek

    bernek Ancient Guru

    Messages:
    1,632
    Likes Received:
    93
    GPU:
    Sapphire 7900XT
    I think I need to try the "real" gsync then since I only had gaming freesync monitors that work with gsync compatible... for example freesync monitor + vega64 compared to 1080ti + freesync monitor (gsync compatible) I didn't notice any difference ...
     
  7. AlmondMan

    AlmondMan Maha Guru

    Messages:
    1,036
    Likes Received:
    345
    GPU:
    7900 XT Reference
    Gamers Nexus did some highspeed recording of the Destiny 2 gameplay and attempted some framecounting. At all low settings at 1920x1080, 100% resolution scale, they were seeing 30-45 FPS in a map that was empty and without enemies. They added that it also had significant input lag.

    This is of course pre-production performance. But we probably shouldn't expect much more than this.
     
  8. asturur

    asturur Maha Guru

    Messages:
    1,371
    Likes Received:
    503
    GPU:
    Geforce Gtx 1080TI
    is not that i am blind, the article state:

    "While the plating is ready for display-port and HDMI connectors, these are factual missing for the development product as well."
    So those are place holders, fake ports.
     
  9. TechEnthusiast

    TechEnthusiast Member

    Messages:
    18
    Likes Received:
    4
    GPU:
    RTX 2080ti
    Not gonna lie: I was smiling on every comment claiming that, since it is so clearly stated in the article itself. Plenty of commenters never seem to read the article. Just watch at pictures and make up their own version. ;-)
     
  10. isidore

    isidore Guest

    Messages:
    6,276
    Likes Received:
    58
    GPU:
    RTX 2080TI GamingOC
    Sorry Intel, but with Raja Koduri as head of GPU you will be 10 years ahead of time with the design, but 5 years behind on performance. History doesn't lie..
    And looking at that design pic...they want an all in one, that can offer all time of GPU workloads.
    Looks like Raja has settled in comfortably. AMD era all over again, he has the impression that the hardware will be best in every category, but actually it will be average.
    No matter how much time Intel will give him, he is just a visionary, Intel needs a real life head there, that can produce performance GPU's that sell, not fantasy.
     
    Last edited: Jan 10, 2020

  11. Dribble

    Dribble Master Guru

    Messages:
    369
    Likes Received:
    140
    GPU:
    Geforce 1070
    Also to be fair to Raja, while at AMD he had his hands tied (well more his pockets emptied). He can't have had the funding for his grand visions which imo was the reason he left. At Intel there is plenty money, so this is his chance to show what he can do...
     
  12. Denial

    Denial Ancient Guru

    Messages:
    14,206
    Likes Received:
    4,118
    GPU:
    EVGA RTX 3080
    This is based on what exactly? GCN was designed by Eric Demers who left AMD in 2012. AMD then clearly focused it's entire budget on CPUs leaving Raja to essentially manage GCN with minimal resources across multiple different product sectors. The fact that AMD was at all competitive during this time is honestly incredible. He also completely turned AMD's driver stack around - AMD's drivers were absolute trash when he took over. Now most people consider them on par with Nvidia.

    The Raja hate makes no sense to me given the cards he was dealt.
     
    schmidtbag likes this.
  13. TechEnthusiast

    TechEnthusiast Member

    Messages:
    18
    Likes Received:
    4
    GPU:
    RTX 2080ti
    Not a fan of Raja myself, since he never delivered anything I would consider good. But truth be told: He also never got the chance to deliver something good.
    Intel is the very first time he has the budget to actually work on his ideas and the very first time he can actually work on a product from start to finish.

    We will see how it turns out, but Intel will have hundreds of people on the task, not just that one guy. He may have the lead, but a lot of great people will make sure he gets this right.
     
  14. isidore

    isidore Guest

    Messages:
    6,276
    Likes Received:
    58
    GPU:
    RTX 2080TI GamingOC
    I hope you both ^^ are right. I don't hate the man, just saying he looks more like a visionary then a real life implementer. I was under the impression, he was the supporter of HBM GPU's which to be fair where never backed by software and were never used at full potential. And it doesn't matter if it was because their were expensive or hard to work with, what matters is that Nvidia didn't waste time on that and delivered. So yeah, maybe he is not the perfect man for the job at Intel atm, when they need something that perform decent and they need it fast.
     
  15. TechEnthusiast

    TechEnthusiast Member

    Messages:
    18
    Likes Received:
    4
    GPU:
    RTX 2080ti
    Why would they need it fast?
    They have all the time in the world. There is zero pressure to release a GPU now for Intel, or is there? They can take all the time they need to make it as good as they want it.
    The only question will be: Once it is good enough for Intel, will it also be good enough for gamers? ;-) We tend to drastically overvalue our own market position when it comes to PC components. Gamers may turn a little profit for them, but in the grand scheme of things, we are meaningless.
    Gamers always talk about "value", want "more power" and "lower prices", even tho the margins are already slim as hell compared to the much bigger markets.

    So in short:
    I am not sure if Intel will really try to focus on the gaming market, when it is much easier and more lucrative to go for servers first. If that works out well, they can downscale for gamers and see if the product can find a place.
     

  16. sykozis

    sykozis Ancient Guru

    Messages:
    22,492
    Likes Received:
    1,537
    GPU:
    Asus RX6700XT
    Considering it's not a finalized product, and is only being made available to software developers, that's not really a problem. The card, as pictured (image 10 and 11), is not a production card. It's also based on Xe LP. It reportedly doesn't even have outputs on it.... It doesn't take a lot of performance to optimize code for an architecture....

    Actually, if you read the article....it's destined for software developers and ISVs.....
    Also, there are only 2 images of the actual card and they don't show any ports.

    We don't see the plating on the actual card, only in the renders. Image 10 and 11 are the actual card. Images 7, 8, 9 and 12 are renders of the card. The production cards are expected to have 3 display port and 1 hdmi, the same as the plating on the SDV card.... These cards are not intended for testing graphics performance. They're for developing, testing and optimizing compute code like OpenCL, etc for the Xe architecture.
     
  17. Size_Mick

    Size_Mick Master Guru

    Messages:
    630
    Likes Received:
    463
    GPU:
    Asus GTX 1070 8GB
  18. Venix

    Venix Ancient Guru

    Messages:
    3,440
    Likes Received:
    1,944
    GPU:
    Rtx 4070 super
    Not sure what are we arguing about if this is an actual card that will be released is a 1030 competitor while i do not believe intel will be able to compete at high end i hope they give a good fight in the mid range!
     
  19. EspHack

    EspHack Ancient Guru

    Messages:
    2,795
    Likes Received:
    188
    GPU:
    ATI/HD5770/1GB
    well im hoping this DG1 card is their gt1030 DDR4 and they dont need a whole year to finally bring guns to the $700+ market like AMD did last time
     
  20. Astyanax

    Astyanax Ancient Guru

    Messages:
    17,011
    Likes Received:
    7,353
    GPU:
    GTX 1080ti
    this DG1 card will not reach retail, its for a limited developer distribution to learn the intricacies of intels driver and capabilities.
     

Share This Page