1. This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.

Possible PCIe successor Gen-Z Reaches Version 1.0 Development Stage

Discussion in 'Frontpage news' started by Hilbert Hagedoorn, Feb 21, 2018.

  1. Hilbert Hagedoorn

    Hilbert Hagedoorn Don Vito Corleone Staff Member

    Messages:
    32,895
    Likes Received:
    1,872
    GPU:
    AMD | NVIDIA
  2. RonanH

    RonanH Member

    Messages:
    42
    Likes Received:
    5
    GPU:
    GTX760/2GB
    No Intel = No consumer products

    Do we need more than what PCIe can deliver currently?
    Looks like there are a few big scale computing companies in there though so might be something for the datacentre/supercomputer crowd maybe?
     
  3. Barry J

    Barry J Ancient Guru

    Messages:
    2,660
    Likes Received:
    54
    GPU:
    MSI 1080ti GAMING X
    I like anything that pushes technology it can only help improve desktop PC's in the long run. Would be nice if all PCIE connectors were all 16x we might not need it but if hardware suppliers
    new they can access that maybe they will produce faster and better hardware....... or maybe not
     
  4. Fox2232

    Fox2232 Ancient Guru

    Messages:
    6,964
    Likes Received:
    622
    GPU:
    -NDA +AW@240Hz
    Yes, we need more. Once we get PCIe 5.0, bandwidth will be good enough for you to buy APU system, and add another APU into PCIe slot as add-in-card.

    Imagine buying 4C/8T APU with 2048SP, and then as time passes, you would add few 4C/8T APUs with 2048SP GPU each. One may need 140W CPU socket, but if one considers year at which we'll have PCIe 5.0 ... Maybe those will be just 95W APUs, and 95W Add-In-Cards.

    With great PCIe bandwidth, it may be just question of time till we see secondary system memory. IIRC there are already few projects.
    Maybe even GPUs using HSA memory in another slot. And that slot may be hosting RAM + SSD as large cache.
     

  5. AcidSnow

    AcidSnow Master Guru

    Messages:
    381
    Likes Received:
    5
    GPU:
    VisionTek R9 290X
    I don't believe I've ever heard of Gen-Z. However it sounds like a great idea, and if it works royalty free, then that's a win-win. Because the savings will be passed on the the OEMs & consumers, hopefully lowering the cost of PC's in the future.
     
  6. Kaarme

    Kaarme Maha Guru

    Messages:
    1,167
    Likes Received:
    173
    GPU:
    Sapphire 390
    When I began to read the article and got to the point where royalty free was mentioned, I knew Intel wouldn't be aboard. Nvidia as well these days loves their restricted in-house technologies, so their participation was pretty much 50-50. I guess they judged that since Intel isn't interested, it will have no private consumer future, and in the pro market Nvidia already has their proprietary NVlink, so they wouldn't want their left hand to compete with the right hand.
     
  7. Dragondale13

    Dragondale13 Maha Guru

    Messages:
    1,289
    Likes Received:
    70
    GPU:
    ZOTAC 1070 AMP!
    Great, now by the time I upgrade, the web will be filled with "Gen-Z is the way to go bro! Pci-E is dead!"
    ..and I'll be feeling left out :( :p :D ...it's Pci-E 2.0 vs 3.0 all over again :D :(!
     
  8. warlord

    warlord Maha Guru

    Messages:
    1,295
    Likes Received:
    241
    GPU:
    R9 390X MSI(Gaming)
    Now Gen-Z, next should be Gen-Super and then Gen-GT (aka DragonBallZ -> Dragonball Super -> Dragonball GT), I love so much innovation without useful usage. Find me proper reasoning for this bandwidth existence in the foreseeable future. Really.
     
    fantaskarsef likes this.
  9. Fox2232

    Fox2232 Ancient Guru

    Messages:
    6,964
    Likes Received:
    622
    GPU:
    -NDA +AW@240Hz
    Not so much, this is like move from AGP to PCIe on graphics cards. But much more innovative.

    Intel&nV are not participating from very specific reasons. This new thing is strong push towards HSA.
    Early birds will have to support two platforms and invest resources with low return rate.

    Intel & nV will eat legacy clients while AMD will be pushing for future. (I think it will not be so close future.)
    And they'll make leap once market is large enough/proven.
     
  10. LiviuTM

    LiviuTM Member

    Messages:
    15
    Likes Received:
    0
    GPU:
    Gigabyte R9 290 OC 4GB

  11. -Tj-

    -Tj- Ancient Guru

    Messages:
    15,168
    Likes Received:
    648
    GPU:
    Zotac GTX980Ti OC
    I thought pcie4 and pcie5 was the successor?
     
  12. Dragondale13

    Dragondale13 Maha Guru

    Messages:
    1,289
    Likes Received:
    70
    GPU:
    ZOTAC 1070 AMP!
    What!? those memories were even worse! :(:(:(
    I'm kidding obviously. :D
     
  13. vbetts

    vbetts Don Vincenzo Staff Member

    Messages:
    13,789
    Likes Received:
    356
    GPU:
    Nvidia Geforce GTX 960M
    Maybe in data centers. Consumer grade stuff, no not really.
     
    fantaskarsef likes this.
  14. tunejunky

    tunejunky Master Guru

    Messages:
    309
    Likes Received:
    51
    GPU:
    gtx 1070 / gtx 1080ti
    reminds me of when people were skeptical of hdmi connectors on monitors.
    obviously no-one any of us knows has a direct application for this standard.
    but i remember when i didn't have an application (that was worth a darn...at first) for the Apple II.

    to use a cheesy line from "field of dreams"...."build it and they will come".
     
  15. wavetrex

    wavetrex Master Guru

    Messages:
    362
    Likes Received:
    106
    GPU:
    Zotac GTX1080 AMP!
    This might be the dawn of the truly modular computer.

    Need more CPU power ? Stick in another CPU "unit"
    Need more GPU and higher frame-rates for games ? Add more GPU units
    Need more I/O for your amazing media server ? Stick in an I/O unit with extra SATA/M.2 etc.
    Need application specific accelerators ? Plug those in.

    One slot to rule them all !
    (And it should be hot swappable... adding or removing units without even bothering to power down or reboot)

    I can already picture some crazy high-end motherboard that has 8 slots of this thing, all identical, and could go from 1 APU - 1 I/O to a multi-core beast with 6 CPU's or to a massive gaming/processing machine with 4 GPU's, 2 CPU's, 1 I/O and 1 VR/Sensor Interface board and next day swapped to a 4 I/O video-station / 4 CPU for some massive 4K encoding power.
    I can only dream ;)
     
    tunejunky likes this.

  16. Aura89

    Aura89 Ancient Guru

    Messages:
    6,730
    Likes Received:
    185
    GPU:
    -
    Intel doesn't matter when it comes to this.

    If Intel never supports it in their chipsets, then 3rd party controllers will step in, increasing the price of their boards, to make sure it works. It might be slower then intended because it's not accepted by the chipset directly, but it'd still happen, and then Intel motherboards would be that much more expensive.

    The biggest and really main issue would be Nvidia, since it's going to be the adoption race of cards that'll determine if this goes further. And since Nvidia is as of yet on board, that'll be quite a chunk of devices that will be fitted with PCI-Express instead.


    Honestly, we don't need all PCI-Express connectors to be 16x (ultimately would be a waste of materials and cost), what would be better would be for all PCI-Express connectors, except for the 16x slots, to be open ended and allow any size of card to be put there, albeit at lower speeds if the card needs more. This is already a thing, just not a universal thing.

    [​IMG]

    [​IMG]

    There's even the option to put in the 16x slot clip (the back piece that holds the card in place)

    [​IMG]
     
    Last edited: Feb 21, 2018
  17. tsunami231

    tsunami231 Ancient Guru

    Messages:
    8,473
    Likes Received:
    141
    GPU:
    EVGA 1070Ti Black
    maybe by time i do new build this will be the new standard?
     
  18. FlyBy

    FlyBy Member

    Messages:
    14
    Likes Received:
    1
    GPU:
    Asus 980GTX Poseidon WC
    Nice !
     
  19. Glidefan

    Glidefan Don Booze Staff Member

    Messages:
    12,400
    Likes Received:
    14
    GPU:
    GTX 1070 | 8600M GS
    I do think that it's nice for this to get some traction for being open. It means lower cost in theory (unless the cost to implement the new controller is high).
    On the other hand, this makes me think that perhaps not all Gen-Z implementations will be created equal if companies are open to make their own implementations to the bus, with compatibility being all over the place.
     

Share This Page