Review: Intel Core i9 7900X processor

Discussion in 'Frontpage news' started by Hilbert Hagedoorn, Jun 19, 2017.

  1. Venix

    Venix Ancient Guru

    Messages:
    3,440
    Likes Received:
    1,944
    GPU:
    Rtx 4070 super
    i got my 3770k what is it now ? 5 years ago i got a bit later my mugen 3 pushed it up to the maximum stock voltage clock 4.3 ghz and my cpu felt like rambo now is more like the veteran soldier that you can count on and when it walks in the field everybody respects that said my veteran has one more gpu upgrade to 1060/580 or with the mining craze it might end up being 2060/680/670
     
  2. Prince Valiant

    Prince Valiant Master Guru

    Messages:
    819
    Likes Received:
    146
    GPU:
    EVGA GTX 1080 ti
    As per TPU's last PCIe test, the regular 1080 loses around 5% on average across common resolutions with 1.1x16.

    PCIe X.0x16 constantly being well ahead of GPUs is a good thing since it means one less thing to need to worry about :nerd:.
     
  3. Denial

    Denial Ancient Guru

    Messages:
    14,206
    Likes Received:
    4,118
    GPU:
    EVGA RTX 3080
    Yeah exactly, I'd rather have the bus be 10x faster than necessary then have to worry about it bottlenecking my GPU upgrades. I wish they took a similar approach with HDMI/DP and such.
     
  4. Aura89

    Aura89 Ancient Guru

    Messages:
    8,413
    Likes Received:
    1,483
    GPU:
    -
    I'm not saying it's not a good thing that PCI-Express gets updated with new technology, i'm saying for how many people freak out about how much PCI-Express lanes they have on a motherboard/etc. it's unwarranted. There is no reason to worry about it, especially when it comes to GPUs. Yet everyone does.
     

  5. ivymike10mt

    ivymike10mt Guest

    Messages:
    226
    Likes Received:
    12
    GPU:
    GTX 1080Ti SLI
    Both CPU have perfect sense..
    As good as R7 1700 is for work, Core i7-7700K is good for gamming.
    You forgot i7-7700K have turbo boost 4.5GHz.

    2500K was a good CPU. But if You compare with i7-7700K in gamming, or rendering.. Its completly other performance. I have ~90% performace increase.

    You comparing 4-cores (14nm) with 4-cores (32nm) like evry 4-core CPU have similar performance.
     
  6. schmidtbag

    schmidtbag Ancient Guru

    Messages:
    7,973
    Likes Received:
    4,341
    GPU:
    HIS R9 290
    I could not agree more. Despite the repeated proof that PCIe 3.0 @ x8 is enough for nearly all (if not, 100%) of current workloads, people still scoff at motherboards that divide bandwidth.

    Let's look at this in a different perspective:
    No matter how expensive and fancy your PC is, it's puny compared to what some enterprise PCs have. For whatever reason, only now are people ogling at 16 core CPUs, when they've been around at least 6 years ago (such as the Opteron 6274). Meanwhile, Intel has 24-core CPUs. Then, imagine having multiples of those CPUs in a single board. We fawn over M.2 as though it's the fastest thing available. Meanwhile there are enterprise SSDs that operate at over 2GB/s. We're wooed by a Titan Xp, when there are GPUs with 32GB of VRAM, or others with a built-in SSD controller. We think 10 SATA ports is a lot, when there are SAS controllers that support 32 drives, some of which need their own RAM for cache. We're excited over 4Gbps Ethernet, when servers had 10Gbps fiber optic years ago.

    And despite all this super-powered hardware, very rarely will you ever find something electrically configured for an x16 slot.


    So, what exactly is it that people do that utilizes 16 lanes of PCIe 3.0?
     
  7. D3M1G0D

    D3M1G0D Guest

    Messages:
    2,068
    Likes Received:
    1,341
    GPU:
    2 x GeForce 1080 Ti
    I think it's similar to the situation with NVMe SSDs and USB-C. Neither of these two technologies are particularly useful today, yet people fawn over them, and many consider them essential.

    NVMe SSDs provide no practical speed advantage over SATA SSDs (including modern games), but high-end/enthusiast builders consider them as almost essential. If a motherboard had no M.2 slots for a M.2 NVMe SSD, would you buy it?

    When the new Surface Pro was released the vast majority of the comments were about the lack of a USB-C port. Despite the fact that the vast majority of USB accessories are USB-A (including high-end gaming keyboards and mice), people really want USB-C for some reason. My desktop system has a USB-C port and not once have I needed it for anything, so this attitude baffles me.

    The situation is the same with quad-channel memory. Despite quad-channel providing next to no performance benefits compared to dual-channel for the vast majority of applications, I'm willing to bet every high-end gamer with a quad-channel-capable motherboard is using quad-channel. Even if it costs extra, they will go for it, since they don't want to limit themselves to "only" dual-channel when the board can do quad.

    IMO, people do not like compromises, and splitting the PCIe lanes looks like a compromise. If they can help it, they will always opt for the full 16 lanes (whether they need it or not is a different matter).
     
  8. schmidtbag

    schmidtbag Ancient Guru

    Messages:
    7,973
    Likes Received:
    4,341
    GPU:
    HIS R9 290
    Yeah no kidding - I remember someone who avoided SATA like the plague and called it archaic. He acted like M.2 was better than SATA in every way, even though there are things SATA does better. I then linked to some benchmarks of real-world tests and basically said "I'd rather save myself the extra $100. I can deal with waiting an extra second on my load times".

    Same here, I have just 4 devices that are USB 3.0, two of which are drives and the other two are for VR. I have yet to find any devices that take advantage of USB 3.1's bandwidth. Note: I know there are USB 3.1 drive enclosures, but to my understanding they still don't saturate the bandwidth.

    USB-C is nice in many ways and I understand it isn't going to get popular if it isn't implemented, but at the same time, I have lived my entire life never feeling the need to have it. If people are really so upset about not having type C, just buy this:
    http://www.ebay.com/itm/USB-3-1-Typ...935682?hash=item1a2b612202:g:p5EAAOSwOgdYpnHh

    Haha it's like you're reading my mind. But, there is one exception: IGPs. AMD's APUs are extremely hungry for memory bandwidth. Their performance scales very linearly and there is a lot of untappable potential in existing APUs. Intel's IGPs seem to be less starved for bandwidth, but they still like it.
     
  9. TheF34RChannel

    TheF34RChannel Guest

    Messages:
    277
    Likes Received:
    3
    GPU:
    Asus GTX 1080 Strix
    Thanks for the review! Coffee Lake-S it is for me then (gaming rig).
     
  10. sykozis

    sykozis Ancient Guru

    Messages:
    22,492
    Likes Received:
    1,537
    GPU:
    Asus RX6700XT
    The only reason I wanted a motherboard with a USB-C port, it because my phone uses USB-C. It's much easier to just grab the charging/data cable from my phones wall-wart than to find a USB-A to USB-C cable just to plug my phone into my computer. Also, my work laptop has a USB-C port....lol I don't like my laptops having capabilities that my desktop doesn't.
     

  11. D3M1G0D

    D3M1G0D Guest

    Messages:
    2,068
    Likes Received:
    1,341
    GPU:
    2 x GeForce 1080 Ti
    My phone uses USB-C as well, but I rarely use the port since I charge wirelessly. In fact, the only time that I use the port is when I connect it to my car for Android Auto, and I need an adapter for that (I also need an adapter to connect it to my external charger, which uses USB-A). So far, the port has been more troublesome than anything else (dongle hell and all). USB-C may be the future, but that future is a long time away, and in no way do I consider it an essential port to have for modern devices.
     
  12. Aura89

    Aura89 Ancient Guru

    Messages:
    8,413
    Likes Received:
    1,483
    GPU:
    -
    The worst part about it, in my opinion, is when you confront people about why they need so many PCI-Express lanes, they'll list all the devices in their PC. Typically, it's something like:

    1 16x GPU
    1 1x soundcard (which by far does not need the bandwidth of 1x PCI-E 3.0)
    1 4x NVME drive

    In which case, how much they "need" is more like 13 lanes, and not even because of speed, but as far as i know, you can't divide a lane, which is the only reason the soundcard needs a whole lane

    or

    2 16x GPUs
    1 1x soundcard
    2 4x NVME drives

    So, at that point you need 25 PCI-Express lanes. Key word here is NEED, since though you don't even "need" 8x PCI-Express 3.0 lanes for a GPU, having 4x, while not ideal, would not doom your computer, but lets stick with 8x since theres in general a 1fps difference between 8x and 16x, even when its 189fps vs 190fps.

    You point this out to them, and its "Oh no, no, i NEED 41 PCI-Express lanes!" Of which you ask why, after explaining to them that they don't, and they end up replying with "You can't tell me what i need! i don't need to justify my needs to you! I need 41 PCI-Express lanes at minimum, and that's final!"

    I just don't get it, it's like people have it set in their minds that PCI-Express lanes is life itself, and without more bandwidth then they will ever need, they will never be happy in life...I don't get it.

    Personally, i have a ryzen, it has 16x PCI-Express 3.0 lanes (or 20? 24? i can't fully understand how many this chipset/CPU has....), i have 1 16x GPU, 1 1x souncard, and 1 4x NVME drive. Not one has any device throttled down due to being used.
     
  13. ManofGod

    ManofGod Ancient Guru

    Messages:
    1,591
    Likes Received:
    111
    GPU:
    Sapphire R9 Fury Nitro
    Eh, I am happy with my Ryzen 1600 and 1700x builds. :) Competition is a great thing. :)
     
  14. Loophole35

    Loophole35 Guest

    Messages:
    9,797
    Likes Received:
    1,161
    GPU:
    EVGA 1080ti SC
    SLI requires x8 minimum IIRC (artificial limit). Like I said earlier 20 is more than enough for me which I beleve is what Ryzen offers. The reason I say 20 is I may go multi-GPU again and don't want to close that door with a NVMe SSD. Look I get it you feel the need to tell everyone else what they need. So what is your opinion of TR having 64 lanes?

    And yes there is absolutely no reason to have your soudcard on CPU feed lanes.
     
    Last edited: Jun 24, 2017
  15. sykozis

    sykozis Ancient Guru

    Messages:
    22,492
    Likes Received:
    1,537
    GPU:
    Asus RX6700XT
    Unfortunately, Nexus 5x and 6P don't support wireless charging. The chargers that come with the phones have USB-C ports themselves.

    I agree that having the port isn't essential, but given the option I'd rather have it than not.
     

  16. Matt26LFC

    Matt26LFC Ancient Guru

    Messages:
    3,123
    Likes Received:
    67
    GPU:
    RTX 2080Ti
    My One Plus 3T has USB C port too! Modern platforms should start adopting it, having one USB C port isn't going to **** things up, the more quickly its adopted the better imho
     
  17. sykozis

    sykozis Ancient Guru

    Messages:
    22,492
    Likes Received:
    1,537
    GPU:
    Asus RX6700XT
    Exactly. There are already flash drives that can make reasonable use of USB-C and phones are adopting USB-C, though not necessarily configured for full speed. There's no reason not to have at least 1 USB-C port these days.
     
  18. schmidtbag

    schmidtbag Ancient Guru

    Messages:
    7,973
    Likes Received:
    4,341
    GPU:
    HIS R9 290
    Like I said before, there are type A to type C converters for super cheap. Even if a modern motherboard for whatever reason doesn't include one, it isn't hard to get one, or multiple if you happen to need to.
     
  19. nosirrah

    nosirrah Guest

    Messages:
    7
    Likes Received:
    0
    GPU:
    GTX 980M
    Those type C ports are really cool. I just built a setup for my sister. She connects 1 cord to her laptop and the screen jumps to a 1440p monitor and activates 2.1 speakers, wireless keyboard and mouse, blu-ray drive and backup storage.

    The only improvement I want now is laptop charge through type C so I can power the hub itself and have 1 less cord to the laptop.
     
  20. Andrew LB

    Andrew LB Maha Guru

    Messages:
    1,251
    Likes Received:
    232
    GPU:
    EVGA GTX 1080@2,025
    Most of Intel's historical competition started around the same time Intel did (1960s) and competed with them for decades. The vast majority of which weren't in the right place at the right time during the tech boom of the late 80's and 90's when Intel became the market leader. Everyone had their own vision as to what processing technology would eventually become the industry standard (AIM PowerPC, ARM Acorn RISC Machine, DEC Alpha, Intel, etc etc etc), and Intel became that standard because theirs was the best overall.

    Additionally, the vast majority of of the market competitors, including Intel, were nothing more than CPU clones of IBM PC, which was IBM reluctently allowed because they were under the dark cloud of 20 years of antitrust legislation by the US Government and didn't want to get dismantled. In the mid 80's, IBM lost control of the IBM PC standard to the companies cloning their chips who pushed IBM out of the market by undercutting their prices. During this time period, the clone companies were forced by IBM to be part of the "Second Source" agreement, which mandated Intel 'share success' with AMD and other companies. Intel wanted to break out of the agreement and let each companies products rise or fall on their own merits, which ended AMD's partnership with Intel. Once Intel became the standard, companies like VIA, Cirix, and AMD cloned Intel's chips through the mandated technology sharing agreement which intel finally put an end to this sharing, prompting AMD to sue Intel. AMD lost the suit and spent the next 5 years reverse engineering Intel's 386 since they no longer were given the tech directly. AMD has done a wonderful job ever since portraying themselves as a victim of the evil Intel even though the company spent five of the past seven decades riding on other companies accomplishments
     

Share This Page