1. This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.

PCIe SSDs slowly replacing SATA3 SSD

Discussion in 'Frontpage news' started by Hilbert Hagedoorn, Mar 22, 2019.

  1. Kaleid

    Kaleid Ancient Guru

    Messages:
    2,347
    Likes Received:
    132
    GPU:
    rx480 8GB
    With command queue you certainly mean the access times? That's what makes mechanical harddrives extremely slow, they do have NCQ but that's not enough in itself.
     
  2. tsunami231

    tsunami231 Ancient Guru

    Messages:
    9,350
    Likes Received:
    287
    GPU:
    EVGA 1070Ti Black
    Common user dont need NVME they will never see diffrence between SATA 3 and NVME to justify the extra cost, the cost SSD is still more expensive then it should be and NVME extra. if it cost lets then I it wouldnt matter. So I not really sure how it gonna replace sata3, they been saying SSD is gonna replace HDD for years but price and space is still a big fact as to why HDD are still used.

    I had switch my OS drive from HDD to SSD years back and that was hugh jump in performance. I just recently change my data/game drive from HDD to SSD and I see almost no difference between the 2 at lest not as far as most my games go.
     
  3. Glidefan

    Glidefan Don Booze Staff Member

    Messages:
    12,478
    Likes Received:
    34
    GPU:
    GTX 1070 | 8600M GS
    Not just access time, because you still have to issue the command to read the file one after the other.
    With NVME i think the OS can issue one command to move *this many* files and start moving them all at once, instead of sequentially.
    They can have multiple queues and each queue has more depth.
     
  4. The Goose

    The Goose Ancient Guru

    Messages:
    2,213
    Likes Received:
    35
    GPU:
    Evga 1080FTW
    same way you migrated to m.2
     

  5. MegaFalloutFan

    MegaFalloutFan Master Guru

    Messages:
    506
    Likes Received:
    54
    GPU:
    RTX 2080Ti 11Gb
    I have my 2080ti in x8 just so i could connect my Optane directly to CPU instead of going trough chipset and there is no difference.
    Also we have more PCIe lanes, i have two x4 PCIe NVME drives installed and have space for one more

    I did that after reading these benchmarks
    https://www.techpowerup.com/reviews/NVIDIA/GeForce_RTX_2080_Ti_PCI-Express_Scaling/3.html

    Basically there is no real world difference, especially in 4K and especially if you have 60hz monitor
     
  6. Dragam1337

    Dragam1337 Maha Guru

    Messages:
    1,209
    Likes Received:
    563
    GPU:
    1080 Gaming X SLI
    Did you even look at what you posted, before making that statement?

    First of all they only tested 1 bandwidth heavy game - asscreed origins - where it makes a 5% difference to average fps. But what you need to keep in mind is that what bandwidth limitations affects the most is the minimum fps... so with limited pcie bandwidth, you will see bigger drops in fps. It's even more apparent in a game like witcher 3, which is very bandwidth heavy.
    Second have a look at what happens in civilization - the reason for it is that bandwidth reqruirements increase drastically with increases to resolution.

    I will state what i said again - i would personally never sacrifice actual game performance for slightly faster load times.
     
    Last edited: Mar 24, 2019
  7. MegaFalloutFan

    MegaFalloutFan Master Guru

    Messages:
    506
    Likes Received:
    54
    GPU:
    RTX 2080Ti 11Gb
    I bet you rushed to replay without reading my post till the end :(
    I repeat: Basically there is no real world difference, especially in 4K and especially if you have 60hz monitor

    Why do you care if it does 115fps instead of 120fps? I dont, I game on OLED TV and its 60hz
    Based on that chart even PCIe 3.0 x2 [they tested PCIe 2.0 at x4 and got 78FPS] with 2080ti is playable at 4K
    I mean i wont use it and its unrealistic, but PCIe 3.0 x8 is far above that

    Only games that get into super high frame rate are hit, regualr games that do below 80fps are identical
    Their last graph says that the difference is 2% between x8 and x16

    Minimum FPS are not affected by this at all

    Here is Titan X tested
    https://www.pugetsystems.com/labs/articles/Titan-X-Performance-PCI-E-3-0-x8-vs-x16-851/
     
  8. Dragam1337

    Dragam1337 Maha Guru

    Messages:
    1,209
    Likes Received:
    563
    GPU:
    1080 Gaming X SLI
    There absolutely is a difference - assuming that your rig can pull 4k 120 fps, but you only have 60 hz monitor, then a smart person would be using downsampling, which would increase bandwidth requirements even further, meaning an even bigger bandwidth limitation posed by pcie.

    That test you link to again tests no bandwidth heavy games...

    I just tested witcher 3, and indeed, i do see fairly significant differences.

    2.0 x16 (which has the exact same bandwidth as 3.0 x8)

    [​IMG]

    3.0 x16

    [​IMG]

    A 8,5 % difference in fps, and that is just in a static... during heavy action, the difference is even greater.
     
  9. HardwareCaps

    HardwareCaps Master Guru

    Messages:
    452
    Likes Received:
    154
    GPU:
    x
    seems like a software issue more than a PCIE limit.
     
  10. HeavyHemi

    HeavyHemi Ancient Guru

    Messages:
    6,169
    Likes Received:
    522
    GPU:
    GTX1080Ti
    Why would you say that when there is a factual performance hit at high resolutions using top tier GPU's at x8 vs x16? Why is this even being disputed?
     
    Dragam1337 likes this.

  11. HardwareCaps

    HardwareCaps Master Guru

    Messages:
    452
    Likes Received:
    154
    GPU:
    x
    Because it was tested already.
     
  12. Dragam1337

    Dragam1337 Maha Guru

    Messages:
    1,209
    Likes Received:
    563
    GPU:
    1080 Gaming X SLI
    Yes, and the test by techpowerup shows that the games that use a lot of bandwidth do get limited by lack of pcie bandwidth.
     
  13. HardwareCaps

    HardwareCaps Master Guru

    Messages:
    452
    Likes Received:
    154
    GPU:
    x
    Care to show the link? GN tested it very recently and there was no difference at all.

    hopefully you're not talking about the 2010 article which featured GPU's like GTX 280....
     
  14. Dragam1337

    Dragam1337 Maha Guru

    Messages:
    1,209
    Likes Received:
    563
    GPU:
    1080 Gaming X SLI
    It is already posted in this thread...
     
  15. Richard Nutman

    Richard Nutman Active Member

    Messages:
    54
    Likes Received:
    18
    GPU:
    AMD RX 580 8GB
    Problem is most motherboards only have 1 M2 slot.

    I migrated from Sata SSD to M2 drive. Can only plug 1 M2 drive into mobo.
     

  16. BlackZero

    BlackZero Ancient Guru

    Messages:
    8,880
    Likes Received:
    467
    GPU:
    RX Vega
    TPU results show x16 consistently providing higher numbers, even when there’s no apparent limitation. So, there may be other factors influencing the results.

    Of course, that doesn’t mean there isn’t a bandwidth limitation, but just that those particular results can’t be seen other than margin of error.

    If they ran the same tests with a 2080, the comparison could provide more insight to why games like Hellblade show such a difference despite not being very VRAM intensive, or why Wolfenstein only saturates x4 (FPS) for a 2080 but requires x16 for a 25% faster 2080 TI.
     
  17. HardwareCaps

    HardwareCaps Master Guru

    Messages:
    452
    Likes Received:
    154
    GPU:
    x
    That's how you avoid facts.... well done.
     
  18. Dragam1337

    Dragam1337 Maha Guru

    Messages:
    1,209
    Likes Received:
    563
    GPU:
    1080 Gaming X SLI
    Are you really unable to scroll up? It's literally on the same page... talk about lazy...
     
  19. HardwareCaps

    HardwareCaps Master Guru

    Messages:
    452
    Likes Received:
    154
    GPU:
    x
    Huh? you mean 2% difference? which is basically margin of error and can come down to clocks or the run itself?

    Looks like you can't read graphs.....

    [​IMG]
     
    MegaFalloutFan likes this.
  20. HardwareCaps

    HardwareCaps Master Guru

    Messages:
    452
    Likes Received:
    154
    GPU:
    x
    To add GN test which shows similar trend of basically margin of error difference:

    [​IMG]

    [​IMG]

    [​IMG]
     
    MegaFalloutFan likes this.

Share This Page