Star Wars Jedi: Fallen Order

Discussion in 'Games, Gaming & Game-demos' started by Carfax, Apr 13, 2019.

  1. fantaskarsef

    fantaskarsef Ancient Guru

    Messages:
    12,415
    Likes Received:
    4,678
    GPU:
    2080Ti @h2o
    They should fix glitches and bugs and performance...
     
    Last edited: Dec 12, 2019
    WhiteLightning likes this.
  2. XenthorX

    XenthorX Ancient Guru

    Messages:
    3,801
    Likes Received:
    1,792
    GPU:
    3090 Gaming X Trio
    Haswell-e CPU had conservative 3.3ghz core clocks, when overclocked to 4.4-4.6Ghz they're still totally relevant, and it provides free heat for your room during cold winter nights. :eek:

    https://valid.x86.fr/1r2cfi
     
    Last edited: Dec 12, 2019
    Dragam1337 and angelgraves13 like this.
  3. fantaskarsef

    fantaskarsef Ancient Guru

    Messages:
    12,415
    Likes Received:
    4,678
    GPU:
    2080Ti @h2o
    Hmm.... *wonders still if should upgrade or just guilty of upgrade itch* :confused:o_O
     
    XenthorX likes this.
  4. XenthorX

    XenthorX Ancient Guru

    Messages:
    3,801
    Likes Received:
    1,792
    GPU:
    3090 Gaming X Trio
    Got to say i'm playing at 4K, and aim at 60fps no more. If i were at 1440x2560 chasing the 120fps, clearly the 5Ghz 9900k/3900X is the baby to get.
     

  5. fantaskarsef

    fantaskarsef Ancient Guru

    Messages:
    12,415
    Likes Received:
    4,678
    GPU:
    2080Ti @h2o
    Yeah... like @Dragam1337 said... :(
     
    XenthorX likes this.
  6. XenthorX

    XenthorX Ancient Guru

    Messages:
    3,801
    Likes Received:
    1,792
    GPU:
    3090 Gaming X Trio
    I can't get it stable mostly for temperature reasons ( despite AIO cooling), but cranking the overclock to 4.7Ghz you're totally in 8700K territory which is 3 years more recent.

    https://valid.x86.fr/bench/h9pgf3/12

    Deciding to spend some extra money on CPU compared to my initial plans was the best call i made back in 2015.
     
    Dragam1337 likes this.
  7. Memorian

    Memorian Ancient Guru

    Messages:
    3,126
    Likes Received:
    268
    GPU:
    RTX 3080 Ti
    8700K terittory in CPU-z, not in games.
     
  8. XenthorX

    XenthorX Ancient Guru

    Messages:
    3,801
    Likes Received:
    1,792
    GPU:
    3090 Gaming X Trio
    Again, depends on resolutions. At 4K the CPU bottleneck isn't really noticeable anyway.
     
    Dragam1337 likes this.
  9. Dragam1337

    Dragam1337 Ancient Guru

    Messages:
    2,817
    Likes Received:
    1,682
    GPU:
    Rtx 3090 Strix OC
    The mobos outright suck? Might not be the newest scream, but i'd wager that the mobos are still very decent :p
     
  10. XenthorX

    XenthorX Ancient Guru

    Messages:
    3,801
    Likes Received:
    1,792
    GPU:
    3090 Gaming X Trio

  11. Carfax

    Carfax Ancient Guru

    Messages:
    2,913
    Likes Received:
    465
    GPU:
    NVidia Titan Xp
    I'm still on my 6900K at 4.2ghz and I find that I have no need to upgrade my CPU just yet, as my performance is still very good. I'm going to wait until there's a big leap in performance from upgrading, like at least 40% IPC plus DDR5. In BF5 I'm averaging triple digit frame rates, and even in Jedi Fallen Order I'm nearing 100 FPS most of the time at 1440p max quality on a 165Hz monitor.

    The most important thing about a CPU these days besides the amount of cores is whether the CPU supports AVX/AVX2. Those two instructions are getting used more and more frequently by developers in all sorts of the newer games and applications, and in GPU drivers as well. As long as your CPU supports those instructions and has around 6-8 real cores then I'd say you're good for at least another two or three years.

    That's not to say you should wait that long to upgrade, but with the next gen of consoles having great AVX2 support, games will certainly begin to target those instructions in a big way and will further the life of your current CPU. Plus, DX12 and Vulkan make single threaded performance less relevant than it used to be at any rate.
     
  12. Dragam1337

    Dragam1337 Ancient Guru

    Messages:
    2,817
    Likes Received:
    1,682
    GPU:
    Rtx 3090 Strix OC
    That could certainly become an issue for me within not too long, as my 4930k is the last intel cpu to not have avx2. Though hopefully it won't be used in cyberpunk... But then again, i only decided to postpone my system upgrade til monitors can do 4k144+ without the use of compression, so hopefully less than 2 years.
     
  13. fantaskarsef

    fantaskarsef Ancient Guru

    Messages:
    12,415
    Likes Received:
    4,678
    GPU:
    2080Ti @h2o
    See I'm "only" in the high 80s there with max fps, so you can already see it's 10% less than your 6900K, add another two generations of Intel CPUs on top of that and I'm starting to miss out on more than 20% of the fps I could get with my 2080TI. And that's starting to show these days, at least subjectively.
     
  14. Dragam1337

    Dragam1337 Ancient Guru

    Messages:
    2,817
    Likes Received:
    1,682
    GPU:
    Rtx 3090 Strix OC
    Well, you are certainly missing out in bf5 with that cpu / gpu / monitor combo - but bf5 is also very hard on the cpu compared to the gpu.
     
    fantaskarsef likes this.
  15. Lurk

    Lurk Member Guru

    Messages:
    159
    Likes Received:
    20
    GPU:
    Gigabyte GTX 1080
    playing at epic settings on a i5-8600k paired with a 1080GTX and I have to say the game runs smooth as butter except for the few areas designated for level change (i.e. when sliding down to the ice caves on Zeffo, with a couple of area changes accompanied by a weird and sudden loud boom).
    other than that I find performance pretty good (definitely improvable though, wish we could tweak engine settings like in the good old days through .ini file).

    What helped greatly for me is vsync off ingame, on in nvcp, and MOST OF ALL limiting fps to refresh rate through RTSS.
     

  16. Dragam1337

    Dragam1337 Ancient Guru

    Messages:
    2,817
    Likes Received:
    1,682
    GPU:
    Rtx 3090 Strix OC
    Btw apparently this game doesn't like hyperthreading a whole lot - has any of the people who suffer from stuttering tried turning hyperthreading off ?
     
  17. DocStrangelove

    DocStrangelove Ancient Guru

    Messages:
    1,924
    Likes Received:
    458
    GPU:
    MSI RTX2080 Super
    Yup i play without HT for ages now, still got stuttering with this game. Other games run fine. No crashes in GTA V or RDR2 at this point, can hardly believe it.
     
  18. Dragam1337

    Dragam1337 Ancient Guru

    Messages:
    2,817
    Likes Received:
    1,682
    GPU:
    Rtx 3090 Strix OC
    Rdr2 benefits from hyperthreading though :)
     
    DocStrangelove likes this.
  19. Mufflore

    Mufflore Ancient Guru

    Messages:
    12,964
    Likes Received:
    1,427
    GPU:
    Aorus 3090 Xtreme
    After todays big update for RDR2 it crashed on loading every time, even in safe mode.
    It turns out 441.20 video driver isnt compatible.
    After updating to 441.66 it still crashed.
    Starting the game in safe mode got it going, settings changed back, now its working.

    Bummer if you need an older drive for anything else.

    oops I thought this was the NVidia driver thread, my bad.
     
  20. Carfax

    Carfax Ancient Guru

    Messages:
    2,913
    Likes Received:
    465
    GPU:
    NVidia Titan Xp
    It's very likely going to be used in Cyberpunk, because Cyberpunk will be DX12 and Microsoft's compilers target SSE/SSE2/AVX/AVX2. You'll still be able to play the game to be sure, but it will likely run quite a bit slower for you than if you had say a Haswell or newer CPU.
     

Share This Page