Star Wars Jedi: Fallen Order

Discussion in 'Games, Gaming & Game-demos' started by Carfax, Apr 13, 2019.

  1. Carfax

    Carfax Ancient Guru

    Messages:
    2,913
    Likes Received:
    465
    GPU:
    NVidia Titan Xp
    What's your 5930K clocked to? I used to have that CPU. One of the things I hated about it was the memory controller, as it was Intel's first DDR4 memory controller and it had some bugs. Broadwell-E CPU has a much better memory controller than Haswell-E, with better performance and stability with higher frequency memory.

    But yeah I see your point. You are definitely being CPU limited, but big games like Jedi Knight Fallen Order that use DX11 will be a rarity in the coming years I reckon.
     
  2. Dragam1337

    Dragam1337 Ancient Guru

    Messages:
    2,826
    Likes Received:
    1,690
    GPU:
    Rtx 3090 Strix OC
    Well... as long as i will be able to pull a steady 60 fps, then i will be happy. Do you know of any current game that actually uses avx2 ? I know several games use avx, but i haven't been able to find any that use avx2, nor any benchmark that would suggest that 3770k or 2600k was hammered to a much higher degree than newer gen 4 core / 8 thread cpu's in any current games... only the gradual differences in performance you'd expect between generations.
     
  3. haste

    haste Maha Guru

    Messages:
    1,391
    Likes Received:
    439
    GPU:
    GTX 1080 @ 2.1GHz
    Neither DX12 API or any compiler require AVX2. Compilers use instruction sets that you set. Performance increase of AVX2 depends on specific use case.
     
    (sic)Klown12 likes this.
  4. Carfax

    Carfax Ancient Guru

    Messages:
    2,913
    Likes Received:
    465
    GPU:
    NVidia Titan Xp
    Off the top of my head, no. Currently AVX is used much more frequently like you said. But, as I said before, Microsoft's compilers can generate those instructions and with the next gen consoles having full support for AVX2, the usage is only going to increase. This will be for cross gen games next year as well, like Cyberpunk etc.

    From doing a bit of reading, I haven't been able to find any either. It would seem that floating point seems to be preferred in game development over integers. Perhaps with the PS5 and XSX having full speed AVX2 that may change though.
     

  5. Damien_Azreal

    Damien_Azreal Ancient Guru

    Messages:
    10,809
    Likes Received:
    1,642
    GPU:
    Gigabyte 1070 Ti
    Has the stuttering been addressed yet?
     
  6. Carfax

    Carfax Ancient Guru

    Messages:
    2,913
    Likes Received:
    465
    GPU:
    NVidia Titan Xp
    Gotta love the switch and bait. I never said the DX12 API or any compilers "requires" AVX2. I simply said that Microsoft's compilers can target or generate those instructions. Just like how AVX is now more frequently used these days, AVX2 will eventually increase in prevalence because the compilers will target their usage in games.

    Although as I said above, it seems developers prefer to use floating point for most cases.
     
  7. haste

    haste Maha Guru

    Messages:
    1,391
    Likes Received:
    439
    GPU:
    GTX 1080 @ 2.1GHz
    You said that: "It's very likely going to be used in Cyberpunk, because Cyberpunk will be DX12". That sentence implies that DX12 games are more likely to use AVX2, which is simply not true. There is absolutely no correlation between DX12 and AVX2. AVX2 is supported by compilers for almost 6 years, that was even before DX12.

    Also sorry, but: "it seems developers prefer to use floating point for most cases", this sentence makes no sense in correlation to AVX. AVX/AVX2 add a few vector instructions. Faster copying, logical ops and some math ops for vector registers.
     
  8. Carfax

    Carfax Ancient Guru

    Messages:
    2,913
    Likes Received:
    465
    GPU:
    NVidia Titan Xp
    Dude, you're reading into this too much. The reason why I said that is because DX12 games are more likely to use a later version of Visual C++ redistributable that supports AVX2, because it's the newest API. Have you ever seen a DX12 game that used Visual C++ redistributable from 2008?

    And secondly, because Cyberpunk will be a cross gen title. It will also be released on XSX and PS5, which means it could very well target AVX2 on those platforms.

    AVX 256 bit wide vectors was restricted to floating point only as I recall. With AVX2, Intel expanded most integer instructions to 256 bit:

    Source
     
  9. Carfax

    Carfax Ancient Guru

    Messages:
    2,913
    Likes Received:
    465
    GPU:
    NVidia Titan Xp
    Nope, it's still there. It will probably take them a while to address it fully, if they plan on doing so.
     
  10. haste

    haste Maha Guru

    Messages:
    1,391
    Likes Received:
    439
    GPU:
    GTX 1080 @ 2.1GHz
    Oh boy. That is just one change in AVX2 to make data types unified. AVX instructions are meant for vector operations. The best use case is matrix multiplication using FMA (Floating Point Multiply Add) and floating point data types. The most important change in AVX2 is doubling the bandwidth over AVX2 registers, which would theoretically increase FMA by up to 2x over AVX1.

    To make it simple: AVX2 instruction set does improve floating point ops over AVX1 up to 2x in specific cases.

    So in games, this can speed up transforms, which is usually a very little part of a frame time.
     
    Carfax likes this.

  11. Damien_Azreal

    Damien_Azreal Ancient Guru

    Messages:
    10,809
    Likes Received:
    1,642
    GPU:
    Gigabyte 1070 Ti
    Okay.
    Been holding off on playing more of the game because of the stuttering, but... also haven't really been following it that closely. So wasn't sure if any news had come about fixes/updates.
     
  12. XenthorX

    XenthorX Ancient Guru

    Messages:
    3,810
    Likes Received:
    1,800
    GPU:
    3090 Gaming X Trio
    My grain of salt on AVX2, it added support for 256bit registers with a bunch of new helpful instructions (shuffle, add, substract,...) , which made 4*64bits possible, aka Vec4 double precision.

    Big deal for games involving double precision rendering, space sim and such.
     
    Last edited: Dec 15, 2019
  13. haste

    haste Maha Guru

    Messages:
    1,391
    Likes Received:
    439
    GPU:
    GTX 1080 @ 2.1GHz
    ^ I believe that using 4x64bit doubles in one 256bit register was also possible in AVX1 albeit slower.
     
  14. Carfax

    Carfax Ancient Guru

    Messages:
    2,913
    Likes Received:
    465
    GPU:
    NVidia Titan Xp
    OK I get what you're saying. The way it's presented, makes it seem like AVX and AVX2 are different instruction sets, but from what you say, they overlap. AVX2 is merely an extension of AVX, with enhanced capabilities.

    I reckon that when developers get around to targeting AVX2, they will likely use it the same way they are using AVX; namely for physics, cloth simulation, destruction and particle effects.

    Could these instructions be used for A.I? Physics and A.I will be the prime areas for exploitation with the PS5 and XSX, more so than graphics in my opinion.
     
  15. fantaskarsef

    fantaskarsef Ancient Guru

    Messages:
    12,436
    Likes Received:
    4,705
    GPU:
    2080Ti @h2o
    It's clocked at 4.5GHz. Still okay, but yeah it shows. IMC is horrible, can't do 32GB @ 3200MHz... no way, didn't get it to work. And I think it's showing it's age these days... but not really fancying buying in a Broadwell CPU (which could work ofc), much rather thing about an upgrade in 2020.
     

  16. Dragam1337

    Dragam1337 Ancient Guru

    Messages:
    2,826
    Likes Received:
    1,690
    GPU:
    Rtx 3090 Strix OC
    Just get yourself a 4k display, then cpu will be a non-issue :p
     
    XenthorX and fantaskarsef like this.
  17. fantaskarsef

    fantaskarsef Ancient Guru

    Messages:
    12,436
    Likes Received:
    4,705
    GPU:
    2080Ti @h2o
    I figured such as well, sadly I don't see any current GPU as capable enough for it. Not even the 2080TI I have already.
     
  18. Dragam1337

    Dragam1337 Ancient Guru

    Messages:
    2,826
    Likes Received:
    1,690
    GPU:
    Rtx 3090 Strix OC
    2080 ti should do 60+ fps at 4k in all titles, assuming that you don't use craptracing and use optimized settings in games like rdr2.
     
  19. fantaskarsef

    fantaskarsef Ancient Guru

    Messages:
    12,436
    Likes Received:
    4,705
    GPU:
    2080Ti @h2o
    Honestly, "optimized settings" doesn't sound like what I'd fancy with buying the fastest, most expensive gaming card :D Just my personal, subjective opinion.
    I refuse to accept I buy a graphics card costing €1000+ (2080TI) and play on reduced details and / or reduced fps. Hence my general hesitation to step up to 4K. I wouldn't want to play on a 60Hz display again, as well.

    But I am very aware that this is my subjective feeling and limits, and they may wary with others like you. And if you like it I do think that's prefectly fine. Fallen Order was one of the games I actually wondered and wanted to see in 4K and see if it looked noticeably better than "2K".
     
  20. Martigen

    Martigen Master Guru

    Messages:
    465
    Likes Received:
    213
    GPU:
    GTX 1080Ti SLI
    This is why you SLI. It's only way to get maximum details at maximum res at maximum FPS. My 2x1080TI are approx 40% faster than an overclocked 2080Ti and I max most games out at 4k @ 60 on an OLED TV.

    Every time someone mentions SLI people who don't actually use it jump in to say it's dead. Let me nip that in the bud: there were 10 new SLI profiles in the last few driver releases, including for RDR2, Outer Worlds, Fallen Order and more. It's alive and well and works gloriously.

    When the 3080Ti comes out, I'll buy 2x2080Tis at their reduced price and still be ahead of the curve.
     
    Dragam1337 likes this.

Share This Page