Star Wars Jedi: Fallen Order

Discussion in 'Games, Gaming & Game-demos' started by Carfax, Apr 13, 2019.

  1. fantaskarsef

    fantaskarsef Ancient Guru

    Messages:
    12,215
    Likes Received:
    4,391
    GPU:
    2080Ti @h2o
    Oh well, to each their own. I had SLI back in the days of the 980 and had nothing but troubles with it. Blockbuster games not supporting it, having to wait for profiles in the driver releases a few weeks after game releases (these days it's not that bad anymore), twice the costs for cooling unless you stay under air, there's quite some reasons to avoid SLI. I'm happy for you that it works, and I do not have an up to date view maybe, but I don't fancy flashing out 300% of the GPU / monitor costs to make that step to 4K yet. Subjective decision though, and even if I could afford it, it doesn't feel worth it to me personally
     
  2. Dragam1337

    Dragam1337 Ancient Guru

    Messages:
    2,637
    Likes Received:
    1,522
    GPU:
    Rtx 3090 Strix OC
    It's true, you can get sli working in most titles, and i do... but i must admit that i am tired of fighting the trend, so to speak. That, and i am increasingly annoyed at not being able to use TAA, as more or less all games are meant to use it these days, and flicker quite a bit without it, even at 4k. So i will upgrade to a single 3080 ti when it launches.
     
    fantaskarsef likes this.
  3. Martigen

    Martigen Master Guru

    Messages:
    460
    Likes Received:
    209
    GPU:
    GTX 1080Ti SLI
    I've never seen a game flicker with or without TAA using SLI. What games do you have issues with? Maybe I can test and compare. I have seen TAA use more PCIE bus bandwith however (as it utilises the previous frame), so having two cards with x16 each is beneficial here.
     
  4. Dragam1337

    Dragam1337 Ancient Guru

    Messages:
    2,637
    Likes Received:
    1,522
    GPU:
    Rtx 3090 Strix OC
    Flickering was badly worded - shimmering is the correct word :)

    A game that shimmers alot without taa is shadow of the tomb raider, especially her hair - had it just been (significantly) more bandwidth used, thus lower fps (as in star wars battlefront 2), then it would have been fine, but using taa with sli in shadow of the tomb raider causes quite a few artificats aswell at certain times.

    But yeah, 3.0 16x for both gpu's is why i bought the 4930k all those years ago ;P

    But i've noticed that wether or not you can use taa in a game with sli, depends on how bandwidth hungry the game is beforehand. Again, TAA can be used in frostbite titles with an acceptable performance penalty, where as in other games, TAA completely tanks the performance with sli - asscreed unity for instance.
     
    Last edited: Dec 16, 2019

  5. Mufflore

    Mufflore Ancient Guru

    Messages:
    12,771
    Likes Received:
    1,320
    GPU:
    Aorus 3090 Xtreme
    I agree with you but I fear 2x 2080ti wont match a 3080ti for RTX.
    I expect NVidia to more than double RTX performance, it is sorely needed.
     
  6. Dragam1337

    Dragam1337 Ancient Guru

    Messages:
    2,637
    Likes Received:
    1,522
    GPU:
    Rtx 3090 Strix OC
    Who the fack cares about rtx... it's nvidia bs marketing, nothing more, nothing less...
     
  7. Mufflore

    Mufflore Ancient Guru

    Messages:
    12,771
    Likes Received:
    1,320
    GPU:
    Aorus 3090 Xtreme
    Huh?
     
  8. Dragam1337

    Dragam1337 Ancient Guru

    Messages:
    2,637
    Likes Received:
    1,522
    GPU:
    Rtx 3090 Strix OC
    Rtx is just an nvidia's excuse for those otherwise useless cores being on a gaming chip, cause they don't bother developing separate pro and gaming chip designs. As shown by the cryengine raytracing demo, they are not needed in any way.

    Rtx is a gimmick, just like physx was, and like most gimmicks, it will soon be gone.

    Worst part is that so many people buy into the crap that nvidia feeds them...
     
  9. Archvile82

    Archvile82 Master Guru

    Messages:
    220
    Likes Received:
    82
    GPU:
    EVGA 3080 FTW U
    Have to disagree with that and so does AMD as they are also doing hardware raytracing or at least a hybrid system. Cryengine raytracing is very light or lower precision compared to what DXR is doing and you would also have to see those effects in a full game to judge weather their version is good or not.

    Raytracing is far from a gimmick - much better than crappy screen space reflection techniques and releasing this tech early gave the industry a head start to implement them as they will be core to game development come next gen console games.
     
  10. Mufflore

    Mufflore Ancient Guru

    Messages:
    12,771
    Likes Received:
    1,320
    GPU:
    Aorus 3090 Xtreme
    Its pants on current cards which is why I didnt buy one. That and way too expensive.
    Next gen will hopefully make it a decent proposition for the first time, assuming its at least twice as quick.
    I think you missed the point that sales are pretty low for this gen which is why they need to make it a lot faster and cost less.

    This is what developers have been waiting for, much simpler lighting of the environment without a ton of gotchas that need compensating.
    It will not only look better, more time can be spent by devs on other troublesome areas.
    This means we get better looking games.
    Its the future.

    ps
    Demos are not real time games.
    Plenty of times we have seen demos that exceed what we could actually play.
     

  11. haste

    haste Maha Guru

    Messages:
    1,246
    Likes Received:
    388
    GPU:
    GTX 1080 @ 2.1GHz
    AI is very general term that covers countless of algorithms and above all that it depends on specific implementation whether it will be possible to optimize the code with AVX. But compilers are extremely smart these days, so even without specific optimizations there is a good chance it will boost performance. Nevertheless, last time we were optimizing some functions for skeletal animations on Xenon, we were able to improve performance by almost 20% over the code straight from compiler.
     
  12. Dragam1337

    Dragam1337 Ancient Guru

    Messages:
    2,637
    Likes Received:
    1,522
    GPU:
    Rtx 3090 Strix OC
    Oh yeah, cause having slightly better looking reflections is totally worth tanking the performance for... not. If i need performance, reflections is one of the first things i turn down / off. They generally add very little to the overall look of a game.

    I am not impressed by any stretch of the imagination.

    And to make matters worse, nvidia wants you to use their crappy upscaling in order to use their crappy dxr, because otherwise the performance is too terribad... so they'd make you sacrifice something that actually matters for the visual fidelity (resolution) for something that really doesn't matter... it's laughable really.

    I will buy a 3080 ti because... what alternative is there, if i want the best performance. But you can bet your arse that i won't be using dxr and especially not dlss.
     
  13. DocStrangelove

    DocStrangelove Ancient Guru

    Messages:
    1,924
    Likes Received:
    457
    GPU:
    MSI RTX2080 Super
    Thx mate. New PC is on the way and i'll leave it on this time. ;)
     
    Dragam1337 likes this.
  14. Stone Gargoyle

    Stone Gargoyle Ancient Guru

    Messages:
    5,791
    Likes Received:
    440
    GPU:
    GTX 1060 G1
  15. Carfax

    Carfax Ancient Guru

    Messages:
    2,913
    Likes Received:
    465
    GPU:
    NVidia Titan Xp
    My 5930K would do 4.4ghz rock solid stable with low voltage, and run 32GB of DDR4 3200 at CL16 and CR1. I swear, if I overclocked the memory by even 1 MHz the damn thing would crash LOL! My 6900K on the other hand will do 4.3ghz (but I have it at 4.2 because it requires less voltage), and run a 32GB DDR4 3200 kit overclocked to DDR4 3400 at CL14 and much more aggressive memory sub timings than my 5930K. Also, the annoying low write speed performance bug that Haswell-E had was fixed with Broadwell-E.

    I would only recommend Broadwell if you want to stretch out your upgrade to say 2021. By then Intel will have their 10nm desktop CPUs in full production with either Willow Cove or Golden Cove cores, and DDR5 memory support. If you prefer to upgrade in 2020, then Zen 3 will definitely be the best option.
     
    fantaskarsef likes this.

  16. Carfax

    Carfax Ancient Guru

    Messages:
    2,913
    Likes Received:
    465
    GPU:
    NVidia Titan Xp
    This is what I envision AVX2 being used for on the next gen games. Epic's new Chaos physics engine with heavy multithreading and SIMD optimization which is still in development and runs completely in software on CPUs is a template for future games. Something like this would never have been possible back in the day on older CPUs. When GPU PhysX was in full swing years ago, you would have needed a dedicated GPU to calculate all of the physics for a destruction event like what is seen in this demo. And even then it would likely have been laggy with a substantial hit to framerate.

     
  17. jwb1

    jwb1 Master Guru

    Messages:
    725
    Likes Received:
    156
    GPU:
    MSI GTX 2080 Ti
    They need to at least in a patch the ability to replay the game with your customizations.
     
  18. fantaskarsef

    fantaskarsef Ancient Guru

    Messages:
    12,215
    Likes Received:
    4,391
    GPU:
    2080Ti @h2o
    Well I t ried everything, couldn't get the 32GB to run at 3200MHz even with loosened timings etc. Just no way it worked... 3000MHz was going as well, but I mainly believe my CPU had a weak memory controller; core overclock never was an issue, could go higher maybe but not s ure an extra 100MHz does much about it.

    It's mainly the thing, under full water I'd have to take apart the look to put another old CPU in an old platform... nothing too fancy. And I don't see much use in investing money in the X99 platform just to save me a year of upgrading...

    But you are right, 2020 will be the year I'm curious about Zen3, but I have learned that I don't want to put too much hope in either AMD nor Intel, so a 2021 CPU upgrade to Intel's platform doesn't look very attractive either... especially when it's about node shrinks or alleged upgrades in architecture... I simply have lost faith in Intel bringing out anything worth noting as "new".
    Sure the numbers of benchmarks don't lie, but I don't know, as an enthusiast I'd like to have a bit more than benchmarks to look at and feel when it comes to an upgrade. And I do nothing but gaming, literally, nothing.
     
  19. haste

    haste Maha Guru

    Messages:
    1,246
    Likes Received:
    388
    GPU:
    GTX 1080 @ 2.1GHz
    Physics engines are still evolving, it's not just instruction sets. While AVX2 is nice performance boost in such scenarios, it's hard to quantize the difference between instructions sets used without profiling or proper benchmark. It would be really nice if somebody could deliver us the numbers, so we could see the difference in frametimes between SSE,AVX and AVX2.

    There is an interview with developers about this and they stated that player would be too OP, so they scratched the idea of New Game+. Personally, I think it's BS. New Game+ would add at least some replayability. TBH I wouldn't care playing again with unlocked customizations only.
     
  20. Martigen

    Martigen Master Guru

    Messages:
    460
    Likes Received:
    209
    GPU:
    GTX 1080Ti SLI
    Ah yeah I know what you mean, TAA was designed to counteract that shimmering (though oddly, not all game engines seem to suffer this problem).

    I have SOTR and Asscreed Unity, though haven't played them (backlog of shame!). I'll install both and see how they fare on my system with/without TAA and SLI.
     

Share This Page