Star Wars Jedi: Fallen Order

Discussion in 'Games, Gaming & Game-demos' started by Carfax, Apr 13, 2019.

  1. Carfax

    Carfax Ancient Guru

    Messages:
    2,932
    Likes Received:
    461
    GPU:
    NVidia Titan Xp
    OK I get what you're saying. The way it's presented, makes it seem like AVX and AVX2 are different instruction sets, but from what you say, they overlap. AVX2 is merely an extension of AVX, with enhanced capabilities.

    I reckon that when developers get around to targeting AVX2, they will likely use it the same way they are using AVX; namely for physics, cloth simulation, destruction and particle effects.

    Could these instructions be used for A.I? Physics and A.I will be the prime areas for exploitation with the PS5 and XSX, more so than graphics in my opinion.
     
  2. fantaskarsef

    fantaskarsef Ancient Guru

    Messages:
    11,700
    Likes Received:
    3,676
    GPU:
    2080Ti @h2o
    It's clocked at 4.5GHz. Still okay, but yeah it shows. IMC is horrible, can't do 32GB @ 3200MHz... no way, didn't get it to work. And I think it's showing it's age these days... but not really fancying buying in a Broadwell CPU (which could work ofc), much rather thing about an upgrade in 2020.
     
  3. Dragam1337

    Dragam1337 Ancient Guru

    Messages:
    1,740
    Likes Received:
    821
    GPU:
    2080 ti @ 2100 mhz
    Just get yourself a 4k display, then cpu will be a non-issue :p
     
    XenthorX and fantaskarsef like this.
  4. fantaskarsef

    fantaskarsef Ancient Guru

    Messages:
    11,700
    Likes Received:
    3,676
    GPU:
    2080Ti @h2o
    I figured such as well, sadly I don't see any current GPU as capable enough for it. Not even the 2080TI I have already.
     

  5. Dragam1337

    Dragam1337 Ancient Guru

    Messages:
    1,740
    Likes Received:
    821
    GPU:
    2080 ti @ 2100 mhz
    2080 ti should do 60+ fps at 4k in all titles, assuming that you don't use craptracing and use optimized settings in games like rdr2.
     
  6. fantaskarsef

    fantaskarsef Ancient Guru

    Messages:
    11,700
    Likes Received:
    3,676
    GPU:
    2080Ti @h2o
    Honestly, "optimized settings" doesn't sound like what I'd fancy with buying the fastest, most expensive gaming card :D Just my personal, subjective opinion.
    I refuse to accept I buy a graphics card costing €1000+ (2080TI) and play on reduced details and / or reduced fps. Hence my general hesitation to step up to 4K. I wouldn't want to play on a 60Hz display again, as well.

    But I am very aware that this is my subjective feeling and limits, and they may wary with others like you. And if you like it I do think that's prefectly fine. Fallen Order was one of the games I actually wondered and wanted to see in 4K and see if it looked noticeably better than "2K".
     
  7. Martigen

    Martigen Master Guru

    Messages:
    428
    Likes Received:
    168
    GPU:
    GTX 1080Ti SLI
    This is why you SLI. It's only way to get maximum details at maximum res at maximum FPS. My 2x1080TI are approx 40% faster than an overclocked 2080Ti and I max most games out at 4k @ 60 on an OLED TV.

    Every time someone mentions SLI people who don't actually use it jump in to say it's dead. Let me nip that in the bud: there were 10 new SLI profiles in the last few driver releases, including for RDR2, Outer Worlds, Fallen Order and more. It's alive and well and works gloriously.

    When the 3080Ti comes out, I'll buy 2x2080Tis at their reduced price and still be ahead of the curve.
     
    Dragam1337 likes this.
  8. fantaskarsef

    fantaskarsef Ancient Guru

    Messages:
    11,700
    Likes Received:
    3,676
    GPU:
    2080Ti @h2o
    Oh well, to each their own. I had SLI back in the days of the 980 and had nothing but troubles with it. Blockbuster games not supporting it, having to wait for profiles in the driver releases a few weeks after game releases (these days it's not that bad anymore), twice the costs for cooling unless you stay under air, there's quite some reasons to avoid SLI. I'm happy for you that it works, and I do not have an up to date view maybe, but I don't fancy flashing out 300% of the GPU / monitor costs to make that step to 4K yet. Subjective decision though, and even if I could afford it, it doesn't feel worth it to me personally
     
  9. Dragam1337

    Dragam1337 Ancient Guru

    Messages:
    1,740
    Likes Received:
    821
    GPU:
    2080 ti @ 2100 mhz
    It's true, you can get sli working in most titles, and i do... but i must admit that i am tired of fighting the trend, so to speak. That, and i am increasingly annoyed at not being able to use TAA, as more or less all games are meant to use it these days, and flicker quite a bit without it, even at 4k. So i will upgrade to a single 3080 ti when it launches.
     
    fantaskarsef likes this.
  10. Martigen

    Martigen Master Guru

    Messages:
    428
    Likes Received:
    168
    GPU:
    GTX 1080Ti SLI
    I've never seen a game flicker with or without TAA using SLI. What games do you have issues with? Maybe I can test and compare. I have seen TAA use more PCIE bus bandwith however (as it utilises the previous frame), so having two cards with x16 each is beneficial here.
     

  11. Dragam1337

    Dragam1337 Ancient Guru

    Messages:
    1,740
    Likes Received:
    821
    GPU:
    2080 ti @ 2100 mhz
    Flickering was badly worded - shimmering is the correct word :)

    A game that shimmers alot without taa is shadow of the tomb raider, especially her hair - had it just been (significantly) more bandwidth used, thus lower fps (as in star wars battlefront 2), then it would have been fine, but using taa with sli in shadow of the tomb raider causes quite a few artificats aswell at certain times.

    But yeah, 3.0 16x for both gpu's is why i bought the 4930k all those years ago ;P

    But i've noticed that wether or not you can use taa in a game with sli, depends on how bandwidth hungry the game is beforehand. Again, TAA can be used in frostbite titles with an acceptable performance penalty, where as in other games, TAA completely tanks the performance with sli - asscreed unity for instance.
     
    Last edited: Dec 16, 2019
  12. Mufflore

    Mufflore Ancient Guru

    Messages:
    12,029
    Likes Received:
    929
    GPU:
    1080Ti + Xtreme III
    I agree with you but I fear 2x 2080ti wont match a 3080ti for RTX.
    I expect NVidia to more than double RTX performance, it is sorely needed.
     
  13. Dragam1337

    Dragam1337 Ancient Guru

    Messages:
    1,740
    Likes Received:
    821
    GPU:
    2080 ti @ 2100 mhz
    Who the fack cares about rtx... it's nvidia bs marketing, nothing more, nothing less...
     
  14. Mufflore

    Mufflore Ancient Guru

    Messages:
    12,029
    Likes Received:
    929
    GPU:
    1080Ti + Xtreme III
    Huh?
     
  15. Dragam1337

    Dragam1337 Ancient Guru

    Messages:
    1,740
    Likes Received:
    821
    GPU:
    2080 ti @ 2100 mhz
    Rtx is just an nvidia's excuse for those otherwise useless cores being on a gaming chip, cause they don't bother developing separate pro and gaming chip designs. As shown by the cryengine raytracing demo, they are not needed in any way.

    Rtx is a gimmick, just like physx was, and like most gimmicks, it will soon be gone.

    Worst part is that so many people buy into the crap that nvidia feeds them...
     

  16. Archvile82

    Archvile82 Member Guru

    Messages:
    157
    Likes Received:
    66
    GPU:
    Nvidia 2080ti FE
    Have to disagree with that and so does AMD as they are also doing hardware raytracing or at least a hybrid system. Cryengine raytracing is very light or lower precision compared to what DXR is doing and you would also have to see those effects in a full game to judge weather their version is good or not.

    Raytracing is far from a gimmick - much better than crappy screen space reflection techniques and releasing this tech early gave the industry a head start to implement them as they will be core to game development come next gen console games.
     
  17. Mufflore

    Mufflore Ancient Guru

    Messages:
    12,029
    Likes Received:
    929
    GPU:
    1080Ti + Xtreme III
    Its pants on current cards which is why I didnt buy one. That and way too expensive.
    Next gen will hopefully make it a decent proposition for the first time, assuming its at least twice as quick.
    I think you missed the point that sales are pretty low for this gen which is why they need to make it a lot faster and cost less.

    This is what developers have been waiting for, much simpler lighting of the environment without a ton of gotchas that need compensating.
    It will not only look better, more time can be spent by devs on other troublesome areas.
    This means we get better looking games.
    Its the future.

    ps
    Demos are not real time games.
    Plenty of times we have seen demos that exceed what we could actually play.
     
  18. haste

    haste Maha Guru

    Messages:
    1,050
    Likes Received:
    293
    GPU:
    GTX 1080 @ 2.1GHz
    AI is very general term that covers countless of algorithms and above all that it depends on specific implementation whether it will be possible to optimize the code with AVX. But compilers are extremely smart these days, so even without specific optimizations there is a good chance it will boost performance. Nevertheless, last time we were optimizing some functions for skeletal animations on Xenon, we were able to improve performance by almost 20% over the code straight from compiler.
     
  19. Dragam1337

    Dragam1337 Ancient Guru

    Messages:
    1,740
    Likes Received:
    821
    GPU:
    2080 ti @ 2100 mhz
    Oh yeah, cause having slightly better looking reflections is totally worth tanking the performance for... not. If i need performance, reflections is one of the first things i turn down / off. They generally add very little to the overall look of a game.

    I am not impressed by any stretch of the imagination.

    And to make matters worse, nvidia wants you to use their crappy upscaling in order to use their crappy dxr, because otherwise the performance is too terribad... so they'd make you sacrifice something that actually matters for the visual fidelity (resolution) for something that really doesn't matter... it's laughable really.

    I will buy a 3080 ti because... what alternative is there, if i want the best performance. But you can bet your arse that i won't be using dxr and especially not dlss.
     
  20. DocStrangelove

    DocStrangelove Ancient Guru

    Messages:
    1,944
    Likes Received:
    481
    GPU:
    MSI RTX2080 Super
    Thx mate. New PC is on the way and i'll leave it on this time. ;)
     
    Dragam1337 likes this.

Share This Page