Vietnamese store put up preorder for ASUS ROG STRIX 1180s

Discussion in 'Frontpage news' started by Hilbert Hagedoorn, Jul 2, 2018.

  1. warlord

    warlord Guest

    Messages:
    2,760
    Likes Received:
    927
    GPU:
    Null
    Gurus, we should admit as we are close to 2020, we have so much power in PC, that we should buy something only if it needed. For example if you need 4K HDR with 60fps minimum with ultra settings YOU ARE GONNA PAY AS MUCH NVIDIA WANTS YOU TO PAY for new GPU or you can keep using 1070/1080/1080ti/titan Xp. We shouldn't really care much about prices anymore I believe. Our machines can meet our expectations if we lower them by a little. You do like being ahead of time, pay gladly then.
     
    Last edited: Jul 2, 2018
    Silva likes this.
  2. Fox2232

    Fox2232 Guest

    Messages:
    11,808
    Likes Received:
    3,371
    GPU:
    6900XT+AW@240Hz
    I was not exactly on Vega Hype train, quite opposite. And that's same for long time before Vega came out. On paper they had few interesting things, but they did not activate them except few which were mostly for compute.
    And I repeat myself as I said this few times before. Vega does not have additional building blocks in terms of SP/TMU/ROPs when compared to Fiji. If GCN could do 5120 SPs, and there were not those "failed to be implemented" parts. Vega 80 could still have 12.5B transistors with around 20% performance boost over that what Vega 64 delivered due to bit lower clock. But that would make it sub 300W chip as bonus.
    (And I am not exactly believer when someone states that those zounds of additional transistors were necessary to get 10% clock boost over Polaris.)
     
  3. wavetrex

    wavetrex Ancient Guru

    Messages:
    2,465
    Likes Received:
    2,578
    GPU:
    ROG RTX 6090 Ultra
    Actually this is not the real goal.

    There's quite a lot of high-framerate monitors out there, including many QHD 144Hz and soon QHD 240Hz. There's even 4K monitors announced that can do 120-144Hz...
    1080Ti can barely keep up with 60fps in modern games at 4K...

    The era of 60Hz is over. (Has been over for quite a while, but many people still refuse to acknowledge that for anything gaming, more frames is better than more pixels !)
     
    Jayp and warlord like this.
  4. schmidtbag

    schmidtbag Ancient Guru

    Messages:
    8,020
    Likes Received:
    4,398
    GPU:
    Asrock 7700XT
    Because the 1080Ti costs more than a 1080, and by a considerable margin. Not everyone needs the performance of a 1080Ti.
    Even pre-overclocked 1080Tis don't tend to get breach 1.7Ghz; what people are capable of manually OC'ing to is irrelevant. Furthermore, this is a different fabrication process, so getting higher than 2GHz is a possibility. It could be worse at OC'ing too. The point is, just because the architecture may be the same, that doesn't mean the OC potential will be too.
    Considering this is basically 1080Ti performance level at a lower performance tier, I think the amount of CUDA cores is perfectly reasonable. That being said, the 1180 has the same amount of CUDA cores as the 1080Ti, not the 1080 (assuming this Vietnamese store has the official specs).
    Wait, you're saying you expect the 1180 to have nearly double the shaders of a 1080Ti? I don't even think the 1180Ti could pull that off.

    Actually, 60Hz isn't going anywhere for quite some time. I have absolutely no interest in a display higher than 60Hz. The only games where it yields any benefit are competitive games. Not only do I not play any of those, but anyone who does likely isn't good enough where doubling the frame rate isn't going to make them perform better (except the pro players, obviously). Having better equipment doesn't make you a better gamer.
     
    Last edited: Jul 2, 2018
    signex likes this.

  5. warlord

    warlord Guest

    Messages:
    2,760
    Likes Received:
    927
    GPU:
    Null
    That is why I said after that, minimum with ultra settings. Who cares about 120hz/144hz if you cannot maintain 60fps with maximum visuals for every gaming second? :)
     
  6. wavetrex

    wavetrex Ancient Guru

    Messages:
    2,465
    Likes Received:
    2,578
    GPU:
    ROG RTX 6090 Ultra
    @schmidtbag, you probably didn't read the numbers correctly before quoting.

    1080 Has 2560 shaders (2.5K), I happen to own one.
    1080Ti has 3584 shaders (3.6K)... exactly as I wrote.
    (Full Titan XP has 3840)

    I can expect the next gen x80 non-Ti to have ~3200 or so, clocking slightly better than 1080; Combined these would make the new chip at least equal with current 1080Ti, while costing and consuming less.

    We're NOT talking about any new "Ti" version here !
     
  7. schmidtbag

    schmidtbag Ancient Guru

    Messages:
    8,020
    Likes Received:
    4,398
    GPU:
    Asrock 7700XT
    I could argue the same thing about you. I'm aware of what the 1080 and 1080Ti specs are. Assuming the leaked info in the article is accurate, the 1180 has the same specs as the 1080Ti. I mentioned no specifics about the 1180Ti's specs.
     
  8. wavetrex

    wavetrex Ancient Guru

    Messages:
    2,465
    Likes Received:
    2,578
    GPU:
    ROG RTX 6090 Ultra
    Yeah.
    Reopened the pictures and now it makes sense...

    Edited pictures of a standard 1080 Ti with 0 changed in 1, and identical specs as "Ti"
    Which resulted in assuming that 3584 shaders were about the 1180 Ti, so unchanged architecture, just a rebrand.

    ~~~
    Anyway, already 3 pages of forum posts about some VN kid's MSPAINT editing.
    We should probably stop here.
     
  9. alanm

    alanm Ancient Guru

    Messages:
    12,274
    Likes Received:
    4,479
    GPU:
    RTX 4080
    Disagree. I may be in a minority with this, but much prefer more pixels with lower refresh rates than less pixels high refresh/FPS. More pixels (4k + greater screen size ) to me produces better visuals, immersion which I value more than smoother gameplay. Again, I know I'm in a minority, but say this after living with several high refresh rate monitors over many years then switching to large size 4k screen.
     
    Pinscher, signex, warlord and 2 others like this.
  10. schmidtbag

    schmidtbag Ancient Guru

    Messages:
    8,020
    Likes Received:
    4,398
    GPU:
    Asrock 7700XT
    Seeing as the vast majority of people are content with 30FPS and seem to prefer higher graphical detail over higher framerates, you are not by any means in the minority. People who want 120Hz+ are the minority. Most people see 120Hz+ displays in the sense "having a Ferrari would be so nice, but I know I'll never get one" and such people don't feel like they're really missing out as a result. I'm pretty confident the vast majority of gamers don't have a mentality of "my gaming experience is terrible due to this 30/60 Hz display". Obviously, nobody is against having higher refresh/frame rates, but only a select few games actually benefit from it (and like I said before, only the elite players can actually take advantage of it).
     
    alanm and warlord like this.

  11. warlord

    warlord Guest

    Messages:
    2,760
    Likes Received:
    927
    GPU:
    Null
    Exactly that happens to me. I prefer stable 30fps console like experience with absolute maximum setting in Forza Horizon 3 despite is a racing game, instead of lowering settings to achieve 60fps or more. It is clearly each to his own. On the other hand others would bash me because I play a racing game without ultra fast pacing and tons of frames to feel the speed. :D
     
    alanm and schmidtbag like this.
  12. Fox2232

    Fox2232 Guest

    Messages:
    11,808
    Likes Received:
    3,371
    GPU:
    6900XT+AW@240Hz
    It's point of view, and it can only be agreed with. As high refresh rate vs high resolution are just different ends of same spectrum. I personally have trouble with 60fps even in isometric game like Path of Exile as there is a lot of things going on screen. at 90fps it starts to be comfortable.
    In MOBA games, 60fps is just fine for me.
    FPS, I just need 100fps+, optimally 120+. Can't watch 30fps even in strategy games.
    Only game where I kind of tolerated 30fps was some racing game from Sony with quite nicely done motion blur. But it felt like input lag was too big.
     
  13. Jayp

    Jayp Member Guru

    Messages:
    151
    Likes Received:
    61
    GPU:
    2080 Ti
    Couldn't agree more. I am tired of people talking about 4k60 like some gold standard. I use my 1080 Ti for butter smooth QHD 165hz panel and I love it that way. Going back to 60hz hz my eyes. I don't really have anything against 4k except for the fact in big titles we are not anywhere close to higher settings 120hz plus. Once I can rock 144hz or so at 4k high settings in major titles I will switch to 4k panel. These new 4k 144hz monitors are out of this work in price right now. Seems like rocking a samsung TV and freesync is the best value but I need a monitor like that for a more reasonable price. Probably rock my PG279Q for some years to come.
     
    wavetrex likes this.
  14. Jayp

    Jayp Member Guru

    Messages:
    151
    Likes Received:
    61
    GPU:
    2080 Ti
    Man I don't even like using 60hz monitor for normal desktop and application use anymore. I get on my other system with 60hz and my eyes are like wtf. The mouse looks like it's trailing compared to high refresh. I am a huge high refresh advocate.
     
  15. Jayp

    Jayp Member Guru

    Messages:
    151
    Likes Received:
    61
    GPU:
    2080 Ti
    The lower frame rate is still a bit more popular because it comes standard. You don't have to do anything to get there. I know a lot of people that aren't even familiar with what high refresh feels and looks like. Once you go high refresh you certainly don't go back not most people anyways. I would take lower visuals and high refresh over 60hz and max details any day. Especially when you start talking 27" QHD vs 27" 4k. This visual difference at that size is so minimal especially if you are playing the game and not just staring at the visuals. I guess everyone is going to have their opinion but some have opinion on something they haven't used before. I am not saying this is you.
     

  16. schmidtbag

    schmidtbag Ancient Guru

    Messages:
    8,020
    Likes Received:
    4,398
    GPU:
    Asrock 7700XT
    Relatively speaking, the Forza series isn't even that fast-paced (when you compare to sci-fi racing games where the vehicles regularly travel at hundreds to thousands of km/h) so 60FPS would pretty much be as fast as that game would really need to go. I'm not saying people wouldn't notice if the frame rate were higher, but it wouldn't make enough of a difference. Games like Distance and Audiosurf 2 are probably the fastest-paced I play, and though I would definitely notice an improvement at 144Hz, I'm not bothered by the framerate and I doubt my scores would improve enough to justify buying a more expensive display.

    However... I'm confident that if I played these games on a 144Hz monitor and went back to 60Hz, that would probably drive me nuts (in those particular games, not all games). This is one of those situations where "ignorance is bliss" reigns especially true - if you're happy with the specs you've got, why spend more money on something you don't need?

    I agree with this, though my only gripe with wavetrex's comment is how it came across as so presumptuous and arrogant. Of course higher frame rates can result in a more pleasant experience, but like you said, sometimes a higher resolution is more important. It all depends on the game and personal taste. Such things shouldn't be treated so matter-of-factly.

    I don't disagree with any of this - like you said, it's all about opinion. When it comes to framerate, I definitely notice the difference. I have the money to buy a high-refresh rate 4K display and the GPUs to operate it, but I opt not to, because I simply don't care enough. For me personally, the money I could spend on that would give me more happiness elsewhere.

    It's important to consider this scenario, too:
    Go hook up an old N64 on a CRT. At first, the experience is horrible on so many levels, even the the point where you might get a headache. The first thing you think is "how the hell did anybody ever think this looked acceptable?" but after about an hour, the lengthy list of problems suddenly aren't a nuisance anymore. What once was intolerable suddenly becomes acceptable. Modern hardware is no different. It is because of this why I don't think the upgrade is worth it. However, I do plan on getting a 4K display, because I actually will have a practical benefit in doing so (there are some games where details are difficult to resolve, or look annoyingly blocky at a distance).
     
    alanm and warlord like this.
  17. slick3

    slick3 Guest

    Messages:
    1,867
    Likes Received:
    234
    GPU:
    RTX 2070 +85/1200
    It's really about preference, some people prefer the higher pixel density - some prefer the buttery smooth experience. I don't see myself going back to 60hz, ever. The difference is too noticeable for me to switch back.

    I don't mind if people agree or disagree, to each their own. I can 'understand' why people would want higher resolution, just as I'm sure it also goes the other way around.

    What I don't accept however is when people make comments based on ignorance, aka people who have never experienced 144hz and arguing against it.
     
    yasamoka and wavetrex like this.
  18. wavetrex

    wavetrex Ancient Guru

    Messages:
    2,465
    Likes Received:
    2,578
    GPU:
    ROG RTX 6090 Ultra
    I never ever saw one single comment anywhere on any forum or blog that someone said "Hey I bought this 144Hz monitor and it totally sucks to play at 144fps ! I'd rather have 4K @ 60Hz instead"

    It is probably certain that all the comments about resolution being better than framerate are from people who, as you said, never experienced it themselves (at home, not just a quick glance on somebody else's PC)
     
  19. warlord

    warlord Guest

    Messages:
    2,760
    Likes Received:
    927
    GPU:
    Null
    If you have good eyes you don't care. Not all games are able to utilize all HZ levels with FPS to fill. So it is useless for many games, on the other hand, more pixel density and resolution is working greatly in all games available to date.

    The best is to have both resolution and hz to enjoy. But if you have to choose between the two first comes the quality then the speed. The best compromise is the classic monitors like 27" 1440p 16:9 144+hz or 30"~35" 1440p 21:9 100+hz ones.

    And I say again better to have always more pixels and then if the horsepower exists to use the extra hz. I would vomit if I had to play like some esporters with 1080p 24-27" 240hz monitors. I cannot. Disgusting. Nobody deserves this trash before his eyes.
     
  20. schmidtbag

    schmidtbag Ancient Guru

    Messages:
    8,020
    Likes Received:
    4,398
    GPU:
    Asrock 7700XT
    Nobody in their right mind would complain about a 144Hz monitor, but I'm confident there are more people who would prefer a 4K @ 60Hz display than a 1080p @ 144Hz G-Sync display (of a similar price bracket). Whether or not they should is a different story, because again, it comes down to what you do and what your personal preferences are. For me personally, I would benefit more from my GPU power rendering more pixels than more frames.
    Most of us here are not advocating that higher resolutions resolution are [unanimously] better than higher framerates. There's a point of diminishing returns for both, and that point varies from person to person AND workload to workload.

    I disagree with this statement equally as much as wavetrex's "The era of 60Hz is over. (Has been over for quite a while, but many people still refuse to acknowledge that for anything gaming, more frames is better than more pixels !)"
    C'mon people... nothing wrong with having a preference, but let's not undermine others'...
     
    Maddness likes this.

Share This Page