Event / Tech Coverage: AMD Capsaicin 2017 - Vega - Threadripper

Discussion in 'Frontpage news' started by Hilbert Hagedoorn, Jul 30, 2017.

  1. Ryu5uzaku

    Ryu5uzaku Ancient Guru

    Messages:
    7,551
    Likes Received:
    608
    GPU:
    6800 XT
    I am not in the least disappointed really. I got my cpu 3 months ago and it has been running great. Threadripper build would have cost more no doubt. And would have had higher memory bandwidth but it should be roughly the same else so I am all fine.
     
  2. AlmondMan

    AlmondMan Maha Guru

    Messages:
    1,037
    Likes Received:
    345
    GPU:
    7900 XT Reference
    What's different from a few years ago? Other than that we had one generation or so of less powerhungry mid-spec cards?
     
  3. cryohellinc

    cryohellinc Ancient Guru

    Messages:
    3,535
    Likes Received:
    2,975
    GPU:
    RX 6750XT/ MAC M1
    This, plus am4 socket will be there for a while, as a result you can upgrade to a new CPU down the road without switching to a new mobo.
     
  4. labidas

    labidas Guest

    Messages:
    328
    Likes Received:
    119
    GPU:
    R9 290 / RX5700XT
    I'm buying Intel.
    Someone needs to support the competition too :(
     

  5. schmidtbag

    schmidtbag Ancient Guru

    Messages:
    8,015
    Likes Received:
    4,392
    GPU:
    Asrock 7700XT
    HBM isn't the problem, I'm betting the GPU itself is the bottleneck. GNC isn't aging so well - Vega has more serious problems than memory bandwidth and overclockability. Don't forget - Nvidia is also interested in HBM, so whatever negative speculation you have of it clearly isn't true. It isn't often Nvidia follows the footsteps of a competitor so obviously.

    Maybe 1800X users (who didn't get the sale price) are disappointed, but definitely not 1700 users. Ryzen 7 is for a completely different market. You get fewer memory channels, but from what I recall, frequency has a bigger performance impact. You get fewer PCIe lanes, but most people only want 1 GPU, and any other expansion cards can easily be handled by the chipset. You get more M.2 slots, but most people are better off with just 1 (for performance) with a separate SATA drive (for mass storage). Ryzen 7s also have the advantage of more heatsink availability and ITX form factors. 1900X is currently limited to full ATX with almost no coolers to choose from.

    So all that being said, the only people who would've regretted their purchase are 1800X owners who wanted a little bit extra but settled for less.
     
  6. thesebastian

    thesebastian Member Guru

    Messages:
    173
    Likes Received:
    53
    GPU:
    RX 6800 Waterblock
    As long as the technology keeps advancing as fast as possible, I don't really care about this. Too many years with CPU market stuck with tiny yearly performance bumps.....
    0% exciting!
     
  7. reix2x

    reix2x Master Guru

    Messages:
    717
    Likes Received:
    246
    GPU:
    HIS 4870 1GB
    maybe i'm wrong but i have been seen something strange here, could this be a gaming forum? because i see people crying because ryzen is 5 FPS slower than 7700, crying because they doesn't want 16 cores, come on guys think about hardware, about future, you want the same tech over and over again?
     
  8. Loophole35

    Loophole35 Guest

    Messages:
    9,797
    Likes Received:
    1,161
    GPU:
    EVGA 1080ti SC
    Vast majority of users on here are gamers first.
     
  9. rm082e

    rm082e Master Guru

    Messages:
    717
    Likes Received:
    259
    GPU:
    3080 - QHD@165hz
    So wait a minute, I'm seeing conflicting information here. Is the 1080 equivalent card going to be $499, or $599?
     
  10. schmidtbag

    schmidtbag Ancient Guru

    Messages:
    8,015
    Likes Received:
    4,392
    GPU:
    Asrock 7700XT
    Sort of - bragging rights of numbers is the top priority. A good gaming experience is distinctly the #2 priority for most people here. I've seen on many forums (here included) where people actively and willingly choose higher frame rates over reducing microstutter. People will also willingly pay an extra $150 for an additional 5-15FPS. Usually nobody even glances at average minimum frame rates.
     
    Last edited: Jul 31, 2017

  11. BReal85

    BReal85 Master Guru

    Messages:
    487
    Likes Received:
    180
    GPU:
    Sapph RX 570 4G ITX
    The 599$ is a "Radeon Pack", which includes the faster air cooled RX Vega and a 100$ voucher for 1800X+X370 mobo, as I understand. The GPU alone is 499$.
     
  12. Loophole35

    Loophole35 Guest

    Messages:
    9,797
    Likes Received:
    1,161
    GPU:
    EVGA 1080ti SC
    It's funny you say that as most sites that show 1% and .1% numbers showed that Ryzen was still at a disadvantage compared to 7700. It's just users (most of which have never used a 7700) claiming smoother gameplay than the 7700. I look at facts presented to me in reputable reviews and base my decision off of that. I don't have the expendable cash to buy one of each system and test back to back.
     
  13. rm082e

    rm082e Master Guru

    Messages:
    717
    Likes Received:
    259
    GPU:
    3080 - QHD@165hz
    Ah, okay. That sounds like an excuse to put most of your cards out in packs...
     
  14. schmidtbag

    schmidtbag Ancient Guru

    Messages:
    8,015
    Likes Received:
    4,392
    GPU:
    Asrock 7700XT
    I said nothing about which product is better or in what way, I'm just saying people tend to ignore what gives the best experience and prefer numbers over their own perception. And no, I'm not the type who says "the human eye can only see 30FPS" but people will pay more or push their hardware to unnecessary limits for just a number, a number they won't perceive over a competitor or the next notch down. If someone wants to spend the extra money and deal with extra heat without any noticeable difference, go ahead - not my problem. But I personally prefer practicality over a petty set of numbers.

    EDIT:
    Obviously, if a more expensive or faster product has a difference you can perceive and that difference is a problem, then that product is worth getting. In other words, people should buy what suits their needs, not their ego. That being said, even if you want to go the Intel route, an overclocked i5 can and will handle most games (including 5+ thread games) at 60FPS just fine, but people will still buy the i7 anyway, because of numbers.
     
    Last edited: Jul 31, 2017
  15. BReal85

    BReal85 Master Guru

    Messages:
    487
    Likes Received:
    180
    GPU:
    Sapph RX 570 4G ITX
    The cards will also be available alone except for the water cooled version.
     

  16. rm082e

    rm082e Master Guru

    Messages:
    717
    Likes Received:
    259
    GPU:
    3080 - QHD@165hz
    Yeah, but my question is in what quantity? Are 20% of the available cards going to be put into bundles, or 80%? If there aren't enough stand alone cards available for the consumers that want to buy them, then some consumers will have to chose between nothing, or a bundle. I just wonder if that's what they're trying to engineer?
     
  17. waltc3

    waltc3 Maha Guru

    Messages:
    1,445
    Likes Received:
    562
    GPU:
    AMD 50th Ann 5700XT
    Yes, the actor is Christian Bale...
     
  18. waltc3

    waltc3 Maha Guru

    Messages:
    1,445
    Likes Received:
    562
    GPU:
    AMD 50th Ann 5700XT
    Big advantage of HBM2 compared to GDDR5 is power consumption. Big nod to HBM2, from what I read so far. Additionally, it will be interesting to see if the Vega GPU engine can utilize all of that bandwidth, as well. I think at this point the power savings alone is reason enough to use it.

    From HH's article: "The graphics engineers from AMD claimed that HBM2 will offer you 5x the power efficiency compared to any other graphics memory including GDDR5, and yes that is huge." We shall see...

    Don't know quite what to think of all of these bundles right out of the gate, however. AMD is being extremely aggressive this year--I'm wondering if Vega will be powerful enough to do 4k in most cases at a decent frame rate. If so, then the bundles ought to lead into massive sales--especially for the $399 non-bundle variant. We shall see...Looking forward to HH's hands-on review!
     
  19. waltc3

    waltc3 Maha Guru

    Messages:
    1,445
    Likes Received:
    562
    GPU:
    AMD 50th Ann 5700XT
    I agree completely. I sort of hate what 3dfx did in starting the frame-rate craze many years back...but then when were talking average GLIDE frame rates of 20-30 fps compared to competitor's cards using less mature APIs at the time maxing out sometimes as low as ~5 fps, it made a lot more sense than it does now as it marked the difference between "playable" frame rates and slide shows. Intel discreet GPUs in those days (the infamous i7xx discrete 3d cards that Intel later dropped because they could not compete with 3dfx/nVidia--I owned two of them)--and ATI Rage Fury 128 (I believe that was the brand name--I had a very difficult time getting >10 fps out of that card and so returned it.) Matrox Millennium was *the* card to own for 2d gaming, but sucked at d3d gaming in terms of frame-rates and image quality. 3dfx whipped everyone for several years in terms of playable frame rates.) Etc.

    Today, in a blind test most people couldn't see the difference between 120 fps and 80 fps (imagine a test where both displays were running at 80 fps but the end user was asked to point out the display running at 120 fps...;) Lots of people would find it when it wasn't even there!), but nevertheless we have benchmark bar charts actually illustrating differences of ~1 fps between GPUs and CPUs--which is certainly not worth talking about at all. 1fps differences simply are not real and would never be perceived by the end-users, etc.

    Lately, I'm enjoying AMD's new "Enhanced Sync" option for the RX480 and up, introduced last week in the 17.7.2 WHQL Crimson drivers. The idea is that you set the game to run with vsync off but set the Crimson profile for the game to Enhanced Sync, and the game should run very close to full bore vsync-off rates but with little to no tearing. It's also supposed to help in games in which the frame-rate engines limit the frame-rates to 30 fps or less, too. It's become my new default and seems to work as advertised--for smooth gaming with no stutter and almost no tearing--it's much better than vsync off, imo.
     
  20. Amx85

    Amx85 Master Guru

    Messages:
    335
    Likes Received:
    10
    GPU:
    MSI R7-260X2GD5/OC
    Just the VEGA 56 is reasonable if outperforms GTX1070, still no understand why VEGA 64 draws a lot more, Just for 10~15% naaa, AMD put more ROPs please, Tessellation is'nt all...
     

Share This Page