Listed in etail: AMD Ryzen 9 3900XT, Ryzen 7 3800XT, and Ryzen 5 3600XT

Discussion in 'Frontpage news' started by Hilbert Hagedoorn, Jun 1, 2020.

  1. Hilbert Hagedoorn

    Hilbert Hagedoorn Don Vito Corleone Staff Member

    Messages:
    48,325
    Likes Received:
    18,405
    GPU:
    AMD | NVIDIA
  2. Neo Cyrus

    Neo Cyrus Ancient Guru

    Messages:
    10,777
    Likes Received:
    1,388
    GPU:
    黃仁勳 stole my 4090
    4.8GHz boost? So expect a reported clock of 4.4-4.6GHz depending on your setup and an actual clock of 4.2-4.4GHz. No, I don't consider 1 in 10000 setups pulling 4.8GHz reported clock for enough nanoseconds to be recorded in HWInfo to be enough for AMD to be allowed to keep straight up lying.
     
  3. asturur

    asturur Maha Guru

    Messages:
    1,371
    Likes Received:
    503
    GPU:
    Geforce Gtx 1080TI
    I think they learned the lesson, if they write 4.8 after that media backlash, it will probably be 4.8
     
    wavetrex likes this.
  4. cryohellinc

    cryohellinc Ancient Guru

    Messages:
    3,535
    Likes Received:
    2,974
    GPU:
    RX 6750XT/ MAC M1
    Very impressive - especially for those that plan to make a new all-rounder build for gaming. 3600x seems like a very compelling choice for that now.
     

  5. Fox2232

    Fox2232 Guest

    Messages:
    11,808
    Likes Received:
    3,371
    GPU:
    6900XT+AW@240Hz
    Since AMD does not cheat their TDP values, question is:
    Are those chips going to have higher TDP or is 200~300MHz on base clock gained through power efficiency?
     
  6. Undying

    Undying Ancient Guru

    Messages:
    25,206
    Likes Received:
    12,611
    GPU:
    XFX RX6800XT 16GB
    It costs like 3700/3800x so idk how compeling that really is.
     
  7. Netherwind

    Netherwind Ancient Guru

    Messages:
    8,813
    Likes Received:
    2,396
    GPU:
    GB 4090 Gaming OC
    I understand that they release this product line to counter Intel but they'll do it anyway with the release of the 4000 series. Anyway, existing 3000 series owners will probably not feel the need to upgrade since they prefer to wait for the 4000 series.
     
  8. asturur

    asturur Maha Guru

    Messages:
    1,371
    Likes Received:
    503
    GPU:
    Geforce Gtx 1080TI
    existing 3000 users will probably not upgrade anyway, not even to 5000.
    Unless they like to upgrade, so they will buy everything, including the XT version.

    The point of the 3600xt is not price/value , for that you already have the 3600 and 3600x. if you really want that extra few fps missing, here are the extra mhz to get them. I supposes is only that.
     
  9. Chess

    Chess Guest

    Messages:
    390
    Likes Received:
    57
    GPU:
    ASUS GTX1080Ti Stri
    I was actually thinking just this a few days ago.
    We could use 2 lines of mainstream CPU's. One with increased frequency for the high FPS gaming type, one with increased cores and cache for the general usage type.
    I'm not too sad about this. Though we all hope games will use more and more cores in the future, I believe they won't make use of more than six, now-a-days? So why not up the frequency then?
    Choice is gud ^^.
     
    Ricardo likes this.
  10. Venix

    Venix Ancient Guru

    Messages:
    3,428
    Likes Received:
    1,939
    GPU:
    Rtx 4070 super
    3600xt if this is the price it is going to sell for , it is a real bad deal veraus tge 3600/x and intels 10400.
     

  11. Ricardo

    Ricardo Member Guru

    Messages:
    165
    Likes Received:
    113
    GPU:
    1050Ti 4GB
    Exaggerating much? People were reporting, at the absolute worst, 100mhz lower than advertised clocks, not 5~10% lower, and most were getting 25~50mhz short of it. All of this before some BIOS updates that definitively improved clocks further and brought them even closer.

    So yeah, they lied, but not nearly as bad as you say.
    I feel like they're releasing these so that they can have bragging rights in the gaming space. Those clocks will definitively put them near/above 10th gen intels on many cases. I doubt they can beat an overclocked 109/7/600K, but for everything else, they just might be plain faster.
     
  12. Fender178

    Fender178 Ancient Guru

    Messages:
    4,194
    Likes Received:
    213
    GPU:
    GTX 1070 | GTX 1060
    The 10400 is a bad deal all around because you can't overclock it (However Asrock and ASUS are going to be changing that but who knows how long that would even last even if it works properly) The only time the 10400 was worth it if you paired it with expensive ram or do some very heavy RAM tinkering in the BIOS but not many people want to waste the time for that.
    Also by them releasing these CPUs you may get a better deal on the CPUs they are replacing.
     
    SplashDown likes this.
  13. asturur

    asturur Maha Guru

    Messages:
    1,371
    Likes Received:
    503
    GPU:
    Geforce Gtx 1080TI
    Having 2 products, one that is a deal and one that is as fast as they can offer, is great.
    As long as people have as much choice as possible
     
  14. Fox2232

    Fox2232 Guest

    Messages:
    11,808
    Likes Received:
    3,371
    GPU:
    6900XT+AW@240Hz
    If anyone needs to go above what intel delivers today, it is actually good idea to wait for Zen3.

    As far as clock goes, I do run 4,45GHz & 4,35GHz on CCDs for daily use/gaming. And if I want to do some longer crunching, I switch to 4GHz on both CCDs. (Where CPU just sits at 60~62°C.)

    I could not care less about single core boost claims before and I can't care less today. If AMD declare 4,8GHz Boost when 4 cores are under load, then I would have seen it in different light.

    Situations where workload can fit just one core do not need that much computational power. And in most cases such combination of workloads are still better off spread across multiple cores and user would not notice any difference if CPU clock was much lower. In many situations, CPU can clock at 2 GHz and that single threaded workload will run just fine.

    I for sure have better desktop/browsing/media experience with 240Hz screen even if I downclocked CPU to 2GHz in comparison to someone who uses 60Hz screen with 5GHz CPU.
    (That's not meant to downplay importance of framerate in games which is often limited by CPU on 1080p. But chasing low one digit improvements from each generation of CPU is such a waste.)
     
    Ricardo likes this.
  15. H83

    H83 Ancient Guru

    Messages:
    5,443
    Likes Received:
    2,982
    GPU:
    XFX Black 6950XT
    Those CPUs look interesting because of the higher clocks and i´m curious to see how they perform but the prices are too high.
     

  16. schmidtbag

    schmidtbag Ancient Guru

    Messages:
    7,955
    Likes Received:
    4,336
    GPU:
    HIS R9 290
    I still don't quite understand why these are real products. The 3600XT I kinda get, but aren't there still people out there with a 3800X and 3900X who can't achieve the max clocks? Aren't there still motherboards out there that don't have the latest AGESA? I understand these are supposed to be a response to Intel's 10th gen but it doesn't matter if the CPUs don't run at their rated speeds.

    AMD does cheat their TDP, they're just a lot less ridiculous about it than Intel.
     
  17. Fox2232

    Fox2232 Guest

    Messages:
    11,808
    Likes Received:
    3,371
    GPU:
    6900XT+AW@240Hz
    I have not had in hand one AMD's CPU that would intentionally cheat. Hell, A10-7870k refused to ignore 95W TDP. If I could get it use like 125~135W, it would be nice gaming micro PC back in the day.
    Ryzen 2400G boosted within limits of TDP. And I built micro-media PC with stock cooler and limited it to 45W to give easy time for VRMs on MB.
    There is always chance that measurements are not exact, but many of them are done by components on MB not by CPU itself. And those confirm AMD's values within margin of error.

    You have Ryzen CPU, have fun in BIOS. Limit its PPT to 25W :D
    It will do its best to boost within given limit.
    When I tested this with 2700X, I took PPT even down to 15W And that's where Chip did not start to cheat. It simply wrote that it is eating 110% of limit (10% above limit).
    Because its lowest auto clock was 600MHz and lowest voltage 0,8V.(Tested under load in CB R15.)
    And I made this note for 15W PPT: "(idle ~80-85% PPT 2.2-4.3GHz)"

    So 2700X reported its idle power draw as some 12W. Which is pretty close given that EDC was 12A at 0,8V.

    I have to admit that I did not do same experiment with 3900X, which seems to be able to clock lower and have lower voltage too.
     
  18. Neo Cyrus

    Neo Cyrus Ancient Guru

    Messages:
    10,777
    Likes Received:
    1,388
    GPU:
    黃仁勳 stole my 4090
    It's not 100MHz less. A 3900X will hold 4.5GHz at best for single core turbo, for a second, then plummet back down. And those are not really the clocks, want to see what they actually are? Run Ryzen Master, or you know, scroll down in HWInfo at look at effective clocks. Hitting 4.5GHz or 4.6GHz as the reported clock for a literal millisecond and being recorded is meaningless when it's never actually running anywhere near that.
     
    Last edited: Jun 1, 2020
  19. metagamer

    metagamer Ancient Guru

    Messages:
    2,596
    Likes Received:
    1,165
    GPU:
    Asus Dual 4070 OC
    3900xt looks kinda good if it boosts to 4.8ghz. The other two make little sense at those prices.

    Base 3600 is around 190 EUR, I wouldn't pay 150 EUR more for 500 more mhz.
     
  20. wavetrex

    wavetrex Ancient Guru

    Messages:
    2,445
    Likes Received:
    2,539
    GPU:
    TUF 6800XT OC
    3600XT price is dumb, nobody in their right mind will pay that.

    3800XT price is even dumber, considering you can get a 3900X today for less than that and get 50% more cores which are pretty much equally fast (within margin of error)

    --
    3900XT price however is somewhat appealing, more speed and potentially lower power consumption for just €50 extra, I'll take it !


    BUT,
    I think these are simply placeholders, and these will NOT be the prices of the actual products.

    It might even be a joke from someone at that store...
     

Share This Page