PS4 not worth the cost, says Nvidia

Discussion in 'Frontpage news' started by HonoredShadow, Mar 14, 2013.

  1. The Chubu

    The Chubu Ancient Guru

    Messages:
    2,537
    Likes Received:
    0
    GPU:
    MSi GTX560 TwinFrozrII OC
    You'll probably have some restricted memory, for the OS only, Wii U uses 2Gb, lets go from there.

    So you have around 6Gb of memory, shared between GPU and CPU. Remember that games need quite a lot of memory for handling geometry, scenes, game logic and so on before passing the data to the GPU to render it all. So I guess it will be dealt 50/50 with the GPU and CPU.

    PS4 GDDR5 memory is pretty fine for a GPU, 176Gb/s bandwidth. Not bleeding edge but faster than most mid-range hardware out there. For a CPU is absurdly fast BUT (big but in there), x86 CPUs work with <80ns memory latency. I've seen GDDR5 latency numbers going anywhere from 100ns to 1000ns, which is way too slow for the CPU.

    Moving data around in small batches is going to be expensive as hell, so devs will need to go the extra mile to pass things in big chunks (which will be fast though, given the bandwidth) and try to make the least memory accesses possible.

    One thing that would give pretty big performance improvements is GPU and CPU accessing the same memory. One of the caveats of CPU to GPU dealings is that passing data to the GPU is relatively very slow, thus you need to organize well and batch your draw calls accordingly.

    If the CPU can access the same memory that the GPU is using, you don't need to pass the data to the GPU anymore! It's all right there. So the CPU grabs that chunk of memory, does what it needs to do, and leaves it to the GPU to grab it and render it, without going back and forth with an intermediary buss (PCIe in desktops). That would reduce drawing costs, GPU/CPU interop costs, even you'd have many memory savings since you don't need to have the same resident data in both GPU and CPU at the same time.
     
  2. freeZ

    freeZ Master Guru

    Messages:
    658
    Likes Received:
    0
    GPU:
    MSI GTX 670 PE
    Don't get me wrong, I use both myself (and always had). For HTPC purposes, I always go red. Nice cards that have zero issues with HDMI passthrough for sound. Nvidia has this as well but there are issues. For gaming though, that's another story (from my personal experience). Which was probably more along the lines of software issues than hardware though.
     
  3. vbetts

    vbetts Don Vincenzo Staff Member

    Messages:
    15,140
    Likes Received:
    1,743
    GPU:
    GTX 1080 Ti
    Jaguar is supposed to be either 32 or 22nm, can't remember which one for sure but tpd shouldn't be so high, and I'm sure the gpu will be about the same.
     
  4. warlord

    warlord Guest

    Messages:
    2,760
    Likes Received:
    927
    GPU:
    Null
    this, low tpd (for a more Green Environment), doesn't allow a lot of oc.

    I hope that consoles this time will be shipped and working with power profiles, high performance / normal(default) / low consumption, I can't believe they will sacrifice any inch of performance from demanding users just for electricity......
     

  5. mR Yellow

    mR Yellow Ancient Guru

    Messages:
    1,935
    Likes Received:
    0
    GPU:
    Sapphire R9 Fury
    You guys also forget that AMD are streets ahead when it comes to it's APU.
    Integrated GPU and CPU. nVidia just doesn't have the tech at the moment to offer a complete solution.
     
  6. CronoGraal

    CronoGraal Ancient Guru

    Messages:
    4,194
    Likes Received:
    20
    GPU:
    XFX 6900XT Merc 319
    Went from Nvidia to AMD and had no problems with either. You guys are just being fanboys, and it's pathetic. They're both just graphic card brands, stop taking it so seriously.

    Article makes Nvidia seem a bit salty, but I doubt they care.

    Bring on the next-gen consoles, I wanna see what the console exclusive games will look like. I'd also say play like but you'd probably need to go to Nintendo if you want real gameplay in your games.
     
  7. Loophole35

    Loophole35 Guest

    Messages:
    9,797
    Likes Received:
    1,161
    GPU:
    EVGA 1080ti SC
    Low TDP is more fore heat than power consumption on the GPU side I would expect a down clocked equivalent of the 7870 than an OC 7850.
     
  8. Stukov

    Stukov Ancient Guru

    Messages:
    4,899
    Likes Received:
    0
    GPU:
    6970/4870X2 (both dead)
    No. Jaguar is, which is the successor to the Bobcat (one similar to the Atom) with a larger front end. It also has higher IPC (lower clocks) than BD/PD/SR. If you look at some of the sheets you will see Jaguar and SR have almost nothing in common in terms of design. I'd have to check but they were both made independent from each other by separate teams.
     
  9. Stukov

    Stukov Ancient Guru

    Messages:
    4,899
    Likes Received:
    0
    GPU:
    6970/4870X2 (both dead)
    28nm bulk, at current time.
     
  10. Lane

    Lane Guest

    Messages:
    6,361
    Likes Received:
    3
    GPU:
    2x HD7970 - EK Waterblock
    Basically Jaguar architectures, should have do is road in " Kabini ": architectures ( 4cores LP ) and temash ( 2cores ULP ) .. basically here you end with a mod design of 2 Kabini ( how they are agenced, i dont know, is it one big silicon with 8 Jaguar cores or 2x separate "cores", i dont know.. )

    [​IMG]

    [​IMG]

    More here: ( sorry in french ). http://www.hardware.fr/focus/71/amd-devoile-steamroller-jaguar.html
     
    Last edited: Mar 15, 2013

  11. sykozis

    sykozis Ancient Guru

    Messages:
    22,492
    Likes Received:
    1,537
    GPU:
    Asus RX6700XT
    In that case, someone needs to correct the Wiki article I was reading as it showed Jaguar as being a low-power branch of Piledriver (not Steamroller, I can't keep AMD's construction equipment processors straight....)
     
  12. IPlayNaked

    IPlayNaked Banned

    Messages:
    6,555
    Likes Received:
    0
    GPU:
    XFire 7950 1200/1850
    Here:

    http://forwardthinking.pcmag.com/none/308360-amd-details-jaguar-preps-chip-for-playstation-4

    and here:

    http://www.tomshardware.com/news/APU-Jaguar-PlayStation-Kabini-Temash,21229.html

    You can see Jaguar is an evolution of bobcat. While it no doubt may share some design philosophies with piledriver, it isn't very similar to it. An 8 core piledriver in a system would definitely be a bit much, especially for the power cost.
     
  13. deltatux

    deltatux Guest

    Messages:
    19,040
    Likes Received:
    13
    GPU:
    GIGABYTE Radeon R9 280
    At least by choosing Jaguar over Steamroller, Sony's been able to keep its power draw low. That was their reasoning for keeping an in-order execution processor when they designed the Cell processor with IBM and Toshiba for the PS3. Now that they have an out-of-order execution processor, it will help in terms of performance as the pipeline doesn't have to be as deep. Then again since they're two very distinct architectures, it's not an apple-to-apple comparison.

    If we were comparing Intel Atom's in-order execution to AMD "Jaguar" then ya, we would have a much better comparison, the "Jaguar" will blow the Atom away. Though the Atom would use less power due to having a much smaller prediction window since it's an in-order execution processor.

    deltatux
     
  14. nakquada

    nakquada Guest

    Messages:
    352
    Likes Received:
    0
    GPU:
    Gigabyte GTX 1080 FE
    I dont see how theres a problem with nVidias handling of the situation. I totally understand their perspective. Its not greedy at all. It's a clever strategy for nVidia not to overstretch their boundaries and resources. AMD seem to be the greedy ones grabbing for every console GPU they can.
     
  15. GhostXL

    GhostXL Guest

    Messages:
    6,081
    Likes Received:
    54
    GPU:
    PNY EPIC-X RTX 4090
    I don't know but I hope Nvidia doesn't make the wrong decisions like 3DFX did, and try to do everything themselves and bomb.

    I mean there are millions and millions of console only gamers, and Sony, Microsoft, and nintendo are the only names they know. You know how many console fans never heard of Nvidia and knew that they drove the PS3's GPU?

    Way more than you may think. I'm really thinking the Nvidia Shield wont last very long, as people cling to console makers for portable gaming, aka Nintendo (3DS/DS)/Sony (PSP/VITA).

    I mean I can understand if Nvidia doesn't feel that it would be profitable, but If they want their name to be known even more, they have to invest a little, not just gain profit, do a little of both. Make it more widely known that Nvidia is in the PS whatever or Xbox whatever.

    AMD sure is making a big deal about being in the Xbox/PS4, and they are sure are making it known, and that will just spread their name to the millions of console users who don't even know what an AMD is (again you'd be surprised with how many everyday console users don't even know what their CPU and GPU in the system actually is). All they see is the name Sony or microsoft, and think they did it all.
     
    Last edited: Mar 15, 2013

  16. Darkest

    Darkest Guest

    Messages:
    10,097
    Likes Received:
    116
    GPU:
    3060ti Vision OC V2
    They either cling to console makers or use portable devices that they already own such as tablets and smartphones. I honestly don't see this 'Shield' as having much of a market, if any.

    Regardless, whatever was said or was not said by Nvidia - the best choice for Sony this round (considering the move to X86) was to go with AMD. They can deliver on both fronts, Nvidia can't. I'm pretty sure that's what it really boils down too.
     
  17. H83

    H83 Ancient Guru

    Messages:
    5,515
    Likes Received:
    3,037
    GPU:
    XFX Black 6950XT
    I think that Nvidia is correct in skiping selling chips for consoles this time around for two reasons:

    First, because they don´t have an APU like AMD and making one from scratch for consoles would be too time and resource consuming;

    Second, because key accounts like this, Sony and MS, operate on extremely thin margins, sometimes lower than 5%, and have several constraints dictated by the buyers, making this type of businesses not very profitable...

    And lets not forget that Nvidia is being cornered into a tight spot right now because they don´t have a x86 license to make their APU and because the discrete graphic market is probably going to disappear in a near future, so Nvidia has to concentrate their resources on Tegra and pray they succeed!...

    In the end, i think Nvidia didn´t have any other choice but to do what they did.
     
  18. KCjoker

    KCjoker Guest

    Messages:
    2,470
    Likes Received:
    0
    GPU:
    EVGA GTX 460/w LG 24 LCD
    You sound exactly like them except a fanboy for AMD.
     
  19. sykozis

    sykozis Ancient Guru

    Messages:
    22,492
    Likes Received:
    1,537
    GPU:
    Asus RX6700XT
    NVidia doesn't have an x86 license, and since they have an ARM license, there's no reason for Intel to license x86 to them.

    We're also only assuming that the NVidia "spokesperson" is telling the truth....which his statement suggests he knows nothing about it at all.

    If Sony had decided at the beginning that they wanted to go x86 it would have made little, if any, sense to approach NVidia for graphics unless they already had an arrangement with Intel to design a processor. Why would you go to NVidia for graphics, then AMD for a processor? Especially when AMD is building APU's for mobile, entry-level, budget and mainstream markets. It's also generally more cost effective to get both products from the same source....which makes AMD the best option.
     
  20. mitzi76

    mitzi76 Guest

    Messages:
    8,738
    Likes Received:
    33
    GPU:
    MSI 970 (Gaming)
    does seem a bit strange that nvidia wouldnt have wanted in on ps4. time will tell if the ps4 and the new xbox will be that much of a step up from predecessors.

    sold my ps3 awhile back and cant see myself getting a ps4 unless there's a def improvement in the visuals i.e at least some aa in there (x4). some of the pre release clips though do look good..
     

Share This Page