Nvidia Ends 3D Vision And Mobile Kepler Support

Discussion in 'Frontpage news' started by Hilbert Hagedoorn, Mar 11, 2019.

  1. XP-200

    XP-200 Ancient Guru

    Messages:
    6,412
    Likes Received:
    1,797
    GPU:
    MSI Radeon RX 6400
    I occasionally stick in a 3d move and watch it via the 3d Projector, but very occasionally, nice on a big screen and all, but only when you can be bothered to charge up the glasses first. lol

    3D gaming for me has been superseded by VR, although i wonder if these vr headsets can do 3d side by side, need to look into that.
     
  2. stereoman

    stereoman Master Guru

    Messages:
    887
    Likes Received:
    182
    GPU:
    Palit RTX 3080 GPRO
    Yeah it's good they are removing a feature as long as it doesn't effect you, what a joke, seriously this attitude is the reason why Nvidia doesn't give two shits about the consumer anymore but if you head over to Nvidia's official forums you can see there's plenty of people who are not happy about this.
     
  3. Noisiv

    Noisiv Ancient Guru

    Messages:
    8,230
    Likes Received:
    1,494
    GPU:
    2070 Super

    For a company to fail at nearly everything and yet to have a commanding, near-monopoly presence in the competing markets - that would be a tremendous achievement, if it was actually true.

    But the reality is slightly different:

    • Hardware T&L
    • CUDA
    • Optimus
    • MXM
    • PhysX
    • NVLINK
    • AI/Deep/Machine learning

    Huge bets all of them, and huge wins
     
  4. H83

    H83 Ancient Guru

    Messages:
    5,515
    Likes Received:
    3,037
    GPU:
    XFX Black 6950XT
    I love how companies promote certain features, telling buyers that those features are amazing and going to change everything and then drop them after a certain time... The wonders of being an early adopter...

    So any bets about the next fade/feature that it´s going to disappear? VR maybe???
     
    Last edited: Mar 11, 2019
    lucidus likes this.

  5. Astyanax

    Astyanax Ancient Guru

    Messages:
    17,045
    Likes Received:
    7,382
    GPU:
    GTX 1080ti
    Sounds like nvidia knows that microsoft are expanding on the DXGI Stereo API to me.
     
  6. Noisiv

    Noisiv Ancient Guru

    Messages:
    8,230
    Likes Received:
    1,494
    GPU:
    2070 Super
    G-Sync, RTX, HSA :D
     
  7. schmidtbag

    schmidtbag Ancient Guru

    Messages:
    8,023
    Likes Received:
    4,400
    GPU:
    Asrock 7700XT
    Nvidia's superior designs are what keeps them afloat, not the proprietary products (except for CUDA). That being said, of everything you mentioned, Optimus was good but underwhelming, I would not consider MXM a success, hardware PhysX was a long-term failure, NVLINK is basically just next-gen SLI, and AI/deep/machine learning is just a glorified mixed-precision FP/INT with some extra libraries to help out.
     
    Last edited: Mar 11, 2019
  8. Noisiv

    Noisiv Ancient Guru

    Messages:
    8,230
    Likes Received:
    1,494
    GPU:
    2070 Super
    so what are the alternatives, that are better, more open, more successful and with bigger market penetration than those mentioned:

    CUDA, OPTIMUS, MXM, PhysX

    as for AI/deep/machine learning Nvidia has only one strategy: Full speed ahead. If they fail, they fail.

    Compare that to AMD which came late, but eventually said: we're in. But in reality they did nothing.
    Then later Raja said: We made a mistake, but now we're really in. Our competition has been bungling (??) (a total wtf moment)
    No wonder they they are still far behind.
     
  9. holler

    holler Master Guru

    Messages:
    228
    Likes Received:
    43
    GPU:
    Asrock 7900XTX Aqua
    Not surprised, Nvidia is the EA of hardware companies, Physx and SLI both run into ground. buy up competing tech to retain Nvidipoly, stagnate the technology, then say we are dropping the tech because "no one uses it". No one uses it because they never reiterated it or attempted to improve it...
     
  10. schmidtbag

    schmidtbag Ancient Guru

    Messages:
    8,023
    Likes Received:
    4,400
    GPU:
    Asrock 7700XT
    You're missing the point: it doesn't matter what the alternatives are if nobody prefers Nvidia's option either. I already stated, more than once, that CUDA is the exception. That is very successful, though even then, stuff like OpenCL is far from a failure.
    What I'm talking about has nothing to do with AMD or existing alternatives. My point is Nvidia hoarding technologies to themselves rarely pays off, again, with CUDA being the major exception.

    It's also worth pointing out that there are AMD MXM GPUs, though, I know very little about them.
     

  11. Denial

    Denial Ancient Guru

    Messages:
    14,207
    Likes Received:
    4,121
    GPU:
    EVGA RTX 3080
    SLI was replaced by mGPU, 3d Vision is being replaced by Microsoft's Stereo API. Why would Nvida continue to support technologies that are being integrated directly into DX12 in a vendor/GPU agnostic way?

    They just released PhysX 4.0 with a bunch of new features a few months ago and it's the default physics engine in Unity/Unreal so idk what you're talking about with that one.
     
    alanm likes this.
  12. Fox2232

    Fox2232 Guest

    Messages:
    11,808
    Likes Received:
    3,371
    GPU:
    6900XT+AW@240Hz
    I do and more than a few. 120Hz (60 per eye alternating dimming causes serious headache for me). If I could get Glasses capable to work with my 240Hz screen, I would.

    But I still prefer to go for VR headset once they have Freesync and at least 100Hz.
     
  13. warlord

    warlord Guest

    Messages:
    2,760
    Likes Received:
    927
    GPU:
    Null
    Well look, previous generations like hd4000 series and hd6000 series went EOL at 4 and 5 years respectively. Do you really believe gcn first editions will live for more than 8 years? Nah...

    I expect all GCN pre RX gpus going EOL at late 2019 early 2020. That includes mine 390x too.
     
  14. sideeffect

    sideeffect Master Guru

    Messages:
    326
    Likes Received:
    37
    GPU:
    FE 3070
    I thought 3D blu ray playback in Windows depended on 3D vision being installed for Nvidia cards. So this will mean remove that functionality?
     
  15. fry178

    fry178 Ancient Guru

    Messages:
    2,078
    Likes Received:
    379
    GPU:
    Aorus 2080S WB
    as of 419.35 driver 3D is still there, not sure why hilbert wrote 418.

    @Fox2232
    glasses (to do 3D on tv/moni) are all running at 60Hz, even if the screen has 240 Hz..

    VR sets are different, so far i dont know anyone that would drop higher res/FOV for higher refresh rate.
    i do have a problem with 60Hz flicker, be it a plasma or (active) 3D glasses, yet no problems with VR running at same rate.
     

  16. robbo247

    robbo247 Guest

    Messages:
    67
    Likes Received:
    5
    GPU:
    5700XT Anniversary
    I also have an Sa950d which has 4 different 3d modes but they are hardware within the monitor,this isnt nvidia 3d vision ,that requires extra hardware that doesnt come with the monitor itself,i do like the solid 120hz when you turn off 3d though..
     
  17. clamatac

    clamatac Active Member

    Messages:
    54
    Likes Received:
    10
    GPU:
    MSI GTX 1070 X
    disappointing

    I do not use it a lot, but sometimes I use it for 3D movies in the living room
     
  18. Fox2232

    Fox2232 Guest

    Messages:
    11,808
    Likes Received:
    3,371
    GPU:
    6900XT+AW@240Hz
    Yes, practically same issue. When you perceive that dimming, it is problem. 2nd problem is that one eye gets data for frame 1, then 8ms later other eye gets data for same frame.
    Then 8ms later cycle repeats... creating temporal latency in between information for each eye. It is like having one eye in lag.

    VR on other hand shows data for both eyes at same time which fixes dimming and eye-to-eye lag. But I have "issue" with image fluidity too. And bigger the FoV higher fps I need for image to feel fluid. Otherwise it is just annoying rapid slide show.

    VR importance for me:
    - frame rate 100Hz and Freesync to be safe from stutter
    - black space between pixels must be close to non-existent
    - FoV is less important... and in reality is not something hard to achieve or costly today
    - Resolution is least important to me as long as 2nd point with no black grid ensures that one does not clearly recognize pixels (1280x1440 per eye is plenty)
    - Wireless is nice to have thing

    VR is about immersion and low fps/stutter breaks it fast, seeking separate pixels due to black space in between them does that too. Bit lower FoV is not that big of an issue, it's like looking through helmet/visor of sorts. And once device passes certain resolution brain does rest of the work and composes stereoscopic 3D representation which is actually perceived as higher resolution than looking twice at same image.
     
  19. Noisiv

    Noisiv Ancient Guru

    Messages:
    8,230
    Likes Received:
    1,494
    GPU:
    2070 Super

    Of course AMD GPUs can and they are using it. It's an industry standard developed by Nvidia.
    http://www.informit.com/articles/article.aspx?p=339056

    "I would not consider MXM a success"... lel

    "Optimus was good but underwhelming" ... ... eh.. what? ..."was"?

    Hey, even if underwhelming or "not-a-success", at least we agree that CUDA, OPTIMUS, MXM and PhysX are the industry leading solutions :)
    I give you this much: MXM was never a proprietary solution about which you were ranting originally.

    But IMHO even if everything else failed, it's hard to be against proprietary as a general rule, when CUDA which was their biggest bet is still looking good.
    Small bets you can lose, don't lose BIG ones.

    General rule is more like:
    If it's disruptive enough and you can fend off the competition - go proprietary. If not - release it for everyone to use.
    Sure it's more nuanced than that, like for example you can insist on your meh solution and bleed the competition, but I am armchairing this enough as it is, so I'll stop right there :D
     
  20. schmidtbag

    schmidtbag Ancient Guru

    Messages:
    8,023
    Likes Received:
    4,400
    GPU:
    Asrock 7700XT
    Well, what's your definition of success? Because most laptops don't support it. I wouldn't be surprised if most laptops with dGPUs don't support it.
    I don't agree about PhysX, but for the rest, sure. Software PhysX is still commonly found but there are many physics engines. I'm not sure which is deemed the "industry leader". Off the top of my head, there's also Havok and Bullet. I'm not suggesting those are industry leaders, industry standards, or more popular, BTW. PhysX is just widely known because it's marketed often and you need to install some hefty libraries for it (which makes you more aware of it).
    Also, I should've clarified earlier that I was mostly talking about hardware-accelerated PhysX, so, that's my mistake there. I don't deem PhysX in general to be a failure, just the hardware-accelerated part.
    Well, at least you seem to be somewhat paying attention to my point.
    Funny, in open-source forums, I'm told I'm too pro-proprietary or pro-closed-source. Now here I am, being told proprietary can be good.
    Proprietary IP in and of itself isn't bad, so I agree with you that being against it as a general rule isn't a good idea. For things like NVLink/SLI, Optimus, or G-Sync, it hardly matters, because it's a technology that depends on being hardware-specific and if you don't like it, don't buy it. Considering Nvidia seems to be supporting Adaptive Sync, that shows how their proprietary nature with G-Sync didn't pay off, but, no harm was done to consumers in any meaningful way, so I don't really care. The other 2 are relatively niche (before you rebut that, remember we're on tech forums here - we all have anecdotes about that tech but we're the minority).
    What I'm against is stuff that is unnecessarily proprietary, which in turn becomes anti-competitive. This is the stuff where the saying "if you don't like it, don't buy it" doesn't work because in some cases, there either is no alternative or it is too costly to transition. CUDA is what I define as unnecessarily properietary. Sure, it's open-open source now, but apparently it's pretty difficult to write drivers for it because it seems people would rather take shortcuts writing a conversion layer to OpenCL/Vulkan than something at the hardware level. Frankly, I'm a little surprised AMD hasn't given more of an effort to write their own drivers, but I digress. The point is, stuff like CUDA is what skews the market in a way that hurts customers in the end.
    In a business/revenue perspective, I totally agree. In terms of a healthy market, I disagree.
     

Share This Page