No more mobile GPU overclocking - has Nvidia gone insane?!

Discussion in 'Videocards - NVIDIA GeForce Drivers Section' started by iaTa, Feb 12, 2015.

  1. Yecnot

    Yecnot Guest

    Messages:
    857
    Likes Received:
    0
    GPU:
    RTX 3080Ti
    Haha, this would never happen though?
     
  2. Cyberdyne

    Cyberdyne Guest

    Messages:
    3,580
    Likes Received:
    308
    GPU:
    2080 Ti FTW3 Ultra
    Here's one, I bet you, you would buy my GPU's if they were half the price of the best GPU to date and performed twice the speed, AND overclocked to be over nine thousand percent faster!
    Pointless hypotheticals. Your attempt at making a point falls flat.
     
  3. Fender178

    Fender178 Ancient Guru

    Messages:
    4,194
    Likes Received:
    213
    GPU:
    GTX 1070 | GTX 1060
    I can say this Laptops with an Nvidia or an AMD mobile GPU overclocked or not is a heck of alot better than a laptop with Intel HD graphics.
     
  4. fry178

    fry178 Ancient Guru

    Messages:
    2,078
    Likes Received:
    379
    GPU:
    Aorus 2080S WB
    @Cyberdyne
    didnt know you have a crystal ball and can predict the future...


    the chance that amd will never mess up "anything" in the next few years, so people will say "i will never buy an amd gpu again.." is almost zero.
    so where is that hypothetical?

    at that point, the same people that are complaining about nv right now, will then do what? not buy amd (with the problems), AND not buy nvidia because they swore to never buy one again?
    this to my knowledge leaves only intel.

    one thing i've learned in life: never say "never again"..
     

  5. Robbo9999

    Robbo9999 Ancient Guru

    Messages:
    1,858
    Likes Received:
    442
    GPU:
    RTX 3080
    Even if both companies do mess up from time to time, I still think it's worth complaining, pointing out the faults, causing an uproar about it, etc, because ignoring it is boring and defeatist, and it's possible that these companies care about our opinions, and could be possible we could influence them in some way (even if it's small!), here's this darn petition again!:
    https://www.change.org/p/nvidia-re-...geforce-equipped-notebooks?recruiter=20925400

    Negative press is not wanted by either NVidia nor AMD, it results in a loss of sales for them, so lets just go ahead & complain! These discussions also help consumers choose the best products too.
     
  6. Murcer_Borg

    Murcer_Borg Guest

    Messages:
    567
    Likes Received:
    0
    GPU:
    msi RTX4090 LiquidX
    I can just speak from my personal experiences regarding both companies and OCing .
    First of all OCing doesn't bring anything and I'm completely with supporting both companies regarding this . It is a proper way and with the processing powers at hand from CPU and GPU specially from HIGH END systems , regardless desktop or mobile OCing doesn't bring anything as this systems right now can run the most demanding games in HD quality with all the tech and nuts behind with steady 60FPS which is enough , this systems may play at much higher resolutions beyond HD on multy monitors so with almost non performance loss . For example looking at my system Im completely covered for the next 3-4 years ..you might be laughing now but I'm talking from experience and knowledge .
    The systems and GPU designs today are made and divided to budget , mid range and high end and every one of them according to the specifications will do their job however investing in HIGHEND system will give you a longer life span of HIGHEND gaming then budget and mid-range but you also pay a good amount of money..and it is justified .
    OCing is for nothing else but for showing off and squeezing the maximum of the hardware that exceeds the maximum tolerance of the same exposing the product and system to an unnecessary risk . It will not help you to play a game at higher resolution with higher frame rate or what ever the goal is you will still experience frame drops when the load comes and the HARDWARE is not designed to cope with this amount of load or exceeds the hardware limitations . It is sole for people to post 3D mark scores and thinking how cool they are with that synthetic score although they never say ..well my driver was set to performance mode AA disabled , everything minimized just to squeeze some score...stupid in my point of view.
    When I test my system and I do it only to see if my system is fit an set properly all my driver setting are set for MAXIMUM QUALITY and VSYNC is always ON ..right now Nvidia introduced a beautiful technology behind VSYNC called ADAPTIVE so I use this one and it works amazingly good.
    Why VSYNC ON???.. oh ! you limit your GFX not to render 250FPS when it can ??? ..and that's exactly what it has to do ..first of all you cant benefit from 250FPS rendered on a screen that is able to refresh and raster 60FPS ..did you ever asked your self ..where is the rest gone ? why my GPU has to render frames when they cant be projected and shown ?? Why my GPU needs all the time to render 250FPS for the same scene when it can render it 60 FPS and not spending resources and time for rendering and processing unnecessary data ? only to satisfy your synthetic benchmark greed which will never bee high enough. Well I hope that NVIDA and ATI will put a LOCK on the VSYNC according to Monitor refresh standard in hardware so that it cant be override by any software manipulation.
    The lower the differences between minimum fps and maximum fps the smoother the game will be and that's a fact.
    You will notice worse differences when you have a drop from 250fps to 60 or 80 then from 60 -15 or 20 ..although on my current system I didn't see a drop lower then 45 so it goes 45-60 always around 55-58 ..on the most demanding game with everything maximized HD resolution driver set to maximum quality ..that's the REAL POWER and beauty ...synthetic benchmarks and scores are not .
    I don't play benchmarks I play games , I want to enjoy my games at highest standards and visual quality and that's why I pay my money for and both NVIDIA and ATI provides this in high end systems .
    Unfortunately like everything else it has a life span and when it reaches its end its done ...and nothing can prevent this and no ocing can really extend this.
    My ALIENWARE M17 R2 was supporting me for 4 years allowing me to enjoy maximum quality all the time this came to an end with COD Advanced warfare , and I knew that I still could play around with lower settings , lower resolutions but no OCing could make it better so a time for a change . And that's realty guys like it or not . From experience Ocing brings more risk and harm then good. :pc1::pc1:
    Let the flaming begin :D
     
    Last edited: Feb 14, 2015
  7. Robbo9999

    Robbo9999 Ancient Guru

    Messages:
    1,858
    Likes Received:
    442
    GPU:
    RTX 3080
    Haha, let the flaming begin - well it deserves a reaction, but was very hard to read your wall of text with your hatred of the paragraph!

    Overclocking does bring substantial performance gains, for example I've got an 87% overclock on the core, and a 64% overclock on the VRAM; so I get roughly a 70-87% increase in frame rate depending on the game, and temperatures don't go over 69 degC. Obviously, overclocking can make a substantial difference to in game performance. (I also had a Dell M1530 with 8600M GT that I overclocked by about 45% on the core and 80% on the shader). None of these systems have failed on me. For NVidia to remove overclocking after 5 years of allowing overclocking through their Official NVidia Drivers is a disastrous move in my eyes.

    Even more modest overclocks of 10-25% are worth it, because it means you can perhaps flip on an extra setting or two while maintaining a solid 60fps. Plus, it's fun to overclock, and it's good to get the maximum value from your product. This is a positive thing, and it reflects positively on the manufacturer - I think it gives people a good feeling towards the manufacturer, so I think it's in NVidia's interest to bring back this feature. They're decreasing brand loyalty and product value by disabling the overclocking feature - they're shooting themselves in the foot!

    (The other stuff you mentioned in your post was largely irrelevant to the topic, so not worth commenting on specifically).
     
    Last edited: Feb 14, 2015
  8. Mineria

    Mineria Ancient Guru

    Messages:
    5,540
    Likes Received:
    701
    GPU:
    Asus RTX 3080 Ti
    @Murcer_Borg
    OC brings something with new games.
    If your minimum framerate is low at a desired IQ you can either lower the IQ, get it up with a slight OC or do both.

    NVidia's technology is called G-Sync, Adaptive-Sync is VESA's technology which AMD will use for Free-Sync .
    Those are better for gaming since the display dynamically adjusts it's refresh rates according to how many fps the card throws at it.
    But it has it's downsides, G-Sync as example doesn't work to well when fps drops below 30.
    Plus that the dynamically part isn't using small steps of refresh rates, it uses gaps instead.
    Yet it beats V-Sync technologies when above 30fps.

    Adaptive V-Sync is crap in most circumstances, I find V-Sync to work much better for most games, rather bump IQ a notch on things like shadow and AO to keep steady 60fps.
     
  9. Murcer_Borg

    Murcer_Borg Guest

    Messages:
    567
    Likes Received:
    0
    GPU:
    msi RTX4090 LiquidX
    well didn't expect anything less..however I'm sure with time there will be some modified drivers and people who want to OC will be able to .
    I'm happy for your OC success ..truly :) I see an Alienware brother here :D my oldy was powered by dual Radeons 5870 . It is still an amazing machine though.
     
  10. Murcer_Borg

    Murcer_Borg Guest

    Messages:
    567
    Likes Received:
    0
    GPU:
    msi RTX4090 LiquidX
    Actually VESA put the standard for Adaptive V-Sync and Nvidia adopted it and offered ..I have it selected on my 980 ..and honestly don't see it this bad as you described ..what I understand Adaptive V-sync.. basically engages only when the refresh hits higher mark the the monitor refresh is set but wont engage once is below that mark doing this it will reduce tearing/stuttering that appears when there is no V-sync and reduces lag when the V-sync is enabled..however I don't exactly know what that lag is all about because I couldn't identify it and on both my systems V-SYNC is always set to ON form the driver ..however my Nvida based system is set to adaptive V-sync.
    G-sync is competently different story , first two technologies are raster based and hardware video signal based G-sync is GPU based and it will synchronize GPU rendering with Monitor refresh in theory this should provide the smoothest visual appearance. So far I don't see this option and I have the latest what Nvidia put on market , I don't believe that my 980 and monitor are not GSYNC compatible ,I believe Nvidia is still testing this feature and will be released later in the driver . So Adaptive V-sync is a good crossover technology . Works for me .
     

  11. newbuser

    newbuser New Member

    Messages:
    4
    Likes Received:
    0
    GPU:
    8800gtx
    Maybe I am being very old here, but I recall the days when you could not just overclock through a nice GUI, it was not supported out the box by the vendors and you had to revert to all sorts to get it working at all (Anyone remember pencil marks on their AMD CPUs?).

    This is all like watching rich kids throw their golden GI Joes out thier cot. Who buys a car and then complains "OMFGWTF! It cant do over 80% of its manufacturer rated speed!"? I know overclocking today is relativly trivial and "safe", but a manufacturer has the right to produce something that runs at a certain speed and the manufacturer has the right to not expect it to run faster then that. We could go 50/50 with the vendors as say we get chips we can overclock with a random unknown amount of gains and they get to impliment a variable warranty :p

    Imagine having to honor a warrently for a product with end users running it upwards of 80% faster then spec? And while there are disclaimers for overclocking and the damage it CAN cause, there is little to no way a manufacturer can look at a dead chip and say for sure its a result of overclocking (unless a modded bios is used etc). It would look the same as a chip at stock speed with bad cooling, or just faulty chip.

    Does anyone complain that they cannot overclock their non-K Intel chips? Seems they should :p

    As I offer no solution but am just adding remarks, I might add, as a loooooong time lurker, and vary rare poster, what the actual **** has happend here? All I see lately is complaining, sniping, whining and fanboism and accusations of fanboism left right and center.

    It should not be about that, I should not be wrong just because I have a certain GPU or CPU. Did nobody see the episode of Star Trek "Let That Be Your Last Battlefield" ? :p
    It USE to be about numbers, specs and technologies that translated into performance and a better PC experiance.

    Also with people that clued into both R&D and marketing, why do we not see more CEOs of Fortune 500 Tech companies posting here?

    Its like a complaining disease is spreading! Just look at me! :eek: :D

    AS for the issue, did anyone consider not upgrading the driver?
     
  12. Murcer_Borg

    Murcer_Borg Guest

    Messages:
    567
    Likes Received:
    0
    GPU:
    msi RTX4090 LiquidX
    oh! yeah, I remember this techniques and however the best one was modding R8500 by actually replacing resistors ;)
     
  13. snip3r_3

    snip3r_3 Guest

    Messages:
    2,981
    Likes Received:
    0
    GPU:
    1070
    Honestly, the reason may stem from the fact that most OEMs are requesting this block. We may however, get special customized drivers in the future for gaming machines. Why?

    Because while many of you think that the GPU is just fine if temperatures are under control, what about the power regulation? What about your power brick? Alot of these new machines, particularly NON-GAMING/WORKSTATION laptops, are completely designed to work at their designed TDP. What happens if you overclock it? The VRMs don't usually have temperature sensors in laptops, meaning for the vast majority of laptops/dGPU equipped ultrabooks, overclocking is very risky and will be responsible for the majority of returns due to their volume and typical, cost over quality component selection, not to mention thickness and typical poor cooling capabilities. Most manufacturers will still honor the warranty because honestly testing the silicon for overclocking will likely cost more in the end versus the cost of replacing the boards. It'll also be a PR/customer-relationship disaster. This also means that many manufacturers will suffer more loss as most boards these days in non-gaming laptops are soldered together (CPU+RAM+GPU). If the GPU/VRM got damaged, you have to replace everything together.

    Don't forget about the power brick too, those things aren't exactly overbuilt these days, especially with the emphasis on size and weight.

    With all of these factors together, it makes sense from a manufacturer and supplier point of view. Obviously, it sucks as a consumer, but I won't be too surprised. I had a laptop with a 640M. Overclocking was always limited at around +135, any more and you had to mod the driver with a 3rd party patch. Thermals were already bad in the first place, meaning I never used it unless I really wanted to play some games when I was away from my desktop or if it was down for RMAs. I'm sure however, that someone will eventually find a way to bypass it, it'll however, cut down on the amount of "casual" overclockers that really don't know what they are doing. I think that overclocking should exist, but should be capped relatively low (10%) and be allowed more on machines with the proper headroom (thermal + OEM customization).
     
  14. Robbo9999

    Robbo9999 Ancient Guru

    Messages:
    1,858
    Likes Received:
    442
    GPU:
    RTX 3080
    I think overclocking without increasing voltage is a very safe practice. For a start you're generally not going to be able to get massive overclocks without increasing voltage, and consequently your consumption in Watts is not going to increase greatly (so there's not a lot more stress on the VRMs, power brick, etc). As you know Kepler was limited to +135Mhz, I don't think there's much risk in that, and the GPUs have temperature & TDP limited controls in place anyway, so there's not really much scope for any damage through standard overclocking methods. I still believe NVidia should bring back this level of control to the consumer.
     
    Last edited: Feb 14, 2015
  15. yobooh

    yobooh Guest

    Messages:
    260
    Likes Received:
    15
    GPU:
    Gigabyte 970 G1
    A good OC will not change your hardware, but let you to use it at it's maximum. If in a game you reach 50-60 fps with oc you could reach 60-70 and enjoy a much better experience with Vsync.
    My 660ti overclocked reach the gtx 670, In general a card overclocked could reach the next in the lineup at stock: it means more for less money!
     

  16. fry178

    fry178 Ancient Guru

    Messages:
    2,078
    Likes Received:
    379
    GPU:
    Aorus 2080S WB
    @Robbo9999
    Oh im fine with complaining about nv. My rant was about not so smart people saying "i will never buy nv again "
    Or users saying "it was always possible... "
     
  17. Cyberdyne

    Cyberdyne Guest

    Messages:
    3,580
    Likes Received:
    308
    GPU:
    2080 Ti FTW3 Ultra
    Which is something I never said. I feel like I'm pointing out the obvious, but anyone who does say that is upset about a feature they had and was taken away, lol. The pointless hypothetical about "I bet you would buy from them again if they had the best GPU ever for a dollar!" means nothing.

    Now things have got ridiculous. People are posting their fake scientific findings in favor of NVidia's decision. The mere fact that NV laptop users have been overclocking their GPU's easily for a good 5 years is just glossed over and now, suddenly it's physically impossible because of the BS I just laid out. Suuure.

    We even got people in favor of blocking all overclocking in general! People go to some crazy lengths to defend a company they like.
     
  18. Murcer_Borg

    Murcer_Borg Guest

    Messages:
    567
    Likes Received:
    0
    GPU:
    msi RTX4090 LiquidX
    To be honest I didn't have the need to OC my older Aliwnware M17 R2 2x5870 , it wasn't available from the driver and Catlyst suite from day one I bought the system . It was possible with some hacks , registry messing and so on , like I said the community will find a way ..is it really necessary ? I don't know It should be personal decision. All I know who ever has an GTX 980 or any other hi-end card in any type of configuration Ocing isn't rally necessary for hi-end gaming at all it is only for syntetick show off and ..Mine is bigger then yours thing....BS !
     
  19. Mineria

    Mineria Ancient Guru

    Messages:
    5,540
    Likes Received:
    701
    GPU:
    Asus RTX 3080 Ti
    Do you have a reference for that?
    I can not find anything that point to that Adaptive V-Sync is a VESA standard, everything points to NVidia.
    Are you sure that you aren't confusing it with Adaptive-Sync?
    http://www.vesa.org/news/vesa-adds-adaptive-sync-to-popular-displayport-video-standard/

    Adaptive-Sync and G-Sync tells the display to switch it's refresh rate to match current fps, the refresh rates are in set intervals, G-Sync as example 60-85-100-120-144Hz for 3D.

    V-Sync and Adaptive-V-Sync limit fps to match the displays refresh-rate, in addition the adaptive version disables itself when fps is below the that refresh rate to eliminate stuttering.
    So I guess you understood that correct.

    The problem I experience with Adaptive-Sync is that it still gives tearing in quite a lot of games, so I use V-Sync instead, got no stuttering when using it anyway.

    I think the future will be displays that are able to dynamically switch frequency in much smaller intervals when the GPU tells them too, especially in the lower frequency area.

    Regarding your monitor, what brand and model is it?
    I highly doubt that it can switch refresh rates dynamically.
    In addition, the G-Sync module on G-Sync enabled displays have a frame buffer to support situations where fps drops below 30, without you would have the same issues that are reported when using the test driver that enables G-Sync for certain laptop models (the displays in those are build after VESA's DP1.2a standard but they got no frame buffer).
     
  20. Mineria

    Mineria Ancient Guru

    Messages:
    5,540
    Likes Received:
    701
    GPU:
    Asus RTX 3080 Ti
    Overclocking should not be blocked, but neither should it be easy available on OEM systems that will crash and burn, who buys a low end or business laptop with NVidia or AMD GPU's to overclock them anyway.
    The laptops labeled as gaming laptops with extreme cooling systems can take a lot of OC, the OEM's know that and even sell them as overclock-able, there is no reason what so ever for NVidia in doing a generalizing move and block laptop overclocking in general.

    It would be nice to get a more detailed statement from NVidia regarding this matter.
     

Share This Page