No more mobile GPU overclocking - has Nvidia gone insane?!

Discussion in 'Videocards - NVIDIA GeForce Drivers Section' started by iaTa, Feb 12, 2015.

  1. Murcer_Borg

    Murcer_Borg Guest

    Messages:
    567
    Likes Received:
    0
    GPU:
    msi RTX4090 LiquidX
    Mineria , we are somewhat in the same range now after reading this about VESA and Adaptive V-Sync I believe that it is the same what Nvidia is calling Adaptive under the VERTICAL SYNCHRONIZATION TAB i in the driver .
    I have to ask you do you know how picture is projected on the monitor raster and how Vertical synchronization works ? that's the normal one which is present in any type of screen CRT ( obsolete) LCD/LED .
    So according to that there is no mystery regarding Adaptive V-Sync and what it has to do ...however G-SYNC is something different completely that's why a monitor has to be G-Sync capable and it is Nvidia thing not standard which also may hang out in the race .
    Thing is G-SYNC is GPU based and not like V-SYNC or ADAPTIVE V-SYNC in nature. That's why any monitor can support ADAPTIVE V SYNC as it is on electrical level converted in digital impulses and finally again converted in analog as the final product what we see is analog.

    My primary mon is laptop monitor:
    Model LP173WF4-SPF1
    Manufacturer LG Display
    http://www.panelook.com/LP173WF4-SPF1_LG Display_17.3_LCM_overview_21056.html

    by looking at this specs it is hard to determine what ever is or isn't G-Sync capable ..check that little question mark ( ? ) under frequency
     
    Last edited: Feb 14, 2015
  2. Murcer_Borg

    Murcer_Borg Guest

    Messages:
    567
    Likes Received:
    0
    GPU:
    msi RTX4090 LiquidX
    and I have the G-SYNC leeked nvidia drivers as well but didn't install them ever so I don't know for fact :p
     
  3. DLD

    DLD Master Guru

    Messages:
    896
    Likes Received:
    76
    GPU:
    MSI GTX1060 6GB
    ...or is it the real meaning of their latest announcement:

    Redefine Future of Gaming...?
     
  4. Robbo9999

    Robbo9999 Ancient Guru

    Messages:
    1,855
    Likes Received:
    442
    GPU:
    RTX 3080
    Haha, that's me you're referring to. The reason I manage such a high core overclock is because the 670MX runs at a really low stock frequency of 600Mhz (oc'd to 1124Mhz), so in a way the 670MX was artificially gimped at stock by NVidia. This card is an unusual case. It's not gonna fry the card, according to my extrapolations based on KillaWatt readings it's using less than 100W which is fine for an MXM card and it has the same power delivery circuitry as the 680M which is a 100W card. I've been running with that 87% core overclock since Summer 2013. It's the GK104 chip which is the same piece of silicon used on the desktop GTX 680, albeit with just over 500 of its cores disabled, and it's only using 1.05V which is not much in comparison to the 1.175V used by the GTX 680. At a max temperature too of 69 degC I'm not gonna be frying it anytime soon.

    I also have a 240W PSU and when gaming never use more than 170W, I believe there's enough overhead there!

    So, my card is an unusual situation given it's artificially low stock clocks, laptop gamers are generally not running 87% core overclocks!
     
    Last edited: Feb 14, 2015

  5. iaTa

    iaTa Member

    Messages:
    31
    Likes Received:
    0
    GPU:
    NVIDIA GeForce GT 840M
    Actually, no. You have 3K and 4K panels in notebooks and with consumer VR on the horizon every frame will count when you need a constant 75 or 90 fps.

    Having being an original backer of the DK1, VR is going to be massive. I was about to purchase a new Clevo P750ZM with 980M but I will now hold off until AMD show their hand in a few months.
     
  6. Murcer_Borg

    Murcer_Borg Guest

    Messages:
    567
    Likes Received:
    0
    GPU:
    msi RTX4090 LiquidX
    No you don't have 4K panels best you can get is UHD 3840 X 2160 native which you can buy so far , there is no panel that supports native 4096 x 2160 . And if your fancy flat is showing 4096 x 2160 the resolution is 3840 X 2160 .
    4K is used as standard in film industry while UHD is consumer ..bad thing is for marketing reason they put 4K on it but truly it is 3840 X 2160 resolution which is also high enough however 980 can handle it not to mention 980 in SLI .
     
  7. iaTa

    iaTa Member

    Messages:
    31
    Likes Received:
    0
    GPU:
    NVIDIA GeForce GT 840M
    Facepalm.

    http://en.wikipedia.org/wiki/4K_resolution

    And talk about missing the point of my post.
     
    Last edited: Feb 14, 2015
  8. Mineria

    Mineria Ancient Guru

    Messages:
    5,540
    Likes Received:
    701
    GPU:
    Asus RTX 3080 Ti
    The adaptive v-sync in the control panel is NVidia's thing, as I already wrote.
    More info here:
    http://www.geforce.com/hardware/technology/adaptive-vsync/technology

    DisplayPort Adaptive-Sync uses the same basic mechanism as NVidia G-Sync, and AMD calls it FreeSync.
    You can read more about it here and also the differences:
    http://support.amd.com/en-us/kb-articles/Pages/freesync-faq.aspx
    I hope NVidia is going to support it, but for it work we need monitors with eDP 1.2a that none the less have it implemented, since it is optional.

    How the GPU tells the display to fill the pixels by sending it a set of bits really doesn't matter, all that matters today is that we align it to match 1 full screen draw (frame)/s to Hz/s
    Having the display to adjust it's frequency according to how many fps the GPU sends to it the best way, since that reduces tearing, stuttering and input lag.
    If you check a bit info regarding that driver you will also notice that it makes use of G-Sync up against VESA's DisplayPort Adaptive-Sync, since as already mentioned the mechanics are close to each other, with exception of the frame buffer, G-Sync needs it since it polls and ACK's, plus uses the framebuffer.

    No idea regarding your laptops display, try to compare it to the ASUS ROG it works on.
     
  9. fry178

    fry178 Ancient Guru

    Messages:
    2,067
    Likes Received:
    377
    GPU:
    Aorus 2080S WB
    Yes, there are 20:9 (native 4k) screens available, but mainly for monitors or laptop screens and only a handfulbof cinemascope tvs.
    It just doesn't make sense for tvs, since most BDs and broadcasts are still 16:9.
    And i haven't seen any 4k tv showing anything but 3840 resolution on the screen...
    Uhd or 4k has nothing to if its consumer or not. Its based on the resolution of the screen.
     
    Last edited: Feb 14, 2015
  10. Mineria

    Mineria Ancient Guru

    Messages:
    5,540
    Likes Received:
    701
    GPU:
    Asus RTX 3080 Ti
    This one was showcased in 2012
    http://www.eizo.se/default.aspx?page=11&product=FDH3601

    15.000$ lol
     

  11. Robbo9999

    Robbo9999 Ancient Guru

    Messages:
    1,855
    Likes Received:
    442
    GPU:
    RTX 3080
  12. Ac30

    Ac30 Member

    Messages:
    25
    Likes Received:
    0
    GPU:
    860m GTX
    Nvidia isn't going to change ****, and all of you saying that Nvidia has every right is flat-out wrong when laptops were sold with OCing advertised as a feature. That's crap any way you look at it. The dream is dead. let's hope AMD comes out with some killer mobile chips and wipes the floor with Nvidia, they could use some competition for once.
     
  13. WontonNoodle

    WontonNoodle Active Member

    Messages:
    68
    Likes Received:
    7
    GPU:
    Nvidia GTX 3070M
    you cant overclock without external software which has warning messages as well when you use it. but ok, keep arguing against this case for whatever reason
     
  14. alexrose1uk

    alexrose1uk Guest

    Messages:
    81
    Likes Received:
    1
    GPU:
    3080 10GB
    I've signed the petition, overclocking is not a right, but this feature has been available for years, and frankly to my mind its up to the user to decide how to use thier hardware.
    Most of the people buying insanely powerful laptops are going to be power users or enthusiasts, and this is going to upset many of them, so it seems a strange move.

    I can only see it having happened because of pressure from a massive OEM like Dell, or to try and force people to upgrade sooner.

    I'm also not sure how this will affect companies like ASUS and MSI who actually provide uprated cooling systems, software and market overclocking as a feature on some of thier machines. If someone has bought a machine with that feature, and the feature has now been removed...that's dubious I think.

    This coming out right after the 970 VRAM debacle and G-Sync outage seems very dubious and frankly a stupid move. Why annoy a decent percentage of the high end enthusiasts who buy the high end mobile chips?
     
    Last edited: Feb 15, 2015
  15. Fender178

    Fender178 Ancient Guru

    Messages:
    4,194
    Likes Received:
    213
    GPU:
    GTX 1070 | GTX 1060
    Ok whatever. Plus If you look at my other post where I said if people wanted to Oc their Laptop's GPU let them. That is not against this case.
     
    Last edited: Feb 15, 2015

  16. fry178

    fry178 Ancient Guru

    Messages:
    2,067
    Likes Received:
    377
    GPU:
    Aorus 2080S WB
    @WontonNoodle
    there is a difference between having a popup warning, and the user actually reading it.

    more than 60% of infected pc's i've fixed so far, have toolbars/software installed, that is at least partially responsible for infections (ad/malware) and actually says it most of the time when installing, or at least has the option to uncheck a box to prevent it.

    that doesnt mean anyone actually read it.

    please show me more than 5 non-geek users that FULLY read the COMPLETE disclaimer of EVERYTHING they install.
    right....
     
    Last edited: Feb 15, 2015
  17. Robbo9999

    Robbo9999 Ancient Guru

    Messages:
    1,855
    Likes Received:
    442
    GPU:
    RTX 3080
    Cheers for signing the petition, the number is steadily growing! Good points too, NVidia has been dropping some big ones lately!
     
  18. AC_Avatar100400

    AC_Avatar100400 Guest

    Messages:
    340
    Likes Received:
    0
    GPU:
    WC GTX 780
    After eating 2 bags of corn chips and reading through this thread and the one on Techpowerup
    i can only conclude IT Enthusiasts are morons that don't care Unless it a affects them.

    Back on topic.That statement about it been a bug is a Lie.
    Nvidia lied its not a bug and if it was then its crazy how a bug
    can exist for 5 years and be advertised by MSI Asus Clevo as a feature.

    Fact is on a well designed laptop overclocking is fine and only gives
    a performance boost for maybe a slightly higher temp.
    If a laptop can't handle a small overclock the its cooling is garbage
    and will fail eventually anyway.

    The fact is Nvidia lied or are incompetent beyond belief which seems
    very unlikely since its been here for 5 years and advertised by there
    largest partners as a feature.

    Let the flaming and derps come out of the wood work :)
     
  19. WontonNoodle

    WontonNoodle Active Member

    Messages:
    68
    Likes Received:
    7
    GPU:
    Nvidia GTX 3070M
    why would a non-geek install overclocking software in the first place? the same argument could be made for desktops in your case. (nvidia laptop cards limit the allowable overclock so you cant oc it too much where its dangerous without forcing pstates, etc which is even more advanced - which non-geeks would never know about)
     
  20. Cyberdyne

    Cyberdyne Guest

    Messages:
    3,580
    Likes Received:
    308
    GPU:
    2080 Ti FTW3 Ultra
    We'd be missing a lot of nice things if we always worked under the logic 'what if an idiot breaks something with it?'.
     

Share This Page