GTX1070 + GTX1650 (NOT in SLI) for HDMI 2.1 VRR (LG OLED)

Discussion in 'Videocards - NVIDIA GeForce' started by janos666, Nov 30, 2019.

  1. janos666

    janos666 Ancient Guru

    Messages:
    1,653
    Likes Received:
    407
    GPU:
    MSI RTX3080 10Gb
    So, I managed to sell my LG C8 at a great price and ordered a C9 on a Black Friday deal (to be delivered sometime next week) not solely but mainly for G-Sync compatibility (although the corresponding LG firmware didn't land in the EU just yet but I am fine with installing pre-release firmware if necessary and the official release shouldn't take much longer either).

    But I looked around and my options on the VGA front are very grim. The good old GTX1070 costs next to nothing on second hand markets (when compared to it's old retail price or even to the closest performing RTX variants) while the not so new RTX2070 still has a very hefty retail price (with no great deal on used ones either).
    And I prefer factory installed full-cover water blocks which makes both the resale and second-hand sourcing even more complicated/costly (there isn't a good second-hand market for these watered variants).

    But then I stopped for thinking a bit and figured... Why not just buy the cheapest HDMI VRR compatible card I can find (retail or second hand and a simple air-cooled variant since it won't be put under actual 3D load and I can even underclock it as much as possible and try running it at 0% fan speeds most of the time: semi-passive air cooling with the stock fan...).
    This would cost significantly less than swapping the GTX1070 for an RTX2070 and it would be much easier to sell the cheap RTX16xx card when full-bandwidth HDMI 2.1 cards hit the market (at which point I will probably be eager to buy one of those next-gen cards and ditch the then actually outdated GTX1070 for something also significantly faster... hopefully...).

    So, did anybody try this yet?
    I know some people managed to use AMD FreeSync while an nVidia card did the 3D rendering in games. This setup should hit even less potential manufacturer soft-locks because it's all-nVidia.
    But it would be nice to hear from someone if it really works before I buy an RTX1650. (And even if I will decide to be the first [I have a long history being the first with these kinds of setups, *SPAM* back to AMD+nVidia PhysX days, LOL], I thought it's not a bad idea to share this idea with others. I am probably not alone with a similar problem, BF-newcomer OLED buyers included. Haha.)
     
    Last edited: Nov 30, 2019
    ultra5711 likes this.
  2. ultra5711

    ultra5711 Guest

    Messages:
    317
    Likes Received:
    6
    GPU:
    GTX Titan on Agua
    Very interested in hearing how this turns out.

    I have a Titan V I want to use on my C9, and Nvidia hasn't released HDMI VRR support for Volta.
     
  3. southamptonfc

    southamptonfc Ancient Guru

    Messages:
    2,626
    Likes Received:
    654
    GPU:
    Zotac 4090 OC
    I'm probably missing something, I don't understand what having 2 cards would do for you? FWIW, neither the RTX1650 or GTX1070 will provide acceptable framerates at 4K, even with lowest detail settings in the latest games.

    Nice screen btw! Definitely can see something like that in my future once all you early adopters have ironed out all the niggles :)
     
    Last edited: Nov 30, 2019
  4. ultra5711

    ultra5711 Guest

    Messages:
    317
    Likes Received:
    6
    GPU:
    GTX Titan on Agua
    Render on the 1070 and output VRR via hdmi on the 16xx card. Should work in theory, but who knows what the drivers will do.
     

  5. janos666

    janos666 Ancient Guru

    Messages:
    1,653
    Likes Received:
    407
    GPU:
    MSI RTX3080 10Gb
    You don't have to go 2160p120. In fact you can't (well, unless you also buy a DP->HDMI converter and drop to YCC 4:2:0 and miss out on G-Sync VRR). Even if wanted, the HDMI port on both of these cards is limited to 2160p60 or 1440p120. But with Turing, you get a 40-120 VRR G-Sync range (or 40-60 in case of 2160p60) which is better than fix 60 or fix 120 with Pascal. I am currently using FastSync which is a little stuttery for my taste (and V-Sync is too laggish).
     
  6. janos666

    janos666 Ancient Guru

    Messages:
    1,653
    Likes Received:
    407
    GPU:
    MSI RTX3080 10Gb
    Ah. There might be a problem though...
    I think the render GPU is automatically assigned to the one which is connected to the main display according to the Windows extended desktop settings. So, in this case, my small LCD has to be hanging on the old card and must be the main display but not all games support selecting another display on which they shall go into fullscreen mode. And I don't have much experience with G-Sync but I guess not all games work nicely in borderless window mode with G-SYnc. Or do they (not to mention that some games don't even offer borderless but has to be tricked into borderless)?
     
  7. ultra5711

    ultra5711 Guest

    Messages:
    317
    Likes Received:
    6
    GPU:
    GTX Titan on Agua
    Would purchase a 16xx card to try out but have mini ITX Mobo. May have to pick up an ATX sometime in the future.
     
  8. janos666

    janos666 Ancient Guru

    Messages:
    1,653
    Likes Received:
    407
    GPU:
    MSI RTX3080 10Gb
    I bought a board with 3 long slots with things like this in mind. (The 4x slot is occupied by a 10Gbe card.) :)
     
  9. janos666

    janos666 Ancient Guru

    Messages:
    1,653
    Likes Received:
    407
    GPU:
    MSI RTX3080 10Gb
    May be I would be better off selling the GTX1070 Sea Hawk EK now (which will be a slow process whenever it starts) and camp out the time before the 3000 series (or however nV will name it's next-gen) with a custom 1660Ti or 1660 Super which has a semi-passive custom factory aircooler (something like the MSI Gaming X). My ears are not what they used to be (sadly). May be I can tolerate the noise now. And it will be much easier and faster to sell an aircooled 1660 when the time comes for HDMI 2.1 (4k120) and a complementary performance upgrade.
    What are your thoughts?
     
  10. HeavyHemi

    HeavyHemi Guest

    Messages:
    6,952
    Likes Received:
    960
    GPU:
    GTX1080Ti
    No, that is not how it works or 'in theory' everyone would be doing this. There's no passthrough.
     

  11. janos666

    janos666 Ancient Guru

    Messages:
    1,653
    Likes Received:
    407
    GPU:
    MSI RTX3080 10Gb
    Don't you remember the test case when someone managed to get FreeSync working with an nVidia dGPU and an AMD iGPU before nVidia opened up G-Sync for VESA ASync displays?
    However, this is a good point for a GTX1660 Super instead of a 1650 because I can keep the new card in solo if the dual card setup doesn't work.
    And this combo setup only makes sense with an expensive GTX1070 or better card (and if you are talented enough to figure out the workarounds for selecting the render GPU, etc), a simple upgrade of the main card is evident otherwise (I wouldn't even hesitate if the 1070 wasn't a Sea Hawk EK but a simple air-cooled 1070, barely think about it if there was a 2060 Sea Hawk EK...).
     
    Last edited: Dec 2, 2019
  12. Astyanax

    Astyanax Ancient Guru

    Messages:
    17,035
    Likes Received:
    7,378
    GPU:
    GTX 1080ti

    When the game has no setting to choose what gpu is used, this is exactly how it works,
     
  13. janos666

    janos666 Ancient Guru

    Messages:
    1,653
    Likes Received:
    407
    GPU:
    MSI RTX3080 10Gb
    Having to set my psychically secondary screen up as primary is a minor annoyance if G-Sync works in borderless window mode with no latency added by the Windows Desktop Manager.
    I think I will capitulate and swap the 1070 for it's closest Turing part (1660 Super or 2060 with the MSI Gaming cooler) though. But I will still test this (before selling the 1070) and report back.
     
  14. Astyanax

    Astyanax Ancient Guru

    Messages:
    17,035
    Likes Received:
    7,378
    GPU:
    GTX 1080ti
    I'm actually surprised nobody has made a tool that allows device selection, its not impossible.

    nvidia's recently done this driver side for OpenGL, its not impossible for Dx (hell i'd like to launch my web browsers on my second card so the primary card can provide all its vram to games)
     
    ultra5711 and janos666 like this.
  15. ultra5711

    ultra5711 Guest

    Messages:
    317
    Likes Received:
    6
    GPU:
    GTX Titan on Agua
    Yep, there's many advantages to device selection. The question is would it introduce latency?
     

  16. HeavyHemi

    HeavyHemi Guest

    Messages:
    6,952
    Likes Received:
    960
    GPU:
    GTX1080Ti
    My feeble brain is telling me I have seen some games with a GPU selection box. But I would have to go back and look through dozens I've not ran for a long time. It is not an option I think I have seen in a game menu for awhile.
     
  17. Astyanax

    Astyanax Ancient Guru

    Messages:
    17,035
    Likes Received:
    7,378
    GPU:
    GTX 1080ti
    Yep, there's quite a few with an in game control for it, and some also select the rendering gpu based on what card a display is connected to.

    Not a universal thing though, some games just choose the system primary display, most browsers do with no ability to select otherwise (with all the preferences they do provide the user)
     
  18. janos666

    janos666 Ancient Guru

    Messages:
    1,653
    Likes Received:
    407
    GPU:
    MSI RTX3080 10Gb
    Most of the time it's a display and not a GPU selector. May be some OpenGL games had GPU selection though (or games with hardware PhysX)?

    I got the C9 and the RTX2060 today. I had to hook up the new VGA with a PCI-E riser cable because it's ~2.2 slots wide (the fan cover is oversized) and I won't pull my 10Gbe NIC out to make room (there is no free UTP wire in this room). :D
    G-Sync didn't work with the out-of-box firmware (some 03.xx.xx, the screen flashed to black a lot with G-Sync enabled) but it works fine (so far, I only launched a few games for a few minutes) with the latest EU firmware (04.70.05), although it still has to be enabled manually.

    I tried to test the hybrid setup but it's inconclusive so far. I couldn't make any games go fullscreen on the secondary TV (connected to Turing) after they launched on the main LCD (connected to Pascal). Frostbite games only list the displays connected to the Pascal card (one in total). Other games sneak back to the main display after I hit ALT+ENTER. And windowed mode is tricky because Windows will do V-Sync on windowed games if G-Sync is off (so I will never see tearing in a window).
    Interestingly, the Pendulum demo, when launched in window and dragged to the TV's screen area, seems to be locked to G-Sync mode (I can't use the built-in switches to turn it off) and it looks like it's running in G-Sync (although there are some occasional small stutters).

    So, yeah, this might work but definitely with a lot of quirks and workarounds and only with a few games. Well, at least I bought a Turing which is slightly faster and not considerably slower than the GTX1070.

    Just for fun, I tried to connect the TV to the Pascal card with a passive DP->HDMI adapter (DP++ Alt-Mode). -> No G-Sync. :( I can't try the same with the Turing card's DP port because the card is held by a carton box in a way which blocks the DP socket. :D
    (Yes, I know DP Alt-Mode is limited to HDMI 1.4 bandwidth with current adapters and drivers.)

    Ah, and there is this funny thing in Windows 10: it decided the RTX2060 is both my Power Saving and High Performance GPU, so that won't help me trick anything into rendering on the GTX1070.
     
    Last edited: Dec 6, 2019
  19. yasamoka

    yasamoka Ancient Guru

    Messages:
    4,875
    Likes Received:
    259
    GPU:
    Zotac RTX 3090
    I can chime in with my own experience. I have a GTX 1080Ti and a GTX 1650 in the same PC. The purpose is being able to run Ubuntu with KDE Plasma at full framerate on my 1440p 165Hz display connected to the GTX 1650 while folding 24/7 on my GTX 1080Ti. Works wonderfully. When I want to play a game, I fire up Windows natively or reboot to put the GPU in VFIO passthrough mode and launch Windows under a VM.

    Once, when I launched Windows natively for the first time in weeks, and ran Doom, with the 1440p 165Hz display connected to my GTX 1080Ti, I was getting 40-50FPS on max settings. Disabled my GTX 1650 in Device Manager and the framerate shot back up to my RTSS framerate cap of 162FPS. This means that the GTX 1650 was doing the rendering while the GTX 1080Ti was passing through the rendered frames. The most important part perhaps in this is that G-Sync was working perfectly (no tearing and no stutter).

    I am unsure what other games would do. Doom has no GPU selection option in its settings. Perhaps it has something to do with Vulkan.
     
  20. janos666

    janos666 Ancient Guru

    Messages:
    1,653
    Likes Received:
    407
    GPU:
    MSI RTX3080 10Gb
    The problem is that games in general insist on going fullscreen (even what that mean borderless windowed fullscreen, not exclusive fullscreen) on the main display in extended desktop mode and even Frostbite3's built-in display selector lists only the displays connected to the VGA card it's currently rendering on. Hence, I am left with windowed mode (at least for the overwhelming majority of games) but even after I drag a window to the TV's screen (connected to the Turing card) I don't think it works with G-Sync. There is no tearing but that's expected since the Windows Desktop Manager always does some kind of V-Sync in normal windowed mode.
    The utilization of the display-only GPU is fairly high (roughly 25% but with spikes up to almost 50% in 4k), so a very weak card could possibly face some performance issues (and I guess it also taxes the CPU for some degree as well, since this is not SLI/NVLink-like communication, probably not DMA copy but GPU1->systemRAM->GPU2 "software" copy). And the render GPU is seemingly stuck at ~85% utilization, although I am not sure how accurate those Windows Task Manager numbers are. The utilization remains ~0% on the Pascal GPU when the Turing is both the render and display GPU and the Pascal card only has a static small desktop extended to it (for a small LCD but also chained though an old HDMI 1.3 AVR for LPCM sound).

    What are the optimal settings for HDMI 2.1 VRR G-Sync anyways? G-Sync ON + V-Sync ON + Low Latency ULTRA or G-Sync ON + V-Sync OFF + RTSS frame limit slightly below the upper end of the range (or in-game limit exactly at the range's top) and Low Latency at user preference?
    I used to understand this for the old (module assisted) G-Sync but it's so not clear for me with VESA G-Sync and even less so with HDMI VRR G-Sync (this is probably the same as VESA G-Sync in terms of V-Sync / limiter configuration but I can't be sure).
     
    Last edited: Dec 6, 2019

Share This Page