LG UltraFine 5K monitor rendered useless with nearby wireless router

Discussion in 'Frontpage news' started by Hilbert Hagedoorn, Jan 31, 2017.

  1. Hilbert Hagedoorn

    Hilbert Hagedoorn Don Vito Corleone Staff Member

    Messages:
    40,845
    Likes Received:
    9,223
    GPU:
    AMD | NVIDIA
  2. thatguy91

    thatguy91 Ancient Guru

    Messages:
    6,643
    Likes Received:
    98
    GPU:
    XFX RX 480 RS 4 GB
    So if you plug your computer directly into the router and use the wifi for other devices, you basically will have to buy a much, much longer LAN cable, and other cables, and hide it away somewhere if using this monitor? Is it 2.4 GHz or 5 GHz that is affected, or both? Technically phones with wifi turned would cause the issue as well, but I guess the output power isn't great enough to be an issue. Same with bluetooth etc.
     
  3. sammarbella

    sammarbella Ancient Guru

    Messages:
    3,929
    Likes Received:
    178
    GPU:
    290X Lightning CFX (H2O)
    (Former) LG's product design employee at work:

    [​IMG]
     
  4. SirDremor

    SirDremor Master Guru

    Messages:
    586
    Likes Received:
    0
    GPU:
    Nvidia GTX 1050
    Sorry, I have no pity for such users.

    If they are "smart" to buy Apple-only devices (apparantly thinking how great they are, when they are definitely not), then they can only blame themselves.
     

  5. Zeka

    Zeka Active Member

    Messages:
    77
    Likes Received:
    7
    GPU:
    PowerColor R9 280
    This is a big issue for LG. Monitor that is useless next to a router???

    I'm not a fan of Apple's overpriced products, but I don't think this would happen if he purchased Apple Display...
     
  6. ChisChas

    ChisChas Member Guru

    Messages:
    199
    Likes Received:
    45
    GPU:
    ASUS Strix 3080 OC
    This monitor is described as an 'Ultrafine 5K monitor' but the resolution is printed as 2K in the article?

    Is this a typo, Hilbert?
     
  7. slyphnier

    slyphnier Master Guru

    Messages:
    813
    Likes Received:
    71
    GPU:
    GTX1070
    well my guess seems there is issue with hardware shielding ... seems especially on the thunderbolt port or cable

    its pretty basic in electronics, as the higher data-rate goes bigger, its also become more sensitive to noise...
    1080p said using 1.485Gbps, while hdmi 2.0 up to 6Gbps
    not sure how high thunderbolt can, but plain calculation if 5k is 5x 1.4Gbps then its 7Gbps

    like 10gbps (cat7) lan cable why it have more shielding compared lower speed cable (cat5 or 6)
    well not only LAN cable but same to other cable, such as HDMI also same
    in past (DVI&VGA) thats why monitor cable always equipped with ferrite-core to reduce noise
    the port should also have some shielding
     
  8. Mda400

    Mda400 Master Guru

    Messages:
    927
    Likes Received:
    112
    GPU:
    3060Ti 2055/16.2Ghz
    You are thinking individual channel bandwidth (HDMI has 3). The first HDMI version can support up to 5Gbps worth of video/audio data and 1.3/1.4 can support up to 10.2Gbps. Version 2 supports up to 18Gbps and 2.1 supposedly 48Gbps. 1080p@60hz 32-bit color needs 4Gbps. 4k@60hz 32-bit color needs 18Gbps. 5k@60hz 32-bit color needs 22Gbps.

    Thunderbolt version 1 and 2 provide 20Gbps but the only difference is how they split that bandwidth (version 1 between 2x10Gbps channels and version 2 with 20x1Gbps channels). Version 3 is supposedly be able to provide 40Gbps.

    Here's a good calculator for video bandwidth: Digital Bandwidth Calculator
     

Share This Page