Spec for IEEE 802.11ay Is in Development stages WIFI going 176 Gb/s

Discussion in 'Frontpage news' started by Hilbert Hagedoorn, Feb 21, 2017.

  1. Hilbert Hagedoorn

    Hilbert Hagedoorn Don Vito Corleone Staff Member

    Messages:
    48,317
    Likes Received:
    18,405
    GPU:
    AMD | NVIDIA
  2. WaroDaBeast

    WaroDaBeast Ancient Guru

    Messages:
    1,963
    Likes Received:
    0
    GPU:
    Gigabyte HD7950
    ¡Ayyyy, qué rápido!
     
  3. sverek

    sverek Guest

    Messages:
    6,069
    Likes Received:
    2,975
    GPU:
    NOVIDIA -0.5GB
    nice speeds.

    I am yet to fully utilize 802.11g, handles twitch 1080p60 streams without problem.

    4k video streaming probably gonna utilize better recent WIFI standards (ac, ad, ay...).
    We just need more infrastructure, bandwidth and stronger PC hardware to get there... so not soon...
     
  4. schmidtbag

    schmidtbag Ancient Guru

    Messages:
    7,955
    Likes Received:
    4,336
    GPU:
    HIS R9 290
    I'm not sure how I feel about all of this. Wireless displays are cool and have a lot of practical purposes, but wifi is a TERRIBLE way to go about it, and I don't like that the wifi standard is being manipulated for widi. Take ATSC for example - you can get full HD television with surround sound audio through the air, using UHF range. Sure, it's analog, but it still looks decent. I don't see why a similar approach couldn't be taken for wireless displays. We are getting way too dependent on wifi and it's crippling its abilities. Consider this:
    * For every device added to a wifi network, the theoretical maximum bandwidth of each device is halved. I don't think it's a good idea for someone who is using facebook to potentially impact my wireless display.
    * If you are streaming video AND using a wireless display, not only does that imply at least 3 connected devices (wireless router, wireless display, and the playback device) but bandwidth will also be used up just to get the media stream in the first place. That isn't very efficient.
    * You are likely only able to project 1 wireless display in a network at a time; depending on your network, there might not be enough bandwidth to do multiple displays. Meanwhile if you used something like an ATSC emitter, you can do as many as you have open channels. You could theoretically do a side-by-side wireless triple-monitor setup while streaming 4K content. Not only that, but since you're reducing a wifi device, you're reducing EMI in the 2.4GHz range.
    * Even if you do ad-hoc networks (in which case, you wouldn't need an 802.11ay connection), you're still adding EMI, which effectively reduces bandwidth.
    * Using a new wifi network in order to accomplish widi requires all devices to comply to the same generation, or else you don't benefit from it. I wouldn't consider that to be any more cost effective than, for example, an ATSC emitter meant for home use.
    * Compression is almost a requirement for widi to work. Not only does this worsen the quality and add to CPU/GPU usage, but it worsens latency.

    And yes, I understand that using ATSC would conflict with FCC regulations or whatever. I'm not saying to specifically use ATSC either (though, most TVs already support it anyway, which reduces equipment costs and other complications). My point is, there are much better ways to go about wireless displays than wifi.
     
    Last edited: Feb 21, 2017

  5. Amx85

    Amx85 Master Guru

    Messages:
    335
    Likes Received:
    10
    GPU:
    MSI R7-260X2GD5/OC
    haha, you speak spanish? im dominican :infinity:
     
  6. Prince Valiant

    Prince Valiant Master Guru

    Messages:
    819
    Likes Received:
    146
    GPU:
    EVGA GTX 1080 ti
    This is nifty but I'll be sticking to wired so long as it's around.
     
  7. David Lake

    David Lake Master Guru

    Messages:
    765
    Likes Received:
    46
    GPU:
    Titan V watercooled
    And here I am stuck with infiniband at a pathetic 56Gbps!
     
  8. Incredible Lama

    Incredible Lama Member Guru

    Messages:
    164
    Likes Received:
    30
    GPU:
    Gigabyte GTX970 G1
    With all that wireless signal development nowadays, am I the only one whom is reading: "cancer rate increasing at 176 Gb/s"?
     
  9. destruya

    destruya Member

    Messages:
    14
    Likes Received:
    4
    GPU:
    EVGA 2080 FTW3
    Honestly, they're missing out not nicknaming this new tech after The Fonz.
     
  10. Mda400

    Mda400 Maha Guru

    Messages:
    1,087
    Likes Received:
    199
    GPU:
    4070Ti 3GHz/24GHz
    4k streaming is easy today, only 4k gaming is still difficult. Only hurdles are producing 4k content and getting lazy and greedy ISP's to offer the minimum needed to stream 4k (which for good quality is around 12-15mbps). Definition of broadband here in the states is 25mbps down, 3mbps up, so technically two 4k streams can be watched at a time.

    Any cable company or better can do it. DSL can differ from ISP's that don't care to invest in existing infrastructure. Even cellular carriers can offer it. But the data caps... oh the data caps... the big money grab.
     

  11. WareTernal

    WareTernal Master Guru

    Messages:
    267
    Likes Received:
    53
    GPU:
    XFX RX 7800 XT
    This seems like a bunch jargon. To me, this is far from concise. I hope this comment can be taken in a constructive way.

    "a transmission rate of 20–40 Gbit/s"
    "The link-rate per stream is 44Gb/s, with four streams this goes up to 176Gb/s."
    -so the "transmissions rate" is both 20-40Gb/s AND 176Gb/s...

    "It will have a frequency of 60 GHz"
    "802.11ay bonds four of those channels together for a maximum bandwidth of 8.64 GHz"
    you can't measure bandwidth in Hertz, so I guess you throw that out as garbage and say that "ay" operates in the 60GHz band

    If "ad" is 7Gb/s and this is 4 times that, it's not 176Gb/s, so maybe the 20-40 bit was correct. 60GHz radios will need higher power to match the range of sub 5GHz wifi, and 60GHz radios perform poorly if they don't have a clear line of sight. The "300–500 meters" seems unlikely without high gain, directional antennas. 300 meters seems pointless for streaming video to my TV, especially compared to to 7 meter range of "ad". So is this for wifi bridging, or wireless displays?
     
  12. Rabbitdude92

    Rabbitdude92 Guest

    Messages:
    25
    Likes Received:
    0
    GPU:
    HD7850 2GB
    Im pretty sure that over the air broadcasts have been converted to digital for a few years now and that is why you need a tv with a builtin digital tuner or a digital converter to even watch the the over the air broadcasts.
     
  13. heffeque

    heffeque Ancient Guru

    Messages:
    4,413
    Likes Received:
    205
    GPU:
    nVidia MX150
    Nobody cares. That's what Private Messages are for. Use them.
     
  14. tsunami231

    tsunami231 Ancient Guru

    Messages:
    14,702
    Likes Received:
    1,843
    GPU:
    EVGA 1070Ti Black
    eh did they fiqure out to how to KEEP things from interfering with wifi yet? if not I dont think I care how fast they make. I limit all wireless clients to 10mbit up/down, and to this day WIFI both 2ghz/5ghz still the only part network I get complains about, said client will be connected to router and transmitting yet client will act like it not connected, rebooting client fix it so does rebooting router, but reboot router is annoying when it easier to reboot device but people cant be bother to do that. I lost count how many time a wireless client says there no interent when the router show it is connected and transmitting or simply wont connect. that was FIX by rebooting the client.


    Which why everything that mine is wired.
     
    Last edited: Feb 21, 2017
  15. schmidtbag

    schmidtbag Ancient Guru

    Messages:
    7,955
    Likes Received:
    4,336
    GPU:
    HIS R9 290
    You're right, they have been converted; that's what ATSC is. But to my understanding, it's basically an analog signal carrying a distorted form of digital data. Modern TV tuners are basically just specialized modems:
    https://en.wikipedia.org/wiki/ATSC_tuner#How_an_ATSC_tuner_works

    EDIT:
    I may be wrong, but I get the impression that wireless technologies designed for digital use (like Wifi or Bluetooth) don't handle signals the same way as ATSC, for several reasons. For example, ATSC signals from either cable TV providers or over-the-air broadcasts are passing their signals through 2 wires (including ground) to your TV. That's a single wire that handles 1080p, 5.1 channel audio, and some I/O controls. When you consider that this same single wire can be used to transmit high-speed Internet signals, that's a crazy amount of data pumping through this wire. All of this is often done in less than 2GHz. Wifi's bandwidth is miniscule in comparison, but, it's also a completely different set of hardware that conclusively must handle data in a different way.
     
    Last edited: Feb 21, 2017

  16. illLoGiQ

    illLoGiQ Member

    Messages:
    11
    Likes Received:
    0
    GPU:
    GTX 970 SC

    LMAO IKR :puke2:
     
  17. WaroDaBeast

    WaroDaBeast Ancient Guru

    Messages:
    1,963
    Likes Received:
    0
    GPU:
    Gigabyte HD7950
    ¡Si, pero lo hablo como'l culo!

    In all seriousness, I can't speak Spanish and that was just a pun with 802.11ay. I simply couldn't resist. :D

    Anyhow... This is all theoretical. We probably won't reach speeds close to that theoretical maximum unless we buy top of the line products. Of course, most manufacturers will go for the bare minimal requirements to be able to put a shiny sticker on the box so as to fool the customer.

    Which all leads to the following reaction: yawn. (Not directed at you Hilbert, but rather at the industry. ;))
     
  18. sverek

    sverek Guest

    Messages:
    6,069
    Likes Received:
    2,975
    GPU:
    NOVIDIA -0.5GB
    - "Hey man, what wifi you using?"

    -
     

Share This Page