Acer XB280HK is a 28-inch 4K Ultra HD monitor with G-Sync

Discussion in 'Frontpage news' started by Hilbert Hagedoorn, May 23, 2014.

  1. Hilbert Hagedoorn

    Hilbert Hagedoorn Don Vito Corleone Staff Member

    Messages:
    48,392
    Likes Received:
    18,560
    GPU:
    AMD | NVIDIA
  2. SLI-756

    SLI-756 Guest

    Messages:
    7,604
    Likes Received:
    0
    GPU:
    760 SLI 4gb 1215/ 6800
    waw, this and two 780ti 6gb would set me back £2000.
    I much prefer lower scale gaming for now, when gtx 990 is here perhaps.
     
  3. Smikis

    Smikis Active Member

    Messages:
    61
    Likes Received:
    0
    GPU:
    Sapphire Tri-X R9 290
    why would you need gsync anyway, I still believe screen tearing is made up belief that doesn't actually exists..
    If i ever had it was so minor that I never noticed, those screen tearing screenshots sure look ridiculous..
     
  4. Corbus

    Corbus Ancient Guru

    Messages:
    2,469
    Likes Received:
    75
    GPU:
    Moist 6900 XT
    You'd be surprised how wrong you are . Screen tearing is real and is out to get you! Depends on what games you play i guess, i myself don't mind it that much but there's a lot of people who can't play because of it.
     

  5. Loobyluggs

    Loobyluggs Ancient Guru

    Messages:
    5,220
    Likes Received:
    1,589
    GPU:
    RTX 3060 12GB
    I think the problem is deeper than that.

    For affordable 4,096 Horizontal pixel gaming there needs to be a complete redesign of the basic PC architecture. AMD, Intel, IBM, Apple, nVidia, Samsung, Microsoft and considerable contribution from the global community need to get a new clean design from scratch.

    All this x86 crap needs to go. Too expensive, too much power draw and running out of steam fast.

    When I was a kid, I was told we'd be in flying cars by 2015 - but no, we still got x86 processors running binary...64 bit? pfft, I would have expected 1,024 bit tech running my kettle ffs.

    This is what happens when incremental increases are more profitable.

    /gets off soapbox.
     
  6. Spets

    Spets Guest

    Messages:
    3,500
    Likes Received:
    670
    GPU:
    RTX 4090
    4k G-Sync, do want :D
     
  7. TheDeeGee

    TheDeeGee Ancient Guru

    Messages:
    9,633
    Likes Received:
    3,413
    GPU:
    NVIDIA RTX 4070 Ti
    I'm guessing you also believe there is no difference between 30 and 60 FPS, because the human eye can't see past 30 FPS eh?
     
  8. AzzKickr

    AzzKickr Guest

    Messages:
    141
    Likes Received:
    6
    GPU:
    Vega64
    Don't forget narrow-FOV motion sickness :infinity:

    :D
     
  9. KissSh0t

    KissSh0t Ancient Guru

    Messages:
    13,832
    Likes Received:
    7,593
    GPU:
    ASUS 3060 OC 12GB
    Is this like a joke comment?
     
  10. FrenchKiss

    FrenchKiss Master Guru

    Messages:
    345
    Likes Received:
    7
    GPU:
    PNY RTX 4090 XLR8

  11. FrenchKiss

    FrenchKiss Master Guru

    Messages:
    345
    Likes Received:
    7
    GPU:
    PNY RTX 4090 XLR8
    Nice troll bro :D
     
  12. Ven0m

    Ven0m Ancient Guru

    Messages:
    1,851
    Likes Received:
    31
    GPU:
    RTX 3080
    Well, g-sync drastically reduces the hardware requirements, as you don't need your system to stay at 60fps+ levels 100% of the time. So if you accept frame rates around 40, single 780ti or your dual 760.
     
  13. AC_Avatar100400

    AC_Avatar100400 Guest

    Messages:
    340
    Likes Received:
    0
    GPU:
    WC GTX 780
    Assassins Creed Black Flag Check Mate.

    I think Gsync is great but Nvidia really screwed up with it what a blunder.
     
  14. Lane

    Lane Guest

    Messages:
    6,361
    Likes Received:
    3
    GPU:
    2x HD7970 - EK Waterblock
    I like G-sync, and its nice to see it evolve over the 1080p 27" monitors..

    But im a little bit worry with 4K +G-sync. We will need wait reviews, but g-sync have a limit at 33.3ms (30hz or 30fps ). It is the maximum time it can retain a frame, so basically when you are in the 30fps range and under, it will need to completely redraw the frame ( and could create a stutter more or less visible).

    So with 4K monitor, what i fear is the min framerate, if you go to much under 30fps with the minimum framerate ( so at more of 33.3ms for render a frame ), this could start to be a problem. It will be needed to be really carefull on the min framerates and set the graphism accordingly for dont meet the problem.
     
    Last edited: May 23, 2014
  15. anub1s18

    anub1s18 Member Guru

    Messages:
    157
    Likes Received:
    9
    GPU:
    Nvidia RTX2070S /WC
    yea but if you're packing a display like this whats the point of medium/high textures instead of ultra :p

    i'm guessing 4k single displays are hitting the 3gb vram limit in high end games already (or they probably should be if the textures are up to snuff) that's only gone get worse. in the next 2 years so bundeling 2 2GB gpu's or 2 3GB gpu's with a display like this almost seems like a waste : /
     

  16. schmidtbag

    schmidtbag Ancient Guru

    Messages:
    7,975
    Likes Received:
    4,342
    GPU:
    Asrock 7700XT
    I don't get it - how exactly is x86 a problem when it comes to high-res gaming? And what do you propose to improve it? x86 may be old, kind of messy (but it is CISC after all...) and a bit power hungry, but there's a certain point where x86 becomes very efficient, if not more efficient than any other architecture.

    Try getting an ARM CPU to compete with an i7 in terms of performance-per-watt and the ARM will most certainly fail miserably. When you try to get an x86 CPU to compete against the power draw of an existing ARM CPU, the ARM will most likely perform better. This is my gripe with intel - they want to dominate EVERYTHING but x86 is not a 1-size-fits-all architecture by any means.

    I both agree and disagree. x86 should have been obsoleted a long time ago, but in the Windows world, software compatibility would be a nightmare if that were the case. But why go beyond 64 bit architectures? In the server world, where software compatibility in new systems often doesn't matter at all, they still stick with 32 bit and 64 bit architectures. Every year servers have the opportunity to increase the bus width but they don't. GPUs are the only exception, but their operation isn't comparable to a CPU.

    If money wasn't in the equation, then at this point we'd likely all own a quantum computer at this point. But since that isn't the case and since companies only do things in their own interest, your demands seem very naive.
     
  17. Lane

    Lane Guest

    Messages:
    6,361
    Likes Received:
    3
    GPU:
    2x HD7970 - EK Waterblock
    x86 is an architecture, 64 or 32bits have nothing to with that..

    in 70's x86 was using a 16bits instruction set, it have then pass to an 32bits instructions set.. it is now using the x86_64 instructions set, who is too the AMD64 instructions sets ( AMD64 is too used by Intel processors ( with some little modifications on the implementations, VIA processors etc ).

    To note that AMD64bits instructions are too backward compatible with both 16 and 32bits. This is why you can run full 32bits software with it, and not the invert.. the x86_64 is not 32bits extended to 64bits, but full 64bits but is backward compatible with 32bits operands etc.
     
    Last edited: May 23, 2014
  18. schmidtbag

    schmidtbag Ancient Guru

    Messages:
    7,975
    Likes Received:
    4,342
    GPU:
    Asrock 7700XT
    x86-64 is backward compatible on a HARDWARE level. In contrast, I believe IA64 is a 64-bit x86 architecture, but it isn't backward compatible on either a hardware of software level. This brings me back to my point about Windows being an issue - if you want newer, better architectures, the software has to be designed for it. 32 bit Windows is STILL prevalent. For whatever reason, MS didn't try pushing for wider buses 8 years ago, and because of this they're crippling the computer industry.
     
  19. Lane

    Lane Guest

    Messages:
    6,361
    Likes Received:
    3
    GPU:
    2x HD7970 - EK Waterblock
    The Itanium version is different ( on scratch 64bits ) but it is not an x86 architecture anyway.

    For softwares / OS, thats a different questions i agree.
     
    Last edited: May 23, 2014
  20. Denial

    Denial Ancient Guru

    Messages:
    14,206
    Likes Received:
    4,118
    GPU:
    EVGA RTX 3080
    There is no reason to retire x86. Modern x86 processors barely resemble "x86" processors anyway. They all have sophisticated frontends that decode CISC based instructions to internal formats that resemble RISC where they need to. ARM's only advantage is that they focused on low power from the start, Intel has slowly been going through and modifying their instructions/internal arch to better suit those needs. Moorefield is x86 and is power/performance competitive with ARM and should be seen in products later this year.

    Also nothing is black and white, it isn't like ARM is only capable of low power, they could easily scale their design and make internal changes to better suit high performance needs. They just knew it would be more difficult to compete with Intel so they went a completely different route to avoid competing with them.

    And yeah, the compatibility thing is a problem. It would be like retiring all world languages in favor for a superior one that is more accurate with less words/phonics/whatever. Languages evolve naturally to fill the voids/gaps/concerns that populations have. X86/ARM is exactly the same way, neither one is set in stone they are constantly evolving. There have also been tons of other instruction sets that have claimed all kinds of benefits but failed in gaining traction, mainly because Intel can do whatever it wants internally on a chip and mimic those benefits.

    As for G-Sync, I think I may personally wait to see what the deal is with Freesync and see if my questions about G-Sync get answered. Nvidia keeps saying they are going to fix things in the future with G-Sync, but is that going to require an new module? Are they going to issue firmware updates to the current G-Sync module? Is that why they went with an FPGA instead of a ASIC board?

    I really don't want to lock myself into Nvidia only with my monitor purchase.
     
    Last edited: May 23, 2014

Share This Page