Back in 2013, Nvidia introduced a new technology called G-Sync to eliminate screen tearing and stuttering effects and reduce input lag when playing PC games. The company accomplished this by tying your display’s refresh rate to the actual frame rate of the game you were playing, and similar variable refresh-rate (VRR) technology has become a mainstay even in budget monitors and TVs today.

The issue for Nvidia is that G-Sync isn’t what has been driving most of that adoption. G-Sync has always required extra dedicated hardware inside of displays, increasing the costs for both users and monitor manufacturers. The VRR technology in most low-end to mid-range screens these days is usually some version of the royalty-free AMD FreeSync or the similar VESA Adaptive-Sync standard, both of which provide G-Sync’s most important features without requiring extra hardware. Nvidia more or less acknowledged that the free-to-use, cheap-to-implement VRR technologies had won in 2019 when it announced its “G-Sync Compatible” certification tier for FreeSync monitors. The list of G-Sync Compatible screens now vastly outnumbers the list of G-Sync and G-Sync Ultimate screens.

    • GamingChairModel@lemmy.world
      link
      fedilink
      English
      arrow-up
      77
      arrow-down
      3
      ·
      3 months ago

      They always win, unless they don’t. History is littered with examples of the freer standard losing to the more proprietary standard, with plenty of examples going the other way, too.

      Openness is an advantage in some cases, but tight control can be an advantage in some other cases.

        • Vik@lemmy.world
          link
          fedilink
          English
          arrow-up
          16
          ·
          3 months ago

          Up until relatively recently, it was great to see that Vulkan and DX12 were in a practically even split.

          Still great to see that some of the best talent in terms of visual fideltiy showcases Vulkan, like rdr2 (originally defaulted to dx12, now vulkan), doom eternal and so on. Fully expect the next GTA to.

          stadia was derp but it forced interested publishers to get acquainted with vk. I think it ended up doing more good for the industry in the end as a failure, rather than harm by succeeding and locking subscribers into such a restrictive game “ownership” paradigm.

        • pycorax@lemmy.world
          link
          fedilink
          English
          arrow-up
          14
          ·
          2 months ago

          OpenGL is a bit more complicated since it’s more than just a specification in practical terms. The documentation and tooling for OpenGL was really awful compared to Direct3D. This is a huge issue when developers are working on implementing features. For instance, the documentation for glReadPixels is incorrect for years and you would have to refer to the wiki for it instead. Yet, the only way you would know this is if you scoured the internet and happened to find a StackOverflow page asking about symptoms that may not even match your issue.

          Thankfully, Vulkan seems to be a lot better in this regard but I still curse the heavens everytime I need to go back to OpenGL when supporting older hardware.

    • Psythik@lemmy.world
      link
      fedilink
      English
      arrow-up
      11
      arrow-down
      14
      ·
      2 months ago

      I’ll buy an AMD GPU once they have an answer to the 4090 (actually the 5090 at this point). I need AI upscaling, SDR-to-HDR conversion for videos, and way better ray tracing performance. Until that happens, my PC will unfortunately remain a mixed-breed bastard.

      • barsoap@lemm.ee
        link
        fedilink
        English
        arrow-up
        12
        arrow-down
        1
        ·
        2 months ago

        I need AI upscaling,

        Not a hardware thing.

        SDR-to-HDR conversion for videos,

        Not a hardware thing.

        and way better ray tracing performance.

        Ray accelerators are a hardware thing. The AI to denoise them, again, not so much.

        Just because AMD cards don’t come with tensor cores doesn’t mean they can’t run AI workloads, tensor cores are essentially cut-down GPU cores. They make sense in mobile devices to save on energy consumption but on desktop? Just use the TFLOPs you have for the basic matrix math you’re doing, the important bit, and that’s the gather/scatter memory architecture to deal with giant matrices, GPUs also have.

        • Psythik@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          ·
          edit-2
          2 months ago

          If there are software alternatives that allow me to automatically AI upscale and convert to HDR any video I play in any program, streaming or local (and it just works automatically in everything with no effort on my part), then please share download links with me. Until then I’ll stick with Nvidia GPUs.

          • barsoap@lemm.ee
            link
            fedilink
            English
            arrow-up
            1
            ·
            edit-2
            2 months ago

            “Any video in any program” is not how it works for you right now, either. And if you need something, then definitely not in every program but in your video editor because you’re a professional.

            As to software: I’m on Linux. You won’t get that nvidia software there, either, in Linux land everyone gets those features because they have nothing to do with what GPU you have. Well at least mpv does it all, natively or via standard plugins (also AI frame interpolation), TBH I don’t really care how firefox plays videos as long as VRR works, which it does.

  • ScampiLover@lemmy.world
    link
    fedilink
    English
    arrow-up
    31
    ·
    3 months ago

    TL:DR The stuff the dedicated module is doing will go inside specific Mediatek chips on specific premium monitors

    Really weird it’s taken this long - I remember reading that the modules were expensive and assumed it was just because they were early generations and Nvidia was still working things out

  • MonkderVierte@lemmy.ml
    link
    fedilink
    English
    arrow-up
    17
    ·
    2 months ago

    Nvidia says it’s partnering with chipmaker MediaTek to build G-Sync capabilities directly into scaler chips that MediaTek is creating for upcoming monitors.

    Meaning, the same in blue?

    • AlotOfReading@lemmy.world
      link
      fedilink
      English
      arrow-up
      10
      ·
      2 months ago

      No. Nvidia will be licensing the designs to mediatek, who will build out the ASIC/silicon in their scaler boards. That solves a few different issues. For one, no FPGAs involved = big cost savings. For another, mediatek can do much higher volume than Nvidia, which brings costs down. The licensing fee is also going to be significantly lower than the combined BOM cost + licensing fee they currently charge. I assume Nvidia will continue charging for certification, but that may lead to a situation where many displays are gsync compatible and simply don’t advertise it on the box except on high end SKUs.

  • WalnutLum@lemmy.ml
    link
    fedilink
    English
    arrow-up
    15
    ·
    2 months ago

    This is why I think eventually FSR will win over DLSS in the end, despite DLSS having better performance.

    • hamsterkill@lemmy.sdf.org
      link
      fedilink
      English
      arrow-up
      4
      ·
      2 months ago

      I think it’s unlikely one of those techs “wins” at all. It’s relatively easy to support them all from a software perspective and so gamers will just use whichever corresponds to their GPU.

      • WalnutLum@lemmy.ml
        link
        fedilink
        English
        arrow-up
        1
        ·
        2 months ago

        Unless something has changed recently you still have to submit builds to Nvidia to have them train the DLSS kernel for you, so FSR is substantially easier to integrate.

    • melroy@kbin.melroy.org
      link
      fedilink
      arrow-up
      3
      arrow-down
      5
      ·
      3 months ago

      haha. I wouldn’t, but yes please sell your stocks. I will buy them. We still have a $300B inflow of AI shizzle (bubble) that goes into Nvidia.

  • Vik@lemmy.world
    link
    fedilink
    English
    arrow-up
    4
    ·
    3 months ago

    Good for them if it help eliminate the mark up of displays advertising gsync ultimate. I have my doubts but it’d make sense if they’re no longer using dedicated boards with FPGAs and RAM.

    One has to wonder if VESA will further their VRR standard to support refresh rates as low as 1Hz

    • AngryMob@lemmy.one
      link
      fedilink
      English
      arrow-up
      2
      ·
      3 months ago

      Yeah it feels premature since so many freesync displays still only go to 48hz.

      Maybe if the mediatek chip can go to 30hz then VESA will update.

      • Vik@lemmy.world
        link
        fedilink
        English
        arrow-up
        4
        ·
        2 months ago

        I think below that range they can frame double (low framerate compensation LFC) to go as low as 24 FPS

    • barsoap@lemm.ee
      link
      fedilink
      English
      arrow-up
      2
      ·
      edit-2
      2 months ago

      I’m not aware of any protocol limitations there, it’s just that monitors don’t bother to support refresh rates that low.

      Experience at low frame rates will be choppy anyways, if it’s a fixed low framerate you can use LFC without quality degradation (say for movies) and if it’s a variable low framerate (where LFC causes jitter)… you should be lowering your graphics settings to get better fps. Why spend extra engineering and hardware on a capability that won’t ever result in a good experience anyway?

      …has it really come to this? From laughing at console people for their “cinematic FPS” to nvidia fanboys saying “my monitor supports lower framerates than yours”? Aren’t we supposed to brag about our displays (pointlessly) reaching haptic fps? (that’s be 1kHz btw).

      • Vik@lemmy.world
        link
        fedilink
        English
        arrow-up
        3
        ·
        2 months ago

        Higher end phones have the capability to gear down to 1hz to save power on static representation. Would be nice to see that on notebook eDP and hell, even with dekstop monitors too.

  • p1mrx@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    3
    ·
    2 months ago

    Will the Mediatek modules also support VESA Adaptive Sync, or will they have fixed frame rate on AMD cards?

    • Cort@lemmy.world
      link
      fedilink
      English
      arrow-up
      8
      ·
      2 months ago

      Iirc freesync (extreme/ultimate) works down to 48fps. I use it when playing in 4k since my card struggles to keep a locked 60fps at that resolution

      • barsoap@lemm.ee
        link
        fedilink
        English
        arrow-up
        1
        ·
        2 months ago

        It works down to whatever the implementation in the monitor supports which tends to be 40 or 48fps. There’s a minimum you have to support if you want the FreeSync sticker but in principle you could call it AdaptiveSync and only support down to 60 or such, or support everything down to 1fps (which doesn’t happen in practice) and still call it FreeSync, AMD doesn’t mind you exceeding specs.

    • Psythik@lemmy.world
      link
      fedilink
      English
      arrow-up
      3
      ·
      2 months ago

      Freesync works at up to 120Hz on my TV (LG C1), the maximum refresh rate of the set. I’m particularly sensitive to screen tearing, and confirmed that it’s working by playing various PC games with a framerate limiter.

  • ditty@lemm.ee
    link
    fedilink
    English
    arrow-up
    1
    ·
    2 months ago

    So I have two Acer monitors with the dedicated G-Sync hardware, are those compatible with AMD freensync? I think both displays predate AMD freesync though.

    I’m curious now that I’ve switched to Linux I’m running into issues with my NVIDIA GPU and Wayland suspend/hibernate functionality.

    • barsoap@lemm.ee
      link
      fedilink
      English
      arrow-up
      1
      ·
      2 months ago

      So I have two Acer monitors with the dedicated G-Sync hardware, are those compatible with AMD freensync?

      Have a look at the manual but I don’t think chances are good.

      I’m curious now that I’ve switched to Linux I’m running into issues with my NVIDIA GPU and Wayland suspend/hibernate functionality.

      Suspend/hibernate is iffy in general, doesn’t necessarily have to do anything with the GPU. You’ll need sufficient swap space and a BIOS which is playing nice. Aside from slogging through logs to see if anything throws particular errors you can try booting without nvidia drivers (plain VESA console if you have to) and trying to hibernate that.