This is for multiplying your fps by 3x or 4x, but the input lag, ghosting, stuttering, and other issues make everything worse. Overall I’d recommend sticking to lossless scaling at 2x or not using framegen at all.

  • orca@orcas.enjoying.yachts
    link
    fedilink
    arrow-up
    5
    ·
    1 hour ago

    Last time I tried one of these filters, the game looked terrible (it was Death Stranding). It applied a weird squiggly, distorted look that wasn’t even remotely better looking. For games where you need precision and timing, I can see this making multiplayer games hell.

  • brucethemoose@lemmy.world
    link
    fedilink
    arrow-up
    1
    ·
    edit-2
    1 hour ago

    (NSFW warning)

    When I had an RTX 2060 laptop, I had the most jank setup for Cyberpunk 2077.

    I’d plug it into an older Sony OLED, which only supported low res HDMI input (can’t remember which res, I think 1080p?)

    The RTX 2060 would run DLSS quality (for antialiasing) and output 2077 at low-res 60fps, and the TV would use its big ASIC to interpolate it to 120hz, and up to 4K.

    And actually, it looked good! It felt smooth! Input lag wasn’t great, but absolutely playable.

    I don’t have a PC that can do framegen (3090 now), but ironically, the DLSS framegen demos I’ve seen didn’t have interpolation as good as the Sony. And I believe Sony/Samsung support “no next frame” interpolation, so they don’t blow up input lag.


    I’m not saying it’s a great idea, but there are ways of doing this that aren’t terrible.