• Visstix@lemmy.world
    link
    fedilink
    arrow-up
    4
    ·
    22 days ago

    ray-tracing? Sure let’s give it a try.

    Ok I don’t see a difference and my fps dropped by a 100.

      • Hazzard@lemmy.zip
        link
        fedilink
        arrow-up
        3
        ·
        edit-2
        22 days ago

        The big benefit of raytracing now, imo (which most games aren’t doing), is that it frees games up to introduce dynamic destruction again. We used to have all kinds of destructible walls and bits and bobs around, with flat lighting, but baked lighting has really limited what devs can do, because if you break something you need a solution to handle all the ways the lighting changes, and for the majority of games they just make everything stiff and unbreakable.

        Raytracing is that solution. Plug and play, the lighting just works when you blow stuff up. DOOM: TDA is the best example of this currently (although still not a direct part of gameplay), with a bunch of destructible stuff everywhere, and that actually blows up with a physics sim rather than a canned animation. All the little boards have perfect ambient occlusion and shadows, because raytracing just does that.

        It’s really fun, if minor, and one of the things I actually look forward to more games doing with raytracing. IMO that’s why raytracing has whelmed most people, because we’re used to near-flawless baked lighting, and haven’t really noticed the compromises that baked lighting has pushed on us.

        • Narauko@lemmy.world
          link
          fedilink
          arrow-up
          3
          ·
          22 days ago

          If ray tracing can give me back the fun of tunneling through the ground with explosives that the first Red Faction games let me do, I will 100% change my mind on the technology. I have missed 100% destructible environments.

          • boonhet@sopuli.xyz
            link
            fedilink
            arrow-up
            3
            ·
            22 days ago

            That absolutely is something that Ray Tracing could simplify/enable, yes!

            It’s not that ray tracing is a gimmick, it’s more that modern cards still aren’t powerful enough to fully trace every ray, so some cheating is still done with baked lighting I believe. Plus a lot of devs might not have gotten accustomed to using it properly yet.

        • yogurtwrong@lemmy.world
          link
          fedilink
          arrow-up
          2
          ·
          22 days ago

          Oh yeah. For example the game “Teardown” uses a software ray tracing for lighting. Most Minecraft shaders also do ray tracing I think…

          Of course these are voxel based examples which are a lot easier on the processor. You need hardware ray tracing for high poly destructible structures and I have absolutely nothing against the technology.

          I just don’t like how the technology is abused by studios to push out unoptimized games running at ~50 fps on 3090s

          • KubeRoot@discuss.tchncs.de
            link
            fedilink
            English
            arrow-up
            1
            ·
            21 days ago

            Isn’t Teardown fully raytraced? As in, all rendering being raytracing? I don’t have a source, but remember it being talked about.

            • yogurtwrong@lemmy.world
              link
              fedilink
              arrow-up
              1
              ·
              21 days ago

              It is. Instead of hardware rt, it just uses a software implementation of ray tracing to run on all GPUs, as it does not need to do that much ray tracing. As a side note: Teardown has its own engine.

              It also boils down the fact RTX GPUs were not that popular when the game was released

          • Hazzard@lemmy.zip
            link
            fedilink
            arrow-up
            1
            ·
            edit-2
            22 days ago

            Oh, does it? I was literally thinking to myself that Teardown was an interesting example of destruction, and wondering how they did their lighting. RT makes perfect sense, that must be one of the earliest examples of actually doing something you really couldn’t without RT (at least not while lighting it well).

            But yes, agreed that recent performance trends are frustrating, smearing DLSS and frame gen to cover for terrible performance. Feels like we’re in a painful tween period with a lot of awkward stuff going on, and also deadlines/crunch/corporate meddling etc causing games to come out half-baked. Hopefully this stuff does reach maturity soon and we can have some of this cool new stuff without so many other compromises.

        • Semperverus@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          ·
          edit-2
          21 days ago

          If you want it to read as “a hundred,” you simply write out in letters “a hundred” (without the quotes).

          If you want to say “one hundred,” you can write 100 or “one hundred” (without the quotes). Typically shorter numbers in the single or double digits should be written out with letters and then switch to numerical representation at three digits and above to be considered “correct” by some academics. Always write with numerics-only when working in scientific or data-oriented settings.

          Finally, if you want to write “a one hundred,” simply write “a 100” (without the quotes).

          • Visstix@lemmy.world
            link
            fedilink
            arrow-up
            1
            ·
            21 days ago

            Thanks I will try to remember it. We just say “hundred” for 100 and I know “a hundred” is a thing so a 100 makes sense to me.

  • bulwark@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    22 days ago

    What am I supposed to play on “High” settings like some kinda peasant? Jk, my gpu is so old if it were a kid it would be starting 1st grade this year.

    • LH0ezVT@sh.itjust.works
      link
      fedilink
      arrow-up
      0
      ·
      22 days ago

      We really need to get rid of this line-go-up mentality, because it translates directly into tech companies telling you to buy something new every few months. Phones, GPUs… Every time they can push for shorter replacement cycles, they will. Good on you to not cave in to the pressure, my 1050Ti still runs as great as day one for the games that I play since day one.

      • Rai@lemmy.dbzer0.com
        link
        fedilink
        arrow-up
        0
        ·
        22 days ago

        Maaaan. My 1080 was chuggin for games I was playing three years ago. I’m lucky I got a sweet deal on some secondhand 3070tis for my partner and myself from someone my mum knows. I got a new game a couple days ago for us and we both had to drop down to “high” for 100FPS at 1440p.

        • InFerNo@lemmy.ml
          link
          fedilink
          arrow-up
          0
          ·
          22 days ago

          I have a 1060 with 3gb vram, my kids both have a 1080 and they still work great

          • Rai@lemmy.dbzer0.com
            link
            fedilink
            arrow-up
            1
            ·
            22 days ago

            I don’t doubt they work great for a lot of games! But I’m not going to be able to use those to play Abiotic Factor at 1440p…

      • CallMeAnAI@lemmy.world
        link
        fedilink
        arrow-up
        0
        ·
        22 days ago

        “I don’t have a large TV or monitor so nobody needs an up to date GPU!”

        It’s a much different story at 1440 and 4k.

        • LH0ezVT@sh.itjust.works
          link
          fedilink
          arrow-up
          1
          ·
          edit-2
          22 days ago

          I mean, if a slight increase in visual fidelity is worth a couple hundred (if not thousand) bucks every year or two to you, then sure, treat yourself. But I don’t see the need to buy a slightly faster thing every year that basically will do the same as the old. And that’s before mentioning the resources used up for producing soon-to-be-ewaste or software bloat.

          It is always the same story, cars, phones, computers, smart fridges, clothes… companies try to push people to buy the shiny new thing for obvious reasons. Companies trying to build products that last get out-competed. The line must go up.

          • CallMeAnAI@lemmy.world
            link
            fedilink
            arrow-up
            0
            ·
            22 days ago

            GTFO 🤣

            The card is 9 years old, not 2. I bet you the number of gamers swapping cards every 2 years is in the single digit percentage.

            I am in a peer group of friends and colleagues making 175-300k a year and not a single one of them is swapping a GPU every 2 years.

            • LH0ezVT@sh.itjust.works
              link
              fedilink
              arrow-up
              1
              ·
              22 days ago

              As I said, if 4k and high settings is worth the investment for you, sure, go for it, treat yourself. I am in no position to preach about ascetic life or anything. Eat out, go on holidays, buy a new gaming PC, life is short.

              I cannot seem to find numbers on GPUs in particular, only marketing or AI bullshit “articles”. But smartphones seem to have a replacement cycle in western countries ranging from 1.5-3 years, depending on who you ask. And that is average, meaning that for every weirdo like me, who keeps their phones until they break, there are around three people who get a new phone every year.

              That sounds pretty insane to me. Sure, new products are better. But I don’t want to own a better product. I want to play games, or in the case of smartphones, chat and doom-scroll on the go.

      • Final Remix@lemmy.world
        link
        fedilink
        arrow-up
        1
        ·
        22 days ago

        1660su works with everything except shitty Unreal5 games with forced lumen and stuff. I just replaced mine with an AMD somethingorother, but it wasn’t because of performance.

  • rumschlumpel@feddit.org
    link
    fedilink
    arrow-up
    0
    ·
    edit-2
    22 days ago

    I remember when I was playing the early access version of Baldur’s Gate 3 at potato fidelity. I bought a new (to me) GPU since then, but I’m pretty sure FSR and its Linux implementation massively improved since then, too.

    TBH I never bother with “high” settings in the first place unless the game is like 15 years old, I’d rather have an even quieter PC and/or a more stable framerate than that little bit more of visual fidelity.

    • MudMan@fedia.io
      link
      fedilink
      arrow-up
      0
      arrow-down
      1
      ·
      22 days ago

      I recognize the words you are using, but they don’t seem to make any sense to me when put in that order.

  • wander1236@sh.itjust.works
    link
    fedilink
    arrow-up
    0
    ·
    22 days ago

    Man, DLSS (the upscaling part) is such a great technology. I’m definitely glad that FSR isn’t bound to a brand or model, but DLSS just does so much better.

    It’s a shame they decided to give up on improving the upscaling and instead go with fake frames that add ghosting and latency.

    • sp3ctr4l@lemmy.dbzer0.com
      link
      fedilink
      English
      arrow-up
      1
      ·
      edit-2
      22 days ago

      Have you heard of our lord and savior Optiscaler?

      https://github.com/OptiScaler/OptiScaler

      Doo dee doo, there I go again, hacking more frames and render quality into CyberPunk 2077, so I can prettify my cyberdeck while I’m on my Steam Deck, wheee!

      Hey you wouldn’t maybe happen to have any unsecure bluetooth devices nearby you, set to maybe accept any pairing attempts?