Oh hi there person who I’ve upset by expressing my opinion who is now wanting to go back through my post history to find something to use against me! If you’re reading this, it’s because you’ve already lost the argument :)

  • 0 Posts
  • 28 Comments
Joined 2 months ago
cake
Cake day: April 26th, 2025

help-circle


  • I’m not sure if English isn’t your first language, or if you’re just being wilfully obtuse, but I didn’t call it emulation. I said it is essentially emulation, like WINE. I know WINE isn’t emulation, which is why I said it is “essentially” emulation because it’s doing the same thing - converting calls from one set of APIs to work on other hardware/architecture. It’s not emulation, but it’s essentially the same thing.

    Why would Nvidia want competition? AMD don’t want competition either, but they made FSR work on everything because they were so far behind Nvidia (and because it was all done in software, requiring no special hardware) that they have to give it away to try and catch up.

    Companies making proprietary tech is not anti-consumer - unless of course you think that everything other than making everything free and open source is “anti-consumer”, which I am thinking you might?






  • Linking to an 81 page document isn’t helpful. What specifically in there are you referring to?

    No it doesn’t. It allows you to run a game at a higher resolution for no reason at all

    Other than the reasons like I said - running it at higher settings while maintaining a playable framerate. The point is you don’t have to lower settings as much with DLSS.

    You fundamentally don’t understand what it is and what it allows you to do.



  • Second cuda is not hardware dependend

    That’s essentially an emulation layer. Nvidia make DLSS specifically for their GPUs, which have CUDA cores on them. It’s the reason why DLSS doesn’t work on their pre-CUDA core hardware.

    Could they make DLSS work on AMDs hardware? Sure, they could - but it would not be DLSS as we know it, and again - why would they? They are allowed to make stuff exclusively for their hardware.


  • I 100% know what DLSS is, though by the sounds of it you don’t. It is “AI” as much as any other thing is “AI”. It uses models to “learn” what it needs to reconstruct and how to reconstruct it.

    What do you think DLSS is?

    You render a scene a dozen times once, then it regurgitates those renders from memory again if they are shown before ejected from cache on the card. It doesn’t upsample, it does intelligently render anything new, and there is no additive anything. It seems you think it’s magic, but it’s just fast sorting memory tricks.

    This is blatantly and monumentally wrong lol. You think it’s literally rendering a dozen frames and then just picking the best one to show you out of them? Wow. Just wow lol.

    It absolutely does not make a game playable while otherwise unplayable by adding details and texture definition, as you seem to be claiming.

    That’s not what I claimed though. Where did I claim that?

    What it does is allow you to run a game at higher settings than you could usually at a given framerate, with little to no loss of image quality. Where you could previously only run a game at 20fps at 1080p Ultra settings, you can now run it at 30fps at “1080p” Ultra, whereas to hit 30fps otherwise you might have to drop everything to Low settings.

    Go read up.

    Ditto.








  • First: https://www.corsair.com/us/en/explorer/gamer/gaming-pcs/rtx-5090-5080-and-5070-series-gpus-everything-you-need-to-know/

    What exactly am I supposed to be looking at here? Do you think that says that the GPUs need their own PSUs? Do you think people with 50 series GPUs have 2 PSUs in their computers?

    It’s not innovative, interesting, or improving performance, it’s a marketing scam. Games would be run better and more efficiently if you just lower the requirements.

    DLSS isn’t innovative? It’s not improving performance? What on earth? Rendering a frame at a lower resolution and then using AI to upscale it to look the same or better than rendering it at full resolution isn’t innovative?! Getting an extra 30fps vs native resolution isn’t improving performance?! How isn’t it?

    You can’t just “lower the requirements” lol. What you’re suggesting is make the game worse so people with worse hardware can play at max settings lol. That is absolutely absurd.

    Let me ask you this - do you think that every new game should still be being made for the PS2? PS3? Why or why not?



  • FreedomAdvocate@lemmy.net.autoTechnology@lemmy.worldNVIDIA is full of shit
    link
    fedilink
    English
    arrow-up
    0
    arrow-down
    1
    ·
    2 days ago

    I assume people mean 3440x1440 when they say 1440 as it’s way more common than 2560x1440.

    Your card is comparable to a 5070, which is basically the same price as yours. There’s no doubt the 5080 and 5090 are disappointing in their performance compared to these mid-high cards, but your card can’t compete with them and nvidia offer a comparable card at the same price point as AMDs best card.

    Also the AMD card uses more power than the nvidia equivalent (9700xt vs 5070).