I noticed Debian does this by default and Arch wiki recommends is citing improved security and upstream.

I don’t get why that’s more secure. Is this assuming torrents might be infected and aims to limit what a virus may access to the dedicated user’s home directory (/var/lib/transmission-daemon on Debian)?

        • BaumGeist@lemmy.ml
          link
          fedilink
          arrow-up
          6
          ·
          1 month ago

          The point of security isn’t just protecting yourself from the threats you’re aware of. Maybe there’s a compromise in your distro’s password hashing, maybe your password sucks, maybe there’s a kernel compromise. Maybe the torrent client isn’t a direct route to root, but one step in a convoluted chain of attack. Maybe there are “zero days” that are only called such because the clear web hasn’t been made aware yet, but they’re floating around on the dark web already. Maybe your passwords get leaked by a flaw in Lemmy’s security.

          You don’t know how much you don’t know, so you should be implementing as much good security practices as you can. It’s called the “Swiss Cheese” model of security: you layer enough so that the holes in one layer are blocked by a different layer.

          Plus, keeping strong security measures in place for something that’s almost always internet connected is a good idea regardless of how cautious you think you’re being. It’s why modern web-browsers are basically their own VM inside your pc anymore, and it’s why torrent clients shouldn’t have access to anything besides the download/upload folders and whatever minimal set of network perms they need.

          • nanook@friendica.eskimo.com
            link
            fedilink
            arrow-up
            1
            ·
            1 month ago

            @BaumGeist @Quail4789 If you get software from an untrusted source, and it does not matter if it’s a torrent, ftp, https, scp, etc, you run this risk. And usually when you download with a torrent the supplying site will publish a hash which you can compare to make sure that it wasn’t corrupted in transit.

        • nanook@friendica.eskimo.com
          link
          fedilink
          arrow-up
          1
          ·
          edit-2
          1 month ago

          @Quail4789 @rc__buggy@sh.it just.works there is not a known exploit in sudo but there IS a known exploit in the library it uses to elevate privileges, at least in older versions. Also I make full system weekly backups so worst comes to worst I’m never going to lose more than a weeks data.

      • mik@sh.itjust.works
        link
        fedilink
        arrow-up
        5
        ·
        1 month ago

        It may be mostly “security theater” but it requires almost no extra effort and drastically increases the difficulty of compromise by adding privilege escalation as another requirement to gaining root access.

        • loutr@sh.itjust.works
          link
          fedilink
          arrow-up
          4
          ·
          1 month ago

          The point is also to minimize potential damages caused by a bug in the software. Just this year there have been multiple data-destroying bugs in publicly released software. If the app runs as a server it’s usually trivial to have it run as a dedicated user, with just enough permissions to do its job.

          It’s just good practice, even though the risks might be low why risk it at all?

        • Fonzie!@ttrpg.network
          link
          fedilink
          arrow-up
          1
          ·
          1 month ago

          Not yet, but if every system was only protected against what already happened instead of also what could happen, we’d get hacked a lot more often!