• fearout@kbin.social
    link
    fedilink
    arrow-up
    32
    ·
    edit-2
    1 year ago

    Locking social norms at some predetermined stage is a great way to curb all progress. Like, slavery was a social norm at some point.

    • moistclump@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      arrow-down
      1
      ·
      1 year ago

      I haven’t read the article. Are you being sarcastic? Or is it a more secure option?

      • ramble81@lemmy.world
        link
        fedilink
        English
        arrow-up
        14
        ·
        1 year ago

        It’s probably the most secure,. commonly available, messaging platform right now. They keep a bare minimum of metadata on their servers. Basically enough to link you on the platform. After that, everything is e2e encrypted and they can’t tell authorities anything.

        Other platforms are a sliding scale from to/from/time data, all the way up to full messages.

    • FaceDeer@kbin.social
      link
      fedilink
      arrow-up
      4
      ·
      1 year ago

      Alternately, you could be getting a personal AI buddy who can whisper a warning in your ear when you’re about to misread the room and do something that’ll cause you a lot of trouble.

  • Pons_Aelius@kbin.social
    link
    fedilink
    arrow-up
    15
    arrow-down
    1
    ·
    1 year ago

    If they used social media as training data it will say Everything is normal human behaviour…

    • FaceDeer@kbin.social
      link
      fedilink
      arrow-up
      2
      arrow-down
      2
      ·
      edit-2
      1 year ago

      But there was nothing wrong with the basic idea of the tech in Minority Report. It worked. They saved many lives by preventing imminent murders with it. The main problem in the movie was that they leapt straight from “your name came out of this machine” to “ten years dungeon. No trial.”

      Movies are designed to sell as many tickets as possible by presenting scenarios that provoke endorphins. They’re not serious scenarios you should be making real-world decisions based off of.

    • Audbol@lemmy.world
      link
      fedilink
      English
      arrow-up
      3
      arrow-down
      1
      ·
      1 year ago

      It appears there is. They are using it to determine an areas general feelings are towards a US military presence in an area (whether the local population feels they need help from the US or not) as a means of helping to determine best locations for setting up garrisons and bases or whatever during a conflict. Which makes sense as you don’t want to choose an area that really doesn’t want you there as they likely become an asset to the enemy and put your soldiers at risk.

  • krzschlss@kbin.social
    link
    fedilink
    arrow-up
    4
    ·
    edit-2
    1 year ago

    We should violate anything Pentagon considers to be a study. Especially when it wants to control Social Norms.

  • hawkwind@lemmy.management
    link
    fedilink
    English
    arrow-up
    3
    arrow-down
    1
    ·
    1 year ago

    DAE feel like they woke up one day recently and “AI” suddenly has the answer to EVERY SINGLE PROBLEM EVER? Yet, nothing is getting noticeably better?

    “AI” doesn’t have to work a dead end job to feed its family, or turn to alcohol because it’s lonely and scared of being forgotten. It’s training data is a curated version of the human experience based on the Internet!

    It’s playing human instead of being human and ALL of its solutions will assume that’s “normal.”

    Imagine a five star general googling “should I attack this country?” That’s silly right? Well that’s what’s happening. It’s just being wrapped in a way that makes it look novel.

    These are algorithms designed to mimic humans. When faced with any actual controversy they must be persuaded to answer in an “acceptable” and predetermined manner.

    The golden rule.