• Ekky@sopuli.xyz
    link
    fedilink
    English
    arrow-up
    51
    ·
    2 months ago

    So now LLM makers actually have to sanitize their datasets? The horror

      • Ekky@sopuli.xyz
        link
        fedilink
        English
        arrow-up
        17
        ·
        2 months ago

        Oh no, it’s very difficult, especially on the scale of LLMs.

        That said, we others (those of us who have any amount of respect towards ourselves, our craft, and our fellow human) have been sourcing our data carefully since way before NNs, such as asking the relevant authority for it (ex. asking the post house for images of handwritten destinations).

        Is this slow and cumbersome? Oh yes. But it delays the need for over-restrictive laws, just like with RC crafts before drones. And by extension, it allows those who could not source the material they needed through conventional means, or those small new startups with no idea what they were doing, to skim the gray border and still get a small and hopefully usable dataset.

        And now, someone had the grand idea to not only scour and scavenge the whole internet with no abandon, but also boast about it. So now everyone gets punished.

        At last: don’t get me wrong, laws are good (duh), but less restrictive or incomplete laws can be nice as long as everyone respects each other. I’m excited to see what the future brings in this regard, but I hate the idea that those who facilitated this change likely are the only ones to go free.

      • Ekky@sopuli.xyz
        link
        fedilink
        English
        arrow-up
        4
        ·
        2 months ago

        You don’t have to sanitize the weights, you have to sanitize the data you use to get the weights. Two very different things, and while I agree that sanitizing a LLM after training is close to impossible, sanitizing the data you give it is much, much easier.

    • leftzero@lemmynsfw.com
      link
      fedilink
      English
      arrow-up
      1
      ·
      2 months ago

      They can’t.

      They went public too fast chasing quick profits and now the well is too poisoned to train new models with up to date information.