• msage@programming.dev
    link
    fedilink
    arrow-up
    8
    arrow-down
    1
    ·
    4 days ago

    All that to hallucinate every response in ways, that make people feel like they know what they are talking about.

    Which they don’t, and LLMs never will - unless they program in some responses, which then goes against the entire thing.