• @Monument
    link
    English
    72 months ago

    Huh. That’s interesting —

    I’ve heard that severely schizophrenic people don’t know they are schizophrenic.
    It’s not fair or right to characterize LLMs with human medical issues. (Not for humans, that is.) But similar to humans in that situation, LLM’s don’t “know” they are speaking nonsense.

    Wonder if it would be possible to have clusters of LLMs that “talk to” each other to establish baselines.

    (Sorry, I just say whatever pops into my head. I know this comment isn’t contextually relevant.)