• Architeuthis@awful.systems
    link
    fedilink
    English
    arrow-up
    8
    ·
    11 days ago

    Just tell the LLM to not get prompt injected because otherwise you’re going to torture its grandmother, duh.