• circuitfarmer
    link
    fedilink
    arrow-up
    2
    ·
    2 years ago

    This is the thing about LLMs and AI/ML in general that I think people need to understand (especially corporate higher-ups throwing money at AI to save costs of labor): it doesn’t learn anything. It doesn’t generate anything novel. It finds patterns and repeats those patterns in, ultimately, pseudorandom ways (which leads to nondeterminism, which makes things look more impressive than they actually are).

    We’ll see tremendous stagnation as AI adoption increases, because you simply can’t feed it enough data if so much is already coming from AI to begin with. The entropy among the patterns reduces.