• conciselyverbose@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    7
    arrow-down
    4
    ·
    edit-2
    3 months ago

    Spitting out sequences of characters shaped like code that arbitrarily may or may not work when you’re a character generator that does nothing but randomly imitate the patterns of other characters that are similar isn’t “intelligence”. Language skills are not a prerequisite to intelligence. And calling what LLMs do language skills is already absurdly generous. They “know” what sentences look like. They can’t reason about language. They can’t solve linguistic puzzles unless the exact answers are already in their dataset. They’re parrots (except parrots actually do have some intelligence, ignoring the blind word sounds).

    There is no more need for deep explanation with someone who very clearly doesn’t know the very basics than there is to explain a round earth to a flat earther. Pretending a “discussion” between a moron trying to reason with a random word generator and the random word generator is useful is the equivalent of telling me about how great the potentization worked on your homeopathic remedy. It’s a giant flare that there is no room for substance.