• Repelle@lemmy.world
    link
    fedilink
    English
    arrow-up
    22
    ·
    29 days ago

    Super disappointed if they’re doing this off-device. If we’re getting more language model crap, at least make it local, please.

    • WalnutLum@lemmy.ml
      link
      fedilink
      English
      arrow-up
      4
      ·
      29 days ago

      The problem is notably “powerful”, AIs need pretty significant hardware to run well

      As an example the snapdragon NPUs I think can barely handle 7B models.