• AlexWIWA@lemmy.ml
    link
    fedilink
    English
    arrow-up
    24
    ·
    4 months ago

    I’m willing to be we’ll see something to train language models on the user’s hardware soon enough. Folding at home, but instead of helping science, Google steals your electricity.

    • vvv@programming.dev
      link
      fedilink
      arrow-up
      12
      ·
      4 months ago

      I really think that’s the secret end game behind all the AI stuff in both Windows and MacOS. MS account required to use it. (anyone know if you need to be signed in to apple ID for apple ai?) “on device” inference that sometimes will reach out to the cloud. when it feels like it. maybe sometimes the cloud will reach out to you and ask your cpu to help out with training.

      that, and better local content analysis. “no we aren’t sending everything the microphone picks up to our servers, of course not. just the transcript that your local stt model made of it, you won’t even notice the bandwidth!)”