I’m sure it depends on the AI tools and features being used, but with all the “magic” obfuscation from companies surrounding them, it’s not exactly clear how much of the processing is happening locally over remotely.

With some of the text stuff, I’m relatively sure most of that involves data exchange to work, but for some of the image/video editing and audio processing? That’s where things get much murkier, at least to me, and where this question is largely stemming from.

I’m aware more processors are specifically being made to support these features, so it seems like there are efforts to make more of this happen locally, on one’s own devices, but…What’s the present situation look like?

  • Lung@lemmy.world
    link
    fedilink
    arrow-up
    15
    arrow-down
    1
    ·
    10 months ago

    Basically: to even have a decent model installed locally is in the gigabytes. If you didn’t install gigs then it’s remote

  • sun_is_ra@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    8
    ·
    10 months ago

    Most ai providers do the processing remotely. As a rule of thunb, if you cant use specific AI service without internet then its done remotely

  • ArbiterXero@lemmy.world
    link
    fedilink
    arrow-up
    5
    ·
    10 months ago

    It varies some….

    Most of it is remote, however “Siri” actually does a lot locally, and I assume Google assistant does too.

    Those are likely the only two that do much locally, everyone else does it all remotely.

    • Atemu@lemmy.ml
      link
      fedilink
      arrow-up
      2
      ·
      10 months ago

      “Siri” actually does a lot locally, and I assume Google assistant does too.

      On what basis? It’s Google, so I would assume any and all data that you could possibly input into their apps and services to be used against you.

      • ArbiterXero@lemmy.world
        link
        fedilink
        arrow-up
        1
        ·
        10 months ago

        Mostly a “cost” basis, but it’s an assumption for the Google assistant for sure. Siri I’ve tested.

      • Venator@lemmy.nz
        link
        fedilink
        arrow-up
        1
        ·
        edit-2
        10 months ago

        Yeah most hey Google features don’t work if you don’t have an internet connection, can’t even start music when driving through an area with no signal, had to pull over and use the GUI…

  • Archpawn@lemmy.world
    link
    fedilink
    arrow-up
    4
    ·
    10 months ago

    If you the download size is in the gigabytes and need a good graphics card to run it, you’re doing it locally. Otherwise, it’s remote.

  • Yer Ma@lemm.ee
    link
    fedilink
    arrow-up
    4
    arrow-down
    1
    ·
    10 months ago

    It’s all remote and they keep everything you give them

  • LainTrain@lemmy.dbzer0.com
    link
    fedilink
    arrow-up
    2
    ·
    10 months ago

    It depends.

    Are you using chatGPT in a browser window? Then yeah.

    Are you running SD through Automatic1111 locally? Are you running Mistral or LLama models locally?

    Then you’re running locally.