• andallthat@lemmy.world
    link
    fedilink
    English
    arrow-up
    16
    ·
    edit-2
    14 hours ago

    It’s not that LLMs aren’t useful as they are. The problem is that they won’t stay as they are today, because they are too expensive. There are two ways for this to go (or an eventual combination of both:

    • Investors believe LLMs are going to get better and they keep pouring money into “AI” companies, allowing them to operate at a loss for longer That’s tied to the promise of an actual “intelligence” emerging out of a statistical model.

    • Investments stop pouring in, the bubble bursts and companies need to make money out of LLMs in their current state. To do that, they need to massively cut costs and monetize. I believe that’s called enshttificarion.

    • pixxelkick@lemmy.world
      link
      fedilink
      English
      arrow-up
      6
      arrow-down
      5
      ·
      14 hours ago

      You skipped possibility 3, which is actively happening ing:

      Advancements in tech enable us to produce results at a much much cheaper cost

      Which us happening with diffusion style LLMs that simultaneously cost less to train, cost less to run, but also produce both faster abd better quality outputs.

      That’s a big part people forget about AI: it’s a feedback loop of improvement as soon as you can start using AI to develop AI

      And we are past that mark now, most developers have easy access to AI as a tool to improve their performance, and AI is made by… software developers

      So you get this loop where as we make better and better AIs, we get better and better at making AIs with the AIs…

      It’s incredibly likely the new diffusion AI systems were built with AI assisting in the process, enabling them to make a whole new tech innovation much faster and easier.

      We are now in the uptick of the singularity, and have been for about a year now.

      Same goes for hardware, it’s very likely now that mvidia has AI incorporating into their production process, using it for micro optimizations in its architectures and designs.

      And then those same optimized gpus turn around and get used to train and run even better AIs…

      In 5-10 years we will look back on 2024 as the start of a very wild ride.

      Remember we are just now in the “computers that take up entire warehouses” step of the tech.

      Remember that in the 80s, a “computer” cost a fortune, took tonnes of resources, multiple people to run it, took up an entire room, was slow as hell, and could only do basic stuff.

      But now 40 years later they fit in our pockets and are (non hyoerbole) billions of times faster.

      I think by 2035 we will be looking at AI as something mass produced for consumers to just go in their homes, you go to best buy and compare different AI boxes to pick which one you are gonna get for your home.

      We are still at the stage of people in the 80s looking at computers and pondering “why would someone even need to use this, why would someone put one in their house, let alone their pocket”

      • andallthat@lemmy.world
        link
        fedilink
        English
        arrow-up
        4
        ·
        10 hours ago

        I want to believe that commoditization of AI will happen as you describe, with AI made by devs for devs. So far what I see is “developer productivity is now up and 1 dev can do the work of 3? Good, fire 2 devs out of 3. Or you know what? Make it 5 out of 6, because the remaining ones should get used to working 60 hours/week.”

        All that increased dev capacity needs to translate into new useful products. Right now the “new useful product” that all energies are poured into is… AI itself. Or even worse, shoehorning “AI-powered” features in all existing product, whether it makes sense or not (welcome, AI features in MS Notepad!). Once this masturbatory stage is over and the dust settles, I’m pretty confident that something new and useful will remain but for now the level of hype is tremendous!

        • pixxelkick@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          ·
          4 hours ago

          Good, fire 2 devs out of 3.

          Companies that do this will fail.

          Successful companies respond to this by hiring more developers.

          Consider the taxi cab driver:

          With the invention if the automobile, cab drivers could do their job way faster and way cheaper.

          Did companies fire drivers in response? God no. They hired more

          Why?

          Because they became more affordable, less wealthy clients could now afford their services which means demand went way way up

          If you can do your work for half the cost, usually demand goes up by way more than x2 because as you go down in wealth levels of target demographics, your pool of clients exponentially grows

          If I go from “it costs me 100k to make you a website” to “it costs me 50k to make you a website” my pool of possible clients more than doubles

          Which means… you need to hire more devs asap to start matching this newfound level of demand

          If you fire devs when your demand is about to skyrocket, you fucked up bad lol