• LostXOR@fedia.io
    link
    fedilink
    arrow-up
    33
    arrow-down
    2
    ·
    19 hours ago

    I liked generative AI more when it was just a funny novelty and not being advertised to everyone under the false pretenses of being smart and useful. Its architecture is incompatible with actual intelligence, and anyone who thinks otherwise is just fooling themselves. (It does make an alright autocomplete though).

    • devfuuu@lemmy.world
      link
      fedilink
      English
      arrow-up
      5
      ·
      13 hours ago

      Like all the previous bubbles of scam that were kinda interesting or fun for novelty and once money came pouring in became absolut chaos and maddening.

    • Sheridan@lemmy.world
      link
      fedilink
      English
      arrow-up
      12
      ·
      18 hours ago

      The peak of AI for me was generating images Muppet versions of the Breaking Bad cast; it’s been downhill since.

    • torrentialgrain@lemm.ee
      link
      fedilink
      English
      arrow-up
      1
      arrow-down
      15
      ·
      17 hours ago

      AGI models will enter the market in under 5 years according to experts and scientists.

      • morgunkorn@discuss.tchncs.de
        link
        fedilink
        English
        arrow-up
        25
        ·
        17 hours ago

        trust me bro, we’re almost there, we just need another data center and a few billions, it’s coming i promise, we are testing incredible things internally, can’t wait to show you!

          • LostXOR@fedia.io
            link
            fedilink
            arrow-up
            1
            arrow-down
            2
            ·
            7 hours ago

            Around a year ago I bet a friend $100 we won’t have AGI by 2029, and I’d do the same today. LLMs are nothing more than fancy predictive text and are incapable of thinking or reasoning. We burn through immense amounts of compute and terabytes of data to train them, then stick them together in a convoluted mess, only to end up with something that’s still dumber than the average human. In comparison humans are “trained” with maybe ten thousand “tokens” and ten megajoules of energy a day for a decade or two, and take only a couple dozen watts for even the most complex thinking.

            • pixxelkick@lemmy.world
              link
              fedilink
              English
              arrow-up
              3
              ·
              6 hours ago

              Humans are “trained” with maybe ten thousand “tokens” per day

              Uhhh… you may wanna rerun those numbers.

              It’s waaaaaaaay more than that lol.

              and take only a couple dozen watts for even the most complex thinking

              Mate’s literally got smoke coming out if his ears lol.

              A single Wh is 860 calories…

              I think you either have no idea wtf you are talking about, or your just made up a bunch of extremely wrong numbers to try and look smart.

              1. Humans will encounter hundreds of thousands of tokens per day, ramping up to millions in school.

              2. An human, by my estimate, has burned about 13,000 Wh by the time they reach adulthood. Maybe more depending in activity levels.

              3. While yes, an AI costs substantially more Wh, it also is done in weeks so it’s obviously going to be way less energy efficient due to the exponential laws of resistance. If we grew a functional human in like 2 months it’d prolly require way WAY more than 13,000 Wh during the process for similiar reasons.

              4. Once trained, a single model can be duplicated infinitely. So it’d be more fair to compare how much millions of people cost to raise, compared to a single model to be trained. Because once trained, you can now make millions of copies of it…

              5. Operating costs are continuing to go down and down and down. Diffusion based text generation just made another huge leap forward, reporting around a twenty times efficiency increase over traditional gpt style LLMs. Improvements like this are coming out every month.

              • LostXOR@fedia.io
                link
                fedilink
                arrow-up
                1
                ·
                5 hours ago

                True, my estimate for tokens may have been a bit low. Assuming a 7 hour school day where someone talks at 5 tokens/sec you’d encounter about 120k tokens. You’re off by 3 orders of magnitude on your energy consumption though; 1 watt-hour is 0.86 food Calories (kcal).