• Rooty@lemmy.world
    link
    fedilink
    English
    arrow-up
    7
    ·
    8 hours ago

    Makes a planet burning bullshit machine.

    Does not have any monetization plans.

    “I’m telling you guys, it’s only a matter of time before investor money starts rolling in”

  • k0e3@lemmy.ca
    link
    fedilink
    English
    arrow-up
    28
    ·
    12 hours ago

    I hate when they make these “holographic screen” images and the screen isn’t mirrored. If the guy that’s working on it is looking at it normally, then it should be mirrored for the camera.

    • Doorknob@lemmy.world
      link
      fedilink
      English
      arrow-up
      13
      ·
      11 hours ago

      It’s actually a really good representation of how execs are viewing AI. It’s a bunch of meaningless graphs and pictures of robots with the word ‘AI’ sprinkled over the place, the whole thing is backwards for the worker, and it’s imaginary.

  • Kissaki@feddit.org
    link
    fedilink
    English
    arrow-up
    15
    ·
    15 hours ago

    for example, “have seen revenues jump from zero to $20 million in a year,” he said. “It’s because they pick one pain point, execute well, and partner smartly with companies who use their tools,” he added.

    Sounds like they were able to sell their AI services. That doesn’t really measure AI success, only product market success.

    Celebrating a revenue jump from zero, presumably because they did not exist before, is… quite surprising. It’s not like they became more efficient thanks to AI.

  • Danitos@reddthat.com
    link
    fedilink
    English
    arrow-up
    46
    arrow-down
    1
    ·
    21 hours ago

    This is happening at my company. They gave us 6 months to build an AI-tool to replace a non-AI tool that has been very well built and tested for 6 years, and works perfectly well. The AI tool has some very amazing features, but it could never replace the good old tool.

    The idiot in charge of the project has such a bad vision on the tool, yet likes to overhype it and oversell it so much.

    • UnderpantsWeevil@lemmy.world
      link
      fedilink
      English
      arrow-up
      50
      ·
      21 hours ago

      The idiot in charge of the project has such a bad vision on the tool, yet likes to overhype it and oversell it so much.

      AI in a nutshell.

      A shame, because the underlying technology - with time and patience and less of an eye towards short term profits - could be very useful in sifting large amounts of disorganized information. But what we got was so far removed from what anyone asked for.

        • UnderpantsWeevil@lemmy.world
          link
          fedilink
          English
          arrow-up
          4
          ·
          11 hours ago

          As an enhancement to an existing suite of diagnostic tools, certainly.

          Not as a stand in for an oncology department, though.

          • willington@lemmy.dbzer0.com
            link
            fedilink
            English
            arrow-up
            3
            ·
            7 hours ago

            As an assist to an actual oncologist, only.

            I can see AI as a tool in some contexts, doing some specific tasks better than an unassisted person.

            But as a replacement for people, AI is a dud. I would rather be alone than have a gf AI. And yes I am taking trauma and personal+cultural baggage into account. LLM is also a product of our culture for the most part, so will have our baggage anyway. But at least in principle it could be trained to not have certain kinds of baggage, and still, I would rather deal with a person save for the simplest and lowest stake interactions.

            If we want better people, we need to enfranchise them and remove most paywalls from the world. Right now the world instead of being inviting is bristling with physical, cultural, and virtual fences, saying to us, “you don’t belong and aren’t welcome in 99.99% of the space, and the other 0.01% will cost you.” Housing for now is only a privelege. In a world like that it’s a miracle the people are as decent as they are. If we want better people we have to delibarately, on purpose, choose broadbased human flourishing as a policy objective, and be ruthless to any enemies of said objective. No amnesty for the billionaires and wannabe billionaires. Instead they are trying to shove down our throats AI/LLMs and virtual worlds as replacements for an actually decent and inviting world.

      • Derpgon@programming.dev
        link
        fedilink
        English
        arrow-up
        3
        ·
        11 hours ago

        Management was planning implementing Google Vertex (AI search platform), but since we already have all our data in ElasticSearch and it supports vectors, I said why not try to implement it myself. With integrated GPU and a very small model, I could create a working POC and it is gonna be - not overexaggerating - 50 times cheaper.

        • k0e3@lemmy.ca
          link
          fedilink
          English
          arrow-up
          4
          ·
          11 hours ago

          Don’t tell management. Start a new company then sell them what you made.

      • regedit@lemmy.zip
        link
        fedilink
        English
        arrow-up
        5
        ·
        14 hours ago

        Capitalism strikes again! All the good generative AI could and does sometimes do but some capitalist made an email sound less like a soulless, corporate turd and it was to the moon with whatever state the tech was at! Rich people have no creativity, imagination, or understanding of the tech. They’re always looking for ways to remove labor costs and make those phat stacks! We could have used generative AI to handle a lot of the shitty, mundane stuff at a faster rate, but no they chose to replace the artists’ creations so they didn’t have to pay for the cost of labor.

  • phutatorius@lemmy.zip
    link
    fedilink
    English
    arrow-up
    19
    ·
    19 hours ago

    It depends on the objectives. They were successful at selling useless crap to fools.

    Frankly, I don’t believe that even 5% werre successful by any objective criteria.

  • graycube@lemmy.world
    link
    fedilink
    English
    arrow-up
    97
    ·
    1 day ago

    We also don’t know the true cost of these tools since most AI service providers are still operating at a loss.

    • Thorry84@feddit.nl
      link
      fedilink
      English
      arrow-up
      64
      ·
      24 hours ago

      Not simply operating at a loss, absolutely dumping their prices giving away their products for almost nothing to gain market share. They are burning money at an impressive rate, just for some imaginary payoff in the future.

      • scarabic@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        8 hours ago

        It’s hard to imagine that gaining market share is even meaningful right now. There’s such a profusion of stuff out there. How much does it actually mean if someone is using your product today, I wonder?

      • mmmac@lemmy.zip
        link
        fedilink
        English
        arrow-up
        1
        ·
        edit-2
        11 hours ago

        This is true with most VC backed tech companies, not just AI

      • altphoto@lemmy.today
        link
        fedilink
        English
        arrow-up
        17
        ·
        22 hours ago

        A future where we don’t have jobs so the rich can make more money by selling us stuff? But I won’t have money to pay for stuff! Hmmm!

        • Zron@lemmy.world
          link
          fedilink
          English
          arrow-up
          20
          ·
          21 hours ago

          All MBAs and CEOs are like puppies chasing their own tails.

          They want the growth because number go up good. They’ll do anything for number go up. And when number go up, they get the good and then they need to focus on next number go up.

          They have no long term plan other than number go up. For the next few quarters, they can slap AI on anything and number go up. What happens if AI takes all the non manual labor jobs? Or if it turns out AI is useless and they wasted billions on snake oil? They don’t know, cause they were thinking about number go up right now, not number go up later.

          Our economy is a farce.

          • choco_crispies@lemmy.ml
            link
            fedilink
            English
            arrow-up
            1
            ·
            9 hours ago

            It’s fine, because even though the CEO eventually drives the company into the ground in pursuit of indefinite growth over long-term stability, that accomplishment is no deterrent to getting hired at another company to do it again. The idea of companies with a long-term vision and plan that provides employees with stability and a career is dead.

        • SoftestSapphic@lemmy.world
          link
          fedilink
          English
          arrow-up
          4
          ·
          21 hours ago

          The real reason is they want enough money pumped into AI so someone can automate fascism.

          That’s seriously the plan

          Fucking clown world

      • corsicanguppy@lemmy.ca
        link
        fedilink
        English
        arrow-up
        7
        arrow-down
        1
        ·
        19 hours ago

        And, in doing so, they’ve set the market price at that value for the service they advertise, which is more than they deliver already.

        When Ai enters the Valley of Discontent, the price it can set for what it actually offers will be even less than it is now.

      • Taldan@lemmy.world
        link
        fedilink
        English
        arrow-up
        2
        ·
        edit-2
        17 hours ago

        So many companies are going to get burnt by it

        I know people replacing basic tools with AI versions that are basically just running the simply tool and pretty printing the output

        They’re only foing it because it’s basically free to run it through AI. That whois but with AI is going to be so expensive when these companies enshittif-AI

      • Buffalox@lemmy.world
        link
        fedilink
        English
        arrow-up
        6
        arrow-down
        9
        ·
        edit-2
        23 hours ago

        The same was true for YouTube, in the beginning they operated at a loss, and when people were hooked on the service, they monetized it.

        • frongt@lemmy.zip
          link
          fedilink
          English
          arrow-up
          12
          ·
          23 hours ago

          YouTube wasn’t created to make money, it was created to watch the wardrobe malfunction.

            • ag10n@lemmy.world
              link
              fedilink
              English
              arrow-up
              2
              ·
              18 hours ago

              lol, I have llama.cpp and ollama setup on a separate pc just so I can understand and discuss from experience.

        • Thorry84@feddit.nl
          link
          fedilink
          English
          arrow-up
          5
          ·
          edit-2
          23 hours ago

          What’s your point?

          Sure that’s the point of venture capital, throwing some money at the wall and see what sticks. You’d expect to have most of them fail, but the one good one makes up for it.

          However in this case it isn’t people throwing some money at startups. It’s large companies like Microsoft throwing trillions into this new tech. And not just the one company looking for a little niche to fill, all of them are all in, flooding the market with random shit.

          Uber and Spotify are maybe not the best examples to use, although they are examples of people throwing away money in hopes of some sort of payoff (even though they both made a small profit recently, but nowhere near digging themselves out of the hole). They are however problematic in the way they operate. Uber’s whole deal is exploiting workers, turning employees into contractors just to exploit them. And also skirting regulations around taxis for the most part. They have been found to be illegal in a lot of civilised countries and had to change the way they do business there, limit their services or not operate in those countries at all. Spotify is music and the music industry is a whole thing I won’t get into.

          The current AI bubble isn’t comparable to venture capital investing in some startups. It’s more comparable to the dotcom bubble, where the industry is perceived to move in a certain direction. Either companies invest heavily and get with the times, or they die. And smart investors put their money in anything with the new tech, since that’s where the money is going to be made. Back then the new tech was the internet, now the new tech is AI. We found out the hard way, it was total BS. The internet wasn’t the infinite money glitch people thought it was and we all paid the price.

          However the scale of that bubble was small as compared to this new AI bubble. And the internet was absolutely a trans-formative technology, changing the way we work and live forever. It’s too early to say if this LLM based “AI” technology will do the same, but I doubt it. The amount of BS thrown around these days is too high. As someone with a somewhat good grasp of how LLMs actually work on a fundamental level, the promised made aren’t backed up by facts. And the amount of money being put into this aren’t near any even optimistic payoff in the future.

          If you want to throw in a simple, over simplified example: This AI boom is more like people throwing money at Theranos than anything else.

            • Thorry84@feddit.nl
              link
              fedilink
              English
              arrow-up
              8
              ·
              edit-2
              23 hours ago

              Well maybe one person is a little bit more impressed by some pretty pictures than another person. I really don’t see what that has to do with a company like Microsoft putting their money into this? They don’t make songs or movie trailers.

              To me I’m stunned but that’s just me, on top of this we’re only in year like 5 of AI going mainstream, where will it be in 10 years? 20 years?

              This is a common trap a lot of people fall into. See what improvements have been made the last couple of years, who knows where it will end up right? Unfortunately, reality doesn’t work like that. Improvements made in the past don’t guarantee improvements will continue in the future. There are ceilings that can be run into and are hard to break. There can even be hard limits that are impossible to break. There might be good reasons to not further develop promising technologies from the past into the future. There is no such thing as infinite growth.

              Edit:

              Just checked out that song, man that song is shit…

              “My job vanished without lift.” What does that even mean? That’s not even English.

              And that’s just one of the dozens of issues I’ve seen in 30 secs. You are kidding yourself if you think this is the future, that’s one shit future bro.

                • Thorry84@feddit.nl
                  link
                  fedilink
                  English
                  arrow-up
                  6
                  ·
                  22 hours ago

                  All right, we are done here. I’ve tried to engage with you in a fair and honest way. Giving you the benefit of the doubt and trying to respond to the points you are trying to make.

                  But it appears you are just a troll or an idiot, either way I’m done.

            • absentbird@lemmy.world
              link
              fedilink
              English
              arrow-up
              2
              ·
              22 hours ago

              The gains in AI have been almost entirely in compute power and training, and those gains have run into powerful diminishing returns. At the core it’s all still running the same Markov chains as the machine learning experiments from the dawn of computing; the math is over a hundred years old and basically unchanged.

              For us to see another leap in progress we’ll need to pioneer new calculations and formulate different types of thought, then find a way to integrate that with large transformer networks.

                • absentbird@lemmy.world
                  link
                  fedilink
                  English
                  arrow-up
                  1
                  ·
                  19 hours ago

                  Mixture of experts has been in use since 1991, and it’s essentially just a way to split up the same process as a dense model.

                  Tanks are an odd comparison, because not only have they changed radically since WW2, to the point that many crew positions have been entirely automated, but also because the role of tanks in modern combat has been radically altered since then (e.g. by the proliferation of drone warfare). They just look sort of similar because of basic geometry.

                  Consider the current crop of LLMs as the armor that was deployed in WW1, we can see the promise and potential, but it has not yet been fully realized. If you tried to match a WW1 tank against a WW2 tank it would be no contest, and modern armor could destroy both of them with pinpoint accuracy while moving full speed over rough terrain outside of radar range (e.g. what happened in the invasion of Iraq).

                  It will take many generational leaps across many diverse technologies to get from where we are now to realizing the full potential of large language models, and we can’t get there through simple linear progression any more than tanks could just keep adding thicker armor and bigger guns, it requires new technologies.

        • UnderpantsWeevil@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          ·
          21 hours ago

          Uber’s value-add wasn’t in putting black car services on the internet. It was in operating as a universal middleman for transactions between riders and cabbies.

          Similarly, Spotify found a way of scamming both media content creators and advertisers at an industrial scale, while propagating a bunch of far-right influencers for the benefit of anti-tax / anti-environment / anti-LGBTQ conservative groups.

          It’s worth interrogating what the real business model is for any of these services. If you get under the hood of AI, what you’re going to find is a lot of CYA applications - the military can cheaply exploit AI to produce fountains of “This guy is a terrorist” data points that help justify the next airstrike, call centers can churn out convincing scam calls more quickly and cheaply than ever before, visual arts studios can rapidly and covertly plagiarize professionally produced media.

          These are legalistic tools for evading bureaucratic oversight. They aren’t value-add to the retail customer in any way that matters.

  • Buffalox@lemmy.world
    link
    fedilink
    English
    arrow-up
    51
    arrow-down
    5
    ·
    24 hours ago

    As a non native English speaker, I found the “pilot” thing a bit confusing. But:

    pilot = pilot program

    And then it made sense.

    Anyways I think it’s not so much the 95% that fail that matter, it’s the 5% that succeed.

    • UnderpantsWeevil@lemmy.world
      link
      fedilink
      English
      arrow-up
      18
      ·
      21 hours ago

      Anyways I think it’s not so much the 95% that fail that matter, it’s the 5% that succeed.

      Succeeding in what is also a critical point.

      Succeeding in dramatically improving the library sciences? In rapidly sequencing and modeling data in chemistry or biology? In language translation? Cool. Cool. Cool.

      Succeeding in slop-ifying mass media? In convincingly fabricating data to dupe industry professionals and regulatory officials? In jamming up my phone with automated sales calls? Less cool.

      • Buffalox@lemmy.world
        link
        fedilink
        English
        arrow-up
        2
        arrow-down
        3
        ·
        edit-2
        19 hours ago

        Correct, not all things that matter are positive.

        But it’s the 5% we need to focus on.

  • SocialMediaRefugee@lemmy.world
    link
    fedilink
    English
    arrow-up
    10
    ·
    21 hours ago

    All work with AI has to be double checked. It only works on the first try in the simplest of cases. Even then I need it to run through a few iterations to get the code features I want. You still have to be able to run through the code and understand it regardless of the source.

  • Kissaki@feddit.org
    link
    fedilink
    English
    arrow-up
    2
    ·
    15 hours ago

    I’m confused by the article suddenly changing to seemingly other semi-related topics and pieces.

    • chobeat@lemmy.mlOP
      link
      fedilink
      English
      arrow-up
      25
      arrow-down
      1
      ·
      1 day ago

      That surprises me, marketing and sales being the main user of AI, I thought the back-office automation for sure was going to be by far number 1

      Generative AI is a bullshit generator. Bullshit in your marketing=good. Bullshit in your backend=bad.

       > So the number 1 user is sales/marketing but it’s back office admin jobs that are most impacted?
      

      GenAI is primarily adopted to justify mass layoffs and only secondarily to create business value. It’s the mass layoffs that drive AI adoption, not the other way around.

    • manxu@piefed.social
      link
      fedilink
      English
      arrow-up
      6
      ·
      1 day ago

      I read that slightly differently: the jobs “disrupted” away are customer support, generally outsourced due to their perceived low value = phone support. Basically, phone customer support is being terminated in favor of chat bots.

    • RememberTheApollo_@lemmy.world
      link
      fedilink
      English
      arrow-up
      2
      ·
      21 hours ago

      Think of how many ads you see and hear. From pharmaceuticals to entertainment industry ads the amount of money poured into sales and marketing is absurd. If there’s any one massively under-accounted for scourge on modern society and finance it’s ad agencies consuming huge amounts of budget, bandwidth, and just being constantly in your face. Pharma alone spends ~$20Bn on ads.

    • Eagle0110@lemmy.world
      link
      fedilink
      English
      arrow-up
      4
      arrow-down
      2
      ·
      24 hours ago

      This is really interesting but it doesn’t surprise me.

      AI and implementation of AI tend to be inherently good at optimizing efficiency of a system, and they can be exceptionally good at it. Look at Nvidia DLSS realtime AI upscaling for video games for example, it’s fundamentally a conventional TAA pipeline (a normally computationally expensive Antialiasing technique) that’s super boosted in efficiency by AI in only one of the steps of the pipeline, and it’s so efficient that it can be use to make image even more clear than original and in real time. And many of the actually practical machine learning systems that have demonstrated good results in scientific and engineering applications are fundamentally conventional algorithms that’s efficiency boosted so that the computation takes merely hours instead of many decades so they became practical. Not surprising the same approach can be used for business systems and give you actually good results.

      But I fortunately majority of the snake oil marketing for AI products and services seem to focus on claims for generating new content with AI, which is exactly what the marketing people would want LMAO

  • FlashMobOfOne@lemmy.world
    link
    fedilink
    English
    arrow-up
    5
    ·
    edit-2
    20 hours ago

    I hope you’re all keeping some money set aside for when the LLM bubble pops. It could end up being the best time to invest at a discount since March 2020.

    • bier@feddit.nl
      link
      fedilink
      English
      arrow-up
      3
      ·
      17 hours ago

      When Trump got elected I sold some of my stocks. My investments are also my retirement funds, so I don’t need them for a while. I’m waiting until the next crash starts, or something else that’s pretty bad (war, rogue AI, whatever). If the market crashes I can immediately step in.

      • FlashMobOfOne@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        17 hours ago

        Nice.

        I recently sold a portion of my crypto at a profit and am just keeping it in a money market for the moment. I want to have a certain amount to throw in stocks when the next crash comes, but don’t want to lose out on some of the good things happening now in terms of investing.

      • FlashMobOfOne@lemmy.world
        link
        fedilink
        English
        arrow-up
        3
        arrow-down
        1
        ·
        20 hours ago

        Oil stocks were down 90% in March 2020. That’s what I went with. You can profit off a lot of things if you’re willing to hold for a few years.

        • bier@feddit.nl
          link
          fedilink
          English
          arrow-up
          3
          ·
          17 hours ago

          I once had stocks in Shell, mainly because it’s a Dutch company (I’m Dutch) and its a pretty stable investment. But it felt very wrong to invest in oil. So after a few months I sold them (thankfully with a little profit).

          Personally I just don’t want to invest in anything I think fucks up the world. It’s also why I don’t want to invest in meta and some other tech companies.

          • ubergeek@lemmy.today
            link
            fedilink
            English
            arrow-up
            2
            ·
            17 hours ago

            Personally I just don’t want to invest in anything I think fucks up the world.

            For better or worse, then you wont want to invest in well, anything. Capitalism is always fucking up the world.

            That said, most don’t have a choice about it, if they ever want to retire.

            The best I can get you is investing in Seed Commons, or something similar: https://seedcommons.org/invest

          • FlashMobOfOne@lemmy.world
            link
            fedilink
            English
            arrow-up
            1
            ·
            17 hours ago

            It’s true. A person has to make the best choices for themselves.

            Oil stock, in my case, paid off my student loans and paid for the down payment on my home, and just in time too given that the US leaders have allowed things to get so bad for the people in those arenas.

            The way I justify it to myself is that, as far as investing goes, I am among the smallest of the small fish. Unless you own enough to sway votes in a public company, your investment doesn’t particularly matter as far as a company’s policies and behavior.