• redsunrise@programming.dev
    link
    fedilink
    English
    arrow-up
    238
    arrow-down
    2
    ·
    16 hours ago

    Obviously it’s higher. If it was any lower, they would’ve made a huge announcement out of it to prove they’re better than the competition.

    • T156@lemmy.world
      link
      fedilink
      English
      arrow-up
      3
      arrow-down
      1
      ·
      6 hours ago

      Unless it wasn’t as low as they wanted it. It’s at least cheap enough to run that they can afford to drop the pricing on the API compared to their older models.

    • Chaotic Entropy@feddit.uk
      link
      fedilink
      English
      arrow-up
      14
      arrow-down
      2
      ·
      11 hours ago

      I get the distinct impression that most of the focus for GPT5 was making it easier to divert their overflowing volume of queries to less expensive routes.

    • Ugurcan@lemmy.world
      link
      fedilink
      English
      arrow-up
      30
      arrow-down
      5
      ·
      edit-2
      12 hours ago

      I’m thinking otherwise. I think GPT5 is a much smaller model - with some fallback to previous models if required.

      Since it’s running on the exact same hardware with a mostly similar algorithm, using less energy would directly mean it’s a “less intense” model, which translates into an inferior quality in American Investor Language (AIL).

      And 2025’s investors doesn’t give a flying fuck about energy efficiency.

    • morrowind@lemmy.ml
      link
      fedilink
      English
      arrow-up
      1
      arrow-down
      7
      ·
      6 hours ago

      It’s cheaper though, so very likely it’s more efficient somehow.

      • SonOfAntenora@lemmy.world
        link
        fedilink
        English
        arrow-up
        9
        ·
        5 hours ago

        I believe in verifiable statements and so far,with few exceptions, I saw nothing. We are now speculating on magical numbers that we can’t see, but we know that ai is demanding and we know that even small models are not free. The only accessible data come from mistral, most other ai devs are not exactly happy to share the inner workings of their tools. Even than, mistral didn’t release all their data, even if they did it would only apply to mistral 7b and above, not to chatgpt.