AI Summary:

Apple’s first-quarter earnings report revealed a mixed performance. While overall sales increased by 4%, iPhone sales showed weakness, particularly in China, where they declined by over 11%. CEO Tim Cook attributed this partly to the lack of Apple Intelligence in China and inventory changes. However, the Mac, iPad, and Services categories saw significant growth, with Services up 14% and both the Mac and iPad up 15%. The company reported $36.33 billion in net revenue, a 7.1% increase from the previous year.

  • Jrockwar@feddit.uk
    link
    fedilink
    English
    arrow-up
    4
    ·
    edit-2
    23 hours ago

    If it’s for AI, loading huge models is something you can do with Macs but not easily in any other way.

    I’m not saying many people have a use case at all for them, but if you have a use case where you want to run 60 GB models locally, a whole 192GB Mac Studio is cheaper than the GPU alone you need to run that if you were getting it from Nvidia.

    • Boomkop3@reddthat.com
      link
      fedilink
      English
      arrow-up
      1
      ·
      19 hours ago

      I’ve ran them on intel cpu’s before. When putting a cpu with more than two memory channels and a several hundred watt power budget up to a beefed up mobile cpu, it’s not a fair fight.

      Second hand xeons are cheaper though

      • Jrockwar@feddit.uk
        link
        fedilink
        English
        arrow-up
        3
        ·
        8 hours ago

        I’m talking about running them in GPU, which favours the GPU even when the comparison is between an AMD Epyc and a mediocre GPU.

        If you want to run a large version of deepseek R1 locally, with many quantized models being over 50GB, I think the cheapest Nvidia GPU that fits the bill is an A100 which you might find used for 6K.

        For well under that price you can get a whole Mac Studio with those 192 GB the first poster in this thread mentioned.

        I’m not saying this is for everyone, it’s certainly not for me, but I don’t think we can dismiss that there is a real niche where Apple has a genuine value proposition.

        My old flatmate has a PhD in NLP and used to work in research, and he’d have gotten soooo much use out of >100 GB of RAM accessible to the GPU.

        • Boomkop3@reddthat.com
          link
          fedilink
          English
          arrow-up
          1
          ·
          6 hours ago

          I had found one for about 400 recently, a bit far away tho. I ended up going with a gpu closer by. I don’t need that many gb’s