There were a number of exciting announcements from Apple at WWDC 2024, from macOS Sequoia to Apple Intelligence. However, a subtle addition to Xcode 16 — the development environment for Apple platforms, like iOS and macOS — is a feature called Predictive Code Completion. Unfortunately, if you bought into Apple’s claim that 8GB of unified memory was enough for base-model Apple silicon Macs, you won’t be able to use it. There’s a memory requirement for Predictive Code Completion in Xcode 16, and it’s the closest thing we’ll get from Apple to an admission that 8GB of memory isn’t really enough for a new Mac in 2024.

  • Hux@lemmy.ml
    link
    fedilink
    English
    arrow-up
    1
    ·
    3 months ago

    This isn’t a big deal.

    If you’re developing in Xcode, you did not buy an 8GB Mac in the last 10-years.

    If you are just using your Mac for Facebook and email, I don’t think you know what RAM is.

    If you know what RAM is, and you bought an 8GB Mac in the last 10-years, then you are likely self-aware of your limited demands and/or made an informed compromise.

  • Jtee@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    3 months ago

    And now all the fan boys and girls will go out and buy another MacBook. That’s planned obsolescence for ya

    • bamboo@lemm.ee
      link
      fedilink
      English
      arrow-up
      1
      ·
      3 months ago

      Someone who is buying a MacBook with the minimum specs probably isn’t the same person that’s going to run out and buy another one to get one specific feature in Xcode. Not trying to defend Apple here, but if you were a developer who would care about this, you probably would have paid for the upgrade when you bought it in the first place (or couldn’t afford it then or now).

    • m-p{3}@lemmy.ca
      link
      fedilink
      English
      arrow-up
      0
      ·
      3 months ago

      And why they solder the RAM, or even worse make it part of the SoC.

      • rockSlayer@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        3 months ago

        There are real world performance benefits to ram being as close as possible to the CPU, so it’s not entirely without merit. But that’s what CAMM modules are for.

        • akilou@sh.itjust.works
          link
          fedilink
          English
          arrow-up
          0
          ·
          3 months ago

          But do those benefits outweigh doubling or tripling the amount of RAM by simply inserting another stick that you can buy for dozens of dollars?

          • BorgDrone@lemmy.one
            link
            fedilink
            English
            arrow-up
            1
            ·
            3 months ago

            Yes, there are massive advantages. It’s basically what makes unified memory possible on modern Macs. Especially with all the interest in AI nowadays, you really don’t want a machine with a discrete GPU/VRAM, a discrete NPU, etc.

            Take for example a modern high-end PC with an RTX 4090. Those only have 24GB VRAM and that VRAM is only accessible through the (relatively slow) PCIe bus. AI models can get really big, and 24GB can be too little for the bigger models. You can spec an M2 Ultra with 192GB RAM and almost all of it is accessible by the GPU directly. Even better, the GPU can access that without any need for copying data back and forth over the PCIe bus, so literally 0 overhead.

            The advantages of this multiply when you have more dedicated silicon. For example: if you have an NPU, that can use the same memory pool and access the same shared data as the CPU and GPU with no overhead. The M series also have dedicated video encoder/decoder hardware, which again can access the unified memory with zero overhead.

            For example: you could have an application that replaces the background on a video using AI. It takes a video, decompresses it using the video decoder , the decompressed video frames are immediately available to all other components. The GPU can then be used to pre-process the frames, the NPU can use the processed frames as input to some AI model and generate a new frame and the video encoder can immediately access that result and compress it into a new video file.

            The overhead of just copying data for such an operation on a system with non-unified memory would be huge. That’s why I think that the AI revolution is going to be one of the driving factors in killing systems with non-unified memory architectures, at least for end-user devices.

          • gravitas_deficiency@sh.itjust.works
            link
            fedilink
            English
            arrow-up
            0
            arrow-down
            1
            ·
            3 months ago

            It’s highly dependent on the application.

            For instance, I could absolutely see having certain models with LPCAMM expandability as a great move for Apple, particularly in the pro segment, so they’re not capped by whatever they can cram into their monolithic SoCs. But for most consumer (that is, non-engineer/non-developer users) applications, I don’t see them making it expandable.

            Or more succinctly: they should absolutely put LPCAMM in the next generation of MBPs, in my opinion.

    • Mongostein@lemmy.ca
      link
      fedilink
      English
      arrow-up
      0
      ·
      3 months ago

      And the apple haters will keep making this exact same comment on every post using their 3rd laptop in ten years while I’m still using my 2014 MacBook daily with no issues.

      Be more original.

      • Jtee@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        3 months ago

        Nice attempt to justify planned obsolescence. To think apple hasn’t done this time and time again, you’d have to be a fool

        • Mongostein@lemmy.ca
          link
          fedilink
          English
          arrow-up
          0
          ·
          edit-2
          3 months ago

          👍

          -posted from my ten year old MacBook which shows no need for replacement

  • _number8_@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    arrow-down
    1
    ·
    3 months ago

    imagine showing this post to someone in 1995

    shit has gotten too bloated these days. i mean even in my head 8GB still sounds like ‘a lot’ of RAM and 16GB feels extravagant

    • rottingleaf@lemmy.zip
      link
      fedilink
      English
      arrow-up
      0
      ·
      3 months ago

      I still can’t fully accept that 1GB is not normal, 2GB is not very good, and 4GB is not all you ever gonna need.

      If only it got bloated for some good reasons.

      • Aux@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        3 months ago

        High quality content is the reason. Sit in a terminal and your memory usage will be low.

        • lastweakness@lemmy.world
          link
          fedilink
          English
          arrow-up
          0
          ·
          3 months ago

          So we’re just going to ignore stuff like Electron, unoptimized assets, etc… Basically every other known problem… Yeah let’s just ignore all that

          • Aux@lemmy.world
            link
            fedilink
            English
            arrow-up
            0
            ·
            3 months ago

            Is Electron that bad? Really? I have Slack open right now with two servers and it takes around 350MB of RAM. Not that bad, considering that every other colleague thinks that posting dumb shit GIFs into work chats is cool. That’s definitely nowhere close to Firefox, Chrome and WebStorm eating multiple gigs each.

            • lastweakness@lemmy.world
              link
              fedilink
              English
              arrow-up
              0
              ·
              3 months ago

              Yes, it really is that bad. 350 MBs of RAM for something that could otherwise have taken less than 100? That isn’t bad to you? And also, it’s not just RAM. It’s every resource, including CPU, which is especially bad with Electron.

              I don’t really mind Electron myself because I have enough resources. But pretending the lack of optimization isn’t a real problem is just not right.

              • Aux@lemmy.world
                link
                fedilink
                English
                arrow-up
                0
                ·
                3 months ago

                First of all, 350MB is a drop in a bucket. But what’s more important is performance, because it affects things like power consumption, carbon emissions, etc. I’d rather see Slack “eating” one gig of RAM and running smoothly on a single E core below boost clocks with pretty much zero CPU use. That’s the whole point of having fast memory - so you can cache and pre-render as much as possible and leave it rest statically in memory.

                • jas0n@lemmy.world
                  link
                  fedilink
                  English
                  arrow-up
                  0
                  ·
                  edit-2
                  3 months ago

                  Just wanted to point out that the number 1 performance blocker in the CPU is memory. In the general case, if you’re wasting memory, you’re wasting CPU. These two things really cannot be talked about in isolation.

                • Verat@sh.itjust.works
                  link
                  fedilink
                  English
                  arrow-up
                  0
                  ·
                  edit-2
                  3 months ago

                  When (according to about:unloads) my average firefox tab is 70-230MB depending on what it is and how old the tab is (youtube tabs for example bloat up the longer they are open), a chat app using over 350 is a pretty big deal

                  just checked, my firefox is using 4.5gb of RAM, while telegram is using 2.3, while minimized to the system tray, granted Telegram doesnt use electron, but this is a trend across lots of programs and Electron is a big enough offender I avoid apps using it. When I get off shift I can launch discord and check it too, but it is usually bad enough I close it entirely when not in use