During WWDC24’s opening keynote, Apple previewed some of the generative AI features rolling out on select devices later this year. From text and notification summaries to artificially generated images and emojis—Apple Intelligence will equip compatible iPhones, iPads, and Macs with a plethora of handy tools that work on a system level.
Notably, several devices are omitted from that list—Apple Watch, Vision Pro, and perhaps most glaring of all, HomePod. For the time being, Apple’s smart speakers ironically won’t be getting any of the Apple Intelligence perks. And they probably never will.
HomePod’s hardware constraints
Perhaps the main reason Apple Intelligence won’t be supported on current HomePod models is their internal specs. For reference, the upcoming AI features will only work on some of the highest-end Apple products, including M-series powered iPads and Macs, in addition to the iPhone 15 Pro and 15 Pro Max—thanks to the A17 Pro chip.
What all of these compatible devices have in common is at least 8GB of RAM and a powerful Neural Engine. So, while the iPhone 15 and 15 Plus launched alongside the 15 Pro models, they won’t support Apple Intelligence features. That’s presumably due to the A16 Bionic chipset and the lower 6GB of RAM it packs.
The latest HomePod 2 is fueled by an S7 chip, featuring just 1GB of RAM and no Neural Engine. It’s the same SoC found in 2021’s Apple Watch Series 7. If the most capable, S9-equipped Apple Watch Series 9 and Ultra 2 can’t handle Apple Intelligence, then neither can devices powered by inferior processors—such as the HomePod lineup.
Private Cloud Compute isn’t the answer
Apple Intelligence, by design, prioritizes on-device processing. However, Apple has also developed a fallback, cloud-based infrastructure to process more advanced queries that consumer devices can’t handle locally. Dubbed “Private Cloud Compute,” these server-based models utilize Apple silicon chips and their Secure Enclave to analyze encrypted user data. So, why not bake support for Private Cloud Compute into HomePods?
Foundry
Well, as mentioned earlier, Apple wants Private Cloud Compute to act as a fallback option when the device can’t perform a certain task. It’s not meant to be the default or sole engine powering Apple’s AI features on incompatible products. This could be due to several reasons, such as the consequent server overload if millions of underpowered devices actively send requests for cloud processing.
John Gruber at Daring Fireball explains in greater detail: “The models that run on-device are entirely different models than the ones that run in the cloud, and one of those on-device models is the heuristic that determines which tasks can execute with on-device processing and which require Private Cloud Compute or ChatGPT.” He adds that Apple Intelligence’s on-device processing “component of Apple Intelligence “isn’t just nice to have, it’s a keystone to the entire thing.”
That’s not to say this vision could potentially change down the road, once Apple polishes Private Cloud Compute and upgrades its servers. After all, Apple Intelligence is launching as a beta, and the company is only getting started in this domain. Additionally, we’ve read rumors about Apple possibly relying on cloud computing in the future to power lighter wearables. So, things may change when Apple eventually masters the AI and cloud processing formats.
A future AI HomePod
Beyond potentially adopting Private Cloud Compute down the road, which seems somewhat unlikely, Apple’s most straightforward path for bringing Apple Intelligence to the HomePod is to introduce a new form factor. According to reputable leakers, the company has been working on an overhauled HomePod that features a display and a FaceTime camera. While we aren’t expecting this long-rumored HomePod with a display to launch this year, it could address the current hardware limitations when it does.
Could Apple open up iCloud Private Compute to one day include less-powerful devices such as Homepod and Apple Watch? Possibly not.
Foundry
Given that the added screen and camera would unlock new HomePod capabilities, Apple will naturally have to adopt a more powerful processor with more RAM. This could consequently elevate the HomePod’s hardware to reach the minimum specs required to run Apple Intelligence.
Furthermore, a display may warrant bringing Apple Intelligence to the HomePod, as its AI features are primarily visual. So, for example, a HomePod in its current form factor won’t be able to generate images or compose emails on your behalf. The Siri it packs revolves around voice commands for controlling music playback and HomeKit accessories, along with answering basic questions (or at least trying to).
Apple Intelligence, on the other hand, seemingly excels at productivity and entertainment tasks that a smart speaker can’t necessarily output. For the most part, I think using Apple Intelligence features on the current HomePods would be an unintuitive experience—except, of course, for “new era for Siri.”
The best-case scenario for current HomePod users is Apple expanding integration with ChatGPT to include the HomePod. By doing so, the underpowered HomePods would be able to answer more advanced questions without needing to change the way Apple’s Private Cloud Compute works, as OpenAI would be handling the queries instead.
Despite this execution being technologically possible, it’s unlikely Apple will go that route. ChatGPT and whatever other third-party AI chatbots Apple partners with are meant to fill the gaps in Apple Intelligence, not serve as the main AI engine. Nevertheless, Apple needs to find some way to upgrade the current subpar Siri experience, especially when our iPhones, iPads, and Macs will be so much better. Perhaps Siri can break free from the larger Apple Intelligence experience so current HomePods can get an upgrade too. Until then, we’re all going to keep hearing that all-too-familiar Siri response, “I found some web results. I can show them if you ask again from your iPhone.”