Apple's "It's Glowtime" event on September 9th is not just another iPhone launch. It is the first hardware designed from the ground up to run Apple Intelligence, and that changes the conversation for every developer building on iOS.

What Apple Intelligence actually is

When Apple announced Apple Intelligence at WWDC in June, the broad reaction was somewhere between scepticism and cautious interest. On-device AI, private cloud compute, an upgraded Siri that can actually understand context, writing tools, image generation. The demos looked polished. The question was always whether the hardware could deliver it.

iPhone 16 answers that. Apple designed the A18 chip specifically for AI workloads. The Neural Engine in A18 is 60 percent faster than A17. More importantly, Apple added a dedicated hardware accelerator for the type of matrix operations that LLMs need. The iPhone 15 Pro could run Apple Intelligence but with limitations. iPhone 16 was purpose-built for it across the entire lineup, not just Pro.

What this means for iOS developers

Apple's private approach to on-device AI is different from what Google and Microsoft have been doing. When you use Gemini Nano on Android or Copilot on Windows, the model runs locally on the device but it is essentially a Google or Microsoft model. Apple Intelligence is woven into the OS APIs. You do not call a model directly. You call system APIs that happen to use AI under the hood.

That is both a constraint and an advantage. The constraint is that you cannot swap in your own model or fine-tune for your use case. The advantage is that your app inherits the quality and privacy guarantees of Apple's system without any infrastructure cost. Writing tools, summarisation, smart replies in notifications, photo editing with AI: all of this is available through standard APIs starting with iOS 18.

The private cloud compute story is the interesting part from an engineering standpoint. When a task is too complex for on-device processing, Apple routes it to their cloud infrastructure running on Apple Silicon servers. The privacy claim is that Apple cannot see the data and it is not retained after the request. Cryptographic attestation lets the device verify it is talking to genuine Apple hardware. That is an unusual architecture and worth understanding if your app handles sensitive data.

The Siri integration

The upgraded Siri has context awareness across apps. The demos show asking Siri to send a message to someone you were just emailing, or asking it to add a restaurant from a recent Safari search to your calendar. This works through on-device context, not a cloud call. For developers, there are new App Intents APIs that let you make your app's actions available to this upgraded Siri. If you build something users do repeatedly, this is worth adding support for before iOS 18 ships widely.

Apple Intelligence is rolling out in US English first, with other languages following in 2025. The features are opt-in initially. Expect a slower adoption curve than a typical iOS feature.