Apple has debuted Apple Intelligence, its personal intelligence system for iPhone, iPad and Mac that combines generative models with personal context to deliver useful and relevant intelligence.

Apple Intelligence is integrated into iOS 18, iPadOS 18, and macOS Sequoia.

Apple aims to ensure privacy in AI with Private Cloud Compute, which has the ability to flex and scale computational capacity between on-device processing and larger, server-based models that run on dedicated Apple silicon servers.

“We’re thrilled to introduce a new chapter in Apple innovation. Apple Intelligence will transform what users can do with our products — and what our products can do for our users,” says Tim Cook, Apple’s CEO.

“Our unique approach combines generative AI with a user’s personal context to deliver truly helpful intelligence. And it can access that information in a completely private and secure way to help users do the things that matter most to them.

“This is AI as only Apple can deliver it, and we can’t wait for users to experience what it can do.”

Apple CEO Tim Cook

 

Understanding and creating language

Apple Intelligence unlocks new ways for users to enhance their writing and communicate more effectively. With new systemwide Writing Tools built into iOS 18, iPadOS 18, and macOS Sequoia, users can rewrite, proofread, and summarise text in Mail, Notes, Pages, and third-party apps.

With Rewrite, Apple Intelligence allows users to choose from different versions of what they have written, adjusting the tone to suit the audience and task at hand.

Proofread checks grammar, word choice, and sentence structure while also suggesting edits — along with explanations of the edits — that users can review or accept.

With Summarize, users can select text and have it recapped in the form of a digestible paragraph, bulleted key points, a table, or a list.

In Mail, Priority Messages is a new section at the top of the inbox showing the most urgent emails, like a same-day dinner invitation or boarding pass.

Across a user’s inbox, instead of previewing the first few lines of each email, they can see summaries without needing to open a message. For long threads, users can view pertinent details with just a tap.

Smart Reply provides suggestions for a quick response, and will identify questions in an email to ensure everything is answered.

Priority Notifications appear at the top of the stack to surface what’s most important, and summaries help users scan long or stacked notifications to show key details right on the Lock Screen, such as when a group chat is particularly active.

Reduce Interruptions is a new Focus that surfaces only the notifications that might need immediate attention.

In the Notes and Phone apps, users can now record, transcribe, and summarise audio. When a recording is initiated while on a call, participants are automatically notified, and once the call ends, Apple Intelligence generates a summary to help recall key points.

 

Image Playground

Apple Intelligence powers Image Playground, where users can create images, choosing from three styles: Animation, Illustration, or Sketch.

Image Playground is built into apps including Messages. It’s also available in a dedicated app.

With Image Playground, users can choose from a range of concepts from categories like themes, costumes, accessories, and places; type a description to define an image; choose someone from their personal photo library to include in their image; and pick their favourite style.

Pause playback of video: Image Playground Creation

Image Playground in Messages lets users can create images or see personalised suggested concepts related to their conversations.

In Notes, users can access Image Playground through the new Image Wand in the Apple Pencil tool palette. Rough sketches can be turned into images, and users can select empty space to create an image using context from the surrounding area.

Image Playground is also available in apps like Keynote, Freeform, and Pages, as well as in third-party apps that adopt the new Image Playground API.

 

Genmoji creation

Users can create an original Genmoji, then by typing a description, their Genmoji appears, along with additional options.

Like emoji, Genmoji can be added inline to messages, or shared as a sticker or reaction in a Tapback.

 

New features in photos

With Apple Intelligence, natural language can be used to search for specific photos, while seatch in videos also becomes more powerful with the ability to find specific moments in clips.

The new Clean Up tool can identify and remove distracting objects in the background of a photo, without accidentally altering the subject.

With Memories, users can create the story they want to see by typing a description. Using language and image understanding, Apple Intelligence will pick out the best photos and videos based on the description, craft a storyline with chapters based on themes identified from the photos, and arrange them into a movie with its own narrative arc. Users will even get song suggestions to match their memory from Apple Music.

 

New life for Siri

Apple Intelligence gives Siri richer language-understanding capabilities, making it more natural, more contextually relevant, and more personal, with the ability to simplify and accelerate everyday tasks.

Siri can now follow along if users stumble over words and maintain context from one request to the next. User can also type to Siri, and switch between text and voice.

Siri can now operate across iPhone, iPad, and Mac, and willl be able to understand and take action with users’ content in more apps over time.

With Apple Intelligence, Siri will be able to take new actions in and across Apple and third-party apps, delivering intelligence that’s tailored to the user and their on-device information.

 

Privacy in AI

To be truly helpful, Apple Intelligence relies on understanding deep personal context while also protecting user privacy.

A cornerstone of Apple Intelligence is on-device processing, and many of the models that power it run entirely on device.

To run more complex requests that require more processing power, Private Cloud Compute extends the privacy and security of Apple devices into the cloud.

With Private Cloud Compute, Apple Intelligence can flex and scale its computational capacity and draw on larger, server-based models for more complex requests. These models run on servers powered by Apple silicon.

Independent experts can inspect the code that runs on Apple silicon servers to verify privacy, and Private Cloud Compute cryptographically ensures that iPhone, iPad, and Mac do not talk to a server unless its software has been publicly logged for inspection.

 

Integrated ChatGPT

Apple is integrating ChatGPT access into experiences within iOS 18, iPadOS 18, and macOS Sequoia. Users are asked before any questions are sent to ChatGPT, along with any documents or photos, and Siri then presents the answer directly.

ChatGPT will be available in Apple’s systemwide Writing Tools and, with Compose, users can also access ChatGPT image tools to generate images.

Privacy protections are built in for users who access ChatGPT — their IP addresses are obscured, and OpenAI won’t store requests. ChatGPT’s data-use policies apply for users who choose to connect their account.

ChatGPT will come to iOS 18, iPadOS 18, and macOS Sequoia later this year, powered by GPT-4o. Users can access it for free without creating an account, and ChatGPT subscribers can connect their accounts and access paid features.