Apple concluded the first day of WWDC 2024 event and introduced to the world what’s called Apple Intelligence, the short form for which is smartly called AI. Apple Intelligence is Apple’s suit of AI features similar to what Samsung calls Galaxy AI. Here’s everything you need to know about what Apple Intelligence is, where will it be available, and more.
Apple Intelligence: What Is It?
In Apple’s words, Apple Intelligence is the personal intelligence system for iPhone, iPad, and Mac that combines “the power of generative models with personal context to deliver intelligence that’s incredibly useful and relevant.” Apple Intelligence is intricately woven into iOS 18, iPadOS 18, and macOS Sequoia, leveraging the prowess of Apple silicon. It comprehends and generates language and imagery, executes tasks seamlessly across applications, and utilizes personal context to streamline and expedite routine activities.
Apple also leverages Private Cloud Compute, where it hs the ability to flex and scale computational capacity between on-device processing and larger, server-based models that run on dedicated Apple silicon servers.
Apple Intelligence: Features
Apple Intelligence brings new systemwide Writing Tools built into iOS 18, iPadOS 18, and macOS Sequoia. With it, users can rewrite, proofread, and summarise text nearly everywhere they write, including Mail, Notes, Pages, and third-party apps. Apple Intelligence allows users to choose from different versions of what they have written, adjusting the tone to suit the audience and task at hand, such as finessing a cover letter, to adding humour and creativity to a party invitation.
Users can also check for grammatical mistakes, change word choice, and sentence structure as well. Then, the default Apple Mail App will also be using Apple Intelligence to automatically categorise messages under the “Priority Messages” section. A new section at the top of the inbox shows the most urgent emails, like a same-day dinner invitation or boarding pass
Across a user’s inbox, instead of previewing the first few lines of each email, they can see summaries without needing to open a message. For long threads, users can view pertinent details with just a tap. Smart Reply provides suggestions for a quick response, and will identify questions in an email to ensure everything is answered.
Apple’s notification system on iPhones and iPads has always been criticised for being too cluttered. Apple tries to solve that with Priority notifications, which appear at the top of the stack in a grouped manner to surface what’s most important, and summaries help users scan long or stacked notifications to show key details right on the Lock Screen, such as when a group chat is particularly active.
Reduce Interruptions is a new Focus that surfaces only the notifications that might need immediate attention, like a text about an early pickup from daycare. Then, in the Notes and Phone apps, users can now record, transcribe, and summarise audio. When a recording is initiated while on a call, participants are automatically notified, and once the call ends, Apple Intelligence generates a summary to help recall key points.
Read More: iOS 18: Embracing Features Long Loved by Android Users
Next up, Apple Intelligence also powers Image Generation features, such as Image Playground, where users can create fun images in seconds, choosing from three styles: Animation, Illustration, or Sketch. Image Playground is built right into apps including Messages. It’s also available in a dedicated app, so users can try out different concepts and styles. All images are created on device, says Apple.
In Notes, users can access Image Playground through the new Image Wand in the Apple Pencil tool palette. Rough sketches can be turned into full-fledged images, and users can even select empty space to create an image using context from the surrounding area. Image Playground is also available in apps like Keynote, Freeform, and Pages, as well as in third-party apps that adopt the new Image Playground API.
The new Genmoji feature can be used to create certain emojis by simply typing a description, along with additional options. Users can even create Genmoji of friends and family based on their photos. Just like emoji, Genmoji can be added inline to messages, or shared as a sticker or reaction in a Tapback.
Read More: iPadOS 18: Top 5 New Features We Love
In addition, searching for photos and videos has become even more convenient with Apple Intelligence. Natural language can be used to search for specific photos, such as “Maya skateboarding in a tie-dye shirt,” or “Katie with stickers on her face.” Search in videos also becomes more powerful with the ability to find specific moments in clips so users can go right to the relevant segment. Additionally, there’s a new Clean Up tool that can identify and remove distracting objects in the background of a photo, which works similar to Google’s Object Eraser.
A new feature called Memories allows users to create a story they want. By using language and image understanding, Apple Intelligence will pick out the best photos and videos based on the description, craft a storyline with chapters based on themes identified from the photos, and arrange them into a movie with its own narrative arc.
Siri is getting a significant upgrade with Apple Intelligence. With richer language-understanding capabilities, Siri is more natural, more contextually relevant, and more personal, with the ability to simplify and accelerate everyday tasks. It can follow along if users stumble over words and maintain context from one request to the next.
Siri can now give users device support everywhere they go, and answer thousands of questions about how to do something on iPhone, iPad, and Mac. It will also be able to read what’s available on the users’ screen and will be able to understand and take action with users’ content in more apps over time. Siri will be able to take hundreds of new actions in and across Apple and third-party apps. Users can tell Siri to send a mail of a particular photo to a certain recipient just by voice and it will be able to handle that.
Apple is also boasting about the privacy and security measures it has employed for Apple Intelligence. It even claims that independent experts can inspect the code that runs on Apple silicon servers to verify privacy, and Private Cloud Compute cryptographically ensures that iPhone, iPad, and Mac do not talk to a server unless its software has been publicly logged for inspection. In other words, Apple says that the device will only talk to the cloud for more complex requests and will try to handle most of the requests locally on the device itself.
Apple is also integrating ChatGPT access into experiences within iOS 18, iPadOS 18, and macOS Sequoia, allowing users to access its expertise — as well as its image- and document-understanding capabilities — without needing to switch across tools.
Siri can tap into ChatGPT’s expertise when it’s needed. Users are asked before any questions are sent to ChatGPT, along with any documents or photos, and Siri then presents the answer directly. The model Siri will be leveraging will be the latest GPT-4o. For those who choose to access ChatGPT, Apple says their IP addresses will be hidden, and OpenAI won’t store requests.
ChatGPT will come to iOS 18, iPadOS 18, and macOS Sequoia later this year. Users can access it for free without creating an account, and ChatGPT subscribers can connect their accounts and access paid features right from these experiences.
Apple Intelligence: Where & When Will It Be Accessible?
Apple Intelligence is free for users, and will be available in beta as part of iOS 18, iPadOS 18, and macOS Sequoia this fall in U.S. English. Some features, software platforms, and additional languages will come over the course of the next year. Apple Intelligence will be available on iPhone 15 Pro, iPhone 15 Pro Max, and iPad and Mac with M1 and later, with Siri and device language set to U.S. English.