Home >Hardware Tutorial >Hardware News >iOS 18.1 and Beyond: Siri\'s Apple Intelligence Features
With Apple Intelligence, Apple is aiming to make Siri smarter than ever before. The personal assistant is going to be able to learn more about you, do more in apps, and hand over the reins to a smarter virtual assistant when needed.
Siri has a refreshed design on devices that support Apple Intelligence. Rather than the small wavelength that used to show when activating Siri, there's now a pink/purple/blue/orange variegated glow that wraps around the entire iPhone, with the colors shifting as Siri listens to a command.
Siri can better understand conversational language and requests, so if you stumble over your words or change your mind mid-sentence, Siri can still follow what you're saying.
Siri has a new, more natural sounding voice.
There is a built-in Type to Siri feature so you don't need to speak to Siri to interact with the personal assistant. To use it, double tap on the bottom of the iPhone or iPad to bring up a text bar, and then from there, just type in your request.
In iOS 18.2, ChatGPT integration is available with Siri. If a user asks something that Siri is not capable of handling, ChatGPT can provide a response instead, so long as the user gives permission.
The Siri ChatGPT integration can essentially be used to do anything you can do with the ChatGPT app or ChatGPT on the web, it's simply an easier way to get to ChatGPT.
For anything on your screen, such as an image, you can ask Siri a question about it. If you have a photo of a plant, for example, asking "What is this?" will prompt Siri to send a screenshot over to ChatGPT, and ChatGPT will attempt to provide context.
ChatGPT can be used to describe a scene, which is useful for people who might have issues with sight. Opening the Camera app, activating Siri, and asking "What is this?" will provide a detailed description of whatever is in front of you.
For emails, documents, PDFs, and more, ChatGPT can provide a summary. When you ask "Can you summarize this?" Siri will send a screenshot or the entire document, which includes full PDFs. It's a useful feature for getting a quick overview of the content of a long document.
Rather than asking for a summary, you can instead ask a specific question about a document. If you're looking at an insurance policy, you can ask "What are the limits of this policy?" or "What are the exclusions?" to get more tailored information.
If you've written an email, rather than selecting it and using Writing Tools to check it for spelling and grammar errors, you can ask Siri to take a look, and Siri will send a screenshot to ChatGPT. "Can you look this over for errors?" works as a command for this feature.
ChatGPT can generate text from scratch based on prompts that it is given. You can, for example, ask Siri to ask ChatGPT to write a poem or compose a polite letter to a friend, and ChatGPT will create something from scratch.
If you have ChatGPT write something for you, you can tap on the copy icon to copy it to the clipboard to paste it into Notes, Messages, a document, or an email.
You can also create images. Using the Dall-E 3 engine, ChatGPT can make realistic AI-generated images, something that can't be done with Apple Intelligence. For image requests, it's easiest to tell Siri to "Tell ChatGPT to make an image of [thing you want an image of]," because if you just ask Siri to make an image or generate an image, it will often bring up web images.
One of the best use cases for ChatGPT through Siri is getting answers for queries that are just a bit too complex for Siri. Questions that Siri can't handle will be handed over to ChatGPT with your permission, but you can also force Siri to use ChatGPT instead of the internal Siri engine by amending questions with "Ask ChatGPT."
Some example queries that Siri will automatically consult ChatGPT on:
While Siri can do all of these things with ChatGPT's help, the lack of continuity with the Siri version of ChatGPT makes it difficult to complete tasks that are not one-off requests. Creating a meal plan, for example, works better with the actual ChatGPT interface because you can have more of a conversation rather than relying on a single request.
ChatGPT integration has to be turned on, and after that, each request requires user permission. There is an option to turn off the extra permission by toggling off the "Confirm ChatGPT Requests" option.
The toggle can be accessed by opening up the Settings app, choosing Apple Intelligence, and then tapping on ChatGPT. With the feature disabled, Siri will not ask each time before sending information to ChatGPT.
Siri will, however, always ask permission before sending a file to ChatGPT even with the confirm requests feature turned off.
As for privacy, no login is required to use ChatGPT, and neither Apple nor OpenAI log your requests. But if you sign in with a paid account, ChatGPT can keep a copy of requests.
ChatGPT integration includes a limited number of requests that use ChatGPT-4o, the latest version of ChatGPT, for free. After those are used up, ChatGPT integration uses 4o Mini, which is less advanced and takes up less resources.
Apple users essentially have access to ChatGPT's basic plan, so requests that use advanced capabilities reset every 24 hours. With this plan, two images per day can be generated.
There is overlap between what's possible with Apple Intelligence and what you can do with ChatGPT integration, but there are some distinctions. Apple Intelligence has Writing Tools for rewriting and editing what you've already written, but ChatGPT can write content from scratch.
Image Playground, Image Wand, and Genmoji allow you to generate images, but Apple Intelligence won't generate realistic looking images. Instead, styles are limited to those that look animated or sketched. ChatGPT will generate lifelike images, though.
Apple Intelligence can be used to summarize documents, but only when you select text and select the Summarize option from Writing Tools. Apple Intelligence can't answer more specific questions about PDFs and documents, so ChatGPT does have an edge for that kind of query.
When you ask ChatGPT a question through Siri, you need to make sure to read the answer right away because it doesn't stay on the screen long. Apple does not keep a record of it, either.
If you're logged into ChatGPT, there is a history in your OpenAI account, but if you're not logged in, there's no way to save information that you've received from ChatGPT, and there's no log.
Apple has only added ChatGPT integration right now, but support for Google Gemini is planned in the future.
There are several Siri features that are still in development, with Apple planning to add these capabilities to Siri next year. Timing isn't concrete yet, but rumors suggest we'll see them in iOS 18.4 in the spring.
Siri will be able to keep track of your emails, messages, files, photos, and more, learning more about you to help you complete tasks and keep track of what you've been sent.
Siri will be able to tell what's on your screen and complete actions involving whatever you're looking at. If someone texts you an address, for example, you can tell Siri to add it to their contact card. Or if you're looking at a photo and want to send it to someone, you can ask Siri to do it for you.
Siri will be able to do more in and across apps, performing actions and completing tasks that are just not possible with the personal assistant right now. We don't have a full picture of what Siri will be capable of, but Apple has provided a few examples of what to expect.
After all of the Siri Apple Intelligence features have been implemented in iOS 18, Apple plans to unveil the next-generation Siri, which will rely on large language models. An LLM version of Siri is already in development, and it will be able to better compete with chatbots like ChatGPT.
LLM Siri will be able to hold ongoing conversations, and it will be more like speaking with a human. Large language model integration will let Siri perform more complex tasks, and in the future, Siri likely won't need to rely on ChatGPT.
The updated version of Siri will replace the current version of Siri in the future. Apple is expected to announce LLM Siri in 2025 alongside the introduction of iOS 19, but the update likely won't launch until spring 2026.
Apple Intelligence was designed with privacy in mind, and many requests are handled on-device. All personal context learning, for example, is done with on-device intelligence and nothing leaves your iPhone or iPad.
For requests that need the processing power of a cloud server, Apple is using Private Cloud Compute on Apple silicon machines to handle complex tasks while preserving user privacy. Apple promises that data is not stored and is used only for user requests.
Apple Intelligence is available on the iPhone 15 Pro, the iPhone 15 Pro Max, all iPhone 16 models, the iPad mini with A17 Pro chip, all iPads with an Apple silicon chip, and all Macs with an Apple silicon chip.
The above is the detailed content of iOS 18.1 and Beyond: Siri\'s Apple Intelligence Features. For more information, please follow other related articles on the PHP Chinese website!