Apple has released the first set of features in its AI-supported Apple Intelligence system.
Users of the iPhone, iPad and Mac computers can access the tools through the latest software update from today.
Apple CEO Tim Cook said Apple Intelligence built on years of innovation in AI and machine learning.
“It puts Apple’s generative models at the core of our devices, giving our users a personal intelligence system that is easy to use,” Mr Cook said.
He said Apple Intelligence was a personal intelligence system that could understand and create language and images, take action across apps and draw from personal context to simplify and accelerate everyday tasks.
Apple’s senior vice president of Software Engineering Craig Federighi said the system had a range of useful uses in this release, such as writing tools to help refine writing and the ability to search in photos and videos by describing what you were looking for.
“And it’s all built on a foundation of privacy with on-device processing and Private Cloud Compute, a groundbreaking new approach that extends the privacy and security of iPhone into the cloud to protect users’ information,” Mr Federighi said.
He said other features would be available in December, including “localized English in Australia”.
“Also coming in December, a new visual intelligence experience will build on Apple Intelligence and help users learn about objects and places instantly, thanks to the new Camera Control on the iPhone 16 lineup.
“Users will be able to pull up details about a restaurant in front of them and interact with information – for example, translating text from one language to another.”
Mr Federighi said in the months to come, Priority Notifications would surface what’s most important and Siri would become even more capable, with the ability to draw on a user’s personal context to deliver intelligence that’s tailored to them.
“Siri will also gain onscreen awareness, as well as be able to take hundreds of new actions in and across Apple and third-party apps.”