Siri is about to get a lot smarter.
Today at Apple’s annual Worldwide Developers Conference (WWDC 2016) in San Francisco, the company announced that its virtual personal assistant, Siri, will start to actually integrate with the apps on your phone.
This means you’ll be able to ask Siri to order you a pizza, or post a new Facebook status for you, or order a ride from Uber, or a host of other tasks within your favorite iPhone and iPad apps (including apps developed by companies other than Apple).
Before this announcement, Apple let devices like smart lights be controlled by Siri, but most developers who didn’t make physical products couldn’t take advantage of the software. Developers will now be given access to an API for Siri, which allows connected apps to tap into Siri’s natural language processing so users can make commands through the personal assistant.
In iOS, Siri will be able read your iMessages and give intelligent suggestions about places to eat, or auto-fill locations being referenced in the conversation. Siri won’t add to the conversation like a bot, but be able to augment your responses.
Apple is also bringing a different flavor of the technology that powers Siri, deep learning, to your photos. The Photos app will be able to run facial recognition and object recognition to make highlight albums of your photos. It will group like “memories” based on who’s in the photos and where they were taken.
Siri will also make an appearance on the next version of Apple’s desktop operating system, macOS. The assistant will live in the upper right-hand corner of the toolbar, and integrate with macOS’ file search feature.
The last update to Siri was at last year’s WWDC 2015, when Apple expanded the virtual assistant’s capabilities to answer more complex questions. At that time, Apple added abilities like finding specific photos in a user’s Camera Roll, and more detailed sports information.
Follow all of our WWDC 2016 coverage here.