Apple Now Has A Neural Network API

Unlock Apple's machine learning with a few lines of code
WWDC 2016 keynote
Apple's keynote will take place the first day of WWDC, June 13. Apple

Share

Neural networks are all the rage recently—they’re a way to get software to make decisions based on an overwhelming amount of data and the fundamental component of deep learning, which allows computers to recognize photos, speech, and text with unparalleled accuracy.

Apple, which has traditionally kept its artificial intelligence research under wraps, but now is allowing developers to build neural networks by calling on the company’s simple API.

Those developing with Apple’s neural networks, called Basic Neural Network Subroutines, won’t be able to train on their own data. Instead Apple has pretrained them for certain tasks, and from the documentation the API seems very focused towards image recognition.

“For example, the initial input might be an image and the inference might be that it’s an image of a dinosaur,” documentation says, in the only concrete use case given.

The API will run on macOS, iOS, tvOS, and watchOS, and is optimized for each device’s CPU.

Follow all of our WWDC 2016 coverage here.