Apple's Worldwide Developers Conference begins June 3. Last week we released exclusive details about iOS 13 and macOS 10.15. Today, we're announcing details of new features and APIs for developers to be announced at the event, from sources familiar with the development of Apple's new operating systems.
Siri's New Intentions
There Are New Developers of Siri Intentions
Developers porting their iOS apps to the Mac can have access to news, including media playback, search, conference call , Event ticketing, message tracking, train ride, flight, airport gate, and seat information APIs that enable integration of their UIKit apps with Mac-specific features such as the touch bar and menu bar (including shortcuts). UIKit apps on the Mac can also open multiple windows.
iOS ported split view apps can resize by dragging the divider and resetting its position by double clicking on the divider, just like native Mac apps.
Turning on Mac support for an existing iOS app is as easy as enabling a checkbox in the destination settings in Xcode, just as you would add iPad support to an iPhone app.
AR on Apple's platforms will see significant improvements this year, including a brand new, exclusively Swift framework for AR and a companion app that allows developers to visually create AR experiences. ARKit can detect human poses. For game developers, the operating system supports controllers with touch pads and stereo AR headsets.
Taptic Engine, Links, NFC, More
A new framework gives developers more control over the Taptic Engine, which is currently a very small size providing feedback styles for third-party developers. There's also new functionality for developers to include link previews in their apps, much like in iMessage conversations.
NFC receives important enhancements, including the ability for third parties to read ISO781
A new version of CoreML allows developers to update their machine learning models on the device. Currently models need to be trained in advance and are static after deployment. This allows apps to change their behavior as their ML models learn about user actions. Apple also adds a new API for developers that enables in-depth analysis with machine learning. The vision framework gets an integrated image classifier without developers having to embed a machine learning model to classify images into common categories.
The document scanning feature, which is available in some parts of iOS, such as the Notes app, will be available for developers with a new public frame. With a new API, apps can capture photos from external devices, such as cameras and SD cards, without having to go through the Photos app.
On the Mac, apps can offer file provider extensions to integrate specific apps like Dropbox into the Finder. There will also be a new API for writing device drivers.
Apple is expected to introduce iOS 13, tvOS 13, macOS 10.15 and watchOS 6 on June 3 during the WWDC keynote. Developers get immediate access to the first beta, and public betas are later released to members of the public beta program. The final version of the systems should be released to consumers in September.
Thanks to Steve Troughton-Smith for his help with this report.
Check 9to5Mac on YouTube for more Apple News: