Once again 9to5Mac’s Gui Rambo has published a big report about features coming in iOS 13 and macOS 10.15. This particular report focuses on the developer side of things and new frameworks.
There will be new Siri intents developers can adopt, including media playback, search, voice calling, event ticketing, message attachment, train trip, flight, airport gate and seat information.
Nothing special here. A lot more Siri intents are needed and this is a step in the right direction.
Developers porting their iOS apps to the Mac will have access to new APIs that allow their UIKit apps to integrate with Mac-specific features such as the Touch Bar and menu bar (including keyboard shortcuts). UIKit apps on the Mac will also be able to open multiple windows.
We already knew that we (developers) will be able to port over iOS apps to the Mac. The thing we didn’t know so far was whether Marzipan apps will be able to have multiple windows. Now we know.
AR on Apple’s platforms will gain significant improvements this year, including a brand new Swift-only framework for AR and a companion app that lets developers create AR experiences visually.
I’m in trouble! This companion app sounds very much like my app Yttrium. It still seems like you have to use a Mac to design and build everything, but this is Apple at their best: Sherlocking other peoples products. It’s fine if they do a good job at it!
On the other hand, the new framework sounds very interesting. We already have SceneKit and SpriteKit for AR. Both are gaming focused frameworks. If Apple really wants the future to be AR, they still need a UI framework. I’m not sure whether this is it, but it’s very exciting nonetheless.
ARKit gets the ability to detect human poses.
Cool but not revolutionary
For game developers, the OS will support controllers with touch pads and stereo AR headsets.
With a new version of CoreML, developers will be able to update their machine learning models on-device.
This is great because CoreML models will be able to improve themselves the more you use them. It’s also great, because it means, that the app you are using won’t have to send any raw data back to the developer, just to improve a model.
Apple is also adding a new API for developers to do sound analysis with machine learning.
I’m all over this. Expanding ML frameworks beyond image recognition is really important, and with integration into CreateML this could be huge.
With a new API, apps will be able to capture photos from external devices such as cameras and SD cards, without having to go through the Photos app.
Basically addressing one of Nilay Patel’s problems with the iPad Pro
On the Mac, apps will be able to offer file provider extensions, improving the way certain apps such as Dropbox can integrate with Finder.
Cleaning up the user folder
This is another great report from 9to5Mac and now WWDC can’t come fast enough. Especially since I’ll be there.