It’s been less than two months since WWDC 2019 (Apple Global Developers Conference), and the rumors about the next-generation system for Apple’s four major software platforms are starting to increase. Following the exclusive disclosure of some of the new features in iOS 13 and macOS 10.15 last week, the tech site 9to5Mac brings several news about developers today.
▲ WWDC 2019 poster
Added a large number of Siri intention instructions
For developers who need to access Siri, Apple has prepared some new Siri intent commands, including multimedia playback, search, voice calls, event tickets, message attachments, train information, flight information, gates and seating information. Wait.
Marzipan cross-platform app plan improved
For developers looking to develop iOS and macOS cross-platform apps, Apple offers a new API that can be used to integrate its UIKit app with Mac-specific features such as the Touch Bar touch bar on the MacBook Pro and the menu bar on the desktop. (including keyboard shortcuts). The UIKit app will also open multiple windows on the Mac.
For apps that offer the Split View split screen feature on iOS, you can adjust the window by dragging the divider bar on the Mac (with the mouse), and double-click the split bar if you want to reset the position.
In addition, turning on Mac support for iOS apps is as simple as checking an option in the target settings in Xcode.
This year Apple has brought a number of improvements to AR development, including the new AR framework (unique in the Swift language) and a new aid that developers can use to create a visual AR experience. The ARKit development tool will be able to detect human posture. For AR game developers, the new framework will support handles with trackpads and stereo AR helmets.
Improvements in vibration engine, CoreML and NFC
This time Apple will allow developers to take full advantage of the vibration engine on the iPhone, Apple Watch and Mac. In the past, third-party developers were only allowed to use a few types of vibration feedback.
The new development framework allows developers to add link previews to their app, similar to the features already available in iMessage conversations.
The NFC feature has also been improved to allow third-party developers to design applications that can read ISO7816, FeliCa, and MiFare tags. Previously, only third-party apps could be used to read tags in NDEF format.
With the new CoreML development framework, developers will be able to update the machine learning model on the device. Currently, models must be pre-trained and static after deployment. The new framework allows apps to learn and improve their behavior through user actions. Apple also provides developers with a new API that can use machine learning to analyze sound. The Vision framework also has a built-in image classifier that eliminates the need for developers to embed machine learning models for image classification.
The document scanning feature provided by iOS will be open to developers with a new public framework. Applications can later capture images from external devices such as cameras and SD cards without the need for photo applications.
WWDC is now Apple's largest special event, this year's WWDC 2019 Global Developers Conference will be held from June 3rd to 7th at the McEnery Conference Center in San Jose, Calif., when the company will publicly preview the next-generation system iOS 13, tvOS 13, the core features of macOS 10.15 and watchOS 6. In addition to carefully prepared hundreds of lectures and experimental activities, Apple will send nearly a thousand engineers to the site to provide technical support to thousands of participating developers.