On november 22nd many rumors announced that apple would launch an augmented reality (AR) or hybrid reality (MR) device in the near future. While Apple hasn't discussed the device in detail, AR features have become very active in iPhone and are rapidly getting better and better.
Apple started AR in 2017, By virtue of virtual IKEA furniture and realistic appearance of outdoor Pokemon Go war caused a sensation. This year, Scott Scott Stein, senior U.S. technology editor, used Apple's new iPhone 12Pro to scan fire hydrants and map the interior of his home, And navigate through the lava river on the floor of the house. In many ways, The apple's newest deep sensing lidar sensor on iPhone and iPad, And with its advanced 3 D scanning capabilities, are becoming the backbone of future Apple AR devices.
Facebook、 Microsoft and Magic Leap are already exploring goggles and glasses designed to blend virtual and reality, more VR devices will come out using Qualcomm chips in the future. But Apple AR director Mike
Apple's killer ar app: Smartphone
Virtual reality devices such as Oculus Quest 2, although the quality of continuous improvement, but compared with smartphones, still not many people use. Moor Insights senior consumer chip analyst Anshe
There are already 10,000 iOS applications from 7,000 developers that support AR, many of which focus on shopping or home decoration as a way to actually use AR at home, Apple said. Practicability is what Apple seems to value most at the moment. Rockwell said:
Although the new crown pneumonia epidemic has led to the closure of many physical businesses and slowed travel for most people, the use of AR tools for home shopping remains Apple's main concern. Like the way Google and Microsoft are looking for, using mobile-based AR tools, you may view what you might want to buy in the form D 3 scans on your home phone. The connection between Apple and the Safari browser makes pop-up AR shopping look like a perfect substitute for going to the store.
McGinnis cites Shopify and Build.com data as saying:
The same seems to be true for mobile-based AR, including Adobe. Adobe developed AR creative application Aero. for Apple Adobe AR Supervisor Stefano
Meanwhile, smartphones such as the $999 new iPhone 12Pro could be a major creative tool for future AR devices. Corazza said:
And that's the same model Qualcomm is already building for future AR/VR devices, but it could take years. At the same time, smartphones will play a greater role. Corazza said of the iPhone 12Pro:
As a creative tool, lidar is a AR field
Apple's first step into AR is to use the mobile phone's motion sensors, gyroscopes and built-in cameras to identify floors, and then it identifies walls and people. The lidar enabled iPhones and iPads go one step further, sending a series of infrared lasers from a small black circle near the rear camera, which can quickly grid (3D) a room's panorama.
This also includes 3D scanning of objects in space. This is the evolution of technology that Google explored many years ago through a series of deep sensing Tango mobile phones, but the technology is more advanced and more extensive. Many of the early lidar applications, such as polycam, 3D scanners and record 3D, were very creative and focused on 3D capture, which is a huge change from the amazing, game playing AR applications of 2017.
One of the largest 3 D object repositories on the Internet Sketchfab has seen growth, although it has been explored for years in 3 D scanning. Sketchfab has just reached 4 million subscribers, the first profit month since it began offering the service in 2012.
But just like Sketchfab chief executive office
Snapchat have explored the use of lidar to achieve AR effects, allowing virtual objects to be put into the real world, and even larger experiments scanning entire urban neighborhoods. Snapchat Camera Platform Vice President Etan
But even with these possibilities, learning to use these new tools can be daunting. Apple's own ar authoring tool, reality composer, and Adobe's 3D ar authoring toolkit, aero, are not necessarily apps that people can download immediately, or even avoid. So far, 3D scanning applications have been fascinating, but experimental, and not always intuitive. Apple has largely left the world of 3D scanning applications into the hands of developers, and Apple's core IOS tools don't integrate these functions at all.
apple's support for 3 scan objects in the iOS range does suggest that 3 D scan objects may eventually be shared like PDF or photos. But in some ways, the creative tools of the future have not yet fully existed.
Photo effects can also lead to amazing improvements, with Apple's own camera App using the iPhone 12 pro's lidar to improve the ability to focus photos and portraits at night. But Apple has not yet integrated AR technology into the camera App, and does not allow any 3D scanning. These ideas are left to developers to explore. Some applications, such as DSLR camera, have used the iPhone's lidar to create a custom 3D information layer on the photo data to layer 3D text into photos.
Fervio, DSLR Camera founde
AR and accessibility tools for extended senses
Apple believes that AR's killer app is discoverable, but there is another big opportunity in terms of accessibility. AR can expand one's senses. In the audio space, Apple has used airpods as hearing aids, and Facebook is exploring spatial audio technology for hearing aids.
The same can happen with assisted vision. Future vision aids, such as the enhanced contact lenses promised by mojo lens, are working to become useful tools for people with impaired vision. Apple is likely to follow a similar path with AR on the iPhone and the way devices in the future work as assistive tools. IOS 14.2 already has a new human detection feature, which uses Apple's AR and lidar to identify distance between people, and uses it to assist vision on the new iPhone.
This may just be the beginning. Rockwell said:
AR future killer advantage: immediacy
Although some people use ar all the time, they often forget to find new AR applications when they use iPhones or iPads in their daily lives. When we are busy in the real world, discovering new things in the virtual world is not a seamless process.
Rockwell believes that the future of iPhone AR is not an application, but a quick browsing feature. He explained that:
The way to achieve this includes App Clips, this is a new way for apple to launch small micro-applications in iOS 14, running on iPhone without downloading anything. App Clips can be triggered by NFC labels or scannable code placed in the real world, and it also involves Apple's mapping work. Apple's new Location Anchors means that virtual AR objects can exist in real-life locations, such as seeing a virtual artwork in Times Square and sharing it with many people at the same time.
About Location Anchors, Rockwell said:
these two tasks are still ongoing for apple's AR technology: it may be unlikely that this location-based AR technology will appear in public places, shops or museums during the period when the epidemic led to the blockade. But Apple believes they are crucial for people who use AR every day. Rockwell said:
Acute Art chief executive Jaco
Now lay the foundation for the future
Combined with Apple's 3D scanning function based on lidar, AR tools with more and more powerful visual effects, and spatial audio technology introduced by airpod pro (which can make what you're listening to sound like moving in 3D space), it's not hard to imagine the future of Apple's ar helmet.
Apple declined to comment. But at the same time, the company is trying to encourage a large number of developers to develop applications that support AR. Whether or not their AR devices will be available in the short term, more space-conscious iPhone and iPad will turn mobile phones into world-scanning devices. or may even be used in robotics or in unexpected places as computer vision cameras.