According to foreign media reports, Apple will introduce a rear 3D sensor array into its new iPhone. Apple plans to buy laser components for the array from lumentum. Lumentum is a California based company, and apple has purchased the front end true depth laser currently used on the iPhone from lumentum. It is reported that Apple engineers have been studying the rear 3D camera for two years, and now plan to release at least one product with 3D camera in the second half of this year.

The introduction of a new generation of sensors will improve the AR function of iPhone

Of course, it’s not only apple that is considering adding this feature to its flagship phones in 2020. The new generation of Galaxy S20 + and S20 ultra just released by Samsung last month are equipped with rear time of flight (TOF) sensors. They are used for real-time focusing (optional blur effects in photos) and fast measurement (allowing users to measure objects in front of them).

However, Apple has made greater progress in developing application interfaces and tools for third-party developers and their own software teams to create new experiences and functions.

In 2017, Apple launched a similar front-end truedepth array on iPhone X. Its key function is face recognition, which verifies the user’s face scan before unlocking access to personalized files and services on the device.

However, this technology still has great potential to be developed. Apple may want to install these sensors on the back of the iPhone to stimulate the creation of new and practical applications and prepare for the arrival of AR technology.

Tim Cook, Apple’s chief executive, has previously said he believes augmented reality will become a watershed to some extent, similar to the launch of the app store. There is a lot of evidence that Apple has been developing ar glasses internally. Earlier this week, 9to5mac, a website focused on apple news, claimed to have found evidence of a new ar app in the leaked IOS 14 code, as well as signs that the new iPad Pro model will also be equipped with a post TOF sensor. The new ar app will allow users to access information about products and other items in the surrounding space.

So far, augmented reality applications on the iPhone have relied on depth differences between multiple traditional cameras behind the new iPhone to handle depth, but none of them are as accurate as the new generation of sensor arrays.

The introduction of these more precise tools will certainly improve the AR capabilities on the iPhone, but it may not be enough to open the watershed predicted by cook. It’s still a bit awkward to use AR experience on mobile phones. It may take a developer gold rush to make ar glasses successful.

Tagged:

Leave a Reply

Your email address will not be published. Required fields are marked *