The iPhone X has only been shipping for a few weeks, but already we’re starting to learn what capabilities future iPhones may hold. While 2018 looks likely to showcase only small improvements, 2019 may yield a phone with exciting new features.
This year, besides its external redesign, Apple’s 10th-anniversary handset most prominently features an upgraded front-facing camera capable of sensing depth. This TrueDepth camera, as it’s called, projects a network of 30,000 laser dots to build a highly accurate 3-D rendering of your face. This can then be used for biometric authentication, for powering the movements of Animoji characters (best experienced while singing karaoke), and for fun 360-degree selfie scenes, among other applications.
Oft-accurate Apple analyst Ming-Chi Kuo of KGI Securities predicts that next year we’ll see three new iPhones, all of which will include TrueDepth camera-sensor technology. The new models are said to include an iPhone X follow-up and a larger 6.5-inch iPhone X–style handset. The third model, while like the iPhone X in appearance, would have a traditional LCD display instead of an OLED display. It’s also expected to come in at the typical iPhone price point of $649–$749. This lineup follows Apple’s traditional biyearly cycle, with a “big” hardware launch every other year followed by a more iterative, software-driven launch the following year.
With this tactic, Apple also gets its augmented reality–optimized phone into a greater number of hands before expanding the capability even further in 2019. Right now, Apple is limited. All those 3-D image-rendering augmented reality applications are only feasible with the front-facing camera. In coming years, however, Apple will expand its depth-sensing technology to the rear-facing camera, too.
According to Bloomberg, Apple is experimenting with a different laser-based method for bringing 3-D sensing to the iPhone’s rear camera. Set for inclusion in the 2019 iPhone, the method would use time-of-flight to calculate how long it takes for a laser to bounce off the objects in a scene and then use that information to build a 3-D image. With depth-sensing technology built into both of the phone’s cameras, the opportunity for game-changing augmented-reality applications grows exponentially.
We’ve already seen how addictive and entertaining augmented reality–based gaming can be, with titles such as Pokémon Go among the most popular mobile games of 2016. With more detailed 3-D imagery to work with, games like this can become even more immersive and challenging for players.
Adding this technology on the rear-facing camera could also be useful in more practical applications. Designers and creators who need to do 3-D scans and renderings, for example, could use it. Right now, a desktop-size 3-D scanner costs anywhere from $130–$500, and 3-D–scanning apps are typically less accurate and far more time-consuming. With laser-based time-of-flight technology built into the iPhone, 3-D scanning could become faster, more efficient, and cheaper than current solutions. At the least, it could make the question of whether that new couch will fit in your living room easier to answer.
As it’s still early days for testing this technology, there’s no guarantee it’ll end up in the iPhone. But given Apple’s dedication and excitement surrounding the field of augmented reality, it would be surprising not to see the company advance the iPhone’s augmented reality capabilities in some way in the near future.
With augmented reality driving the future of iPhone innovation, the next few years of iPhone evolution should be interesting to watch and exciting to experience. With augmented reality built into our handsets, we can move beyond the screen and meld technology more seamlessly into our real-world experiences.