Home iOS Apple puts a Map to the future on iPhone

Apple puts a Map to the future on iPhone

149
0

Apple has begun rolling out its long-in-the-making augmented reality (AR) city guides, which use the camera and your iPhone’s display to show you where you are going. It also shows part of the future Apple  sees for active uses of AR.

Through the looking glass, we see clearly

The new AR guide is available in London, Los Angeles, New York City, and San Francisco. Now, I’m not terribly convinced that most people will feel particularly comfortable wriggling their $1,000+ iPhones in the air while they weave their way through tourist spots. Though I’m sure there are some people out there who really hope they do (and they don’t all work at Apple).

But many will give it a try. What does it do?

Apple announced its plan to introduce step-by-step walking guidance in AR when it announced iOS 15 at WWDC in June. The idea is powerful, and works like this:

  • Grab your iPhone.
  • Point it at buildings that surround you.
  • The iPhone will analyze the images you provide to recognize where you are.
  • Maps will then generate a highly accurate position to deliver detailed directions.

To illustrate this in the UK, Apple highlights an image showing Bond Street Station with a big arrow pointing right along Oxford Street. Words beneath this picture let you know that Marble Arch station is just 700 meters away.

This is all useful stuff. Like so much of what Apple does, it makes use of a range of Apple’s smaller innovations, particularly (but not entirely) the Neural Engine in the A-series Apple iPhone processors. To recognize what the camera sees and provide accurate directions, Neural Engine must be making use of a host of machine learning tools Apple has developed. These include image classification and alignment APIs, Trajectory Detection APIs, and possibly text recognition, detection, and horizon detection APIs. That’s the pure image analysis part.

Copyright © 2021 IDG Communications, Inc.