How Microsoft built an AR app with OpenStreetMap, ElasticSearch and Mapzen

Erik Schlegel dives into how Microsoft built Cities Unlocked, an AR navigation system for visually impaired users.

AR

Cities Unlocked announces location information throughout positional or head turn movements. The responsiveness and availability of our backend service had to be realtime. The planetary shapes we’re working with are often comprised of complex geometry structures(i.e. parks and buildings). The expectation was our services can seamlessly onboard new cities to the platform, while maintaining acceptable performance for data volume growth.

Our mapping data provider would also have to have a strong story around accessibility(i.e. crossings, curbs, sidewalks, stairs, etc). The data model also had to be flexible so we can easily add new places, features and tags.

Microsoft turned to ElasticSearch, Mapzen vector tiles and Valhalla routing, and Wikipedia.

We implemented a vector tile based solution that collects mapping data as square shaped GeoTiles into ElasticSearch. Each tile is a square area of land with a defined size and location, identified by a row, height and zoom level(X/Y/Z). The highest zoom level comprises of 4 tiles. Each zoom level divides the parent tile into four separate tiles. We used a zoom level of 16(so an average bounding box of ~460 meters x ~460 meters)

We crowdsourced nearby geotile data based on the users position and a 3 kilometer radius distance. We’re using Mapzens GeoJSON-based Vector Tile Service. Streets, buildings and parks are stored as GeoJSON LineString, while addresses and points of interest as a single GeoJSON point.

And best of all, they are giving data back to OpenStreetMap making it better for everyone.

Utilizing an open spatial data solution like OpenStreetMap was critical to our success, as we needed full read/write access to sidewalk objects(hydrants, mailboxes, trash cans, etc) to maintain an acceptable level of data quality. The current population of sidewalks and crossings in OSM are sparse. This presented a challenge as the open source Valhalla routing engine factors roads and intersections in the absence of pedestrian data in OSM. We’re currently undergoing an exploratory effort that uses machine learning models paired with aerial and streetview imagery to infer sidewalk and crossing mapping features, with the goal of contributing that dataset back to OpenStreetMap to improve the data quality of Valhalla pedestrian routes.

If you want to learn more, Erik spoke spoke about this at State of the Map US in July:

We are thrilled to involved with this effort as it falls in line with our philosophy on mobility and accessibility where people and the freedom of movement come first.