< BackHome/stories/5-cool-uses-of-augmented-reality-2-0-now-with-computer-vision-2

5 Cool Uses of Augmented Reality 2.0 (Now With Computer Vision!)

augmented reality

For some time now, augmented reality (AR) has been one of Silicon Valley’s most talked-about new technologies, but the accessibility of the tech has been limited at best—stuff like Snapchat filters that just stick a few animations on faces, or Pokémon Go-like games that use the player’s location to spawn AR characters rather than actually “see” what’s around them.

Recent developments, however, look to change all that. Google’s Tango technology, currently available in Lenovo’s Phab 2 Pro and Asus’s Zenfone AR smartphones, uses computer vision to enable mobile devices to detect their position relative to the world around them (this is achieved through a combination of multiple depth-sensing cameras, accelerometers and machine learning algorithms—without relying exclusively on GPS or other external signals). This allows app developers to create breakthrough experiences like indoor navigation, 3D mapping and measurement of physical space, and environmental recognition, thus blurring the boundaries between the real and the synthetic by placing users in virtual environments and bringing virtual objects into the real world.

Now, with the wider launch of Google ARcore and Apple’s ARKit—both of which are able to use the existing cameras and sensors on most current smartphones—augmented reality 2.0 is coming to an Android or iOS device near you. In the coming months, we’ll start to see applications that go well beyond Pokémon Go or Snapchat selfie whiskers. Essentially, our smartphones will soon be enhanced by proper computer vision tech that enables them to more completely and accurately interpret the world around them, rather than imprecisely (and sometimes haphazardly) projecting disembodied virtual objects or skins onto the immediate environment.

Along with technologies such as Google’s recently unveiled Lens (which lets mobile devices do things like read street signs and crop out unwanted objects from photos), AR 2.0 is primed to change how we perceive and interact with the physical world around us. This, of course, has implications for brands and marketers, that must capitalize on ways to incorporate everything from advertising and branding to shopping assistance and customer service into this new and increasingly robust medium. Here are a few of the most compelling and promising uses of AR 2.0 from 2017 so far, all of which offer a glimpse into how brands might augment their use of the technology in the very near future.


Judging by some early demos on the eve of Apple’s AR-optimized iOS 11 announcements, one of the first ways we’ll see AR enter our everyday lives will be in the context of restaurant dining, reducing the anxiety of deciding what to eat. A recently released video by AR startup Kabaq certainly reinforces this argument—rather than ordering based on menu descriptions or, at best, tiny pictures of dishes, Kabaq’s AR vision is going to allow users to see nearly photorealistic 3D renderings, in 360 degrees, of a given restaurant’s offerings, presented to scale on the plate in front of them. The options here are varied; beyond simply ordering food in restaurants or for delivery, the tech can be used to help consumers visualize recipes in cookbooks or for catering menu presentations. The 3D menu renderings can also be shared on social media or other marketing channels. Already, there are 15 restaurants signed up to prototype the service, and Kabaq is looking to develop custom applications for brands to, for instance, enable consumers to point their phones at product packaging and visualize foodstuffs prepared with a specific product.


Ever find yourself hopelessly lost inside a big-box store, searching for something specific like a wrench or a 20-pack of toilet paper? AR looks to change all that in the near future, as demonstrated by Lowe’s Vision: In-Store Navigation app. Developed by Lowe’s Innovation Labs, the app lets customers use their Tango-enabled smartphone to search for products online, add them to a shopping list and then use the phone to help them physically locate the products they need inside the store using computer vision-enabled AR guidance. Directional prompts are overlaid onto the real-world store setting to guide the customer to each item on their list using the most efficient route around the store. Lowe’s has already added the feature to 400 of its stores. Is AR guidance better than human guidance at a store? If you’ve ever been led to the wrong aisle at your local big-box chain, the answer may well be yes, but this kind of AR application might work in the wild too, with brands leveraging it outside of the context of individual stores. Walking around NYC looking for your favorite brand of kombucha? You’ll likely be pointing your phone at your surroundings to home in on the nearest bottle, which might just be at that organic hot dog cart right across the street. 


One of the most exciting things about the next wave of AR is its use of computer vision to put things in their proper context and scale them to their desired destinations. IKEA, for instance, recently showed off some Apple ARKit tech it’s calling IKEA Place, an app that aims to help shoppers try out furnishings in the comfort of their homes. The app, which will roll out in iOS 11, lets users virtually place furnishings in their living space; the app is able to sense walls and determine precisely the correct scale of a virtual piece of furniture inside an actual living room (the company claims 98% accuracy). To boot, the ability to see how light and shadows sit on one’s virtual home furnishings helps users visualize how everything looks in its proper context. And if the user likes how something fits, they can add it to a shopping basket and purchase it right then and there.


It’s not just homes that are going to benefit from AR tech—it’s also faces. Earlier this year, makeup purveyor Sephora introduced its Virtual Artist iOS app, developed in partnership with AR company ModiFace. The app scans your face, locates your eyes and lips, and lets you try on different looks. Currently, users can experiment with lip color, eyeshadow and false lash styles, or use virtual tutorials that show them how to properly do things like contour and apply eyeliner virtually, right on their faces.

L’Oréal also has several branded AR apps (Makeup Genius, Style My Hair) that let users apply makeup and hair looks to their selfies before trying a product. Beyond pure functionality, the beauty behemoth is strategically mining those user engagements for data about what customers like and what they don’t. The brand recently signed a deal with AR beauty app YouCam to bring its products into YouCam’s e-commerce platform, and is installing AR kiosks at beauty counters to analyze whether or not consumers bought a product after using the app. AR and beauty brands seem to be a match made in heaven, and the more precise AR gets, the more we’ll see it implemented in all kinds of cosmetics-related commerce.


Car manufacturers have been quick to jump aboard the AR bandwagon, and that’s no surprise. The tech enables people to use their smartphones to view photorealistic 3D models of cars in their own driveway (or their living room), walk around them and even step inside the cars to explore the interiors in detail. BMW’s app is currently available for Tango-enabled devices, and Audi is working on similar technology using the Unity game engine. The “AR 2.0” ability of the phone to provide the accurate scale and proportions of the vehicles is of course essential here, and the precision with which the cars are rendered and stabilized in the environment (rather than jerking around the environment like a windsock) makes all the difference when dealing with something as weighty (and expensive!) as a car.

Animations by Eran Mendel

sign up for our newsletter