Google has been quietly working to improve its augmented reality platform, ARCore, since its official launch early last year. Now, the company says its ready to unveil some of the next-generation upgrades to depth detection and physics it’s achieved that promise to make AR experiences seem much more realistic in the future.
The upgrades, part of an overhaul to ARCore’s Depth API, will soon allow developers to perform what’s known as occlusion, which is when an artificial object can be blocked from view by other real-world objects in a scene. Place a virtual cat in your living room, for instance, and you can see it disappear from view when you angle your camera in a way that places a bed or table or some other object in between.
The result is a more believable scene, because the depth detection going on under the hood means your smartphone better understands every object in a scene and how far apart each object is from one another. Google says it’s able to do this through optimizing existing software, so you won’t need a phone with a specific sensor or type of processor. It’s also all happening on the device itself, and not relying on any help from the cloud. So long as you have a phone that supports ARCore, which is pretty much every new Android phone released in the last few years, you’ll be able to access these new features.
We’ve seen occlusion on mobile phones before. Pokémon Go creator Niantic showed off a video of an occlusion demo featuring a tiny virtual pikachu darting around an urban plaza, dashing in between objects and blending seamlessly with the environment. That was in July 2018. But it was just a video, and not a demo running on a device members of the press were able to see operating in real time.
During a meeting with members of Google’s AR division last week, I was able to play around with real-time demos the team built to show off the new depth technology. Granted, it was in a test environment Google had arranged for the demo, but the technology does work. In fact, it’ll be available starting today, as part of updates to home design app Houzz and Google’s own AR in Search feature.
Furniture you find on Houzz that are part of the app’s “view in my room” option will now support occlusion. Google says the more than 200 million Android devices will also get occlusion for any object that has an AR model in Google Search.
I did get to see some demos of the Depth API’s new capabilities that won’t be turning up in commercial apps or services today, but Google says those advancements will be made available to developers in the future after it works more closely with developers and other collaborators to polish some of its approaches.
These go beyond occlusion and into more realistic physics and 3D mapping. Google has developed a way for AR objects to interact with the real world more realistically, move through an environment the way a real-world 3D object would, and interact with surfaces like you might expect physical matter would. For instance, in the demo I got to experience, I was able to create colorful shaped blocks out of thin air that could bounce off virtually any surface, even the handlebars of an exercise bike.
Google also made a mini-game of sorts showing off the ability for AR objects to move through an environment by going around and over real-world objects and the new Depth API’s surface interaction capabilities. It involved a cooking robot that engages in a food fight with you that takes into account the furniture and walls of the environment, with desserts leaving realistic splatters on surfaces.
Google isn’t making these demos available to the public, but the company says it hope app makes will make similar to vastly improved experiences when it is ready to release the updated Depth API to all developers. The company doesn’t have a timeline for when it does expect to release this toolset more broadly, but it’s likely these capabilities will be showing up in apps and AR web experiences some time next year.