5 ARCore Concepts Android App Developers Need to Catch on ASAP

ARCore, Google’s augmented reality counterpart for Apple’s ARKit is still hot news. In the first couple of months of its existence, Android app developers have used it to create hundreds of AR apps, but users need to get prepared for an even bigger wave of apps to deluge the Google Play Store. Although augmented shopping and gaming apps make up the greatest percentage of all AR apps, many original app ideas emerge to test the newly developing AR platform for Android phones. To stand up to the competition and create Android apps with a truly immersive experience of reality, Android app developers need to catch up on the recent releases and features.
ARCore Concepts
Flat vector illustration of designer’s workplace.
ARCore allows users to augment the real world with the phone’s camera. By tracking user’s position in real-time and projecting digital objects onto the real-world landscape, the app engages users into a more absorbing reality. ARCore version 1.0, which was launched a few months back was a part of the first SDK, giving Android app developers tools for publishing new apps in the Google Play Store and spreading the augmented reality to over 100 million AR-compatible devices. From that moment forward, ARCore has been updated to version.1.2 including improved features, such as cloud anchors, vertical plane detection and Sceneform. So, what are the core 5 concepts that should be the focus of Android app developers in the current stage of AR development?

Keeping Pace with Smartphones  

ARCore doesn’t work on every single Android phone. Android app developers should follow suit as more smartphone brands and models become AR-supported. At the moment, most flagship phones are built with AR capabilities, including the Google Pixel, as well as brands like Samsung Galaxy, LG, and OnePlus. Sony, Motorola, and Huawei are the next in line waiting to become AR capable. Some of the smartphones require additional work on the side of the manufacturer. Therefore, it’s important to keep track of this progress, especially if you are developing for a specific audience segment, which is accustomed to using devices that aren’t AR-supported yet.

Shared User Experiences with Cloud Anchors

Cloud anchors create multiplayer AR experiences. The cloud anchor works by syncing an object’s location in the 3D space to the cloud. If you and another person use the same AR application – such as a gaming app, cloud anchors will help you both see the same virtual object in the same location. The really big news is that cloud anchors don’t work only on Android devices; Google is working to expand them to iOS phones. An example of a cross-platform sharing app that allows Android app developers to use the same codebase for Android and iOS devices is ViroReact. ViroReact is an image recognition app that matches the similar capabilities of iPhones and iPads and supports Apple’s ARKit.
Three Dimensional Orange Squares On Computer Monitor Shows 3d Graphics Software Or Illustrations

Vertical Plane Detection

Imagine using your phone for multidimensional object detection in the real world. The first version of the ARCore included only horizontal plane detection, but now with version 1.2, it’s possible to detect vertical surfaces and augment objects placed on side walls. This feature enables the creation of  augmented interior design experiences, such as placing pictures or furniture on a wall or creating portals to alternate dimensions.

Augmented Images

The whole AR experience is based on an image-recognition system that enables building image-responsive apps. By pointing the camera to an object, ARCore can detect 2D images from the user environment, such as posters or product packaging, helping the user enliven a picture or presenting product assembly instructions with augmented images. ARCore can only track up to 20 images from the same environment. The images must be flat, static, and at least 15cm x 15cm in size. The ARuler app, for instance, is an AR measurement tool that can measure real-world objects placed on a horizontal or a vertical surface by pointing the phone camera at them.

An Easy Route From 2D to 3D with Sceneform SDK

Augmented reality apps require coding skills for a three-dimensional environment.  If you are an experienced gaming app developer, the jump to augmented reality will not be a big challenge. However, if you are used to coding for the 2D world, you may encounter some difficulties. To encourage building apps in a 3D reality, Google has developed the Sceneform SDK, which is basically an Android Studio Plugin and an API that can help developers create 3D user interfaces without starting from scratch by learning the rules of complex 3D game engines. The hundreds of AR projects that are already underway in the Google Play Store indicate that Google may be finally catching up with Apple’s leading position in the AR domain. The latest added features to the ARCore, as well as the Google Lens, the real-world search tool that provides information about an object based on an image taken by a smartphone camera, promise exciting new avenues for Android app developers that need to be turned into practical solutions.   Meanwhile you can also check out our article on Choosing a Career in Mobile App Development