Google has made multiple attempts to get into the augmented reality market. Commonly referred to as AR, augmented reality has the potential to revolutionize society if implemented correctly; yet, no technological firm has yet mastered the field. While other firms, like Google, are getting by with software alone, Apple has opted to use specialized hardware, such the LiDAR scanner found in iPads and iPhones. However, we are coming closer and closer to ideal AR, and Google is one of the firms at the forefront of this field with its augmented reality software development kit (SDK) and technology.
As the name suggests, ARCore is Google’s SDK for developing augmented reality applications. It debuted in 2018, and since then, it has found its way to a broad variety of devices from a number of different brands. It’s compatible with most high-end and mid-range Android smartphones, and it’s been put to several fascinating uses, some of which are helpful and others of which are more gimmicky.
ARCore is not Google’s first attempt at augmented reality. It has been noted by some that Google’s first foray into augmented reality technology was with Project Tango. Both ARCore and Project Tango have many similarities, but they also have significant differences. In particular, Project Tango needed specialized hardware consisting of a number of cameras and sensors. Having many back sensors, much alone incorporating the camera technology Tango needed (including features like a fisheye camera), was quite rare when Project Tango first emerged.
On the other side, ARCore is more appealing since it can function without any specialized gear. Your phone’s camera and other internal sensors are all it takes to complete the task at hand. The ARCore augmented reality experience will work properly on any standard smartphone. Due to ARCore’s superior scalability and compatibility with a wider variety of mobile devices, Project Tango was discontinued and replaced.
How does ARCore operate?
If you’re curious about how ARCore accomplishes its magic, this primer on its core principles is a fantastic place to start. There’s a lot more to the narrative than what’s on that page, but it should give you a fair idea of how it works. Unlike Project Tango, which needs specific sensors in order to function, ARCore can run on any consumer device with a standard camera configuration. That’s right; it performs all it does with only the help of your phone’s camera, its motion sensors (including the gyroscope and the accelerometer), and some of Google’s software trickery. If ARCore is to properly observe and understand what the camera is seeing and create an augmented reality experience based on that information, it must adhere to a set of core notions.
ARCore makes use of a technique called simultaneous localization and mapping (SLAM) to determine the phone’s precise location in the environment. It is able to calculate position shifts by identifying visually distinguishable elements in the camera’s acquired picture, which it then uses as feature points to determine whether or not it has moved and what its new location looks like. These feature points are then used to identify planes, as well as horizontal and vertical surfaces, which provide more context for the algorithm.
Visual data is combined with inertial data from the device’s IMU to determine the camera’s pose relative to the surroundings over time. Developers may use this data and context to build a layer over the camera feed that seamlessly integrates into the actual environment. In addition, ARCore may make a picture seem brighter or darker depending on the amount of light it infers is striking a plane.