SLAM, what is it?

Brandon Mendoza
3 min readMay 8, 2021

--

When starting this research I honestly had no idea what to expect or what exactly SLAM could entail. Or even how it connected to AR and VR. SLAM is Simultaneous Localization And Mapping is pretty much what it sounds like. It is a way to make a live map of an object’s surroundings.

As I was researching I came upon a new method for SLAM-ing that utilized LiDAR sensors to make a geographic map. It turns out that LiDAR is a technology that has been used for many years with airplanes to create a visual graph of the land it was flying only. The way that it works is that it shoots pulsing laser energy outwards and measures how long it take for it to return and it calculates the distance that way. This is a lot how Sonar works but with sound, just like echolocation works with bats. The new iPhone 12 actually included a LiDAR sensor in its camera module. According to Apple “LiDAR can instantly understand the surfaces in your space. So AR apps can get right to work analyzing the scene and creating custom experiences.” I have included some links to videos that show off the potential that LiDAR technology can have while working in combination with other systems.

I think this has a lot of potential in improving AR experiences. It can create better awareness of the phone's surroundings if it can essentially instantly create a live geo-map of the world around it. I think that this can also improve the ability and reliability of self-driving cars. While I was doing research I found out that many self-driving cars already use LiDAR technology to assist their systems. This also has a lot of big implications for the future of VR and AR technology. If you start putting LiDAR sensor all over a VR headset or a pair of AR glasses you can make it so that the device always knows its surroundings. This will make the experience seamless and more realistic and prevent glitches and mistakes for the elements on the device. This will create a more stable experience for the user.

Visual representation of LiDAR scans

In the reading, there were many different technologies that allowed devices to track location, surroundings, movements, and acceleration. I think that including a combination of many of these different sensors would be very important for making the most seamless and stable experience in future AR and VR technologies. Many smartphones already have a combination of a lot of the different sensors that the reading included, such as an accelerometer, gyroscope, and a GPS sensor. All of these are what programs like Instagram, Snapchat, and Facebook use to run their AR filters.

--

--