wondARland

Device 1

Device 2

Device 1

Device 2
About
The project proposes a simple and quick approach to synchronize Augmented Reality sessions on handheld devices.
The result of the project is a simple mobile Application that implements this methodology to synchronize virtual objects in the environment on multiple devices.
The result of the project is a simple mobile Application that implements this methodology to synchronize virtual objects in the environment on multiple devices.
Project Info
Role: Co-Author and Programmer
Team Size: 2
Time Frame: 20 Weeks
Engine: Unity (C#)
Team Size: 2
Time Frame: 20 Weeks
Engine: Unity (C#)
Introduction
Collaborative Augmented Reality (AR) refers to a technology that allows multiple users to share a common augmented reality environment in real-time. It demands quick and seamless synchronization between multiple connected devices.
To get an idea of what synchronized AR sessions will look like, you can imagine a situation where both devices are placed together in such a way that they are overlapping, and then the AR session is initialized, they would have a common origin and so be synchronized. We are trying to achieve this synchronization without requiring the devices to be overlapping.
The proposed methodology uses a QR code on the host device as shared common reference point for the client devices to synchronize their sessions to.
To get an idea of what synchronized AR sessions will look like, you can imagine a situation where both devices are placed together in such a way that they are overlapping, and then the AR session is initialized, they would have a common origin and so be synchronized. We are trying to achieve this synchronization without requiring the devices to be overlapping.
The proposed methodology uses a QR code on the host device as shared common reference point for the client devices to synchronize their sessions to.
System Overview
The below diagram shows the key events during the synchronization process. The host starts an AR session while the client prepares to scan the QR code on the host's device. Once the QR code is scanned by the client two process take place:
1. Establish connection with the host through the decoded IP address in the QR code
2. At the same instance the client adds the scanned QR code as an image to the reference library for tracking.
Once the above process is completed the synchronization process starts.
1. Establish connection with the host through the decoded IP address in the QR code
2. At the same instance the client adds the scanned QR code as an image to the reference library for tracking.
Once the above process is completed the synchronization process starts.

Synchronization Step
The methodology requires the client device to track an image presented on the host device’s screen. While the image is being tracked, we require the relative distance and rotation of the image from the client’s device. The position and rotation data are considered as the pose of the tracked image. This pose data is only relevant in the client’s coordinate system as it is being tracked through the client device; it will not match the host device's pose since they have different coordinate origins.
Using the pose data, we can synchronize the devices through two approaches.
The first approach involves shifting the coordinate origin of both devices to a common reference point, which is the position and rotation of the QR code.
The second approach requires the client device to use the host devices current position and rotation data and the tracked image’s pose data to construct the transformation matrix required to convert the host device’s pose data to its local coordinates.
Even if the device loses sight of the environment, the positions of the objects will be recalculated once the environment comes back into view. This happens as a result of the implementation of the SLAM algorithm by both ARCore and ARKit which stores a map of the environment in memory.
Using the pose data, we can synchronize the devices through two approaches.
The first approach involves shifting the coordinate origin of both devices to a common reference point, which is the position and rotation of the QR code.
The second approach requires the client device to use the host devices current position and rotation data and the tracked image’s pose data to construct the transformation matrix required to convert the host device’s pose data to its local coordinates.
Even if the device loses sight of the environment, the positions of the objects will be recalculated once the environment comes back into view. This happens as a result of the implementation of the SLAM algorithm by both ARCore and ARKit which stores a map of the environment in memory.
1Vector3 posOffset = qrCode.position - hostPos;
2Quaternion rotOffset = qrCode.rotation * Quaternion.Inverse(hostRot);

Achievements
This project was my first venture into academic publishing. I successfully transformed it into a research paper and had the honor of presenting it at a Scopus-indexed international conference1. Later in the year, the paper was published in the journal2. This achievement stands as one of my proudest moments, as it marked a significant milestone in my academic journey. Through this process, I gained invaluable insights and learned a great deal about research methodologies, paper writing, and the rigorous process of academic publishing.
1. Proceedings of the 2nd International Conference on Challenges in Information, Communication, and Computing Technology (ICCICCT 2024), April 26th & 27th, 2024, Namakkal, Tamil Nadu, India
2. Challenges in Information, Communication and Computing Technology
1. Proceedings of the 2nd International Conference on Challenges in Information, Communication, and Computing Technology (ICCICCT 2024), April 26th & 27th, 2024, Namakkal, Tamil Nadu, India
2. Challenges in Information, Communication and Computing Technology
Check out the repository at github.com