Project Aria Research Kit: Case Studies + a Call for Applications | Meta Quest Blog Project Aria Research Kit: Case Studies + a Call for Applications | Meta Quest Blog

Project Aria Research Kit: Case Studies + a Call for Applications | Meta Quest Blog


Announced at Connect 2020, Project Aria aims to help us better understand how to build the software and hardware necessary for future AR and AI-powered glasses. While originally designed solely for internal use, our teams quickly realized the value of opening up access to Aria to the broader research community. Since then, we’ve partnered with BMW and others to work on unlocking new forms of hands-free communication, entertainment, and utility. Today, we’re sharing some case studies of exciting work undertaken with Project Aria and inviting members of the research community to apply for the Aria Research Kit (ARK).

University of Bristol

Researchers at the University of Bristol are involved in Ego-Exo4D, a project that captures egocentric data from highly skilled people to learn more about how people interact with and change the world to achieve their goals. Using the rich sensor suite in Aria glasses, researchers can create 3D reconstructions, or 3D maps of the world, so they can concurrently track participants and objects as they move throughout a space.

University of Iowa

At the University of Iowa, researchers are using Aria to better understand the environments in which people with hearing loss may experience difficulties. Participants wear the Aria glasses in a variety of environments, and because Aria has an array of microphones, researchers are able to pinpoint the direction from which sounds are coming. In the future, this data could disrupt hearing aid technology as we know it today.

IIIT Hyderabad

Researchers at IIIT Hyderabad are working on the Driver Intent Prediction Project, a computer vision application for accident prevention. Thanks to Aria’s eye gaze data, researchers can determine where a driver is looking. With Aria, you can also use the sensor output to generate point clouds that can detect vehicles outside the driver’s field of view.

Carnegie Mellon University

Finally, researchers at Carnegie Mellon University’s Robotics Institute are using Aria to develop Navcog, a mobile app designed to solve the challenge of indoor location and provide audio wayfinding to people who are visually impaired. By working with Project Aria, CMU was able to reduce its dependence on bluetooth beacons and bring new environments online at a faster rate, enabling NavCog to be available to more people who need it.

What Is the Aria Research Kit?

The Aria Research Kit (ARK) is a research ecosystem, available for approved research partners. ARK partners use Project Aria glasses and tools for a broad range of research topics, including embodied AI, contextualized AI, human-computer interaction (HCI), robotics, and more.

Machine perception services, or cloud APIs, allow research partners to use Meta’s algorithms so they can focus on what matters most for their research. ARK includes a companion app for Android and iOS, as well as tools on web and desktop. The SDK lets you stream data from the Aria glasses to a PC in real time for prototyping. Additionally, we open sourced a face and license plate blurring model called Ego Blur so researchers can ensure the privacy of their image data.

We’re committed to open source and have released a number of datasets that leverage egocentric data from Project Aria glasses. We also encourage the research community to open source the datasets and models that they build using Aria.

For more information, visit projectaria.com. To apply for ARK, click here.