Facebook is working on AR glasses that could change the way we hear

2020-09-05 05:00:00

Facebook imagines you'd wear the AR glasses in a restaurant where there's a lot of background noise competing with your conversation.

Facebook imagines you’d wear the AR glasses in a restaurant where there’s a lot of background noise competing with your conversation. (Jennifer Leahy/)

When we talk about augmented reality in 2020, that term typically refers to our vision. We picture digital objects appearing in the real world or handy head-up displays that overlay mapping information onto our field of view so we can find the nearest public restroom the same way the Terminator would. But AR isn’t all about the eyes, and Facebook’s FRL Research department is working a pair of augmented reality glasses that will change how wearers hear as well as see.

It’s not a totally new project for the company. Facebook has been working on creating immersive audio experiences since at least 2014 when it spent $2 billion to buy Oculus, the VR powerhouse. While immersive audio is great for gaming, bringing it into the realm of augmented reality glasses that people wear in the real world could greatly expand its usefulness.

Facebook made it very clear that these glasses were just for demo purposes and were not representative of any potential products.

Facebook made it very clear that these glasses were just for demo purposes and were not representative of any potential products. (Jennifer Leahy/)

The primary pitch for the new technology involves using an array of microphones in the frame of the glasses to focus on a person as they’re speaking. The glasses could then identify the source of the sound you’re trying to hear and use a combination of hardware and software solutions to isolate it from background noise. The final result isn’t unlike what you might expect from a hearing aid.

By adding glasses and AI to the mix, Facebook hopes to give users more specific control over what sounds make it through and which ones get suppressed. Because AR glasses would presumably have a built-in camera, they could use computer vision to analyze a scene in real time to track the wearer’s gaze. That gives the glasses a strong clue about where to focus the microphones. And since people don’t always stare directly at the subject they’re listening to, AI would need to figure out whether the attention should stay where it is or if it should shift using variables like your head and eye movement.

The uneven walls in this chamber prevent sound waves from bouncing around the space.

The uneven walls in this chamber prevent sound waves from bouncing around the space. (Facebook FRL/)

While that research could help people hear better, the company is also working on improving its “audio presence” technology, which attempts to mix digital sounds coming through headphones with the environmental noise happening around you in the real world.

Audio presence isn’t a totally new concept, either—you can experience some semblance of it in VR headsets such as the Valve Index. Facebook, however, aims to make audio so convincing that a person couldn’t distinguish between sound from headphones and sound from the world around them.

The company is currently developing the technology in massive rooms called anechoic chambers, which have jagged foam padding lining every surface to absorb every bit of errant sound. The room in which you’re standing profoundly affects the sounds you hear thanks to variables such as ceiling height, distance from the walls, and even the finish on the floors. With those variables suppressed, Facebook is working on isolating how audio cues tell us about our surroundings.

In the real world, sound typically hits one ear before the other—and that first ear will hear it with slightly more volume than the second. By replicating those slight differences in headphones, the company can better trick a user’s ears and brain.

Everyone's ear structure is different, which makes tuning audio devices to work work perfectly a difficult task.

Everyone’s ear structure is different, which makes tuning audio devices to work work perfectly a difficult task. (Facebook FRL/)

Implementing these effects are somewhat straightforward in a perfectly silent room, but doing it in the real world will take a lot more work. Right now, Facebook is working on tech to help control for not just the room variables, but also the size and structure of people’s individual ears.

In the trials, Facebook is using an open-air headphone design, which is typical for audio presence demos. Rather than capturing and digitizing every sound from the environment and trying to mix it with digital audio, the natural sounds come in as usual and the headphones only handle the signal from the device. Many high-end audiophile headphones use a similar open concept to listeners can hear the room in addition to the tunes.

For now, these are still in the research phase. Facebook hasn’t been secretive about the fact that it’s working on AR glasses, but now we’re starting to see more concrete technology that may end up in the final hardware. The company still doesn’t have any specific info about when—or if—any of this will ever hit the market in one of its gadgets, but we’ll hear more about Facebook’s VR ambitions at the Oculus Connect conference starting on September 16th.

Source link

To Return to News Page – click here

Share this

0 Comments

Leave a reply