Bose CEO Lila Snyder on the Next Revolution in Sound

by Tyler Wells Lynch

Lila Snyder, CEO of Bose speaks at Discover Experiential AI. (Photo by Heratch Ekemekian)

As one of the foremost and iconic brands in audio, Bose is no stranger to revolutionary technologies. At several points in its 50+ year history, it helped create them. The company’s ethos is perhaps best embodied in the words of its founder, Amar Bose: “You have to have the courage to be different. You can never do anything better unless it’s different.”

For CEO Lila Snyder, those words explain how Bose was able to make the successful leap from analog to digital — a transition that left many audio companies in the lurch. Now Synder sees another revolution on the horizon, one that will be every bit as transformative as the move to digital.

Artificial intelligence (AI) promises to change the way we experience sound. Through contextual awareness, selective amplification, and immersive acoustics, AI will enhance the way people experience music, audio, and the very sound of our environments.

Speaking at Discover Experiential AI, the inaugural event of the Institute for Experiential AI (EAI) at Northeastern University, Synder offered a sneak peek into the ways her company is using AI to make all this and more a reality. She stressed the need to look beyond the immediate demands of technical talent to examine the “golden questions” that drive innovation — questions that only form in collaborative environments that span industry, academia, and domain expertise. That’s also why Bose partnered with EAI: to help unearth the potential buried in the massive data sets made possible by the digital revolution.

To round out her talk, Synder offered three examples of how Bose is using AI to advance its mission.

1. Contextual Awareness

Data helps sound engineers get a better sense of the environments in which people listen to music or podcasts or take phone calls. Noise-canceling technology has been around for a few decades, allowing customers to selectively tune out din or racket or noisy streets, subway cars, or airplanes. With Active Sense technology, headphones can trigger noise cancellation when moving from a quiet setting to a noisy one, such as stepping out of your home onto a busy street.

The same technology can be used to control dynamics in home audio systems. Most people are familiar with the irritating experience of volume spikes during commercial breaks or volume dips during dialogue sequences in movies. Active Sense powered by AI allows sound systems to recognize those transitions and ride the volume knob accordingly.

2. Hearing what you want

Synder explained how the future of sound is about hearing what you want to hear. That means a fundamental rethinking of the purpose of noise-canceling technology, which right now is entirely binary: You can turn it either on or off.

But people don’t want to hear everything or nothing. They want to be able to tune into some things, like voices or sirens or approaching buses, while tuning out the racket. AI allows us to do that. Indeed, Bose is working on technology that amplifies certain sounds or voices while dynamically adjusting noise cancellation.

3. Immersive Audio

The secret truth about audio is that it’s rarely heard how musicians, engineers, and designers intended it. Elaborate home theater systems can help get you part of the way there, but a truly immersive audio experience depends on fine details like position, distance, and room acoustics — not to mention the fidelity of the sound system in use.

AI allows sound engineers to imagine immersive audio environments that dynamically render acoustics to match ideal settings. They can disaggregate the data in music, gaming, or movie environments to emulate venues or settings that are better suited for the medium. That could be a living room, a symphony hall, or a recording studio.

“We can render and recreate the music in that environment so you feel like you’re sitting there,” Synder said. “But we can’t do that without AI and without data.”

Sound is Data

Snyder revealed a few other ways Bose is using AI to make the future of sound a reality, some of which are found on the business side: The company is using machine learning tools to better understand its supply chain, making sense of a breadth of data previously thought unimaginable. They’re also using marketing data to better understand their customers and personalize interactions with them. And they’re doing it all with an eye toward collaboration.

By partnering with the Institute for Experiential AI, Bose sets an example for the true promise of AI. The company sees a future immersed in all kinds of data — not just sound. To be a leader in that future will require a deeper understanding of the challenges and opportunities buried in that data. What better way to distinguish yourself, to carry on the legacy of Amar Bose, than to harness the game-changing power of artificial intelligence?

Watch the replay of Lila Snyder’s discussion or contact EAI to learn more about how the institute can help you achieve your AI goals.