SUBSCRIBE
Tech Journal Now
  • Home
  • News
  • AI
  • Reviews
  • Guides
  • Best Buy
  • Software
  • Games
  • More Articles
Reading: UW researchers put tiny cameras into earbuds for hands-free AI – GeekWire
Share
Tech Journal NowTech Journal Now
Font ResizerAa
  • News
  • Reviews
  • Guides
  • AI
  • Best Buy
  • Games
  • Software
Search
  • Home
  • News
  • AI
  • Reviews
  • Guides
  • Best Buy
  • Software
  • Games
  • More Articles
Have an existing account? Sign In
Follow US
© Foxiz News Network. Ruby Design Company. All Rights Reserved.
Tech Journal Now > News > UW researchers put tiny cameras into earbuds for hands-free AI – GeekWire
News

UW researchers put tiny cameras into earbuds for hands-free AI – GeekWire

News Room
Last updated: May 15, 2026 1:39 pm
News Room
Share
7 Min Read
SHARE
VueBuds, a prototype developed by University of Washington researchers who have embedded a rice-grain-sized camera into each earbud of a standard pair of Sony wireless earbuds. (UW Photo)

Wireless earbuds seemingly sprang out of nowhere. Popularized by Apple’s AirPods, they were suddenly everywhere — on the subway, in the grocery store, in the ears of the person sitting across from you — until somewhere along the way, they became the thing nearly everyone wears without a second thought.

Could that popularity make earbuds better than smart glasses for AI? That is the bet behind VueBuds, a prototype developed by University of Washington researchers who have embedded a rice-grain-sized camera into each earbud of a standard pair of Sony wireless earbuds. The result is a visual AI assistant hiding in plain sight: look at a can of food and ask how many calories it has, hold up an unfamiliar kitchen tool and get an answer in about a second. 

The system processes images on-device and responds through a connected AI model — no cloud required, no images stored.

The UW team believes it is the first to embed cameras directly in commercial wireless earbuds.

The earbuds don’t remember anything, but the people around you might not know that. That tension sits at the heart of what the UW team built and raises a question the researchers take seriously: what are the social norms when cameras are embedded in objects nobody thinks of as cameras?

The team’s answer is to lean hard on minimizing data collection. Images are processed and discarded; nothing is saved. But the system offers no outward signal to bystanders that a camera is present, which the researchers acknowledge is an open challenge rather than a solved one.

For technology like this to earn trust, Maruchi Kim, lead researcher and UW doctoral student in the Paul G. Allen School of Computer Science & Engineering, argued that privacy can’t be an afterthought. 

“We don’t support saving the images,” Kim said. “It’s mainly just to bridge the interaction between a person and having access to AI on the go, especially in hands-free scenarios.”

The team’s other central argument is about form factor — and it’s a pointed challenge to Meta, which has spent years and hundreds of millions of dollars trying to make camera glasses a mainstream product.

The UW team’s position is that smart glasses will never fully shed their social baggage: the memory of Google Glass, the discomfort of being watched, the visible signal that the wearer has opted into something most people haven’t. Earbuds carry none of that history.

“From the get-go, we didn’t want to be associated with that,” Kim said.

Getting cameras into earbuds required solving a power problem first. Cameras consume far more energy than microphones, so the team opted for a low-power sensor that captures roughly one frame per second in black and white — slow by video standards, but fast enough for the question-and-answer style of interaction the researchers had in mind.

The cameras are angled five to 10 degrees outward, providing a 98- to 108-degree field of view, and images from both earbuds are stitched into a single frame before processing, cutting response time to about one second.

The applications range from the practical to the significant. The system can read text on food packaging, identify objects, and translate written Korean. But for people with low vision or cataracts, the implications run deeper. 

The team received more than a dozen emails from people with visual impairments describing what they’d use it for: understanding facial expressions, reading books, watching television — tasks that existing AI tools can’t easily support in a hands-free, ambient way.

Kim sees another underserved group in the workforce. Electricians, plumbers, and workers in industrial settings often can’t pause to pull out a phone mid-task — a pipe fitting wedged in place, a live wire that needs both hands.

For those workers, a voice-queryable visual assistant that doesn’t require touching a screen is the difference between having access to AI and not having it at all.

“There’s a lot of blue collar work where those people aren’t really able to harness the benefits of recent AI advances,” Kim said. “They can’t just whip out their phones and take a photo.”

The hands-free framing extends broadly: surgeons, cooks, anyone who has ever tried to follow a recipe with wet hands.

The system remains experimental and isn’t available for purchase. Shyam Gollakota, a professor in the Allen School and the project’s senior researcher, said interest from technology companies has been significant, and camera-equipped earbuds could reach consumers within a few years.

On cost, Gollakota is optimistic. The camera sensor itself could run under a dollar at the component level, he said — meaning that at the scale of a major consumer electronics manufacturer, the price premium over standard earbuds would likely be modest.

The $10 figure Gollakota cited refers to a more conservative estimate at smaller production volumes.

“What we do at the universities is show that you can solve technical problems,” Gollakota said. “Then we show a path for these companies and other people to say that this is actually possible.”

Read the full article here

You Might Also Like

No Oscar nods for Amazon this year, but company is among tech targets from host Conan O’Brien

TerraPower Breaks Ground on First U.S. Next-Gen Nuclear Plant

Microsoft and OpenAI revamp partnership, with trial in Elon Musk suit set to begin – GeekWire

AI Apps Generate Revenue but Struggle With Retention

AI Data Center Boom Drives Inland Expansion Across US

Share This Article
Facebook Twitter Email Print
Leave a comment Leave a comment

Leave a Reply Cancel reply

Your email address will not be published. Required fields are marked *

- Advertisement -
Ad image

Trending Stories

News

Microsoft adds more former Ai2 researchers, bolstering its Superintelligence team – GeekWire

May 15, 2026
Games

Star Wars: The Old Republic’s original director hadn’t played a single MMO before launching its development studio: ‘How the hell did they trust me with this much money?’

May 15, 2026
Games

Stardew Valley creator Eric Barone would never use AI, instead wants to rely on human creativity—’I feel like that should take priority over a soulless machine’

May 15, 2026
AI

Microsoft business software faces UK antitrust probe over bundling, AI lock-in – Computerworld

May 15, 2026
Games

Final Fantasy 14 is promising big changes with Evercold, but I hope it’s not a surface-level attempt

May 15, 2026
Games

Don’t sleep on Battlestar Galactica: Scattered Hopes—it’s tricky, tactical and the best FTL-like since, well, FTL

May 15, 2026

Always Stay Up to Date

Subscribe to our newsletter to get our newest articles instantly!

Follow US on Social Media

Facebook Youtube Steam Twitch Unity

2024 © Prices.com LLC. All Rights Reserved.

Tech Journal Now

Quick Links

  • Privacy Policy
  • Terms of use
  • For Advertisers
  • Contact
Welcome Back!

Sign in to your account

Username or Email Address
Password

Lost your password?