Smart Headset: Helping Visually Impaired “See
- Researchers at the National University of Singapore (NUS) have developed a wearable assistive device that leverages Meta's Llama models to help people with visual impairments "see" the world...
- Called aisee, the headphone-like gadget is equipped with a camera that operates as an artificial intelligence (AI) companion that helps users process visual data, integrating into their...
- Initially conceived as a finger-worn ring in 2018, AiSee's design has since evolved into an open-ear headphone form factor.
“`html
AiSee: AI-Powered Wearable Restores “Sight” for the Visually impaired
Table of Contents
What is AiSee and How Does it Work?
Researchers at the National University of Singapore (NUS) have developed a wearable assistive device that leverages Meta’s Llama models to help people with visual impairments ”see” the world around them.
Called aisee, the headphone-like gadget is equipped with a camera that operates as an artificial intelligence (AI) companion that helps users process visual data, integrating into their daily lives and even helping them return to the workforce.
From Ring to Headphone: The evolution of AiSee’s Design
Initially conceived as a finger-worn ring in 2018, AiSee’s design has since evolved into an open-ear headphone form factor. Suranga Nanayakkara, professor at the NUS department of information systems and analytics who led the research team, said this design was chosen over options such as glasses to avoid potential social stigma and, more importantly, to keep the user’s ears uncovered, preserving their natural spatial awareness thru sound.
Based on user feedback, Nanayakkara said the design has been further improved, addressing issues such as hair obstructing the camera and the need for longer battery life. The latest iteration also functions as a standard headphone for music and calls, with AI capabilities available on-demand.
Nanayakkara said this dual-purpose design enhances AiSee’s utility and appeal to users. “It doesn’t make sense to have something that’s used once a day or maybe few times a week,” Nanayakkara said. “We’ve built it more as a smart headphone.”
The Power of Large Language Models (llms)
A major turning point for AiSee was the integration of large language models (LLMs), which transformed the device from a simple object identifier into a conversational assistant. This allows users to not only identify an object but also ask follow-up questions about it.
The device runs an agentic AI framework where computer vision and reasoning models work together to understand and describe the user’s environment.
