NUS team launches AI headset to help people with vision loss to commute and shop
Sign up now: Get ST's newsletters delivered to your inbox
Madam Teresa Ng is looking forward to using the device to identify, colours, sizes and prices when she goes shopping.
ST PHOTO: NG SOR LUAN
SINGAPORE – Madam Teresa Ng, who has been partially blind since her teenage years, often relies on the kindness of other commuters to help her look out for her bus number.
“My challenge is (that) sometimes people cannot tell that I’m blind. So they will look at me and think: ‘What’s wrong with her?’” said Madam Ng, recounting how some people have rejected her requests for help.
She also sometimes needs assistance when shopping, relying on others to read labels or locate items.
Such occurrences, however, will soon be a thing of the past for Madam Ng, who is in her 50s.
She will be relishing her newfound independence with a complimentary unit of AiSee, an assistive headset equipped with artificial intelligence (AI) capabilities, a camera and speakers to help her “see” her surroundings.
Developed by a team of researchers from the National University of Singapore (NUS)
The device connects to the internet via Wi-Fi or hot spot from the user’s phone, with the team working to integrate SIM card functionalities in future iterations.
Two years after the prototype was unveiled in 2024, an NUS spin-off named after the device has been launched to commercialise AiSee, which is priced at US$299 (S$380) per piece.
Developed by a team of NUS researchers, AiSee is equipped with artificial intelligence capabilities, speakers and a camera to help users "see" their surroundings.
ST PHOTO: NG SOR LUAN
The spin-off is seeking endorsement by national disability agency SG Enable so that users can get the device at a subsidised rate.
The agency administers a fund for persons with disabilities to buy up to $40,000 worth of endorsed assistive tech.
Madam Ng, who has been testing a prototype of AiSee since mid-2025, said one of the most important features is being alerted when her bus or private-hire ride is arriving.
AiSee can detect the bus stop users are at through Global Positioning System technology. The device draws bus arrival timings from the Land Transport Authority’s database, providing users with an estimated arrival time for the bus of their choice.
All users need to do is ask: “When is bus 190 arriving?”.
A built-in camera allows AiSee to monitor the road and alert users verbally through the speakers when their bus approaches.
Similarly, users can speak to AiSee to book a private-hire ride and receive verbal alerts when the car arrives.
Madam Ng, a consultant for an organisation that promotes inclusion for the special needs community, said she knows of many people like her who do not leave their homes unless someone accompanies them.
As an avid shopper, she is most excited to take AiSee for a spin at the malls. “It could tell me the price, size, colour, and even describe the kind of occasion each outfit is suitable for – very useful,” said Madam Ng.
“Sometimes when I go grocery shopping, I have to ask the promoters for help and later feel obligated to buy from them. But with AiSee, I won’t have to trouble anyone.”
Built with large language models, the AiSee can tell Madam Teresa Ng what occasions each outfit is appropriate for.
ST PHOTO: NG SOR LUAN
Though the initial target users were people with visual impairment, the NUS team has now broadened its efforts to benefit the sighted as well.
The team is in the midst of testing AiSee’s ability to lead people on curated tours in the Botanic Gardens.
“We are mapping points of interest within the garden so that people can learn more beyond what they see, and ask questions,” said lead researcher Suranga Nanayakkara, who is also an associate professor at NUS’ School of Computing.
Possible points of interest include the handkerchief tree – known for its young leaves that resemble soft, white handkerchiefs – and the history of trees planted by the late prime minister Lee Kuan Yew.
AiSee’s open-source code allows developers to build their own hands-free, screen-free applications. During a recent workshop, 17 developers gathered to design features such as the ability to search LinkedIn based on someone’s face captured in real time, and to give users possible conversational points.
The AiSee was developed by a team from NUS led by Associate Professor Suranga Nanayakkara.
ST PHOTO: NG SOR LUAN
“From phones with buttons to touch-screen smartphones to smartglasses – screen-free devices could be the next wave of computing,” said Prof Suranga.
“We want AiSee to be a virtual guide (that helps us) to really understand beyond what we see.”
Correction note: This story has been updated to correct a source error that said AiSee had signed an agreement with Botanic Gardens.


