} ?>
(Yicai) April 16 -- Gu Leilei, a tenured associate professor and doctoral supervisor at the Qingyuan Research Institute of Shanghai Jiao Tong University's School of Computer Science, has reportedly developed a wearable device that provides navigation assistance using artificial intelligence for the visually impaired, according to a report.
Through innovative smart software and hardware, people with visual impairments can live a normal life using the new system, The Paper reported yesterday, citing Gu.
The system collects visual information by taking pictures and analyzing it with AI algorithms to select targets, obstacles, and other key elements before transmitting signals to users via sound from bone conduction headphones and tactile sensation on the wrist to guide them. Instructions update as users move, gradually leading them to their destinations.
However, the system is at the basic research stage, with data collected from only 20 test subjects, Gu noted, adding that more feedback is needed for optimization before it can hit the market.
The device weighs about 200 grams and comprises glasses with red, green, blue, and depth cameras, two small pieces of artificial electronic skin, and a miniature single-board computer. It takes only 200 to 300 milliseconds from taking a picture to responding, consistent with the reaction time of humans, allowing it to better cooperate with users.
To solve camera risks due to insufficient light, the system also integrates an infrared detector that can actively detect, "similar to lidar," providing information such as distance and height.
In addition, it offers a pair of friction-powered smart insoles and a virtual reality training platform, enabling users to experience simulation and training scenarios in a virtual environment, reducing the risks of falls and collisions before actual use.
On April 14, the international academic journal Nature Machine Intelligence published a paper by Gu about the new device.
Editor: Martin Kadiev