Intel and AI Developer Create Backpack for Visually Impaired

By Ujala Chowdhry
April 20, 2021

What differentiates an animal from a machine are its mirror neurons, a set of neurons in an organism’s brain that is fired to perform a task based on the performance of self or others. Discovered in 1991, the research on mirror neurons is still going on, and the Giacomo Rizzolatti research group believes that these neurons are the biological basis of compassion and empathy.

Jagadish K. Mahendran, an independent Artificial Intelligence (AI) developer, and his team have collaborated with Intel to help ease the mode of transport for visually impaired people. He shared that “Last year when I met up with a visually impaired friend, I was struck by the irony that while I have been teaching robots to see, there are many people who cannot see and need help. This motivated me to build the visual assistance system with OpenCV’s Artificial Intelligence Kit with Depth (OAK-D), powered by Intel.” For Mahendran, his visually impaired friend’s daily routine difficulties became the conception ground for this project.

Here’s an extremely exciting project made with Intel technology: Jagadish K. Mahendran and his team developed an #AI-powered, voice-activated backpack to assist the visually impaired with a way to navigate and perceive the world. Truly amazing. https://t.co/Ij0tz8Wj8L pic.twitter.com/hPvQtSEvjq

Navigating the world

Mahendran and the team did a lot of case studies and asked visually impaired people about the difficulties they face while walking on the road. The team listed out a few hurdles that they can help the people cross on their own. The tests were conducted in Downtown Monrovia, California, and neighboring areas. With the data gathered, they developed an artificially intelligent voice-activated backpack.

The backpack is meant to hold the host computing device such as a laptop to run the OAK-D Unit, which is a versatile and powerful AI device that runs on the Intel Movidius VPU and the Intel Distribution of OpenVINO toolkit for on-chip edge AI inferencing. The voice assistance is provided by the computer through Bluetooth headsets, and the user can interact with the system via voice queries and commands.

As the user moves through their environment, the system audibly conveys information about common obstacles, including signs, tree branches, and pedestrians, and warns of the upcoming crosswalks, curbs, staircases, and entryways. This helps the user see their surroundings through hearing.

The setup was designed to provide discretion to the visually impaired by trying to eliminate objects such as a cane for the person to walk. You and I may have concerns and questions about this project. Questions such as the safety of the user, will they require human assistance to manage the device. If we eliminate the visual clues denoting that the person has vision problems, society may not give them the necessary empathy one must have towards another human whose visual communication is seized by nature. However, the team at Intel was kind enough not to leave us hanging with these questions.

Intel’s response to such questions or concerns is, “This AI backpack is currently still a proof of concept and is not commercially available. The next step for Jagadish Mahendran and his team is to raise funds and expand testing. Their goal is to release an open-source, AI-based, visual-assistance system. The researchers have formed a team called Mira, made up of some visually impaired volunteers. Much more user testing will be done before it is ready to go to market.”

Questions aside, the project backpack seems promising and is a great step towards building a bright future for everyone to see. If a mere meeting with a visually impaired friend could lead to such a productive way of expressing concern, no wonder the mirror neurons are a wonder, and empathy is a blessing to humanity.

Original at https://techacute.com/intel-and-ai-developer-create-backpack-for-visually-impaired/