By Matthew Futterman
New york Times, Dec. 26, 2020
A blind runner issued a challenge to technologists last year to find a way for him to run safely without a guide. They did.
Thomas Panek dreamed for years of running the way he did before he lost his sight, without fear and without a human or a dog tethered to his wrist as a guide.
That dream took Panek, 50, to the north end of Central Park one frigid morning last month, to test drive something that might one day liberate thousands of other people with severely impaired vision. As a camera crew and a team of technologists made some final adjustments, he stood on the downslope of West Drive. He straddled a painted yellow line and waited for the signal to go.
A little more than a year had passed since Panek delivered a challenge to a group of engineers at Google’s Manhattan offices during a company hackathon. Could they develop a way for him to run by himself, with a clear sense of where he was going and without having to worry about hazards along the way? There had been other attempts to find technological solutions to this problem, but none of them completely untethered runners from their guides.
Panek has retinitis pigmentosa, a genetic condition that causes the loss of photoreceptor cells. As a child, he lost the ability to see stars in the night sky. He was legally blind by young adulthood.
To run blind, Panek said, is to always fear that you are about to slam your face into a tree.
Panek, who lives in Westchester County, just north of New York City, loves to run with his guide dog, Blaze. But Blaze can’t run faster than about a nine-minute mile.
“I’m a little faster than that,” said Panek, who completed the 2015 Boston Marathon in 3 hours 42 minutes, with Scott Jurek, an ultramarathoner, guiding him. That translates to about an eight-and-half-minute mile for the marathon, though Panek can go faster at shorter distances.
By the end of that one-day hackathon last year, the engineers had sketched a basic idea of a solution. They placed a line of masking tape on the floor and had Panek and his dog follow the line. Then the mission became devising an app that worked the same way as the dog, and, in the process, perhaps solving a problem for Panek and a lot of other people.
“If you start with one person and their challenges, you can bring a great benefit for him and also people like him,” said Ryan Burke, a leader at the Google Creative Lab.
It’s hard to say how many blind runners there are, or, more important, how many there could be if running became something that blind people could do on their own.
The National Federation of the Blind estimates that 7.6 million people in the United States have a visual impairment that requires them to use alternate means to engage in an activity that people with vision can do without assistance. The United States Association of Blind Athletes, which holds camps and regional competitions for people who are blind or visually impaired, has more than 700 members who could benefit from the technology Panek requested.
Panek is chief executive of Guiding Eyes for the Blind, which provides guide dogs to people with severe vision loss. Three years ago, the organization began a program to train dogs as running guides, and since then has provided dogs to 75 runners.
In a typical year, more than 50 blind runners complete the California International Marathon with guides; 53 runners who identified themselves as having a vision impairment finished the New York City Marathon in 2019. Many others ran shorter races.
Dror Ayalon, a creative technologist on the project inspired by Panek’s challenge, said a plan became clear fairly quickly.
A crew would paint a yellow line for Panek to follow throughout a race. He would strap a phone with a camera and the newly devised app to his midsection, and the camera would track the yellow line on the ground. The app would take information from the camera and convey vibrating signals through a headset. As Panek ran, the signals would tell him how to adjust his steps to stay on the line. Signals in the right ear meant he was wandering too far to his right, and vice versa. Another signal would help him navigate a turn when the camera spotted the line curving.
Software programs used for video game design helped the app learn to interpret the images from the camera. The work became more complicated when engineers began to consider all the things that might interfere with tracking a line on pavement that a runner would be trying to follow while carrying a phone.
The camera was going to shake constantly. The sunlight would change, making a yellow line look different at noon from the way it did at dawn.
And what would happen when leaves blew onto the line, covering a part of it? Would the app interpret that break in the line as a parked car, and signal the runner to stop?
“We take examples and feed them into the model, classifying the pixels as one class and everything else as not in the class,” Ayalon said, referring to obstacles that might block the view of the line. “The model learns over time.”
So does the runner. Panek tested the technology for months over short distances, slowly gaining confidence, learning to trust the directional messages in his ears. Then, in November, it was time for a 5-kilometer run.
“Liberation is a huge motivation,” he said, “the idea of being self-reliant.”
Working with New York Road Runners, the organizer of the New York City Marathon, technologists received permission to paint their yellow line around the north loop of Central Park, a 1.42-mile circle that includes the climb known as Harlem Hill.
Despite the cold, Panek wore short sleeves. He has the wiry build of a veteran runner. The only hint of his sight loss is that his eyes sometimes appear to focus in different directions. But he adeptly compensates, following a voice and picking up on people’s unique sounds, looking toward them as he talks.
As noon approached, he was ready to run.
“Let’s go,” he said when it was time.
A starter told him to go, and he was off. He sprinted downhill toward his first turn as though he knew where he was headed. And then, about a minute in, the voice in Panek’s headset — as well as everyone around him — told him to stop. A car from the Parks and Recreation Department was parked on the line.
As a park ranger helped track down the driver to move the car, Panek headed back to the start line. Time to try this again.
And off he went. Except for a few stutter steps as he rounded the first corners, he pounded the line with confidence, rarely drifting more than a few inches from it on either side. For the first time in decades, Panek was running the way he did as a child.
He crossed the finish line. An official from the running organization draped a medal around his neck.
“Perfect,” Panek said of the run. “It was just perfect.”
It’s unclear where the technology goes from here. Engineers still have some kinks to work out. It would help if the app could navigate around a parked car. But once it can, park officials anywhere might be persuaded to paint yellow lines marking loops for blind runners on routes that are free of cars, allowing them to run freely and safely.
Perhaps one day, after the app can learn how to follow a yellow line that is partly covered by discarded paper cups, blind runners may be able to run an official 5K, or even a marathon.
“There’s that blue line on the New York City Marathon route,” Panek said, referring to the race’s signature marking of the 26.2-mile route. “Maybe one day we’ll be able to get our own yellow line, too.”