With activity and step trackers increasingly in every smartphone and wearable, walking is more popular than ever. But not every neighborhood is necessarily optimized for the task. Researchers at Arizona State University have unveiled a new tool that applies computer vision and deep learning to Google StreetView images to determine if a neighborhood is walkable. The tool looks for everything from the presence and density of crosswalks and street lights to the condition of curb cuts and sidewalks. The idea is not only to give people the heads-up before they venture out into unknown streets—essential intel for anyone with mobility issues—but also to eventually provide data to spur on repairs and improvements.
sign up for our newsletter