Institute for Experiential AI core faculty member Sarah Ostadabbas is changing the baby monitor as we know it. From bench to bedside, her team’s cutting-edge device, AiWover, uses AI to track infant poses and their daily activity with a level of precision that’s never been seen. She calls her new startup Paretofront.
The Northeastern University spinoff aims to bring AiWover to market to help put parents’ minds at ease about the safety and development of their babies. Her patented new cloud-based baby monitor collects massive amounts of data on a child’s daily activities, from crib to playroom, and spits out helpful summaries for parents who don’t have the time to manually record their child’s every move. The data not only allows them to understand their infants’ day-to-day activity, but also their physical and neurological development over time. Today’s standard baby monitor can trigger false alarms, failing to prevent accidents and injuries or detect early warning signs of developmental disorders. The AI used in AiWover, on the other hand, is linked to data on the cloud that predicts adverse events and tracks infants’ posture and movement to detect abnormalities.
“In the last decade or so, words like ‘smart’ and ‘intelligent’ have been overused in this market,” says Ostadabbas, who doubles as assistant professor of electrical and computer engineering at Northeastern. “Bringing a truly AI-powered system to the market to address the lack of long-term, intelligent, and objective monitoring products for infants will allow us to adapt to a changing culture by educating customers about the true benefits of AI in this sector.”
Her group was the first to publicly release datasets accompanied by domain adaptation tools to successfully close the data gap between differences in infants and adults during AI modeling. With her research couched at the intersection of computer vision and machine learning, Ostadabbas explores advanced representation learning algorithms in visual perception problems, in particular human pose estimation and activity recognition.
“Human pose estimation has received a lot of attention and success lately, but much of it can’t be translated to infants. If you look at adult poses, you can imagine big data from social media, Hollywood images, and sports are not going to work with infant poses,” says Ostadabbas. “Infants have more complex and unique movement and a different body shape with a different muscle-to-bone ratio.”
Her algorithms, which have been funded by the National Science Foundation and others, have gone on to help scientists estimate, predict, and detect behavior and motor function to innovate and understand developmental disorders across many applications. She’s implemented AI to study the behavior of bats to help roboticists to design flying drones, and to analyze patients’ poses to elucidate the effects of Parkinson’s drugs. Now, in this next chapter, as the creator of the AiWover, she’ll continue validating her algorithms to zero in on the motor function of infants — and ultimately keep children safe and help prevent a host of developmental disorders, from autism spectrum to cerebral palsy, and sudden infant death syndrome.
“The state-of-the-art computer vision algorithms don’t work on infants moving, but our models do,” says Ostadabbas. “Hopefully it can come full picture and be in every baby monitor out there.”
Learn more about Ostadabbas’ research and our institute.