Researchers are studying technology that would let hospitals monitor staff 24 hours a day, seven days a week Computers monitoring hospital staff can maintain a constant state of vigilance to try to cut down on preventable mistakes. Hospitals have spent considerable resources trying to reduce the number of preventable mistakes that doctors and nurses make, such as skipping hand washing. But it’s hard to ensure that caregivers take every preventive step every time. Perhaps they need to be watched all the time. That’s what a group of researchers are trying find out. In a handful of recent pilot studies, computer scientists and doctors have installed depth sensors and other types of monitors in hospital hallways, next to patients’ bedsides and in operating rooms. These sensors generate video images that look like blurry silhouettes—protecting people’s privacy—but can be used to train computer algorithms to identify certain movements, like someone stopping at a sanitizer dispenser to clean their hands. The technology, called computer vision, would allow hospitals to monitor workers 24 hours a day, seven days a week, across an entire hospital ward. Ultimately, researchers hope to use the data generated by these algorithms to help hospitals find ways to influence their workers’ behavior and even rethink how care is given in the first place. “People are prone to mental slips and lapses,” says Arnold Milstein, a professor of medicine and director of the Clinical Excellence Research Center at Stanford University. Even the most attentive physicians and nurses are likely to skip steps occasionally without realizing it, especially when the care they give in the course of a day involves hundreds of small tasks, he says. Computers, on the other hand, can maintain a constant state of vigilance. Read Also: Hand Washing Among Doctors Steps of Hand Washing For Doctors Surgical Hand Washing Steps Teaching the technology There is much room for improvement in reducing preventable mistakes. For example, one in 25 patients develops health-care-associated infections in hospitals, according to the Centers for Disease Control and Prevention. To help tackle the problem, Dr. Milstein and his colleagues designed a series of studies to see if computer vision could tell whether people used hand-sanitizer dispensers before entering and exiting patients’ rooms. The project began in the hallway of an acute-care ward at Lucile Packard Children’s Hospital at Stanford, where many of the young patients were awaiting organ transplants and were at a high risk of infection because of their suppressed immune systems. By installing sensors above the hand-sanitizer dispensers, the researchers collected thousands of images and annotated about 80% of them, labeling if someone had sanitized their hands or not when entering or exiting a patient’s room. They fed the annotated images into an algorithm to teach it to make that distinction. The remaining images were used as a test to see if the algorithm could identify hand washing without the annotations. Then the algorithm was applied to images collected at an intensive-care unit of Intermountain Healthcare in Salt Lake City. Even though the hallway configuration was different, the algorithm was able to identify hand sanitization almost 85% of the time. When the algorithm was further trained on images captured by Intermountain Healthcare, its accuracy increased to 98%, says Serena Yeung, a Ph.D. student in the Stanford Artificial Intelligence Laboratory . Researchers are now figuring out how to use the data to design a program to encourage vigilance in hand washing. One idea is creating an alert on the sanitizer dispensers, perhaps with a flashing screen, to remind people to wash their hands before they enter a room. Another is creating a digital dashboard to track compliance throughout the entire unit over time. Keeping an Eye Sensors throughout a hospital generate blurred images of staff activity. From these, computer algorithms can recognize key actions such as use of hand sanitizer. “People would know how they are performing as a group, and hopefully be motivated to improve,” says Ms. Yeung, adding that training programs or incentives could be used to reward the unit with the best compliance record. Researchers also are beginning to use computer vision to study hand-washing protocols inside patients’ rooms. The idea is to see if current practices, like washing hands after touching a patient, effectively fight harmful and drug-resistant bacteria or if different steps are needed. Beyond hand washing Researchers also are using computer vision to identify movements other than hand washing. For instance, researchers are identifying when patients on ventilators are given oral care, since failing to clean a patient’s mouth and throat regularly can cause pneumonia. They are also identifying when staff change patients’ position in bed, since failing to turn a patient regularly can cause bedsores. Stanford, Intermountain and, separately, Johns Hopkins Hospital and Johns Hopkins University School of Medicine in Baltimore are working on identifying signs of patients’ mobility, like sitting, standing or walking. Immobility during a hospital stay is linked to delirium, long-term disabilities and hospital readmission, and is widely considered a preventable aliment. The researchers also hope to train an algorithm to identify positions or movements that indicate a patient may be about to fall, and then come up with an intervention that could prevent that from happening. Even the most attentive physicians and nurses are likely to skip steps like hand washing occasionally without realizing it. Other research in computer vision focuses on training algorithms to recognize correct surgery procedures. Down the line, that could lead to better training of surgeons, reviews of procedures that go wrong, or safety checks as a surgery proceeds. In France, several institutions—the University of Strasbourg; the University Hospital of Strasbourg; the Institute of Image-Guided Surgery of Strasbourg, a hospital and research center; and Ircad, a training center in minimally invasive surgery—are using sensors along with more-traditional video images to teach computers to recognize different stages of surgery. Researchers and medical professionals hope they can use the technology to improve care—for example, by finding key safety checkpoints where a computer could intervene by issuing a warning before a mistake is made. But Nicolas Padoy, an associate professor at the ICube Laboratory at the University of Strasbourg, cautions that the research is still in its infancy and applications like safety check points could be years away. “The technology is really likely to take off over the next five to 10 years,” he says. Source