In the 1940s, detective Dick Tracy prophesied Apple's two-way wristwatch in the funny papers, just one example of pop culture and science influencing each other. Some 75 years after Tracy's gadget caught the public's imagination, "hololens goggles" in the Ironman films inspired Johns Hopkins lecturer Sakul Ratanalert when he was an undergraduate.
"Ironman II really stretched my imagination," says Ratanalert, a faculty member in the Whiting School's Department of Chemical and Biomolecular Engineering. "Specifically, where [the hero] scans a diorama to create a 3D model. Then he chooses parts to keep and discard by grabbing, swiping, and flicking—finally expanding his hands out to enlarge the model and sit in this new element."
Ratanalert and colleagues are currently researching the use of "holo" goggles with the help of a 2020 DELTA grant, short for Digital Education and Learning Technology Acceleration. Their objective is to create a mixed-reality headset—coupled with e-notebooks to facilitate data collection and analysis across collaborators—to make cooperative remote laboratory work a reality, a possibility as cool as the gear that thrilled Ratanalert in the movies.
Now in their third year, DELTA grants of up to $75,000 each were given in July to six universitywide teams made up of JHU faculty, staff, and students. Bestowed by the Office of the Provost, the grants are designed to help "unleash the potential of digital technology." The program was founded with proceeds from the university's massive open online courses, or MOOCs, offered in partnership with Coursera.
"Creating technology that's as intuitive and user-friendly as possible are the hallmarks of good design," says Ratanalert, the principal investigator on a team that also includes his Whiting School colleagues Luo Gu, Orla Wilson, and Patty McGuiggan, along with Krieger School faculty members Robert Leheny and Meredith Safford. The formal name of their project is Mixed Reality Headsets in Teaching Laboratory Courses: Changing the Pedagogy Through Remote Collaboration and Experimentation.
Grants also were given for the following proposals:
- Improving Pediatric Cardiopulmonary Resuscitation Education and Performance via Augmented Reality. Principal Investigators: Justin Jeffers, Therese Canares, and Keith Kleinman, all of the School of Medicine; and James Dean, Blake Schreurs, and Scott Simpkins, all of the Applied Physics Laboratory.
- Broadening the Message: Making Videos More Usable at Johns Hopkins University and Beyond. Principal Investigators: Jeff Day and Bonnielin Swenor, both of the School of Medicine; Donna Schnupp, of the School of Education; and Valerie Hartman, of the Peabody Institute.
- Deconstructing Health Care Silos: Interprofessional Education Using Multiplayer Virtual Simulation and Virtual Reality for Medical and Nursing Trainees. Principal Investigators: Kristen Brown, Shawna Mudd, Catherine Horvath, and Nancy Sullivan, all of the School of Nursing; and Nicole Shilkofski, Justin Jeffers, Julianne Perretta, and Sandy Swoboda, all of the School of Medicine.
- Socratic Artificial Intelligence Learning (SAIL): A Virtual Study Assistant for Educating Medical Professionals. Principal Investigators: T. Peter Li, Alex Johnson, and Dawn LaPorte, all of the School of Medicine; and Stewart Slocum and Arpan Sahoo, both of the Whiting School of Engineering.
- Accessible Experiential Lesson Guide Delivery Platform. Principal Investigators: Carrie Wright, Stephanie Hicks, Leah Jager, Margaret Taub, and John Muschelli, all of the Bloomberg School of Public Health.
Carrie Wright and Leah Jager, both in Biostatistics at the Bloomberg School, are on the Lesson Guide Delivery Platform team.
Says Wright, "The project is supposed to expand our data science and online education materials and tools to make them versatile and usable by the largest number of people."
Wright's hope is that her team's work—to create material to better teach people how to analyze data—results in education that benefits the greatest number of people.
"This is really useful now that so much education has moved online" in the wake of the COVID-19 pandemic, Jager says. "The hope is that traditional educators will use [the website] in the classroom but also for [others] to use at their own pace."
The finished open-access site (with a function to translate the material into other languages) may be online before the end of the year, Jager says.
The first three words of Jeffrey Day's video project could speak for all the work taking place with DELTA awards this year: Broadening the Message. The end of the title is also instructive: Johns Hopkins University and Beyond.
Again, ease and accessibility—in this case for an audience with a variety of disabilities—are the keys.
"One of our main study questions is testing audio descriptions in [instructional] videos," says Day, a physician in the School of Medicine's Department of Art as Applied to Medicine and a principal investigator on the project.
Although most federal guidelines on electronic media for disabilities are very clear, Day says, those for audio description in video creation are written in a way that can be misinterpreted. The goal of his study is to take videos to users with both low and normal vision to gain greater insight into the amount of audio description that is most usable and useful, thereby providing better educational media experiences to low vision users.
"We're looking for insights that clarify what is most helpful," he says. "We're hoping our work will inform the rulemakers and creators to see that the entire audience is best served," he says.
Posted in News+Info