Smartphone images taken at night often appear blurry, dark, and indistinct even when the appropriate settings are on. But a team at Google led by Alexander Schiffhauer, Engr '13, has developed a new mode for the Pixel phone called Night Sight, which sharply renders details and true colors, even in low light.
So how does Night Sight work? It measures users' natural handshake, as well as how much motion is in the scene, before they press the shutter button. If Pixel is stable and the scene is still, Night Sight will spend more time capturing light to minimize visual noise; if the device moves or there's significant scene motion, Night Sight will use shorter exposures, capturing less light to minimize motion blur.
Since debuting the feature, Schiffhauer's team has already extended Night Sight to astrophotography. "Wild, right?" says Schiffhauer, the product manager for computational photography at Google. "This was previously only possible with expensive cameras, lenses, and post-production software. We simplified it to a tap of a button."
Keep your classmates informed with a submission to alumni notes. Submit your information via email to: classnotes@jhu.edu. (Due to production deadlines, your information may not appear for an issue or two. By submitting a class note, you agree that Johns Hopkins can publish your note in the print and online edition.)
Posted in Science+Technology, Alumni
Tagged alumni, engineering