Carol Reiley, Engr '07 (MSE), has long been captivated with the creative side of engineering. As a graduate student at Johns Hopkins, she worked with her classmates to design and build a robot that made pin and thread art. In the graduate lab of JHU's Laboratory for Computational Sensing and Robotics, she and her fellow labmates coaxed a $1.5 million surgical robot into playing the popular board game Operation. Later, she and a classmate co-wrote a DIY hack to the video game Guitar Hero, which allowed amputees to play the game.
Now she has her sights set on combining music with artificial intelligence, teaming up with Hilary Hahn—a noted violinist, Peabody Preparatory alum, and winner of three Grammy awards—to launch a new initiative called Deepmusic.ai.
"I'm interested in cracking how we can make humans supercreative with AI tools," Reiley says. "Imagine the potential to unlock the human mind and imagination. Creativity is one of the uncharted territories of AI because it is so personal and subjective. It's also often thought to be uniquely human because it's what we excel at as a species and where AI is said to fall short."
Reiley and Hahn met in Baltimore when Reiley was a graduate student in computer science at Johns Hopkins and Hahn was living in the city a few years after graduating from the renowned Curtis Institute of Music. They saw a need for a community to connect their two seemingly disparate fields of AI and the arts, "a space where ideas can be shared without judgment, feedback can be exchanged, and projects created," Reiley says.
"The brainstorming process for that was really fascinating," adds Hahn, recalling one phone conversation early on in the development of Deepmusic.ai. "We were talking while I was taking a walk. I ended up sitting on a park bench in the rain, talking for three hours."
The pair believe that artificial intelligence could unlock new frontiers for music. "For example, at our fingertips, we can organize and search all the history of music," Reiley says. "AI can help scan through all the data and suggest common chord progressions. AI-generated sounds might lead to new genres of music, like the synthesizer did for electronic music. I'd like AI to be a natural extension of the artist, or if not that, a helpful sidekick."
Prior to this new project, Reiley had co-founded the self-driving car startup drive.ai, which raised over $77 million in funding. But DeepĀmusic.ai isn't a typical startup company with something to sell. "Our goal is not really to create a product," Hahn says. Instead, Reiley says, the objective is to figure out how AI researchers can work together with creatives. "This AI creativity revolution seems to be missing the voice of the current generation of artists themselves," Reiley says. "So we want to find ways to empower the artists by giving them a seat at the table now as these early tools are being created to help shape them before it becomes too late."
Scientists and engineers have long been captivated by the creative possibilities of artificial intelligence. In 1955, the computing pioneer John McCarthy coined the term "artificial intelligence" and famously experimented with writing computer programs to play chess. In 1957, Max Mathews, the director of acoustics research at Bell Labs, laid the groundwork for the modern field of computer music by writing software to get an early computer to generate 17 seconds of sound. In 1963, he published an article in the journal Science titled "The Digital Computer as a Musical Instrument."
In the 21st century, composing music on a laptop is completely normal—even expected. But composing music using artificial intelligence, with the computer's "brain" acting as an equal partner in the creative process, has been less explored.
For the first phase of the project, they commissioned AI-assisted pieces from three well-known figures in music—David Lang, co-founder of the Bang on a Can collective and the recipient of both a Grammy and a Pulitzer Prize; Michael Abels, composer of the score for the Oscar-winning film Get Out; and Dana Leong, a two-time Grammy-winning multi-instrumentalist and composer. Each musician created a new work, either using the AI software platform AIVA or OpenAI's platform MuseNet. "They gave enormous amounts of time and a lot of trust and creative energy to the project," Hahn says.
The results were often surprising. In Abels' case, he first wrote a solo piano piece and then had the AI generate five new piano pieces influenced by it. The AI-generated pieces, he found, were energetic, intense, and sometimes unplayable by a human pianist. He folded parts of the AI-created pieces into his work to create the rousing, up-tempo "Gift of the Machine."
"I was surprised at how much work was required by human composers in order to collaborate with the raw AI output," Reiley says. "And I was impressed by how they worked the challenges into the concepts of the pieces, inventing new solutions to bring their ideas to life."
Working with professional composers and musicians is important, Reiley says, because of the unique insights they can provide. "They bring with them subject matter expertise and are more able to see the mistakes and inherent errors in the AI-generated output that novices may miss," she says. "Novices generally only focused on listenability. Professionals have different needs. Professionals have certain tools and are trying to express a cohesive thought or idea. They generally have constraints and understand the theory." She thinks that AI could assist professionals in numerous ways. "AI could help automate some of the tedious parts of their job or act as a 24/7 improvisation partner," she says.
Reiley foresees tremendous opportunities for Deepmusic.ai to bring together AI and creative expression. " I want to expand past music into different areas such as visual art, dance, writing, cooking, designing websites," she says. A sonnet written with an assist from AI? Why not. "My ultimate wish is that we preserve the precious beauty of human creativity and save it from becoming formulaic."
Posted in Arts+Culture