NASA’s Sonification Project allows everyone to enjoy the beauty of the cosmos in a different way.
What would the depths of our galaxy sound like if we could hear their melody?
In an effort to answer this question, NASA's Sonification Project is turning data gathered from the outer reaches of the universe into sounds - and even symphonies.
The project is part of the US space agency’s efforts to allow visually impaired people to experience the beauty of the cosmos through sounds. It recently released 16 data performances that bring audio interpretations of the Milky Way galaxy and beyond to listeners for the first time.
Sonification from visualisations: How is it done?
Telescopes allow scientists to see different objects in our universe by filtering images through different types of light. For example, adding colour to those observed objects allows to pick out the various chemical elements that are part of the phenomena being observed. Without telescopes or image processing, the visual representations that explain the cosmos would be invisible to us.
Sonification functions with the same principle but with a different kind of processing, and the technique implies, in a nutshell, translating information of some kind into a sound of some kind.
Data is data, and its output can take many forms: stunning photographs, 3D prints, virtual reality applications, or in this case, sounds.
"This is a process of translation and we are making choices along the way, but... we are using the science to guide us," Kimberly Arcand, project lead at NASA's Chandra X-ray Observatory, said during an interview with Universe Unplugged, a project part of NASA's Universe of Learning initiative.
For the Universe of Sound project, Arcand and her team of visualisation scientists took actual observational data from telescopes - series of ones and zeros - which they translated into frequencies that can be heard by the human ear.
Sonifications combined the output of telescopes such as NASA’s Chandra X-ray Observatory, Hubble Space Telescope and James Webb Space Telescope, together making “a “kind of symphony, each playing their own instrument,” astrophysicist and musician Matt Russo explained in the same interview.
Because different telescopes capture different layers of light - optical, infrared and X-ray - they each represent “a different instrument,” he said.
For some visualisations, time - meaning, the composition of the sounds - was interpreted from left to right; for others, from top to bottom.
The different shapes and structures in the image determined the pitch of notes in each sonification. For example, bright lights near the bottom could represent low notes, and bright lights near the top higher notes. The brightness of the lights or radiation scanned in the images made the sound louder.
Not all sonifications are created equal
Sonifications were created using different techniques based on the object and the data available. This allowed to portray the scientific data in the way that made the most sense with each cosmic phenomenon.
Below are some of our favourite sonifications in NASA’s latest album. Read the descriptions of each Youtube video to get the details about the composition choices and the source of the visualisations.
The Galactic Center sonification
The Bullet Cluster Sonification
The Crab Nebula Sonification
The M51 (aka Whirlpool Galaxy) Sonification
The Perseus Cluster (the black hole at the center of the Perseus galaxy) Sonification
For years, NASA has been trying to make space knowledge available and accessible in innovative ways. From tools allowing people to explore the cosmic landscape in its full spectrum with the help of sophisticated telescope technology, to programmes educating elementary school children about Earth science, and partnerships with video game companies to create educational video games: the space agency has a myriad of projects to help us understand the distant wonders in our universe.