Wanda Díaz-Merced is an astronomer and the leading proponent of sonification of astrophysical data. When Dr. Díaz-Merced lost her sight in her early 20s, her dreams of studying stars in the visually oriented scientific world suffered a major setback -- until she discovered "sonification," a way to turn huge data sets into audible sound. She realized that she could use her ears to detect patterns in stellar radio data, and could uncover connections obscured by graphs and visual representation.
After receiving a bachelor's in Physics from the University of Puerto Rico. She continued to undertake astrophysical research at NASA Spaceflight Center, and completed a Ph.D. in computer science at the University of Glasgow in Scotland. She has held positions at the Harvard Smithsonian Center for Astrophysics and the South African Astronomical Observatory. She co-chaired the 2019 conference Astronomy for Equity, Diversity and Inclusion, at the National Astronomical Observatory of Japan. Recently Dr. Díaz-Merced joined the European Gravitational Wave Observatory. She is an important voice and leader in increasing equality of access to astronomy and STEM.
Dr. Díaz-Merced will be giving a talk about her research and her personal journey into science. This event is hosted by LIGO and will be moderated by Dr. David Reitze, Executive Director of LIGO.
More than one sensorial modality to do mainstream science in astronomy?
I hope we will dialog about the possible efficiency of using sound to contribute to the mainstream research in astronomy and symmetrically contribute to human development by closing the skill development gap caused by the current performance and digital discourse. Questions like why if astronomy was using audio at the end of the 1800s the usage of multisensorial perception became secondary or perhaps even obsolete or forgotten? I will mention my perception experiments at the centre for astrophysics and wonder: How are we going to transform the astronomy field of performance into one that does not exclude other perceptual performance experiences to generate science? What is needed to carry that transformation? I will mention some examples of perception techniques I have used to hear data streams from the MESSENGER Space Craft, Ulises, ACE and ARTHEMIS.