Home Artificial Intelligence The sound of a sunset | MIT News

The sound of a sunset | MIT News

0
The sound of a sunset | MIT News

[ad_1]

What does a sunset sound like? Or ascending shades of red? Now you can listen in via the Sonification Toolkit, a polished, up-and-running prototype from a year-long initiative powered by the combined imagination and expertise of two dozen MIT undergraduates.

The latest major endeavor from MIT’s Digital Humanities Lab (DH Lab) began last spring with an initial idea from Evan Ziporyn, the lab’s faculty fellow, and the Kenan Sahin Distinguished Professor of Music. Ziporyn reflects that the DH Lab faculty fellowship was a fortuitous transition as he was considering how to expand on his previous sonification project, which involved turning intricate spiderwebs into immersive audio installations.

The newly released Sonification prototype, a work in progress with cutting-edge capabilities, is a robust exploratory foray into possibilities for sonification. The lab has many further ambitions for the toolkit: Among the most exciting is a web application that will transform almost anything digital — from numerical data to drawings — into sound.

True sonification

Sonification is the process of translating an object or a dataset’s structure into sound, using computers. To take a simple example, think of a set of stairs (like the ones at the Museum of Science in Boston) with each step assigned a note on the major scale. Movement along the staircase produces respective notes. But is this really sonification? Ziporyn makes an important distinction: He is seeking to faithfully produce sound based in the attributes of the object itself, not to impose sound onto an object.

He explains: “The staircase is a human object, as is a major scale. The staircase isn’t literally analogous to a major scale — it’s just that major scale happens to sound good to many people.” The staircase itself isn’t actually being sonified, only being assigned values that give a pleasant, digestible result. “What we were trying to do by using the pure numerical relationships between [material and sound] is to get away from that … to make sure you’re getting a result that actually reflects the structure of the object rather than a humanized, palatable version of that.”

The toolkit includes five avenues for sonification with which users can experiment. Teams of student researchers composed tools to sonify time data, polygons, colors, gestures, and text shapes — building accessible software to analyze the digital object and allow untrained users to listen in and play around.   

The imaginative scope of the project was a major draw for sophomore Jessica Boye-Doe, a computation and cognition major. “I had thought about the idea of turning digital items into sound before,” she says, “but wasn’t aware of how much research there was already on sonification. I saw this UROP as a way to explore this application of technology in music.”

The life cycle of the Sonification Project has unfolded in the DH Lab over the course of Ziporyn’s faculty fellowship. Forty Undergraduate Research Opportunity Program (UROP) students, along with the lab’s instructional staff, assembled for the fall semester, following the competitive faculty fellow selection process in the spring and initial project steps over the summer.

Using the new sonification toolkit, senior Moises Trejo converted the annual times of sunrises and sunsets in one location into music. The lower tone is the sunrise time and the higher tone is the sunset time. The sunrise and sunset gradually move further apart from each other as the days get longer and then towards each other as the days get shorter.

Endless curiosity

Peihua Huang, a sophomore majoring in computer science and engineering, came on-board to the project early, after discovering the DH Lab through the lab’s first-year orientation event. She helped to build the initial infrastructure of the project over the summer and stayed on in the fall semester to build and polish the “Gestures to Sound” application, an interface that allows users to make gestures with their cursor that are then transmuted into music.

“The lab’s intersection with humanities meant that, while developing, we think about the project through the discipline lens of the professors we are working with, as well as our intended audience,” says Huang. “As a computer science major, I really value these opportunities to put my skills into use, to work in small teams and collaborate to create an overarching project.”

Ryaan Ahmed, the DH Lab’s associate director and senior research engineer, connects all of the lab’s many moving pieces, from brainstorming with faculty to organizing dozens of UROP students. Projects like the Sonification Toolkit reveal the rich and unique range of the DH Lab’s programming. Over the course of the lab’s first five years, it has tackled emerging technologies for language learners, enhancing analyses of visual archives, simulations around topics affecting democratic development in Africa, and computational analysis of gendered language in novels, among other projects. The level of creativity and capability demanded by these projects speaks to the multidisciplinary strengths — and endless curiosity —  of MIT undergraduates.

“There just aren’t that many people in the world who have this kind of really deep, interdisciplinary expertise,” says Ahmed. “The lab’s students have the level of excellence in engineering that you find at MIT, and they also have a deep seriousness and investment in the humanistic side of things. I hope we’re showing our students here that you don’t have to choose between those realms: You can actually integrate them.”

Artistic technologists

The blend of programming and arts was what initially drew first-year Grace Jau, majoring in computer science and engineering, to the DH Lab. Jau worked as a student researcher on the Sonification Toolkit last fall.

“As someone who loves making art and music,” says Jau, “I appreciate that working in the DH Lab gave me an opportunity to gain experience in my technical field while also connecting to my interests in the humanities.”

A love of computing and music also drew sophomore Emeka Echezona to the lab. An electrical engineering and computer science major, Echezona also plays trumpet in the MIT Festival Jazz Ensemble. Reflecting on the Sonification Toolkit, he says, “A lot of the work we put into designing and coding the instruments for the toolkit was only possible through knowledge of different fields. Even outside of the toolkit itself, we learned about musical concepts like dynamics, pitch, and timbre from the view of fields like physics.”

The imaginative connection to the arts and humanities through technology explored in the Sonification Toolkit has any number of further applications. UROPs brainstormed how to sonify paintings, maps, and photography. What might the Mona Lisa sound like? With recent MIT sonification projects reaching down to the level of proteins and particle energy, the limits of this art form appear only to be the limits of technology and imagination.

This breed of technology shows another aspect of MIT innovation: The Sonification Toolkit is about what can be dreamed up and accomplished with the perspective of the arts and humanities.  

“We’ll have a generation of technologists who are better equipped to think about those subtle, usefully complex areas that the humanities are great at addressing,” says Ahmed, “thinking about how the technologies that they’re working on affect people and bringing the nuances of the arts, social sciences, and humanities to their technical work.”

[ad_2]

Source link

LEAVE A REPLY

Please enter your comment!
Please enter your name here