[ad_1]
Ludwig van Beethoven died in 1827, three years after completing his Ninth Symphony, which many consider his masterpiece. He had started to work on a 10th Symphony, but the deterioration of his state of health did not allow him to complete this project, which remained in the draft stage.
Since then, musicologists and Beethoven lovers have wondered – and lament not knowing – what this symphony could have become. Now, thanks to the work of a team of music historians, musicologists, composers and computer engineers, Beethoven’s vision has come to life.
Responsible for the artificial intelligence (AI) side of the project, I led a scientific team from the AI start-up Playform that taught a machine Beethoven’s entire body of work and his creative process.
A full recording of his 10th Symphony was released on 9 October 2021, and was performed on stage for the first time in Bonn, Germany, crowning more than two years of effort.
Past attempts hit a wall
Around 1817, the Royal Philharmonic Society in London commissioned Beethoven to write his Ninth and 10th Symphonies. Written for an orchestra, symphonies have four movements: the first is played at a fast tempo, the second at a slower tempo, the third at a moderate or fast tempo, and the last at a fast tempo.
In 1824, Beethoven completed his Ninth Symphony, which concludes with the timeless “Ode to Joy”.
When it came to the 10th Symphony, however, the composer left little behind, other than some musical notes and a handful of ideas he had jotted down.
There have been some past attempts to reconstruct parts of Beethoven’s 10th Symphony. Most famously, in 1988, the musicologist Barry Cooper ventured to complete the first and second movements. He wove together 250 bars of music from the sketches to create what was, in his view, a production of the first movement that was faithful to Beethoven’s vision.
Yet the sparseness of Beethoven’s sketches made it impossible for symphony experts to go beyond that first movement.
Assembling the team
In early 2019, I was contacted by Dr Matthias Roeder, the director of the Karajan Institute, an organisation in Salzburg, Austria, that aims to promote the links between music and technology.
He explained that he was putting together a team to complete Beethoven’s 10th Symphony in celebration of the 250th anniversary of the composer’s birth. Aware of my work on AI-generated art, he wanted to know if AI could help fill in the blanks left by Beethoven.
The challenge seemed daunting. To pull it off, AI would need to do something it had never done before. But I replied that I was ready to give it a try.
Roeder then formed a team that included the Austrian composer Walter Werzowa. Famous for writing Intel’s signature jingle, Werzowa was tasked with putting together a new kind of composition that would integrate what Beethoven left behind with what the AI would generate.
Mark Gotham, a computational music expert, led the effort to transcribe Beethoven’s sketches and process his entire body of work so the AI could be properly trained.
The team also included Robert Levin, a musicologist at Harvard University who also happens to be an incredible pianist. Levin had previously finished a number of incomplete 18th-century works by Mozart and Johann Sebastian Bach.
The project takes shape
In June 2019, the group gathered for a two-day workshop at Harvard’s music library. In a large room with a piano, a blackboard and a stack of Beethoven’s sketchbooks spanning most of his known works, we talked about how fragments could be turned into a complete piece of music and how AI could help solve this puzzle, while still remaining faithful to Beethoven’s process and vision.
The music experts in the room were eager to learn more about the sort of music AI had created in the past. I told them how AI had successfully generated music in the style of Bach. However, this was only a harmonisation of an inputted melody that sounded like Bach. It didn’t come close to what we needed to do: construct an entire symphony from a handful of phrases.
Meanwhile, the scientists in the room – myself included – wanted to learn about what sort of materials were available, and how the experts envisioned using them to complete the symphony.
The task at hand eventually crystallised. We would need to use notes and completed compositions from Beethoven’s entire body of work – along with the available sketches from the 10th Symphony – to create something that Beethoven himself might have written.
This was a tremendous challenge. We didn’t have a machine that we could feed sketches to, push a button and have it spit out a symphony. Most AI available at the time couldn’t continue an uncompleted piece of music beyond a few additional seconds.
We would need to push the boundaries of what creative AI could do by teaching the machine Beethoven’s creative process – how he would take a few bars of music and painstakingly develop them into stirring symphonies, quartets and sonatas.
Piecing together Beethoven’s creative process
As the project progressed, the human side and the machine side of the collaboration evolved. Werzowa, Gotham, Levin, and Roeder deciphered and transcribed the sketches from the 10th Symphony, trying to understand Beethoven’s intentions. Using his completed symphonies as a template, they attempted to piece together the puzzle of where the fragments of sketches should go – which movement, which part of the movement.
They had to make decisions, such as determining whether a sketch indicated the starting point of a scherzo, which is a very lively part of the symphony, typically in the third movement. Or they might determine that a line of music was likely the basis of a fugue, which is a melody created by interweaving parts that all echo a central theme.
The AI side of the project – my side – found itself grappling with a range of challenging tasks.
First, and most fundamentally, we needed to figure out how to take a short phrase, or even just a motif, and use it to develop a longer, more complicated musical structure, just as Beethoven would have done. For example, the machine had to learn how Beethoven constructed the Fifth Symphony out of a basic four-note motif.
Next, because the continuation of a phrase also needs to follow a certain musical form, whether it’s a scherzo, trio or fugue, the AI needed to learn Beethoven’s process for developing these forms.
The to-do list grew: we had to teach the AI how to take a melodic line and harmonise it. The AI needed to learn how to bridge two sections of music together. And we realised the AI had to be able to compose a coda, which is a segment that brings a section of a piece of music to its conclusion.
Finally, once we had a full composition, the AI was going to have to figure out how to orchestrate it, which involves assigning different instruments for different parts.
And it had to pull off these tasks in the way Beethoven might have done.
Passing the first big test
In November 2019, the team met in person again – this time at the Beethoven House Museum, in Bonn, where the composer was born and raised.
This meeting was the litmus test for determining whether AI could complete this project. We printed musical scores that had been developed by AI and built off the sketches from Beethoven’s 10th. A pianist performed in a small concert hall in the museum before a group of journalists, music scholars and Beethoven experts.
We challenged the audience to determine where Beethoven’s phrases ended and where the AI extrapolation began. They couldn’t.
A few days later, one of these AI-generated scores was played by a string quartet at a news conference. Only those who intimately knew Beethoven’s sketches for the 10th Symphony could determine when the AI-generated parts came in.
The success of these tests told us we were on the right track. But these were just a couple of minutes of music. There was still much more work to do.
Ready for the world
At every point, Beethoven’s genius loomed, challenging us to do better. As the project evolved, the AI did as well. Over the ensuing 18 months, we constructed and orchestrated two entire movements of more than 20 minutes apiece.
We anticipate some pushback to this work – those who will say that the arts should be off-limits from AI, and that AI has no business trying to replicate the human creative process. Yet when it comes to the arts, I see AI not as a replacement, but as a tool – one that opens doors for artists to express themselves in new ways.
This project would not have been possible without the expertise of human historians and musicians. It took an immense amount of work – and, yes, creative thinking – to accomplish this goal.
At one point, one of the music experts on the team said that the AI reminded him of an eager music student who practices every day, learns, and becomes better and better.
Now that student, having taken the baton from Beethoven, has presented the 10th Symphony to the world.
Ahmed Elgammal is a professor and the director of the Art & AI Lab at Rutgers University. This article first appeared on The Conversation.
[ad_2]
Source link