Microsoft Corporation

05/15/2019 | News release | Distributed by Public on 05/15/2019 22:25

Ever-changing music shaped by skies above NYC hotel

Barwick composed five movements within an overall soundscape that reflect the constantly changing nature of the sky throughout the day, each with its own background of bass, synthesizer and vocal lines that weave in and out. For each 'event,' identified by Microsoft AI, she then created six synthesized and six vocal sounds for the generative audio program to choose from - for example, 60 different musical options a day for every time an airplane passes above. The sounds are an expression of Barwick's emotions in response to each stimulus.

'I didn't want it to be too literal,' she says. 'I could have made it sound 'raindroppy,' but it's more about the attitude of the event. An airplane is a lot different than the moon, so it has more of a metallic sound than a warm sun sound or a quiet 'moony' kind of feeling. I wanted people who listen to it to be curious and wonder what that sound meant, what's going across the sky right now.'

Barwick has never been afraid of technology, even if she didn't have access to it. She recorded her first album in 2007 using a guitar pedal to form vocal loops on a cassette tape. 'I didn't even have a computer then,' she remembers. 'I took my bag of tapes in somewhere to get mastered to produce the CD.'

Now she relies on technology to compose, record and perform her multilayered, ambient music. She uses effects on everything, including her voice. There's no such thing as an unplugged Julianna Barwick set. Still, she says, 'Before I was approached to do this project, the only thing I knew about artificial intelligence was from the movies. I'd never seen an application of it in my daily life.'

So as she began exploring sounds, Barwick grappled not only with what AI was and could do, but also with what her role would be in comparison to its. Who was the actual composer - she or the program? Was AI a partner or a tool?

'I contemplated how the project would play out in my absence and realized that I can make all the sounds, but I'm not going to be there to detect all the events - you have to rely on the AI to do that,' Barwick says. 'And that's such an important part of the score; it's almost like it's a 50-50 deal. And that's what makes this project interesting. It almost brings in another collaborator, and the possibilities are endless. It's opened up a new world of thinking and approaching future compositions and scores.'