Chris Chafe

Chris Chafe is a composer, improvisor, and cellist, developing much of his music alongside computer-based research. He is Director of Stanford University’s Center for Computer Research in Music and Acoustics (CCRMA). In 2019, he was International Visiting Research Scholar at the Peter Wall Institute for Advanced Studies The University of British Columbia, Visiting Professor at the Politecnico di Torino, and Edgard-Varèse Guest Professor at the Technical University of Berlin. At IRCAM (Paris) and The Banff Centre (Alberta), he has pursued methods for digital synthesis, music performance and real-time internet collaboration. An active performer either on the net or physically present, his music reaches audiences in sometimes novel venues. Chafe’s works include gallery and museum music installations which are now into their second decade with “musifications” resulting from collaborations with artists, scientists and MD’s.

Data sonification allows listeners to perceive data in a new, more tangible way.

We caught up with Chafe recently to speak about his work in composition and computer research in music and acoustics.

ClimateMusic: When did you start engaging with climate topics and why did you start including them in your music?

CC: I began engaging with them in Kyoto, August, 2002 in reaction to the Kyoto Protocol and the United States’ role in it. “More than 160 countries have signed on, including more than 30 industrialized countries. The United States — which produces about one-quarter of the world’s greenhouse gases — initially signed the agreement, but later rejected it” (NPR 2007).

See Chris’ project Carbon Path, which uses patterns in levels of carbon dioxide measured within the performance chamber to create music. This piece premiered in August 2002.

ClimateMusic: Can you tell us about your recent project “1,200 Years of Earth’s Climate, Transformed into Sound”?

CC: Read this excerpt from KQED Article “LISTEN: 1,200 Years of Earth’s Climate, Transformed into Sound”:

“‘In all climate data you see it in a long chart with time that is way longer than human life time so it’s impossible to experience,’ says Gordon [project data compiler]. ‘But when you sonify it you actually experience time in a way that you can’t experience when you look at the chart.’

‘As you hear in the piece that Chris has composed there’s really not a lot happening for a really long time and it’s kind of soothing,’ says Pennington [project data compiler]. ‘We have a normal state of the world, and life has evolved relative to that normal state of the world.’

Starting in the 1700s, however, you begin to hear a change. The Industrial Revolution and widespread deforestation in Europe take hold. Carbon concentrations begin to creep up. Approaching the 1900s, the tone becomes a higher-pitched wail. The last few seconds of the piece sound like an alarm, the result of a meteoric rise in CO2 concentrations.”

ClimateMusic: What inspired you to begin this project, and how did it evolve over time?

CC: I was initially referred to the UCB team by my daughter, Zoe, who at the time was a grad student in climate and public health. The team forwarded some data sets and we iterated on the project together. KQED was helping produce the final version for broadcast and web.

ClimateMusic: How can sonification of climate change data, like in your 1,200 Years of Earth’s Climate project, help listeners to better understand the urgency of the climate crisis? How can it drive them to act on these issues?

CC: It’s another means of science communication. If we can somehow hear the wake up calls that are all around us.

ClimateMusic: What would you say to someone who is critical of the use of data when composing music? What does balancing data sonification with your creative freedom as a composer look like to you?

CC: Extra-musical ideas are material for all kinds of music, even love songs. “Pure” music, “program music”, and now data-driven music is all music. How do we know if it’s music? My colleague, Ge Wang, says, “we’ll know it when we hear it.” I’ve been lucky to be free to experiment with all kinds of composition over the decades. When incorporating data started to be tractable, it was a natural extension of what I’d been doing already, generating musical characters and behavior from sensors, from numbers and from algorithms.

Read Chafe’s interviews with Stanford News and Aeon Magazine to learn more about how he translates data into music.

ClimateMusic: Can you tell us a bit more about the work you’ve been doing in the past year with real-time online musical collaboration software?

CC: JackTrip is a multi-machine technology which supports bi-directional flows of uncompressed audio over the internet at the lowest possible latency. Developed in the early 2000’s, it was used in intercontinental telematic music concerts and a variety of musical experiments using high-speed research networks as the audio medium. Its ability to carry hundreds of channels simultaneously and its lightweight architecture led to a range of applications from IT for concert halls to small embedded systems.

The pandemic has ushered in a new phase of development driven by musicians seeking solutions during lockdown. Major improvements have focused on ease of use and the ability to scale across worldwide cloud infrastructure. With orchestral-sized ensembles urgently in need of ways to rehearse on the network and most participants running their systems over commodity connections, this “new reality” runs counter to what’s required for ultra-low-latency rhythmic synchronization. Many developers and musical practitioners have joined in the cause of finding adequate solutions.

JackTrip, which has generally been run as a native software application, is now complemented by dedicated solutions including numerous Raspberry Pi-based systems, standalone physical web devices, and browser-based WebRTC and Pure Data versions. The recently established JackTrip Foundation is a non-profit clearing house for open-source development, training, and support of partners and affiliates providing their own roll-outs of the technology. 

ClimateMusic: What sort of impact do you see it having in the future?

CC: My crystal ball says we’ll be interested in remote rehearsals, remote audiences for live events, and distance teaching.

ClimateMusic: What advice would you give to someone who wants to start getting involved in fighting climate change? 

CC: Start locally, tame the transportation addiction, restore open space and give room for trees to return in important numbers (trillion trees).

ClimateMusic: Any advice for musicians and scientists specifically?

CC: First off, stop flying all over the place for festivals and conferences. Use remote technologies where possible. We had more than a million people in the air at any instant before the lockdowns. Support groups that are involved with the “wake up calls” and mitigating the causes and effects.

ClimateMusic: Where can we learn more about the work that you’re doing?

CC: My site is https://chrischafe.net/

ClimateMusic: Thank you for speaking with us!