logo

Part 1

Name: Chris Korda
Occupation: Transgendered suicide cult leader, electronic music composer, digital artist, free software developer
Nationality: American
Current Release: Apologize To The Future on Perlon
Recommendations: My main inspiration for polymeter and phase art is Thomas Wilfred. Look him up in Wikipedia. I saw his work at the Museum of Modern Art in NYC when I was a child. Some of his “Lumia” machines permute for years without repeating. For harmony, I recommend “Drifting Petals” by Ralph Towner and Gary Burton, as well as Ralph Towner’s 1979 “Solo Concert.”

If you enjoyed this interview with Chris Korda, find out more about Chris's work and music on the following pages: Personal website, Twitter, Instagram.

ParTouT · 3.02 Chris Korda - Magic Cookie Ep

 

When did you start writing/producing music - and what or who were your early passions and influences? What is it about music and/or sound that drew you to it – especially compared to the other activities you've been engaged with?

I started making music as a child by tapping on household objects. I was displaying an unusual aptitude for rhythm, but unfortunately it was misdiagnosed as twitchiness. I also improvised on the piano every chance I got, but this was similarly discouraged. I dazzled my schoolmates by using my mouth to accurately imitate rock drumming, beat-boxing long before I knew the term. I also started building my own instruments, for example I taped a microphone to the end of a wooden recorder, plugged it into a radio, and played wild Hendrix-inspired solos.

Odd time was the height of musical fashion in the late 1960s and early 1970s, and I developed a lifelong fascination with odd time as a result. My strongest influence was the band Yes, and I still listen to their album “Relayer” regularly and use it as an example of peak complexity in popular music. The rock opera “Jesus Christ Superstar” was another major influence. Many years later, odd time influences such as these predisposed me to discover complex polymeter and use it in my composing.

By the age of twelve I acquired a toy organ, and a cheap acoustic guitar shortly thereafter. I started studying jazz guitar in 1979, practicing countless hours every day, and had a series of excellent teachers, including tenor saxophonist Jerry Bergonzi. I studied composition at Sarah Lawrence, and attended a summer session at Berklee College of Music. The latter was a decisive influence, for two reasons: I learned to sight-read jazz charts, and my roommate introduced me to a group of artists that vastly expanded my musical taste: Pat Metheny, John Abercrombie and Ralph Towner.

For most artists, originality is first preceded by a phase of learning and, often, emulating others. What was this like for you? How would you describe your own development as an artist and the transition towards your own voice? What is the relationship between copying, learning and your own creativity?

My dream since early adolescence was to play guitar in a band, and after some false starts I eventually fulfilled my ambition by playing in a few Boston-area jazz and rock bands. I even spent a summer busking, playing jazz standards on street corners, but ultimately I found the technical aspects of guitar intensely frustrating and limiting.

I was a huge fan of John Abercrombie, attended many of his shows, and consciously imitated his style, for example by transcribing his solos, but it made me increasingly unhappy. Eventually a friend persuaded me that my strategy was mistaken, and that I needed to escape from Abercrombie’s shadow in order to find my own creative path. So in 1991, I quit the guitar, moved to Provincetown, and started a new life as a female impersonator. This drastic transition gave me the inspiration and courage to reinvent myself, first as founder of the Church of Euthanasia, and then as an electronic musician.

What were your main compositional- and production-challenges in the beginning and how have they changed over time, especially after a long break from producing music?

I started producing in 1993 using the MS-DOS version of Cakewalk, which by chance happened to allow each track to have its own independent loop length. Due to this happy accident, I immediately discovered and fell in love with polymeter and phasing. Oscillators having different frequencies will drift in and out of sync, and this is called phasing. Polymeter is quantized phasing, wherein the drift occurs in discrete steps. I soon began composing in complex polymeter, which I define as the simultaneous use of three or more prime meters. For example “Buy”—the opening track of “Six Billion Humans Can’t Be Wrong”—is in 3, 4, 5, 7, 11, 13, 23, and 31, all at once. I didn’t discover Steve Reich’s work until decades later.

My immediate problem was that in Cakewalk you were either editing or listening but not both. I wanted to escape from this dichotomy, and improvise my arrangement in real time. I frequently performed live sound-collage, and was influenced by that style’s free-flowing aesthetic.

My goal was to live-arrange my polymeter loops, and have the arrangement recorded, not as audio, nor as MIDI, but as mute automation. In other words, I wanted to record when each track was muted or unmuted. The advantage of this is that the mute events can be edited afterwards—for example to fine-tune the transitions—without disturbing the underlying polymeter loops. The concept is analogous to a stencil. Unmuting tracks cuts holes in the stencil, and the underlying tracks show through the holes, with their phase relationships always preserved.

I started by hacking Cakewalk, but this proved too limiting, so I developed a live-arranging program of my own—partially modeled on a lighting controller—which eventually grew into a full-fledged polymeter MIDI workstation. It consisted of three separate programs: one for polymeter composing, one for live-arranging and recording the arrangement as mute events, and still another for fine-tuning the resulting arrangement.

It was all written from scratch in C and assembler language—in those days you had to write your own device drivers—and it was clumsy and hard to use by today’s standards. Originally I drove music hardware, but after Reason came out I simplified my rig to two laptops, one running my sequencer and the other running Reason, connected by a hardware MIDI interface.

Decades later my sequencer has evolved from these humble roots into a powerful integrated composing software with many features that aren’t found in commercial DAWs.

What was your first studio like? How and for what reasons has your set-up evolved over the years and what are currently some of the most important pieces of gear for you?

When I started producing electronic music, racks full of hardware—synths, drum machines, effects, mixers and so forth—were still a necessity, but I ditched all that stuff as soon it became practical to do so. It’s fashionable to be obsessed with hardware, but it reminds me of collecting antique cars. I like unlimited undo.

My studio currently consists of a Windows laptop running my custom composing software (called Polymeter), along with Propellerhead Reason connected to Polymeter via a virtual MIDI loopback cable. I use Reason only to translate Polymeter’s MIDI output into audio. I also have a flat-screen monitor, a digital-to-analog converter, a pair of powered speakers, and headphones for working at night.


 
1 / 3
next
Next page:
Part 2