logo

Name: Kid Arrow / Markus Reuter
Occupation: Producer, composer, guitarist, educator (Markus Reuter), process, music creator (Kid Arrow)
Nationality: Japanese (Kid Arrow), German (Markus Reuter)
Recent release: The latest Kid Arrow album The Day I Met You is out via bandcamp. It is the 23rd collaboration between Markus Reuter and Kid Arrow, a process based on transformations of a single MIDI file.

If you enjoyed these thoughts by Markus Reuter and Kid Arrow and would like to find out more about their work, visit Markus's official homepage.

To keep reading, we also recommend our earlier Markus Reuter interview.

This interview is part of an ongoing series of conversations with Kid Arrow about the role of human composers in music. Read part 1. Read Part 2.



You've criticised the word “expression” when it comes to art. But even if we replace “expression” with “experience”, it is hard to say how an AI could meet the criteria for that term. What role does experience play for your own work? How is the music you're making today different from the one you made when you were 15?

The quality of our decisions is governed by the quality of intention and attention. And, at least for me, I can say both have grown over time and they become better - for lack of a better word - in me. So I'm better at awarding attention to things and the things I do are more intentional.

The tools you're using, ultimately, don't play much of a role. Every writer can use the same software to write a book and the same keyboard to type. But it's not about that. It's about how you use it. And then you could say: “Ah, but everybody's using the same alphabet.” Which is true, but although we're using the same alphabet, we don't have to write the same words, and we don't have to read them in the same order. There's so much freedom and there are so many possibilities and opportunities to create something. So it is with music.

I can jump to some other conclusion here as well. In the end, you could say that even attention and intention don't actually make any difference. Because the person who's reading the book, or listening to a piece of music has the same importance when it comes to the decoding, right?

Let's take the example I already gave in another interview: A few 15 year olds get together and make dark rock music – let's say in the vein of The Cure's early phase. And their sole intention was simply to look attractive to their potential partners. But then people start listening to it and it becomes a musical phenomenon. Suddenly, it's not about that original intention – which had nothing to do with the actual music - anymore at all. And something really beautiful and big grew out of that initial, simple motivation.

What I'm saying is that you cannot really judge any art just by looking at the intention behind its creation.

I know one very famous jazz musician who no longer enjoys giving interviews. He feels as though the questions always suggest that there was a big, spiritual, creative masterplan behind his path. While in reality, he was a professional musician and many decisions were taken just for commercial reasons. Very simple, mundane reasons.

As human beings, we tend to assume similar mental processes in other people. We assume that our way of thinking is the only way of thinking. We can't imagine what lies beyond what we can imagine, which is trivial and totally human.

If you think about it this way, AI tools give you a way to interact with materials where you don't really know what the underlying process is. It's cool because it's not a human process. The black box is one step further away than the black box that is another human being.

As we said, it's already impossible to understand how another human being operates. And an AI takes the input that many, many human beings have generated, analyses it, and creates something that is many magnitudes removed from that. It's potentially very inspiring.

It expands the realm of the imaginable.

Ultimately, yes. Very much so.

And yet, even artists who are actively using artificial intelligence systems, will often tell me that machines do not have any real experiences and so they cannot make decent music.

I'm being reminded once again of the great book The Manual by the KLF. In a sense, it's exploring the very thing we're talking about right now: The idea that to produce saw a hit song, you don't enter the studio with an idea, but with a person who is only interested in fiddling with knobs and trying out things with a new piece of gear.

Whatever comes out of that process is the material for the song and the song basically writes itself, because people just want to fiddle about with things. It's not about intention, it's not about anything, really.

Likewise, we could take advantage of an AI to give us material for our song or for our music. It's been done forever. There's absolutely nothing new about it. It's just that we can create ideas and materials faster than ever before.

Is it actually correct to say that AI does not have experience? It does seem as though it at least has a translation of select experiences from its creators. And it collects experiences, of whatever kind, through a continuous learning process.

Exactly, it has amassed experience not as a sensation, but as an output. But that is something very different.

We could make the analogy that if kid arrow were a character in a story, he wouldn't physically exist, either, and he wouldn't “have” any experiences. And yet, we assume him to be human and to have all these human traits nonetheless.

That's what I love about human imagination - that we can nest these imaginary worlds. To have the capacity to identify ourselves with a character from a book, a fictional character. I'm sure this could somehow be represented by a machine, but only as a result. There is still a difference between learning and experiencing.

I think the question is what happens if you manage to integrate sensors into an AI system, and you make it susceptible to sensory interaction. So it could “hear”, it could “smell”, it could “touch”. How long would it take for that being to actually become sentient? Once you interface it, that's where things get interesting.

You spoke about the “remainder of humanity”. That human part is getting smaller, then, isn't it.

The remainder of humanity is: Do I like it or not? I don't like the word “like”, but it's about what do I prefer? What do I vote for? Would I rather have a Beatles song or a Rolling Stone song? That has always been the question. And that's what's going to remain there until the end. It's your particular choice of library. What music do you carry with you on your phone?

Being human means believing you have a choice. Believing that you're making those choices. It has nothing to do with performance. At least, I don't think it does. It has nothing to do with how the material was generated.

Again, let's return to the example of the band that wrote that Cure-like song. They just wanted to hang out and snort cocaine. But the music that come came out was great. Let's now say that song was created by an AI. Would anybody care?

I think the current majority opinion is simply that an album like Disintegration could not have been made by an AI.

I disagree. Especially an album like Disintegration could have been made by an AI. It's so particular in its set of rules. You can describe that music in relatively simple terms, and you could easily have somebody create more music by those rules, and it would come very close to the original.

I think it's the more chaotic, less defined, less artsy, less recognisable music that is harder to create for for a machine. Take someone like Prokofiev. He was a really out there composer and then became neoclassical at some point for reasons that I think we all know. This is where things get more complex if you want to put a defining stamp on him as a brand.

The simpler the recipe, the easier it is to read and to use it to create similar dishes.

Ironically, the simpler the recipe, probably, the deeper emotion expressed through the music. Which would make the threshold of creating meaning lower for a composing AI.

Exactly. I am saying that as someone who really likes Disintegration and the lyrics of Robert Smith. They are absolutely genius. They are genius in their simplicity.

I think the real test is to have an AI create a song in the vein of Disintegration and The Cure then performing it. To me, even if Robert didn't write the music and the words, it would still be him.

So you see still see a value in the act of physical music making?

I do. There's a physicality of human bodies being in the same room. Air being moved, the insides of somebody's stomach being moved … all these things. The communal aspect is really, really important. There's this wonderful little sad thing that Mike Oldfield says in a video: “A concert is like a celebration of life. And it shows that some of us are happy to be here and alive.” Some of us!  It's pretty amazing.

We experience being alive in a performance situation. It's like having an oral examn at school. You feel your nerves.