Pop culture has programmed humanity to fear an impending uprising of robots that will wipe out civilisation. But Charlotte Kemp-Muhl, one of the directors of Finis Musicae, sees robots as another means to push music forward alongside humanity.“New genres are born on the back of new technology.
Rock and roll from electric guitars. Hip-hop from samplers. Laptops were the last [enabler of] new genres of music, but there hasn’t really been anything new since then,” Kemp-Muhl says.
READ MORE: Fish56Octagon: “Music production is the hardest, then social media — DJing is the easy part”Finis Musicae, a futurist art and transhuman tech collective, is applying robotics as the next step in the evolution of music performance. Hundreds of thousands of people saw their technology in action at Anyma’s Sphere residency. Frederik Gran, Robotics Director at Finis Musicae, programmed four robotic arms, two for each cello, to play in concert with Anyma.
“Instead of being a replacement that people fear, there’s a big opportunity for robotics to be an extension and augmentation,” says Sage Morei, the third director of Finis Musicae alongside Gran and Kemp-Muhl. “Somebody who has never played an instrument before, and maybe physically cannot, we can enable them to do it for the first time.”Finis Musicae has been active for two years, and all three members have longstanding musical careers.
But Gran has been working in the robotics space since 2009. Another example of his work besides the robot cello is Spiegelreigen, a project in which a robot arm moves a microphone around a circle of speakers to create music with feedback frequencies.“Fredrik’s research with the cello has been a guiding light into this uncharted territory.
We refer to him as our cyber Gandalf,” says Kemp-Muhl.Building on Gran’s research, they have designed robotic apparatuses such as synthetic embouchures for brass and prosthetic hands playing synths. Currently, they program these robots themselves using processes like electromyography (EMG) to measure muscle movements, and electroencephalogram (EEG), which records brainwaves and translates them into electrical signals.
But they are also developing AI systems so the robots can improvise on their own.“We have this grand vision of playing at a superhuman level. But first reaching the human level is a task in itself,” Morei says.
Read on to learn about Finis Musicae’s philosophy of integrating technology and music, the challenges they’ve faced programming robots to mimic human movement, and how this tech can expand beyond humanity and collaborate with plants and animals.A major concern about the rise of AI and robotics is that it will eliminate human jobs. How are you intending to use your tech to augment and extend how musicians can perform and record?Gran: The electric guitar did not silence the acoustic.
The arrival of sound recording sparked fears for live performance, yet both coexist still todayThroughout history, humanity’s reaction to new technology has followed familiar patterns, repeating with a regularity not unlike a mechanical process. Resistance, skepticism, and fear of replacement are recursive themes, whether in industry or society at large. In music, this pattern is clear.
Innovation and tradition move forward together, as new technology and art walk hand in hand. More broadly, I see technology not as a threat to human endeavor, but as a means to broaden and deepen it.What is the ultimate role technology can play in the world of music?Morei: Consider how the material science of Neolithic archery led to the first bowed instruments.
The Trois Frères hunting bow was used as a musical tool by humans. War-driven radio communication led to the invention of vacuum tubes, which, when overdriven in amps, enabled a vast new range of sounds. DAWs and laptops democratised industry-grade tools, putting them in the hands of teenagers in their bedrooms.
Now, from mechanical automation efforts and brain-computer interfaces (BCIs), emerges a whole new range of expressive possibilities for collaborative man-machine performance art.Physical mastery of an instrument will always command respect, but I am equally excited about shortening the path from thought to sound using cutting-edge tools.What about AI specifically? What is the best result of AI and music coexisting? Kemp-Muhl: Scientists generated a musical composition with analog computers in the 1950s; giving our inventions the autonomy to invent themselves is fascinating.
One application I’m particularly curious about is applying machine learning to allow our robotic fleet to improvise their own singularity symphonies together. While AI is helping do our taxes and making creepy videos of Will Smith eating spaghetti and piloting war drones, we might as well use it to make something alien and beautiful.Frederik’s work with robot arms has been your “guiding light.
” What new possibilities will open up as you can bring in more human-based parts into the robotic world?Morei: If you want to measure robotic dexterity, playing instruments designed for humans is the ultimate litmus test.Trying to replicate a trumpet embouchure has been particularly humbling. Brass players make it look effortless, but the sheer complexity of what’s actually happening — lips buzzing, tonguing, air pressure, embouchure constriction shifts — has only deepened our awe for human anatomy.
Ultimately, we want to push things further by designing instruments specifically for robots, taking advantage of their precision and speed to explore art forms beyond what humans could ever pull off.So far, many of your presentations focus on playing classical music. Why are you choosing to bring together some of the oldest music that is still appreciated by a general audience and the most modern technological methods of playing music? Gran: My core interest in using robotics for music lies in my role as a composer, deeply obsessed and focused on sound itself.
I am driven to create new sonic worlds, new compositions, and new ways of creating and playing music—sometimes pushing beyond what is possible with conventional techniques or equipment. Robotics opens doors to unique expressions and textures, expanding my compositional palette.At the same time, as with any human musician training on an instrument, there is much to learn from existing repertoire.
Classical works such as Bach provide a rich framework to test and refine both the robot’s technical and expressive abilities. In practice, this is part of the process of developing the system— just as a human cellist would practice core repertoire.Every tool —human or mechanical— imparts its own character.
Different robots and programming choices yield distinct musical personalities, much like the differences between instruments, tools, individual musicians, or even between instrument makers. I view this diversity as fundamental.You’ve discussed collaborating with plants and fish through this tech.
Can you explain how that works? What have been the results? Kemp-Muhl: We’ve been adapting medical electrodes to the outside of aquariums to harness electroconductive fields of albino Koi fish so that their swimming modulates ambient synthesizers. We also have plants permanently wired up to a Moog synth and delay pedal in our lab window, and the beautiful thing is they start to sing louder when the sun rises each morning. It gives you a very Panpsychist lens of life when you realize everything is just action potentials, electric fields, and mechanical actuators, even down to individual skin cells.
Based on your research and work, what do the AI-infused instruments of tomorrow look like? We have AI-assisted plugins and synths — will it be more of the same?Morei: During our first tests playing the violin using mere muscle signals and controlling the robot cello via brainwave-derived motor imaging, it became clear that our definition of ‘musician’ must expand. Entirely new ways of interfacing with the medium already exist. Like traditional musicianship, these can be honed and trained.
While I find the full automation of art a generally despicable pursuit— and there is still much to refine in clean dataset acquisition— humans have long since welcomed AI-assisted plugins and soft synths as valuable tools in an artist’s creative arsenal.There are many steps of creative work (I am looking at you, rotoscoping and breath comping) that I always thought were too drainingly repetitive for humans to do, and I am glad to see us shorten the path from initial thought to artistic output.Do you have any advice or messages for influential brands and artists looking to further integrate AI and music?Gran: Break some rules.
Take tools—such as AI, or technologies designed for one purpose, and push them into unexpected territories. I use industrial robots, originally engineered for mass production, to create experimental music. I find that fun and interesting.
And while working with artificial intelligence, let’s also confront our own natural foolishness.The post “Instead of robotics replacing musicians, there’s a big opportunity for extension and augmentation”: Finis Musicae on programming robots to play live instruments appeared first on MusicTech..
Entertainment
“Instead of robotics replacing musicians, there’s a big opportunity for extension and augmentation”: Finis Musicae on programming robots to play live instruments

The futurist art and transhuman tech collective is using AI, EEG, EMG, and other technical processes to allow robots to play strings, brass, and synths in ways humans physically can’tThe post “Instead of robotics replacing musicians, there’s a big opportunity for extension and augmentation”: Finis Musicae on programming robots to play live instruments appeared first on MusicTech.