Kasper Jordaens


please press to get the best experience
press space or swipe down for next, esc or pinch for overview

___artistic CV________

__selected works____


artistic CV

The purpose of the following slides is to highlight some of the more artistic paths I followed, specifically collaborations, commissioned or simply employed work. To get an overview of my professional career please have a look at my linkedIn. Although both paths have always been intertwined, the angle to look at things matters.

Wim Delvoye - artistic engineering (2005-2009)

Art&D - Co-curating, directing (2014)

imec/STUFF. - tech/creative production (2018)

sondervan - live interactive visual and light mapping (2018)

with sioen & Koen van den Broek
chords & cracks - live interactive visual and light mapping (2018)

selected work

This section highlights individual artistic research and practice which is fully supported by my own artistic vision. As it is less about "working" it is all about expressing how I see the world and interact with it. Collaboration is very important and this can be expressed in close interaction with people during performances but not less important is how we as humans interact with machines (computers, AI, algorithms, instruments, ...) or even how two worlds (digital and analogue) reinforce each other through touch points in an artistic process. In my research process I will build tools, and I try to evolve these to instruments for live artistic production.

pumpkin orchestra 2012 - 2013

H.AL.I.C - live coded visuals (2014 - now)

plotter works___

I've been doing plotter works for quite a while. The delicate interaction between the tight digital lines and the pens and paper adding various imperfections makes this a powerful medium. The vast possibilities of digital variations combined with the never the same production process give each plot a unique appearance. So although numbered, each one of a series is very unique.

__triangle yur_______

music by sondervan - bandcamp page

triangle yur series (300pc), record cover, pen on cardboard
triangle yur series (300pc), record cover, pen on cardboard


CoverCover is a bot that makes reinterpretations of record covers. You can either show the record to a camera or type in a record name (easier as people not very often carry around a 12″ record nowadays). The bot will try it’s best to find a match, scrape all possible metadata on it generate associations in natural language, find some creative commons licensed images, convert them to plottable shapes (using potrace, python and chiplotle) make a composition and plot it on a record inlay including the creative commons references. The result questions the role of the robot, the programmer, the original record cover maker, the person who shows the record and the people that put CC-licensed images online… is this a collaborative project?

covercover, auto generated record cover cover , pen on LP-sleeve
installation at data and music conference the future of data

plotter experiments

twitter feed

These experiments serve me eg. in getting to know how paper reacts to ink or how the computer models I make translate to paper. They are essential for progressing towards real artistic work and evolve my plotting skills. The process itself is also important so I can automatically publish the sketches to twitter straight from my workflow.

density tests
ink on paper tests
parametrisation iterations

interactive robot piano

This robot piano tweets, replys to tweets and rhymes to the music you play and prints the text for the users

123 Piano ascii art + RPi software (2019) - picture Evy Ottermans

interactive visuals performances

I see music/sound as shapes blending in the environment. I try to reproduce the vision and add a certain feel to it so it becomes part of the music, aiming for the -hard to explain I admit- blend of image and sound. As if it would be possible to merge the audible and visible spectrum. This got me started in the early days making animated shapes, synced to music with early versions of 3Ds max at school (took ages to render). Today I use creative code tools or game engines to make this all dynamic and realtime and take it to the stage.

live coding

My live coding practice takes place as 50% of H.AL.I.C. which stands for Heuristic ALgorithmic Interactive Computing meaning we combine the power of straightforward computing with the gut feeling of creative beings and we’re using this power to augment our output. Furthermore we strive to achieve a harmonic cooperation not only between the human participants and the computer but also between the audio and the graphical output.

live visuals

Interactive graphical shows, made up from quartz composer patches, processing sketches, quil, unity and other flavors of code tied together with syphon and pure data, controlled with CoGe and homebrew hardware controllers and projection mapped with millumin. It's a recipe for intense interactive audio-reactive visuals that claim their spot on stage in a dialog with live music.


testing a 16 speaker spatial audio setup in a deadroom

For long I have been exploring methods of making sound. I always saw myself therein as a musical instruments builder, using tools to build instruments. Both software and hardware. Along the road I started doing projects with musicians, and missing the classical musical training, learning things about music in a very pragmatic way. Today I'm a member of a band where I play the modular synth and I keep exploring building these instruments, tailored to my concept of music making (not classically trained that is). I make use of several open source hardware and software projects, hacking in and out to fit my needs, contributing back where I can. This is where I am right now, and this is the starting grid for what's next

___modular synth__________


I was involved as an artist in the development of axoloti and I continue to invest time in this versatile "arduino for audio". It's a cornerstone in my setup, integrates with other eurorack modules and it served me in numerous artistic projects.

alpha version of axo, connected to forks that read signals from vegetables

artistic vision

I stated I believe in collaboration before. Humans and machines together in a society were decisions are often made by one having impact on the other. People not always connect with each other. Often they do connect with their machines though, letting algorithms decide what they do, where they eat and who they love. Better understanding of the human soul and the ghost in the machine results in the needed resilience. Only depth and focus can bring you there, wether it is talking, interacting and opening yourself up to someone to get a full understanding or diving into a technology to fully understand what it is capable of. I try and get this out in performative way, building tools to improve all forms of collaboration... I have not yet given up humanity, and I seek the right way to embrace technology while remaining critical for what humans want to use it for. This is my quest...