The midi sequencer sends midi to the synthesizers and samplers, that midi is forwarded to a soundcard connected to a raspberry pi. The raspberry pi then turns the midi signals into visuals using "processing", a java like script language. Unfortunately it doesn't work when I'm streaming, because the soundcard is them connected to the laptop 😅
I know Processing! I’ve made a interactive art installation with it more than 10 years ago!
Got any references / pictures / code to that? I'd be interested to see!
https://attack.sebastix.nl/
I have a video somewhere as well, will send it later when I’ve found it
Reminds me of Dexter, the series 😂