Livecoding soundscapes in Kotlin with Compose Multiplatform
- Track: Music Production
- Room: UA2.220 (Guillissen)
- Day: Sunday
- Start (UTC+1): 14:30
- End (UTC+1): 14:55
- Video only: ua2220
- Chat: Join the conversation!
Kotlin's Compose Multiplatform allows for the creation of beautiful user interfaces in a declarative, functional paradigm. But the Compose compiler isn't limited to creating UI or even visuals.
In this talk, we explore using the Compose compiler to create soundscapes and other pieces of music. I will present a library and domain-specific language (DSL) for musical composition – complete with a plugin to turn IntelliJ into a livecoding environment.
We'll start by looking at the building blocks of musical compositions and how Kotlin and Compose Multiplatform can be used to implement them in a declarative and functional way. This includes tone generators/synths, shaping the envelope of a sound, playing samples, expressing various timings and simultaneously playing multiple tracks, and creating a drum machine. We'll also see how a powerful DSL allows us to more conveniently express complex rhythms.
Following that, we'll look at how to implement the audio generation backend on multiple platforms, and some ways to optimise the Compose code – the precise timings required for music present a particular challenge. We'll also see how the audio generation integrates with existing Compose components, and how to create enlightening visualisations of the code and music.
Finally we'll look at how to turn IntelliJ into a livecoding environment, allowing the creation and modification of musical pieces on the fly. This enables experimentation and improvisation for any programmer or musician.
Speakers
| Merlin Pahic |