February 26, 2020

A Piano for SuperCollider

Complete code for the project and some documentation.

Last winter, I started to work on an acoustic piano multi-sampler for SuperCollider. The earliest versions had many issues, and my budding programming skills in sclang did not allow me to fix them. But I’ve used the instrument a lot since then, and I’ve improved the code along the way. I plan to keep working on the instrument as I keep using it, but I’m thinking that it could already be useful for other people if they are interested. So here it is.

A very simple demonstration of the sampler using two patterns created with Pbind.

The instrument uses a public domain sample pack I used this same sample pack for the soundtrack of my short film Étude for Cellular Automata no 2. I composed this piece with Ableton Live because I didn’t know about SuperCollider at the time. The piece uses only one or two samples in the pack, as I had found that building a good multi-sampler with Ableton Live was mostly a nightmare. I also used the piano samples in this live coding experiment which was one of my first SuperCollider projects.that you can download from freesound.org. You’ll need to create an account to download it, but it’s free to join, and free to download, and I highly recommend this website.

Below is the entirety of the code for the sampler, along with some Pbind examples to try it out. It is free software distributed under an Apache 2.0 licence. You can also find this code on GitHub.

// Run this block of code once the server is booted.
// You also need to make sure that the packLocation variable
// is set to the actual location of the downloaded sample pack.

var pianoSamples, pianoFolder, makeLookUp, indices, pitches, dynAmnt, maxDyn, maxNote,
packLocation = "/21055__samulis__vsco-2-ce-keys-upright-piano/",
quiet = false;

dynAmnt = if (quiet, {2}, {3});
maxDyn = if (quiet, {1}, {2});
maxNote = if (quiet, {46}, {1e2});
pianoSamples = Array.new;
pianoFolder = PathName.new(packLocation);
    |path, i|
    if (i < maxNote, {
        pianoSamples = pianoSamples.add(Buffer.read(s, path.fullPath));

makeLookUp = {
    |note, dynamic|
    var octave = floor(note / 12) - 2;
    var degree = note % 12;
    var sampledNote = [1,  1,  1,  1,  2,  2,  2,  3,  3,  3,  3,  3];
    var noteDeltas = [-1, 0,  1,  2, -1,  0,  1, -2, -1,  0,  1,  2];
    var dynamicOffset = dynamic * 23;
    var sampleToGet = octave * 3 + sampledNote[degree] + dynamicOffset;
    var pitch = noteDeltas[degree];
    [sampleToGet, pitch];
indices = dynAmnt.collect({|j| (20..110).collect({|i| makeLookUp.(i, j)[0]})}).flat;
pitches = dynAmnt.collect({|j| (20..110).collect({|i| makeLookUp.(i, j)[1]})}).flat;

Event.addEventType(\pianoEvent, {
    var index;
    if (~num.isNil, {~num = 60}, {~num = min(max(20, ~num), 110)});
    if (~dyn.isNil, {~dyn = 0}, {~dyn = floor(min(max(0, ~dyn), maxDyn))});
    index = floor(~num) - 20 + (~dyn * 91);
    ~buf = pianoSamples[indices[index]];
    ~rate = (pitches[index] + frac(~num)).midiratio;
    ~instrument = \pianoSynth;
    ~type = \note;

SynthDef(\pianoSynth, {
    arg buf = pianoSamples[0], rate = 1, spos = 0, pan = 0, amp = 1, out = 0, atk = 0, sus = 0, rel = 8;
    var sig, env;
    env = EnvGen.kr(Env.new([0, 1, 1, 0], [atk, sus, rel]), doneAction: 2);
    sig = PlayBuf.ar(2, buf, rate * BufRateScale.ir(buf), startPos: spos, doneAction: 2);
    sig = sig * amp * 18 * env;
    sig = Balance2.ar(sig[0], sig[1], pan, 1);
    Out.ar(out, sig);

// Below are examples of patterns that show how to use the instrument.
// I recommend running both patterns at the same time,
// they are made to complement each other.

var key = 62;
var notes = key + ([0, 3, 7, 10] ++ [-5, 2, 3, 9]);
~pianoRiff = Pbind(
    \type, \pianoEvent,
    \dur, Pseq(0.5!1 ++ (0.25!3), inf),
    \num, Pseq(notes, inf),
    \dyn, Pseq([1, 0, 0, 1], inf),
    \amp, Pseq([0.5, 2, 2, 0.5], inf),
    \pan, Pwhite(-0.75, 0.75, inf),
    \rel, 4
).play(quant: [2]);

var key = 62 + 36;
var notes = key + [2, -5, 0, -2];
~pianoRiff2 = Pbind(
    \type, \pianoEvent,
    \dur, Pseq([0.25, 1.75], inf),
    \num, Pseq(notes, inf),
    \dyn, Pseq([1, 1, 1, 1], inf),
    \amp, Pseq([0.5, 1, 1, 0.5], inf),
    \pan, Pwhite(-0.75, 0.75, inf),
    \rel, 4
).play(quant: [2]);


How to use the sampler

Using the instrument will probably be very straightforward for most users of SuperCollider, but I thought that writing down some instructions could still be useful, particularly for beginners. It’s also an opportunity to talk about the design decisions that went into creating the instrument, and the ways in which it could potentially be improved.

October 4, 2019

Colourful ravines

Adapting a short film into a live coding set.

I currently have the vague idea of reusing some concepts from my short film Ravines for a live coding set that I could perform in various ways (as opposed to the short film itself, which exists in a static, definite form). The laptop I use for live coding is quite slow, so I have to really simplify and optimize the original algorithms to be able to run them in real time. It’s an interesting challenge to work around those limitations. The animations in Ravines were too resource-intensive to be rendered in real time. It took around 12 hours to render the whole thing.

As you can see in the still images above and below, my goal is also to play with colour. I enjoy the stark black and white look of Ravines, but I generally prefer to work with colours and I envision my live coding sets as being very warm and colourful. The colourful backgrounds here were created by adapting the fog effect taken from this ShaderToy post.

April 8, 2019

Wavetable synthesis

Musical sketches and scattered notes.

I’m currently learning the basics of wavetable synthesis with SuperCollider, and I will gather my notes and my first musical sketches below. I started to learn with this excellent video by Eli Fieldsteel, who takes part of his material in the documentation of the Shaper class.

It all begins with the creation of a wavetable:

~sig = Signal.newClear(513);
    arg x, y, i;
    // i.linlin(0, 512, -1, 1);
    // sin(x);
    sin(x.cubed * 20);
}, 0, 1);
~w = ~sig.asWavetableNoWrap;
~b = Buffer.loadCollection(s, ~w);

We create an instance of Signal, we fill it with the waveFill method, we transform it into an instance of Wavetable, and then load it into a Buffer. The expression sin(x.cubed * 20) used to fill the Signal was written arbitrarily and I really love the sound that it produces.

March 29, 2019

Spring on Phobos

A live coding set using WebGL and SuperCollider.

I’m continuing to learn live coding with SuperCollider and WebGL shaders, and I composed another audiovisual set, trying more complicated things that I wasn’t able to do for the first one. I performed this new piece at the concert venue La Vitrola on March 29th 2019, as part of the third edition of Signes vitaux, a series of concerts organised by Rodrigo Velasco (Yecto) and Toplap Montréal.

Unfortunately, my computer is not fast enough to render this piece to disk while I perform it, so for now I’m unable to share a completed version of this work.

March 6, 2019

Swaying loops

Using 4D OpenSimplex noise in the context of algorithmic botany.

The code written to make the animation above and the one below can be found on the noise-loops branch of the Evolutionary Botany GitHub repository. The OpenSimplex noise implementation that I’m using is called SimplexNoiseJS and was written by Mark Spronck.

February 15, 2019

Tangent lines

Tiptoeing into calculus.

While getting to know the basics of machine learning, I stumbled on some notions of calculus, something that I’m very unfamiliar with but that I would love to understand. I have been thinking for a while about devising some small experiments that would allow me to familiarize myself with mathematical concepts in an intuitive way. So this sudden need to understand a bit of calculus seemed like a good opportunity to delve into this.

The animation below is one of those experiments. It helped me to familiarize myself with the notions of limits and tangents. A line is tangent to a circle when its slope is identical to the slope of the circle at the point where they meet. This slope is calculated by taking two different points on the circle, and by then diminishing almost infinitely the distance between those two points. This is what I have done while writing the code for this animation. I was interested in what kinds of shapes I would obtain by playing with these notions, and I ended up with a dense mesh of tangent lines evenly distributed around a circle. I then got curious about how the mesh would “react” if I morphed the shape of the circle, and this animation was the result.

January 27, 2019

Small neural network

Notes taken while learning how to build a neural network for artistic purposes.

Below are some notes that I took while following Daniel Shiffman’s video tutorials on neural networks, which are heavily inspired by Make Your Own Neural Network, a book written by Tariq Rashid. The concepts and formulas here are not my original material, I just wrote them down in order to better understand and remember them.

Feedforward algorithm

The calculations made by one the network’s layers, which takes into account its synaptic “weights”, can be represented by the matrix product written down below, in which h represents an intermediary layer (or “hidden layer”) of the network, w represents the weights and x represents the inputs. In this inverted notation, wij indicates the weight from j to i.


This product can also be represented thusly:

January 26, 2019

Linear regression

Small steps towards machine learning.

Below are some notes that I took while following Daniel Shiffman’s video tutorials on neural networks, which are heavily inspired by Make Your Own Neural Network, a book written by Tariq Rashid. The concepts and formulas here are not my original material, I just wrote them down in order to better understand and remember them.

Slope and intercept

A line is traditionally represented by this equation, in which m represents the slope and b the y intercept at the origin:


In the case of linear regression (or the ordinary least squares method), we will calculate the m value with the formula below. This formula is taken from this video made by Daniel Shiffman for his Intelligence and Learning course. We will consider that the numerator of this fraction represents the correlation between the growth of the value x and the growth of the value y (which determines the slope of the line).

January 12, 2019

Singing trees

Animated graphs.


This blog post is part of my research project Towards an algorithmic cinema, started in April 2018. I invite you to read the first blog post of the project to learn more about it.

December 17, 2018

Blurry quadrilaterals

Geometric studies with WebGL.

Drawing blurry shapes with WebGL can be quite complicated. All the basic shapes it creates are very precise, with well-defined (and often poorly aliased) contours. Because I want to create animations that are foggy, atmospheric, and uncertain, I’m slowly building a set of tools dedicated to drawing blurry WebGL geometry.

Blurry circles are not so hard to draw because the distance from the center of a circle can be used to draw a gradient, and then this gradient can be adjusted with the smoothstep() function—it’s all fast calculations for the gpu. Drawing more complex shapes requires me to learn a lot more about WebGL, a technology that I do not yet know a lot about.

I started with quadrilaterals. I created a system where I only need to specify the position and blurriness of a quadrilateral, and all the triangles necessary to interpolate the colour of the shape are created automatically. In the upper left part of the diagram below, asking for a rectangle whose area is equal to the area of triangles a and b will generate the triangles a, b, c, d, e, f, g, h, i, and j.