Page 2

Context

This blog post is part of my research project Towards an algorithmic cinema, started in April 2018. I invite you to read the first blog post of the project to learn more about it.

Drawing blurry shapes with WebGL can be quite complicated. All the basic shapes it creates are very precise, with well-defined (and often poorly aliased) contours. Because I want to create animations that are foggy, atmospheric, and uncertain, I’m slowly building a set of tools dedicated to drawing blurry WebGL geometry.

Blurry circles are not so hard to draw because the distance from the center of a circle can be used to draw a gradient, and then this gradient can be adjusted with the smoothstep() function—it’s all fast calculations for the gpu. Drawing more complex shapes requires me to learn a lot more about WebGL, a technology that I do not yet know a lot about.

I started with quadrilaterals. I created a system where I only need to specify the position and blurriness of a quadrilateral, and all the triangles necessary to interpolate the colour of the shape are created automatically. In the upper left part of the diagram below, asking for a rectangle whose area is equal to the area of triangles a and b will generate the triangles a, b, c, d, e, f, g, h, i, and j.

This project represents my very first experience with live coding. The animation is made with p5.js and WebGL, and the music is made with SuperCollider. The music also uses this free pack of piano samples.

The animation is based on Boids flocking algorithms, which I first learned about in video tutorials made by Daniel Shiffman. Shiffman has covered this subject a few times already, but this video in particular was the inspiration behind this project.

The project’s full video can be watched on YouTube, and a shorter version (which contains, in my opinion, the most interesting part of the longer video) is on my Twitter feed.

Context

This blog post is part of my research project Towards an algorithmic cinema, started in April 2018. I invite you to read the first blog post of the project to learn more about it.

Context

This blog post is part of my research project Towards an algorithmic cinema, started in April 2018. I invite you to read the first blog post of the project to learn more about it.

I’m starting to learn how to program WebGL shaders, a field that seems immense and often confusing. The code for my first few experiments (which are mostly variations on the spiral you can see above) can be found on GitHub.

En apprenant à travailler avec les graphes, il m’est venu l’idée de m’en servir pour créer des animations qui combineraient du mouvement visuel et du mouvement sonore. L’idée m’est aussi venue en voyant le séquenceur circulaire dont Sam Tarakajian parle sur sa chaîne YouTube consacrée à Max/Msp. Une démonstration vidéo de ce projet se trouve sur mon fil Twitter.

Context

This blog post is part of my research project Towards an algorithmic cinema, started in April 2018. I invite you to read the first blog post of the project to learn more about it.