Linear regression

Small steps towards machine learning.
January 26, 2019

Below are some notes that I took while following Daniel Shiffman’s video tutorials on neural networks, which are heavily inspired by Make Your Own Neural Network, a book written by Tariq Rashid. The concepts and formulas here are not my original material, I just wrote them down in order to better understand and remember them.

Slope and intercept

A line is traditionally represented by this equation, in which m represents the slope and b the y intercept at the origin:

y=mx+b

In the case of linear regression (or the ordinary least squares method), we will calculate the m value with the formula below. This formula is taken from this video made by Daniel Shiffman for his Intelligence and Learning course. We will consider that the numerator of this fraction represents the correlation between the growth of the value x and the growth of the value y (which determines the slope of the line).

m=ni=0(xi¯x)(yi¯y)ni=0(xi¯x)2

Here, ¯x represents the average of all the x values, which is the sum of all the x divided by the amount n of those values.

¯x=ni=0xin

As for b, the y intercept, we will calculate it thusly:

b=¯ym¯x

Context

This blog post is part of my research project Towards an algorithmic cinema, started in April 2018. I invite you to read the first blog post of the project to learn more about it.