In this blog post, I used the random forest model to predict employment status based on demographic characteristics excluding race. Then, I performed a fairness audit to assess whether my algorithm displays bias with respect to race.
This blog post demonstrates the motivation, methods, and results of our final project in speech emotion recognition.
In this blog post, I introduce a Middlebury guest speaker Dr. Timnit Gebru, propose some questions for her, and reflect on her broad talk.
The first part of the blog shows the implementation of least-squares linear regression in two ways: 1) the analytical way of using a formula for the optimal weight vector and 2) the way of using gradient descent. The second part performs experiments to illustrate the effect of overfitting. The last part demonstrates the LASSO regularization, an alternative algorithm that uses a modified loss function with a regularization term, for overparameterized problems.
This blog shows the standard workflow of machine learning through the classification of three penguine species. The primary goal is to determine the smallest number of measurements necessary to confidently determine the species.
Regular and stochastic logistic regression fit models with experiements on learning rate, batch size, momentum, and feature numbers.
Implementation of perceptron algorithm.
An example blog post illustrating the key techniques you’ll need to demonstrate your learning in CSCI 0451.