Representing a Triangulation – Introduces the notion of a triangulation of a manifold, and a data structure for representing one.
Testing a Manifold For Orientability – Demonstrates the power of a triangulation by testing for orientability.
Representing Abstract Simplicial Complexes – Introduces simplicial complexes and a python class for working with them.
Computing Homology – Puts the representation of abstract simplicial complexes to good use and calculates the Betti numbers of a complex.
A First Taste of Object Oriented Programming – Introduction to OOP in python covering inheritance, multiple inheritance, composition and ‘dynamic’ inheritance.
Decorators in Python and Syncing Objects -Introduces Decorators in Python for the purpose of allowing objects in a hierarchical relationship, say subgroup to group, to update one another.
Theano and Autoencoder – Introduces the symbolic computation library with a basic implementation of an autoencoder.
A First Taste of Graph Theory – A basic introduction to concepts and terminology in graph theory.
Graphs as Objects in Python – Introduces the main graph object along with vertices, edges, and directed and undirected graphs as subclasses.
Special Graphs and Dynamic Inheritance – Introduces some subclasses for the main kinds of graph such as complete, linear and cycle. It also shows how a class of Cycle can dynamically inherit from either the directed or undirected class.
Finding Triangles – A discussion of the complexity of finding all triangles in a graph, and a python implementation of the ‘forward’ algorithm for doing so.
Find the Connected Components – Introduces the depth first search algorithm and how to use this to find the connected components of a graph.
A First Taste of Bayesian Theory – Here I introduce Bayes’ Theorem and do an example of multiple testing for a disease.
Linear Regression: The Math(s)– An introduction to the maths behind basic linear regression, assumes some knowledge of linear algebra.
Linear Regression: The Code – Implements linear regression in Python and tests it with some synthetic data.
Logistic Regression – Explains logistic regression, then implements it in Python.
Entropy – A quick primer on entropy and related concepts.
Binary Search Trees – Introduces Binary Search Trees and how they can be used to sort in time proportional to the height of the tree with a python implementation.
AVL Trees – Introduces an improvement to Binary Search Trees that balance themselves, allowing us to sort in time.
Munging the Titanic – An introduction to dealing with data in Python, dealing with missing values, discretisation and so on. Uses the Titanic data.
Plotting with Python – An introduction to making basic plots in Python, using matplotlib and the Titanic Data.
Basic Feature Engineering – Some basic ideas in feature engineering, grabbing important strings, taking combinations of columns, using the Titanic data.
Bootstrapping – Introduces out of bag resampling, what it is used for, some of the theory and python code for doing it.
Decision Trees Part 1: Representation – Here we create classes for representing a decision tree.
Decision Trees Part 2: Growing your Tree – Shows how to grow a tree by splitting leaves according to a metric such as the Gini coefficient, entropy or residual sum of squares for regression problems.
Decision Trees Part 3: Pruning your Tree -Shows how to prune a tree by penalizing tree complexity and error on a data set.
To Prune or Not to Prune – Introduces cross-validation and how it can be used to control how a tree is pruned. Also introduces the main decision tree model I use for the Titanic data set.
A complete Guide to Getting 0.79903 in Kaggle’s Titanic Competition – Provides an overview of the decision tree model, and sourcecode for producing the model used to obtain the score.
My First Genetic Algorithm – A straightforward introduction to genetic algorithms with a python implementation, applied to a guessing game.
Introduction to the Random Forest – Introduction to the random forest algorithm, with an implementation in Python, demonstrating how it can be used to determine variable importance. I demonstrate it on the Titanic dataset, and show that the ‘Spouse’ variable I engineered is completely useless.
Enter the Perceptron – An introduction to the perceptron algorithm with a python implementation.
Neural Networks Part 1 – An introduction to feedforward neural networks, deriving the backpropagation algorithm.
Neural Networks Part 2– Continues by expressing the model in vector notation, and implements this in python with an example of regression and classification.