Support Vector Machine is one of the most commonly used supervised machine learning algorithms for data classification. A binary classifier, the support vector machine algorithm works in vector space to sort data points by finding the best hyperplane separating them into two groups. Thanks to its reliance upon vectors, it finds frontiers between groups of data points even in nonlinear patterns and features spaces of high dimensions.
K Nearest Neighbors is a popular classification algorithm for supervised machine learning. It permits to divide data points into groups, defining a model that will then be able to classify an unknown data point in one group or another. The K parameter, defined during programming, allows the algorithm to classify unknown data points by examining the K closest known data points.
Machine learning makes use of multiple mathematical formulas and relations to implement the different tasks it can handle. Gathered in the following “cheat sheets” by Afshine and Shervine Amidi, the concepts for supervised and unsupervised learning, deep learning together with machine learning tips and tricks, probabilities, statistics algebra and calculus reminders, are all presented in details with the underlying math.
Based on the Stanford course on Machine Learning (CS 229), the cheat sheets summarize the important concepts of each branch with simple explanations and diagrams, such as the following table cover underfitting and overfitting.
|Symptoms||• High training error|
• Training error close to test error
• High bias
|• Training error slightly lower than test error||• Very low training error|
• Training error much lower than test error
• High variance
|Deep learning illustration|
|Possible remedies||• Complexify model|
• Add more features
• Train longer
|• Perform regularization|
• Get more data
The main machine learning cheat sheets can be found here:
- Supervised Learning
Results about linear models, generative learning, support vector machines and kernel methods
- Unsupervised Learning
Formulas about clustering methods and dimensionality reduction
- Deep Learning
Main concepts around neural networks, backpropagation and reinforcement learning
- Machine Learning Tips and Tricks
Good habits and sanity checks to make sure that your model is trained the right way
Other mathematics and coding cheat sheets can be found here:
- Probabilities and Statistics
Formulas about combinatorics, random variables, main probability distributions, and parameter estimation
- Linear Algebra and Calculus
Matrix-vector notations as well as algebra and calculus properties
- Getting started with Matlab
Main features and good practices to adopt
The complete cheat sheets can also be found on Github.