Notes for machine learning
This repo contains tutorials to implement various ML algorithms from scratch or using pre-built libraries. This is a living repo and I will be adding more tutorials as I learn more. Hope it will be helpful for someone who wants to understand these algorithms conceptually as well as learn how to implement them using Python.
01GradientBoosting_Scratch.ipynb This jupyter notebook has implementation of basic gradient boosting algorithm with an intuitive example. Learn about decision tree and intuition behind gradient boositng trees.
02CollaborativeFiltering.ipynb Builting MovieLens recommendation system with collaborating filtering using PyTorch and fast.ai.
03RandomForest_Interpretetion.ipynb How to interpret a seemimngly blackbox algorithm. Feature importance, Tree interpretor and Confidence intervals for predictions.
04NeuralNet_Scratch.ipynb Using MNSIT data, this notebook has implementation of neural net from scratch using PyTorch.
05LossFunctions.ipynb Exploring regression and classification loss functions.
06NLPFastai.ipynb Naive bayes, logistic regression, bag of words on IMDB data.
07_Eigenfaces.ipynb Preprocessing of faces and PCA analysis on the data to recontruct faces and see similarities among differnt faces.
08kmeansscratch.ipynb Implementation and visualization of kmeans algorithm from scratch.
09QuantileRegression.ipynb Implementation of quantile regression using sklearn.
10TransferLearn_MXNet.ipynb Tutorial on how to perform transfer learning using MXNet. Notebook used in this blogpost.