Need help with Feature-Selection-for-Machine-Learning?
Click the “chat” button below for chat support from the developer who created it, or find similar developers for support.

About the developer

anujdutt9
227 Stars 134 Forks 23 Commits 0 Opened issues

Description

Methods with examples for Feature Selection during Pre-processing in Machine Learning.

Services available

!
?

Need anything else?

Contributors list

# 102,383
Python
python-...
Keras
Tensorf...
22 commits

Feature Selection for Machine Learning

This repository contains the code for three main methods in Machine Learning for Feature Selection i.e. Filter Methods, Wrapper Methods and Embedded Methods. All code is written in Python 3.

Status: Ongoing

Requirements

1. Python 3.5 +

2. Jupyter Notebook

3. Scikit-Learn

4. Numpy [+mkl for Windows]

5. Pandas

6. Matplotlib

7. Seaborn

8. mlxtend

Datasets

1. Santander Customer Satisfaction Dataset

2. BNP Paribas Cardif Claims Management Dataset

3. Titanic Disaster Dataset

4. Housing Prices Dataset

Filter Methods

| S.No. | Name | About | Status | | ----- | ----------------- | ------------------------------------------------------------------ | ------------ | | 1. | Constant Feature Elimination | This notebook explains how to remove the constant features during pre-processing step. | Completed | | 2. | Quasi-Constant Feature Elimination | This notebook explains how to get the Quasi-Constant features and remove them during pre-processing. | Completed | | 3. | Duplicate Features Elimination | This notebook explains how to find the duplicate features in a dataset and remove them. | Completed | | 4. | Correlation | This notebook explains how to get the correlation between features and between features and target and choose the best features. | Completed | | 5. | Machine Learning Pipeline | This notebook explains how to use all the above methods in a ML pipeline with performance comparison. | Completed | | 6. | Mutual Information | This notebook explains the concept of Mutual Information using classification and Regression to find the best features from a dataset. | Completed | | 7. | Fisher Score Chi Square | This notebook explains the concept of Fisher Score chi2 for feature selection. | Completed | | 8. | Univariate Feature Selection | This notebook explains the concept of Univariate Feature Selection using Classification and Regression. | Completed | | 9. | Univariate ROC/AUC/MSE | This notebook explains the concept of Univariate Feature Selection using ROC AUC scoring.| Completed | | 10. | Combining all Methods | This notebook compares the combined performance of all methods explained. | Completed |

Wrapper Methods

| S.No. | Name | About | Status | | ----- | ----------------- | ------------------------------------------------------------------ | ------------ | | 1. | Step Forward Feature Selection | This notebook explains the concept of Step Forward Feature Selection. | Completed | | 2. | Step Backward Feature Selection | This notebook explains the concept of Step Backward Feature Selection. |Completed| | 3. | Exhaustive Search Feature Selection | This notebook explains the concept of Exhaustive Search Feature Selection.| Completed |

Embedded Methods

| S.No. | Name | About | Status | | ----- | ----------------- | ------------------------------------------------------------------ | ------------ |

We use cookies. If you continue to browse the site, you agree to the use of cookies. For more information on our use of cookies please see our Privacy Policy.