Machine Learning

Quality, not quantity

This 8-week program is designed to provide comprehensive theoretical and practical learning in Classical Machine Learning, with no prerequisites required. It covers end-to-end learning, focusing on the three major components: Supervised Learning, Unsupervised Learning, and Reinforcement Learning. Participants will delve into both regression and classification, starting with data extraction, moving through feature engineering and feature selection, and culminating in model training and evaluation. This structured approach ensures a thorough understanding of each step in the machine learning process.

Content

No Pre-requisites required
Mock Interview Included
Learn Classical ML in 8 weeks (2.5 Hours Each)
Week 1: Introduction to Machine Learning

Lecture:

  • Overview of Machine Learning

  • Types of Machine Learning: Supervised, Unsupervised, Reinforcement

  • Applications of Machine Learning in Engineering

Lab:

  • Setting up the environment (Python, Jupyter Notebooks, Libraries)

  • Introduction to basic Python for ML

Week 2: Data Preprocessing

Lecture:

  • Understanding Data: Types, Features, and Quality

  • Data Cleaning: Handling missing values, outliers

  • Data Transformation: Normalization, Standardization, Encoding categorical variables

Lab:

  • Practical exercises on data preprocessing using pandas and scikit-learn

Week 3: Supervised Learning - Regression

- Lecture:

  • Linear Regression: Concepts, Assumptions, and Implementation

  • Evaluation Metrics: MAE, MSE, RMSE, R²

  • Regularization techniques: Ridge, Lasso

- Lab:

  • Implementing Linear Regression models

  • Evaluating and tuning the models

Week 4: Supervised Learning - Classification

Lecture:

  • Evaluation Matrix

  • Confusion Matrix

  • F1 score, Precision, Recall

  • Qscore, ROC-AUC

Lab

  • Building and evaluating classification models

Week 5: Unsupervised Learning

Lecture:

  • Clustering: K-means, Hierarchical Clustering

  • Dimensionality Reduction: PCA, t-SNE

Lab:

  • Practical exercises on clustering and dimensionality reduction

Week 6: Model Evaluation and Selection

Lecture:

  • Cross-Validation techniques

  • Bias-Variance Tradeoff

  • Hyperparameter tuning: Grid Search, Random Search

Lab:

  • Applying cross-validation and hyperparameter tuning on models

Week 7: Advanced Topics

Lecture:

  • Ensemble Methods: Bagging, Boosting (Random Forest, Gradient Boosting)

  • Support Vector Machines (SVM)

Lab:

  • Implementing and evaluating ensemble methods and SVM

Week 8: Project and Review

Lecture:

  • Overview of a machine learning project workflow

  • Best practices in model deployment and maintenance

Lab:

  • Students work on a capstone project, applying all the concepts learned

  • Presentations and reviews

  • Additional Resources

Miscellaneous: Doubts session