Machine Learning Fundamentals (en)
This course provides an introductory overview of basic concepts and techniques in the field of machine learning. Structured over two intensive days, the course combines theory and practice to provide students with a solid foundation in Machine Learning.
During the course, students will be introduced to the Jupyter Notebook development environment and explore key concepts such as model representation, cost functions, and the descending gradient. Through a series of lectures and hands-on labs, they will gain skills in applying multiple linear regression and classification techniques, with an emphasis on logistic regression. Students will learn to implement these models using Python and the Scikit-Learn library, optimizing the learning rate and applying feature engineering techniques.
The course also covers advanced topics such as overfitting and introduces basic regularization techniques.
CODE: DSAI200
Category: Artificial Intelligence
Teaching methodology
The course includes educational laboratories in which each student will be able to work in order to complete training exercises that will provide practical experience in using the instrument, for each of the topics covered during the course.
Prerequisites
- Basic knowledge of computer science and programming.
- Familiarity with Python and its syntax.
- Understanding of basic concepts of linear
- algebra and differential calculus.
- Familiarity with the Scikit-Learn library for machine learning.
The following is an overview of course content:
Day 1
- Introduction to Jupyter Notebook
- Model Representation
- Cost Function
- Descending Gradient Use of Vectors
- Multiple Linear Regression
- Learning Rate
- Features Engineering
- Linear Regression with Scikit-Learn
Day 2
- Classification
- Logistic Regression
- Decision Boundary
- Logistic Loss
- Cost Function for Logistic Regression
- Descending Gradient for Logistic Regression
- Logistic Regression Using Scikit-Learn
- Overfitting
- Cost and Regularized Gradient
Students will obtain:
- Basic understanding of how machine learning models are represented and the differences between various types of models.
- Familiarity with cost functions for evaluating the accuracy of machine learning models.
- Introductory knowledge of the descending gradient method for optimizing machine learning models.
- Basic skills in implementing the descending gradient in Python.
- Awareness of vectorization techniques to improve the computational efficiency of machine learning models.
- Initial proficiency in applying multiple linear regression techniques to predict continuous values.
- Working knowledge of using the Scikit-Learn library to implement linear regressions.
- Understanding of the importance of learning rate in model training.
- Introduction to feature engineering techniques to improve the performance of machine learning models.
- Understanding of the basic concepts of classification and logistic regression.
- Hands-on experience implementing logistic regression using Python and Scikit-Learn.
- Ability to interpret decision boundaries for simple classification models.
- Introductory knowledge of cost functions specific to logistic regression, such as logistic loss.
- Awareness of regularization techniques to prevent overfitting and improve model generalization.
Duration – 2 days
Delivery – in Classroom, On Site, Remote
PC and SW requirements:
- Internet connection
- Web browser, Google Chrome
- Zoom
Language
- Instructor: English
- Workshops: English
- Slides: English