Fundamentals of Regression in Machine Learning

Regression is a fundamental technique in supervised machine learning which is used to predict continuous outcomes based on input data. In this interactive 4-hour workshop, participants will explore the core concepts of regression, including simple and multiple linear regression, regularisation techniques (Ridge & Lasso), model evaluation and touch on Bayesian methods and uncertainty quantification. Through hands-on coding exercises in Python (Scikit-Learn, statsmodels, cmdstanpy, NumPy, Pandas, Matplotlib), attendees will learn how to build, interpret, and optimise regression models using real-world datasets. They will understand how to enable implicit parallelisation methods to effectively use NCI facilities. By the end of the session, participants will have the practical skills to apply regression techniques to solve predictive modelling problems effectively. No prior machine learning experience is required, however basic Python and statistics knowledge is required.
This workshop is ideal for researchers looking to apply machine learning techniques to their data, enabling them to build and evaluate regression models for predictive analysis.
By the end of this workshop, participants will be able to:
Define key concepts of regression in supervised learning, including types of regression models, their applications and mathematical basis.
Compare different regression techniques (e.g., simple vs. multiple regression, ridge vs. lasso, frequentist vs Bayesian) and explain advantages and when to use them.
Apply linear regression models using Python and relevant statistics packages to analyse real-world datasets.
Evaluate model performance using metrics such as R² score, Mean Squared Error (MSE), cross-validation techniques and parameter uncertainty.
Optimise regression models by implementing feature engineering, regularisation techniques, and hyperparameter tuning.
Upcoming workshops
EVENTS
NEWS