Overview
In this tutorial you will explore advanced regression techniques that go beyond ordinary least squares. Topics include regularization, polynomial features, ridge, lasso, and elastic net regressions, with code examples in Python and an interactive demo.
Regularization
Regularization adds a penalty term to the loss function to reduce overfitting. Two popular forms are L2 (ridge) and L1 (lasso).
from sklearn.linear_model import Ridge, Lasso
ridge = Ridge(alpha=1.0)
lasso = Lasso(alpha=0.1)
Polynomial Regression
Extend linear models by adding polynomial features to capture non‑linear relationships.
from sklearn.preprocessing import PolynomialFeatures
poly = PolynomialFeatures(degree=3, include_bias=False)
X_poly = poly.fit_transform(X)
Ridge Regression
Ridge minimizes the sum of squared residuals plus a penalty proportional to the square of the coefficient magnitude.
ridge = Ridge(alpha=0.5, solver='auto')
ridge.fit(X_train, y_train)
Lasso Regression
Lasso adds an L1 penalty, encouraging sparse solutions where some coefficients become exactly zero.
lasso = Lasso(alpha=0.01, max_iter=10000)
lasso.fit(X_train, y_train)
Elastic Net
Elastic Net combines L1 and L2 penalties, balancing between ridge and lasso.
from sklearn.linear_model import ElasticNet
elastic = ElasticNet(alpha=0.1, l1_ratio=0.5)
elastic.fit(X_train, y_train)
Interactive Demo
Select a regression type and coefficient penalty to see the fitted curve on a synthetic dataset.