regularization -凯发k8网页登录

prevent overfitting with regularization

regularization techniques are used to prevent statistical overfitting in a predictive model. regularization algorithms typically work by applying either a penalty for complexity such as by adding the coefficients of the model into the minimization or including a roughness penalty. by introducing additional information into the model, regularization algorithms can deal with multicollinearity and redundant predictors by making the model more parsimonious and accurate.

popular regularization techniques include ridge regression (also known as tikhonov regularization), lasso and elastic net algorithms, method of shrunken centroids, as well as trace plots and cross-validated mean square error. you can also apply akaike information criteria (aic) as a goodness-of-fit metric.

each regularization technique offers advantages for certain use cases.

  • lasso uses an l1 norm and tends to force individual coefficient values completely towards zero. as a result, lasso works very well as a feature selection algorithm. it quickly identifies a small number of key variables.
  • ridge regression uses an l2 norm for the coefficients (you're minimizing the sum of the squared errors). ridge regression tends to spread coefficient shrinkage across a larger number of coefficients. if you think that your model should contain a large number of coefficients, ridge regression is probably a good technique.
  • elastic net can compensate for lasso’s inability to identify additional predictors.

regularization is related to feature selection in that it forces a model to use fewer predictors. regularization methods have some distinct advantages.

  • regularization techniques are able to operate on much larger datasets than most feature selection methods (except for univariate feature selection). lasso and ridge regression can be applied to datasets that contains thousands, even tens of thousands, of variables.
  • regularization algorithms often generate more accurate predictive models than feature selection. regularization operates over a continuous space while feature selection operates over a discrete space. as a result, regularization is often able to fine-tune the model and produce more accurate estimates.

however, feature selection methods also have advantages:

  • feature selection is somewhat more intuitive and easier to explain to third parties. this is valuable when you have to describe your methods when sharing your results.
  • matlab® and statistics and machine learning toolbox™ support all popular regularization techniques, and is available for linear regression, logistic regression, support vector machines, and linear discriminant analysis. if you're working with other model types like boosted decision tree, you need to apply feature selection.

key points

  • regularization is used (alongside feature selection) to prevent statistical overfitting in a predictive model.
  • since regularization operates over a continuous space it can outperform discrete feature selection for machine learning problems that lend themselves to various kinds of linear modeling.

example scenario

let's assume that you are running a cancer research study. you have gene sequences for 500 different cancer patients and you're trying to determine which of 15,000 different genes have a signficant impact on the progression of the disease. you could apply one of the feature ranking methods like and , or univariate if you’re concerned about runtime; only sequential feature selection is completely impractical with this many different variables. alternatively you can explore models with regularization. you can't use ridge regression because it won't force coefficients completely to zero quickly enough. at the same time, you can't use lasso since you might need to identify more than 500 different genes. the elastic net is one possible solution.


examples and how to

see also: feature selection, machine learning, supervised learning, linear model, automl

what regularization does to a function y=f(x)

free white paper

machine learning challenges: choosing the best classification model and avoiding overfitting

understand the strengths of the most common classification models, learn how to correct and prevent overfitting, and see useful functions in matlab.

read white paper
网站地图