In statistics, an additive model (AM) is a nonparametric regression method. It was suggested by Jerome H. Friedman and Werner Stuetzle (1981)[1] and is an essential part of the ACE algorithm. The AM uses a one-dimensional smoother to build a restricted class of nonparametric regression models. Because of this, it is less affected by the curse of dimensionality than a p-dimensional smoother. Furthermore, the AM is more flexible than a standard linear model, while being more interpretable than a general regression surface at the cost of approximation errors. Problems with AM, like many other machine-learning methods, include model selection, overfitting, and multicollinearity.
Given a data set
\{yi,xi1,\ldots,xip
n | |
\} | |
i=1 |
\{xi1,\ldots,xip
n | |
\} | |
i=1 |
yi
E[yi|xi1,\ldots,xip]=\beta0+\sum
p | |
j=1 |
fj(xij)
Y=\beta0+\sum
p | |
j=1 |
fj(Xj)+\varepsilon
E[\epsilon]=0
Var(\epsilon)=\sigma2
E[fj(Xj)]=0
fj(xij)
fj(xij)