Share Share
Text size Increase font size Decrease font size
Share Share
Text size Increase font size Decrease font size

Encyclopedia Entry

Residual Sum of Squares

Shannon Howle Schelin

One of the key components of ANALYSIS OF VARIANCE (anova) and REGRESSION , both LINEAR and multiple, is the residual sum of squares, also called sum of squared errors or error sum of squares. The residual sum of squares is the difference between the observed and predicted values of the dependent variable, reflecting the variance in the dependent variable unexplained by the linear regression model. In ordinary least squares regression, each residual is squared in value and sums taken of all of the observed residuals squared to create the residual sum of squares, denoted SSE. The residual sum of squares indicates the sum of squares not accounted for by the linear model. It describes the variation of the observations from the prediction line. Residual sum of squares (SSE), in mathematical expression, is Σ( Y i ...

Users without a subscription are not able to see the full content on this page. Please, subscribe or login to access all Methods content.

Click here to see full text

Articles in Google Scholar by