The true line and a regression line are compared. The idea is that the regression line that you draw is your attempt to guess what the true line is, based on the data. The distinction is made between errors and residuals.
Least squares regression illustrated. You can move a line around and watch as squares get bigger or smaller. The squares are the squares of the residuals. The least squares method makes the total size of the squares the smallest possible. (Least squares regression with a true line with 0 slope.)
Bias concept illustrated with a target shooting analogy.
Normal distribution of the error illustrated, using discreet values of X.. For hypothesis testing, we assume that the errors are normally distributed. This shows how tight normally distributed errors are and how rare outliers are.
Normal distribution of the error illustrated, using continuous values of X. A prettier version of the above demo.
Efficiency comparison between connect-the-end-points and least squares regression. An inferior method of drawing a regression line is compared with least squares. You see why that method is inferior.
Students' eyeball lines and least squares lines compared. As a group, which are more closely bunched around the true line, the eyeball lines or the least squares lines.
Scatter of points from log(Y)=log(A)+b log(X)+normal error. This is for the week on non-linear regression.