The Least Squares Line

How to Subscribe
MLS & MLT Comprehensive CE Package
Includes 176 CE courses, most popular
$109Add to cart
Pick Your Courses
Up to 8 CE hours
$55Add to cart
Individual course$25Add to cart
Need multiple seats for your university or lab? Get a quote
The page below is a sample from the LabCE course Linear Regression Analysis. Access the complete course and earn ASCLS P.A.C.E.-approved continuing education credits by subscribing online.

Learn more about Linear Regression Analysis (online CE course)
The Least Squares Line

According to the method of least squares, the line of best fit is the one that minimizes the squares of the differences between the data points' observed (experimental) y-values and their expected (theoretical) y-values. This line is known as the least squares regression line.

To calculate the sum of squares of a line, find the expected y value of each point by substituting the corresponding x value into the linear regression equation. Then, find the difference between the observed y value and the expected y value for each point. Finally, square the difference between the observed and expected y value for each point, and then sum those values. The lines that were shown on the previous page are calculated below:
    PointxObserved yExpected Difference y-Difference Squared (y-)2
    ABCABCABC
    1105.08.010.012.0-3.0-5.0-7.09.025.049.0
    2182414.418.022.09.66.02.092.1636.004.00
    33827.530.438.045.0-2.9-10.5-17.58.41110.25306.25
    45060.040.050.060.020.010.00.0400.00100.000.00
    56350.048.063.074.02.0-13.0-24.04.00169.00576.00
    The total sum of squares for line A = 513.57; line B= 440.25; line C= 935.25. As said before, the line that minimizes this value is the line of best fit according to the least squares method. Therefore, line B is the best fit of these 3 lines.