Machine Learning - Linear Regression Algorithm

Linear regression is a pivotal machine learning algorithm used for predicting outcomes by analyzing relationships between variables, essential in fields like finance, health, and business planning.

Posted by Judith Winkler MBA on February 14th, 2024

Linear regression is a supervised machine learning algorithm that learns from labels and maps the data points to the most optimized linear functions, which can be used for predictions on new datasets. It predicts the continuous output variable based on the independent input variable. It discovers the relationship between two or more variables to predict the dependence or response variable. The relationship between variables can be demonstrated, by drawing a straight line on a graph; the independent variable is on the x-axis, and the dependent variable is on the y-axis.

For example, it can predict the price of houses depending on the house's age, location, area, rooms, etc.

Businesses use it to develop forecasts and make informed decisions. Linear regression models can be used in different areas such as finance, business planning, marketing, health, medicine, etc.

The two main types of linear regression are: simple and multiple.

Simple linear regression estimates the relationship between two quantitative variables, such as the value of a dependent variable at a particular value of the independent variable, like determining the number of sales when a customer engages on social media at a certain percentage.

Multiple linear regression finds out the relationship between several independent and dependent variables, like how the price of a product, interest rates, and competitive prices affect a company's sales.

There are certain conditions that the variables of the linear regression model need to meet to predict accurate and dependable solutions.

Simpler Linear Regression Assumptions

  1. Linearity- The independent and dependent variables have a linear relationship with each other; the change in the dependent variable follows those in the independent variable(s); and a straight line can be drawn through the data points.
  2. Independence- The observations in the datasets are independent of each other.
  3. Homoscedasticity- The variance of the errors is constant across all levels of the independent variable.
  4. Normality- The residuals should be normally distributed and follow a bell-shaped curve.

Assumptions for a Multiple Linear Regression

  1. No Multicollinearity- No multicollinearity happens when there is no high correlation between the independent variables. Multicollinearities happen when two or more independent variables are highly correlated between them, making it difficult to determine the effect of each variable on the dependent variable.
  2. Additivity- The model assumes that the effect of changes in a predictor variable on the response variable is consistent regardless of the values of the other variables.
  3. Feature Selection- In multiple linear regression, it is important to carefully select the independent variables that will be included in the model, including irrelevant or redundant variables that might lead to overfitting, and the interpretation of the model might be complicated.
  4. Overfitting- Overfitting happens when the model fits the training data too closely, capturing noise or random fluctuations that do not represent the true underlying relationship between variables.

How to prepare data for linear regression?

When preparing data for linear regression, we should look for:

  1. Outliers - The linear regression model assumes that there is a linear relationship between variables. Consequently, it is critical to remove the outliers that might affect the results.
  2. Collinearity - It is valuable to remove collinearity to avoid overfitting, which can lead to inconsistent results from the model.
  3. Normalize the data - The data should follow normal distribution for a more accurate prediction outcome.
  4. Standardize the data - This can be done by subtracting a measure of location, such as the mean, and dividing it by a measure of scale, such as standard deviation, especially when the datasets have different ranges, such as zero to one and zero to 1000.
  5. Input extra data - Some datasets are not large enough and have missing values. Because of that, it is necessary to create space for additional imputations.

Benefits of linear regression

  1. Predicting outcomes: Linear regression models can predict outcomes, which can help organizations to decide about certain risks or investments; to make long-term business planning, like determining how many individuals can pass in front of a billboard and choosing strategically, where to locate the billboards to advertise products and maximum views and sales.
  2. Preventing mistakes: Regression analysis can determine if a decision can lead to unfavorable outcomes and prevent them from occurring, which helps companies save costs and increase revenue. For example, if a manager wants to determine if keeping a retail store open for an extra two hours daily can increase revenue. A regression analysis can be done to predict the outcomes. If the results suggest that this action could lead to higher costs, the company may decide against those options and save money.
  3. Increasing efficiency: Linear regression models can be used to optimize business processes by determining how a change in a process can affect an outcome, and the results can be used to implement new policies and protocols to increase efficiency. For example, to reduce the number of customer complaints, a company can assess the relationship between a customer’s wait time when calling customer service and the number of negative reviews or complaints they receive.

I have a small favor to ask, if you find this information useful, I ask that you share this blog with other business owners that might find this content useful as well. I will be setting a lot of effort towards posting regular content to help share knowledge about all things related to business and how data analytics can be used to improve companies. Thank you!