The Taxonomy Of Regression Algorithms That Many Don't Bother To Remember

8 standard regression algorithms summarised in a single frame.

Regression algorithms allow us to model the relationship between a dependent variable and one or more independent variables.

After estimating the parameters of a regression model, we can gain insight into how changes in one variable affect another.

Being widely used in data science, an awareness of their various forms is crucial to precisely convey which algorithm you are using.

Here are eight of the most standard regression algorithms described in a single line:

Linear Regression

  • Simple linear regression: One independent (x) and one dependent (y) variable.

  • Polynomial Linear Regression: Polynomial features and one dependent (y) variable.

  • Multiple Linear Regression: Arbitrary features and one dependent (y) variable.

Regularized Regression

  • Lasso Regression: Linear Regression with L1 Regularization.

  • Ridge Regression: Linear Regression with L2 Regularization.

  • Elastic Net: Linear Regression with BOTH L1 and L2 Regularization.

Categorical Probability Prediction

  • Logistic Regression: Predict binary outcome probability.

  • Multinomial Logistic Regression (or Softmax Regression): Predict multiple categorical probabilities.

Over to you: What other regressions algorithms will you include here?

πŸ‘‰ Read what others are saying about this post on LinkedIn and Twitter.

πŸ‘‰ If you liked this post, don’t forget to leave a like ❀️. It helps more people discover this newsletter on Substack and tells me that you appreciate reading these daily insights. The button is located towards the bottom of this email.

πŸ‘‰ If you love reading this newsletter, feel free to share it with friends!

Find the code for my tips here: GitHub.

I like to explore, experiment and write about data science concepts and tools. You can read my articles on Medium. Also, you can connect with me on LinkedIn and Twitter.

Reply

or to participate.