Why are regression problems called regression problems? I was just wondering why regression problems are called "regression" problems What is the story behind the name? One definition for regression: "Relapse to a less perfect or developed state "
regression - When should I use lasso vs ridge? - Cross Validated Ridge regression is useful as a general shrinking of all coefficients together It is shrinking to reduce the variance and over fitting It relates to the prior believe that coefficient values shouldn't be too large (and these can become large in fitting when there is collinearity) Lasso is useful as a shrinking of a selection of the coefficients
regression - How to calculate the slope of a line of best fit that . . . This kind of regression seems to be much more difficult I've read several sources, but the calculus for general quantile regression is going over my head My question is this: How can I calculate the slope of the line of best fit that minimizes L1 error? Some constraints on the answer I am looking for:
regression - What does it mean to regress a variable against another . . . Those words connote causality, but regression can work the other way round too (use Y to predict X) The independent dependent variable language merely specifies how one thing depends on the other Generally speaking it makes more sense to use correlation rather than regression if there is no causal relationship
Interpretation of Rs output for binomial regression For a simple logistic regression model like this one, there is only one covariate (Area here) and the intercept (also sometimes called the 'constant') If you had a multiple logistic regression, there would be additional covariates listed below these, but the interpretation of the output would be the same