|
Why assume normal errors in regression? - Cross Validated
The errors in linear regression are often assumed to be normal. My understanding is that this is because of the following reasons, if there are more please feel free to let me know: Mathematical
regression - Why do we say the outcome variable "is regressed on" the ...
The word "regressed" is used instead of "dependent" because we want to emphasise that we are using a regression technique to represent this dependency between x and y. So, this sentence "y is regressed on x" is the short format of: Every predicted y shall "be dependent on" a value of x through a regression technique.
Does every variable need to be statistically significant in a ...
I recently fit a regression model (ARIMAX) in which some variables (3) were statistically significant and some were not (1). I removed the statistically insignificant variables and refit the model, now all variables are statistically significant (i.e. new model has 3 variables, old model has 4 variables).
Support Vector Regression vs. Linear Regression - Cross Validated
Linear regression can use the same kernels used in SVR, and SVR can also use the linear kernel. Given only the coefficients from such models, it would be impossible to distinguish between them in the general case (with SVR, you might get sparse coefficients depending on the penalization, due to $\epsilon$-insensitive loss)
What's the difference between correlation and simple linear regression ...
Note that one perspective on the relationship between regression & correlation can be discerned from my answer here: What is the difference between doing linear regression on y with x versus x with y?.
regression - When is R squared negative? - Cross Validated
Also, for OLS regression, R^2 is the squared correlation between the predicted and the observed values. Hence, it must be non-negative. For simple OLS regression with one predictor, this is equivalent to the squared correlation between the predictor and the dependent variable -- again, this must be non-negative.
What is the relationship between R-squared and p-value in a regression?
Context - I'm performing OLS regression on a range of variables and am trying to develop the best explanatory functional form by producing a table containing the R-squared values between the linear, logarithmic, etc., transformations of each explanatory (independent) variable and the response (dependent) variable.
What is the lasso in regression analysis? - Cross Validated
LASSO regression is a type of regression analysis in which both variable selection and regulization occurs simultaneously. This method uses a penalty which affects they value of coefficients of regression.
Why Isotonic Regression for Model Calibration?
1 I think an additional reason why it is so common is the simplicity (and thus reproducibility) of the isotonic regression. If we give the same classification model and data to two different analysts, then each of them might get different recalibrations depending on the regression function they choose and its parameters.
Discrete variables in regression model? - Cross Validated
I know that in theory for regression both the Y and factors should be continuous variables. However, I have some factors that are discrete but show both correlation and would fit a regression model...
|