About 494,000 results
Open links in new tab
  1. regression - Converting standardized betas back to original …

    Where β∗ β ∗ are the estimators from the regression run on the standardized variables and β^ β ^ is the same estimator converted back to the original scale, Sy S y is the sample standard …

  2. regression - What's the difference between multiple R and R …

    Mar 21, 2014 · In linear regression, we often get multiple R and R squared. What are the differences between them?

  3. regression - Trying to understand the fitted vs residual plot?

    Dec 23, 2016 · A good residual vs fitted plot has three characteristics: The residuals "bounce randomly" around the 0 line. This suggests that the assumption that the relationship is linear is …

  4. regression - Difference between forecast and prediction ... - Cross ...

    I was wondering what difference and relation are between forecast and prediction? Especially in time series and regression? For example, am I correct that: In time series, forecasting seems …

  5. regression - What is residual standard error? - Cross Validated

    A quick question: Is "residual standard error" the same as "residual standard deviation"? Gelman and Hill (p.41, 2007) seem to use them interchangeably.

  6. regression - Linear vs Nonlinear Machine Learning Algorithms

    Jan 6, 2021 · Three linear machine learning algorithms: Linear Regression, Logistic Regression and Linear Discriminant Analysis. Five nonlinear algorithms: Classification and Regression …

  7. regression - When should I use lasso vs ridge? - Cross Validated

    Ridge regression is useful as a general shrinking of all coefficients together. It is shrinking to reduce the variance and over fitting. It relates to the prior believe that coefficient values …

  8. How should outliers be dealt with in linear regression analysis ...

    What statistical tests or rules of thumb can be used as a basis for excluding outliers in linear regression analysis? Are there any special considerations for multilinear regression?

  9. Linear model with both additive and multiplicative effects

    Sep 23, 2020 · In a log-level regression, the independent variables have an additive effect on the log-transformed response and a multiplicative effect on the original untransformed response:

  10. regression - When is R squared negative? - Cross Validated

    With linear regression with no constraints, R2 R 2 must be positive (or zero) and equals the square of the correlation coefficient, r r. A negative R2 R 2 is only possible with linear …