#1
What does the correlation coefficient measure in linear regression?
The strength and direction of the linear relationship between two variables
The slope of the regression line
The intercept of the regression line
The variance of the dependent variable
#2
Which of the following is a common method for estimating the parameters of a linear regression model?
Least squares estimation
Maximum likelihood estimation
Bayesian estimation
Gradient descent optimization
#3
What is the purpose of the coefficient of correlation in linear regression?
To measure the strength and direction of the linear relationship between two variables
To estimate the intercept of the regression line
To determine the significance level of the independent variable
To compute the sum of squared residuals
#4
Which statement about the residuals in linear regression is true?
They represent the difference between the observed and predicted values of the dependent variable
They are equal to the independent variable
They are used to estimate the coefficient of determination
They determine the significance of the independent variable
#5
What is the formula for the slope (β₁) in simple linear regression?
β₁ = Σ(xy) / Σ(x^2)
β₁ = Σ(x) / Σ(y)
β₁ = Σ(x^2) / Σ(xy)
β₁ = Σ(y) / Σ(x)
#6
Which of the following best describes the purpose of the intercept (β₀) in linear regression?
It represents the predicted value of the dependent variable when the independent variable is zero
It represents the predicted value of the independent variable when the dependent variable is zero
It represents the average of the independent variable
It represents the average of the dependent variable
#7
What is the primary difference between correlation and regression analysis?
Correlation measures the strength and direction of the relationship between variables, while regression predicts the value of one variable based on the value of another.
Correlation predicts the value of one variable based on the value of another, while regression measures the strength and direction of the relationship between variables.
Correlation is used for categorical data, while regression is used for continuous data.
Regression is used for categorical data, while correlation is used for continuous data.
#8
What is the main drawback of using R-squared as a measure of model fit in regression analysis?
It does not indicate the direction of the relationship between variables.
It is sensitive to outliers.
It cannot be calculated for nonlinear relationships.
It does not account for the number of independent variables in the model.
#9
What is the purpose of the residual standard error in linear regression?
To measure the average distance between the observed and predicted values of the dependent variable.
To determine the significance level of the independent variable.
To assess the normality assumption of the residuals.
To compute the coefficient of determination.
#10
Which of the following statements best describes a type I error in the context of regression analysis?
Rejecting the null hypothesis when it is true.
Failing to reject the null hypothesis when it is false.
Incorrectly concluding a significant relationship between variables when there is none.
Incorrectly concluding no significant relationship between variables when there is one.
#11
In linear regression, what does the coefficient of determination (R-squared) indicate?
The proportion of the variance in the dependent variable that is predictable from the independent variable
The sum of squared residuals
The significance level of the independent variable
The slope of the regression line
#12
What does it mean if the p-value associated with a coefficient in linear regression is less than the significance level (e.g., 0.05)?
The coefficient is statistically significant at the given significance level
The coefficient is not statistically significant at the given significance level
The coefficient is equal to zero
The coefficient is negatively correlated
#13
What does multicollinearity refer to in the context of linear regression?
High correlation among independent variables
High correlation between the dependent and independent variables
Low correlation among independent variables
Low correlation between the dependent and independent variables
#14
Which of the following is NOT an assumption of linear regression?
Homoscedasticity
Independence of residuals
Normality of the dependent variable
Linearity
#15
In multiple linear regression, what does the adjusted R-squared measure?
The proportion of the variance in the dependent variable explained by the independent variables
The sum of squared residuals
The significance level of the independent variables
The slope of the regression line
#16
What is the purpose of residual plots in linear regression analysis?
To detect patterns or trends in the residuals
To estimate the coefficients of the regression model
To compute the correlation coefficient
To determine the significance level of the independent variables
#17
In linear regression, what does the residual represent?
The difference between the observed and predicted values of the dependent variable.
The difference between the dependent and independent variables.
The difference between the mean and median of the dependent variable.
The difference between the maximum and minimum values of the dependent variable.
#18
What does it mean if the p-value associated with a coefficient in linear regression is greater than the significance level (e.g., 0.05)?
The coefficient is statistically significant at the given significance level.
The coefficient is not statistically significant at the given significance level.
The coefficient is equal to zero.
The coefficient is negatively correlated.
#19
What is the primary advantage of using standardized coefficients in multiple regression analysis?
They are easier to interpret than unstandardized coefficients.
They are less affected by outliers and differences in scale between variables.
They allow for direct comparison of the strength of different predictors.
They provide information about the direction of the relationship between variables.
#20
In regression analysis, what does the Durbin-Watson statistic measure?
The presence of autocorrelation in the residuals.
The strength and direction of the relationship between variables.
The normality of the residuals.
The homoscedasticity of the residuals.
#21
Which assumption of linear regression states that the residuals should be normally distributed?
Normality of residuals
Homoscedasticity
Independence of residuals
Linearity
#22
What is the purpose of residual analysis in linear regression?
To assess the validity of the regression assumptions
To estimate the coefficient of determination
To compute the sum of squared residuals
To determine the significance level of the independent variable
#23
Which of the following regression techniques is suitable for modeling nonlinear relationships between variables?
Polynomial regression
Simple linear regression
Multiple linear regression
Logistic regression
#24
What assumption of linear regression states that the variance of the residuals should be constant across all values of the independent variable?
Homoscedasticity
Normality of residuals
Independence of residuals
Linearity
#25
What is the primary goal of transforming variables in regression analysis?
To reduce the influence of outliers.
To make the relationship between variables more linear.
To decrease the variability of the residuals.
To increase the coefficient of determination.