Ask Experts Questions for FREE Help !
Ask
    ltdbrown's Avatar
    ltdbrown Posts: 1, Reputation: 1
    New Member
     
    #1

    Dec 8, 2011, 06:07 AM
    When both variables are categorical, the t-test should be used for hypothesis test
    1. When both variables are categorical, the t-test should be used for hypothesis testing.
    2. The term parametric statistics refers to tests that make assumptions about the distribution of data. True
    3. Recoding continuous variables as categorical variables is discouraged because it results in a loss of information.
    4. The t-test has four test assumptions.
    5. The critical values of the t-test are provided by Student's t-test distribution.
    6. One-tailed tests are used most often, unless compelling a priori knowledge exists or it is known that one group cannot have a larger mean than the other.
    7. The term robust is used, generally, to describe the extent to which test conclusions are unaffected by departures from test assumptions.
    8. A combination of visual inspection and statistical testing should always be used to determine normality.
    9. Nonnormality is sometimes overcome through variable transformation.
    10. When problems of nonnormality cannot be resolved adequately, analysts should consider nonparametric alternatives to the t-test.
    11. Analysts should always examine the robustness of their findings.
    12. All t-tests first test for equality of means and then test for equality of variances.
    13. The paired-samples t-test tests the null hypothesis that the mean difference between the before and after test scores is zero.
    14. Simple regression is appropriate for examining the bivariate relationships between two continuous variables.
    15. A scatterplot is a plot of the data points of two continuous variables.
    16. The null hypothesis in regression is that the intercept is zero.
    17. The slope indicates the steepness of the regression line.
    18. A negative slope indicates an upward sloping line.
    19. The statistical significance of regression slopes is indeterminable.
    20. R-square is the percent variation of the dependent variable explained by the independent variable(s).
    21. R-square varies from 1 to 2.
    22. A perfect fit is indicated when the coefficient of determination is zero.
    23. A regression line assumes a linear relationship that is constant over the range of observations.
    24. The dependent variable is also called the error term.
    25. Pearson's correlation coefficient, r, measures the association (significance, direction, and strength) between two continuous variables.
    26. The Pearson's correlation coefficient, r, always has the same sign as b.
    27. Multiple regression is one of the most widely used multivariate statistical techniques for analyzing three or more variables.
    28. Full model specification means that all variables are measured that affect the dependent variable.
    29. A nomothetic mode of explanation isolates the most important factors.
    30. The search for parsimonious explanations often leads analysts to first identify different categories of factors that most affect their dependent variable.
    2

    31. It is okay for independent variables not to be correlated with the dependent variables, as long as they are highly correlated with each other.
    32. In multiple regression the adjusted R-square controls for the number of dependent variables.
    33. Values of R-square adjusted below 0.20 are considered to suggest weak model fit, those between 0.20 and 0.40 indicate moderate fit, those above 0.40 indicate strong fit, and those above 0.65 indicate very strong model fit.
    34. It is common to compare beta coefficients across different models.
    35. The global F-test examines the overall effect of all independent variables jointly on the dependent variable.
    36. Outliers are observations whose multiple regression residuals exceed three standard deviations.
    37. When two variables are multicollinear, they are strongly correlated with each other.
    38. Curvelinearity is indicated by residuals that are linearly related to each other.
    39. Heteroscedasticity occurs when one of the dependent variables is linearly related to the independent variable.
    40. Autocorrelation is common with time series data.

Check out some similar questions!

Categorical or Quantitative variables? [ 3 Answers ]

a.do you live on campus? I think this one is quantitative variable b.how tall are you? I think this one is categorical variable c.favorite color of the rainbow? I think this one is quantitativa variable

Finding power for a hypothesis test [ 0 Answers ]

Find the power for the following hypothesis test given that the true mean is 210. H0: &mu = 190 H1: &mu &ne 190 Level of significance = 0.05 Population standard deviation = 64 n = 75 I don't recall ever covering this in class as my notes for this section are unfilled so I don't even know...

A hypothesis and variables for a pinhole camera? [ 0 Answers ]

I have to do a science project on a pinhole camera and my question was "What is a pinhole camera and how does it work?" and now I have to write the variables and hypothesis, so what would I write? I need this by tomorrow or I get a 0 and I got a 60 today on a test! Please help NOW! URGENT!

Can anyone tell me which test is more accurate:a scalp hair test or a body hair test? [ 3 Answers ]

Can anyone tell me which test is more accurate: a scalp hair test or a body hair test. We have 2 test that were taken on the SAME day & 1 shows positive for methamphetamine and 1 shows negative. It should show negative! We are at a loss as to how a test that is so accurate has 2 different...


View more questions Search
 

Question Tools Search this Question
Search this Question:

Advanced Search

Add your answer here.