Request

To request a blog written on a specific topic, please email James@StatisticsSolutions.com with your suggestion. Thank you!
Showing posts with label MANOVA. Show all posts
Showing posts with label MANOVA. Show all posts

Thursday, January 24, 2013

Checking the Additional Assumptions of a MANOVA



So a MANOVA is typically seen as an extension of an ANOVA that has more than one continuous variable. The typical assumptions of an ANOVA should be checked, such as normality, equality of variance, and univariate outliers. However, there are additional assumptions that should be checked when conducting a MANOVA.

The additional assumptions of the MANOVA include:


  • Absence of multivariate outliers
  • Linearity
  • Absence of multicollinearity
  • Equality of covariance matrices


Absence of multivariate outliers is checked by assessing Mahalanobis Distances among the participants. To do this in SPSS, run a multiple linear regression with all of the dependent variables of the MANOVA as the independent variables of the multiple linear regression. The dependent variable would be simply an ID variable. There is an option in SPSS to save the Mahalanobis Distances when running the regression. Once this is done, sort the Mahalanobis Distances from greatest to least. To identify an outlier, the critical chi square value must be known. This is derived from the critical chi square value at p = .001 with the degrees of freedom being the number of dependent variables. With 3 variables, the critical value is 16.27, so any participants with a Mahalanobis Distance value greater than 16.27 should be removed.

Linearity assumes that all of the dependent variables are linearly related to each other. This can be checked by conducting a scatterplot matrix between the dependent variables. Linearity should be met for each group of the MANOVA separately.

Absence of multicollinearity is checked by conducting correlations among the dependent variables. The dependent variables should all be moderately related, but any correlation over .80 presents a concern for multicollinearity.

Equality of covariance matrices is an assumption checked by running a Box’s M test. Unlike most tests, the Box’s M test tends to be very strict, and thus the level of significance is typically .001. So as long as the p value for the test is above .001, the assumption is met.

Monday, April 6, 2009

Analysis of Variance

Analysis of variance (ANOVA) is a statistical technique that was invented by Fisher, and it is therefore sometimes called Fisher’s analysis of variance (ANOVA). In survey research, analysis of variance (ANOVA) is used to compare the means of more than two populations. Analysis of variance (ANOVA) technique can be used in the case of two sample means comparison.

Additionally, it can be used in cases of two samples analysis of variance (ANOVA) and results will be the same as the t-test. For example, if we want to compare income by gender group. In this case, t-test and analysis of variance (ANOVA) results will be the same. In the case of more than two groups, we can use t-test as well, but this procedure will be long. Thus, analysis of variance (ANOVA) technique is the best technique when the independent variable has more than two groups. Before performing the analysis of variance (ANOVA), we should consider some basics and some assumptions on which this test is performed:

Assumptions:

1. Independence of case: Independence of case assumption means that the case of the dependent variable should be independent or the sample should be selected randomly. There should not be any pattern in the selection of the sample.

2. Normality: Distribution of each group should be normal. The Kolmogorov-Smirnov or the Shapiro-Wilk test may be used to confirm normality of the group.

3. Homogeneity: Homogeneity means variance between the groups should be the same. Levene's test is used to test the homogeneity between groups.

If particular data follows the above assumptions, then the analysis of variance (ANOVA) is the best technique to compare the means of two populations, or more than two populations. Analysis of variance (ANOVA) has three types.

One way analysis of variance (ANOVA): When we are comparing more than three groups based on one factor variable, then it said to be one way analysis of variance (ANOVA). For example, if we want to compare whether or not the mean output of three workers is the same based on the working hours of the three workers, then it said to be one way analysis of variance (ANOVA).

Two way analysis of variance (ANOVA): When factor variables are more than two, then it is said to be two way analysis of variance (ANOVA). For example, based on working condition and working hours, we can compare whether or not the mean output of three workers is the same. In this case, it is said to be two way analysis of variance (ANOVA).

K way analysis of variance (ANOVA): When factor variables are k, then it is said to be the k way of analysis of variance (ANOVA).

Key terms and concepts:

Sum of square between groups: For the sum of the square between groups, we calculate the individual means of the group, then we take the deviation from the individual mean for each group. And finally, we will take the sum of all groups after the square of the individual group.
Sum of squares within group: In order to get the sum of squares within a group, we calculate the grand mean for all groups and then take the deviation from the individual group. The sum of all groups will be done after the square of the deviation.

F –ratio: To calculate the F-ratio, the sum of the squares between groups will be divided by the sum of the square within a group.

Degree of freedom: To calculate the degree of freedom between the sums of the squares group, we will subtract one from the number of groups. The sum of the square within the group’s degree of freedom will be calculated by subtracting the number of groups from the total observation.

BSS df = (g-1) for BSS is between the sum of squares, where g is the group, and df is the degree of freedom.

WSS df = (N-g) for WSS within the sum of squares, where N is the total sample size.
Significance: At a predetermine level of significance (usually at 5%), we will compare and calculate the value with the critical table value. Today, however, computers can automatically calculate the probability value for F-ratio. If p-value is lesser than the predetermined significance level, then group means will be different. Or, if the p-value is greater than the predetermined significance level, we can say that there is no difference between the groups’ mean.

Analysis of variance (ANOVA) in SPSS: In SPSS, analysis of variance (ANOVA) can be performed in many ways. We can perform this test in SPSS by clicking on the option “one way ANOVA,” available in the “compare means” option. When we are performing two ways or more than two ways analysis of variance (ANOVA), then we can use the “univariate” option available in the GLM menu. SPSS will give additional results as well, like the partial eta square, Power, regression model, post hoc, homogeneity test, etc. The post hoc test is performed when there is significant difference between groups and we want to know exactly which group has means that are significantly different from other groups.

Extension of analysis of variance (ANOVA):

MANOVA: Analysis of variance (ANOVA) is performed when we have one dependent metric variable and one nominal independent variable. However, when we have more than one dependent variable and one or more independent variable, then we will use multivariate analysis of variance (MANOVA).

ANCOVA: Analysis of covariance (ANCOVA) test is used to know whether or not certain factors have an effect on the outcome variable after removing the variance for quantitative predictors (covariates).

For information on statistical consulting, click here.