Statistics Toolbox | ![]() ![]() |
Friedman's Test
In Two-Way Analysis of Variance (ANOVA) we used two-way analysis of variance to study the effect of car model and factory on car mileage. We tested whether either of these factors had a significant effect on mileage, and whether there was an interaction between these factors. We concluded that there was no interaction, but that each individual factor had a significant effect. Now we will see if a nonparametric analysis will lead to the same conclusion.
Friedman's test is a nonparametric test for data having a two-way layout (data grouped by two categorical factors). Unlike two-way analysis of variance, Friedman's test does not treat the two factors symmetrically and it does not test for an interaction between them. Instead, it is a test for whether the columns are different after adjusting for possible row differences. The test is based on an analysis of variance using the ranks of the data across categories of the row factor. Output includes a table similar to an anova table.
We can run Friedman's test as follows.
Recall the classical analysis of variance gave a p-value to test column effects, row effects, and interaction effects. This p-value is for column effects. Using either this p-value or the p-value from ANOVA (p < 0.0001), we conclude that there are significant column effects.
In order to test for row effects, we need to rearrange the data to swap the roles of the rows in columns. For a data matrix x
with no replications, we could simply transpose the data and type
With replicated data it is slightly more complicated. A simple way is to transform the matrix into a three-dimensional array with the first dimension representing the replicates, swapping the other two dimensions, and restoring the two-dimensional shape.
x = reshape(mileage, [3 2 3]); x = permute(x, [1 3 2]); x = reshape(x, [9 2]) x = 33.3000 32.6000 33.4000 32.5000 32.9000 33.0000 34.5000 33.4000 34.8000 33.7000 33.8000 33.9000 37.4000 36.6000 36.8000 37.0000 37.6000 36.7000 friedman(x, 3) ans = 0.0082
Again, the conclusion is similar to the conclusion from the classical analysis of variance. Both this p-value and the one from ANOVA (p = 0.0039) lead us to conclude there are significant row effects.
You cannot use Friedman's test to test for interactions between the row and column factors.
![]() | Kruskal-Wallis Test | Nonlinear Regression Models | ![]() |