Correlation, as the name suggests, depicts a relationship between two or more variables under study. Correlation is generally categorized into two types, namely Bivariate Correlation and Partial Correlation.
For a free consultation on correlation or dissertation statistics, click here.Bivariate Correlation is the one that shows an association between two variables. Correlation is the one that shows the association between two variables while keeping control or adjusting the effect of one or more additional variables.
A Correlation is a degree of measure, which means that a Correlation can be negative, positive, or perfect. A positive Correlation is a type of Correlation in which an increase changes the other variable. In other words, if there is an increase (or decrease) in one variable, then there is a simultaneous increase (decrease) in the other variable. A negative Correlation is a type of Correlation where if there is a decrease (or increase) in one variable, then there is a simultaneous increase (or decrease) in the other variables.
A perfect Correlation is that type of Correlation where a change in one variable affects an equivalent change in the other variable.
A British biometrician named Karl Pearson developed a formula to measure the degree of the Correlation, called the Correlation Coefficient. This Correlation Coefficient is generally depicted as ‘r.’ In mathematical language, the Correlation Coefficient, which was developed by the biometrician Karl Pearson, is defined as the ratio between the covariance of the two variables and the product of the square root of their individual variances. The range of the Correlation Coefficient generally lies between -1 to +1. If the value of the Correlation Coefficient is ‘+1,’ then the variable is said to be positively correlated. If, on the other hand, the value of the Correlation Coefficient is ‘-1,’ then the variable is said to be negatively correlated.
The value of the Correlation Coefficient does not depend upon the change in origin and the change in the scale.
If the value of the Correlation Coefficient is zero, then the variables are said to be uncorrelated. Thus, the variables would be regarded as independent. If there is no Correlation in the variables, then the change in one variable will not affect the change in the other variable at all, and therefore the variables will be independent.
However, the researcher should note that the two independent variables are not in any Correlation if the covariance of the variables is zero. This, however, is not true in the opposite case. This means that if the covariance of the two variables is zero, then it does not necessarily mean that the two variables are independent.
There are certain assumptions that come along with the Correlation Coefficient. The following are the assumptions for the Correlation Coefficient:
The Correlation Coefficient assumes that the variables under study should be linearly correlated.
Correlation coefficient assumes that a cause and effect relationship exists between different forces operating on the items of the two variable series. Such forces assumed by the correlation coefficient must be common to both series.
For the cases where operating forces are entirely independent, then the value of the correlation coefficient must be zero. If the value of the correlation coefficient is not zero, then in such cases, correlation is often termed as chance correlation or spurious correlation. For example, the correlation between the income of a person and the height of a person is a case of spurious correlation. Another example of spurious correlation is the correlation between the size of the shoe and the intelligence of a certain group of people.
A Pearsonian coefficient of correlation between the ranks of two variables, say, x and y, is called rank correlation coefficient between that group of variables.