Covariance: Difference between revisions
Jump to navigation
Jump to search
imported>Peter Schmitt m (remove a comment) |
imported>Guido den Broeder (copyedit; note) |
||
Line 8: | Line 8: | ||
of the two variables. | of the two variables. | ||
The value of the covariance | The value of the covariance indicates a linear trend between the two variables. | ||
* If one variable increases (in the mean) with the other, then the covariance is positive. | * If one variable increases (in the mean) with the other, then the covariance is positive. | ||
* It is negative if one variable | * It is negative if one variable tends to decrease when the other increases. | ||
* And it is 0 if the two variables are ( | * And it is 0 if the two variables are not linearly correlated (note: there may still be other dependencies). | ||
A normed derivative of the covariance is — the [[correlation coefficient]] —, which is independent of scale. | |||
== Formal definition == | == Formal definition == |
Revision as of 16:53, 25 January 2010
The covariance — usually denoted as Cov — is a statistical parameter used to compare
two real random variables on the same sample space.
It is defined as the expectation (or mean value)
of the product of the deviations (from their respective mean values)
of the two variables.
The value of the covariance indicates a linear trend between the two variables.
- If one variable increases (in the mean) with the other, then the covariance is positive.
- It is negative if one variable tends to decrease when the other increases.
- And it is 0 if the two variables are not linearly correlated (note: there may still be other dependencies).
A normed derivative of the covariance is — the correlation coefficient —, which is independent of scale.
Formal definition
The covariance of two real random variables X and Y
with expectation (mean value)
is defined by
Remark:
If the two random variables are the same then
their covariance is equal to the variance of the single variable: Cov(X,X) = Var(X).