For observations \(u_{i,j},\ i=1,...,N,\ j=1,2,\) the K-plot considers two quantities: First, the ordered values of
the empirical bivariate distribution function
\(H_i:=\hat{F}_{U_1U_2}(u_{i,1},u_{i,2})\) and, second, \(W_{i:N}\),
which are the expected values of the order statistics from a random sample
of size \(N\) of the random variable \(W=C(U_1,U_2)\) under the null
hypothesis of independence between \(U_1\) and \(U_2\). \(W_{i:N}\)
can be calculated as follows $$ W_{i:n}= N {N-1 \choose i-1}
\int\limits_{0}^1 \omega k_0(\omega) ( K_0(\omega) )^{i-1} ( 1-K_0(\omega)
)^{N-i} d\omega, $$ where
$$K_0(\omega) = \omega - \omega \log(\omega), $$
and \(k_0(\cdot)\) is the corresponding density.
K-plots can be seen as the bivariate copula equivalent to QQ-plots. If the
points of a K-plot lie approximately on the diagonal \(y=x\), then
\(U_1\) and \(U_2\) are approximately independent. Any deviation from
the diagonal line points towards dependence. In case of positive dependence,
the points of the K-plot should be located above the diagonal line, and vice
versa for negative dependence. The larger the deviation from the diagonal,
the stronger is the degree of dependency. There is a perfect positive
dependence if points \(\left(W_{i:N},H_i\right)\) lie on the curve
\(K_0(\omega)\) located above the main diagonal. If points
\(\left(W_{i:N},H_i\right)\) however lie on the x-axis,
this indicates a perfect negative dependence between \(U_1\) and
\(U_2\).