In probability theory and statistics, the noncentral chi-squared distribution (or noncentral chi-square distribution, noncentral
\chi2
Let
(X1,X2,\ldots,Xi,\ldots,Xk)
\mui
k | |
\sum | |
i=1 |
2 | |
X | |
i |
is distributed according to the noncentral chi-squared distribution. It has two parameters:
k
Xi
λ
Xi
k | |
λ=\sum | |
i=1 |
2. | |
\mu | |
i |
λ
λ
This distribution arises in multivariate statistics as a derivative of the multivariate normal distribution. While the central chi-squared distribution is the squared norm of a random vector with
N(0k,Ik)
\chi2
N(\mu,Ik)
0k
\mu=(\mu1,\ldots,\muk)
Ik
The probability density function (pdf) is given by
fX(x;k,λ)=
infty | |
\sum | |
i=0 |
e-λ/2(λ/2)i | |
i! |
f | |
Yk+2i |
(x),
Yq
q
From this representation, the noncentral chi-squared distribution is seen to be a Poisson-weighted mixture of central chi-squared distributions. Suppose that a random variable J has a Poisson distribution with mean
λ/2
λ
Alternatively, the pdf can be written as
f | ||||
|
e-(x+λ)/2\left(
x | |
λ |
\right)k/4-1/2Ik/2-1(\sqrt{λx})
where
I\nu(y)
I\nu(y)=(y/2)\nu
infty | |
\sum | |
j=0 |
(y2/4)j | |
j!\Gamma(\nu+j+1) |
.
Using the relation between Bessel functions and hypergeometric functions, the pdf can also be written as:[2]
fX(x;k,λ)={{\rme}-λ/2
The case k = 0 (zero degrees of freedom), in which case the distribution has a discrete component at zero, is discussed by Torgersen (1972) and further by Siegel (1979).[3] [4]
The derivation of the probability density function is most easily done by performing the following steps:
X1,\ldots,Xk
2 | |
X=X | |
k |
2 | |
λ=\mu | |
k |
\mu1=\sqrt{λ}
\mu2= … =\muk=0
2 | |
X=X | |
1 |
\begin{align}fX(x,1,λ)&=
1 | |
2\sqrt{x |
where
\phi( ⋅ )
X2,\ldots,Xk
2 | |
X | |
k |
2 | |
X | |
1 |
2 | |
X | |
1 |
The moment-generating function is given by
M(t;k,λ)= |
| ||||
(1-2t)k/2 |
.
The first few raw moments are:
\mu'1=k+λ
2 | |
\mu' | |
2=(k+λ) |
+2(k+2λ)
3 | |
\mu' | |
3=(k+λ) |
+6(k+λ)(k+2λ)+8(k+3λ)
4+12(k+λ) | |
\mu' | |
4=(k+λ) |
2(k+2λ)+4(11k2+44kλ+36λ2)+48(k+4λ).
The first few central moments are:
\mu2=2(k+2λ)
\mu3=8(k+3λ)
2+48(k+4λ) | |
\mu | |
4=12(k+2λ) |
The nth cumulant is
n-1 | |
\kappa | |
n=2 |
(n-1)!(k+nλ).
Hence
\mu'n=2n-1
n-1 | |
(n-1)!(k+nλ)+\sum | |
j=1 |
(n-1)!2j-1 | |
(n-j)! |
(k+jλ)\mu'n-j.
Again using the relation between the central and noncentral chi-squared distributions, the cumulative distribution function (cdf) can be written as
P(x;k,λ)=e-λ/2
infty | |
\sum | |
j=0 |
(λ/2)j | |
j! |
Q(x;k+2j)
where
Q(x;k)
Q(x;k)= | \gamma(k/2,x/2) |
\Gamma(k/2) |
and where
\gamma(k,z)
QM(a,b)
P(x;k,λ)=1-
Q | ||||
|
\left(\sqrt{λ},\sqrt{x}\right)
When the degrees of freedom k is positive odd integer, we have a closed form expression for the complementary cumulative distribution function given by[6]
\begin{align} P(x;2n+1,λ)&=1-Qn+1/2(\sqrt{λ},\sqrt{x})\\ &=1-\left[Q(\sqrt{x}-\sqrt{λ})+Q(\sqrt{x}+\sqrt{λ})+e-(x
n | ||
\sum | \left( | |
m=1 |
x | |
λ |
\right)m/2-1/4Im-1/2(\sqrt{λx})\right], \end{align}
where n is non-negative integer, Q is the Gaussian Q-function, and I is the modified Bessel function of first kind with half-integer order. The modified Bessel function of first kind with half-integer order in itself can be represented as a finite sum in terms of hyperbolic functions.
In particular, for k = 1, we have
P(x;1,λ)=1-\left[Q(\sqrt{x}-\sqrt{λ})+Q(\sqrt{x}+\sqrt{λ})\right].
Also, for k = 3, we have
P(x;3,λ)=1-\left[Q(\sqrt{x}-\sqrt{λ})+Q(\sqrt{x}+\sqrt{λ})+\sqrt{
2 | |
\pi |
Abdel-Aty derives (as "first approx.") a non-central Wilson–Hilferty transformation:[7]
\left( | \chi'2 |
k+λ |
| ||||
\right) |
\sim
|
2 | |
9f |
\right),
P(x;k,λ) ≈ \Phi\left\{
| ||||||||
|
}\right\},where f:=
(k+λ)2 | |
k+2λ |
=k+
λ2 | |
k+2λ |
,
which is quite accurate and well adapting to the noncentrality. Also,
f=f(k,λ)
f=k
λ=0
Sankaran discusses a number of closed form approximations for the cumulative distribution function.[8] In an earlier paper, he derived and states the following approximation:[9]
P(x;k,λ) ≈ \Phi\left\{
| |||||
h\sqrt{2p |
(1+0.5mp)}\right\}
where
\Phi\lbrace ⋅ \rbrace
h=1-
2 | |
3 |
(k+λ)(k+3λ) | |
(k+2λ)2 |
;
p=
k+2λ | |
(k+λ)2 |
;
m=(h-1)(1-3h).
This and other approximations are discussed in a later text book.[10]
More recently, since the CDF of non-central chi-squared distribution with odd degree of freedom can be exactly computed, the CDF for even degree of freedom can be approximated by exploiting the monotonicity and log-concavity properties of Marcum-Q function as
P(x;2n,λ) ≈
1 | |
2 |
\left[P(x;2n-1,λ)+P(x;2n+1,λ)\right].
Another approximation that also serves as an upper bound is given by
P(x;2n,λ) ≈ 1-\left[(1-P(x;2n-1,λ))(1-P(x;2n+1,λ))\right]1/2.
For a given probability, these formulas are easily inverted to provide the corresponding approximation for
x
V
V\sim
2 | |
\chi | |
k |
V
V\sim
2 | |
{\chi'} | |
k(0) |
\xi=\sumiλiYi+c, Yi\sim
2(m | |
\chi' | |
i,\delta |
2) | |
i |
V1\sim
2(λ) | |
{\chi'} | |
k1 |
V2\sim
2(0) | |
{\chi'} | |
k2 |
V1
V2
V1/k1 | |
V2/k2 |
\sim
F' | |
k1,k2 |
(λ)
J\simPoisson\left({
1 | |
2 |
λ}\right)
2 | |
\chi | |
k+2J |
\sim
2(λ) | |
{\chi'} | |
k |
2 | |
V\sim{\chi'} | |
2(λ) |
\sqrt{V}
\sqrt{λ}
V\sim
2 | |
{\chi'} | |
k(λ) |
V-(k+λ) | |
\sqrt{2(k+2λ) |
k\toinfty
λ\toinfty
V1\sim
2 | |
{\chi'} | |
k1 |
(λ1)
V2\sim
2 | |
{\chi'} | |
k2 |
(λ2)
V1,V2
W=(V1+V2)\sim
2 | |
{\chi'} | |
k(λ |
1+λ2)
k=k1+k2
Vi\sim
2 | |
{\chi'} | |
ki |
(λi),i\in\left\{1..N\right\}
Y=
N | |
\sum | |
i=1 |
Vi
Y\sim
2 | |
{\chi'} | |
ky |
(λy)
ky=\sum
N | |
i=1 |
ki,λy=\sum
Nλ | |
i |
MY(t)=
M | ||||||||||
|
(t)=
N | |
\prod | |
i=1 |
M | |
Vi |
(t)
Vi
(z1,\ldots,zk)
\mui
\operatorname{E}\left|zi-\mui\right|2=1
S=
k | |
\sum | |
i=1 |
\left|zi\right|2
{\chi'}2
fS(S)=\left(
S | |
λ |
\right)(k-1)/2eIk-1(2\sqrt{Sλ})
where
k | |
λ=\sum | |
i=1 |
\left|\mui\right|2.
Sankaran (1963) discusses the transformations of the form
z=[(X-b)/(k+λ)]1/2
z
O((k+λ)-4)
b
b=(k-1)/2
z
λ
b=(k-1)/3
z
λ
b=(k-1)/4
z
λ
Also, a simpler transformation
z1=(X-(k-1)/2)1/2
(λ+(k-1)/2)1/2
O((k+λ)-2)
Usability of these transformations may be hampered by the need to take the square roots of negative numbers.
Name | Statistic | |||||||||||||
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
\right)2 | ||||||||||||||
noncentral chi-squared distribution |
\right)2 | |||||||||||||
\right)2} | ||||||||||||||
\right)2} |
Two-sided normal regression tolerance intervals can be obtained based on the noncentral chi-squared distribution.[12] This enables the calculation of a statistical interval within which, with some confidence level, a specified proportion of a sampled population falls.