Random Variable Normal Distribution Comparisons Agains Other Event

In probability theory, calculation of the sum of unremarkably distributed random variables is an instance of the arithmetic of random variables, which tin be quite complex based on the probability distributions of the random variables involved and their relationships.

This is not to be confused with the sum of normal distributions which forms a mixture distribution.

Independent random variables [edit]

Let 10 and Y be contained random variables that are ordinarily distributed (and therefore besides jointly so), so their sum is also unremarkably distributed. i.due east., if

Ten Due north ( μ 10 , σ 10 2 ) {\displaystyle X\sim N(\mu _{X},\sigma _{Ten}^{2})}
Y N ( μ Y , σ Y two ) {\displaystyle Y\sim N(\mu _{Y},\sigma _{Y}^{two})}
Z = X + Y , {\displaystyle Z=Ten+Y,}

and so

Z Due north ( μ X + μ Y , σ X two + σ Y 2 ) . {\displaystyle Z\sim Northward(\mu _{X}+\mu _{Y},\sigma _{10}^{2}+\sigma _{Y}^{ii}).}

This means that the sum of 2 independent normally distributed random variables is normal, with its mean being the sum of the two means, and its variance being the sum of the two variances (i.eastward., the square of the standard deviation is the sum of the squares of the standard deviations).[i]

In order for this result to hold, the assumption that X and Y are independent cannot be dropped, although it can be weakened to the assumption that X and Y are jointly, rather than separately, normally distributed.[2] (See hither for an example.)

The event almost the mean holds in all cases, while the result for the variance requires uncorrelatedness, merely non independence.

Proofs [edit]

Proof using characteristic functions [edit]

The feature function

φ X + Y ( t ) = E ( e i t ( X + Y ) ) {\displaystyle \varphi _{X+Y}(t)=\operatorname {E} \left(e^{it(X+Y)}\right)}

of the sum of ii contained random variables X and Y is merely the production of the ii separate characteristic functions:

φ X ( t ) = Due east ( e i t X ) , φ Y ( t ) = Due east ( east i t Y ) {\displaystyle \varphi _{10}(t)=\operatorname {E} \left(e^{itX}\right),\qquad \varphi _{Y}(t)=\operatorname {Eastward} \left(e^{itY}\right)}

of X and Y.

The characteristic function of the normal distribution with expected value μ and variance σ2 is

φ ( t ) = exp ( i t μ σ 2 t ii two ) . {\displaystyle \varphi (t)=\exp \left(it\mu -{\sigma ^{two}t^{two} \over 2}\right).}

So

φ X + Y ( t ) = φ X ( t ) φ Y ( t ) = exp ( i t μ X σ Ten 2 t 2 two ) exp ( i t μ Y σ Y two t 2 2 ) = exp ( i t ( μ Ten + μ Y ) ( σ X 2 + σ Y ii ) t 2 2 ) . {\displaystyle {\begin{aligned}\varphi _{X+Y}(t)=\varphi _{X}(t)\varphi _{Y}(t)&=\exp \left(it\mu _{X}-{\sigma _{X}^{2}t^{2} \over 2}\right)\exp \left(information technology\mu _{Y}-{\sigma _{Y}^{2}t^{2} \over two}\correct)\\[6pt]&=\exp \left(information technology(\mu _{10}+\mu _{Y})-{(\sigma _{X}^{ii}+\sigma _{Y}^{2})t^{2} \over 2}\right).\end{aligned}}}

This is the feature function of the normal distribution with expected value μ X + μ Y {\displaystyle \mu _{Ten}+\mu _{Y}} and variance σ X 2 + σ Y two {\displaystyle \sigma _{10}^{2}+\sigma _{Y}^{ii}}

Finally, recall that no two distinct distributions tin both have the same characteristic function, so the distribution of X +Y must exist only this normal distribution.

Proof using convolutions [edit]

For independent random variables X and Y, the distribution f Z of Z = X +Y equals the convolution of f X and f Y :

f Z ( z ) = f Y ( z x ) f 10 ( x ) d 10 {\displaystyle f_{Z}(z)=\int _{-\infty }^{\infty }f_{Y}(z-10)f_{X}(x)\,dx}

Given that f Ten and f Y are normal densities,

f X ( x ) = N ( 10 ; μ Ten , σ Ten 2 ) = 1 2 π σ Ten eastward ( x μ X ) 2 / ( ii σ X ii ) f Y ( y ) = Northward ( y ; μ Y , σ Y two ) = one two π σ Y e ( y μ Y ) 2 / ( 2 σ Y ii ) {\displaystyle {\begin{aligned}f_{X}(ten)={\mathcal {N}}(x;\mu _{X},\sigma _{X}^{2})={\frac {1}{{\sqrt {ii\pi }}\sigma _{X}}}eastward^{-(x-\mu _{X})^{2}/(2\sigma _{X}^{2})}\\[5pt]f_{Y}(y)={\mathcal {N}}(y;\mu _{Y},\sigma _{Y}^{ii})={\frac {1}{{\sqrt {two\pi }}\sigma _{Y}}}due east^{-(y-\mu _{Y})^{2}/(ii\sigma _{Y}^{two})}\end{aligned}}}

Substituting into the convolution:

f Z ( z ) = 1 2 π σ Y exp [ ( z x μ Y ) 2 2 σ Y 2 ] 1 2 π σ Ten exp [ ( x μ 10 ) 2 ii σ X two ] d x = 1 2 π two π σ X σ Y exp [ σ X ii ( z x μ Y ) two + σ Y 2 ( x μ X ) two two σ Ten 2 σ Y ii ] d ten = 1 2 π 2 π σ X σ Y exp [ σ Ten ii ( z ii + 10 two + μ Y 2 2 10 z 2 z μ Y + ii x μ Y ) + σ Y 2 ( ten two + μ X 2 2 x μ Ten ) 2 σ Y 2 σ X two ] d x = one 2 π 2 π σ 10 σ Y exp [ ten 2 ( σ X two + σ Y ii ) 2 x ( σ X 2 ( z μ Y ) + σ Y ii μ 10 ) + σ X ii ( z 2 + μ Y ii 2 z μ Y ) + σ Y 2 μ X 2 ii σ Y 2 σ X 2 ] d x {\displaystyle {\begin{aligned}f_{Z}(z)&=\int _{-\infty }^{\infty }{\frac {1}{{\sqrt {2\pi }}\sigma _{Y}}}\exp \left[-{(z-x-\mu _{Y})^{2} \over 2\sigma _{Y}^{2}}\right]{\frac {1}{{\sqrt {ii\pi }}\sigma _{X}}}\exp \left[-{(x-\mu _{X})^{2} \over 2\sigma _{X}^{two}}\right]\,dx\\[6pt]&=\int _{-\infty }^{\infty }{\frac {ane}{{\sqrt {2\pi }}{\sqrt {two\pi }}\sigma _{X}\sigma _{Y}}}\exp \left[-{\frac {\sigma _{X}^{2}(z-10-\mu _{Y})^{2}+\sigma _{Y}^{2}(ten-\mu _{X})^{2}}{ii\sigma _{X}^{2}\sigma _{Y}^{two}}}\right]\,dx\\[6pt]&=\int _{-\infty }^{\infty }{\frac {1}{{\sqrt {ii\pi }}{\sqrt {two\pi }}\sigma _{10}\sigma _{Y}}}\exp \left[-{\frac {\sigma _{X}^{2}(z^{2}+ten^{ii}+\mu _{Y}^{2}-2xz-2z\mu _{Y}+2x\mu _{Y})+\sigma _{Y}^{2}(x^{2}+\mu _{X}^{ii}-2x\mu _{X})}{2\sigma _{Y}^{2}\sigma _{10}^{ii}}}\right]\,dx\\[6pt]&=\int _{-\infty }^{\infty }{\frac {ane}{{\sqrt {2\pi }}{\sqrt {2\pi }}\sigma _{Ten}\sigma _{Y}}}\exp \left[-{\frac {x^{2}(\sigma _{X}^{2}+\sigma _{Y}^{2})-2x(\sigma _{X}^{ii}(z-\mu _{Y})+\sigma _{Y}^{2}\mu _{X})+\sigma _{10}^{ii}(z^{ii}+\mu _{Y}^{2}-2z\mu _{Y})+\sigma _{Y}^{2}\mu _{X}^{2}}{2\sigma _{Y}^{two}\sigma _{Ten}^{2}}}\right]\,dx\\[6pt]\end{aligned}}}

Defining σ Z = σ X 2 + σ Y 2 {\displaystyle \sigma _{Z}={\sqrt {\sigma _{X}^{2}+\sigma _{Y}^{ii}}}} , and completing the square:

f Z ( z ) = 1 2 π σ Z one 2 π σ 10 σ Y σ Z exp [ x 2 2 x σ X 2 ( z μ Y ) + σ Y 2 μ X σ Z 2 + σ Ten 2 ( z two + μ Y ii two z μ Y ) + σ Y two μ X 2 σ Z 2 2 ( σ X σ Y σ Z ) 2 ] d 10 = one 2 π σ Z i ii π σ X σ Y σ Z exp [ ( x σ X 2 ( z μ Y ) + σ Y 2 μ X σ Z 2 ) two ( σ X two ( z μ Y ) + σ Y 2 μ X σ Z ii ) 2 + σ Ten 2 ( z μ Y ) two + σ Y two μ 10 ii σ Z 2 ii ( σ X σ Y σ Z ) 2 ] d x = one 2 π σ Z exp [ σ Z ii ( σ Ten 2 ( z μ Y ) ii + σ Y 2 μ Ten two ) ( σ 10 2 ( z μ Y ) + σ Y 2 μ Ten ) 2 2 σ Z 2 ( σ 10 σ Y ) 2 ] 1 2 π σ X σ Y σ Z exp [ ( ten σ X ii ( z μ Y ) + σ Y 2 μ X σ Z 2 ) 2 two ( σ X σ Y σ Z ) 2 ] d x = ane ii π σ Z exp [ ( z ( μ X + μ Y ) ) 2 2 σ Z 2 ] ane ii π σ X σ Y σ Z exp [ ( x σ X two ( z μ Y ) + σ Y 2 μ X σ Z ii ) 2 two ( σ 10 σ Y σ Z ) ii ] d x {\displaystyle {\begin{aligned}f_{Z}(z)&=\int _{-\infty }^{\infty }{\frac {1}{{\sqrt {2\pi }}\sigma _{Z}}}{\frac {1}{{\sqrt {two\pi }}{\frac {\sigma _{Ten}\sigma _{Y}}{\sigma _{Z}}}}}\exp \left[-{\frac {x^{2}-2x{\frac {\sigma _{Ten}^{2}(z-\mu _{Y})+\sigma _{Y}^{two}\mu _{10}}{\sigma _{Z}^{2}}}+{\frac {\sigma _{10}^{2}(z^{ii}+\mu _{Y}^{2}-2z\mu _{Y})+\sigma _{Y}^{2}\mu _{X}^{2}}{\sigma _{Z}^{2}}}}{2\left({\frac {\sigma _{10}\sigma _{Y}}{\sigma _{Z}}}\right)^{2}}}\right]\,dx\\[6pt]&=\int _{-\infty }^{\infty }{\frac {1}{{\sqrt {2\pi }}\sigma _{Z}}}{\frac {1}{{\sqrt {2\pi }}{\frac {\sigma _{X}\sigma _{Y}}{\sigma _{Z}}}}}\exp \left[-{\frac {\left(x-{\frac {\sigma _{X}^{ii}(z-\mu _{Y})+\sigma _{Y}^{2}\mu _{X}}{\sigma _{Z}^{ii}}}\right)^{ii}-\left({\frac {\sigma _{X}^{two}(z-\mu _{Y})+\sigma _{Y}^{ii}\mu _{X}}{\sigma _{Z}^{2}}}\right)^{two}+{\frac {\sigma _{10}^{2}(z-\mu _{Y})^{ii}+\sigma _{Y}^{2}\mu _{X}^{2}}{\sigma _{Z}^{2}}}}{ii\left({\frac {\sigma _{X}\sigma _{Y}}{\sigma _{Z}}}\correct)^{ii}}}\correct]\,dx\\[6pt]&=\int _{-\infty }^{\infty }{\frac {i}{{\sqrt {ii\pi }}\sigma _{Z}}}\exp \left[-{\frac {\sigma _{Z}^{two}\left(\sigma _{X}^{2}(z-\mu _{Y})^{2}+\sigma _{Y}^{2}\mu _{10}^{two}\right)-\left(\sigma _{X}^{ii}(z-\mu _{Y})+\sigma _{Y}^{2}\mu _{X}\correct)^{2}}{two\sigma _{Z}^{2}\left(\sigma _{X}\sigma _{Y}\right)^{two}}}\right]{\frac {1}{{\sqrt {2\pi }}{\frac {\sigma _{10}\sigma _{Y}}{\sigma _{Z}}}}}\exp \left[-{\frac {\left(x-{\frac {\sigma _{X}^{two}(z-\mu _{Y})+\sigma _{Y}^{2}\mu _{10}}{\sigma _{Z}^{2}}}\correct)^{2}}{ii\left({\frac {\sigma _{X}\sigma _{Y}}{\sigma _{Z}}}\right)^{2}}}\right]\,dx\\[6pt]&={\frac {1}{{\sqrt {2\pi }}\sigma _{Z}}}\exp \left[-{(z-(\mu _{Ten}+\mu _{Y}))^{2} \over 2\sigma _{Z}^{2}}\correct]\int _{-\infty }^{\infty }{\frac {1}{{\sqrt {ii\pi }}{\frac {\sigma _{Ten}\sigma _{Y}}{\sigma _{Z}}}}}\exp \left[-{\frac {\left(x-{\frac {\sigma _{X}^{2}(z-\mu _{Y})+\sigma _{Y}^{2}\mu _{X}}{\sigma _{Z}^{2}}}\right)^{2}}{2\left({\frac {\sigma _{X}\sigma _{Y}}{\sigma _{Z}}}\right)^{2}}}\right]\,dx\finish{aligned}}}

The expression in the integral is a normal density distribution on 10, and then the integral evaluates to ane. The desired result follows:

f Z ( z ) = 1 2 π σ Z exp [ ( z ( μ Ten + μ Y ) ) 2 two σ Z 2 ] {\displaystyle f_{Z}(z)={\frac {i}{{\sqrt {2\pi }}\sigma _{Z}}}\exp \left[-{(z-(\mu _{X}+\mu _{Y}))^{ii} \over 2\sigma _{Z}^{ii}}\right]}
Using the convolution theorem [edit]

It tin can exist shown that the Fourier transform of a Gaussian, f X ( 10 ) = Due north ( x ; μ X , σ 10 2 ) {\displaystyle f_{Ten}(10)={\mathcal {North}}(x;\mu _{X},\sigma _{10}^{ii})} , is[3]

F { f X } = F X ( ω ) = exp [ j ω μ X ] exp [ σ X 2 ω 2 2 ] {\displaystyle {\mathcal {F}}\{f_{X}\}=F_{10}(\omega )=\exp \left[-j\omega \mu _{Ten}\right]\exp \left[-{\tfrac {\sigma _{X}^{2}\omega ^{2}}{ii}}\correct]}

By the convolution theorem:

f Z ( z ) = ( f X f Y ) ( z ) = F one { F { f Ten } F { f Y } } = F 1 { exp [ j ω μ 10 ] exp [ σ Ten 2 ω 2 2 ] exp [ j ω μ Y ] exp [ σ Y two ω ii 2 ] } = F i { exp [ j ω ( μ 10 + μ Y ) ] exp [ ( σ X 2 + σ Y 2 ) ω 2 two ] } = N ( z ; μ X + μ Y , σ Ten 2 + σ Y ii ) {\displaystyle {\begin{aligned}f_{Z}(z)&=(f_{X}*f_{Y})(z)\\[5pt]&={\mathcal {F}}^{-one}{\big \{}{\mathcal {F}}\{f_{10}\}\cdot {\mathcal {F}}\{f_{Y}\}{\big \}}\\[5pt]&={\mathcal {F}}^{-1}{\big \{}\exp \left[-j\omega \mu _{10}\right]\exp \left[-{\tfrac {\sigma _{X}^{two}\omega ^{2}}{2}}\correct]\exp \left[-j\omega \mu _{Y}\correct]\exp \left[-{\tfrac {\sigma _{Y}^{ii}\omega ^{ii}}{two}}\correct]{\big \}}\\[5pt]&={\mathcal {F}}^{-1}{\big \{}\exp \left[-j\omega (\mu _{X}+\mu _{Y})\right]\exp \left[-{\tfrac {(\sigma _{X}^{2}\ +\sigma _{Y}^{2})\omega ^{2}}{ii}}\correct]{\big \}}\\[5pt]&={\mathcal {N}}(z;\mu _{10}+\mu _{Y},\sigma _{X}^{2}+\sigma _{Y}^{2})\cease{aligned}}}

Geometric proof [edit]

First consider the normalized case when X, Y ~ Northward(0, ane), so that their PDFs are

f ( x ) = 1 2 π e x 2 / 2 {\displaystyle f(x)={\frac {ane}{\sqrt {ii\pi \,}}}e^{-ten^{2}/ii}}

and

g ( y ) = 1 2 π eastward y ii / 2 . {\displaystyle thou(y)={\frac {1}{\sqrt {ii\pi \,}}}e^{-y^{2}/ii}.}

Let Z = X +Y. Then the CDF for Z will be

z ten + y z f ( ten ) g ( y ) d x d y . {\displaystyle z\mapsto \int _{10+y\leq z}f(x)yard(y)\,dx\,dy.}

This integral is over the one-half-plane which lies nether the line x+y = z.

The key observation is that the function

f ( 10 ) g ( y ) = ane 2 π e ( x 2 + y 2 ) / two {\displaystyle f(x)g(y)={\frac {1}{2\pi }}e^{-(x^{ii}+y^{ii})/2}\,}

is radially symmetric. So we rotate the coordinate plane about the origin, choosing new coordinates x , y {\displaystyle x',y'} such that the line x+y = z is described by the equation x = c {\displaystyle x'=c} where c = c ( z ) {\displaystyle c=c(z)} is determined geometrically. Because of the radial symmetry, nosotros have f ( x ) g ( y ) = f ( 10 ) 1000 ( y ) {\displaystyle f(x)thousand(y)=f(x')g(y')} , and the CDF for Z is

x c , y R f ( x ) 1000 ( y ) d x d y . {\displaystyle \int _{x'\leq c,y'\in \mathbb {R} }f(ten')g(y')\,dx'\,dy'.}

This is easy to integrate; nosotros find that the CDF for Z is

c ( z ) f ( x ) d ten = Φ ( c ( z ) ) . {\displaystyle \int _{-\infty }^{c(z)}f(10')\,dx'=\Phi (c(z)).}

To determine the value c ( z ) {\displaystyle c(z)} , annotation that we rotated the plane then that the line 10+y = z now runs vertically with x-intercept equal to c. So c is just the altitude from the origin to the line x+y = z along the perpendicular bisector, which meets the line at its nearest indicate to the origin, in this case ( z / 2 , z / two ) {\displaystyle (z/2,z/2)\,} . Then the distance is c = ( z / two ) ii + ( z / 2 ) 2 = z / 2 {\displaystyle c={\sqrt {(z/two)^{2}+(z/two)^{2}}}=z/{\sqrt {ii}}\,} , and the CDF for Z is Φ ( z / 2 ) {\displaystyle \Phi (z/{\sqrt {2}})} , i.e., Z = 10 + Y N ( 0 , 2 ) . {\displaystyle Z=X+Y\sim Due north(0,2).}

Now, if a, b are whatever existent constants (not both nada) then the probability that a X + b Y z {\displaystyle aX+by\leq z} is found by the aforementioned integral equally above, merely with the bounding line a ten + b y = z {\displaystyle ax+by=z} . The aforementioned rotation method works, and in this more general instance we find that the closest betoken on the line to the origin is located a (signed) distance

z a two + b 2 {\displaystyle {\frac {z}{\sqrt {a^{ii}+b^{2}}}}}

away, so that

a X + b Y Due north ( 0 , a two + b 2 ) . {\displaystyle aX+by\sim Northward(0,a^{ii}+b^{2}).}

The same argument in higher dimensions shows that if

X i N ( 0 , σ i 2 ) , i = i , , n , {\displaystyle X_{i}\sim N(0,\sigma _{i}^{2}),\qquad i=i,\dots ,n,}

so

X 1 + + X n N ( 0 , σ 1 2 + + σ n 2 ) . {\displaystyle X_{i}+\cdots +X_{n}\sim North(0,\sigma _{1}^{2}+\cdots +\sigma _{north}^{2}).}

Now we are essentially done, because

X North ( μ , σ ii ) one σ ( Ten μ ) N ( 0 , 1 ) . {\displaystyle X\sim N(\mu ,\sigma ^{2})\Leftrightarrow {\frac {ane}{\sigma }}(X-\mu )\sim N(0,1).}

So in general, if

X i Due north ( μ i , σ i 2 ) , i = ane , , n , {\displaystyle X_{i}\sim North(\mu _{i},\sigma _{i}^{2}),\qquad i=one,\dots ,n,}

so

i = i n a i X i Due north ( i = 1 n a i μ i , i = one n ( a i σ i ) 2 ) . {\displaystyle \sum _{i=1}^{n}a_{i}X_{i}\sim N\left(\sum _{i=1}^{n}a_{i}\mu _{i},\sum _{i=ane}^{n}(a_{i}\sigma _{i})^{2}\right).}

Correlated random variables [edit]

In the event that the variables X and Y are jointly normally distributed random variables, so X +Y is all the same normally distributed (meet Multivariate normal distribution) and the mean is the sum of the ways. However, the variances are not additive due to the correlation. Indeed,

σ X + Y = σ X 2 + σ Y 2 + 2 ρ σ 10 σ Y , {\displaystyle \sigma _{Ten+Y}={\sqrt {\sigma _{X}^{ii}+\sigma _{Y}^{ii}+two\rho \sigma _{Ten}\sigma _{Y}}},}

where ρ is the correlation. In particular, whenever ρ < 0, then the variance is less than the sum of the variances of X and Y.

Extensions of this result can be made for more two random variables, using the covariance matrix.

Proof [edit]

In this case (with X and Y having zero means), one needs to consider

1 two π σ x σ y 1 ρ 2 x y exp [ 1 2 ( 1 ρ ii ) ( x 2 σ 10 two + y 2 σ y 2 ii ρ ten y σ x σ y ) ] δ ( z ( x + y ) ) d x d y . {\displaystyle {\frac {one}{2\pi \sigma _{x}\sigma _{y}{\sqrt {i-\rho ^{ii}}}}}\iint _{x\,y}\exp \left[-{\frac {1}{2(ane-\rho ^{two})}}\left({\frac {x^{2}}{\sigma _{10}^{2}}}+{\frac {y^{ii}}{\sigma _{y}^{2}}}-{\frac {ii\rho xy}{\sigma _{x}\sigma _{y}}}\right)\right]\delta (z-(x+y))\,\mathrm {d} 10\,\mathrm {d} y.}

Equally to a higher place, one makes the commutation y z ten {\displaystyle y\rightarrow z-x}

This integral is more complicated to simplify analytically, but can be done easily using a symbolic mathematics programme. The probability distribution f Z (z) is given in this case by

f Z ( z ) = 1 2 π σ + exp ( z ii 2 σ + 2 ) {\displaystyle f_{Z}(z)={\frac {1}{{\sqrt {ii\pi }}\sigma _{+}}}\exp \left(-{\frac {z^{ii}}{2\sigma _{+}^{two}}}\right)}

where

σ + = σ x 2 + σ y two + 2 ρ σ x σ y . {\displaystyle \sigma _{+}={\sqrt {\sigma _{x}^{2}+\sigma _{y}^{2}+2\rho \sigma _{x}\sigma _{y}}}.}

If one considers instead Z = X −Y, and so ane obtains

f Z ( z ) = 1 2 π ( σ ten 2 + σ y 2 two ρ σ ten σ y ) exp ( z 2 two ( σ x two + σ y ii ii ρ σ x σ y ) ) {\displaystyle f_{Z}(z)={\frac {1}{\sqrt {2\pi (\sigma _{ten}^{two}+\sigma _{y}^{two}-two\rho \sigma _{x}\sigma _{y})}}}\exp \left(-{\frac {z^{2}}{2(\sigma _{x}^{2}+\sigma _{y}^{2}-two\rho \sigma _{10}\sigma _{y})}}\correct)}

which also can exist rewritten with

σ = σ x ii + σ y 2 2 ρ σ x σ y . {\displaystyle \sigma _{-}={\sqrt {\sigma _{x}^{2}+\sigma _{y}^{2}-2\rho \sigma _{10}\sigma _{y}}}.}

The standard deviations of each distribution are obvious by comparison with the standard normal distribution.

References [edit]

  1. ^ Lemons, Don S. (2002), An Introduction to Stochastic Processes in Physics, The Johns Hopkins University Printing, p. 34, ISBN0-8018-6866-1
  2. ^ Lemons (2002) pp. 35–36
  3. ^ Derpanis, Konstantinos M. (October 20, 2005). "Fourier Transform of the Gaussian" (PDF).

See too [edit]

  • Propagation of uncertainty
  • Algebra of random variables
  • Stable distribution
  • Standard error (statistics)
  • Ratio distribution
  • Production distribution
  • Slash distribution
  • Listing of convolutions of probability distributions

lyleshawat.blogspot.com

Source: https://en.wikipedia.org/wiki/Sum_of_normally_distributed_random_variables

0 Response to "Random Variable Normal Distribution Comparisons Agains Other Event"

Post a Comment

Iklan Atas Artikel

Iklan Tengah Artikel 1

Iklan Tengah Artikel 2

Iklan Bawah Artikel