Statistical Inference 1~4
Statistical Inference 1~4
1 Probability Theory
1.1 Set Theory
Definition 1.1.1 The set, S, of all possible outcomes of a particular experiment is called the sample space for the experiment.
Definition 1.1.2 An event is any collection of possible outcomes of an experiment, that is, any subset of S(including S itself).
Definition 1.1.5 Two events and are disjoint if . The events are pairwise disjoint if for all .
Definition 1.1.6 If are pairwise disjoint and , then the collection forms a partition of .
1.2 Basics of Probability Theory
Definition 1.2.1 A collection of subset of is called a sigma algebra (or Borel field), denoted by B, if it satisfies the following three properties:
 .
 If , then .
 If , then .
Definition 1.2.4 Given a sample space and an associated sigma algebra , a probability function with domain that satisfies
 for all .
 .
 If are pairwise disjoint, then .
1.3 Conditional Probability and Independence
1.4 Random Variables
1.5 Distribution Functions
1.6 Density and Mass Functions
1.8 Miscellanea
2 Transformations and Expectations
2.1 Distribution of Functions of a Random Variable
Theorem 2.1.4 Let have pdf and let , where is a monotone function. Let and be defined by (2.1.7). Suppose that is continuous on and that has a continuous derivative on . Then the pdf of is given by
2.2 Expected Values
2.3 Moments and Moment Generating Functions
Definition 2.3.6 Let be a random variable with cdf . The moment generating function (mgf) of (or ), denoted by , is
2.4 Differentiating Under an Integral Sign
Theorem 2.4.1 (Leibnitz’s Rule) If are differentiable with respect to , then $$ \frac{d}{d\theta}\int_{a(\theta)}^{b(\theta)}f(x,\theta)dx=f(b(\theta),\theta)\frac{d}{d\theta}b(\theta)
 f(a(\theta),\theta)\frac{d}{d\theta}a(\theta)+\int_{a(\theta)}^{b(\theta)}\frac{d}{d\theta}f(x,\theta)dx $$
Notice that if and are constant, we have a special case of Leibnitz’s Rule:
2.6 Miscellanea
3 Common Families of Distributions
3.1 Introduction
3.2 Discrete Distribution
Discrete Uniform Distribution
Hypergeometric Distribution
Binomial Distribution
Poisson Distribution
Negative Binomial Distribution
Geometric Distribution
3.3 Continuous Distribution
Uniform Distribution
Gamma Distribution
.
Normal Distribution
Beta Distribution
.
.
Cauchy Distribution
Lognormal Distribution
Exponential Distribution
Double Exponential Distribution
3.4 Exponential Families
A family of pdfs or pmfs is called an exponential family if it can be expressed as
3.5 Location and Scale Families
3.6 Inequalities and Identities
3.6.1 Probability Inequalities
Theorem 3.6.1 (Chebychev’s Inequalilty) Let be a random variable and let be a nonnegative function. Then, for any ,
3.6.2 Identities
3.8 Miscellanea
3.8.2 Chebychev and Beyound
4 Multiple Random Variables
4.1 Joint and Marginal Distributions
The marginal pmf
4.2 Conditional Distributions and Independence
4.3 Bivariate Transformations
4.4 Hierarchical Models and Mixture Distributions
Theorem 4.4.3 If X and Y are any two random variables, then
Theorem 4.4.7 For any two random variables X and Y,
4.5 Covariance and Correlation
.
covariance:
correlation:
Theorem 4.5.3 For any random variables and ,
Theorem 4.5.5 If and are independent random variables, then and .
Theorem 4.5.6 If and are any two random variables and and are any two constants, then
.
Theorem 4.5.7 For any random variables and ,
 .
 if and only if there exist numbers and such that . If , then , and if , then .
4.6 Multivariate Distributions
4.7 Inequalities
4.7.1 Numerical Inequalities
Lemma 4.7.1 Let a and b be any positive numbers, and let p and q be any positive numbers (necessarily greater than 1) satisfying
Then,
with equality if and only .
Theorem 4.7.2 (Holder’s Inequality) Let X and Y be any two random variables, and let p and q satisfy. Then
Theorem 4.7.5 (Minkowski’s Inequality) Let X and Y be any two random variables. Then for ,
4.7.2 Functional Inequalities
Theorem 4.7.7 (Jensen’s Inequality) For any random variable X, if g(x) is a convex function, then
Theorem 4.7.9 (Covariance Inequality) Let be any random variable and and any functions such that , , and exist.

If is a nondecreasing function and is a nonincreasing function, then

If and are either both nondecreasing or both nonincreasing, then