Many students find that the obligatory Statistics course comes as a shock. The set textbook is difficult, the curriculum is vast, and secondary-school maths feels infinitely far away. "Statistics" offers friendly instruction on the core areas of these subjects. The focus is overview. And the numerous examples give the reader a "recipe" for solving all the common types of exercise.
Content :-
- Preface
- Basic concepts of probability theory
- Probability space, probability function, sample space, event
- Conditional probability
- Independent events
- The Inclusion-Exclusion Formula
- Binomial coefficients
- Multinomial coefficients
- Random variables
- Random variables, definition
- The distribution function
- Discrete random variables, point probabilities
- Continuous random variables, density function
- Continuous random variables, distribution function
- Independent random variables
- Random vector, simultaneous density and distribution function
- Expected value and variance
- Expected value of random variables
- Variance and standard deviation of random variables
- Example (computation of expected value, variance and standard deviation)
- Estimation of expected value µ and standard deviation s by eye
- Addition and multiplication formulas for expected value and variance
- Covariance and correlation coefficient
- The Law of Large Numbers
- Chebyshev’s Inequality
- The Law of Large Numbers
- The Central Limit Theorem
- Example (distribution functions converge to F)
- Descriptive statistics
- Median and quartiles
- Mean value
- Empirical variance and empirical standard deviation
- Empirical covariance and empirical correlation coefficient
- Statistical hypothesis testing
- Null hypothesis and alternative hypothesis
- Significance probability and significance level
- Errors of type I and II
- Example
- The binomial distribution Bin(n, p)
- Parameters
- Description
- Point probabilities
- Expected value and variance
- Significance probabilities for tests in the binomial distribution
- The normal approximation to the binomial distribution
- Estimators
- Confidence intervals
- The Poisson distribution Pois(?)
- Parameters
- Description
- Point probabilities
- Expected value and variance
- Addition formula
- Significance probabilities for tests in the Poisson distribution
- Example (significant increase in sale of Skodas)
- The binomial approximation to the Poisson distribution
- The normal approximation to the Poisson distribution
- Example (significant decrease in number of complaints)
- Estimators
- Confidence intervals
- The geometrical distribution Geo(p)
- Parameters
- Description
- Point probabilities and tail probabilities
- Expected value and variance
- The hypergeometrical distribution HG(n, r,N)
- Parameters
- Description
- Point probabilities and tail probabilities
- Expected value and variance
- The binomial approximation to the hypergeometrical distribution
- The normal approximation to the hypergeometrical distribution
- The multinomial distribution Mult(n, p1, . . . , pr)
- Parameters
- Description
- Point probabilities
- Estimatorer
- The negative binomial distribution NB(n, p)
- Parameters
- Description
- Point probabilities
- Expected value and variance
- Estimatorer
- The exponential distribution Exp(?)
- Parameters
- Description
- Density and distribution function
- Expected value and variance
- The normal distribution
- Parameters
- Description
- Density and distribution function
- The standard normal distribution
- Properties of F
- Estimation of the expected value µ
- Estimation of the variance s2
- Confidence intervals for the expected value µ
- Confidence intervals for the variance s2 and the standard deviation s
- Addition formula
- Distributions connected with the normal distribution
- The ?2-distribution
- Student’s t-distribution
- Fisher’s F-distribution
- Tests in the normal distribution
- One sample, known variance, H0 : µ = µ0
- One sample, unknown variance, H0 : µ = µ0 (Student’s t-test)
- One sample, unknown expected value, H0 : s2 = s2 0
- Example
- Two samples, known variances, H0 : µ1 = µ2
- Two samples, unknown variances, H0 : µ1 = µ2 (Fisher-Behrens)
- samples, unknown expected values, H0 : s2 1 = s2 2
- Two samples, unknown common variance, H0 : µ1 = µ2
- Example (comparison of two expected values)
- Analysis of Variance (ANOVA)
- Aim and motivation
- k samples, unknown common variance, H0 : µ1 = · · · = µk
- Two examples (comparison of mean values from 3 samples)
- The chi-square test (or ?2-test)
- ?2-test for equality of distribution
- The assumption of normal distribution
- Standardised residuals
- Example (women with 5 children)
- Example (election)
- Example (deaths in the Prussian cavalry)
- Contingency tables
- Definition, method
- Standardised residuals
- Example (students’ political orientation)
- ?2-test for 2 × 2 tables
- Fisher’s exact test for 2 × 2 tables
- Example (Fisher’s exact test)
- Distribution free tests
- Wilcoxon’s test for one set of observations
- Example
- The normal approximation to Wilcoxon’s test for one set of observations
- Wilcoxon’s test for two sets of observations
- The normal approximation to Wilcoxon’s test for two sets of observations
- Linear regression
- The model
- Estimation of the parameters ß0 and ß1
- The distribution of the estimators
- Predicted values ˆyi and residuals ˆei
- Estimation of the variance s2
- Confidence intervals for the parameters ß0 and ß1
- The determination coefficient R2
- Predictions and prediction intervals
- Overview of formulas
- Example
- A Overview of discrete distributions
- B Tables
- B.1 How to read the tables
- B.2 The standard normal distribution
- B.3 The ?2-distribution (values x with F?2(x) = 0.500 etc.)
- B.4 Student’s t-distribution (values x with FStudent(x) = 0.600 etc.)
- B.5 Fisher’s F-distribution (values x with FFisher(x) = 0.90)
- B.6 Fisher’s F-distribution (values x with FFisher(x) = 0.95)
- B.7 Fisher’s F-distribution (values x with FFisher(x) = 0.99)
- B.8 Wilcoxon’s test for one set of observations
- B.9 Wilcoxon’s test for two sets of observations, a = 5%
- C Explanation of symbols
- D Index