\"Programmer\'s Mathematics 2: Probability and Statistics\" follows the approachable style of \"Programmer\'s Mathematics\". It uses popular language and specific charts to explain in depth the various probability and statistics knowledge that programmers must master. It is rich in examples, clear in explanation, and provides a lot of extended content to guide readers to further study. \"Programmer\'s Mathematics 2: Probability and Statistics\" involves random variables, Bayesian formula, probability distribution of discrete and continuous values, covariance matrix, multivariate normal distribution, estimation and test theory, pseudo-random numbers, and various applications of probability theory. It is suitable for programmers and mathematics enthusiasts to read, and can also be used as an introductory reading on probability theory for high school or university students who are not majoring in mathematics. Part 1 Let’s talk about probability Chapter 1 The definition of probability 3 1.1 The mathematical definition of probability 3 1.2 The three doors (Monty Hall problem) — From the perspective of an airship 4 1.2.1 The Monty Hall problem 5 1.2.2 Correct answers and common mistakes 6 1.2.3 From the perspective of an airship 6 1.3 Triplet (Ω, F, P) — From the perspective of God 9 1.4 Random variables 13 1.5 Probability distributions 17 1.6 Shorthand for practical use 19 1.6.1 Random variables 19 1.6.2 Probability 20 1.7 Behind the scenes 21 1.7.1 Don’t worry about anything 21 1.7.2 The customary way of dealing with 22 1.7.3 Probability theory without (without the perspective of God) 23 1.8 Some precautions 23 1.8.1 What I want to do 23 1.8.2 Because it is the area… 24 1.8.3 Explanation 26 Chapter 2 Relationships between Multiple Random Variables 29 2.1 Land Use in Each County (Warm-up for Area Calculation) 29 2.1.1 Statistics for Different Counties and Different Uses (Warm-up for Joint and Marginal Probabilities) 30 2.1.2 Proportions of Specific Counties and Specific Uses (Warm-up for Conditional Probabilities) 31 2.1.3 Back-calculating Proportions (Warm-up for Bayesian Formula) 32 2.1.4 When Proportions Are Equal (Warm-up for Independence) 34 2.1.5 End of Warm-up 38 2.2 Joint and Marginal Probabilities 38 2.2.1 Two Random Variables 38 2.2.2 Three Random Variables 41 2.3 Conditional Probability 42 2.3.1 Definition of Conditional Probability 42 2.3.2 Relationship between Joint, Marginal, and Conditional Distributions 45 2.3.3 Conditions Still Work Even When the Equal Sign Is Not Used 50 2.3.4 2.4.2 Bayes’ Plot 57 2.4.3 Bayes’ Plot 61 2.5 Independence 63 2.5.1 Independence of Events (Definition) 64 2.5.2 Independence of Events (Equivalent Statement) 67 2.5.3 Independence of Random Variables 70 2.5.4 Independence of Three or More Random Variables (More Attention Needed) 73 Chapter 3 Probability Distribution of Discrete Values 79 3.1 Some Simple Examples 79 3.2 Binomial Distribution 82 3.2.1 Derivation of Binomial Distribution 82 3.2.2 Supplement: Permutation nPk, Combination nCk 83 3.3 Expected Value 85 3.3.1 Definition of Expected Value 85 3.3.2 Basic Properties of Expected Value 87 3.3.3 3.4.1 Even if the expected values are the same 99 3.4.2 Variance is the expected value of the “expected value dispersion” 100 3.4.3 Standard deviation 102 3.4.4 Addition, multiplication and standardization of constants 104 3.4.5 When the terms are independent, the variance of the sum is equal to the sum of the variances 108 3.4.6 Expected value and variance of squares 110 3.5 The law of large numbers 112 3.5.1 Independent and identically distributed 114 3.5.2 Expected value of the mean and variance of the mean 116 3.5.3 The law of large numbers 117 3.5.4 Notes on the law of large numbers 118 3.6 Supplementary content: conditional expectation and the method of least squares 120 3.6.1 Definition of conditional expectation 120 3.6.2 The least squares method 121 3.6.3 The God’s eye view 122 3.6.4 Conditional variance 123 Chapter 4 Probability distribution of continuous values 127 4.1 Gradient color printing problem (warm-up for density calculation) 128 4.1.1 Describing ink consumption with a graph (warm-up for cumulative distribution function) 128 4.1.2 Describing ink printing density with a graph (warm-up for probability density function) 129 4.1.3 The effect of stretching the printed product on ink density (warm-up for variable transformation) 133 4.2 The case of zero probability 136 4.2.1 The case of zero probability 137 4.2.2 What problems will the probability of zero bring? 139 4.3 Probability density function 140 4.3.1 Probability density function 140 4.3.2 Uniform distribution 146 4.3.3 Variable transformation of probability density function 147 4.4 4.4.1 Joint distribution 152 4.4.2 How to read after this section 155 4.4.3 Marginal distribution 155 4.4.4 Conditional distribution 159 4.4.5 Bayes’ formula 162 4.4.6 Independence 163 4.4.7 Probability of any region 166 4.4.8 Mixture of real and discrete values 174 4.5 Expected value, variance and standard deviation 174 4.5.1 Expected value 175 4.5.2 Variance and standard deviation 179 4.6 Normal distribution and central limit theorem 180 4.6.1 Standard normal distribution 181 4.6.2 General normal distribution 184 4.6.3 Central limit theorem 187 Chapter 5 Covariance matrix, multivariate normal distribution and ellipse 195 5.1 Covariance and Correlation Coefficient 196 5.1.1 Covariance 196 5.1.2 Properties of Covariance 199 5.1.3 The Significance of Distribution Tendency and Correlation Coefficient 200 5.1.4 Limitations of Covariance and Correlation Coefficient 206 5.2 Covariance Matrix 208 5.2.1 Covariance Matrix = Variance and Covariance Table 208 5.2.2 Vector Form of Covariance Matrix 209 5.2.3 Vector and Matrix Operations and Expected Value 212 5.2.4 Supplementary Notes on Vector-Valued Random Variables 215 5.2.5 Variable Transformation of Covariance Matrix 217 5.2.6 Degree of Divergence in Any Direction 218 5.3 Multivariate Normal Distribution 220 5.3.1 Multivariate Standard Normal Distribution 220 5.3.2 Multivariate General Normal Distribution 223 5.3.3 Probability density function of multivariate normal distribution 228 5.3.4 Properties of multivariate normal distribution 230 5.3.5 Cross section and projection 232 5.3.6 Supplementary knowledge: Chi-square distribution 239 5.4 Relationship between covariance matrix and ellipse 242 5.4.1 (Example 1) Identity matrix and circle 242 5.4.2 (Example 2) Diagonal matrix and ellipse 244 5.4.3 (Example 3) General matrix and tilted ellipse 247 5.4.4 Limitations of covariance matrix 251 Part 2 Explores the application of probability Chapter 6 Estimation and test 257 6.1 Estimation theory 257 6.1.1 Descriptive statistics and inferential statistics 257 6.1.2 Descriptive statistics 258 6.1.3 How to understand some concepts in inferential statistics 260 6.1.4 Problem setting 264 6.1.5 Expected fine amount 265 6.1.6 Multi-objective optimization 266 6.1.7 (Strategy 1) Reduce the number of candidates - **Minimum variance unbiased estimation 267 6.1.8 (Strategy 2) Weaken the definition of **optimality - **Maximum likelihood estimation 269 6.1.9 (Strategy 3) Use a single value as the evaluation benchmark - Bayesian estimation 272 6.1.10 Relevant considerations for strategy selection 275 6.2 Test theory 276 6.2.1 Logic in test theory 276 6.2.2 Overview of test theory 278 6.2.3 Simple hypothesis 279 6.2.4 Composite hypothesis 282 Chapter 7 Pseudo-random numbers 285 7.1 Basics of Pseudo-Random Numbers 285 7.1.1 Random Number Sequences 285 7.1.2 Pseudo-Random Number Sequences 286 7.1.3 Typical Applications: Monte Carlo Methods 287 7.1.4 Related Topics: Pseudo-Random Number Sequences in Cryptography · Low Discrepancy Sequences 289 7.2 Generation of Random Numbers Following a Specific Distribution 291 7.2.1 Generation of Random Numbers Following a Discrete Value Distribution 292 7.2.2 Generation of Random Numbers Following a Continuous Value Distribution 293 7.2.3 Generation of Random Numbers Following a Normal Distribution 296 7.2.4 Supplementary Knowledge: Uniform Distribution in a Triangle and on a Sphere 298 Chapter 8 Various Applications of Probability Theory 305 8.1 Regression Analysis and Multivariate Analysis 305 8.1.1 Fitting a Line by the Least Squares Method 305 8.1.2 Principal Component Analysis 312 8.2 Random Processes 319 8.2.1 Random Walk 321 8.2.2 Kalman Filter 326 8.2.3 Markov Chain 331 8.2.4 Some Supplementary Notes on Random Processes 342 8.3 Information Theory 343 8.3.1 Entropy 343 8.3.2 Binary Entropy 347 8.3.3 Source Coding 349 8.3.4 Channel Coding 352 Appendix A Basic Mathematics Covered in This Book 359 A.1 Greek Letters 359 A.2 Numbers 359 A.2.1 Natural Numbers and Integers 359 A.2.2 Rational Numbers and Real Numbers 359 A.2.3 Complex Numbers 360 A.3 Sets 360 A.3.1 Ways to Represent Sets 360 A.3.2 The Size of Infinite Sets 361 A.3.3 Strengthening Exercises 361 A.4 Sum Symbol 362 A.4.1 Definition and Basic Properties 362 A.4.2 Double Summation 364 A.4.3 Range Specification 366 A.4.4 Geometric Sequence 366 A.5 Exponentiation and Logarithm 368 A.5.1 Exponential Function 368 A.5.2 Gaussian Integral 371 A.5.3 Logarithmic Function 374 A.6 Inner Product and Length 377 Appendix B Approximate Formulas and Inequalities 381 B.1 Stirling\'s Formula 381 B.2 Jensen\'s Inequality 381 B.3 Gibbs\' Inequality 384 B.4 Markov Inequality and Chebyshev Inequality 385
You Might Like
Recommended ContentMore
Open source project More
Popular Components
Searched by Users
Just Take a LookMore
Trending Downloads
Trending ArticlesMore