 Pure Programmer ## Project: Sample Mean and Standard Deviation

Write a program that computes the [[Sample_mean_and_covariance|mean]] (or average) and [[standard deviation]] of a sample of N random floating point values in the interval [0,1). This closed/open [[Interval_(mathematics)|interval]] notation means 0 is in the range but 1 is not. The sample size (N) should be passed as a command line argument. Print the mean and and standard deviation to six decimal places. Run the program multiple times to confirm that you get different results each time. Do you notice that the mean and standard deviation get closer to consistent values as you increase the sample size (N)? As N gets larger the mean should approach the mean of the generator at 0.5 and the standard deviation should approach 0.288675 if our generator is perfectly uniform.

Output
\$ g++ -std=c++17 SampleMeanAndStdDev.cpp -o SampleMeanAndStdDev -lfmt \$ ./SampleMeanAndStdDev 10 Sample mean: 0.383306 Sample std. dev.: 0.284761 \$ g++ -std=c++17 SampleMeanAndStdDev.cpp -o SampleMeanAndStdDev -lfmt \$ ./SampleMeanAndStdDev 1000 Sample mean: 0.496937 Sample std. dev.: 0.287970 \$ g++ -std=c++17 SampleMeanAndStdDev.cpp -o SampleMeanAndStdDev -lfmt \$ ./SampleMeanAndStdDev 1000000 Sample mean: 0.499982 Sample std. dev.: 0.288611

Solution