Project: Sample Mean and Standard Deviation
Write a program that computes the [[Sample_mean_and_covariance|mean]] (or average) and [[standard deviation]] of a sample of N random floating point values in the interval [0,1). This closed/open [[Interval_(mathematics)|interval]] notation means 0 is in the range but 1 is not. The sample size (N) should be passed as a command line argument. Print the mean and and standard deviation to six decimal places. Run the program multiple times to confirm that you get different results each time. Do you notice that the mean and standard deviation get closer to consistent values as you increase the sample size (N)? As N gets larger the mean should approach the mean of the generator at 0.5 and the standard deviation should approach 0.288675 if our generator is perfectly uniform.