Pure Programmer
Blue Matrix


Cluster Map

Project: Sample Mean and Standard Deviation

Write a program that computes the [[Sample_mean_and_covariance|mean]] (or average) and [[standard deviation]] of a sample of N random floating point values in the interval [0,1). This closed/open [[Interval_(mathematics)|interval]] notation means 0 is in the range but 1 is not. The sample size (N) should be passed as a command line argument. Print the mean and and standard deviation to six decimal places. Run the program multiple times to confirm that you get different results each time. Do you notice that the mean and standard deviation get closer to consistent values as you increase the sample size (N)? As N gets larger the mean should approach the mean of the generator at 0.5 and the standard deviation should approach 0.288675 if our generator is perfectly uniform.

Output
$ javac -Xlint SampleMeanAndStdDev.java $ java -ea SampleMeanAndStdDev 10 Sample mean: 0.487202 Sample std. dev.: 0.289480 $ javac -Xlint SampleMeanAndStdDev.java $ java -ea SampleMeanAndStdDev 1000 Sample mean: 0.495143 Sample std. dev.: 0.297671 $ javac -Xlint SampleMeanAndStdDev.java $ java -ea SampleMeanAndStdDev 1000000 Sample mean: 0.500445 Sample std. dev.: 0.288701

Solution