k I'm trying to create a program that calculates the Standard deviation of a sample of r random numbers between 0 and 1 in python 2.7.
This is my program:
from random import random r= input("please enter a number r: ") for i in range(r): sum1=0 i = random() sum1 += i a= (sum1 / r*1.0) # this computes average of random numbers between 0 and 1 def SD(): L =  for r in range(i): if r[i] > a: L.append((r[i] - a)**2) if r[i] < a: L.append((a - r[i])**2) SD = (float(L)/r)**0.5 return SD print "The standard deviation is", SD
It gives me a value but I would like to know for sure if this program is correct? Thanks!