k I'm trying to create a program that calculates the Standard deviation of a sample of r random numbers between 0 and 1 in python 2.7.

This is my program:

from random import random
r= input("please enter a number r: ")
for i in range(r):
   i = random() 
   sum1 += i 
a= (sum1 / r*1.0)
# this computes average of random numbers between 0 and 1

def SD():
    L = []
    for r in range(i):
        if r[i] > a:
             L.append((r[i] - a)**2)
        if r[i] < a:
             L.append((a - r[i])**2)
    SD = (float(L)/r)**0.5
    return SD
print "The standard deviation is", SD

It gives me a value but I would like to know for sure if this program is correct? Thanks!

Edited by imperfectluck1: code

6 Years
Discussion Span
Last Post by woooee

Come up with a test suite, with known numbers and known results to test with. Also, you don't know if multiplication or division is done first so this line
a= (sum1 / r*1.0)
can lead to unexpected results. And I think that multiplication and division would evaluate left to right, so division would be done first=division result is an integer*1.0. Print the result to see for yourself. Instead, convert the numerator or denominator to a float as you do later in the code, or convert "r*1.0" on a separate line or add parens.

Don't use the same name as in can confuse the compiler in some situations.

def SD():
    SD = (float(L)/r)**0.5

Edited by woooee: n/a

Votes + Comments
good point
This topic has been dead for over six months. Start a new discussion instead.
Have something to contribute to this discussion? Please be thoughtful, detailed and courteous, and be sure to adhere to our posting rules.