Python standard deviation program?

I'm learning Python, and I tried to write a program to find the standard deviation of a list of numbers, but it isn't working. Here it is: def sd(a): > m = sum(a)/len(a) > b = [] > for n in range((len(a)-1)): > > if a[n] > m: > > > b.append((a[n]-m)**2) > > if a[n]... show more I'm learning Python, and I tried to write a program to find the standard deviation of a list of numbers, but it isn't working.
Here it is:

def sd(a):
> m = sum(a)/len(a)
> b = []
> for n in range((len(a)-1)):
> > if a[n] > m:
> > > b.append((a[n]-m)**2)
> > if a[n] < m:
> > > b.append((m-a[n])**2)
> sda = (sum(b)/len(a))**(1/2)
> return sda

Every time I run it, it returns one, regardless of the numbers. Why is it doing this?
1 answer 1