I have another problem. Same dataset as before. I have a huge collection of floats. I have a multi-dimensional list. About 37 per list, and over 60,000 lists. So something like: [[16166.00, 4123.15, ...], [761.1364, ...]].

I'm trying to find the Euclidean distance between one list with each of the lists in another set. What I have is below. I guess it works. But it's too bloody slow. Takes a couple seconds to go through the nested loop once, and since it has to go through it over 60,000 times, we're talking over a day to complete. Plus, I need to find the 3 shortest for each, which will take just as much to process.

(though they're labeled *matrix.. it's a standard list of lists.

```
for i in datamatrix:
for j in testmatrix:
temp = (array(i, float)-array(j, float))**2
sum = 0.
[sum + n for n in temp]
distances.append(sqrt(sum))
alldistances.append(distances)
```

Is there some library that will compute this quickly or some better means of writing this in python?