I have written code to compute the discriminant of a polynomial f(x) using the determinant of a sylvester matrix which for

f(x)=x^5 -110*x^3 +55*x^2 +2310*x +979 looks like

```
[[ 1 0 -110 55 2310 979 0 0 0]
[ 0 1 0 -110 55 2310 979 0 0]
[ 0 0 1 0 -110 55 2310 979 0]
[ 0 0 0 1 0 -110 55 2310 979]
[ 5 0 -330 110 2310 0 0 0 0]
[ 0 5 0 -330 110 2310 0 0 0]
[ 0 0 5 0 -330 110 2310 0 0]
[ 0 0 0 5 0 -330 110 2310 0]
[ 0 0 0 0 5 0 -330 110 2310]]
```

Using numpy I simply execute

`(-1)**(n*(n-1)/2)(det(a))`

where a is the above matrix and n is the degree of f(x). If I don't convert to long python returns 6.31724313067e+17 or 631724313067344384L if I convert.

In either case my result conflicts with maxima which returns 631724313067340625.

I am confident of my code since for polynomials with smaller or sparse coefficient the results agree. For instance if f(x)= x^6 -2, python returns 1492992 as the discriminant and maxima returns 1492992.

Could this be a result of how python handles large integers? I was pretty sure python's large number capabilities were on par. BTW I am running on Arch64 with intel E8400 3.0GHz

Thanks in advance