I was trying to calculate the angle between two points using
as was suggested in numerous forums and tutorials. After trying it myself, it didn't quite work as charmingly as it did for others. So I made a simple example of what it does and I need some help figuring out why it does it and how to fix it:
import math p1 = (0.0, 0.0) p2 = (1.0, 1.0) dx = p1 - p2 dy = p1 - p2 angle = math.atan2(dy, dx) angle = math.degrees(angle) print(angle)
This should obviously give a 45° angle as a result, right? Instead, it gives me a -135° angle, so it's 180° off. What I suspected was that it begins from the left hand side of a full circle and the angle increases clockwise instead of counter-clockwise like it should. After testing this I found out this was indeed the case. It's as if it calculates off a mirror image of the usual angle. Is there any way to fix this?
(I'm running Python 3.2.2 if that helps)