This problem is related to an unsolved problem in game theory...
Consider a blind cat and mouse that live on a (unit) circle. They start off in random positions then each choose a speed and start running in a random direction.
On average how far will the cat run before it catches the mouse?
To make things a little easier try always starting the cat at x=0, and iterating its velocity between 0 and 2, choosing direction randomly. Place the mouse randomly on
the circle. Do lots of trials for each cat velocity, with the velocity of the mouse fixed at vm=1, to find the average distance travelled by the cat for each cat velocity. You will need to do some basic geometry to find the distance travelled (by the cat);
remember that it will change depending on which direction is chosen and who is running fastest (sometimes the mouse will catch the cat!).
What happens at different velocities? Plot average distance travelled for a range of cat velocities. What happens to this graph if you change the mouse velocity
Can someone help me to write the code for this program please?

Edited by Rachel Ross: n/a

2
Contributors
1