0

Is there a systematic method to analyze the time complexity of a program strictly from output data such as run-time, the number of data swaps, data comparisons, etc? For example...

```
Data Size | Run Time | Comparisons | Moves
---------------------------------------------
1000 | 0.00 | 15448 | 23965
10000 | 0.00 | 251006 | 369906
100000 | 0.04 | 4029787 | 5551078
1000000 | 0.65 | 64737104 | 84244884
```

I'm pretty sure the algorithm that caused this data has a time complexity is O(n^1.5). I am wondering if there is a way to verify that from the data above?

I want to do this because I am trying to analyze another, more complex algorithm. (I can't get it's complexity simply by looking at it's code, so an alternate method would be helpful)