Slides from Princeton Algorithms.
If you do it as a lg, lg plot very often you‘ll get a straight line. And the slope of the straight line is the key to what‘s going on. In this case, the slope of the straight line is three and so you can run what‘s called a regression to fit a late, the straight line through the data points. And then, it‘s not difficult to show to do the math to show that if you get a straight line and the slope is B, then your function is proportional to A, N^B. That‘s called the power law.
If there is a power law, and again in a very great majority of computer algorithm running times is going to be a power law. What we can do is just double the size of the input each time the way we were and take the ratio of the running times for N and 2N. And if you do that, that ratio going to converge to a constant. And in fact the log of the ratio is going to converge to that constant, which is the exponent of N (B).