Hey there! :)

I've got no idea if this is the place to put this post, but I thought, since I work a little with Java, it would be better to put it here. I was looking into some notes about algorithms - which ere in a slide show - and I got stuck in divide and conquer algorithms.

The problem is the maths... really. And I wondered if anyone could shed some light on it. I have pasted what said on that page,

and I don't understand the reasoning behind evaluation of T(N). Especially how in the world he got it to T(N) = N log N + N. Can anyone help please?

Running time for a divide-and-conquer algorithm

•

How to divide ?

•

2 equally sized smaller problems (subproblems)

T(N) = 1, N = 1

T(N) = 2 T(N/2) + O(N) conquer takes O(N)

---> T(N) = N log N + N or T(N) = O(N log N)

•

A subproblems, each of size N/B, and conquer takes O(Nk)

T(N) = A T(N/B) + O(Nk), where A ≥ 1, B > 1

T(N) = O( N logBA) for A > Bk

T(N) = O(Nklog N) for A = Bk

T(N) = O( Nk) for A < Bk