Why does

(start+last)/2

work to calculate the midpoint while

start+(last-start)/2

does?

To be more specific, I put the contents of a binary tree into an array, sorted it, destroyed the old tree, and plan to rebuild the new one. I know that start+(last-start)/2 as a midpoint calculation works correctly. My problem is that I don't actually know WHY it works. I need to know for a test I have tomorrow. Here's the code for the rebuildTree function.

``````binaryTreeNodePtrType BinaryTree::rebuildTree(int* &a, int start, int last)
{
binaryTreeNodePtrType ptr = new binaryTreeNodeType; //create new node
if (start <= last) {
/*           Had to find out a new way to calculate midpoint.
The "average" way [(start+last)/2] did not work
correctly.  This new way of calculating a midpoint
works.  It's hard to explain, but drawing out the
process helps to understand it.
*/
int mid = start+(last-start)/2; //calculate midpoint
ptr->key = a[mid]; //set pointer's key value to the midpoint value
ptr->leftchild = rebuildTree(a, start, mid-1); //continue to the left
ptr->rightchild = rebuildTree(a, mid+1, last); //cotinue to the right
}
else {
return NULL;
}
return ptr;
}``````

## All 5 Replies

You have to do the whole algebra thing.

``````(S + L)/2 = S/2 + L/2

S + (L - S)/2 = S + (L/2 - S/2)
= S + L/2 - S/2
= S - S/2 + L/2
= (2S/2 - S/2) + L/2
= S/2 + L/2``````

that help?
Val

You have to do the whole algebra thing.

``````(S + L)/2 = S/2 + L/2

S + (L - S)/2 = S + (L/2 - S/2)
= S + L/2 - S/2
= S - S/2 + L/2
= (2S/2 - S/2) + L/2
= S/2 + L/2``````

that help?
Val

I appreciate the algebra work, but that wasn't really my question. I am probably wording this badly.

I need to know why one inserts all the array elements correctly, while the other midpoint calculation fails. say the array was 1-2-3-4-5-6-7-8-9. The tree would end up with two 3's and missing a 9.

The midpoint calculations give the same results - the problem must lie elsewhere.

Can you post the exact code that works, and the exact code that doesn't?

You know, I went and I replaced just the midpoint, and you'e right. It does work, now. I'm not sure what else I fixed in this whole process, but I'm glad to see it works. Well, that should save me some time explaining, then. Thanks a lot!

Programming can be so strange, sometimes.

Why does

(start+last)/2

work to calculate the midpoint while

start+(last-start)/2

does?

I assume you mean why does (start+last)/2 fail when start+(last-start)/2 succeeds? Most of the time it doesn't. Both calculations have the same result except when the addition of start and last would overflow. The second calculation avoids the overflow problem, so it's more correct.

Be a part of the DaniWeb community

We're a friendly, industry-focused community of developers, IT pros, digital marketers, and technology enthusiasts meeting, networking, learning, and sharing knowledge.