What is this really saying? For all values of n that are reasonably large (greater than n0), show that f(n) is bounded by two multiples of g(n). Then c1 <= f(n) / g(n) <= c2. In other words, the ratio of the two functions doesn't grow towards infinity or contract towards zero -- it stays within a certain range. (This is slightly different than the definition I gave in the other thread. That's why I put the disclaimer on the bottom -- what you posted is the correct definition for theta notation.)
Here's where you just want to try different functions (for g(n)) and see how they work. (You'll get good at picking the right functions after a few practice problems.)
For example, the dominant (fastest-growing) term in (6 * 2 ^ n + n ^ 2) is 6 * 2 ^ n. Exponential terms grow faster than polynomial terms, that's why.
Now, suppose that g(n) = 2^n. Then we already know that 1 * g(n) <= f(n), that is, 2 ^ n <= 6 * (2 ^ n) + n ^ 2. This is true no matter what the value of n is. (Because 0 <= 5 * (2 ^ n) + n ^ 2, it follows that 2 ^ n <= 6 * (2 ^ n) + n ^ 2.)
Now we need to find a value c2 _and_ a value n0 such that 6 * (2 ^ n) + n ^ 2 <= c2 * (2 ^ n) whenever n >= n0.
Now we need to pick an n0 and a c2 so that this is true whenever n >= n0.
Let's try n0 = 5 and c2 = 1006. I picked a big value of c2 because it gives us breathing room.
Do these work? When n >= 5, is it true that n ^ 2 <= (1000) * (2 ^ n)? This seems true -- it's true when n == 5, and as you increase n, the right side of the inequality will double each time, while the left side grows slowly.
But can you prove it? You might want (or be forced by your teacher) to prove that your values of c1, c2 and n0 work. (We already proved c1 -- it was a simple inequality.)
Methods for proving this vary. You could use induction or you could use calculus. I'm not going to go on without knowing your mathematical background.