Macros definition is automatically expand by the compiler..
It works just like functions..
There are no. of benefits using macros over constant...
for ex- you have used the constant pi many times in your program and now you want to change the value of pi.. with macros you have to change only where you define the value but with constants you have to change everywhere in the program..
This isn't really a general computer science question, it's more of a language-specific one. The definition of 'macro' and 'constant' varies from language to language. In some, constants aren't constants, and in others, macros aren't macros.
And in some, macros must "have a data type" and in some, constants don't have to.