should i use #define or const in declaring constant variables?? which one is better? when should i use #define or const?

for example when i declare a hexadecimal (0x01), should i use #define or const?

Recommended Answers

All 2 Replies

should i use #define or const in declaring constant variables?? which one is better? when should i use #define or const?

for example when i declare a hexadecimal (0x01), should i use #define or const?

It depends.

A constant variable will take up space, and finding its value may take more time. A #define will, as per the example you have shown, simply be a text replacement of the value 1 in source code. That is, it is directly encoded into the executable.

But with a const, you get type checking. This may be an advantage to prefering it.

For array sizes, I prefer neither. Using sizeof array / sizeof *array will get you the number of elements when array is in scope, and where it isn't it IMO should be passed as a parameter.

For bit masks, I would use a #define, if not a macro like this.

#define BIT(x)  (1 << (x))

For data types other than integers, it too depends -- but I generally try to minimize the number of #defines in my code.

hey thanks...

Be a part of the DaniWeb community

We're a friendly, industry-focused community of developers, IT pros, digital marketers, and technology enthusiasts meeting, networking, learning, and sharing knowledge.