I'm learning C mostly for linux development, but right now I am working through a few general books to get me started. But I am having some problems understanding #define versus declaring a variable directly(?). I know that #define declares a constant, but for example normally I would declare a variable as a specified type like int Celsius; and then later could assign it a value. But with #define as it's written in the book I'm reading I wouldn't declare the variable as anything specific:

#define CELSIUS 20

So later in code it automatically gets converted to a int right? For example:

int celsius;

       for (CELSIUS = LOWER; CELSIUS <= UPPER; CELSIUS = CELSIUS + STEP)
...

Seems to me this could cause problems not only on future readability, but wouldn't it also lead to bad programming habits? Maybe I've taken all this out of context. Either way, can someone please explain to me what exactly is the difference?

Recommended Answers

All 4 Replies

Using #define in this way, CELSIUS is a symbolic constant. Everywhere that CELSIUS appears in this source file it is replaced with 20, simple text substitution.

So the following line:

for (CELSIUS = LOWER; CELSIUS <= UPPER; CELSIUS = CELSIUS + STEP)

becomes

for (20 = LOWER; 20 <= UPPER; 20 = 20 + STEP)

As you can see, this doesn't make sense in this context as 20 will be interpreted during compilation (after the text substitution has taken place) as an integer literal, not as a variable. You can't assign a value to an integer literal.

A better example might be to declare a variable named Celsius as an int value, then use symbolic constants for the (unchanging) values of UPPER, LOWER and STEP. For example:

#define LOWER 0
#define UPPER 100
#define STEP 2

int Celsius;
for(Celsius = LOWER; Celsius <= UPPER; Celsius += STEP)
......
......

This would be particularly useful when the symbolic constants LOWER, UPPER and STEP appear in several places in your file because:

- if the value does change for some reason you only have to change it in one place, e.g. your #define.

- it gives the value a meaningful name that conveys its purpose to the reader.

Does that help?

Yes, I know this is a very old topic :) but I have a question about it...
Do I have only one option about type?

#define SIZE 10

this line makes SIZE integer,it is OK. but how can I make it char,double,long.... ??

As was said about 7 years ago, anywhere where the word SIZE is found, it is replaced with 10. There is no type or rules or anything.

#define SIZE 10

int a = SIZE SIZE 2 + 2;
int *b SIZE = SIZE &a SIZE SIZE SIZE; SIZE

The code above turns into the code below:

int a = 10 10 2 + 2;
int *b 10 = 10 &a 10 10 10; 10

Of course there will be errors, but the point is that #define doesn't care how it's used, its just a simple search and replace.

To make SIZE be a char, you do this

#define SIZE (char)10

It will replace all the SIZE with (char)10

You can also make SIZE char like this: char c = SIZE; Or a double like this: double d = SIZE; Or you can use defines to construct strings, as in this simple example.

#define SIZE 65
main()
{
    char p[] = {SIZE,'p','p','l','e','\0'};
    printf("%s\n", p);
}
Be a part of the DaniWeb community

We're a friendly, industry-focused community of developers, IT pros, digital marketers, and technology enthusiasts meeting, networking, learning, and sharing knowledge.