#include <stdio.h>
#define a b
#define b a
int main(void)
{
 int a = 20, b = 30;
 printf("%d %d", a, b);
}

What will be the output of this programm ??????????????
Really confused with this ... Can somebody tell me what exactly will be happen and how macros will be replaced in the preprocessing stage.
This is my preprocessor output ....

int main(void)
{
 int a   = 20, b   = 30;
 printf("%d %d", a  , b  );
}

No change .... :rolleyes:

Recommended Answers

All 28 Replies

You havn't done anything with your definitions, try this:

#include <stdio.h>
#define a_b

int main(void)
{
#ifdef a_b
 int a = 20, b = 30;
#endif
#ifdef b_a
  int b = 20, a = 30;
#endif
 printf("%d %d", a, b);

}

Output will be 20 30, This seems to straightforward for me.

[LEFT]#include <stdio.h>[/LEFT]

 
[LEFT]#define a b
#define b a
int main(void)
{
int a = 20, b = 30;
printf("%d %d", a, b);
}[/LEFT]

In the above code

a will be replaced by b and
b will be replaced by a.

In that case

int a = 20

will become

int a = 20

.
That I understood.
But what is confusing me is why,

int b= 30

is not getting replaed to

int a = 30

???

Doh! ! I can't believe I am soooo thick!

It swaps every instance of a and b so

printf("%d %d", a, b);

becomes

printf("%d %d", b, a);
[LEFT]int main(void)
{
 int a   = 20, b   = 30;
 printf("%d %d", a  , b  );
}[/LEFT]

But why there is no change in the preprocessor output ( see above) ???

I'm guessing here, but I recon they cancel each other out.

All directives run on each instance, so:

Preprocessor finds the first instance of a and runs ALL directives on it
a = b = a
Then it finds the first instance of b and applied all directives to it
b = a = b

and so on...

commented: Good interpretation +3

Are you sure about this. I just want to confirm .... Do you know any good materials on this.

I stated I was guessing and if I have any good material I would quote it.

I came to this conclusion by thinking about it and applying a process of elimination in my mind, by thinking about how a compiler program would implement a preprocessor.

Take one of your directives away and look at the output, then put that back and remove the previous directive and look at the output. Can there be any other explanation ?

Using one directive the output is as I am expecting.
In this case it will give redefinition error. But when I use two not able to figure out what is happening

I'm guessing here, but I recon they cancel each other out.

All directives run on each instance, so:

Preprocessor finds the first instance of a and runs ALL directives on it
a = b = a
Then it finds the first instance of b and applied all directives to it
b = a = b

and so on...

I also think this is the correct interpretation.

-8- A preprocessing directive of the form # define identifier replacement-list new-line
defines an object-like macro that causes each subsequent instance of the macro name* [Footnote: Since, by macro-replacement time, all character literals and string literals are preprocessing tokens, not sequences possibly containing identifier-like subsequences (see 2.1.1.2, translation phases), they are never scanned for macro names or parameters. --- end foonote]to be replaced by the replacement list of preprocessing tokens that constitute the remainder of the directive.* [Footnote: An alternative token (lex.digraph) is not an identifier, even when its spelling consists entirely of letters and underscores. Therefore it is not possible to define a macro whose name is the same as that of an alternative token. --- end foonote]The replacement list is then rescanned for more macro names as specified below.

16.3.4 - Rescanning and further replacement [cpp.rescan]

-1- After all parameters in the replacement list have been substituted, the resulting preprocessing token sequence is rescanned with all subsequent preprocessing tokens of the source file for more macro names to replace.
-2- If the name of the macro being replaced is found during this scan of the replacement list (not including the rest of the source file's preprocessing tokens), it is not replaced. Further, if any nested replacements encounter the name of the macro being replaced, it is not replaced. These nonreplaced macro name preprocessing tokens are no longer available for further replacement even if they are later (re)examined in contexts in which that macro name preprocessing token would otherwise have been replaced.
-3- The resulting completely macro-replaced preprocessing token sequence is not processed as a preprocessing directive even if it resembles one.

You can see the above more clearly by preprocessing the following program.

#define a b a
#define b a
int main(void)
{
 int a = 20, b = 30;
 printf("%d %d", a, b);
}

At last the calvary arrives, I was starting to feel like Genral Custer!

I would put this under "undefined behavior" and assume defining a to be b then b to be a to be a bad practice. Therefore I would abandon the concept of understanding what the compiler would do and move on to a different problem -- one that makes sense...

Just my 2 cents.

Thank you all for your suggestions.

hollystyles - Good material.

I would put this under "undefined behavior"

But it is clearly defined in the document to which I posted a link. It is important to understand in great depth the tools you work with. It is very possible for a large project to have potentialy conflicting pre-processor directives- not as simplistic as the example postulated here but no less poignant - like this.

I would put this under "undefined behavior"

But it is clearly defined in the document to which I posted a link. It is important to understand in great depth the tools you work with. It is very possible for a large project to have potentialy conflicting pre-processor directives- not as simplistic as the example postulated here but no less poignant - like this.

But is that defined only for Dev or is it defined in the language standards? If only in Dev, then it's a bad thing to rely on. When you change compilers, according to your suggestion, you must then start the "great depth" learning curve from scratch as you try to track down why your programs no longer function correctly.

IMO, you need to understand the language in great depth, not the tool's version of the language. Try to avoid compiler-dependant functions and implementations whenever possible. Wolfpack has the correct idea, and that is by quoting the C++ Standard

Hmmm...

Yes you should know the language well, but without a compiler (standards compliant or not) it is useless.

The language is just the wood, it is the tool and the "in depth" knowledge of it which produces the craft. A cabinet maker chooses his tools very carefully and lives with them until they become extensions of his own limbs. Standards are important and have their job to do but IMHO you certainly shouldn't rely on them they change often, will always be open to interpretation and sometimes blatantly ignored.

But stilll i agree with WaltP.

IF any construct or any stmt for eg. fflush (stdin) suppose would be defined under one compiler it would make really less sense to adopt it if it results in undefined behaviour on other compilers. (though it really is undefined under all implementations)

In the end, undefined behaviour of any kind should be very much avoided.

And btw carrying on with ur eg., if the artist knows what wood he is supposed to use and the knowledge of how that wood can be made into a beautiful engraving or a piece of art then it really doesnt matter in which part of the world he is, he just needs to know wat tools are at his disposal and he is all set to make the next masterpiece.

In the end its the knowledge which is more important than the tool, so if asked to make a choice i would very much choose Knowledge over proficiency of a particular Tool.

Hope u take all this is good faith.
Bye.

This article summarizes why a standard is needed.
Specifically this part.

Impact: The standard will make it easier to teach C++ (which is just coming into use for the Advanced Placement Computer Science courses in US high schools), to use C++ in applications, and to port C++ programs from one kind of computer to another. Basically, the standard heralds a new era of C++ use where more advanced techniques can be used effectively in industrial, research, and educational software. Software tools providers are already shipping C++ implementations and tools that approximate the standard. The standard allows users greater freedom of choice of C++ implementations, allows implementers and major users to check implementations against the standard using test suites and to compare implementations using performance tests. The increased stability and portability offered by the standard is a boon to library providers and tools provides as well as implementers. These improvements will help C++ application developers to build better applications faster, and to maintain them with less cost and effort. The result will be further improvements in the quality of applications delivered to end users - who, typically, will have no idea that they are relying on C++ in their everyday life.

Hmmm...

Yes you should know the language well, but without a compiler (standards compliant or not) it is useless.

The language is just the wood, it is the tool and the "in depth" knowledge of it which produces the craft. A cabinet maker chooses his tools very carefully and lives with them until they become extensions of his own limbs. Standards are important and have their job to do but IMHO you certainly shouldn't rely on them they change often, will always be open to interpretation and sometimes blatantly ignored.

Completely disagree. The language is the tool. The compiler is the wood. With knowledge of the language (how to make a cabinet) it doesn't matter what wood (Borland, Dev-C, MSVC...) you use. The compiler simply gives you the finished product. Each wood has it's own distinct color, grain, but the functionality will be the same with the proper design. Sometimes the wood can have a knot in it that can be a detriment ( system("pause")/fflush(stdin) or an enhancement getch()/kbhit() to the final product.

This whole metaphor is interesting.... and slightly inaccurate. But it illustrates our points, I guess :)

IF any construct or any stmt for eg. fflush (stdin) suppose would be defined under one compiler it would make really less sense to adopt it if it results in undefined behaviour on other compilers. (though it really is undefined under all implementations)

Assuming I'm reading you correctly, your parenthetical statement is incorrect. "it [undefined behavior] really is undefined under all implementations" is not true. If the compiler designers have implemented a specific behavior for something the standard states is undefined ( fflush(stdin) ), it is not undefined in the implementation. It is an enhacement to the standard. This is perfectly reasonable for the designers to do. As for whether a user should utilize this new behavior is questionable.

Completely disagree. The language is the tool. The compiler is the wood. ... Sometimes the wood can have a knot in it that can be a detriment (system("pause")/fflush(stdin) or an enhancement getch()/kbhit() to the final product

No no no no no I'm sorry that's completely the wrong way around. (system("pause")/fflush(stdin) getch()/kbhit() are not wood (the language) they are prefabricated table legs/lego bricks made for convenience.

You don't shape your compiler into native machine instructions with the code !!!!

With knowledge of the language (how to make a cabinet)

Hmm not really, just cos you know C doesn't mean you know how to make a killer app. THAT's where the standards come in with the recipe for a good cabinet, but to make a stunning cabinet you have to be brave enough to realise that the standard may be holding you back and it shouldn't be relied on necessarily. You try to use best practice (courtesy of the standard) to store the wood in the dry, plane with the grain , use pre-fabbed parts. BUT it's your knowledge of the wood AND the tool (with it's library, options and optimizers) that make the product shine.

All technologies (compilers/interpreters/languages) have their knots. I refer to them as piles of saucepans and knicker elastic. I mean take null terminated char arrays in C not it's strongest feature, strcat from the standard library ? not exactly scaleable. I read that the Excel team at Microsoft made their own compiler and used Pascal strings to ensure they shipped a fast and shiney product and they made a lot of money without the standard.

In the end it's the knowledge which is more important than the tool, so if asked to make a choice i would very much choose Knowledge over proficiency of a particular Tool.

On this point I would argue if you have lots of knowledge, but no special knowledge you are in danger of becoming a Jack of all trades instead of a master craftsman. (this is my biggest problem I'm too interested in all of it to stick to a genre)

On this point I would argue if you have lots of knowledge, but no special knowledge you are in danger of becoming a Jack of all trades instead of a master craftsman. (this is my biggest problem I'm too interested in all of it to stick to a genre)

Come on my good friend, u mean to tell that by using a particular compiler or tool and using it perfectly crosses the thin line between Jack of All Trades and A Craftsman then i completely disagree.

Take for eg. the two UML tools Sparx Enterprise and Rational Rose. I completely agree that once you know what is where and the nifty tricks of the software then the productivity is definately boosted but it is no replacement of the knowledge of UML which is driving you to use the software. And suppose u have to switch to some other UML modeling tool then its the knowledge and understandign of UML which would save you and not the knowledge of a particular tool.

But this is just what i think...
Opinions may vary.

Come on my good friend, u mean to tell that by using a particular compiler or tool and using it perfectly crosses the thin line between Jack of All Trades and A Craftsman then i completely disagree.

Well ok I concede that was perhaps a little strong.

But you are changing tack with the UML arguement. UML is just a method for designing, the field is much narrower you don't 'create' with it. UML is UML is UML you can't create a 'poetic' UML diagram like you can with language.

I'm trying to keep this going but I'm running out of steam and the debate is all over the place.

This all started over WaltP dismissing dilips' pre-processor macros as "undefined behavour" and not a real problem.

My arguement is it could be a real problem, and very possible in a large project. And the behaviour we have tested is clearly documented for the Gcc compiler and in the standard (courtesy Wolfpack)

And I never said one should use only ONE compiler, but I stand by the notion you should take the time to know it well, and I know all programmers have their favourite tools, that you cannot deny.

LOL ok i agree.

I still think that you have got some things right there and i also have got some things right there. And that UML eg. was meant to show the difference between knowledge and the use of a particular tool.

And err...

UML is just a method for designing, the field is much narrower you don't 'create' with it. UML is UML is UML you can't create a 'poetic' UML diagram like you can with language.

Come on man.... UML and narrow. THey are more like antonyms. The history of software engg research has been more dedicated to software design and smart practices rather than on a particular language. If languages have power then UML is wat helps us in harness that power and make it available to the massses.

But still ALL PEACE.

This all started over WaltP dismissing dilips' pre-processor macros as "undefined behavour" and not a real problem.

My arguement is it could be a real problem, and very possible in a large project. And the behaviour we have tested is clearly documented for the Gcc compiler and in the standard (courtesy Wolfpack)

I concede that it's is no longer undefined behaviour -- thanks to Wolfpack too. I didn't actually look up the standard to see. I made an erroneous assumption. (but it was a logical assumption :))

And I never said one should use only ONE compiler, but I stand by the notion you should take the time to know it well...

Still disagree. Irregardless of the compiler, you need to know the language. All you need to know about the compiler is how to start the compile process. You don't need to know how it parses, how it preprocesses, how to debug (although it can help admittedly), nor any of the esoteric stuff that's packaged in.

Now if you mean by 'compiler' the IDE editor (which is not part of the compiler but a surrounding package containing the compiler) I still disagree. A bare bones basic editor (Notepad) is certainly enough. I still remember editing on teletypes -- not even a screen existed -- and we did all right. But the help an IDE gives you is a big benefit -- but not required to be a top-knotch programmer. It all comes down to knowing the language and being able to go from code to program. How you get there is moot IMO.

... and I know all programmers have their favourite tools, that you cannot deny.

Absolutely true. That's actually why I usually use an editor (VEdit) and a command line compiler (Borland 5.5) for my programming. I use what I like.

I still remember editing on teletypes -- not even a screen existed

Wow how old are you ? I'd expect to find you at slashdot rather than daniweb.:cheesy: I've read about those days, you submitted your code and didn't get output until that afternoon or maybe the next day!

Yes I think the command line is much overlooked these days, simple things don't break so easily. Thank god for Cygwin so us Windows soles don't miss out.

My favourite editors are vim and codepad. VS is the utimate armchair with built in remote and mini beer fridge, when it's working it's a dream.

It all comes down to knowing the language and being able to go from code to program. How you get there is moot IMO.

Hmmm you still havn't convinced me yet I still think you're attaching too much importance to the language, isn't it more about knowing how it all works. Essentially languages expose the same things variables, loops and decisions. I mean my own experience has been the language is easy, what I have found hardest *exactly is* getting from code to program. The real craft for me has been learning how to organise my header and class files, how to compile them as a static library or a dynamic library, keep track of what version I'm on, going back a step when a build breaks, building for a debug session, building for a release version. Learning the tools that help me with these tasks makes me effective because I can then concentrate on solving the problem and writing the solution which is the fun part.

Wow how old are you ? I'd expect to find you at slashdot rather than daniweb.:cheesy: I've read about those days, you submitted your code and didn't get output until that afternoon or maybe the next day!

Old.... But we were on a PDP-10 Timeshare machine. Looked like the command line today -- compile, run, fix, repeat. It was the batch people that had to wait -- but only a few minutes on this system. Batch means those punch cards -- even older than me... My record collection used to be on punch cards... :eek:


We're obviously talking about the same thing from different angles. I thought you were talking about the tool -- Dev-C, MSVC, etc., not the knowledge about how to get the job done with the language.

Hmmm you still havn't convinced me yet I still think you're attaching too much importance to the language, isn't it more about knowing how it all works. Essentially languages expose the same things variables, loops and decisions. I mean my own experience has been the language is easy, what I have found hardest *exactly is* getting from code to program.

If you don't know the language, you can't get the job done, no matter how much you know about "how it all works". But then again, you have to know how it all works to know what to code...

The real craft for me has been learning how to organise my header and class files, ...

What you are talking about to me here is the language. Organizing the code, the procedures, the flow -- it's the language, not the tools. I think we're really on the same page, we just have different definitions for the terms.

The rest you mention are tools for project organization/management for the most part. Also important, but you can still finish the task without any of them.

commented: Worthy posts +2
Be a part of the DaniWeb community

We're a friendly, industry-focused community of developers, IT pros, digital marketers, and technology enthusiasts meeting, networking, learning, and sharing knowledge.