In this program i have put numeric value in char variable. Trying to observe what happens when using %d and %c specifier.

void main()
  char ch=23;

here when i use %d i get value:23
when i use %c i get some odd char.
why the result is different for both cases?

The the result were the same, why would we have both %d and %c specifiers? If you use %d then the value of ch is interpreted as an integer and the value 23 is displayed. If you use %c then the value of ch is interpreted as a character and the character representation of that value (a control character in this case) gets printed. Because there's not a good graph for the character, you're probably getting whatever decent facsimile your command shell can come up with.

Try changing the value from 23 to 65 for a more obvious test of what's happening.

Because the different format specifiers tell printf() to interperet the value differently when displaying it. When you use %c, it tells the program to read ch as a single character, in this case printing out the character with the ASCII value of 23. When you use %d, it tells the program to display ch like it's an integer, and just print out the number 23 in decimal form.

The printf() function can't really tell the difference between a char and an int without these format specifiers, it just sees ones and zeros. A char, in most environments, is really just an 8-bit integer that is designed to store a printable character on the screen.

This article has been dead for over six months. Start a new discussion instead.