Hi Guys
I have a query regarding "sizeof" operator and it is related to the program:
{
int x;
x=sizeof (-32768);
printf("%d",X);
getch();
}

Then it's printing it's value as 4, But logically it should be 2 because the integer range is from -32768 to 32767.
==============================================
And my second question is why the range of int, float ,are always like -32768 to 32767 etc...
why it is not like -32768 to 32768, why there is a gap

>Then it's printing it's value as 4, But logically it should be 2 because
>the integer range is from -32768 to 32767.
Get with the times, it looks like your machine has 32-bit ints, not 16-bit.

>why it is not like -32768 to 32768, why there is a gap
It's related to how your machine represents signed numbers. There are other ways that don't have the same effect.

look in limits.h to find out the maximum/minimum values of the data types for your compiler.

And my second question is why the range of int, float ,are always like -32768 to 32767 etc...
why it is not like -32768 to 32768, why there is a gap

0 (zero) has to come out the pot somewhere, and the way signed ints work, it comes out of the positive half.