i found that the limits of int and short int are the same........ i just want to know then what is the need of 2 with the same limits....

Recommended Answers

All 8 Replies

Member Avatar for iamthwee

I doubt they're the same.

if they r the same then why 2 names r being used.......... what are the uses of the keyword "short"........

Essentially, it's about programmer choice. Originally, it went something like:
Choose 'short' if you are concerned about the amount of storage taken up.
Choose 'long' if you are concerned about the numeric range of the data.
Choose 'int' if you are concerned about choosing the 'best' performance.

The common assignments are that short is 16 bits, long is 32 bits, and int is whatever the natural word size of the machine is.
However, any assignment is valid so long as it is consistent with the limits in the ANSI standard. Some DSP chips for example have 32 bits for everything.

Nowadays, it doesn't make a great deal of difference on modern desktop architectures. But it can be important for small footprint embedded systems.

Also, C was originally developed on a 16-bit machine with only 32K words of memory. Memory space was a premium commodity.

Also, C was originally developed on a 16-bit machine with only 32K words of memory. Memory space was a premium commodity.

Yes.

>i found that the limits of int and short int are the same

Also C is supposed to be a portable program language, which means that in theory you could use it for different platforms or architectures.
That's the reason why C defines only the minimum store sizes:
the size of type short is at least two bytes, long at least four bytes and long long at least eight bytes. And even when they can be larger that their minimun size, it must follow this order:
sizeof( short ) ≤ sizeof( int ) ≤ sizeof( long ) ≤ sizeof( long long )
Check it out and you'll see that this applies to your machine.

if they r the same then why 2 names r being used..

Surprised that no one mentioned that they are NOT the same and that's the reason 2 different names are used. As Aia said only rule (according to standards) is
sizeof( short ) ≤ sizeof( int ) ≤ sizeof( long ) ≤ sizeof( long long )
which means that sizeof( short ) < sizeof( int ) is also possible and is not always same.

i tried referring some contacts and one of them said that C uses 2 bytes in windows for int i.e int is looked as short int
but in the case of gcc(linux) it uses 4 bytes i.e int is looked as long int

can somebody clarify me on this........

>one of them said that C uses 2 bytes in windows for int
It hasn't been that way for a while. On many modern implementations, int is 4 bytes and short is 2 bytes.

> one of them said that C uses 2 bytes in windows for int
Someone trying out a sizeof(int) on a prehistoric compiler would definitely say that.

Be a part of the DaniWeb community

We're a friendly, industry-focused community of developers, IT pros, digital marketers, and technology enthusiasts meeting, networking, learning, and sharing knowledge.