In undefined cases, anything can happen, the program can crash, or unexpected results be thrown up. What happens is up to the compiler.

In unspecified cases, the Standard has deliberately not defined the exact behaviour of a few things so that the compiler is at liberty to do as it deems efficient.

In both the cases, what happens is up to the compiler. So a compiler can be written to do a certain fixed things when it encounters an unspecified or undefined thing. So in essence, what is undefined in the Standard becomes defined in the context of a particular compiler.

Doesn't it blur the difference between the two things, when the implementation of both, the undefined and the unspecified, is being handled in a fixed way by the compiler.

Is it that during the coding of the compiler, the undefined cases are simply not accounted for, and their result , unintentionally, depends on the way a code is written , which is obviously different for different compilers, and hence the difference.

>So in essence, what is undefined in the Standard becomes
>defined in the context of a particular compiler.

That rather goes without saying, but doesn't change anything. The result can still be unpredictable, and it's certainly not portable.

>Doesn't it blur the difference between the two things,
>when the implementation of both, the undefined and the
>unspecified, is being handled in a fixed way by the compiler.

Not really. Unspecified means the standard offers multiple options to choose from, but the compiler must choose one of them. Undefined means the standard says "you're on your own" and the compiler can do anything.

>Is it that during the coding of the compiler, the undefined
>cases are simply not accounted for, and their result ,
>unintentionally, depends on the way a code is written

That's one possibility, yes.

Be a part of the DaniWeb community

We're a friendly, industry-focused community of developers, IT pros, digital marketers, and technology enthusiasts meeting, networking, learning, and sharing knowledge.