Hi, I was trying to browse through the contents of an SD card containing the FAT16 filesystems using a HEX editor. I was looking up a person's program to navigate through the filesystem i.e. reading the master boot record, boot sector etc and he has used a "long" for fields which are 4 bytes long. The output of his program is absolutely correct as I can see the contents on the HEX editor. I was just wondering how the C compiler knows the endianness of the filesystem i.e. how can the compiler know which is the most significant byte and least significant byte by just assigning that field with a long variable. For example the 4 byte field contains 81 00 00 00 as seen on my hex editor and the output is 00000081! It is using little endian notation but I can't understand how the C compiler knows the endianness beforehand.

Recommended Answers

All 2 Replies

It is using little endian notation but I can't understand how the C compiler knows the endianness beforehand.

The compiler is written for a specific platform, and the platform's endianness is well known.

Okay...Thanks again!!

Be a part of the DaniWeb community

We're a friendly, industry-focused community of developers, IT pros, digital marketers, and technology enthusiasts meeting, networking, learning, and sharing knowledge.