0

Hi, I was trying to browse through the contents of an SD card containing the FAT16 filesystems using a HEX editor. I was looking up a person's program to navigate through the filesystem i.e. reading the master boot record, boot sector etc and he has used a "long" for fields which are 4 bytes long. The output of his program is absolutely correct as I can see the contents on the HEX editor. I was just wondering how the C compiler knows the endianness of the filesystem i.e. how can the compiler know which is the most significant byte and least significant byte by just assigning that field with a long variable. For example the 4 byte field contains 81 00 00 00 as seen on my hex editor and the output is 00000081! It is using little endian notation but I can't understand how the C compiler knows the endianness beforehand.

2
Contributors
2
Replies
11
Views
4 Years
Discussion Span
Last Post by anumash
0

It is using little endian notation but I can't understand how the C compiler knows the endianness beforehand.

The compiler is written for a specific platform, and the platform's endianness is well known.

This topic has been dead for over six months. Start a new discussion instead.
Have something to contribute to this discussion? Please be thoughtful, detailed and courteous, and be sure to adhere to our posting rules.