Someone please tell me the difference between 0 and '\0' and NULL.
I thought I knew it! 0 and NULL is the same thing "Essentially".. '\0' is a null terminator for a string..
It's just sometimes when I convert a string to char array and do some encryption on it, I see TWO null terminating characters.. wth? The second one should never be there and should never get reached because the string terminates as it hits the first.. Thing is.. WHY does this second NULL Terminator appear out of NO where? Now here is where the problem lies in my mind.
I can convert all the chars in the chararray (INCLUDING the NULL Terminator) to ASCII codes and they all convert perfectly fine into the (Int) values.. Converts vice-verse into (char) as well.. So no problem there..
Now when doing the same with Hex.. It leaves out the NULL terminator. Aka char to hex to char.. When it hits the Hex part, it is 1 char less and thus the output chararray is also 1 char less.. (The original minus the Null terminator).
Now Why does ASCII have codes for NULL Terminating characters but Hex does not? Hex has the value 0.. should it not use that?
Edited 4 Years Ago by triumphost: n/a