Thanks for your answeres
Just like I thought. The computer doesn't know nothing itself. We have to tell it what is what and how to do it. We can tell it that it is an ASCII character by : char variable=37;
or integer variable by : int variable=37;
there's nothing wrong with mixing char and int per se, although whether or not its 'bad' depends entirely on context.
an int must always be at least as large as a char, just as a general rule. Both int and char are integral types, although, the default representation of a char (when output as text) is as a character rather than a number.
It becomes "bad" in situations where you might attempt to fit a value into a variable when the variable isn't big enough to hold it.