Thanks for your answeres
Just like I thought. The computer doesn't know nothing itself. We have to tell it what is what and how to do it. We can tell it that it is an ASCII character by : char variable=37;
or integer variable by : int variable=37;
there's nothing wrong with mixing char and int per se, although whether or not its 'bad' depends entirely on context.
an int must always be at least as large as a char, just as a general rule. Both int and char are integral types, although, the default representation of a char (when output as text) is as a character rather than a number.
It becomes "bad" in situations where you might attempt to fit a value into a variable when the variable isn't big enough to hold it.
Write a C program that should create a 10 element array of random integers (0 to 9). The program should total all of the numbers in the odd positions of the array and compare them with the total of the numbers in the even positions of the array and indicate ...
I have a 2d matrix with dimension (3, n) called A, I want to calculate the normalization and cross product of two arrays (b,z) (see the code please) for each column (for the first column, then the second one and so on).
the function that I created to find the ...