When you use char(130) you get the é. When you do int('é') you get -23... and with my program to get value it is -126...

How on earth can I get the 130 value when entering the é ?

Thanks for looking.

With this program I printed out the letter list with number:

#include <iostream> 
using namespace std; 
int main() { 

for(int i=0; i<255;i++){
	cout << char(i) << " " << i << '\n';
	};
	system("pause"); //not safe
	return 0; 
}

With this program I enter a letter and it gives me the value.

#include <iostream> 
using namespace std; 
int main() { 

label:
	char i;
	cin >> i;
	int temp;
	temp = int(i);
	cout << temp ;
	cout << '\n';
goto label;

	system("pause"); //not safe
	return 0; 
}

I cannot test any script at the moment as I am in the middle of a long run and looking for thngs to do before it finishes.

The ascii charcter charts show that 130 should be
é so the char(130) is working
the -126

a char is 8 bits and like ints it is possible to treat them as signed numbers so as 256 - 126 = 130
the int to char conversion is not treating the char as a character but as a signed number
char c = 'é';
unsigned int(c);
should be 130
not sure which compiler option has caused this.

but it helps if you output the char next to the number to check that
everything is what you think it is in your code
"/n"
should be replaced by endl; this can be important for later examples.

cout << ">" << i << "< >" <<temp <<endl;
as for the -23 this doesn't look right where are you getting the é
if you copy and paste from word it may be a different symbol to what you think it is! and so it wraps around the 256 possibilities are only one of two bytes it uses.

When you use char(130) you get the é. When you do int('é') you get -23... and with my program to get value it is -126...

How on earth can I get the 130 value when entering the é ?

Thanks for looking.

Thanks.
In my project/code I will just do a test if it's negative value and if so, I will do 256 + the value (= 130 example). :-)

It has to do with what type of character input the compiler expects and what kind of character data your program expects. Surprisingly, the two may be different.

Unfortunately, anything outside the ASCII range (0..127) is not safe to put in a string/character literal.

Alas.

Be a part of the DaniWeb community

We're a friendly, industry-focused community of developers, IT pros, digital marketers, and technology enthusiasts meeting, networking, learning, and sharing knowledge.