I'm trying to write a program for my c++ class that takes binary numbers and converts them to decimal. I have programmed this all fine and dandy, but I'm having trouble with my EOF statement. The program requires that I use 3 functions, the first being to prompt user for 8 digit max binary number, the second converts it to decimal, and the third function displays the results.
I have everything coded perfectly so far, but the teacher wants me to have the user type '@' when they want to quit the program. Then I should assign @ the value of -1 and send it back to main. When I run my program it freaks out when I type @ because the function is an int function. How do I change it so that I can type '@' without pissing off my computer?
Note: I am not allowed to use arrays, just functions and classes and templates, but I'm not that familiar with classes and templates. PLEASE HELP!!!!!!!!!!!

11 Years
Discussion Span
Last Post by Ancient Dragon

Every character you can type on the keyboard is entered into your program as an int. The char data type is really a one-byte integer. There is no such thing in C and C++ as a character data type. So if the '@' character is causing your program problems then it is due to a bug in your program. But since I can't see your code from where I am sitting I suppose you will just have to post it here for public display.

using namespace std;

//function calls
int promptUser(int);
void convertNumber(int, int &);
int outputResults(int, int);

//begin main program
int main()
	//initialize variables
	int userChoice;
	int binary;
	int decimal;

	//display to the user what the program will do
	cout << " This program will prompt you to input a binary number. \n"
		 << "Then it will convert your choice into the decimal equivalent.\n"
		 << "Once it has done this, it will display your original number\n"
		 << "and the decimal equivalent for the number.  If you wish to \n"
		 << "exit the program, simply type '@' and the program will end.\n\n";

	//define variables
	binary = 0;
	decimal = 0;
	userChoice = promptUser(binary);

	//call other functions if userChoice is successful
	while (userChoice != -1)
		convertNumber(userChoice, decimal);
		outputResults(userChoice, decimal);
		userChoice = promptUser( binary );
	}//end while loop

	return 0;
}//end main program

//begin promptUser function 
int promptUser(int binary) 
	//initialize variables
    int userChoice;
	//define variable
	userChoice = 0;

	//prompt user to input binary number
	cout << "Enter binary number with a maximum of 8 characters (type @ to quit): ";
	cin >> binary;

	//perform error checking to make sure only 8 digits are used
	while (binary > 11111111 || binary < 0)
		cout << "\nYou are only allowed up to 8 digits and they must be zeros or ones. \n"
			 << "Please enter binary number again: ";
		cin >> binary;
	}//end error checking

	//test for EOF using switch structure
	switch (binary)
		case '@':
			userChoice = -1;
			userChoice = binary;
	}// end switch
	return userChoice;
}//end promptUser function

//begin convertNumber function/
void convertNumber(int userChoice, int &decimal)
	//initialize local variables
	int tenMillionsDigit;
	int millionsDigit;
	int hundredThousandsDigit;
	int tenThousandsDigit;
	int thousandsDigit;
	int hundredsDigit;
	int tensDigit;
	int onesDigit;

	//perform calculations based on binary number
	tenMillionsDigit = userChoice / 10000000 % 10;
	tenMillionsDigit = tenMillionsDigit * 128;
	millionsDigit = userChoice / 1000000 % 10;
	millionsDigit = millionsDigit * 64;
	hundredThousandsDigit = userChoice / 100000 % 10;
	hundredThousandsDigit = hundredThousandsDigit * 32;
	tenThousandsDigit = userChoice / 10000 % 10;
	tenThousandsDigit = tenThousandsDigit * 16;
	thousandsDigit = userChoice / 1000 % 10;
	thousandsDigit = thousandsDigit * 8;
	hundredsDigit = userChoice / 100 % 10;
	hundredsDigit = hundredsDigit * 4;

	tensDigit = userChoice / 10 % 10;
	tensDigit = tensDigit * 2;

	onesDigit = userChoice % 10;
	onesDigit = onesDigit * 1;

	//add up all the values
	decimal = tenMillionsDigit + millionsDigit + hundredThousandsDigit + tenThousandsDigit 
		      + thousandsDigit + hundredsDigit + tensDigit + onesDigit;

}//end convertNumber function

//begin outputResults function
int outputResults(int userChoice, int decimal)
	//output results for user
	cout << "\nOriginal binary number: " << userChoice << endl;
	cout << "Decimal equivalent: " << decimal << endl << endl;

	return 0;
}// end outputResults function

this is my code so far, I just need to figure out how to allow user to type '@' when they are done, and I need to assign -1 as the value for @ and send it back to main. Can't use arrays


Oh yes -- now I see the problem. cin is attempting to get an integer from the keyboard, but errors when a non-digit character is entered. What I would do is get the input as a character array (or c++ string) then check -- starting on line 77 in your program.

char buf[16] = {0};
    if( buf[0] == '@')
        return '@';       
    binary = atol(buf);

is there a way to do that without using arrays? I'm allowed to use strings but not arrays. However I'm not sure how to use a string in this case


Oh yes, I forgot. Here it is using string

string buf;
    if( buf[0] == '@')
        return '@';       
    binary = atol(buf.c_str());

And there are other c++ ways to make the conversion on the last like of the code above.


where in my code would that snippet go, because when I tried it it compiled, but it returned 0 for my calculations and still didn't allow me to type '@'? Also, what does buf stand for? Thanks again for helping me


>>where in my code would that snippet go

Replace line 72 with that snippet. If your program still does not work then there are other logic errors that you have to figure out how to fix. Sorry, but I can't help you any more tonight.

This topic has been dead for over six months. Start a new discussion instead.
Have something to contribute to this discussion? Please be thoughtful, detailed and courteous, and be sure to adhere to our posting rules.