I couldn't understand what (ASCII Code) is.Thanks

Salem commented: Try google - a lot less effort than joining a forum for just 1 post. -4

ACII characters are just a bunch of random characters assembled to create one grand character. Here is an example of a 'T' in ASCII:

*************
*************
     ***
     ***
     ***      
     ***

Sorry restrictment, because that is "ASCII art" he's talking about, making shapes/pictures out of ASCII characters.

ASCII is the American Standard Code for Information Interchange. Contrast it with a slightly older standard like EBCDIC (which I think still exists on some mainframes). From my understanding of it, if your system supports ASCII (see this table) lowercase a, for example, must be represented by 97 (as do all the other letter, symbols,and things (like "beep")).

UNICODE is a newer (late 80's early 90's) which encompasses a much larger set, including many of the international alphabets, but still has a subset for the original ASCII codes (entitled UTF-8).

Well, that's probably way more than you wanted to know but in a nutshell it's just the numbers used to represent characters and a standard for saying how that conversion should be laid out. This wasn't mean to be exhaustive but more to give you the links in case you wanted to know anything else.

commented: Excellent post :) +6

Sorry restrictment, because that is "ASCII art" he's talking about, making shapes/pictures out of ASCII characters.

ASCII is the American Standard Code for Information Interchange. Contrast it with a slightly older standard like EBCDIC (which I think still exists on some mainframes). From my understanding of it, if your system supports ASCII (see this table) lowercase a, for example, must be represented by 97 (as do all the other letter, symbols,and things (like "beep")).

UNICODE is a newer (late 80's early 90's) which encompasses a much larger set, including many of the international alphabets, but still has a subset for the original ASCII codes (entitled UTF-8).

Well, that's probably way more than you wanted to know but in a nutshell it's just the numbers used to represent characters and a standard for saying how that conversion should be laid out. This wasn't mean to be exhaustive but more to give you the links in case you wanted to know anything else.

Oye, I just looked it up, and I am wrong again..boy have I been on a role lately...sorry for the wrong info. :icon_sad:

Oye, I just looked it up, and I am wrong again..boy have I been on a role lately...sorry for the wrong info

Nah, it happens, not to worry. Worst case he/she'll come up with some amazing art (speaking of which check out sknake's post from 12/25 (http://www.daniweb.com/forums/thread248920.html) -- it's in C#, but it could be easily translated).

In terms of ASCII, I think people raised on BASIC probably still dream of CHR$ -- EDIT: oops just looked it up and it still exists in VB.

Computer is a machine that can process data and information. Like a human uses languages to process data, computer uses a language called machine language. This machine language only has two letters or in fact two values; 0 and 1. Why computer only supports 0 and 1? because it is easy to understand for the computer.
So how we are going to store letter 'A' in a computer. Well we only have 0s and 1s or bits. Most of the today's computer memory cells are 1 byte long or in other words 8 bits. Using this 8 bits, we can represent 256 different bit patterns. These 256 different values are more than enough to store capital letters and simple letters of the English alphabet. The extra values can be used to represent numbers, symbols, etc (just look at your keyboard). So we can make this a standard so that the same bit pattern will represent the same thing in another machine. Hence ASCII standard was created.

In ASCII 'A' is represented as 65
if you convert 65 in to binary --> 01000001
If you use ASCII standard and if you tell you computers to get a value in memory which it should be handled as a symbol or character in ASCII, the computer will get the bit pattern and then will compare it with the ASCII TABLE and then will give you the Character or Symbol.

Note that Enter, space, delete buttons are also values in ASCII TABLE but because you cannot print them, they are commonly called non-printable characters. Now you may think why they are called characters? because they are represented in ASCII table, I guess. :P

Why computer only supports 0 and 1? because it is easy to understand for the computer.

I just wanted to add this for the sake of clarity:

Computer systems are constructed of digital electronics. That means that their electronic circuits can exist in only one of two states: on or off.
These patterns of "on" and "off" stored inside the computer are used to encode numbers using the binary number system. The binary number system is a method of storing ordinary numbers such as 42 or 365 as patterns of 1's and 0's. Because of their digital nature, a computer's electronics can easily manipulate numbers stored in binary by treating 1 as "on" and 0 as "off." Computers have circuits that can add, subtract, multiply, divide, and do many other things to numbers stored in binary.

(Source: http://www.swansontec.com/binary.html)

Thank you all very much :)
Can I ask another question
I started to program a snake game using c++, how can I use the arrow key -on console application- to let the snake move? is it linked with ASCII code?
Thanks every body :)

Yes you can .... But you will have to figure out what are the ASCI values of the arrow keys
In future start a separate thread for your questions. Even if you feel that your question is related to the current discussion

Be a part of the DaniWeb community

We're a friendly, industry-focused community of developers, IT pros, digital marketers, and technology enthusiasts meeting, networking, learning, and sharing knowledge.