Hello I would like a good reference for a totaly noob in chip building (ok I know they are some circuit+transistor prints on silicon wafers but nothing more)

I would like to know how I define the usage of a little "black cockroach " (I mean a chip) so that later on after some training I would know which kind I need for a specific porpose (like to use it in a motion controller of a robotic hand for example) etc

Also I would like to know how to program it from totaly scratch... I mean more deep than assebly.. because I would like to make an "assembly like" language using GREEK characters as native symbols not LATIN as all the chips now use.

I suppose I need to build a keyboard that outputs Greek characters as native symbols and not as coded latin symbols that tell to the software to print out the greek character like for example if I type the english V button the keyboard will send the "C#002" code to the software so that it decodes this to omega "Ω" character...

I want something direct greek.

So I am asking references for the above that will help to make the first steps and then I will be able to study and research for more sophisticated things like moving from the "assembly like" low level language (which is using greek characters as native symbols) to a high level computer language lets say a "C++ like" language that would use greek syntax and symbols as native.

NOTE: When I say "reference" I mean that I would like you to tell me any book name/s you think that are what I want or guide to some links and say a few words about that to help me:)

I thank you in advance for your help.

Yes I am aware about the general knowledge..... I know how things work now and the basics of compiling code writing code etc... I know what source code is what binary is what a assebler is a compiler a GUI etc etc

I just dont know the root fundamentals of those thinks... Like I know how to bake and that the owen uses electricity to heat up some metalic resistors but I dont know how to make this my self and why the electricity heats the resistors (in reality i know why :P )

To cut the long story short I am not aware of a fundamental yet sophisticated thing...

How you program a chip to understand

AC 0F 00
FF 11 E0

and make it into 0110001101101

If I know that I could translate the 0110001101101


in "my assembly " language sort of.. :P

Nobody knows something about that?

Really nobody has no idea of what i am talking about? ?? ? ?? :O

PLEASE help.... :(

To make a short answer :)

When the program is compiled, what really happens, is that its saved in a binary mannor aka. 0101010 ect.

When the program is executed, the first line of this binary is read, on a 32bit system (4byte system), its read in 4bytes at a time, and 8 bytes on 64bit systems, the format of the binary 4 or 8 bytes is know as an instruction (aka an assembly instruction), say the instruction "add eax, ebx", which is binary would turn into a code somewhat alike; first add (the upcode), then the source and destination.

What happens inside the cpu is that the upcode gets read, and based on this, the data stream is send to the specific unit, for add, that would be the ALU, which would then be supplied with the binary representation for eax, and ebx, in the rest of the instruction, and with this, it would be able to look up eax and ebx, and then feed these to the ALU itself, with does the binary add (using logical gates).

When the add instruction is over, then the next 4 bytes are read, and the instruction here is executed. When this is done the program is pretty much running.

Simplyfied its just:

Read Instruction
--> Read upcode and send to specific unit
Process instruction at the unit
--> The instructions are build using transistors in logicgates.
goto Start;

So pretty much your question about programming so that the chip understands, is just programming in its specific assembly language, or using a hex editor, and programming bit by bit or byte by byte.

If your interrested in knowing how this assembler works, just look up the source of a opensource one, and if you wanna write your own, pretty much just build a compiler, rather than build a new architecture.

papajo, your questions are somewhat disjoint, and there is no forum around taht can answer them. What you need to do is get some fundamentals in first - basic electricity. Google for basic electricity tutorials. You will also need to learn some things about boolean algebra. AND, OR, XOR, DeMorgan's laws and truth tables.An intro to digital electronicsand interfacing. These areas are what were covered in fundamental computer design. When you know exactly what you want the robotic hand to do, you may research manufacturer's data sheets to choose a "chip or controller" to use. learning how to program those components is still another area of study.
So you see, there is a lot more one needs to learn before even asking the types of questions you are asking. It is not easy - I got my computer science degree in 1968 when there were no "microchips". It took most of my degree work plus experience to be able to do the things I outlined above. Don't get discouraged - take it a step at a time.