hi,
the question is simple how they made the 1st compiler Ever !

Recommended Answers

All 3 Replies

By hand.

What they did was they wrote out the code of the compiler in some slightly-higher language. Then they manually went through the code and compiled it themselves. Once that was done they could use the compiler to compile any future sources.

New compilers would then be 'bootstrapped' using old compilers by writing their code in a language for which a compiler already existed.

What Labdabeta said is correct, but a bit skimpy, so I'll give you a bit more detail.

For the first five to ten years of programmable electronic computers, programming was done in machine code, though even that can be a bit of an overstatement - several of the first generation computers had no I/O such as we would understand it today, so entering a program would require the opcodes to be entered by means of wiring up a patch panel (think of an old-time phone operator's switchboard and you have the idea). While this was quickly supplanted by punch-card and paper-tape readers, this still didn't change the process of writing a program much.

Initially, the machine codes were written down for entry in a purely numeric format - sometimes decimal, but more often either octal or hexadecimal - but very soon it was realized that this wasn't practical, so a notation was developed for writing the opcodes as word abbreviations. This was called various names but the one that ended up sticking was 'assembly' code, from the fact that the programmer would 'assembly' the program with it. It was used for keeping track of the code while writing it; however, at the time, it was still converted to the machine opcodes by hand, then entered to a punched deck or tape for reading into the machine. Because the computers of the time were based on very unreliable technologies - vaccuum tubes, mercury delay lines, and the like - running a program was extremely expensive, and often exceeded the Mean Time Between Failures of the system, programs had to be kept short anyway, so automating the transformation of the menmonics into machine code was seen as a waste of money and time until even as late as 1958.

Still, the idea of automating parts of program creation was being looked into fairly early on. By 1950, most computers were running a simple monitor program - a predecessor the modern operating systems - which could select specific cards out of a stack to run them. Another common automation was a floating-point interpreter - a sort of virtual machine for selecting and performing FP calculations, which were difficult to write correctly and needed to be shared by many programs. The last of these utilities, a tool for finding a sub-routine out of a library of procedures in a card stack, was basically what we call a linker today, but at the time was called a 'compiler', as its job was to compile a group of procedures for the program.

Not long after this, the idea of automatic assembly of programs began to take root. Initially there was a good deal of resistance, as the it seemed inefficient to have expensive computer time spent doing what was seen as clerical work, but the advantages slowly came to the fore as computer hardware improved. By 1955, more and more programs were automatically assembled, and assemblers began handling details such as address calculations and providing such luxuries as labels, character constants, and simple macros.

Still, several programmers were beginning to consider taking this even further, with languages that were not a direct analog to the machine code. Experiments began as early as 1952 with tools such as Flow-Matic and Speedcode, which basically were 'extended' assembly languages that alloed for simple arithmetic expressions to be written in something close to familiar mathematical notation. This eventually led to an new idea of a language completely based on expressing mathematical formulae: FORmula TRANslator, or FORTRAN. It would become the first high-level language to see production use.

Under the leadership of John Backus, he FORTRAN I compiler project (which was the largest programming project to date) began in 1954, but would not be completed until 1958. This was in part because the developers were under significant pressure to make sure the compiled programs would perform as well as assembly-language equivalents, and so had to wring every ounce of optimization out of the generated code as they could; but mostly it was because the principles program design were still in their infancy, and the methods for parsing and code generation were still experimental and far less effective than they would later become. While the language that they created would later be seen as primitive and inelegant, it was a great leap forward in software technology, and led to the discovery of many of the principles later compilers would use.

About halfway through the testing that first Fortran compiler, in 1956, amn MIT researcher named John MacCarthy who had done some of the development work began experimenting with using an early version of the compiler to write a library of Fortran routines for manipulating lists of symbols rather than numbers; his goal was to explore the idea of artificial intelligence. While he finished the library, called IPL, in 1957, he soon realized that Fortran was poorly suited for what he was aiming at. He began experimenting with another, more abstract notation, one based on Alonzo Church's lambda calculus. He called this notation LISP, for LISt Processing, and it was radically different from Fortran, though at the time it still resembled conventional mathematical notation. This would become known as m-expression LISP:

cons[1; 2; 3]
print['this; 'is; 'a; 'list; 'of; 'symbols]

At first, this was again just a way of describing the programs, but he soon concluded that a LISP compiler was called for. Work began on this, using this early version of LISP as the design notation and hand-coding the results into assembly, when things suddenly took an unexpected turn. In early 1958, one of MacCarthy's grad students, Stephen 'Slug' Russell (who would later go on to write the original ''Spacewar!'' video game) had been studying Goedel's Theorem and lambda calculus while working on an improved version of the LISP compiler, when he realized that he could use LISP lists to represent LISP programs (in a manner similar to Goedel numbering). It struck him immediately that he could write a 'universal function' in LISP to describe LISP itself in a compact and simple manner, and that if he then re-wrote that function into assembly, he would have a LISP interpreter that could be used directly without needing a compiler at all. He worked out a notation for representing LISP programs as LISP functions, where the first symbol in the list would be treated as the name of the function to be performed, leading to the prefix-ordered 'symbol expressions' or s-expressions:

(cons 1 2 3)
(print 'this 'is 'a 'list 'of 'symbols)

The advantages of this new notation became clear, so s-expressions would be synonymous with the Lisp family of languages from then forth. It would also lead to the first attempts at a self-hosting language, that is, a language whose compiler was written in the language itself.

At around the same time, two new language projects were started, inspired by the new-found success of Fortran: COBOL, a language for writing programs for business data processing, and Algol-58, which was meant to standardize the notation for publishing algorithms. These would play pretty major roles as well: COBOL would be the most widely used high-level language from the early 1960s to the mid-1990s, and while Algol itself would never be widely used, it would lead to major breakthroughs in language design, compiler design, and the notation for describing languages, and would be the basis for other languages' syntax, including C and Pascal.

If you want more, you can check out some of these links:

commented: Great summary +9

Thank you guys so much ! :)

Be a part of the DaniWeb community

We're a friendly, industry-focused community of developers, IT pros, digital marketers, and technology enthusiasts meeting, networking, learning, and sharing knowledge.