What is Computer Engineering? Best Explanation
For artificial intelligence to succeed, we need two things: intelligence and an artifact. The
computer has been unanimously acclaimed as the artifact with the best chance of demonstrating
intelligence. The modern digital electronic computer was invented independently and almost
simultaneously by scientists in three countries embattled in World War II. The first operational
modern computer was the Heath Robinson,10
built in 1940 by Alan Turing's team for the single
purpose of deciphering German messages.
When the Germans switched to a more sophisticated
code, the electromechanical relays in the Robinson proved to be too slow, and a new machine
called the Colossus was built from vacuum tubes. It was completed in 1943, and by the end of
the war, ten Colossus machines were in everyday use.
The first operational programmable computer was the Z-3, the invention of Konrad Zuse
in Germany in 1941. Zuse invented floating-point numbers for the Z-3, and went on in 1945 to
develop Plankalkul, the first high-level programming language. Although Zuse received some
support from the Third Reich to apply his machine to aircraft design, the military hierarchy did
not attach as much importance to computing as did its counterpart in Britain.
In the United States, the first electronic computer, the ABC, was assembled by John
Atanasoff and his graduate student Clifford Berry between 1940 and 1942 at Iowa State University.
The project received little support and was abandoned after Atanasoff became involved in military
research in Washington. Two other computer projects were started as secret military research:
the Mark I, If, and III computers were developed at Harvard by a team under Howard Aiken; and
the ENIAC was developed at the University of Pennsylvania by a team including John Mauchly
and John Eckert. ENIAC was the first general-purpose, electronic, digital computer. One of its
first applications was computing artillery firing tables. A successor, the EDVAC, followed John
Von Neumann's suggestion to use a stored program, so that technicians would not have to scurry
about changing patch cords to run a new program.
But perhaps the most critical breakthrough was the IBM 701, built in 1952 by Nathaniel
Rochester and his group. This was the first computer to yield a profit for its manufacturer. IBM
went on to become one of the world's largest corporations, and sales of computers have grown to $150 billion/year. In the United States, the computer industry (including software and services) now accounts for about 10% of the gross national product.
Each generation of computer hardware has brought an increase in speed and capacity, and I
a decrease in price.
Computer engineering has been remarkably successful, regularly doubling performance every two years, with no immediate end in sight for this rate of increase. Massively parallel machines promise to add several more zeros to the overall throughput achievable.
Of course, there were calculating devices before the electronic computer. The abacus is roughly 7000 years old. In the mid-17th century, Blaise Pascal built a mechanical adding and subtracting machine called the Pascaline. Leibniz improved on this in 1694. building a
mechanical device that multiplied by doing repeated addition. Progress stalled for over a century
until Charles Babbage (1792-1871) dreamed that logarithm tables could be computed by machine.
He designed a machine for this task, but never completed the project. Instead, he turned to the
design of the Analytical Engine, for which Babbage invented the ideas of addressable memory.
stored programs, and conditional jumps. Although the idea of programmable machines was
not new in 1805. Joseph Marie Jacquard invented a loom that could be programmed using
punched cards Babbage's machine was the first artifact possessing the characteristics necessary
for universal computation. Babbage's colleague Ada Lovelace, daughter of the poet Lord Byron,
wrote programs for the Analytical Engine and even speculated that the machine could play chess
or compose music.
Lovelace was the world's first programmer, and the first of many to endure
massive cost overruns and to have an ambitious project ultimately abandoned." Babbage's basic
design was proven viable by Doron Swade and his colleagues, who built a working model using
only the mechanical techniques available at Babbage's time (Swade. 1993). Babbage had the
right idea, but lacked the organizational skills to get his machine built.
AI also owes a debt to the software side of computer science, which has supplied the
operating systems, programming languages, and tools needed to write modern programs (and
papers about them). But this is one area where the debt has been repaid: work in AI has pioneered
many ideas that have made their way back to "mainstream" computer science, including time
sharing, interactive interpreters, the linked list data type, automatic storage management, and
some of the key concepts of object-oriented programming and integrated program development
environments with graphical user interfaces
No comments: