Cloud Frontiers: A Deep Dive into Serverless Spatial Data and FME
History of Computers
1. VIRGEN MILAGROSA UNIVERSITY FOUNDATION
SAN CARLOS CITY, PANGASINAN, PHILIPPINES.
COLLEGE OF COMPUTER SCIENCE
HISTORY
OF
COMPUTERS
Submittedby:
Sudheera W. Semasinghe
15-05561-1461
Submitted to:
Efren Y. Ignacio, BSCE, CPS, MBA
Dean, College of Computer Science
2. CONTENTS
1. Introduction
2. Computers before 20th Century
2.1. Abacus
2.2. Other Computing Tools Used before 20th
Century
2.3. The Difference Engine and the Analytical
Engine
3. Electromechanical Computers and Five Generations
of Modern Computers
3.1 Electromechanical Computers
3.2. First Generation Computers
3.3. Second Generation Computers
3.4. Third Generation Computers
3.5. Fourth Generation Computers
3.6. Fifth Generation Computers
References
3. 1. Introduction
A computer is a general purpose device that can be programmed to carry out a set
of arithmetic or logical operations automatically. Since a sequence of operations
can be readily changed, the computer can solve more than one kind of problem.
Conventionally, a computer consists of at least one
processing element, typically a central processing unit
(CPU), and some form of memory. The processing
element carries out arithmetic and logic operations, and
a sequencing and control unit can change the order of
operations in response to stored information. Peripheral
devices allow information to be retrieved from an
external source, and the result of operations saved
and retrieved.
Before 1935, a computer was a person who
performed arithmetic calculations. Between 1935
and 1945 the definition referred to a machine, rather
than a person. The modern machine definition is
based on Von Neumann's concepts: a device that
accepts input, processes data, stores data, and
produces output.
We have gone from the vacuum tube to the transistor,
to the microchip. Then the microchip started talking to
the modem. Now we exchange text, sound, photos
and movies in a digital environment. There are various
types of computers in present day. Some types are
desktop computers, notebook computers, Palmtop
computers, PDAs and so on. A computer is a common
electronic equipment that can be found almost
everywhere in the world. Nowadays, computers are
used in many fields such as communication, education, healthcare, security, media,
entertainment, engineering, business and trade, and so on. In this report, “History
of Computers”, we discuss about various types of computing devices used by man
from early stages to present day. It begins from the story of abacus, runs through
Babbage’s Analytical Engine, Alan Turing’s Turing machine, ENIAC, EDVAC, IBM
650, Apple computers, upto modern super-computers.
A modern tablet PC
A modern desktop computer
A modern laptop computer
4. 2. Computers before 20th
Century
Before the 20th century, most calculations were done by humans. Early mechanical
tools to help humans with digital calculations were called "calculating machines",
by proprietary names, or even as they are now, calculators. The machine operator
was called the computer.
2.1. Abacus
The abacus (plural abaci or abacuses), also called a
counting frame, is a calculating tool that was in use
in Europe, China and Russia, centuries before the
adoption of the written Hindu–Arabic numeral
system and is still used by merchants, traders and
clerks in some parts of Eastern Europe, Russia, China
and Africa.
The abacus was used by the Chinese since about 500 BC for the simplest of
calculations: addition, subtraction, multiplication, and division, as well as fractions
and square roots. A Chinese abacus is made up of a wood frame divided into two
parts separated by a beam, with an upper deck of two rows of beads and a lower
deck of five rows of beads. A series of vertical rods allows the wooden beads to
slide freely. The abacus as we know it today did not appear in China until about
1200 A.D. Over time the abacus traveled to Japan and evolved into what it is called
today: the soroban. A soroban is made up of a wooden frame divided into two parts
separated by a beam, with upper deck of one row of beads and a lower deck of four
rows of beads.
2.2. Other Computing Tools Used before 20th
Century
In the 5th century BC in ancient India, the grammarian Pāṇini formulated the
grammar of Sanskrit in 3959 rules known as the Ashtadhyayi which was highly
systematized and technical. Panini used metarules, transformations and recursions.
In the 3rd century BC, Archimedes used the mechanical principle of balance to
calculate mathematical problems, such as the number of grains of sand in the
universe (The sand reckoner), which also required a recursive notation for numbers
(e.g., the myriad myriad).
An ancient abacus
5. The Antikythera mechanism is believed to be the earliest
known mechanical analog computer. It was designed to
calculate astronomical positions. It was discovered in
1901 in the Antikythera wreck off the Greek island of
Antikythera, between Kythera and Crete, and has been
dated to circa 100 BC.
Mechanical analog computer devices appeared again a
thousand years later in the medieval Islamic world and
were developed by Muslim astronomers, such as the
mechanical geared astrolabe by Abū Rayhān al-Bīrūnī, and the torquetum by Jabir
ibn Aflah. According to Simon Singh, Muslim mathematicians also made important
advances in cryptography, such as the development of cryptanalysis and frequency
analysis by Alkindus. Programmable machines were also invented by Muslim
engineers, such as the automatic flute player by the Banū Mūsā brothers, and Al-
Jazari's humanoid robots and castle clock, which is considered to be the first
programmable analog computer.
During the Middle Ages, several European philosophers made attempts to produce
analog computer devices. Influenced by the Arabs and Scholasticism, Majorcan
philosopher Ramon Llull (1232–1315) devoted a great part of his life to defining and
designing several logical machines that, by combining simple and undeniable
philosophical truths, could produce all possible knowledge. These machines were
never actually built, as they were more of a thought experiment to produce new
knowledge in systematic ways; although they could make simple logical operations,
they still needed a human being for the interpretation of results. Moreover, they
lacked a versatile architecture, each machine serving only very concrete purposes.
In spite of this, Llull's work had a strong influence on Gottfried Leibniz (early 18th
century), who developed his ideas further, and built several calculating tools using
them.
Indeed, when John Napier discovered logarithms for computational purposes in the
early 17th century, there followed a period of considerable progress by inventors
and scientists in making calculating tools. The apex of this early era of formal
computing can be seen in the difference engine and its successor the analytical
engine (which was never completely constructed but was designed in detail), both
by Charles Babbage.
Antikythera mechanism
6. 2.3. The Difference Engine and the Analytical Engine
The Difference Engine would be able to compute tables of numbers, such as
logarithm tables. He obtained government funding for this project due to the
importance of numeric tables in ocean navigation. By promoting their commercial
and military navies, the British government had managed to become the earth's
greatest empire. But in that time frame the British government was publishing a
seven volume set of navigation tables which came with a companion volume of
corrections which showed that the set had over 1000 numerical errors. It was
hoped that Babbage's machine could eliminate errors in these types of tables. But
construction of Babbage's Difference Engine proved exceedingly difficult and the
project soon became the most expensive government funded project up to that
point in English history. Ten years later the device was still nowhere near complete,
acrimony abounded between all involved, and funding dried up. The device was
never finished.
The analytical engine invented by Charles Babbage combined concepts from his
work and that of others to create a device that if constructed as
designed would have possessed many properties of a modern
electronic computer. These properties include such features as an
internal "scratch memory" equivalent to RAM, multiple forms of
output including a bell, a graph-plotter, and simple printer, and a
programmable input-output "hard" memory of punch cards which
it could modify as well as read. The key advancement which
Babbage's devices possessed beyond those created before his
was that each component of the device was independent of the
rest of the machine, much like the components of a modern
electronic computer. This was a fundamental shift in thought; previous
computational devices served only a single purpose, but had to be at best
disassembled and reconfigured to solve a new problem. Babbage's devices could be
reprogramed to solve new problems by the entry of new data, and act upon
previous calculations within the same series of instructions. Ada Lovelace took this
concept one step further, by creating a program for the analytical engine to
calculate Bernoulli numbers, a complex calculation requiring a recursive algorithm.
This is considered to be the first example of a true computer program, a series of
instructions that act upon data not known in full until the program is run.
Charles Babbage:
Father of Computer
7. 3. Electromechanical Computers and Five
Generations of Modern Computers
3.1 Electromechanical Computers
As the technology for realizing a computer was being honed by the business
machine companies in the early 20th century, the theoretical foundations were
being laid in academia. During the 1930s two important strains of computer-related
research were being pursued in the United States at two universities in Cambridge,
Massachusetts. One strain produced the Differential Analyzer, the other a series of
devices ending with the Harvard Mark IV.
In 1930 an engineer named Vannevar Bush at the Massachusetts Institute of
Technology (MIT) developed the first modern analog computer which was known
as Differential Analyzer. While Bush was working on analog computing at MIT,
across town Harvard professor Howard Aiken was working with digital devices for
calculation. From 1939 to 1944 Aiken, in collaboration with IBM, developed his first
fully functional computer, known as the Harvard Mark I. The machine, like
Babbage’s, was huge: more than 50 feet (15 metres) long, weighing five tons, and
consisting of about 750,000 separate parts, it was mostly mechanical.
Alan Turing, while a mathematics student at the University of Cambridge, was
inspired by German mathematician David Hilbert’s formalist program, which sought
to demonstrate that any mathematical problem can potentially be solved by an
algorithm—that is, by a purely mechanical process. He was able to design such a
machine which was known as Turing Machine.
3.2. First Generation Computers
It was generally believed that the first electronic digital
computers were the Colossus, built in England in 1943, and the
ENIAC, built in the United States in 1945. However, the first
special-purpose electronic computer may actually have been
invented by John Vincent Atanasoff, a physicist and
mathematician at Iowa State College (now Iowa State
University), during 1937–42. The distinguishing feature of the
first generation computers was the use of electronic valves
(vacuum tubes).
Vacuum Tube used
in first generation
computers
8. The first in a series of important code-breaking machines, Colossus, also known as
the Mark I, was built under the direction of Sir Thomas Flowers and delivered in
December 1943 to the code-breaking operation at Bletchley Park, a government
research centre north of London. It employed approximately 1,800 vacuum tubes
for computations.
In Germany, Konrad Zuse began construction of the Z4 in
1943 with funding from the Air Ministry. Z4 used
electromechanical relays, in part because of the difficulty
in acquiring the roughly 2,000 necessary vacuum tubes in
wartime Germany.
The UNIVAC was the first commercial computer delivered
to a business client, the U.S. Census Bureau in 1951. Punch
cards were used in first generation computers for data input.
3.3. Second Generation Computers
Transistors replace vacuum tubes and ushered in the second generation of
computers. The transistor was invented in 1947 by American physicists John
Bardeen, Walter Brattain, and William Shockley, but did not see widespread use in
computers until the late 1950s. The transistor was far superior to
the vacuum tube, allowing computers to become smaller, faster,
cheaper, more energy-efficient and more reliable than their first-
generation predecessors.
Though the transistor still generated a great deal of heat that
subjected the computer to damage, it was a vast improvement
over the vacuum tube. Second-generation computers still relied
on punched cards for input and printouts for output.
Second-generation computers moved from cryptic binary
machine language to symbolic, or assembly, languages,
which allowed programmers to specify instructions in
words. High-level programming languages were also being
developed at this time, such as early versions of COBOL
and FORTRAN. These were also the first computers that
stored their instructions in their memory, which moved
from a magnetic drum to magnetic core technology. The
first computers of this generation were developed for the atomic energy industry.
A part of UNIVAC computer
A Transistor. They
were used in
second generation
computers.
IBM 7090 : A second
generation computer
9. 3.4. Third Generation Computers
During the period of 1964 to 1971 Third generation computers were developed.
The third generation computers emerged with the development of IC (Integrated
Circuits). The invention of the IC was the greatest achievement done in the period
of third generation of computers. IC was invented by Robert
Noyce and Jack Kilby in 1958-59. IC is a single component
containing a number of transistors. Transistors were
miniaturized and placed on silicon chips, called semiconductors,
which drastically increased the speed and efficiency of
computers.
Keyboards and monitors developed during
the period of third generation of computers. The third
generation computers interfaced with an operating system,
which allowed the device to run many different applications at
one time with a central program that monitored the memory.
High-level languages (FORTRAN-II TO IV, COBOL, PASCAL
PL/1, BASIC, ALGOL-68 etc.) were used during this
generation.
The Philco Transac models S-1000 scientific computer and S-2000 electronic data
processing computer, Olivetti Elea 9003 and IBM 608 are examples for third
generation computers.
3.5. Fourth Generation Computers
The period of fourth generation was from 1971-present. The
computers of fourth generation used Very Large Scale Integrated
(VLSI) circuits. VLSI circuits having about 5000 transistors and
other circuit elements and their associated circuits on a single chip
(microprocessor) made it possible to have
microcomputers of fourth generation.
Intel 4004 chip was the first
microprocessor developed in 1971. Fourth generation
computers became more powerful, compact, reliable, and
affordable. As a result, it gave rise to personal computer
(PC) revolution. In this generation time sharing, real time,
An Intergrated Circuit
IBM 370 : A third generation
computer
A microprocessor
An Apple Macintosh
Computer in 1980s
10. networks, distributed operating system were used. All the high-level languages like
C, C++, DBASE etc., were used in this generation. Examples of fourth generation
computers are: Apple Macintosh, IBM PCs etc.
A modern 4th generation computer released in early 2010s.
Supercomputers are the best available computers nowadays. A Supercomputer is
focused on performing tasks involving intense numerical calculations such as
weather forecasting, fluid dynamics, nuclear simulations, theoretical astrophysics,
and complex scientific computations. A supercomputer is a computer that is at the
front-line of current processing capacity, particularly speed of calculation. The term
supercomputer itself is rather fluid, and the speed of today's supercomputers tends
to become typical of tomorrow's ordinary computer.
3.6. Fifth Generation Computers
The Fifth Generation Computer Systems (FGCS) was an initiative by Japan's Ministry
of International Trade and Industry, begun in 1982, to create a computer using
massively parallel computing/processing. It was to be the result of a massive
government/industry research project in Japan during the 1980s. It aimed to create
an "epoch-making computer" with supercomputer-like performance and to provide
a platform for future developments in artificial intelligence. There was also an
unrelated Russian project also named as fifth-generation computer.
Artificial intelligence (AI) is the intelligence exhibited by machines or software. It is
also the name of the academic field of study which studies how to create
computers and computer software that are capable of intelligent behavior.
11. References
Latest Technomanias Blog
http://latesttechnomanias.blogspot.com/2010/06/third-generation-computers.html
Mason.GMU.edu
http://mason.gmu.edu/~montecin/computer-hist-web.htm
UCMAS Mental Math Schools Canada
http://ucmas.ca/our-programs/how-does-it-work/what-is-an-abacus/
Wikipedia, The Free Encyclopedia
https://en.wikipedia.org/wiki/Abacus
Carnegie Mellon School of Computer Science
https://www.cs.cmu.edu/~fga
ComputerScienceLab.com
http://www.computersciencelab.com/ComputerHistory/HistoryPt2.htmndon/lecture/uk1999/comput
ers_types/
Encyclopedia Britannica
http://www.britannica.com/technology/computer/History-of-computing
Tutorialspoint.com
http://www.tutorialspoint.com/computer_fundamentals/computer_second_generation.htm
History of Computer Blog
http://histryofcomputr.blogspot.com/2010/12/fifth-generation-computer-1991-present.html
iiNet Australia
www.iinet.net.au
ComputerHistory.org
www.computerhistory.org