This article aims to present how the computer, humanity's greatest invention, evolved and how its most likely future will be. The computer is humanity's greatest invention because the worldwide computer network made possible the use of the Internet as the technology that most changed the world with the advent of the information society. IBM developed the mainframe computer starting in 1952. In the 1970s, the dominance of mainframes began to be challenged by the emergence of microprocessors. The innovations greatly facilitated the task of developing and manufacturing smaller computers - then called minicomputers. In 1976, the first microcomputers appeared whose costs represented only a fraction of those practiced by manufacturers of mainframes and minicomputers. The existence of the computer provided the conditions for the advent of the Internet which is undoubtedly one of the greatest inventions of the 20th century, whose development took place in 1965. At the beginning of the 21st century, cloud computing emerged, which symbolizes the tendency to place all the infrastructure and information available digitally on the Internet. Current computers are electronic because they are made up of transistors used in electronic chips that have limitations given that there will be a time when it will no longer be possible to reduce the size of one of the components of the processors, the transistor. Quantum computers have been shown to be the newest answer in Physics and Computing to problems related to the limited capacity of electronic computers. Canadian company D-Wave claims to have produced the first commercial quantum computer. In addition to the quantum computer, Artificial Intelligence (AI) can reinvent computers.
Repurposing LNG terminals for Hydrogen Ammonia: Feasibility and Cost Saving
THE EVOLUTION OF HUMANITY'S GREATEST INVENTION, THE COMPUTER, AND ITS FUTURE.pdf
1. 1
THE EVOLUTION OF HUMANITY'S GREATEST INVENTION, THE
COMPUTER, AND ITS FUTURE
Fernando Alcoforado*
This article aims to present how the computer, humanity's greatest invention, evolved and
how its most likely future will be. The computer is humanity's greatest invention because
the worldwide computer network made possible the use of the Internet as the technology
that most changed the world with the advent of the information society. There are those
who say that it was Charles Babbage who created, in the 19th century, an analytical
machine that, roughly speaking, is compared with the current computer with memory and
programs. With this invention, Babbage is considered the "father of informatics".
Although many of Babbage's concepts are used today, the formalization of components,
which would become a general-purpose machine and new abstractions, were only
consolidated from the 1930s onwards, thanks to John Von Neumann, one of the ENIAC
developers, and the Alan Turing. The first large-scale electronic computer, developed
without mechanical or hybrid parts, appeared only in 1945, after World War II [2]. IBM
developed the mainframe computer from 1952, with the first computer based on vacuum
tubes, soon replaced by the 7000 series, which already used transistors. In 1964, the IBM
360 appeared, which had immense commercial success until the early 1980s [1].
Mainframe computers were large machines that performed calculations and stored
information. In general, its use had scientific, commercial and governmental purposes.
In the 1970s, the dominance of mainframes began to be challenged by the emergence of
microprocessors. The 4004 chip, introduced by Intel in 1971, was a central processing
unit and the first commercially available microprocessor. Innovations greatly facilitated
the task of developing and manufacturing smaller computers - then called minicomputers
- which could also use peripherals (disks, printers, monitors) produced by third parties.
Early minicomputers cost one-tenth the price of a mainframe and took up only a fraction
of the space required. In 1976, the Intel 8080 microprocessor was launched, which gave
rise to the first microcomputers. The chip was successively improved and the 8088
version was used by most microcomputer manufacturers. Microprocessors changed the
way computers were designed. It was no longer necessary to produce the entire system,
including the processor, terminals, and software such as the compiler and operating
system. The development of the Apple II in 1977, carried out by young Steve Jobs and
Steve Wosniak, showed that new technologies radically simplified the development
process and equipment assembly. The costs of a system based on microcomputers
represented only a fraction of those practiced by manufacturers of mainframes and
minicomputers, thus allowing the development of servers. Interconnected in local
networks, microcomputers promoted the diffusion of informatics [1].
The existence of the computer provided the conditions for the advent of the Internet,
which is undoubtedly one of the greatest inventions of the twentieth century, whose
development took place in 1965, when Lawrence G. Roberts, in Massachusetts, and
Thomas Merrill, in California, connected a computer over a low-speed switched
telephone line. The experiment was a success and was marked as the event that created
the first WAN (Wide Area Network) in history. The Internet story continued in 1966
when Lawrence G. Roberts joined DARPA (Defense Advanced Research Projects
Agency) and created the plan for the ARPANET (Advanced Research Projects Agency
Network) to develop the first packet-switched network. Although the first prototype of a
decentralized packet-switched network had already been designed by the United
Kingdom's National Physical Laboratory (NPL) in 1968, it only gained visibility in 1969,
2. 2
when a computer at the University of California (UCLA) successfully connected to
another from the Stanford Research Institute (SRI). The connection was so successful
that, months later, four American universities were already interconnected. Thus was born
the ARPANET. In 1970, the ARPANET was consolidated with hundreds of connected
computers [2]. In 1995, a new revolution was started with the commercial use of the
Internet [1].
Technological innovations in microprocessors have multiplied digital storage capacity
and the development of broadband has allowed companies to develop new products and
services. The concern with the limitations of computational resources was overcome,
allowing greater focus on the needs of users through more attractive and functional
applications, which brought more and more uses to personal computers. Netscape was the
first company to promote Internet browsing, but it was surpassed by Microsoft due to the
integration of this device into the Windows operating system, a fact that generated a
prolonged legal dispute in Europe. The commercial development of the Internet showed
that it was possible to create new business models based no longer on the sale of hardware
and software licensing, but on the ability to communicate between different devices and
on the creation of virtual communities. One of the most significant impacts of the
emergence of the Internet was the popularization of electronic commerce [1].
At the beginning of the 21st century, cloud computing emerged. The development of
Internet 2.0 and complementary technologies such as smartphones and tablets,
communication-oriented chips and the development of wired and wireless broadband
infrastructure resulted in a new revolution in the sector. Cloud computing symbolizes the
tendency to place all available infrastructure and information digitally on the Internet,
including application software, search engines, communication networks, providers,
storage centers and data processing. The Internet Protocol (IP) constitutes the universal
language that allows the standardization of packets of different media and supports
indistinct voice, data and image traffic. The cloud concept is very important because it
allows computing to become a public utility, as information assets are non-rival and can
be used simultaneously by unlimited users. This model offers great advantages for users,
although it also presents risks. The main advantage is the possibility of using the available
hardware and software resources more efficiently, allowing to reduce the idle capacity in
data storage and processing, through the sharing of computers and servers interconnected
by the Internet. The infrastructure is accessed by terminals and mobile devices that
connect the cloud to the user. The risks are mainly associated with the security and
confidentiality of data stored in the cloud [1].
One of the main characteristics of contemporary society is the large-scale use of
information technology. The informational or information technology revolution spread
from the 1970s and 1980s, gaining intensity in the 1990s with the spread of the Internet,
that is, network communication through computers. Why call this process a revolution?
Because computerization penetrated society just like electricity that reconfigured life in
cities. The computer, icon of information technology, connected in a network is changing
people's relationship with time and space. Informational networks make it possible to
expand the ability to think in an unimaginable way. The new technological revolution has
expanded human intelligence. We are talking about a technology that allows you to
increase the storage, processing and analysis of information, performing billions of
relationships between thousands of data per second: the computer [2].
Current computers are electronic because they are made up of transistors used in
electronic chips. This causes it to present limitations given that there will be a time when
3. 3
it will no longer be possible to reduce the size of one of the smallest and most important
components of processors, the transistor. In 1965, the American chemist Gordon Earle
Moore made a prediction that, every 18 months, the number of transistors used in
electronic chips would double with the reduction in their size. In the year 2017, the
American technology company IBM managed to produce a chip the size of a fingernail,
with approximately 30 billion of transistors of 5 nanometer (1 nanometer = 10-9
m). With
this, the company showed that, even though it is not very accurate, Moore's prediction
remains valid until today, but it will reach its limit sooner than we imagine. The problem
starts to exist when it is no longer possible to reduce one of the smallest and most
important components of the processors, the transistor. It is important to note that it is in
this small device that all information is read, interpreted and processed [3].
When dealing with very small scales, Physics is no longer as predictable as it is in
macroscopic systems, starting to behave randomly, in a probabilistic way, subjecting
itself to the properties of Quantum Physics. This means that one of the alternatives of the
future is the quantum computer, which is a machine capable of manipulating information
stored in quantum systems, such as electron spins (magnetic field of electrons), energy
levels of atoms and even photon polarization. . In these computers, the fundamental units
of information, called “quantum bits”, are used in order to solve calculations or
simulations that would take impractical processing times in electronic computers, such as
those currently used [3].
Quantum computers work with a logic quite different from that present in electronic
computers. Quantum bits can have the values 0 and 1 simultaneously, as a result of a
quantum phenomenon called quantum superposition. These values represent the binary
code of computers and are, in a way, the language understood by machines. Quantum
computers have proven to be the newest answer in Physics and Computing to problems
related to the limited capacity of electronic computers, whose processing speed and
capacity are closely related to the size of their components. Thus, its miniaturization is an
inevitable process [3].
Quantum computers will not serve the same purposes as electronic computers. One of the
possible uses of quantum computers is factoring large numbers in order to discover new
prime numbers. It should be noted that factoring can be described as the decomposition
of a value into different multiplicative factors, that is, if we multiply all the elements of a
factorization, the result must be equal to the value of the factored number. Even for today's
most powerful supercomputers, this is a difficult and time-consuming task. In theory,
quantum computers could do it much faster. Quantum computers are good at working
with many variables simultaneously, unlike current computers, which have many
limitations for carrying out this type of task. In this way, it is expected that quantum
computers can be used to simulate extremely complex systems, such as biological,
meteorological, astronomical, molecular systems, etc [3].
The ease of quantum computers in dealing with complex systems is related to the nature
of quantum bits. An electronic bit can only take on the value 0 or 1, while quantum bits
can have both values at the same time. Thus, a single quantum bit has a numerical
equivalent of 2 electronic bits. This means that, with only 10 quantum bits, we would
have a computer with a capacity of 1024 bits (210
= 1024), while most home computers
today work with 64-bit systems [3].
Despite representing a significant leap forward in relation to classical computers,
quantum computers also have their limitations. The quantum behavior of bits is only
4. 4
achieved under very sensitive conditions. Therefore, it is necessary to keep them at very
low temperatures, close to absolute zero, using sophisticated liquid nitrogen or helium
refrigeration systems. Any variations in these temperature conditions, however small they
may be, may harm or even interrupt its proper functioning. Other factors, such as external
magnetic fields and electromagnetic waves emitted by nearby devices, can interfere with
the quantum behavior of extremely sensitive particles used to store information, such as
electrons and atoms [3].
Canadian company D-Wave claims to have produced the first commercial quantum
computer. In 2017, the company put up for sale a quantum computer named 2000Q, which
supposedly features an incredible 2000 quantum bits. To acquire it, however, it is
necessary to disburse something around 15 million dollars. This company divides the
opinions of the scientific community, as there are groups of physicists and computer
scientists who believe that the machine is not 100% quantum, but a hybrid computer
capable of using quantum and electronic bits simultaneously [3].
With a conventional classical computer, if you had to perform 100 different calculations,
you would have to process them one at a time, whereas with a quantum computer, you
could perform them all at once. The current situation where we are forced to use classical
computers for calculations will change dramatically. Supercomputers — the highest class
of classical computers — are so big that they take up a large room. The reason is that 100
calculators are lined up to do 100 different calculations at once. In a real supercomputer,
over 100,000 smaller computers are lined up. With the birth of quantum computers, this
will no longer be necessary. However, that does not mean supercomputers will become
unnecessary. They will be used for different purposes such as smartphones and computers
[4].
There are fields in which quantum computers have a great advantage over classical
computers, for example in the areas of chemistry and biotechnology. The reactions of
materials, in principle, involve quantum effects. A quantum computer that uses quantum
phenomena themselves would allow calculations that could easily incorporate quantum
effects and would be very effective in developing materials such as catalysts and
polymers. This can lead to the development of new drugs that were previously unfeasible,
thus contributing to the improvement of people's health. Additionally, in the area of
finance, for example, as formulas for options trading are similar to those for quantum
phenomena, it is expected that calculations can be performed efficiently on quantum
computers [4].
Quantum computers can be divided into several types, depending on how the smallest
unit, the qubit (a superposition of 0s and 1s), is created. The most advanced type is the
superconducting type. This method makes a qubit using a superconducting circuit with
an ultra-low temperature element, and many IT and other companies are developing this
type of computer. Ion trap and cold atom types, which have been increasing recently, use
electrons in fixed atoms to make qubits, and their operation is stable, so future growth is
expected. There's the silicon type, which is an "electron box" called a quantum dot
containing just one electron, which is made onto a silicon semiconductor chip to create a
qubit. In addition, another type, called the "photonic quantum type", which is a quantum
computer that uses light, is also being studied [4].
The Hitachi company is developing a silicon-type quantum computer. The type of silicon
allows very small qubits to be made, so many qubits can be packed into a small space.
This is where Hitachi's accumulated semiconductor technologies can be leveraged. To
5. 5
obtain computational power superior to that of classical computers, it is necessary to be
able to use a large number of qubits. The silicon type of the quantum computer has the
advantage that such a large number of qubits can be easily fitted onto a semiconductor
chip. The fact that the qubits are so small it's hard to see what's really going on. When
you look at a picture of a quantum computer, it looks like a big device, but most of it is a
cooling system that creates a low-temperature environment to keep the electrons relaxed
and trapped in the quantum dots, and the main circuit is very small [ 4].
In addition to the quantum computer, Artificial Intelligence (AI) can reinvent computers
in three ways, according to the MIT Technology Review. Artificial Intelligence is
changing the way we think about computing. Computers haven't advanced much in 40 or
50 years, they've gotten smaller and faster, but they're still mere boxes with processors
that carry out human instructions. AI is changing this reality in at least three ways: 1) the
way computers are produced; 2) the way computers are programmed; and, 3) how
computers are used. Ultimately, this is a phenomenon that will change how computers
function. The core of computing is shifting from number crunching to decision making
[5].
The MIT paper reports that the first change concerns how computers, and the chips that
control them, are made. The deep learning models that make today's AI applications work,
however, require a different approach because they need a large number of less accurate
calculations to be performed at the same time. This means that a new type of chip is
needed that can move data as quickly as possible, ensuring that data is available whenever
needed. When deep learning arrived on the scene about a decade ago, there were already
specialized computer chips that were very good at it: graphics processing units (GPUs)
designed to display an entire screen of pixels dozens of times per second[5].
The second change concerns how computers are programmed what to do. For the last 40
years computers have been programmed, and for the next 40 they will be trained.
Traditionally, for a computer to do something like recognize speech or identify objects in
an image, programmers first have to create rules for the computer. With machine learning,
programmers no longer dictate the rules. Instead, they create a neural network where
computers learn these rules on their own. The next big advances will come in molecular
simulation as training computers to manipulate the properties of matter that can create
global changes in energy use, food production, manufacturing and medicine. Deep
learning has an amazing track record. Two of the biggest advances of this kind so far are
how to make computers behave, as if they understand the language of humans and
recognize what is in an image and are already changing the way we use them [5].
The third change concerns the fact that a computer no longer needs a keyboard or screen
for humans to interact with. Anything can become a computer. In fact, most household
products, from toothbrushes to light switches and doorbells, already have a smart version.
As they proliferate, however, so does our desire to waste less time telling them what to
do. It is like they should be able to figure out what we need without our interference. This
is the shift from crunching numbers to decision making as a driver of this new era of
computing that envisions computers that tell humans what we need to know and when we
need to know it and that help humans when you need them. Now, machines are interacting
with people and becoming more and more integrated into our lives. Computers are already
out of their boxes [5].
REFERENCES
6. 6
1. TIGRE, Paulo Bastos e NORONHA, Vitor Branco. Do mainframe à nuvem:
inovações, estrutura industrial e modelos de negócios nas tecnologias da informação e da
comunicação. Available on the website
<https://www.scielo.br/j/rausp/a/8mCzNXtRWZJzZPnnrHSq6Bv/>.
2. ALCOFORADO, Fernando. A escalada da ciência e da tecnologia ao longo da
história e sua contribuição ao progresso e à sobrevivência da humanidade. Curitiba:
Editora CRV, 2022.
3. MUNDO EDUCAÇÃO. Computador quântico. Available on the website
<https://mundoeducacao.uol.com.br/fisica/computador-quantico.htm>.
4. KIDO, Yuzuru. The Present and Future of “Quantum Computers”. Available on the
website <https://social-innovation.hitachi/en/article/quantum-
computing/?utm_campaign=sns&utm_source=li&utm_medium=en_quantum-
computing_230>.
5. MIT Technology Review. Como a Inteligência Artificial está reinventando o que
os computadores são. Available on the website <https://mittechreview.com.br/como-a-
inteligencia-artificial-esta-reinventando-o-que-os-computadores-sao/>.
* Fernando Alcoforado, awarded the medal of Engineering Merit of the CONFEA / CREA System, member
of the Bahia Academy of Education, of the SBPC- Brazilian Society for the Progress of Science and of
IPB- Polytechnic Institute of Bahia, engineer and doctor in Territorial Planning and Regional Development
from the University of Barcelona, college professor (Engineering, Economy and Administration) and
consultant in the areas of strategic planning, business planning, regional planning, urban planning and
energy systems, was Advisor to the Vice President of Engineering and Technology at LIGHT S.A. Electric
power distribution company from Rio de Janeiro, Strategic Planning Coordinator of CEPED- Bahia
Research and Development Center, Undersecretary of Energy of the State of Bahia, Secretary of Planning
of Salvador, is the author of the books Globalização (Editora Nobel, São Paulo, 1997), De Collor a FHC-
O Brasil e a Nova (Des)ordem Mundial (Editora Nobel, São Paulo, 1998), Um Projeto para o Brasil
(Editora Nobel, São Paulo, 2000), Os condicionantes do desenvolvimento do Estado da Bahia (Tese de
doutorado. Universidade de Barcelona,http://www.tesisenred.net/handle/10803/1944, 2003), Globalização
e Desenvolvimento (Editora Nobel, São Paulo, 2006), Bahia- Desenvolvimento do Século XVI ao Século
XX e Objetivos Estratégicos na Era Contemporânea (EGBA, Salvador, 2008), The Necessary Conditions
of the Economic and Social Development- The Case of the State of Bahia (VDM Verlag Dr. Müller
Aktiengesellschaft & Co. KG, Saarbrücken, Germany, 2010), Aquecimento Global e Catástrofe Planetária
(Viena- Editora e Gráfica, Santa Cruz do Rio Pardo, São Paulo, 2010), Amazônia Sustentável- Para o
progresso do Brasil e combate ao aquecimento global (Viena- Editora e Gráfica, Santa Cruz do Rio Pardo,
São Paulo, 2011), Os Fatores Condicionantes do Desenvolvimento Econômico e Social (Editora CRV,
Curitiba, 2012), Energia no Mundo e no Brasil- Energia e Mudança Climática Catastrófica no Século XXI
(Editora CRV, Curitiba, 2015), As Grandes Revoluções Científicas, Econômicas e Sociais que Mudaram o
Mundo (Editora CRV, Curitiba, 2016), A Invenção de um novo Brasil (Editora CRV, Curitiba,
2017), Esquerda x Direita e a sua convergência (Associação Baiana de Imprensa, Salvador, 2018), Como
inventar o futuro para mudar o mundo (Editora CRV, Curitiba, 2019), A humanidade ameaçada e as
estratégias para sua sobrevivência (Editora Dialética, São Paulo, 2021), A escalada da ciência e da
tecnologia e sua contribuição ao progresso e à sobrevivência da humanidade (Editora CRV, Curitiba,
2022), a chapter in the book Flood Handbook (CRC Press, Boca Raton, Florida United States, 2022) and
How to protect human beings from threats to their existence and avoid the extinction of humanity (Generis
Publishing, Europe, Republic of Moldova, Chișinău, 2023).