Luddite - is the Singularity near?

A Brief History Of Computers

"I think there is a world market for maybe five computers"
Thomas J. Watson (CEO of IBM), 1943

Roots

I guess since human have fingers, they started to count and compute with them,
and since they have tools, they started to carve numbers into bones.

Across different cultures and timelines there have been different kinds of
numbering systems to compute with.

Our global civilization uses mostly the Hindu-Arabic-Numbers with the decimal
number system, based on 10, our computers use commonly the binary number system,
based on 2, the famous 0s and 1s. But there have been other cultures with other
systems, the Maya with an base 20, Babylon with base 60, or the Chinese with
base 16, the hexadecimal system, which is also used in computer science.

The first compute devices were mechanical helpers,
like the AbacusNapier's Bones or Slide Rule,
they did not perform computations on their own, but were used to represent
numbers and apply the arithmetic operations on them,
addition, subtraction, multiplication and division.

Mechanical Computers

The first mechanical computing machine is considered to be the
Antikythera Mechanism, found in an greek ship that sunk about 70 BC.

But actually it is no computer,
cos it does not perform computations,
but an analog, astrological clock,
a sun and moon calendar that shows solar and lunar eclipses.

In the 17th century first mechanical computing machines were proposed and build.

Wilhelm Schickard designed a not fully functional prototype in 1623.

The Pascaline, designed by Blaise Pascal in 1642,
was the first operational and commercial available mechanical computer,
able to perform the 4 basic arithmetic operations.

In 1672 the German mathematician Gottfried Wilhelm Leibniz invented
the stepped cylinder, used in his not fully functional Stepped Reckoner.

With the Curta the use of mechanical calculator lived on,
up to the advent of portable electronic calculators in the 1960s.

Programmable Computers

The punch card for programming a machine was introduced by Joseph Marie Jacquard
in 1804 with his automated weaving loom, the Jacquard Loom,
for producing textiles with complex patterns.

In 1837 Charles Babbage (considered as the father of the computer)
was the first to describe a programmable, mechanical computer,
the Analytical Engine.

Ada Lovelace (considered as the mother of programming) worked for Babbage and
was the first person to publish a computer algorithm, the computation of
Bernoulli numbers.

Babbage was his time ahead, as he described all parts, CPU, memory, input/output,
a modern computer has, but was not able to realize his machine,
due to missing funds and proper engineering abilities of that time.

About a century later, Konrad Zuse's Z3, built in 1941,
is considered to be the first binary, free programmable computer.

It used 600 telephone relays for computation and 1400 relays for memory,
a keyboard and punched tape as input, lamps as output,
and it operated with 5 Hertz.

Mainframes

Zuse's machines mark the advent of the first mainframes used by military and
science during and after WWII.

Colossus Mark I (1943), ENIAC (1946), IBM 650 (1953) for example used vacuum
tubes instead of relays and were replaced more and more by transistor based
computers in the 1960s.

Home Computers

With small chips, integrated circuits, it was possible to built smaller and
reasonable Home Computers. IBM and other big players underestimated this market,
so Atari, Apple, Commodore, Sinclair, etc. started the Home Computer Revolution,
one computer for every home.

Some first versions came as self-assembly kit, like the Altair 8800 (1974),
or with built in TV output, like the Apple I (1976),
or as fully assembled video game console like the Atari VCS (1977),
followed by more performant versions with an graphical user interface,
like the Apple Mac (1984), or the Commodore Amiga 1000 (1985).

Personal Computers

IBM started in 1981 with the 5150 the Personal Computer era.
Third party developers were able to provide operating systems, like
Microsoft DOS, or hardware extensions for the standardized extension slots,
like hard-drives, video-cards, sound-cards, etc.,
soon other companies created clones of the IBM PC, the famous "PC Compatible".

Gaming was already in the Home Computer era an important sales argument,
the early PC graphics standards like CGA and EGA were not really able to compete
with the graphics generated by the Denise chip in an Commodore Amiga 500,
but with the rise of SVGA (1989) standards and the compute power of the Intel
486 CPU (1989), game forges were able to build games with superior 3D graphics,
like Wolfenstein 3D (1992), Comanche (1992) or Strike Commander (1993).

The race for higher display resolutions and more detailed 3D graphics
continues until today.

With operating systems based on graphical user interfaces,
like OS/2, X11, Windows 95, in the 1990s,
PCs finally replaced the Home Computers.

Another recipe for the success of the PC may be, that there have been multiple
CPU vendors for the same architecture (x86), like Intel, AMD, Via or IBM.

Internet of Things

The Internet was originally designed to connect military institutions in an
redundant way, so if one net element fails, the rest would be still operable.

The bandwidth available evolves like compute power, exponentially,
at first mainly text was submitted, like emails (1970s) or newsgroups (1980s),
followed by images and audio via Gopher (1991) or the World Wide Web (1989),
and finally, Full HD videos via Streaming Platforms like Youtube or Netflix.

In the late 1990s, Mobile-Phones like the Nokia Communicator,
MP3 audio players and PDAs, Personal Digital Assistants, like the Palm Pilots,
marked the rise of the Smart Devices.

Their functions were all united into the Smart-Phone, and with mobile,
high-bandwidth internet, it is still on its triumph tour across the globe.

I am not able to portrait the current state of computer and internet usage,
it is simply too omnipresent, from Word-Processing to AI-Research,
from Fake-News to Dark-Net, from Botnets of Webcams to Data-Leaks in Toys...

The next thing

But i can guess what the next step will be, Integrated Devices, the BCI, the
Brain Computer Interface, connected via the Internet to an real kind of Matrix.

It seems only logical to conclude that we will connect with machines directly,
implant chips, or develop non-invasive scanners, so the next bandwidth demand
will be brainwaves, in all kind of forms.

 

On Peak Human

One of the early Peak Human prophets was Malthus,
in his 1798 book, 'An Essay on the Principle of Population',
he postulated that the human population growths exponentially,
but food production only linear,
so there will occur fluctuation in population growth around an upper limit.

Later Paul R. Ehrlich predicted in his book, 'The Population Bomb' (1968),
that we will reach an limit in the 1980s.

Meadows et al. concur in 'The Limits of Growth - 30 years update' (2004),
that we reached an upper limit already in the 1980s.

In 2015 Emmott concludes in his movie 'Ten Billion' that we already passed
the upper bound.

UNO predictions say we may hit 9 billion humans in 2050,
so the exponential population growth rate already declines,
but the effects of an wast-fully economy pop up in many corners.

Now, in 2018, we are about 7.4 billion humans, and i say Malthus et al.
were right.

Is is not about how many people Earth can feed,
but how many people can live in an comfortable but sustainable manner.

What does Peak Human mean for the Technological Singularity?

The advent of Computers was driven by the exponential population growth in
the 20th century. All the groundbreaking work was done in the 20th century.

When we face an decline in population growth,
we also have to face an decline in new technologies developed.

Cos it is not only about developing new technologies,
but also about maintaining the old knowledge.

Here is the point AI steps in, mankind's population growth alters,
but the whole AI sector is growing and expanding.

Therefore the question is, is AI able to take on the decline?

Time will tell.

I guess the major uncertainty is,
how Moore's Law will live on beyond 2021,
when the 4 nm transistor production is reached,
what some scientists consider as an physical and economical barrier.

I predict that by hitting the 8 billion humans mark,
we will have developed another, groundbreaking, technology,
similar with the advent of the transistor, integrated circuit and microchip.

So, considering the uncertainty of Peak Human vs. Rise of AI,
i give +-0 points for the Singularity to take off.

 

More Moore

If we can not shrink the transistor size any further, what other options do we
have to increase compute power?

3D packaging

The ITRS report suggest to go into third dimension and build cubic chips.
The more layers are build the more integrated cooling will be necessary.

Memory Wall

Currently memory latencies are higher than compute cycles on CPUs,
with faster memory tehchniques or higher bandwidth the gap can be closed.

Memristor

The Memristor is an electronic component proposed in 1971.
It can be used for non-volatile memory devices and alternative, neuromorphic
compute architectures.

Photonics

Using light for computation sounds quite attractive,
but the base element, the photonic transistor, has yet to be developed.

Quantum Computing

Really, i do not have a clue how these thingies work,
somehow via Quantum Effects like Superposition and Entanglement
but people say they are going to rock when they are ready...

Considering so much room for research,
i give +1 points for the Singularity to take off.

The End of Moore's Law?

Moore's law, the heartbeat of computer evolution,
is the observation that every two years the amount of transistors
on integrated circuits doubles. Gordon Moore, co-founder of Intel, proposed
an doubling every year in 1965 and an doubling every two years in 1975.

In practice this results in an doubling of compute power of computer chips every two years.

The doubling of transistor amount is achieved by shrinking their size.
The 1970s Intel 8080 chip was clocked with 2 Mhz, had about 6000 transistors and
was produced in an 6 Micrometer process.
Nowadays processors have billions of transistors and use an 14 or 10 Nanometer process.

But less known is Moore's Second Law, the observation that also the investment
costs for the fabrics grow exponentially.

The last ITRS report of 2015 predicts that transistor shrinking will hit such
an economic wall in 2021, and alternative techniques have to be used to
keep Moore's Law alive.

Considering this news,
i give -1 points for the Singularity to take off.

Home - Top