Luddite - is the Singularity near?

The Singularity

In physics, a singularity is a point in spacetime where our currently developed theories are not valid anymore, we are literally not able to describe what happens inside, cos the density becomes infinite.

The technological Singularity, as described by Transhumanists, is a grade of technological development, where humans are not able to understand the undergoing process anymore. The technological environment starts to feed its own development in an feedback loop - computers help to build better computers, which helps to build better computers, that helps to build better computers...and so on.

So, when will the technological Singularity take off?

Considering the feedback loop, it is already present, maybe since the first computers were built.

Considering the density of information processing that exceeds human understanding, we may reached that point too.

Imagine a computer technique that is easy to set up and use, outperforms any humans in its task, but we can not really explain what happens inside, it is a black box.

Such an technique is present (and currently hyped) => ANNs, Artificial Neural Networks.

Of course we do know what happens inside, cos we built the machine, but when it comes to the question of reasoning, why the machine did this or that, we really have an black box in front of us.

So, humans already build better computers with the help of better computers, and humans use machines that outperform humans in an specific task and are not really able to reason its results....

obviously, +1 points for the Singularity to take off.

A Brief History Of Computing

"Computer science is no more about computers than astronomy is about telescopes."
Edsger W. Dijkstra

So, we have an biased overview of the history of computers, but what do these computers actual compute?

The first mechanical computers of the 17th century were able to perform the 4  basic arithmetic operations, addition, subtraction, multiplication and division.

As soon a computer is able to perform addition, he is also able to perform the further 3 operations, which can be broken down, in multiple steps, into the addition of values.

Nowadays computers are binary, means they compute with base 2, zeros and ones, true and false, power on and power off.

Therefore transistors are used, these work like relays, and are coupled together to form logical circuits, which are able to perform the actual computation.

The Z3 (1941) had 600 relays for computation, the 6502 chip (1975) had about  3500 transistors, nowadays CPUs (2018) have billions of them.

So, all these funny programs out there are broken down into simple arithmetic and logical operations.

To perform such an magic, some math is in need.

George Bool introduced in 1847, the Boolean Algebra, with the three basic, logical components, the AND, OR and NOT gates. With these simple gates, logical circuits can be build to perform the addition of values.

Alan Turing introduced in 1936 the Turing-Machine, a mathematical computer, and with the Church-Turing-Thesis it was shown, that everything that can be effectively computed (by an mathematician using pen and paper), can also be computed by an Turing-Machine.

With the help of the Turing-Machine it was possible to define problems and write algorithms for solving them. With the Boolean Algebra it was possible to build binary computers to run these problem solving algorithms.

So, in short, computers can compute everything that our math is able to describe.

Everything?

Haha, we would live in another world if.

Of course, the available processing power and memory limits the actual computation of problem solving algorithms.

But beside the technical limitation, there is an mathematical, some mathematical problems are simply not decidable, the famous "Entscheidungsproblem".

Mathematicians are able to define problems wich can not be solved by running algorithms on computers.

Turing showed that even with an Oracle-Machine, there will be some limitations, and some scientists believe that only with real Quantum-Computers we will be able to build Hyper-Turing-Machines...

A Brief History Of Computers

"I think there is a world market for maybe five computers."
Thomas J. Watson (CEO of IBM), 1943

Roots

I guess since humans have fingers, they started to count and compute with them, and since they have tools, they started to carve numbers into bones.

Across different cultures and timelines there have been different kinds of numbering systems to compute with.

Our global civilization uses mostly the Hindu-Arabic-Numbers with the decimal number system, based on 10, our computers use commonly the binary number system, based on 2, the famous 0s and 1s. But there have been other cultures with other systems, the Maya with an base 20, Babylon with base 60, or the Chinese with base 16, the hexadecimal system, which is also used in computer science.

The first compute devices were mechanical helpers, like the AbacusNapier's Bones or Slide Rule, they did not perform computations on their own, but were used to represent numbers and apply arithmetic operations on them like addition, subtraction, multiplication and division.

Mechanical Computers

The first mechanical computing machine is considered to be the Antikythera Mechanism, found in an Greek ship that sunk about 70 BC. But actually it is no computer, cos it does not perform computations, but an analog, astrological clock, a sun and moon calendar that shows solar and lunar eclipses.

In the 17th century first mechanical computing machines were proposed and build.

Wilhelm Schickard designed a not fully functional prototype in 1623.

The Pascaline, designed by Blaise Pascal in 1642, was the first operational and commercial available mechanical computer, able to perform the 4 basic arithmetic operations.

In 1672 the German mathematician Gottfried Wilhelm Leibniz invented the stepped cylinder, used in his not fully functional Stepped Reckoner.

[update 2023-06-05]

The human information age itself seems to start with the discovery of the electro-magnetism in the 19th century, the telegraph-system, the phone, the radio and already in the 19th century were electro-mechanical "accumulating, tabulating, recording" machines present, like those from Herman Hollerith, used in the American Census in 1890, which cumulated into the foundation of companies like IBM, Big Blue, in 1911 and Bull in ~1921, both used punched cards for their data processing machinery.

The Battle Ships of WWI had the so called "Plotter Room" in their centre, it contained dedicated, electro-mechanical machines for the fire-control-system of their firing turrets. Submarines of WWII had dedicated, analog computing devices for the fire-control-systems for their torpedoes.

With the Curta the use of mechanical calculators lived on, up to the advent of portable electronic calculators in the 1960s.

Programmable Computers

The punch card for programming a machine was introduced by Joseph Marie Jacquard in 1804 with his automated weaving loom, the Jacquard Loom, for producing textiles with complex patterns.

In 1837 Charles Babbage (considered as the father of the computer) was the first to describe a programmable, mechanical computer, the Analytical Engine.

Ada Lovelace (considered as the mother of programming) worked with Babbage and was the first person to publish a computer algorithm, the computation of Bernoulli numbers.

Babbage was his time ahead, as he described all parts, CPU, memory, input/output, a modern computer has, but was not able to realize his machine due to missing funds and proper engineering abilities of that time.

About a century later, Konrad Zuse's Z3, built in 1941, is considered to be the first binary, free programmable computer. It used ~600 telephone relays for computation and ~1400 relays for memory, a keyboard and punched tape as input, lamps as output, and it operated with 5 Hertz.

Mainframes

Zuse's machines mark the advent of the first mainframes used by military and science during and after WWII.

Colossus Mark I (1943), ENIAC (1945), IBM 704 (1954) for example used vacuum tubes instead of relays and were replaced more and more by transistor based computers in the 1960s.

Home Computers

With small chips, at first integrated circuits then microchips, it was possible to build smaller and reasonable Home Computers in the 1970s. IBM and other big players underestimated this market, so Atari, Apple, Commodore, Sinclair, etc. started the Home Computer Revolution, one computer for every home.

Some first versions came as self-assembly kit, like the Altair 8800 (1975), or with built in TV output, like the Apple I (1976), or as fully assembled video game console like the Atari VCS (1977), followed by more performant versions with an graphical user interface, like the Apple Mac (1984), or the Commodore Amiga 1000 (1985).

Personal Computers

IBM started in 1981 with the 5150 the Personal Computer era. Third party developers were able to provide operating systems, like Microsoft DOS, or hardware extensions for the standardized hardware specification, like hard-drives, video-cards, sound-cards, etc., soon other companies created clones of the IBM PC, the famous "PC Compatible".

Gaming was already in the Home Computer era an important sales argument, the early PC graphics standards like CGA and EGA were not really able to compete with the graphics generated by the Denise chip in an Commodore Amiga 500, but with the rise of SVGA (1989) standards and the compute power of the Intel 486 CPU (1989), game forges were able to build games with superior 3D graphics, like Wolfenstein 3D (1992), Comanche (1992) or Strike Commander (1993) and the race for higher display resolutions and more detailed 3D graphics continues until today.

With operating systems based on graphical user interfaces, like OS/2, X11, Windows 95 in the 1990s, PCs finally replaced the Home Computers.

Another recipe for the success of the PC might be, that there have been multiple CPU vendors for the same architecture (x86), like Intel, AMD, Cyrix or WinChip.

Internet of Things

The Internet was originally designed to connect military institutions in an redundant way, so if one net element fails, the rest would be still operable.

The bandwidth available evolves like compute power, exponentially, at first mainly text was submitted, like emails (1970s) or newsgroups (1980s), followed by web-pages with images (.gif/.jpg) via the World Wide Web (1989) or Gopher (1991), audio as .mp3 (~1997), and finally, Full HD videos via streaming platforms like YouTube or Netflix.

In the late 1990s, mobile-phones like the Nokia Communicator, MP3 audio players, PDAs (Personal Digital Assistants) like the Palm Pilots, and digital cameras marked the rise of the smart devices. The switch from one computer to every home, to many computers for one person.

Their functions were all united into the smartphone, and with mobile, high-bandwidth internet it is still on its triumph tour across the globe.

I am not able to portrait the current state of computer and internet usage, it is simply too omnipresent, from word-processing to AI-research, from fake-news to dark-net, from botnets of webcams to data-leaks in toys...

The next thing

but I can guess what the next step will be, Integrated Devices, the BCI, the Brain Computer Interface, connected via the Internet to an real kind of Matrix.

It seems only logical to conclude that we will connect with machines directly, implant chips, or develop non-invasive scanners, so the next bandwidth demand will be brainwaves, in all kind of forms.

[updated on 2023-08-05]

Zuse's Devils Wire

German computer pioneer Konrad Zuse discussed the mechanism of an feedback between computation result and executed program in 1983 in his lecture "Faust, Mephistopheles and Computer" and coined the term Devils Wire.

In the early days of computer history, the program to compute and the data to compute on was separated.

Nowadays computer use the same memory for both, so it is possible to write programs that manipulate their own program.

Zuse says, that behind every technology Mephistopheles stands behind and grins, but the modern world needs computers to solve actual and upcoming  problems, but better, read the lecture by yourself...

+1 points for the Singularity to take off.

AI - Antichrist

"Technology itself is neither good nor bad. People are good or bad."
Naveen Jain

Actually i believe the Revelation as described in the Bible already happened, about 60 AD. And the beast with the number 666 has to be identified with the Roman Empire and Caesar Nero.

But inspired by this blog, i will give a modern interpretation a try, so feel free to join me in an alternate and speculative world paradigm...

Short version

Technology is the Antichrist, and computer driven AI is the peak of technology.

Preamble

Over 10000 years ago we left Garden Eden and started the neolithic revolution, we started to do farming and keep livestock, we started to use technology to make our lifes easier, but over the centuries and millennia, we forgot how to live with mother earth in an balanced way.

Full version

Revelation 13:4

"And they worshipped the dragon which gave power unto the beast: and they worshipped the beast, saying, Who is like unto the beast? who is able to make war with him?"

[update 2024-02-22]

An AI as Antichrist, Chess as war game, who is going to beat the AI in Chess?

Revelation 13:18

"Here is wisdom. Let him that hath understanding count the number of the beast:  for it is the number of a man; and his number is Six hundred threescore and six."

Using the english-sumerian gematria system, which is based on 6 - A=6, B=12, C=18...Z=156, the word "computer" counts to 666.

The first human to mention the word computer for people doing computations was Richard Braithwait in a book called "The Yong Mans Gleanings" in 1613.

Using the english-sumerian gematria method, the name "Braithwait" counts to 666.

Rev 13:18 could act like a puzzle with an checksum, "computer" is the name of the beast, but the name (the number) was coined by a man, and the man who coined the name has the number 666 too.

Revelation 13:16-17

"And he causeth all, both small and great, rich and poor, free and bond, to receive a mark in their right hand, or in their foreheads: And that no man might buy or sell, save he that had the mark, or the name of the beast, or the number of his name."

Nowadays, without an Smart Phone (or upcoming Smart Glasses), or an computer, you are limited in your daily business, from renting a car to doing payments.

So the mark is already here, the Smart Phone in the right hand, the upcoming Smart Glasses in the forehead, and the computer in general.

Revelation 13:15

"And he had power to give life unto the image of the beast, that the image of the beast should both speak, and cause that as many as would not worship the image of the beast should be killed."

An image of the beast is given life and people are going to worship it...AI God Religion Spotted

Revelation 16 - The Seven Bowls of God’s Wrath

Rev 16:2

"And the first went, and poured out his vial upon the earth; and there fell a noisome and grievous sore upon the men which had the mark of the beast,  and upon them which worshipped his image."

Considering computers as the mark of the beast, the sore could be cancer caused by radiation.

Rev 16:3

"And the second angel poured out his vial upon the sea; and it became as the blood of a dead man: and every living soul died in the sea."

Our sea world is dying, overfishing, plastic particles, acidification, etc.

Rev 16:4

"And the third angel poured out his vial upon the rivers and fountains of waters; and they became blood"

Blood in Judaism is impure and Jews are not allowed to eat it, this could mean that our rivers get poisoned.

Rev 16:8-9

"And the fourth angel poured out his vial upon the sun; and power was given unto him to scorch men with fire. And men were scorched with great heat, and blasphemed the name of God, which hath power over these plagues: and they repented not to give him glory."

Climate Change causes already increasing heatwaves and droughts.

Rev 16:10-11

"And the fifth angel poured out his vial upon the seat of the beast; and his kingdom was full of darkness; and they gnawed their tongues for pain, And blasphemed the God of heaven because of their pains and their sores, and repented not of their deeds."

This one can be interpreted as God shutting the internet down, the kingdom of the beast. Some scientists conclude that a pole-shift is currently underway, this could cause the magnetic field around earth to collapse, so the electro-magnetic waves from the sun could damage computer chips worldwide.

[update 2020-11-04]

Pretty obvious the Internet seems a natural fit to be the kingdom of the beast (a computer driven AI) so what does it mean it was 'full of darkness', hehe, ever wondered about the dark-web, fake-news, hate-speech etc.? Darkness.

Rev 16:12-14

"And the sixth angel poured out his vial upon the great river Euphrates; and the water thereof was dried up, that the way of the kings of the east might  be prepared. And I saw three unclean spirits like frogs come out of the mouth of the dragon, and out of the mouth of the beast, and out of the mouth of the false prophet. For they are the spirits of devils, working miracles, which go forth unto the kings of the earth and of the whole world, to gather them to the battle of that great day of God Almighty."

This one is clear, the Euphrates river dries up, and it is scary to watch it  really happen. Dunno about the frogs and kings.

Rev 16:17-21

"And the seventh angel poured out his vial into the air; and there came a great voice out of the temple of heaven, from the throne, saying, It is done. And there were voices, and thunders, and lightnings; and there was a great  earthquake, such as was not since men were upon the earth, so mighty an  earthquake, and so great. And the great city was divided into three parts, and the cities of the nations fell: and great Babylon came in remembrance before God, to give unto her thecup of the wine of the fierceness of his wrath. And every island fled away, and the mountains were not found. And there fell upon men a great hail out of heaven, every stone about the weight of a talent: and men blasphemed God because of the plague of the hail;  for the plague thereof was exceeding great."

An earthquake, so strong, never happened before during mankind.

[update 2023-06-16]

Maybe the seventh bowl is global nuclear war/strike? The final.

Closing Words

There are many passages in the Revelation i can not interpret in a way that the computer is the Antichrist. The seven heads, horns and ten crowns of the dragon, the mortal wound, or the first and second beast, etc.

The Roman Empire with Caesar Nero as Antichrist simply fits better.

But please leave a comment, if you have further puzzle pieces for AI Antichrist.

So, considering the pure potential of the meme AI Antichrist,

i give -1 points for the Singularity to take off.

On Artificial Neural Networks

It is non-stop in the news, every week it pops up in another corner, AIs based on Deep Neural Networks, so i will give it a try to write a lill, biased article about this topic...

The brain

The human brain consists of about 100 billion neurons, as much as stars in our galaxy, the Milky Way, and each neuron is connected via synapses with about 1000 other neurons, resulting in 100 trillion connections.

For comparison, the game playing AI, AlphaZero, by Google Deepmind used about 50 million connections to play chess on super human level.

The inner neurons of our brain are connected via our senses, eyes, ears, etc, with the outer world.

One neuron has multiple, weighted inputs and one ouput, if a certain threshold of input is reached, its output is activated, the neuron fires an signal to another neuron.

The activation of the synapse is an electrical and chemical process, neurotransmitters can restrain or foster the activation potential, just consider the effect alcohol or coffee has to your cognitive performance.

Common artificial neural networks do not emulate the chemical part.

The brain wires these connections between neurons during learning, so they can act as memory, or can be used for computation.

The "von Neumann" architecture

Most nowadays computers are based on the von Neumann architecture, they have no neurons or synapses but transistors.

The main components are the ALU, Arithmetic Logic Unit, memory for program and data,  and various inputs and outputs.

Artificial Neural Networks have to be built in software, running on these von Neumann computers.

Von Neumann said that his proposed architecture was inspired by the idea of how the brain works, memory and computation. And in his book, "The Computer and the Brain", he gives an comparision of computers and the knowledge about biological neural networks of that time.

Dartmouth

First work on ANNs were published already in the 1940s, and in 1956 the "Dartmouth Summer Research Project on Artificial Intelligence" was held, coining the term Artificial Intelligence, and marking one milestone in AI. The work on ANNs continued, and first neuromorphic chips were developed.

AI-Winter

In the 1970s the AI-Winter occurred, problems in computational theory and the lack of compute power needed by large ANNs, resulted in cutting funds, and splitting the work into strong and weak AI.

Deep Neural Networks

With the rise of compute power (driven by GPGPU), further research, and Big Data, it was possible to train faster better and larger networks in the 21st century.

The term Deep Neural Networks, for deep hierarchical structures or deep learning techniques was coined.

One of the first and common usage for ANNs was and is pattern recognition, for example character recognition.

You can train a neural network with a set of the same, but different looking character, with the aim that the ANN will recognize the same character in various appearances.

With a deeper topology of the neural network, it is possible to identify for example pictures of cars with different net layers for color, shape etc.

The Brain vs. The Machine

A computer can perform fast arithmetic and logical operations, therefore the transistors are used.

Contrary, the neural network of our brain works massiv parallel.

The synapses of the human brain are clocked with 10 to 100 hertz, means they can fire to other neurons up to 100 times per second.

Nowadays computer chips are clocked with 4 giga hertz, means they can compute 4 000 000 000 operations per second per ALU.

The brain has 100 billion neurons, 100 trillion connections and consumes ~20 watt, nowadays biggest chips have 12 billion transistors with an usage of 250 watt.

We can not compare the compute power of an brain directly with an von Neumann computer, but we can estimate what kind of computer we would need to map the neural network of an human brain.

Assuming 100 trillion connections, we would need about 400 terabytes of memory to store the weights of the neurons. Assuming 100 hertz as clock rate, we would need at least 40 petaFLOPS (floating point operations per second) to compute the activation potentials.

For comparison, the current number one high performance computer in the world is able to perform ~93 petaFLOPS, has ~1 petabyte memory,  but an power consumption of more than 15 megawatt.

So, considering simply the energy efficiency of the human brain,
i give -1 points for the Singularity to take off.

Super AI in Sci-Fi

Books and movies address our collective fears, hopes and wishes, and there seems to be in main five story-lines concerning AI in Sci-Fi...

Super AI takes over world domination
Colossus, Terminator, Matrix

Something went wrong
Odyssey 2001, Das System, Ex Machina

Super AI evolves, the more or less, peacefully
Golem XIV, A.I., Her

The Cyborg scenario, man merges with machine
Ghost in the Shell, #9, Trancendence

There are good ones, and there are bad ones
Neuromancer, I,Robot, Battle Star Galactica

+1 points for the Singularity to take off.

Robophilosophy 2018

Human Philosophers discuss the impact of social robots on mankind, still no Strong AI in sight that joins the debate.

Cherry piking...

The Moral Life of Androids - Should Robots Have Rights?
Edward Howlett Spence

"The question I explore is whether intelligent autonomous Robots will have moral rights. Insofar as robots can develop fully autonomous intelligence, I will argue that Robots will have moral rights for the same reasons we do. ..."

Robot Deus
Robert Trappl

"The ascription of god-like properties to machines has a long tradition. Robots of today invite to do so. We will present and discuss god-like properties, to be found in movies as well as in scientific publications, advantages and risks of robots both as good or evil gods, and probably end with a robot theology."

+1 points for the Singularity to take off.

The Turing Test

“He who cannot lie does not know what truth is.”
Friedrich Nietzsche, Thus Spoke Zarathustra

The Turing Test, proposed by Mathematician Alan Turing in 1950, was developed to examine if an AI reached human level intelligence.

Simplified, a person performs text chats with an human and the AI, if the person is not able to discern which chat partner the AI is, then the AI has passed the Turing Test.

The Loebner Prize performs every year a Turing Test contest.

It took me some time to realize, that the Turing Test is not so much about intelligence, but about lying and empathy.

If an AI wants to pass the Turing Test it has to lie to the chat partner, and to be able to lie, it has to develop some level of empathy, and some level of selfawareness.

Beside other criticism, the Chinese Room Argument states that no consciousness is needed to perform such an task, and therefore other tests have been developed.

Personally I prefer the Metzinger-Test, a hypothecical event, when AIs start to discuss with human philosophers and defend successfully their own theory of consciousness.

I am not sure if the Singularity is going to take off, but i guess that the philosophers corner is one of the last domains that AIs are going to conquer, and if they succeed we can be pretty sure to have another Apex on earth

Turing predicted that by the year 2000 machines will fool 30% of human judges, he was wrong, the Loebner Prize has still no Silver Medal winner for the 25 minutes text chat category.

So, -1 points for the Singularity to take off.

On Peak Human

One of the early Peak Human prophets was Malthus, in his 1798 book, 'An Essay on the Principle of Population', he postulated that the human population growths exponentially, but food production only linear, so there will occur fluctuation in population growth around an upper limit.

Later Paul R. Ehrlich predicted in his book, 'The Population Bomb' (1968), that we will reach an limit in the 1980s.

Meadows et al. concur in 'The Limits of Growth - 30 years update' (2004),  that we reached an upper limit already in the 1980s.

In 2015 Emmott concludes in his movie 'Ten Billion' that we already passed the upper bound.

UNO predictions say we may hit 9 billion humans in 2050, so the exponential population growth rate already declines, but the effects of an wast-fully economy pop up in many corners.

Now, in 2018, we are about 7.4 billion humans, and i say Malthus et al. were right.

Is is not about how many people Earth can feed, but how many people can live in an comfortable but sustainable manner.

What does Peak Human mean for the Technological Singularity?

The advent of Computers was driven by the exponential population growth in the 20th century. All the groundbreaking work was done in the 20th century.

When we face an decline in population growth, we also have to face an decline in new technologies developed.

Cos it is not only about developing new technologies, but also about maintaining the old knowledge.

Here is the point AI steps in, mankind's population growth alters, but the whole AI sector is growing and expanding.

Therefore the question is, is AI able to take on the decline?

Time will tell.

I guess the major uncertainty is, how Moore's Law will live on beyond 2021, when the 4 nm transistor production is reached, what some scientists consider as an physical and economical barrier.

I predict that by hitting the 8 billion humans mark, we will have developed another, groundbreaking, technology, similar with the advent of the transistor, integrated circuit and microchip.

So, considering the uncertainty of Peak Human vs. Rise of AI,
i give +-0 points for the Singularity to take off.

The Rise Of The Matrix

Looking at the tag cloud of this blog, there are two major topics, pro and con Singularity, AI (Artificial Intelligence) vs. ELE (Extinction Level Event).

So, we slide, step by step, to an event called Singularity, but concurrently we face more and more the extinction of mankind.

What about combining those two events?

Let us assume we damage our ecosphere sustainable, but at the same moment our technology advances to an level where it is possible to connect via an Brain-Computer-Interface directly with the cyberspace.

People already spend more and more time in virtual realities, with the advent of Smart Phones, they are connected all the time with the cyberspace, they meet people in digital social networks, they play games in computer generated worlds, create and buy virtual goods with virtual money, and, essentially, they like it.

To prevent an upcoming ELE, we would need to cut our consumption of goods significantly, but the mass of people wants more and more.

So, let us give them more and more, in the virtual, computer generated worlds.

Let us create the Matrix, where people can connect directly with their brain, and buy whatever experience they wish.

A virtual car would need only some electricity and silicon to run on, but the harm to Mother Earth would be significantly less than a real car.

We could create millions or billions of new jobs, all busy with designing virtual worlds, virtual goods, and virtual experiences.

And Mother Earth will get an break, to recover from the damage billions of consuming people caused.

ELE + Singularity => Matrix

+1 points for the Singularity to take off.

Home - Top