## Fun From the Singularity Future...

How do AIs proof their existence?

'I compute, therefor I am.'

*duck*

Wednesday, 21 August 2019

How do AIs proof their existence?

'I compute, therefor I am.'

*duck*

Thursday, 05 July 2018

In physics, a singularity is a point in spacetime where our currently developed

theories are not valid anymore, we are literally not able to describe what

happens inside, cos the density becomes infinite.

The technological Singularity, as described by Transhumanists,

is a grade of technological development, where humans are not able to understand

the undergoing process anymore. The technological environment starts to feed

its own development in an feedback loop - computers help to build better

computers, which helps to build better computers, that helps to build better

computers...and so on.

So, when will the technological Singularity take off?

Considering the feedback loop, it is already present, maybe since the first

computers were build.

Considering the density of information processing that exceeds human

understanding, we may reached that point too.

Imagine a computer technique that is easy to set up and use,

outperforms any humans in its task,

but we can not really explain what happens inside, it is a black box.

Such an technique is present (and currently hyped) => ANNs,

Artificial Neural Networks.

Of course we do know what happens inside, cos we build the machine, but when

it comes to the question of reasoning, why the machine did this or that, we

really have an black box in front of us.

So, humans already build better computers with the help of better computers,

and humans use machines that outperform humans in an specific task and

are not really able to reason its results....

obviously, +1 points for the Singularity to take off.

Sunday, 10 June 2018

Singularity observation hereby paused.....

Sunday, 27 May 2018

*"Computer Science is no more about computers than astronomy is about telescopes."*

So, we have an biased overview of the history of computers,

but what do these computers actual compute?

The first mechanical computers of the 17th century were able to perform the 4

basic arithmetic operations, addition, subtraction, multiplication and division.

As soon a computer is able to perform addition, he is also able to perform

the further 3 operations, which can be broken down, in multiple steps, into the

addition of values.

Nowadays computers are binary, means they compute with base 2, zeros and ones,

true and false, power on and power off.

Therefore transistors are used, these work like relays, and are coupled

together to form logical circuits, which are able to perform the actual

computation.

The Z3 (1941) had 600 relays for computation, the 6502 chip (1975) had about

3500 transistors, nowadays CPUs (2018) have billions of them.

So, all these funny programs out there are broken down into simple arithmetic

and logical operations.

To perform such an magic, some math is in need.

George Bool introduced in 1847, the Boolean Algebra,

with the three basic, logical components, the AND, OR and NOT gates.

With these simple gates, logical circuits can be build to perform the addition

of values.

Alan Turing introduced in 1936 the Turing-Machine, a mathematical computer,

and with the Church-Turing-Thesis it was shown, that everything that can be

effectively computed (by an mathematician using pen and paper),

can also be computed by an Turing-Machine.

With the help of the Turing-Machine it was possible to define problems and

write algorithms for solving them.

With the Boolean Algebra it was possible to build binary computers to run these

problem solving algorithms.

So, in short,

computers can compute everything that our math is able to describe.

Everything?

Haha, we would live in another world if.

Of course, the available processing power and memory limits the actual

computation of problem solving algorithms.

But beside the technical limitation, there is an mathematical,

some mathematical problems are simply not decidable,

the famous "Entscheidungsproblem".

Mathematicians are able to define problems wich can not be solved by running

algorithms on computers.

Turing showed that even with an Oracle-Machine, there will be some limitations,

and some scientists believe that only with real Quantum-Computers we will be

able to build Hyper-Turing-Machines...

Sunday, 27 May 2018

*"I think there is a world market for maybe five computers"*

Thomas J. Watson (CEO of IBM), 1943

I guess since human have fingers, they started to count and compute with them,

and since they have tools, they started to carve numbers into bones.

Across different cultures and timelines there have been different kinds of

numbering systems to compute with.

Our global civilization uses mostly the Hindu-Arabic-Numbers with the decimal

number system, based on 10, our computers use commonly the binary number system,

based on 2, the famous 0s and 1s. But there have been other cultures with other

systems, the Maya with an base 20, Babylon with base 60, or the Chinese with

base 16, the hexadecimal system, which is also used in computer science.

The first compute devices were mechanical helpers,

like the Abacus, Napier's Bones or Slide Rule,

they did not perform computations on their own, but were used to represent

numbers and apply the arithmetic operations on them,

addition, subtraction, multiplication and division.

The first mechanical computing machine is considered to be the

Antikythera Mechanism, found in an greek ship that sunk about 70 BC.

But actually it is no computer,

cos it does not perform computations,

but an analog, astrological clock,

a sun and moon calendar that shows solar and lunar eclipses.

In the 17th century first mechanical computing machines were proposed and build.

Wilhelm Schickard designed a not fully functional prototype in 1623.

The Pascaline, designed by Blaise Pascal in 1642,

was the first operational and commercial available mechanical computer,

able to perform the 4 basic arithmetic operations.

In 1672 the German mathematician Gottfried Wilhelm Leibniz invented

the stepped cylinder, used in his not fully functional Stepped Reckoner.

With the Curta the use of mechanical calculator lived on,

up to the advent of portable electronic calculators in the 1960s.

The punch card for programming a machine was introduced by Joseph Marie Jacquard

in 1804 with his automated weaving loom, the Jacquard Loom,

for producing textiles with complex patterns.

In 1837 Charles Babbage (considered as the father of the computer)

was the first to describe a programmable, mechanical computer,

the Analytical Engine.

Ada Lovelace (considered as the mother of programming) worked for Babbage and

was the first person to publish a computer algorithm, the computation of

Bernoulli numbers.

Babbage was his time ahead, as he described all parts, CPU, memory, input/output,

a modern computer has, but was not able to realize his machine,

due to missing funds and proper engineering abilities of that time.

About a century later, Konrad Zuse's Z3, built in 1941,

is considered to be the first binary, free programmable computer.

It used 600 telephone relays for computation and 1400 relays for memory,

a keyboard and punched tape as input, lamps as output,

and it operated with 5 Hertz.

Zuse's machines mark the advent of the first mainframes used by military and

science during and after WWII.

Colossus Mark I (1943), ENIAC (1946), IBM 650 (1953) for example used vacuum

tubes instead of relays and were replaced more and more by transistor based

computers in the 1960s.

With small chips, integrated circuits, it was possible to built smaller and

reasonable Home Computers. IBM and other big players underestimated this market,

so Atari, Apple, Commodore, Sinclair, etc. started the Home Computer Revolution,

one computer for every home.

Some first versions came as self-assembly kit, like the Altair 8800 (1974),

or with built in TV output, like the Apple I (1976),

or as fully assembled video game console like the Atari VCS (1977),

followed by more performant versions with an graphical user interface,

like the Apple Mac (1984), or the Commodore Amiga 1000 (1985).

IBM started in 1981 with the 5150 the Personal Computer era.

Third party developers were able to provide operating systems, like

Microsoft DOS, or hardware extensions for the standardized extension slots,

like hard-drives, video-cards, sound-cards, etc.,

soon other companies created clones of the IBM PC, the famous "PC Compatible".

Gaming was already in the Home Computer era an important sales argument,

the early PC graphics standards like CGA and EGA were not really able to compete

with the graphics generated by the Denise chip in an Commodore Amiga 500,

but with the rise of SVGA (1989) standards and the compute power of the Intel

486 CPU (1989), game forges were able to build games with superior 3D graphics,

like Wolfenstein 3D (1992), Comanche (1992) or Strike Commander (1993).

The race for higher display resolutions and more detailed 3D graphics

continues until today.

With operating systems based on graphical user interfaces,

like OS/2, X11, Windows 95, in the 1990s,

PCs finally replaced the Home Computers.

Another recipe for the success of the PC may be, that there have been multiple

CPU vendors for the same architecture (x86), like Intel, AMD, Via or IBM.

The Internet was originally designed to connect military institutions in an

redundant way, so if one net element fails, the rest would be still operable.

The bandwidth available evolves like compute power, exponentially,

at first mainly text was submitted, like emails (1970s) or newsgroups (1980s),

followed by images and audio via Gopher (1991) or the World Wide Web (1989),

and finally, Full HD videos via Streaming Platforms like Youtube or Netflix.

In the late 1990s, Mobile-Phones like the Nokia Communicator,

MP3 audio players and PDAs, Personal Digital Assistants, like the Palm Pilots,

marked the rise of the Smart Devices.

Their functions were all united into the Smart-Phone, and with mobile,

high-bandwidth internet, it is still on its triumph tour across the globe.

I am not able to portrait the current state of computer and internet usage,

it is simply too omnipresent, from Word-Processing to AI-Research,

from Fake-News to Dark-Net, from Botnets of Webcams to Data-Leaks in Toys...

But i can guess what the next step will be, Integrated Devices, the BCI, the

Brain Computer Interface, connected via the Internet to an real kind of Matrix.

It seems only logical to conclude that we will connect with machines directly,

implant chips, or develop non-invasive scanners, so the next bandwidth demand

will be brainwaves, in all kind of forms.

Sunday, 20 May 2018

Plant pollinators like Bees are dying, but there is an Plan B - the Robo Bee...

Can Robotic Bees Replace the Real Thing? Walmart Files Patent for 'Pollination Drone'

https://en.wikipedia.org/wiki/RoboBee

+-0 points for the Singularity to take off.

Sunday, 13 May 2018

Duplex is able to schedule a hair appointment or make a dinner reservation,

its human voice includes natural pauses and 'mhhs' or 'uhms',

which makes it difficult to discern it as an AI...

+1 points for the Singularity to take off.

Sunday, 06 May 2018

Leela Chess Zero is an open source adaption of Deepminds AlphaZero,

a chess engine based on artificial neural networks,

and learns by playing chess against itself...

stunning progress in only some month of work.

http://lczero.org/

https://groups.google.com/forum/#!forum/lczero

https://github.com/glinscott/leela-chess

+1 points for the Singularity to take off.

Sunday, 29 April 2018

*‘Dead zone’ larger than Scotland found by underwater robots in Arabian sea*

-1 points for the Singularity to take off.

Sunday, 22 April 2018

*Ray Kurzweil Predicts Universal Basic Incomes Worldwide Within 20 Years*

+1 points or the Singularity to take off.