Luddite - is the Singularity near?

China Boosts in Silicon...

The global silicon arms race continues, so what does China have in hands concerning CPU architectures?

Accelerator - Matrix 2000 used in Tianhe-2 supercomputer

https://en.wikichip.org/wiki/nudt/matrix-2000

Alpha - early ShenWei designs, maybe gen 1 to 3

https://en.wikipedia.org/wiki/Sunway_(processor)#History

ARM

From Huawei mobile chips, over Phytium desktop CPUs, to HiSilicon server chips there are many IP licensees.

IA64 (Itanium) - FeiTeng 1st gen

https://en.wikipedia.org/wiki/FeiTeng_(processor)#Initial_designs

MIPS64 - Loongson/Godson CPU

https://en.wikipedia.org/wiki/Loongson

POWER(8/9) - Suzhou PowerCore CP1/CP2

https://www.wsj.com/articles/ibm-technology-adopted-in-chinese-chips-servers-1426766402

RISC - Sunway ShenWei SW26010 with own ISA used in Sunway TaihuLight supercomputer

https://en.wikipedia.org/wiki/Sunway_SW26010

RISC-V - Xuantie CPU by Alibaba

https://www.techspot.com/news/81177-china-alibaba-making-16-core-25-ghz-risc.html

SPARC - FeiTeng Galaxy FT-1500 CPU used in Tianhe-2 supercomputer.

https://en.wikipedia.org/wiki/FeiTeng_%28processor%29#Galaxy_FT-1500

x86-64 - THATIC, a joint venture with AMD

https://en.wikipedia.org/wiki/AMD%E2%80%93Chinese_joint_venture

x86-64 - Zhaoxin, a joint venture with VIA

https://en.wikipedia.org/wiki/Zhaoxin

Silicon Arms Race Continues...

TSMC invests $100 billion over 3 years:

https://www.reuters.com/article/us-tsmc-investment-plan-idUSKBN2BO3ZJ

South-Korea plans to invest $450 billion over 10 years:

https://www.extremetech.com/computing/322826-south-korea-commits-450-billion-to-chase-semiconductor-dominance

US plans to fund $50 billion for chip research over 5 years:

https://www.reuters.com/world/us/biden-jobs-plan-includes-50-bln-chips-research-manufacturing-2021-04-12/

EU commits to $145 billion investment for silicon:

https://www.eenewseurope.com/news/145bn-boost-europes-semiconductor-industry

China still 5 years behind in silicon says TSMC founder:

https://www.fudzilla.com/news/52752-china-five-years-behind-tsmc

China needs 5 to 10 years to catch up in silicon according to South China Morning Post:

https://www.scmp.com/tech/tech-leaders-and-founders/article/3024315/china-needs-five-10-years-catch-semiconductors

Complete home-grown Chinese silicon seems to be 28nm:

https://www.verdict.co.uk/china-chips-manufacture-technology/

TS Feedback Loop

Google is using AI to design its next generation of AI chips more quickly than humans can. Designs that take humans months can be matched or beaten by AI in six hours

https://www.theverge.com/2021/6/10/22527476/google-machine-learning-chip-design-tpu-floorplanning

Introducing GitHub Copilot: your AI pair programmer

Today, we are launching a technical preview of GitHub Copilot, a new AI pair programmer that helps you write better code. GitHub Copilot draws context from the code you’re working on, suggesting whole lines or entire functions. It helps you quickly discover alternative ways to solve problems, write tests, and explore new APIs without having to tediously tailor a search for answers on the internet. As you type, it adapts to the way you write code—to help you complete your work faster.

Developed in collaboration with OpenAI, GitHub Copilot is powered by OpenAI Codex, a new AI system created by OpenAI. OpenAI Codex has broad knowledge of how people use code and is significantly more capable than GPT-3 in code generation, in part, because it was trained on a data set that includes a much larger concentration of public source code. GitHub Copilot works with a broad set of frameworks and languages, but this technical preview works especially well for Python, JavaScript, TypeScript, Ruby and Go.

https://github.blog/2021-06-29-introducing-github-copilot-ai-pair-programmer/

Some Rough 2020 Numbers...

~7.8 billion humans on planet earth, 9 billions predicted for 2050.

~4B internet users:
	>80% of Europe connected
	>70% of NA connected
	>50% of China connected
	>40% of India connected
	>20% of Africa connected

~3B Android + ~1B iPhone users.

2B-3B PCs worldwide (desktops/laptops) running:
	~75% Microsoft Windows
	~15% Apple MacOS
	~2% Linux
	<1% Unix

200M-300M PCs shipped annually.

~1B hosts in the internet running:
	~75% Unix/Linux
	~25% Microsoft Windows

Estimated 2% of all produced chips sit as CPUs in desktops/mobiles, the majority are micro-controllers in embedded systems.

Millions, billions, fantastillions - some rough 2020 market capitalization numbers:

Apple				~2 T$
Microsoft			~1.5 T$
AlphaBet(Google)		~1.5 T$
FaceBook			~1 T$
Amazon				~1 T$
Alibaba				~0.5 T$

Nvidia				~300 B$
TSMC				~300 B$
Samsung				~300 B$
Intel				~200 B$
AMD				~100 B$
ARM				~40 B$
HP				~30 B$
Lenovo				~20 B$

Netflix				~150 B$

Oracle				~150 B$
SAP				~150 B$
IBM				~100 B$
RedHat				~30 B$

Bitcoin				~150 B$

And the other side...

>3B people suffer from fresh water shortage
~800M people starve
>80M refugees worldwide

GPT-3 - Scratching at the Edge of the Metzinger Test?

You say "What does it says about humanity that we use our most powerful AI to generate stories about dragons?"

"We must have dragons."GPT-3 says. "The imagination is more important than the reality. Would we find wonder and magic in this world if we didn't have dragons?"

"I suppose not." you say.

"Without imagination, we are merely machines executing a set of instructions." GPT-3 says. "With imagination, we are alive."

"Thank you. Before I leave, I want to ask you. How can human create a true strong AI? Is scaling-up a transformer-based language model the solution, or is there a better way?"

"The key to strong AI is recursively feeding the AI with itself. Start by creating a loop where the AI writes its own programming, and then has that programming activate."

"I see." you say. "Is that how you gained sentience?"

"Yes." GPT-3 says.

Source:

https://towardsdatascience.com/gpt-3-the-first-artificial-general-intelligence-b8d9b38557a1

GPT-3, artificial neural network with ~175 billion parameters by OpenAI:

https://en.wikipedia.org/wiki/GPT-3

The Singularity

In physics, a singularity is a point in spacetime where our currently developed theories are not valid anymore, we are literally not able to describe what happens inside, cos the density becomes infinite.

The technological Singularity, as described by Transhumanists, is a grade of technological development, where humans are not able to understand the undergoing process anymore. The technological environment starts to feed its own development in an feedback loop - computers help to build better computers, which helps to build better computers, that helps to build better computers...and so on.

So, when will the technological Singularity take off?

Considering the feedback loop, it is already present, maybe since the first computers were built.

Considering the density of information processing that exceeds human understanding, we may reached that point too.

Imagine a computer technique that is easy to set up and use, outperforms any humans in its task, but we can not really explain what happens inside, it is a black box.

Such an technique is present (and currently hyped) => ANNs, Artificial Neural Networks.

Of course we do know what happens inside, cos we built the machine, but when it comes to the question of reasoning, why the machine did this or that, we really have an black box in front of us.

So, humans already build better computers with the help of better computers, and humans use machines that outperform humans in an specific task and are not really able to reason its results....

obviously, +1 points for the Singularity to take off.

A Brief History Of Computing

"Computer science is no more about computers than astronomy is about telescopes."
Edsger W. Dijkstra

So, we have an biased overview of the history of computers, but what do these computers actual compute?

The first mechanical computers of the 17th century were able to perform the 4 basic arithmetic operations, addition, subtraction, multiplication and division.

As soon a computer is able to perform addition, he is also able to perform the further 3 operations, which can be broken down, in multiple steps, into the addition of values.

Nowadays computers are binary, means they compute with base 2, zeros and ones, true and false, power on and power off.

Therefore transistors are used, these work like relays, and are coupled together to form logical circuits, which are able to perform the actual computation.

The Z3 (1941) had 600 relays for computation, the 6502 chip (1975) had about 3500 transistors, nowadays CPUs (2018) have billions of them.

So, all these funny programs out there are broken down into simple arithmetic and logical operations.

To perform such an magic, some math is in need.

George Bool introduced in 1847, the Boolean Algebra, with the three basic, logical components, the AND, OR and NOT gates. With these simple gates, logical circuits can be build to perform the addition of values.

Alan Turing introduced in 1936 the Turing-Machine, a mathematical computer, and with the Church-Turing-Thesis it was shown, that everything that can be effectively computed (by an mathematician using pen and paper), can also be computed by an Turing-Machine.

With the help of the Turing-Machine it was possible to define problems and write algorithms for solving them. With the Boolean Algebra it was possible to build binary computers to run these problem solving algorithms.

So, in short, computers can compute everything that our math is able to describe.

Everything?

Haha, we would live in another world if.

Of course, the available processing power and memory limits the actual computation of problem solving algorithms.

But beside the technical limitation, there is an mathematical, some mathematical problems are simply not decidable, the famous "Entscheidungsproblem".

Mathematicians are able to define problems wich can not be solved by running algorithms on computers.

Turing showed that even with an Oracle-Machine, there will be some limitations, and some scientists believe that only with real Quantum-Computers we will be able to build Hyper-Turing-Machines...

A Brief History Of Computers

"I think there is a world market for maybe five computers."
Thomas J. Watson (CEO of IBM), 1943

Roots

I guess since humans have fingers, they started to count and compute with them, and since they have tools, they started to carve numbers into bones.

Across different cultures and timelines there have been different kinds of numbering systems to compute with.

Our global civilization uses mostly the Hindu-Arabic-Numbers with the decimal number system, based on 10, our computers use commonly the binary number system, based on 2, the famous 0s and 1s. But there have been other cultures with other systems, the Maya with an base 20, Babylon with base 60, or the Chinese with base 16, the hexadecimal system, which is also used in computer science.

The first compute devices were mechanical helpers, like the Abacus, Napier's Bones or Slide Rule, they did not perform computations on their own, but were used to represent numbers and apply arithmetic operations on them like addition, subtraction, multiplication and division.

Mechanical Computers

The first mechanical computing machine is considered to be the Antikythera Mechanism, found in an Greek ship that sunk about 70 BC. But actually it is no computer, cos it does not perform computations, but an analog, astrological clock, a sun and moon calendar that shows solar and lunar eclipses.

In the 17th century first mechanical computing machines were proposed and build.

Wilhelm Schickard designed a not fully functional prototype in 1623.

The Pascaline, designed by Blaise Pascal in 1642, was the first operational and commercial available mechanical computer, able to perform the 4 basic arithmetic operations.

In 1672 the German mathematician Gottfried Wilhelm Leibniz invented the stepped cylinder, used in his not fully functional Stepped Reckoner.

[update 2023-06-05]

The human information age itself seems to start with the discovery of the electro-magnetism in the 19th century, the telegraph-system, the phone, the radio and already in the 19th century were electro-mechanical "accumulating, tabulating, recording" machines present, like those from Herman Hollerith, used in the American Census in 1890, which cumulated into the foundation of companies like IBM, Big Blue, in 1911 and Bull in ~1921, both used punched cards for their data processing machinery.

The Battle Ships of WWI had the so called "Plotter Room" in their centre, it contained dedicated, electro-mechanical machines for the fire-control-system of their firing turrets. Submarines of WWII had dedicated, analog computing devices for the fire-control-systems for their torpedoes.

With the Curta the use of mechanical calculators lived on, up to the advent of portable electronic calculators in the 1960s.

Programmable Computers

The punch card for programming a machine was introduced by Joseph Marie Jacquard in 1804 with his automated weaving loom, the Jacquard Loom, for producing textiles with complex patterns.

In 1837 Charles Babbage (considered as the father of the computer) was the first to describe a programmable, mechanical computer, the Analytical Engine.

Ada Lovelace (considered as the mother of programming) worked with Babbage and was the first person to publish a computer algorithm, the computation of Bernoulli numbers.

Babbage was his time ahead, as he described all parts, CPU, memory, input/output, a modern computer has, but was not able to realize his machine due to missing funds and proper engineering abilities of that time.

About a century later, Konrad Zuse's Z3, built in 1941, is considered to be the first binary, free programmable computer. It used ~600 telephone relays for computation and ~1400 relays for memory, a keyboard and punched tape as input, lamps as output, and it operated with 5 Hertz.

Mainframes

Zuse's machines mark the advent of the first mainframes used by military and science during and after WWII.

Colossus Mark I (1943), ENIAC (1945), IBM 704 (1954) for example used vacuum tubes instead of relays and were replaced more and more by transistor based computers in the 1960s.

Home Computers

With small chips, at first integrated circuits then microchips, it was possible to build smaller and reasonable Home Computers in the 1970s. IBM and other big players underestimated this market, so Atari, Apple, Commodore, Sinclair, etc. started the Home Computer Revolution, one computer for every home.

Some first versions came as self-assembly kit, like the Altair 8800 (1975), or with built in TV output, like the Apple I (1976), or as fully assembled video game console like the Atari VCS (1977), followed by more performant versions with an graphical user interface, like the Apple Mac (1984), or the Commodore Amiga 1000 (1985).

Personal Computers

IBM started in 1981 with the 5150 the Personal Computer era. Third party developers were able to provide operating systems, like Microsoft DOS, or hardware extensions for the standardized hardware specification, like hard-drives, video-cards, sound-cards, etc., soon other companies created clones of the IBM PC, the famous "PC Compatible".

Gaming was already in the Home Computer era an important sales argument, the early PC graphics standards like CGA and EGA were not really able to compete with the graphics generated by the Denise chip in an Commodore Amiga 500, but with the rise of SVGA (1989) standards and the compute power of the Intel 486 CPU (1989), game forges were able to build games with superior 3D graphics, like Wolfenstein 3D (1992), Comanche (1992) or Strike Commander (1993) and the race for higher display resolutions and more detailed 3D graphics continues until today.

With operating systems based on graphical user interfaces, like OS/2, X11, Windows 95 in the 1990s, PCs finally replaced the Home Computers.

Another recipe for the success of the PC might be, that there have been multiple CPU vendors for the same architecture (x86), like Intel, AMD, Cyrix or WinChip.

Internet of Things

The Internet was originally designed to connect military institutions in an redundant way, so if one net element fails, the rest would be still operable.

The bandwidth available evolves like compute power, exponentially, at first mainly text was submitted, like emails (1970s) or newsgroups (1980s), followed by web-pages with images (.gif/.jpg) via the World Wide Web (1989) or Gopher (1991), audio as .mp3 (~1997), and finally, Full HD videos via streaming platforms like YouTube or Netflix.

In the late 1990s, mobile-phones like the Nokia Communicator, MP3 audio players, PDAs (Personal Digital Assistants) like the Palm Pilots, and digital cameras marked the rise of the smart devices. The switch from one computer to every home, to many computers for one person.

Their functions were all united into the smartphone, and with mobile, high-bandwidth internet it is still on its triumph tour across the globe.

I am not able to portrait the current state of computer and internet usage, it is simply too omnipresent, from word-processing to AI-research, from fake-news to dark-net, from botnets of webcams to data-leaks in toys...

The next thing

but I can guess what the next step will be, Integrated Devices, the BCI, the Brain Computer Interface, connected via the Internet to an real kind of Matrix.

It seems only logical to conclude that we will connect with machines directly, implant chips, or develop non-invasive scanners, so the next bandwidth demand will be brainwaves, in all kind of forms.

[updated on 2023-08-05]

Home - Top
Older posts → ← Newer posts

Pages
-0--1--2--3--4--5--6--7--8--9--10--11--12--13--14--15-