"I think there is a world market for maybe five computers"
Thomas J. Watson (CEO of IBM), 1943

Roots

I guess since human have fingers, they started to count and compute with them,
and since they have tools, they started to carve numbers into bones.

Across different cultures and timelines there have been different kinds of
numbering systems to compute with.

Our global civilization uses mostly the Hindu-Arabic-Numbers with the decimal
number system, based on 10, our computers use commonly the binary number system,
based on 2, the famous 0s and 1s. But there have been other cultures with other
systems, the Maya with an base 20, Babylon with base 60, or the Chinese with
base 16, the hexadecimal system, which is also used in computer science.

The first compute devices were mechanical helpers,
like the AbacusNapier's Bones or Slide Rule,
they did not perform computations on their own, but were used to represent
numbers and apply the arithmetic operations on them,
addition, subtraction, multiplication and division.

Mechanical Computers

The first mechanical computing machine is considered to be the
Antikythera Mechanism, found in an greek ship that sunk about 70 BC.

But actually it is no computer,
cos it does not perform computations,
but an analog, astrological clock,
a sun and moon calendar that shows solar and lunar eclipses.

In the 17th century first mechanical computing machines were proposed and build.

Wilhelm Schickard designed a not fully functional prototype in 1623.

The Pascaline, designed by Blaise Pascal in 1642,
was the first operational and commercial available mechanical computer,
able to perform the 4 basic arithmetic operations.

In 1672 the German mathematician Gottfried Wilhelm Leibniz invented
the stepped cylinder, used in his not fully functional Stepped Reckoner.

With the Curta the use of mechanical calculator lived on,
up to the advent of portable electronic calculators in the 1960s.

Programmable Computers

The punch card for programming a machine was introduced by Joseph Marie Jacquard
in 1804 with his automated weaving loom, the Jacquard Loom,
for producing textiles with complex patterns.

In 1837 Charles Babbage (considered as the father of the computer)
was the first to describe a programmable, mechanical computer,
the Analytical Engine.

Ada Lovelace (considered as the mother of programming) worked for Babbage and
was the first person to publish a computer algorithm, the computation of
Bernoulli numbers.

Babbage was his time ahead, as he described all parts, CPU, memory, input/output,
a modern computer has, but was not able to realize his machine,
due to missing funds and proper engineering abilities of that time.

About a century later, Konrad Zuse's Z3, built in 1941,
is considered to be the first binary, free programmable computer.

It used 600 telephone relays for computation and 1400 relays for memory,
a keyboard and punched tape as input, lamps as output,
and it operated with 5 Hertz.

Mainframes

Zuse's machines mark the advent of the first mainframes used by military and
science during and after WWII.

Colossus Mark I (1943), ENIAC (1946), IBM 650 (1953) for example used vacuum
tubes instead of relays and were replaced more and more by transistor based
computers in the 1960s.

Home Computers

With small chips, integrated circuits, it was possible to built smaller and
reasonable Home Computers. IBM and other big players underestimated this market,
so Atari, Apple, Commodore, Sinclair, etc. started the Home Computer Revolution,
one computer for every home.

Some first versions came as self-assembly kit, like the Altair 8800 (1974),
or with built in TV output, like the Apple I (1976),
or as fully assembled video game console like the Atari VCS (1977),
followed by more performant versions with an graphical user interface,
like the Apple Mac (1984), or the Commodore Amiga 1000 (1985).

Personal Computers

IBM started in 1981 with the 5150 the Personal Computer era.
Third party developers were able to provide operating systems, like
Microsoft DOS, or hardware extensions for the standardized extension slots,
like hard-drives, video-cards, sound-cards, etc.,
soon other companies created clones of the IBM PC, the famous "PC Compatible".

Gaming was already in the Home Computer era an important sales argument,
the early PC graphics standards like CGA and EGA were not really able to compete
with the graphics generated by the Denise chip in an Commodore Amiga 500,
but with the rise of SVGA (1989) standards and the compute power of the Intel
486 CPU (1989), game forges were able to build games with superior 3D graphics,
like Wolfenstein 3D (1992), Comanche (1992) or Strike Commander (1993).

The race for higher display resolutions and more detailed 3D graphics
continues until today.

With operating systems based on graphical user interfaces,
like OS/2, X11, Windows 95, in the 1990s,
PCs finally replaced the Home Computers.

Another recipe for the success of the PC may be, that there have been multiple
CPU vendors for the same architecture (x86), like Intel, AMD, Via or IBM.

Internet of Things

The Internet was originally designed to connect military institutions in an
redundant way, so if one net element fails, the rest would be still operable.

The bandwidth available evolves like compute power, exponentially,
at first mainly text was submitted, like emails (1970s) or newsgroups (1980s),
followed by images and audio via Gopher (1991) or the World Wide Web (1989),
and finally, Full HD videos via Streaming Platforms like Youtube or Netflix.

In the late 1990s, Mobile-Phones like the Nokia Communicator,
MP3 audio players and PDAs, Personal Digital Assistants, like the Palm Pilots,
marked the rise of the Smart Devices.

Their functions were all united into the Smart-Phone, and with mobile,
high-bandwidth internet, it is still on its triumph tour across the globe.

I am not able to portrait the current state of computer and internet usage,
it is simply too omnipresent, from Word-Processing to AI-Research,
from Fake-News to Dark-Net, from Botnets of Webcams to Data-Leaks in Toys...

The next thing

But i can guess what the next step will be, Integrated Devices, the BCI, the
Brain Computer Interface, connected via the Internet to an real kind of Matrix.

It seems only logical to conclude that we will connect with machines directly,
implant chips, or develop non-invasive scanners, so the next bandwidth demand
will be brainwaves, in all kind of forms.