It is non-stop in the news, every week it pops up in another corner,
AIs based on Deep Neural Networks, so i will give it a try to write a lill,
biased article about this topic...

The brain

The human brain consists of about 100 billion neurons,
as much as stars in our galaxy, the Milky Way,
and each neuron is connected via synapses with about 1000 other neurons,
resulting in 100 trillion connections.

For comparison,
the game playing AI, AlphaZero, by Google Deepmind used about 50 million
connections to play chess on super human level.

The inner neurons of our brain are connected via our senses, eyes, ears, etc,
with the outer world.

One neuron has multiple, weighted inputs and one ouput,
if a certain threshold of input is reached, its output is activated,
the neuron fires an signal to another neuron.

The activation of the synapse is an electrical and chemical process,
neurotransmitters can restrain or foster the activation potential,
just consider the effect alcohol or coffee has to your cognitive performance.

Common artificial neural networks do not emulate the chemical part.

The brain wires these connections between neurons during learning,
so they can act as memory, or can be used for computation.

The "Von Neumann" architecture

Most nowadays computers are based on the Von-Neumann architecture,
they have no neurons or synapses but transistors.

The main components are the ALU, Arithmetic Logic Unit,
memory for program and data,
and various inputs and outputs.

Artificial Neural Networks have to be built in software, running on these
Von-Neumann computers.

Von Neumann said that his proposed architecture was inspired by the idea of how the
brain works, memory and computation.
And in his book, "The computer and the brain", he gives an comparision of
computers and the knowledge about biological neural networks of that time.

Dartmouth

First work on ANNs were published already in the 1940s,
and in 1956 the "Dartmouth Summer Research Project on Artificial Intelligence"
was held, coining the term Artificial Intelligence, and marking one milestone
in AI. The work on ANNs continued, and first neuromorphic chips were developed.

AI-Winter

In the 1970s the AI-Winter occurred, problems in computational theory and the
lack of compute power needed by large ANNs, resulted in cutting funds,
and splitting the work into strong and weak AI.

Deep Neural Networks

With the rise of compute power (driven by GPGPU), further research, and Big Data,
it was possible to train faster better and larger networks in the 21st century.

The term Deep Neural Networks, for deep hierarchical structures or deep learning
techniques was coined.

One of the first and common usage for ANNs was and is pattern recognition,
for example character recognition.

You can train a neural network with a set of the same, but different looking
character, with the aim that the ANN will recognize the same character in
various appearances.

With a deeper topology of the neural network, it is possible to identify for
example pictures of cars with different net layers for color, shape etc.

The Brain vs. The Machine

A computer can perform fast arithmetic and logical operations,
therefore the transistors are used.

Contrary, the neural network of our brain works massiv parallel.

The synapses of the human brain are clocked with 10 to 100 Hertz,
means they can fire to other neurons up to 100 times per second.

Nowadays computer chips are clocked with 4 Giga Hertz,
means they can compute 4 000 000 000 operations per second per ALU.

The brain has 100 billion neurons, 100 trillion connections and consumes ~20 Watt,
nowadays biggest chips have 12 billion transistors with an usage of 250 Watt.

We can not compare the compute power of an brain directly with an Von-Neumann
computer, but we can estimate what kind of computer we would need to map
the neural network of an human brain.

Assuming 100 trillion connections, we would need about 400 Tera bytes of memory
to store the weights of the neurons.
Assuming 100 Hertz as clock rate, we would need at least 40 Peta FLOPS
(floating point operations per second) to compute the activation potentials.

For comparison, the current number one high performance computer in the world
is able to perform ~93 Peta FLOPS, has ~1 Peta byte memory,
but an power consumption of more than 15 Mega Watt.

So, considering simply the energy efficiency of the human brain,
i give -1 points for the Singularity to take off.