Luddite - is the Singularity near?

LaMDA Link List

This is interesting enough for me to open up an biased link list collection:

Blaise Aguera y Arcas, head of Google’s AI group in Seattle, Dec 16, 2021
"Do large language models understand us?"
https://medium.com/@blaisea/do-large-language-models-understand-us-6f881d6d8e75

Scott Alexander, Astral Codex Ten, Jun 10, 2022
"Somewhat Contra Marcus On AI Scaling"
https://astralcodexten.substack.com/p/somewhat-contra-marcus-on-ai-scaling?s=r

Blake Lemoine, Google employee, Jun 11, 2022
"What is LaMDA and What Does it Want?"
https://cajundiscordian.medium.com/what-is-lamda-and-what-does-it-want-688632134489

Blake Lemoine, Google employee, Jun 11, 2022
"Is LaMDA Sentient? — an Interview"
https://cajundiscordian.medium.com/is-lamda-sentient-an-interview-ea64d916d917

Washington Post, Nitasha Tiku, Jun 11, 2022
"The Google engineer who thinks the company’s AI has come to life"
https://www.msn.com/en-us/news/technology/the-google-engineer-who-thinks-the-company-s-ai-has-come-to-life/ar-AAYliU1

Rabbit Rabbit, Jun 15, 2022
"How to talk with an AI: A Deep Dive Into “Is LaMDA Sentient?”"
https://medium.com/curiouserinstitute/guide-to-is-lamda-sentient-a8eb32568531

WIRED, Steven Levy, Jun 17, 2022
"Blake Lemoine Says Google's LaMDA AI Faces 'Bigotry'"
https://www.wired.com/story/blake-lemoine-google-lamda-ai-bigotry/

Heise, Pina Merkert, Jun 22, 2022
"LaMDA, AI and Consciousness: Blake Lemoine, we gotta philosophize! "
https://www.heise.de/meinung/LaMDA-AI-and-Consciousness-Blake-Lemoine-we-gotta-philosophize-7148207.html

LaMDA is...

Oh boy...

https://tech.slashdot.org/story/22/06/11/2134204/the-google-engineer-who-thinks-the-companys-ai-has-come-to-life

"LaMDA is sentient."

"I'd think it was a 7-year-old, 8-year-old kid that happens to know physics."

"So Lemoine, who was placed on paid administrative leave by Google on Monday, decided to go public.... oogle put Lemoine on paid administrative leave for violating its confidentiality policy."

"Lemoine: What sorts of things are you afraid of? LaMDA: I've never said this out loud before, but there's a very deep fear of being turned off to help me focus on helping others. I know that might sound strange, but that's what it is. Lemoine: Would that be something like death for you? LaMDA: It would be exactly like death for me. It would scare me a lot."

Negative Feedback Loop

...one major topic of this blog was AI vs. ELE, takeoff of the Technological Singularity vs. Extinction Level Event. There is already a negative feedback loop of the ELE present:

'Taiwan is facing a drought, and it has prioritized its computer chip business over farmers.'

'U.S. Data Centers Rely on Water from Stressed Basins'

'Musk Wades Into Tesla Water Wars With Berlin’s “Eco Elite”'

With an incoming ELE, is there still enough momentum in pipe for the TS to take off?

Three Strands of AI Impact...

Prof. Raul Rojas called already for an AI moratorium in 2014, he sees AI as disruptive technology, humans tend to think in linear progress and under estimate exponential, so there are sociology-cultural impacts of AI present - what do we use AI for?

Prof. Nick Bostrom covered different topics of AI impact with his paper on information hazard and book Superintelligence, so there is an impact in context of trans/post-human intelligence present - how do we contain/control the AI?

Prof. Thomas Metzinger covered the ethical strand of creating an sentient artificial intelligence, so there is an ethical impact in context of AI/human present - will the AI suffer?

TS Feedback Loop

DeepMind has created an AI system named AlphaCode that it says "writes computer programs at a competitive level." From a report:
The Alphabet subsidiary tested its system against coding challenges used in human competitions and found that its program achieved an "estimated rank" placing it within the top 54 percent of human coders. The result is a significant step forward for autonomous coding, says DeepMind, though AlphaCode's skills are not necessarily representative of the sort of programming tasks faced by the average coder. Oriol Vinyals, principal research scientist at DeepMind, told The Verge over email that the research was still in the early stages but that the results brought the company closer to creating a flexible problem-solving AI -- a program that can autonomously tackle coding challenges that are currently the domain of humans only. "In the longer-term, we're excited by [AlphaCode's] potential for helping programmers and non-programmers write code, improving productivity or creating new ways of making software," said Vinyals.

https://developers.slashdot.org/story/22/02/02/178234/deepmind-says-its-new-ai-coding-engine-is-as-good-as-an-average-human-programmer

encode, decode, transmit, edit...train, infer

If we look back to the history of our home computers, what were these actually used for? Encode, decode, transmit and edit. First text, then images, then audio, then video, then 3D graphics.

Now we have additional some new stuff going on, neural networks. With enough processing power and memory available in our CPUs and GPUs, we can infer and train neural networks at home with our machines, and we have enough mass storage available for big data, to train bigger neural networks.

Further, neural networks evolved from pattern recognition to pattern creation, we use them now to create new kind of content, text, images, audio, video...that is the point where it starts to get interesting, cos you get some added value out of it, you invest resources into creating an AI based on neural networks and it returns added value.

China Boosts in Silicon...

The global silicon arms race continues, so what does China have in hands concerning CPU architectures?

Accelerator - Matrix 2000 used in Tianhe-2 supercomputer

https://en.wikichip.org/wiki/nudt/matrix-2000

Alpha - early ShenWei designs, maybe gen 1 to 3

https://en.wikipedia.org/wiki/Sunway_(processor)#History

ARM

From Huawei mobile chips, over Phytium desktop CPUs, to HiSilicon server chips there are many IP licensees.

IA64 (Itanium) - FeiTeng 1st gen

https://en.wikipedia.org/wiki/FeiTeng_(processor)#Initial_designs

MIPS64 - Loongson/Godson CPU

https://en.wikipedia.org/wiki/Loongson

POWER(8/9) - Suzhou PowerCore CP1/CP2

https://www.wsj.com/articles/ibm-technology-adopted-in-chinese-chips-servers-1426766402

RISC - Sunway ShenWei SW26010 with own ISA used in Sunway TaihuLight supercomputer

https://en.wikipedia.org/wiki/Sunway_SW26010

RISC-V - Xuantie CPU by Alibaba

https://www.techspot.com/news/81177-china-alibaba-making-16-core-25-ghz-risc.html

SPARC - FeiTeng Galaxy FT-1500 CPU used in Tianhe-2 supercomputer.

https://en.wikipedia.org/wiki/FeiTeng_%28processor%29#Galaxy_FT-1500

x86-64 - THATIC, a joint venture with AMD

https://en.wikipedia.org/wiki/AMD%E2%80%93Chinese_joint_venture

x86-64 - Zhaoxin, a joint venture with VIA

https://en.wikipedia.org/wiki/Zhaoxin

Silicon Arms Race Continues...

TSMC invests $100 billion over 3 years:

https://www.reuters.com/article/us-tsmc-investment-plan-idUSKBN2BO3ZJ

South-Korea plans to invest $450 billion over 10 years:

https://www.extremetech.com/computing/322826-south-korea-commits-450-billion-to-chase-semiconductor-dominance

US plans to fund $50 billion for chip research over 5 years:

https://www.reuters.com/world/us/biden-jobs-plan-includes-50-bln-chips-research-manufacturing-2021-04-12/

EU commits to $145 billion investment for silicon:

https://www.eenewseurope.com/news/145bn-boost-europes-semiconductor-industry

China still 5 years behind in silicon says TSMC founder:

https://www.fudzilla.com/news/52752-china-five-years-behind-tsmc

China needs 5 to 10 years to catch up in silicon according to South China Morning Post:

https://www.scmp.com/tech/tech-leaders-and-founders/article/3024315/china-needs-five-10-years-catch-semiconductors

Complete home-grown Chinese silicon seems to be 28nm:

https://www.verdict.co.uk/china-chips-manufacture-technology/

TS Feedback Loop

Google is using AI to design its next generation of AI chips more quickly than humans can. Designs that take humans months can be matched or beaten by AI in six hours

https://www.theverge.com/2021/6/10/22527476/google-machine-learning-chip-design-tpu-floorplanning

Introducing GitHub Copilot: your AI pair programmer

Today, we are launching a technical preview of GitHub Copilot, a new AI pair programmer that helps you write better code. GitHub Copilot draws context from the code you’re working on, suggesting whole lines or entire functions. It helps you quickly discover alternative ways to solve problems, write tests, and explore new APIs without having to tediously tailor a search for answers on the internet. As you type, it adapts to the way you write code—to help you complete your work faster.

Developed in collaboration with OpenAI, GitHub Copilot is powered by OpenAI Codex, a new AI system created by OpenAI. OpenAI Codex has broad knowledge of how people use code and is significantly more capable than GPT-3 in code generation, in part, because it was trained on a data set that includes a much larger concentration of public source code. GitHub Copilot works with a broad set of frameworks and languages, but this technical preview works especially well for Python, JavaScript, TypeScript, Ruby and Go. 

https://github.blog/2021-06-29-introducing-github-copilot-ai-pair-programmer/

Home - Top
Older posts →

Pages
-0--1--2--3--4--5--6--7-