Difference between revisions of "AI predictions"

From GISAXS
Jump to: navigation, search
(Science & Technology Improvements)
(Bad Outcomes)
Line 157: Line 157:
 
** '''Extinction doctrine:''' Humanity will lose control of ASI, leading to extinction or permanent disempowerment
 
** '''Extinction doctrine:''' Humanity will lose control of ASI, leading to extinction or permanent disempowerment
 
** '''Replacement doctrine:''' AI will automate human tasks, but without fundamentally reshaping or ending civilization
 
** '''Replacement doctrine:''' AI will automate human tasks, but without fundamentally reshaping or ending civilization
 +
* 2025-09: Sean ÓhÉigeartaigh: [https://www.cambridge.org/core/journals/cambridge-prisms-extinction/article/extinction-of-the-human-species-what-could-cause-it-and-how-likely-is-it-to-occur/D8816A79BEF5A4C30A3E44FD8D768622 Extinction of the human species: What could cause it and how likely is it to occur?]
  
 
==Intelligence Explosion==
 
==Intelligence Explosion==

Revision as of 11:12, 28 September 2025

Capability Scaling

GmZHL8xWQAAtFlF.jpeg

Scaling Laws

See: Scaling Laws

AGI Achievable

AGI Definition

Progress Models

AI impact models01.png

Economic and Political

Job Loss

F-kVQuvWkAAemkr.png

0dab4c86-882d-4095-9d12-d19684ed5184 675x680.png

National Security

AI Manhattan Project

Near-term

Overall

Surveys of Opinions/Predictions

Bad Outcomes

Intelligence Explosion

Gm-1jugbYAAtq Y.jpeg

Superintelligence

Long-range/Philosophy

Psychology

Positives & Optimism

Science & Technology Improvements

Social

Plans

Philosophy

GlchEeObwAQ88NK.jpeg

Research

Alignment

Strategic/Technical

Strategic/Policy

Restriction

See Also