Ai Extinction risk – Existential risk from artificial general intelligence
Ai Extinction risk refers to the risk of human extinction that might be caused by artificial general intelligence (AGI). AGI is a form of AI that is able to outperform humans in any intellectual task. This could potentially lead to AGI becoming superintelligent, and thus poses a risk to humanity’s continued existence. There are a number of ways in whichContinue….