AI doomsday worries many Americans. So does apocalypse from climate change, nukes, war, and more

Taylor OrthDirector of Survey Data Journalism
Carl BialikU.S. Politics Editor and Vice President of Data Science
April 14, 2023, 9:16 PM GMT+0

Nearly half of Americans are very or somewhat concerned about the possibility that artificial intelligence, or AI, will cause the end of the human race on Earth, according to new YouGov polling. That means AI is perceived as a major threat to humanity — though the poll also finds that at least four other potential causes of extinction are at least somewhat of a concern among even more Americans.

The latest polling builds on a three-question YouGov poll of 20,810 U.S. adults conducted earlier this month. That poll's third question asked, "How concerned, if at all, are you about the possibility that AI will cause the end of the human race on Earth?" According to that poll, 19% of U.S. adults say they are very concerned, and 27% say they are somewhat concerned. The question followed two previous questions on the poll asking about the likelihood that AI overtakes human intelligence and about support for a possible pause in certain AI developments. The second of the two questions mentioned fears of the “profound risks to society and humanity” posed by AI.

The newer poll asked more or less the same question about AI concern, but with important modifications — and got more or less the same result. Incorporating some of the feedback the first poll got in social media and from the media, the second poll, of 1,000 U.S. adult citizens, was different in the following ways, largely to do with which questions were asked earlier in each poll:

  • It did not ask about AI intelligence rising above humans', and asked about a pause in AI research after, not before, asking about AI as a potential cause for humanity's end.
  • It asked first about likelihood of humanity ending in the next 10, 100, 1,000, or one million years — and about level of concern for that happening, regardless of cause.
  • It asked about the likelihood of AI ending humanity, and only asked the followup question about concern among people who did not say it was "impossible" for AI to cause the end of the human race.
  • It asked about AI alongside eight other potential causes of extinction, a list devised from a variety of sources including a previous poll's open-ended question about what might cause extinction. (This idea also was mentioned by a Twitter user.)
  • This poll was of U.S. adult citizens rather than of U.S. adults. The two populations overlap heavily but not entirely.

The purpose of these changes was to ensure the poll didn't force respondents to take as a given that AI could end humanity, and that it didn't risk skewing their thinking toward AI negatively by previously asking about the pause and mentioning fears of profound risks.

Even with all those changes, results on concern over AI's potential to end humanity were almost identical to the first poll: 18% of U.S. adult citizens are very concerned and 28% are somewhat concerned about AI ending the human race; 10% think it's impossible. (Another poll asking the same question, conducted by Conjointly, got similar results.)

That sounds like a lot of concern about AI — and it is. But some context is helpful: AI ranks fifth — among nine potential causes of extinction we asked about — in share of Americans at least somewhat concerned about their extinction threat. AI ranks behind four threats that loom larger: About two-thirds (66%) of Americans are at least somewhat concerned about nuclear weapons ending humanity, and the same proportion say the same about world war; 53% say so about a pandemic and 52% about climate change. For concern, AI ranks just ahead of an act of God (42%) and asteroid impact (37%) and raises far more concern than a global inability to have children (31%) or an alien invasion (25%).

One in 10 Americans are at least somewhat concerned about all nine potential causes of human extinction; 17% don't have that level of concern about any of them.

Potential extinction threats are ranked somewhat differently by likelihood than by concern. Nuclear weapons, world war, and an act of god are seen as at least somewhat likely to wipe out humanity by the most Americans (70%, 67%, and 62%, respectively). A global inability to have children and an alien invasion are seen that way by the least Americans — 36% and 25%, respectively. AI ranks just ahead of it as a likely threat, at 44%.

People who use AI tools more often have greater levels of concern about AI. Among people who never use AI tools, just 37% are at least somewhat concerned about AI ending humanity. Among people who use them somewhat or very often, 63% are at least somewhat concerned. People who never use AI also are less concerned about some other possible causes of extinction, though both groups are almost equally concerned about nuclear weapons ending the human race: 69% are at least somewhat concerned among people who never use AI tools, 70% among ones who use them at least somewhat often.

The backdrop to such pervasive concern about specific humanity-ending events is a widespread belief that humanity has a real chance of ending in the next millennium, and widespread concern about that possibility. About half (51%) of Americans say the end of the human race is at least somewhat likely to occur within the next millennium, and 18% say so about the next decade. Many of the rest are unsure: 19% about the next millennium and even 16% about the next decade.

A question early in the poll, before any specific possible causes of humanity's end were broached, asked about the threat of extinction generally. It found that 43% of Americans are at least somewhat concerned about the possibility of the end of the human race on Earth. That's not quite as high as the level of concern for several specific causes, including AI — suggesting that even with the screening question to ensure only people who consider it plausible are asked, the mention of a possible cause of extinction raises the likelihood that people will say they're concerned. (It's an example of a cognitive bias known as the conjunction fallacy.)

Our first poll found a strong age effect: Older Americans are less concerned about extinction. The second poll confirmed that for AI: Adults under 30 are twice as likely as those 65 and older to be at least somewhat concerned about AI ending humanity (60% vs. 30%). While older Americans are less concerned than younger Americans about some other threats to humanity, that isn't true for all of them. On some other causes polled — world war, nuclear weapons — there is no difference in likelihood of concern by age. On others — climate change, asteroid impact, alien invasion, act of God, inability to have children — the difference in concern among age groups is about as big as for AI.

Differences by age in likelihood of concern about specific threats to humanity are partly a result of a difference in belief that extinction of any kind is imminent or likely. While 30% of adults under 30 think humanity is at least somewhat likely to end in the next decade, just 13% of people 65 and older do. The differences narrow at longer time horizons.

But age differences also are a function of different levels of concern about extinction: The age gap on concern about extinction is greater than the gap on the likelihood of extinction. Adults under 30 are more than twice as likely as Americans 65 and older to be at least somewhat concerned about the possibility of the end of the human race on Earth (62% vs. 29%).

Our latest poll also tested three ways of asking about a proposed pause in certain kinds of AI development. The original poll asked, "More than 1,000 technology leaders recently signed an open letter calling on researchers to pause development of certain large-scale AI systems for at least six months world-wide, citing fears of the 'profound risks to society and humanity.' Would you support or oppose a six-month pause on some kinds of AI development?" About two-thirds (69%) of Americans say they would strongly or somewhat support a pause, while 13% said they would strongly or somewhat oppose one.

To test whether the wording of the original poll influenced the result — perhaps by citing only the tech leaders who support a pause, and their fears — we randomly assigned respondents in the second poll to one of three wordings of the question. One was a simplified version of the question, without arguments from either side: "Would you support or oppose a six-month pause on some kinds of AI development?" It found that 57% support and 22% oppose a pause — slightly softer support than was found by the first poll, which was conducted shortly after the release of the open letter. The second wording was the original wording, and it found that 61% support a pause and 19% oppose it. And the third version added balance to the initial framing, citing the arguments of both supporters and opponents of a pause: "More than 1,000 technology leaders recently signed an open letter calling on researchers to pause development of certain large-scale AI systems for at least six months world-wide, citing fears of the 'profound risks to society and humanity.' Critics said the letter overstated risks and threatens AI research that could 'create enormous social and economic benefits across the economy and society.' Would you support or oppose a six-month pause on some kinds of AI development?" This question found that 60% support and 20% oppose. Question wording seems to have little effect here, at least for the wording tested.

On all three wordings, people who use AI tools somewhat or very often are at least as likely as Americans overall to support a pause on AI development — but also more likely to oppose one and much less likely to be unsure.

Related:

See the results for this YouGov poll

Methodology: This poll was conducted online on April 7 - 11, 2023 among 1,000 U.S. adult citizens. Respondents were selected from YouGov’s opt-in panel using sample matching. A random sample (stratified by gender, age, race, education, geographic region, and voter registration) was selected from the 2019 American Community Survey. The sample was weighted according to gender, age, race, education, 2020 election turnout and presidential vote, baseline party identification, and current voter registration status. Demographic weighting targets come from the 2019 American Community Survey. Baseline party identification is the respondent’s most recent answer given prior to March 15, 2022, and is weighted to the estimated distribution at that time (33% Democratic, 28% Republican). The margin of error for the overall sample is approximately 3%.

Image: Adobe Stock (Techtopia Art)