Is a Terminator-style AI apocalypse inevitable? Know what shocking research says

Artificial intelligence (AI) has captured the world’s attention, especially since ChatGPT came to life last November. Since then, generative AI has been a rapidly developing field with new advancements announced every single day. Technologies such as virtual assistants, speech and voice recognition programs, and machine learning are ingrained in the programs we use on a daily basis, helping us without even us noticing. Take the film ‘Her’ for example, where an AI virtual assistant helped Joaquin Phoenix combat loneliness. However, it’s not all sunshine and rainbows, as arguments have been made regarding the consequences of AI’s meteoric rise especially if it is not kept in check. The current risks of AI involve technology taking over jobs, making humans lazy and even swaying elections. This is already evident as Suumit Shah, the founder of an e-commerce company, replaced 90 percent of the support staff jobs with an AI customer service assistant. However, the future predicament could involve something much more sinister.

While The Terminator may have been an unrealistic 80s sci-fi film involving cyborg assassins and doomsday, the threat posed by Skynet could very well become real with sentient AI causing existential risks and bringing about a potential apocalypse.

But will it ever happen? Extinction risk experts, super forecasters, and researchers might have the answer.

Predicting the AI apocalypse

A report by The Economist on a study, which was led by Ezra Karger, an economist at the Federal Reserve Bank of Chicago, and Philip Tetlock, a political scientist at the University of Pennsylvania, sheds light on a possible apocalypse being brought about by artificial intelligence. In the paper published on July 10, the researchers conducted a survey involving two groups of people.

On the one hand, there were experts related to nuclear war, AI extinction, and bioweapons, the other included ‘super forecasters’, people that make forecasts based on statistical data. They were asked to estimate the probability of a potential catastrophe or apocalypse brought about by events such as AI takeovers, bio-weapon attacks, and nuclear wars. Shockingly, the most doom and gloom probabilities were given by domain experts, with over 20 percent of them predicting a potential catastrophe by 2100, and a 6 percent chance of extinction. On the other hand, super forecasters gave these events a 9 percent and 1 percent chance of happening respectively.

But will it be caused by AI? 12 percent of the experts believe so, as they predict the catastrophic impact of the technology. 3 percent of them even predicted the extinction of our species altogether. Superforecasters, on the other hand, have envisioned a lesser impact of AI, with only 2.1 percent of them speculating an AI-sparked catastrophe, while a mere 0.38 percent of them believe it could cause the extinction of humanity.

What about the near future?

While the possibility of AI causing global extinction could be speculated perhaps 1000 years from now, it still poses risks that threaten our near future. As technology advances, it could put jobs at risk, and it is already doing so. According to “The Future of Jobs Report 2020” by World Economic Forum, AI could replace as many as 85 million jobs worldwide by 2025. Geoffrey Hinton, the “godfather” of AI, had even quit his position at Google, citing AI’s impact on jobs as the reason behind his exit.

But will artificial intelligence bring doom like Skynet did? As of now, we can only hope that we learn to incorporate this technology into our lives, without letting it actually take over.

For all the latest Sports News Click Here 

Read original article here

Denial of responsibility! Technocharger is an automatic aggregator around the global media. All the content are available free on Internet. We have just arranged it in one platform for educational purpose only. In each content, the hyperlink to the primary source is specified. All trademarks belong to their rightful owners, all materials to their authors. If you are the owner of the content and do not want us to publish your materials on our website, please contact us by email – [email protected]. The content will be deleted within 24 hours.

Comments are closed.