shot-button
E-paper E-paper
Home > Entertainment News > Hollywood News > Article > Christopher Nolan says Developing AI technology more dangerous than nuclear weapons

Christopher Nolan says 'Developing AI technology more dangerous than nuclear weapons'

Updated on: 23 July,2023 05:27 PM IST  |  Los Angeles
IANS |

Christopher Nolan, much like other big figures in Hollywood such as James Cameron, Simon Pegg, Tom Cruise, has spoken about the increasing use of artificial intelligence in both movies and in real life, and has spoken greatly about its dangers.

Christopher Nolan says 'Developing AI technology more dangerous than nuclear weapons'

Chistopher Nolan via Instagram

Christopher Nolan, much like other big figures in Hollywood such as James Cameron, Simon Pegg, and Tom Cruise, has spoken about the increasing use of artificial intelligence in both movies and in real life and has spoken greatly about its dangers.


Releasing his massive biopic ‘Oppenheimer’ in theatres which is currently ruling cinema, as the movie deals with the concept of nuclear weapons, Nolan has said that AI is even more dangerous than nukes.


As reported by Aceshowbiz, while speaking to 'The Guardian', the ‘Interstellar’ director said: "To look at the international control of nuclear weapons and feel that the same principles could be applied to something that doesn't require massive industrial processes - it's a bit tricky."


He added: “International surveillance of nuclear weapons is possible because nuclear weapons are very difficult to build. Oppenheimer spent $2 billion and used thousands of people across America to build those first bombs. It's reassuringly difficult to make nuclear weapons and so it's relatively easy to spot when a country is doing that. I don't believe any of that applies to AI.”

Nolan went on to say that the increasingly intimate relationship between AI and weaponry exposes the need for corporate accountability and much scrutiny.

He further went on to say that the very thought of people producing or using such technology without truly understanding its implications is, “absolutely terrifying … because as AI systems go into the defense infrastructure, ultimately they’ll be in charge of nuclear weapons.”

During the special screening of ‘Oppenheimer’ back on July 20, the director had spoken to a bunch of scientists working in the field of AI, and they too have questioned their work many times.

Many of these scientists and researchers have called the developments undertaken in their own department as their own personal ‘Oppenheimer’ moment as they ponder over the possible outcomes of such advancements in AI technology.

‘The Dark Knight’ director also went on to say that while the need for global accountability in AI control is becoming increasingly more important with advancing weapon technology as well as systems of control such as in surveillance systems, he said that “the United Nations has become a very diminished force” in controlling it.

The director also went on to say that upon watching ‘Oppenheimer’, he hoped audiences would better understand the prospects of control regarding weapon systems and artificial intelligence.

This story has been sourced from a third party syndicated feed, agencies. Mid-day accepts no responsibility or liability for its dependability, trustworthiness, reliability and data of the text. Mid-day management/mid-day.com reserves the sole right to alter, delete or remove (without notice) the content in its absolute discretion for any reason whatsoever

"Exciting news! Mid-day is now on WhatsApp Channels Subscribe today by clicking the link and stay updated with the latest news!" Click here!

Register for FREE
to continue reading !

This is not a paywall.
However, your registration helps us understand your preferences better and enables us to provide insightful and credible journalism for all our readers.

Mid-Day Web Stories

Mid-Day Web Stories

This website uses cookie or similar technologies, to enhance your browsing experience and provide personalised recommendations. By continuing to use our website, you agree to our Privacy Policy and Cookie Policy. OK