"We call on all AI labs to immediately pause for at least 6 months the training of AI systems more powerful than GPT-4," reads an open letter organized by the Future of Life Institute and endorsed by over 27,000 signatories, including Elon Musk and Apple co-founder Steve Wozniak.
Since the publication of the letter on March 22, the White House has summoned the leaders of the nation's top artificial intelligence companies for a discussion about regulation. Senate majority leader Chuck Schumer (D–N.Y.) is "taking early steps toward legislation to regulate artificial intelligence technology," according to reporting from Axios. Sam Altman, CEO of OpenAI, the company responsible for ChatGPT, has said that optimizing artificial intelligence for the good of society "requires partnership with government and regulation."
But economist Robin Hanson worries that too much of today's fear of artificial intelligence is a more generalized "future fear" that will imperil technological progress likely to benefit humanity.
"Most people have always feared change. And if they had really understood what changes were coming, most probably would have voted against most changes we've seen," Hanson wrote in a recent post on the topic. "For example, fifty years ago the public thought they saw a nuclear energy future with unusual vividness, and basically voted no."
Join Reason's Zach Weissmueller this Thursday at 1 p.m. Eastern for a discussion of the risks and rewards of A.I. with Hanson, an associate professor at George Mason University and research associate at the Future of Humanity Institute at Oxford, and Jaan Tallinn, a tech investor, part of the software team responsible for the technology behind Skype, and co-founder of the Future of Life Institute, which organized and published the open letter calling for a pause.
Watch and leave questions and comments on the YouTube video above or on Reason's Facebook page.
- Producer: Adam Sullivan
The post Is It Time To Hit 'Pause' on Artificial Intelligence Research? appeared first on Reason.com.