Some of the most prominent executives in the tech landscape have been called to Washington by Senate Majority Lead Chuck Schumer to discuss the regulation of artificial intelligence. The meeting, set to take place Sept. 13, is closed to both the media and the public; Senators will only be able to submit written questions to the participants.
The participants include Elon Musk, Bill Gates, OpenAI CEO Sam Altman, Mark Zuckerberg, Google's Chief Executive Sundar Pichai and Nvidia's CEO Jensen Huang, among others. This first meeting has already faced widespread criticism for tilting toward the businessmen set to benefit from the technology, rather than the researchers, ethicists and scientists who could help ensure safe, non-incentivized regulation.
Related: Even regulation won't stop data scraping from AI companies, Quivr founder warns
"Innovation must apply to both sides of the equation, innovating so we can move the advantages of AI forward, but innovating so we can deal with the problems that AI might create and lessen them as much as we can," Schumer said last week, noting that the push to regulate AI will be "one of the hardest things" Congress has possibly ever undertaken.
The series of forums, Schumer said, will "give our committees the knowledge base and thought insights to draft the right kind of policies."
But not everyone is on board with the way Schumer has decided to go about gaining this "knowledge base."
“These tech billionaires want to lobby Congress behind closed doors with no questions asked. That’s just plain wrong,” Sen. Elizabeth Warren, D-Mass., told NBC News, adding that this group of people should not have a forum to shape regulation to ensure that they "are the ones who continue to dominate and make money."
More Artificial Intelligence:
- Here's the steep, invisible cost of using AI models like ChatGPT
- Artificial Intelligence isn't going to kill everyone (at least not right away)
- Google agrees on a huge step to handle AI risks
An ongoing debate
The conversation about how to regulate AI has been ongoing for months, intensifying after a public Senate hearing in May that featured testimony from Altman, the man behind ChatGPT.
Altman, laying out a regulatory plan that included an oversight agency, independent auditors and deployment standards, said that "regulatory intervention by governments will be critical to mitigate the risks of increasingly powerful models."
Altman and some of his peers' core focus is ensuring regulation doesn't mitigate innovation, while citing extinction-level threats as their biggest concern when it comes to AI.
But many experts have been openly critical of the so-called extinction risk, panning it as an effort by tech companies to divert peoples' attention away from the current harms being exacerbated by the technology, such as increased fraud, discrimination, misinformation, job loss, copyright infringement and environmental impacts.
One of the biggest threats to AI research, meanwhile, is that the companies behind these models have not been transparent about what data they used during their training processes, so researchers are not easily able to understand how powerful these models actually are.
For Gary Marcus, a leading AI researcher who testified at the hearing in May, real transparency — not just as a talking point — is a vital path on the road to AI safety. But he remains concerned about Big Tech's seat at the table.
"Putting it bluntly: if we have the right regulation; things could go well," he wrote in June. "If we have the wrong regulation, things could badly. If big tech writes the rules, without outside input, we are unlikely to wind up with the right rules."
"The big tech companies' preferred plan boils down to ‘trust us,'" Marcus said at the hearing. "The current systems are not transparent, they do not protect our privacy and they continue to perpetuate bias. Even their makers don't entirely understand how they work. Most of all, we cannot remotely guarantee that they're safe."
Action Alerts PLUS offers expert portfolio guidance to help you make informed investing decisions. Sign up now.