A year ago, artificial intelligence -- to many people -- was nothing more than the behind-the-scenes power behind social media algorithms. But that changed when ChatGPT launched in November.
In less than a year, AI has become a dominant force, impacting just about every sector and sending stocks like Nvidia soaring.
Microsoft (MSFT) -- the latest leader to re-affirm its responsible approach to the technology -- laid out a series of proposals May 25 with the goal of answering a simple question.
"How do we best govern AI?"
DON'T MISS: Protesters Say OpenAI CEO Is Dangerously Misled When It Comes to AGI
Though AI presents a host of opportunities for humanity, from enhanced national security to greater individual productivity, it necessitates certain guardrails, Microsoft vice chair and president Brad Smith wrote.
Smith laid out a five-pronged approach for what regulation could look like: grow current government-led AI safety protocols; mandate "safety brakes" for AI that is involved with "critical infrastructure"; build a broad legal framework; promote transparency; enhance public-private partnerships.
AI that is, or might eventually, be in control of infrastructure like electric grids and traffic flows must include safety brakes that keep "human oversight, resilience and robustness top of mind," Smith said.
A component of his suggested legal framework involves the creation of a new licensing and regulatory agency, something that OpenAI has discussed repeatedly.
"As technological change accelerates, the work to govern AI responsibly must keep pace with it," Smith wrote. "We’re on a collective journey to forge a responsible future for artificial intelligence."
Microsoft-backed ChatGPT creators OpenAI similarly laid out a proposal for AI governance on May 22; OpenAI's suggestions, however, are much more focused on preparing regulation for the hypothetical creation of artificial superintelligence or AGI, an eventually that Smith's proposal didn't mention.
“Don’t ask what computers can do," Smith said, "ask what they should do.”