What about outside the EU?
GDPR, the EU’s data protection regulation, is the EU’s most famous technology export, copied everywhere from California to India.
The EU’s approach to AI for the highest-risk AI is one that most developed countries agree on. If Europeans can create a coherent way to regulate the technology, it could also serve as a template for other countries that wish to do so.
“U.S. companies in their compliance with the EU AI Act will ultimately also raise the bar for American consumers in terms of transparency and accountability,” said Mark Rotenberg, director of the Center for Artificial Intelligence and Digital Policy, a tracker Artificial intelligence nonprofit. policy.
The Biden administration is also closely watching the bill.The United States has some of the largest AI labs in the world, such as Google AI, Meta, and OpenAI, and leads several different global ranking In AI research, so the White House wants to know how any regulations apply to these companies. For now, influential U.S. government figures such as national security adviser Jack Sullivan, Commerce Secretary Gina Raimondo and Lynn Parker, who leads the White House AI effort, have welcomed European efforts to regulate AI.
“This is in stark contrast to how the U.S. viewed the development of GDPR, when Americans said GDPR would end the Internet, eclipse the sun, and end life on Earth as we know it,” Rotenberg said.
Despite some inevitable caution, the United States welcomes the legislation for good reason. It is very anxious about China’s growing influence in tech. For the United States, the official position is that maintaining Western dominance in tech is a matter of whether “democratic values” prevail. It wants to keep the EU, a “like-minded allies,” closure.
What is the biggest challenge?
Currently, certain requirements of the Act are technically impossible to comply with. The first draft of the bill required that the datasets be error-free and that humans “completely understand” how AI systems work. The datasets used to train AI systems are so large that it would take thousands of hours of work to manually check that they are completely error-free if something like this could be validated. Today’s neural networks are so complex that even their creators don’t fully understand how they reach their conclusions.
Tech companies are also deeply disturbed by requirements to allow outside auditors or regulators access to their source code and algorithms to enforce the law.