The EU's AI Act Could Have a Chilling Effect on Open Source Efforts, Experts Warn
Proposed E.U. rules could limit the type of research that produces cutting-edge AI tools like GPT-3, experts warn in a new study. From a report: The nonpartisan think tank Brookings this week published a piece decrying the bloc's regulation of open source AI, arguing it would create legal liability for general-purpose AI systems while simultaneously undermining their development. Under the E.U.'s draft AI Act, open source developers would have to adhere to guidelines for risk management, data governance, technical documentation and transparency, as well as standards of accuracy and cybersecurity. If a company were to deploy an open source AI system that led to some disastrous outcome, the author asserts, it's not inconceivable the company could attempt to deflect responsibility by suing the open source developers on which they built their product. "This could further concentrate power over the future of AI in large technology companies and prevent research that is critical to the public's understanding of AI," Alex Engler, the analyst at Brookings who published the piece, wrote. "In the end, the [E.U.'s] attempt to regulate open-source could create a convoluted set of requirements that endangers open-source AI contributors, likely without improving use of general-purpose AI." In 2021, the European Commission -- the E.U.'s politically independent executive arm -- released the text of the AI Act, which aims to promote "trustworthy AI" deployment in the E.U. As they solicit input from industry ahead of a vote this fall, E.U. institutions are seeking to make amendments to the regulations that attempt to balance innovation with accountability. But according to some experts, the AI Act as written would impose onerous requirements on open efforts to develop AI systems. The legislation contains carve-outs for some categories of open source AI, like those exclusively used for research and with controls to prevent misuse. But as Engler notes, it'd be difficult -- if not impossible -- to prevent these projects from making their way into commercial systems, where they could be abused by malicious actors.
Read more of this story at Slashdot.