OECD members, including U.S., back rules to make AI safer
n n
PARIS–The evolution of artificial intelligence is driving advances in technology but raising questions over ethics, the head of the OECD said on Wednesday, as more than 40 nations backed a set of principles meant to improve transparency around AI.
The principles, endorsed by the United States, call for AI systems to be fair, transparent and accountable and are the first of their kind, said Angel Gurria, head of the Organization for Economic Co-operation and Development.
The principles call on companies to disclose enough about how their systems work for people to understand their results and be able to challenge them.
Not only should AI be used to benefit people, but the technology should also uphold the rule of law, human rights, democratic values and diversity.
“While AI is driving optimism, it is also fueling anxieties and ethical concerns,” Gurria said at a ministerial meeting of the group’s members.
“There are questions around the trustworthiness, the robustness of AI systems, including the dangers of codifying or reinforcing existing biases related to gender or race, or infringing on human rights and privacy,” he added.
The principles are backed by the OECD’s 36 members as well as Argentina, Brazil, Colombia, Costa Rica, Peru and Romania, the OECD said. They are to be put to the Group of 20 nations at a summit in Japan later this year.
The guidelines are not legally binding, but are intended to influence legislation as governments increasingly face pressure to lay down rules for what AI technology can and cannot do.
Although U.S. President Donald Trump’s administration has tended to shun international agreements, Washington gave its backing to the principles.
Last week the White House snubbed a push by a gathering of world leaders in Paris for stronger measures against hate speech on social media.
The OECD has set policy guidelines in the past that over time became the international standard for such issues as privacy law or consumer protection norms.
n n