Global framework to support safer AI use

AI graphic. | Newsreel
A new OECD framework promotes the safe use of AI. | Photo: Black Jack 3D (iStock)

A global framework has been developed for companies to better share progress in implementing the safe use of artificial intelligence (AI).

The Organisation for Economic Co-operation and Development (OECD)-developed framework allows business to report on their efforts to promote safe, secure, and trustworthy AI.

OECD Secretary-General Mathias Cormann said, for the first time, companies would be able to provide comparable information on their AI risk management actions and practices, such as risk assessment, incident reporting and information sharing mechanisms.

Mr Cormann said this would foster trust and accountability in the development of advanced AI systems.

He said some of the world’s largest developers of advanced AI systems had contributed to the initiative and were instrumental in its pilot phase, testing its features, and ensuring its effectiveness.

“Leading AI developers, including Amazon, Anthropic, Fujitsu, Google, KDDI CORPORATION, Microsoft, NEC Corporation, NTT, OpenAI, Preferred Networks Inc., Rakuten Group Inc., Salesforce, and Softbank Corp. have already pledged to complete the inaugural framework.”

Mr Cormann said the OECD was committed to promoting transparency, comparable reporting and co-operation among global stakeholders, ultimately building trust in AI systems.

“Enabling companies to share their practices and demonstrate their focus on safety, accountability, and transparency will contribute to the responsible development, deployment and use of advanced AI systems.”

He said by aligning the reporting framework with multiple risk management systems, including the Hiroshima Code of Conduct, the OECD aimed to promote interoperability and consistency across international AI governance mechanisms.

Learn more about the G7 reporting framework – Hiroshima AI Process (HAIP) international code of conduct for organizations developing advanced AI systems.