Dr Ntokozo Mahlangu On Why Boards Must Govern AI, Not Just Adopt It

Dr Ntokozo Mahlangu, Risk Management Specialist and a Strategic Advisory Board member of The DaVinci Institute, has published a thought-provoking Op-ed examining a growing governance gap in South African organisations: the inability of boards to fully explain the decisions made by artificial intelligence systems.

Writing at a time when AI is already embedded in sectors such as banking, insurance, and telecommunications, Dr Mahlangu argues that the challenge is no longer technological adoption, but governance. As algorithms increasingly influence decisions on credit, pricing, hiring, and risk, boards are being confronted with a critical question: who is accountable when decisions are made by systems that are not easily understood?

From Adoption to Accountability

In his article, “If your board cannot explain AI decisions, it is not governing them,” Dr Mahlangu reflects on how governance debates often emerge only after failure. Referencing cases such as Steinhoff International and Eskom, he highlights how weaknesses in oversight and accountability tend to surface only once systems break down.

He suggests that a similar risk is now forming with artificial intelligence. While organisations are rapidly adopting AI for efficiency and innovation, the governance of these systems remains underdeveloped. This creates a disconnect between decision-making and accountability, where outcomes are produced without clear visibility into how they were reached.

The Risk of Opaque Decision-Making

Dr Mahlangu points to global examples such as Apple Card and Amazon, where AI-driven systems were found to produce biased or unfair outcomes. These cases, while international, illustrate risks that are equally relevant in South Africa, including bias, lack of transparency, and weak accountability structures.

As AI systems rely on complex datasets and machine learning models, decision-making processes become harder to trace. What appears to be more precise and data-driven can, in reality, obscure where judgment is applied and where responsibility lies.

Strengthening Governance in an AI Era

With South Africa accelerating digital adoption and enforcing regulations such as the Protection of Personal Information Act, expectations around data governance are rising. However, Dr Mahlangu notes that many boards still treat AI as a strategic capability rather than a governance priority.

He argues that this must change. Effective oversight now requires boards to move beyond evaluating outcomes and develop a working understanding of how AI systems operate. This includes interrogating the data used, assessing bias and fairness, understanding model limitations, and ensuring mechanisms for human intervention and auditability.

Reasserting Accountability

Grounded in South Africa’s governance tradition shaped by the King Reports on Corporate Governance, the article reinforces that ethical leadership and accountability remain central, even in a technologically advanced environment.

Ultimately, Dr Mahlangu’s message is clear: governance cannot be outsourced to algorithms.

If organisations cannot explain the decisions made in their name, they cannot claim to govern them. And if governance fails to extend to AI systems, it is only a matter of time before the consequences become visible. Read the full article in The Times.


Comments


Leave a Reply

Your email address will not be published. Required fields are marked *