Who Watches the Algorithms Guiding South Africa’s National Dialogue?
As South Africa’s National Dialogue unfolds, promising to be the most inclusive participatory process since 1994, there’s a new voice in the room: technology.


Who Watches the Algorithms Guiding South Africa’s National Dialogue?

As South Africa’s National Dialogue unfolds, promising to be the most inclusive participatory process since 1994, there’s a new voice in the room: technology.  AI tools now summarise thousands of inputs. Chatbots guide people through submissions. Algorithms decide what gets categorised, flagged, or “cleaned up.” But here’s the question we must ask: When machines mediate democracy, who holds them accountable?

Chatbots
The Algorithmic Middleman: Powerful, Silent, Unchecked
We are witnessing the rise of the invisible facilitator, a set of software tools quietly replacing moderators, scribes, and human note-takers. They’re faster, yes. But they also decide what counts as insight.
From automated transcription tools to AI-powered keyword grouping systems, these technologies frame how input is interpreted and prioritised.
"A digital platform without accountability is not neutral.
It’s power hiding behind code.” V20MM
What Happens When AI Gets Culture Wrong?
Most AI systems used today are trained on English-dominant, Global North datasets.

That means they often:

  • Misinterpret or erase idioms and proverbs from African languages
  • Prioritise submissions that use formal, policy-friendly language
  • Flag passion or emotional tone as “non-substantive”
  • Summarise complex oral inputs into generic terms
“If the Dialogue’s AI can’t understand a Xhosa elder’s proverb, was she ever really included?”

This isn’t just technical oversight, it’s cultural erasure by design.

Four Accountability Gaps We Must Name

As the Dialogue leans more heavily on tech, we risk institutionalising bias without even realising it.
 Black Box Summarisation

AI summaries are rarely transparent. Participants can’t see how their input was interpreted—only the output. 
Filtered Participation

Moderation bots remove “duplicate” or “off-topic” content, but who defines what’s repetitive, and what’s resistance? 
Categorisation Errors

A woman describing her journey as a domestic worker and a mother might be boxed under “Family Issues” instead of “Labour Rights”. 
Silent Terms and Conditions

Who owns the submissions? Who stores the data? For how long? Most platforms don’t say, and most participants don’t know.


Who Audits the Algorithm?

We wouldn’t hold a national election without ballot observers.

So why are we trusting unobserved algorithms to structure the People’s Dialogue

What’s Needed:

These are not luxuries. They’re the democratic firewalls of our time.

A Just Dialogue Needs Just Tech

We support innovation. But justice must come first.

The National Dialogue’s credibility depends not only on who speaks, but on how their voice is captured, processed, and preserved.

If AI systems continue to serve as silent editors, unseen translators, and invisible gatekeepers without public oversight, we risk replicating the very exclusions we claim to dismantle.

Final Word: No Algorithms About Us, Without Us

South Africa’s next chapter must not be written by biased code and unseen hands.

Let us demand tools that serve participation, not manipulate it.

Let us insist on tech that’s transparent, accountable, and rooted in context.

Let us remember: The struggle for justice includes the software now.