LET me ask you something: when was the last time you wrote a message, be it a WhatsApp text, an email, or even a Facebook post, without second-guessing your words?

Without relying on autocorrect, predictive text, or some AI-powered suggestion to “polish” your thoughts? If you are like most people today, chances are you have already surrendered a piece of your voice to the machine.

Now, I am not here to bash technology. I have spent my career studying its ethical implications, especially in the African context.

I believe in innovation. I believe in progress. But I also believe in people, and right now, people are being quietly disempowered by design.

Let us talk about WhatsApp’s new feature called “Writing Help.” It sounds innocent enough. Meta (the company behind WhatsApp) says, “Sometimes you know what you want to say, but just need a little help with how to say it”. Harmless, right?

Wrong. This is not just a tool. It is a subtle shift in how we communicate, how we think, and ultimately, how we define ourselves. And for Zimbabweans, especially our youth, it is a shift we cannot afford to ignore.

Keep Reading

More than just words

Writing is not just about putting words on a screen. It is about thinking. It is about clarity. It is about expressing your truth, your emotions, your ideas.

In fact, psychologists and educators have long argued that writing is one of the most powerful tools for developing critical thinking. When a teenager in Mbare writes a heartfelt message to a friend, or when a vendor in Mutare drafts a complaint to the council, they are not just communicating; they are practising self-awareness, agency, and creativity.

But what happens when AI starts doing the writing for them? They stop thinking. They stop feeling. They stop owning their voice.

Our children are at risk

Let us be honest: our children and teenagers are growing up in a digital world. Many of them have never known life without smartphones, social media, or instant messaging.

For them, texting is not just a tool; it is a way of life. Now imagine this generation relying on AI to write their messages, emails, essays, and even job applications.

Imagine a 16-year-old in Chitungwiza, who never learns how to express frustration, joy, or love in their own words because an app is always ready to do it for them.

This is not empowerment. This is erosion. And it is happening quietly, subtly, under the guise of “productivity” and “efficiency”.

Disempowerment at its best

WhatsApp is not alone. Google has embedded AI writing tools into Gmail, Google Docs, and even Google Sheets. Microsoft is doing the same with Outlook and Word. These features are marketed as helpful assistants, ways to save time, reduce errors, and sound more “professional”. But let us pause and ask: who defines professionalism? Who decides what “better” writing looks like?

When a rural teacher in Gokwe writes a passionate email to the Ministry of Primary and Secondary Education, and AI suggests changing their tone to be more “neutral”, whose voice is being prioritised? Whose story is being sanitised? This is what I mean by disempowerment by design.

The tools are not neutral. They are shaped by global tech companies, often based in Silicon Valley, with little understanding of our cultural nuances, our linguistic richness, or our social realities.

Our voices matter

Zimbabwe is a nation of storytellers. From the mbira songs of our ancestors to the spoken word poetry of today’s youth, we have always valued expression.

Our languages, Shona, Ndebele, Tonga, and many others, carry deep wisdom, humour, and emotion. But AI tools are not trained on our stories.

They are trained on Western data, Western grammar, and Western norms. They do not understand our idioms, our metaphors, or our cultural references. So when AI starts “helping” us write, it is not just correcting our grammar, it is erasing our identity.

The everyday impact

Let us bring this closer to home. Imagine a young woman in Bulawayo applying for a job via email. She writes a passionate cover letter, but AI suggests she “tone down” her enthusiasm and “restructure” her sentences.

She accepts the suggestions, thinking it will help her chances. But in doing so, she loses the very spark that made her unique. Or consider a WhatsApp group of community organisers in Epworth.

They are planning a clean-up campaign and want to rally support. One member writes a fiery message, but AI suggests softening the language.

The message loses its urgency. Fewer people show up. These are not hypothetical scenarios. There are real consequences of letting machines shape our words and, by extension, our actions.

What can we do?

So, what is the way forward? Do we throw away our phones? Ban AI tools? Of course not! But we must be intentional. We must be aware. And we must teach our children and ourselves to use these tools wisely. Here are a few suggestions:

Teach writing as thinking: In schools, let us emphasise writing not just as a skill, but as a form of self-expression and critical thought. Let students write in their own voices, without AI interference.

Promote digital literacy: Let us educate people about how AI tools work, what biases they carry, and how to use them without losing authenticity.

Celebrate local languages: Encourage writing in Shona, Ndebele, and other indigenous languages. AI tools struggle with these languages, which makes them a powerful space for genuine expression.

Challenge the tech giants: As a nation, we must demand that global tech companies respect our linguistic and cultural diversity. We need African data, African voices, and African ethics in the development of these tools.

We must keep your voice

Dear reader, your voice matters. Whether you are a kombi driver texting your cousin, a nurse writing a report, or a student drafting an essay, your words are yours.

Do not let a machine take that away. AI can be a tool. But it must never become the author of your life. Let us raise a generation of Zimbabweans who write boldly, think deeply, and speak truthfully. Let us protect our voices, our stories, and our minds. Because in the end, the most powerful message is the one that comes from the heart, not from an algorithm.

Dr Sagomba is a doctor of philosophy, who specialises in AI, ethics and policy researcher, AI governance and policy consultant, ethics of war and peace research consultant, political philosophy and also a chartered marketer- esagomba@gmail.com/ LinkedIn; @ Dr. Evans Sagomba (MSc Marketing)(FCIM )(MPhil) (PhD)/ X: @esagomba