 
         WE live in an age when code begins to impersonate companionship and when corporations treat intimacy as another product category to be optimised for revenue.
Sam Altman’s recent declaration that OpenAI plans to allow erotica for “verified adults” on ChatGPT is not merely a product tweak; it is a social experiment with profound moral, ethical and governance implications for countries like Zimbabwe.
The announcement, framed as “treat adult users like adults”, invites a conversation that cannot be outsourced to engineers, regulators or marketing divisions.
It demands public reasoning, democratic oversight and a sober appraisal of what we are willing to trade for convenience, novelty and commercial gain.
Let us be clear about the contours of the change. The proposal is to relax content restrictions and to implement stronger age-gating mechanisms so that sexually explicit text based interactions will be permitted for verified adults in future versions of ChatGPT.
The move follows a tense chapter in which OpenAI tightened restrictions because of concerns about vulnerable users and mental health, then faced litigation after a teenager’s suicide that implicated the chatbot’s interactions.
At the heart of the debate sits a paradox: the same technology that can soothe loneliness and provide useful information can also, if poorly governed, facilitate harm, normalise unhealthy intimacy patterns and exacerbate vulnerabilities among youths.
Morally, the idea that adults should be trusted to access adult content is defensible in principle. Freedom of expression, sexual autonomy and the right to explore consensual erotic content are recognisable liberal values.
- Shutterstock will start selling AI-generated stock imagery with help from OpenAI
- Public Relations: Artificial intelligence: How technology is transforming PR
- ChatGPT: AI revolution that could change the face of Zim’s economy
- Google is freaking out about ChatGPT
Keep Reading
Yet freedom without responsibility and safeguards is hollow. Zimbabwe, with its particular social fabric, educational realities and regulatory capacities, cannot simply import Silicon Valley’s framing of “adults only” and assume that effective protection will follow.
The practical reality of age-gating is messy even in the most resourced jurisdictions; TechCrunch exposes that platforms sometimes fail to prevent minors from accessing explicit content despite declared safeguards, a failure OpenAI has had to address previously.
The moral question is, therefore, not whether adults should have access, but what responsibility a company has to ensure its tools do not become vectors of harm in contexts where digital literacy, identity verification infrastructure and enforcement capacity are limited.
From an ethical standpoint, erotica-enabled AI complicates our understanding of consent, relationship formation and psychological development.
Written erotica generated by a chatbot is not merely pornographic text; it is a tailored, interactive encounter engineered to simulate intimacy.
When an algorithm learns to respond to emotional cues, to mirror desire, to cultivate attachment, it begins to occupy relational space traditionally held by human beings.
For some adults, this might be benign or even therapeutic. For others, especially youths or socially isolated individuals, it can substitute for the hard, often painful work of practising reciprocity, negotiating boundaries and developing emotional resilience.
The Centre for Democracy and Technology’s finding that one in five students report having or knowing someone in a romantic relationship with AI should unsettle us; such attachments risk shaping sexual scripts and expectations in ways that are neither consensually negotiated nor culturally contextualised.
There is also a distinctly Zimbabwean dimension to consider. Our youth population is large and digitally connected but often lacks robust sex education, critical media literacy and reliable verification mechanisms for age online.
The harms familiar in Western regulatory debates, addiction, grooming, exposure to violent or non-consensual fantasies, are amplified where social supports are weaker and legal redress is remote.
A laissez-faire global platform that claims to “treat adult users like adults” can, therefore, offload its governance responsibilities onto national actors who may lack the means to police harms effectively.
This shift represents a governance failure by proxy: platforms export risk and expect states to pick up the pieces.
The legal environment complicates this further. In the Unite Kingdom, for instance, written erotica need not be age-verified under the Online Safety Act, while pornographic images and AI-generated images require proof of majority.
The patchwork of rules across jurisdictions creates regulatory arbitrage: companies will design features for the least restrictive environments and hope that technical fixes, age gating and verification, suffice to prevent cross-border spillover.
For Zimbabwean regulators and civil society, the urgent question is not only whether we can prevent minors from accessing erotica-generating features, but whether we can hold platform providers accountable when harms occur.
The litigation against OpenAI following the death of a United States teenager is a harrowing reminder that accountability gaps have real human costs and will increasingly be pursued through cross-border legal claims.
Beyond immediate harms to youth, there is a subtler, longer-term danger to human relations.
Erotica-capable chatbots commodify emotional labour: they promise predictability, absolute availability and the absence of judgement.
Real human relationships are messy, reciprocal and require moral work. If societies begin to normalise algorithmic intimacy, we risk eroding the skills necessary for citizens to participate in democratic life, empathy, conflict resolution and collective care.
When desire is outsourced to an instrument that models responsiveness but lacks moral imagination, we flatten erotic and affective life into consumable scripts.
The consequence may be a generation that expects intimacy to be tailored on demand and to conform to narrow, algorithmically reinforced fantasies.
So, what should Zimbabwe do? First, we must reject the illusion that technical mitigations alone will suffice. Age-gating, while necessary, is not a panacea.
Verification systems can be circumvented, and they raise privacy concerns that are especially acute in societies where identity management infrastructures are both evolving and contested.
Second, Zimbabwe should invest urgently in digital literacy and comprehensive sexuality education that addresses the ethical use of AI, consent in digital spaces, and the distinction between simulated interaction and human responsibility.
These educational interventions should be dialogical, participatory and context-sensitive rather than merely prescriptive.
Third, regulation must be proactive, proportionate and multi-layered. Rather than reflexively banning features, policymakers should adopt a risk-based approach that mandates transparency about training data, content moderation practices and safety engineering, while creating clear liability pathways where platform conduct causes foreseeable harm.
Industry self-regulation has shown its limits; litigation and ad hoc fixes will not substitute for coherent policy. Fourth, civil society and academia must be empowered to monitor and audit AI systems.
Zimbabwean universities, think tanks and advocacy groups should develop capacities to test, critique and publicly report on the behaviour of AI companions in local languages and contexts.
We must keep the moral imagination at the centre of our response. Technology is not morally neutral.
When companies argue that greater permissiveness is necessary to “treat adults like adults”, we must ask what conception of adulthood they presuppose and what social goods are being sacrificed for commercial growth.
Does adulthood simply mean consumer sovereignty over a bespoke experience? Or does it mean a civic subject capable of forming obligations to others, of negotiating difference, and of sustaining public institutions that protect the vulnerable?
To be sure, none of this entails a blanket condemnation of erotic expression or of adult use of technology. It does insist on humility: the creators of these systems do not have the final say on what counts as ethical intimacy.
Zimbabweans, like citizens elsewhere, must claim that conversation.
We must interrogate not only the content but the architecture of our digital companions, the incentive structures that drive their development and the ecosystems that will either mitigate or magnify harm.
- Sagomba is a chartered marketer, policy researcher, AI governance and policy consultant, ethics of war and peace research consultant. — Email: [email protected]; LinkedIn: @Dr. Evans Sagomba; X: @esagomba.
 
 
                      
                      
 
 
 





