THE government of Zimbabwe, like many other countries is battling to contain and mitigate the impact of Covid-19 on an already fragile economy and constrained social welfare services for the vulnerable and marginalised groups.
Otto Saki &
These include, orphans, elderly, disabled and unemployed. The numbers of those considered as vulnerable cannot be properly established. Zimbabwe’s social welfare systems have suffered neglect and under resourcing over time.
Despite a trail of unsatisfactory performance by President Emmerson Mnangagwa’s administration it was encouraging to learn that some form of social safety net will be implemented, though the pronouncement lacked detail.
In his post-cabinet briefing on April 21, 2020, Finance minister Mthuli Ncube, suggested that the government, and we quote “used a sophisticated algorithm to select beneficiaries of the ZW$180 Covid-19 pocket money”.
A social media report also elaborated that the Finance minister claimed that “they looked at how much money is in your bank account, mobile wallet, and using your cell phone number, figured out where you really stay”.
The authenticity of the report is not the issue, and if it ever becomes, the authorities have capacity track those that spread misinformation as already evidenced by the arrest of a citizen for forwarding a statement which the state alleges to be false.
What is alarming is what appears to be a combined operation of excessive use of personal information, by public and private actors, government, and mobile network operators. The statement by Ncube thus raises several issues of concern around data protection, surveillance and the right to privacy.
Providing economic relief or “pocket money” is a commendable step, what is missing are the details. And as often said, the devil is in the details. If indeed, the finance minister and other stakeholders used a sophisticated algorithm, then there are several fundamental implications, which all Zimbabweans should be extremely concerned about.
Zimbabwe’s legal framework and safeguards on protecting personally identifiable information (PII) is insufficient. Several laws have limited protection, specific to the use, or industry. A comprehensive data protection bill, which was proposed in February 2019 following the approval by Cabinet to repeal the Access to Information and Protection of Privacy Act (Aippa) is long overdue.
Not so long ago, PII was limited to a few identifies, such as your identity number, mailing address, your landline. With technology, this has rapidly expanded to include internet protocol (IP) address, social media posts, or digital images, geolocation, biometric, and even behavioural data especially in some digital authoritarian states. Mandatory registration of SIM cards is another medium that has resulted in the collection of personal information by mobile network operators.
The processing, collection and storing of PII by any entity public or private should be confined to the specific collection purpose. If a mobile network operator is collecting geolocation data for purposes of improving or providing services to a customer, that should not be extended to other purposes. Using balances in mobile money accounts to determine eligibility for Covid-19 pocket money would equally be going beyond the defined purposes.
Such conducts amount to a violation of consumer protection laws and also leaves consumers to wonder as to what other purposes service providers have used consumer information for.
From the collected PII, one can easily build digital identities or identifies which enables making of decisions on service provision, access to welfare and other public services. Globally, there is a rush to use technology to solve human and systems failure in providing welfare. Digital welfare states are emerging. The World Bank and its ilk have argued that digital identities “create huge savings for citizens, governments, and businesses by reducing transaction costs, increasing efficiency, and driving innovation in service delivery, particularly to the poorest and most disadvantaged groups in society”.
This sounds noble, but an ineffective manual process will never translate to an effective digital process. The data and identifies are collected from data gathered manually with their inherent biases.
To develop a sophisticated algorithm as the minister purports, suggests that information was “pulled” from different sources. There are no frameworks that allow for data federation, for one central data store to link to one or more remote data sources. These various remote data sources will include bioinformatic, numerical analysis, or general computational algorithms generated by different individuals. This sophistication is bound to confuse, cause chaos, exclude, minimise or discriminate against individuals that ordinarily should be included.
There are many unknowns with this sophistication. And of concern is the absence of safeguards to mitigate against abuse, to ensure that those that deserve this pocket money are not excluded, and that PII and or digital identities are not driving data economies at the expense of people’s rights. Social protection systems are intended to advance enjoyment of other rights. These systems should observe the dignity and privacy of the vulnerable, poor and marginalised.
The authorities should promptly share and disclose the science, and policy behind the provision of welfare and the algorithms used, otherwise the sophistication means precious little.
Saki and Simanje are lawyers with keen interest in technology and legal issues and are writing in their personal capacities.