CVA response – COVID-19, privacy, carrots and open banking
Since the onset of COVID-19 pandemic, there has been heightened interest in the use of technology for humanitarian assistance provision, both to help fight the spread of the pandemic through a plethora of tracing apps, and to help reach people who have been worst hit economically with remote registration and authentication tools such as contactless biometrics. In this blog post we explore the implications of some of these developments on individual rights to privacy and data protection. Over the coming weeks we will be building on this discussion through several blogs and webinars. Stay tuned.
Some days it is hard to remember what life was like pre-COVID-19, but in humanitarian response terms, we are still very much in the first phase of this marathon. With alarming projections of increasing poverty and risks of famine, expanding humanitarian caseloads at scale, identifying those most in need and responding fast are top of the priority list.
For many, digital-first remote CVA responses have proved to be one of the few options available to reach people in need fast. Whilst the nuts and bolts of CVA responses to COVID-19 may not be hugely different to other responses, the increased use of digital tools puts even more onus on organisations to be responsible in their data management practices. A recent global meeting of Cash Working Group coordinators flagged several challenges in this area. The two issues at the top of the list were: (i) the impact of existing Know Your Customer regulations in being able to target those without formal identification and (ii) safe handling and sharing of vast amounts of sensitive data with financial service providers and/or between responding agencies. As these challenges are being worked through, and solutions are identified, individual rights to privacy, a fundamental right, should not be forgotten.
Now may be a good moment to talk about carrots.
Amos reflects: “My son hated carrots. He’d remove them from his plate, sometimes throw them away, or hide them in random places when we weren’t looking. The weird thing was, he had never tried them. Some older cousin or friend had told him carrots were gross and that was that. Nothing we said was going to change that. But then everything changed. I don’t even remember what or how or if a bribe was involved. But he got to try one. And he loved it. He loved it so much that carrots are now a staple food in our house. He has them with almost every meal and will complain if there are none in the house”.
Going remote is somewhat similar. Going remote and going digital are often seen as being two sides of the same coin and are considered interdependent. People, companies have resisted it for years. Never trying it, but basing their decisions on assumptions, rumours, and bad experiences others have had (and ignoring all the good experiences). Then COVID-19 arrived and lockdowns were introduced. Suddenly, everyone was working remotely. The longer it goes on, the more we hear of people and companies loving it. It’s likely to feature much more in the future than it did in the past.
There are likely to be important parallels with remote delivery of cash programmes. Most of our cash delivery is through remote means, we just don’t think about it that way. However, banks, mobile money, mobile vouchers all involve an element of “remote” in the design. This is the easy part of going digital and going remote. Registration, targeting and monitoring – the hard part of CVA – remains the hard part to get right. In fact, this part has become even harder.
Registration often contains sensitive information, which we don’t want shouted across a 2-metre physical distancing space. It can be done with SMS and digital forms, but to do so requires literacy (linguistic and digital), access to a device and having a form of identity to verify you are who you say you are – neither of which all vulnerable people have.
Targeting tends to be based on some aspect of demographics collected through the registration process or by segmenting a list given to us by a local leader or government official. But most of those lists need to be verified, which is a challenge remotely. The error of exclusion in this crisis can be a lot more costly than the error of inclusion.
And monitoring. While it is fairly easy to monitor whether the funds arrived in the account they were supposed to, it is very difficult to know if they were used as intended – unless, of course, we follow the money. However, it is hard to follow the money without following the person. And this highlights a dilemma.
People have been infected in millions, thousands die every day – now is not the time to worry about [privacy, informed consent, precise targeting, insert your own] – is the argument brought forward by the urgency of this response. But, we also hear about the need to build back better, an urgent need to focus on trust building over functionality, on avoiding the normalization of the digital surveillance over and above individual rights to privacy and protection.
Many countries have adopted or are considering tracing apps of various sorts. The discussion is framed in the language of tracking the virus, but to track the virus you need to track people. CVA is no different. To monitor how the cash or voucher was used, we track people.
The surveillance issue is all about people, privacy, and power. Is the tracking of people for the monitoring of a project an infringement of a person’s privacy? This is the topic of long complex debates and has different legal implications depending on the jurisdiction you are in, some of which are subject to quick change. Where the power lies is easier to agree on: unless the monitoring is ‘opt in’ like some of the contact tracing apps and most fitness apps, the organisation doing the tracking has more power than the tracked. The real risk of harm from the misuse of the data of the individual being tracked should be weighed against the ease of doing the tracking when going digital and remote.
Some of the early adaptations we hear about in light of COVID-19 – such as disabling the use of biometrics for verification in Bangladesh and other contexts to transmission risks or innovative uses of cardless ATM withdrawals in Ecuador – are good news from a privacy and individual data protection angle, even if the primary driver for these adaptations is the speed (and safety) of the response, not the privacy implications of the technology used. But we need to do more. If it is possible to continue effective provision of aid without organisations retaining control of sensitive individual data now, could this COVID-19 triggered practice become the new norm?
If privacy and power do not provide enough motivation to build back better, should we focus on another ‘P’ – portability? Data portability would protect organisations from vendor lock-in and would protect those we seek to serve from having their data locked in to one organisation. It would mean affected person’s data, an extension of their physical person, can move with them; they can choose who to provide it to in order to access organisation’s services. It is a different framing of the idea of ‘people centric’, of accountability, of keeping the ‘customer’ at the centre, one that focuses more on the services we can provide – putting the power and the choice back into recipients’ hands. If the banks can do ‘open banking’ then we can do ‘open aid’.
This idea itself is not new – but what is new is the opportunity presented by the COVID-19 response to think about long-term implications of going remote and digital in CVA. Let us evaluate, as we go, which of our data management practices are best left in the pre-COVID-19 era and which we should collectively re-engage with.
The choice is ours.