Skip to content
We are sorry but the page you are looking for is not available in the language you have selected, please go to the corresponding homepage
  1. Blog
Blog Post

Collecting and using data – coffee shop lessons!

No finger print ID – no coffee! How much data would you volunteer in exchange for a cuppa? And what if the stakes were higher, and your ID was needed in exchange for some much needed humanitarian assistance? After jumping through hoops to get her daily brew, the CALP Network’s director shares a few thoughts about where the lines can, and should, be drawn.

17 November 2021 — By Karen Peachey

No fingerprint ID - no coffee!

I love a good cup of coffee. But in recent months, I’ve found getting this delicious brew at some Nairobi coffee shops has become more complicated.   

Coffee shop 1 is in an office block. At the entrance to the block there is a reception area where, to my surprise, they wanted fingerprints and other details to pass by the security gate. I asked why it was needed and how would it be stored but didn’t receive a good reply. But as I’d arranged a meeting there, I proceeded – against my better judgement – to register and go in.

Coffee shop 2 is in a compound with offices and it has a security desk where, once again, I was asked for ID information (albeit no biometrics this time) to be logged into a computer. Again, I asked what the information was for, again there was no good reply. This time, I opted to walk away and skipped the coffee.

At Coffee shop 3, happily, there were no entry barriers so I went in. But the shop had just introduced QR codes that needed to be scanned to get an e-menu (this was not a COVID measure). I asked about data use and requested a paper menu but received neither. Happily, I knew what coffee I wanted and went ahead with my order.

I’m sure many of you have had similar experiences in different contexts. So, what does this tell us about the collection and use of personal data?

Coffee shop lessons: 

  1. In each case, I had questions but the people I asked didn’t have answers.
  2. In the first case, I ended feeling somewhat cornered by context and gave my data away. I have no idea how the data will be used and whether it will ever be deleted.
  3. In the second, it was a case of give your data or don’t get your coffee. I opted to walk away. Not much of a choice.
  4. In the third, I went into the shop and knew what I wanted. But if I had wanted to see the menu, I would have had to download the QR app and scan the code to access the menu. What would have happened if I didn’t have a phone, was not tech savvy or my eye-sight made using small screens difficult?
  5. In all cases, I have to ask was the solution right for the problem and was the data really needed?

These three coffee shop experiences got me thinking about how data management is a daily experience yet, for many, issues of data protection easily end up feeling rather abstract or driven by rules not understanding.

Before I go on, let me make clear – I’m not anti tech or anti data – I love it!  Both are essential if we are to address growing humanitarian needs more effectively. But for all of us – both on personal and professional levels – there are important questions to ask and choices to be made.

What’s more, if we are in a position where we make decisions that impact vulnerable people – we have a huge responsibility to dig deep and ensure we understand the implications of those decisions both now and in the longer term.

As we reflect on the use of different technologies and the growing collection of data, we need to weigh up the pros and cons from the user perspective as well as looking at organisation efficiencies.

Some things to think about:

 

The way we think about data matters

  1. When someone first said to me ‘think about data as an extension of
    Think about data as an extension of yourself.

    yourself’ – it changed my perspective fundamentally. Once we see data as part of a person, we engage with it differently and can start to shift from thinking about data as a compliance issue to thinking about it as protecting people.

More data more risk

2. We must constantly ask what data we really need and why?  Every new bit of data we collect brings new risks. As such, data minimisation has to be a central tenet of responsible data management.

3. We need to understand the concept of ‘do no harm’ in the digital space. We need to examine, very practically, the risks and potential harms of collecting data and determine what level of risk is acceptable?  Wherever possible, we need to explore these questions with the people whose data will be collected.

Decision making when there is no choice

4. There has been much discussion about informed consent and its limitations.  Without meaningful choice, what does the notion of informed consent really mean? Going back to the conversation in the first coffee shop: “I need to scan your fingerprint” “I’d rather not” “then you can’t come in”.

Pushing the coffee to one side, when we collect data we need to determine under which lawful basis it will be collected. And even if there is a legal basis – it doesn’t mean to say its ethical or without risk.

Weighing up user and organisational benefits

5. Many technical developments have clear organisational benefits but the risks and benefits for users are often less clear. Just think about the coffee shop QR code example. As humanitarians, we need to work with affected communities to explore issues, examine problems and look at options before making decisions about new innovations. That dialogue then needs to continue to understand and address emerging issues – a point highlighted by Innocent Tshilombo who draws attention to the unintended consequences experienced when vouchers were introduced as part of the assistance package he received.

6. As we weigh up the pros and cons, we need to think about different user population groups. Are there specific barriers for people with poor eyesight, who can’t read, who struggle to use the technology, or people who are excluded from services for any other reason?

Some data hangs around

7. Many of us, humanitarians especially, think and plan in short timelines. But many of the technologies we use have long term implications and the data we collect today could be around for a life time. What seems safe today, might not be tomorrow – as innovations from WhatsApp to blockchain have shown.  We need to examine our safeguards more critically and ask; ‘What happens if the safeguards we have now are gone tomorrow – what’s the risk then?’

In summary

We need to use new technology to increase the reach, efficiency and effectiveness of humanitarian aid – including cash and voucher assistance. But we need to use it ethically and responsibly – capitalising on what all that technology has to offer while ensuring we don’t unconsciously use humanitarian spaces as an unregulated playground or just get carried away with a new gizmo which seems to offer the perfect solution.

While it can feel like an impossible task to tackle all these issues andData responsibility toolkit the CALP Network weigh up the trade-offs that need to be made, there is excellent guidance available.  the CALP Network’s Data Responsibility Toolkit, published earlier this year, outlines the legal and ethical implications that that we need to think about. It offers a ‘gold standard‘ that organizations can aspire to – and if you want more – it directs you to multiple references so you can dig deeper.

Finally, let’s go back to Nairobi coffee shops … what do I want?

A fabulous cup of coffee, easy access, a great customer experience and choice.  No more, no less. And I definitely don’t want my data taken when there is no good reason to do so and when I don’t know how it will be stored or used.

Anyone for a coffee?

Cup of coffee