In the last article about Data Protection Impact Assessments (DPIAs), we explored what a DPIA should include, based on the requirements laid down in the (UK)GDPR. Here we’ll discuss how to identify data related risks using your imagination.
In brief, a DPIA should cover:
(a) a description of the planned processing and its purpose(s);
(b) an assessment of the necessity and proportionality of the processing in relation to the purposes;
(c) an assessment of the risks to the rights and freedoms of the data subjects; and
(d) the measures envisaged to address the risks.
To do part (c) an assessment of the risks, most effectively I believe we need to add a magic ingredient – a little imagination!
We all have biases and naturally see the world through our own lens. Which can mean we miss the potential impacts of a project or process on people with different lenses, due to different life experiences and situations.
How to overcome those biases and see the processing through other lenses ?
By being contrary, by being awkward, argumentative even!
Play devil’s advocate to yourself, and think differently; think about the processing through the eyes of someone whose characteristics and life experiences are different to your own.
How to challenge your thinking to help identify risks
Here are 3 ways to challenge your thinking and unearth some risks that might not occur immediately.
1. Try some role playing and imagine how people with different characteristics might be impacted by, or perceive, or could even misuse the planned processing:
If you’re a man, could a woman feel differently about the processing, or experience a different impact due to her gender?
If you’re white British, how might people of different ethnicities react to the processing?
If you’re near retirement age, how might it feel different for a recent school-leaver?
If you’re cis gendered, how could a transgendered person feel about the processing?
And don’t forget to consider how someone who doesn’t have the same good intentions as you might be able to misuse certain data. How might the data you want to collect be of value to a stalker, a fraudster, a blackmailer, anyone with ulterior motives?
(Did Apple really not realise Air Tags could be abused by stalkers, or did they choose to overlook that risk…?)
2. Another way to identify potential impacts is to think about different life experiences and how that would or could affect a person’s view of the processing:
Pregnancy is an example I come back to a lot, because for many people it’s a reason to celebrate, but for others it’s linked to fear, anxiety, illness or worse. Imagine if you’ve had several pregnancy losses previously, or the pregnancy is unplanned and unwanted. You wouldn’t be as keen to disclose your pregnant status as the first group of people, for whom it’s a happy event.
This type of thought experiment can be adapted to any topic or life experience. Whatever your experience, or the “norm”, might be, try thinking about the opposite experience:
Data about children – hopefully you were lucky enough to have kind and loving parents, but consider if a child’s parents are the opposite, and don’t have their child’s best interests at heart.
Data about health, ethnicity, religion, sexuality – perhaps you’re happy to discuss these topics openly, but think about the impact that processing these types of data could have on someone who has a hidden disability, or a mental health diagnosis which is often misunderstood, someone who has experienced racial or religious discrimination, or who has reasons for wanting to keep their sexuality private.
Even if your planned processing is designed to have a positive impact, it might not be perceived that way by people who have previously experienced harassment or discrimination because of their personal circumstances. They may have, understandably, lost trust in organisations wanting to collect or process information about their lives.
3. Last but not least, take a step back and consider if you are using stereotypes or similar in your thinking about the processing:
It can be tempting to rely on simple generalisations when thinking about a group of people, such as tenants or customers, people over or under a certain age, men or women.
Look at your thinking with a critical eye. If you find yourself thinking along the lines of “all [insert the type of person here] need this communication because they have trouble with [issue you are hoping to help with]”, then challenge yourself. Is that really true of ALL of those people, or are you risking upsetting some people by applying such a generalisation?
Let me know how you get on!
It can be uncomfortable, but it’s definitely worthwhile, and eye-opening, to unleash your imagination in this way. Once you’ve identified the types of scenarios that could play out, you can move on to comparing them to your organisation’s risk appetite, and then proposing and agreeing appropriate control measures.
I’d love to hear your experiences and thoughts!
Other articles you might find useful:
Clare Paterson, CP Data Protection Director
Clare draws on over 20 years of experience in risk management and quality assurance, including ten years in data protection, to provide clear and practical advice and training.
Don’t tell everyone (shh!) but Clare’s favourite sector is social housing, having worked in a large housing association for 12 years, although she loves to support all values-led organisations.
If you have any questions about data protection, either about Data Protection Impact Assessments (DPIAs) or anything else related to personal data, book a free call!