RISCS deputy director Lizzie Coles-Kemp has worked in infosecurity since 1990. In 2007, she began a project to look at underserved communities such as refugees, migrants, long-term unemployed, and families separated by prison, none of whose interests are typically considered when we talk about cyber security.
Coles-Kemp worked with online services to discover what these groups disclosed and why, and their reasoning for sharing what they did. Coles-Kemp found two conflicting themes: first, their very real need to protect themselves and their feeling that they were under attack; second, their strong need to engage and maintain social relationships and, as a result, being online for their own safety and security. How they managed this tension to retain their autonomy while under siege was Coles-Kemp’s line of inquiry.
In the Visualisation and Other Methods of Expression (VOME) project, Coles-Kemp worked from the ground up with communities in a participative and participatory way, listening without judgement to stories about public service design, what “user-centric” really means, and their notions of empowerment. In this project, Coles-Kemp was “catapulted” into a situation where she had to think hard about what security was and whose security was under discussion.
In a Guardian article written on 9/11, the writer Salman Rushdie advocated defeating terrorism by living without fear (or at least as-if) and by, internationally, making friends rather than enemies. Coles-Kemp finds, however, that her graduate students are more comfortable with the simplicity of C-I-A, or confidentiality-integrity-availability and the way it enables them to avoid questions of whose security and how digital security contributes to an individual’s security. Coles-Kemp’s challenge, therefore, is “how to bridge my liminal communities and the world of digital security”.
These tensions are played out on a “minute-by-minute” basis in security practices that as a system that can be perceived as being hostile to these communities. For example: in the context of online welfare systems: to whose benefit is an order to do 13 job searches? Does it offer more security for the unemployed person to simply follow the order, or to game the system by working with others to crowd-source job searches to make space for “fiddle work” – that is, work paid for with cash in hand, avoiding tax and record-keeping?
Coles-Kemp’s experiences have made her acutely aware of the many different forms of security and the way they conflict. With welfare benefits, for example, “security” typically means protecting the public purse from fraud, yet competing securities such as mental health and the state are also relevant. Sociologists and security theorists such as Bill McSweeney and Anthony Giddens argue that we need to broaden how we think. Both, for example, write of negative versus positive security – “freedom from” versus “freedom to”. This collision is apparent in Tom Sorell’s earlier talk framing philosophical theories. You can argue about contractualism and utilitarianism, but you can see Kant’s theories come up in everyday practices. What interests Coles-Kemp is the intersections between these liminal communities and technology they engage with and the state. Those intersections, she says, explain a lot of security behaviours.
The ethical tensions Coles-Kemp is dealing with are basic:
- What is security?
- Whose security?
- Whose order of values?
- What counts as a security issue? Is it fraud and the public purse or my mental health and financial security?
- Who puts that forward?
- How do these different issues converge?
- For example, if large parts of the population feel they’re under attack when they’re claiming benefits, when do the interests of the state and those of individuals start to collide?
Coles-Kemp suggests that what’s needed is to reframe and restructure security issues so they are a benefit to people.
As people progressively achieve their various aspirations – housing, employment, better quality of life, different types of information sharing apply and different types of security come under attack. Even a couple of years ago, this approach would have been disjoint from discussions of digital security. Coles-Kemp advocates looking at these areas of security both as individual subjects and as part of a greater whole that includes both individual security and technical security.
Conducting field research that explores such intersections creates many tensions for researchers. First and foremost are the many stakeholders to whom the researchers are responsible, whose interests often conflict. These include:
- Participants. In the case of groups with low resilience, researcher engagement can have significant impact, and careful thought is essential to ensure that impact is positive.
- Gatekeepers who manage access to the participant community. The research conducted can reflect back on the gatekeepers and it is important that this reflection also has positive impact.
- Institutions such as government, universities, and funders. There is always the potential for tension between researchers and these stakeholders and it is therefore important for researchers to understand the position of these stakeholders, enter into constructive dialogue, and understand what such stakeholders regard as a return on their investment when engaging with this type of field research.
- The research community. The researchers need to feed back into the wider community and build a body of knowledge.
- Society at large. When projects are ultimately paid through public money assessing the societal impact is important. This impact may not be immediate but instead generated over the medium to long term.
- The researchers themselves and their families. Researcher safety and well-being is an important consideration for any researcher and will often influence the choice and design of field research.
Responding and positively contributing to all of the above communities is a tall order and it’s essential to be pragmatic about the impact you are able to achieve. “You are never going to win,” Coles-Kemp observed. “You will always disbenefit some, but not others.”
The research Coles-Kemp undertakes is “messy”. As a result, it takes a long time to find the focus of her inquiries. However, she feels that messiness is important because it helps foster understanding, and contributes to public policy debate by identifying where the barriers and challenges lie for implementing security technologies, which interventions are best used, the strengths within the environment, and where in the architecture those are likely to fit. Her work helps identify the conditions under which a security approach is likely to be successful.
“If we had had this conversation [about the intersection between individual security and digital security] with the Department of Work and Pensions five years ago, what effect might it have had on how Universal Credit went?” Coles-Kemp asked.
Coles-Kemp’s methodology and methods reflect the need to weave ethics through study design. When she first began working with underserved communities in 2008, she began with a “theoretical and abstract” approach. Once she discovered how ill-suited these approaches were, she engaged clowns to help the communities show the researchers why they did what they did. The result was improved engagement and connections, which the researchers were then able to square with scientific principles of abstraction.
Coles-Kemp offered as an example her collaboration with Rikke Bjerg Jensen, where, together, they have studied newcomers in immigrant and migrant communities in Denmark and Sweden. Jensen and Coles-Kemp posed questions and made statements; the community replied in collages with a meta-narrative. This, she said, was a story of empowering the community. Taking notes was not empowering, but using collages with the right to access and edit led many stories to emerge, such as the family dynamics that make the mobile phone central to the resettlement experience.
Among the ethical considerations for that study was the information from the gatekeepers that working on home rebuilding and the role of digital could provoke trauma. After much thought about the research theme, Jensen and Coles-Kemp moved to talking about phones, because she observed that these were central in every migrant or refugee setting. Phones could be discussed instrumentally, but they also provided a gateway for those who wanted to think more about how they are used to create connections between old and new homes.
For recruitment and participation, the project followed the standard. Here, the university ethics process proved to be a roadblock: the form, translated, looked utterly straightforward to the researchers, but it proved disruptive. Participants couldn’t understand the point. “They felt they were already in the room because they trusted us.” Confused, they thought the form was the actual research. “Ethical for whom?” Coles-Kemp asked. The “ethical” requirement was not designed for their participants.
Exchange is crucial. Participants have many reasons for coming to these sessions: they want to be part of something, they want language practice, they find discussing the issues interesting, they want an hour off class. Researchers have to deliver a return for their time, and do so every time; return visits depend on this.
The speed of engagement, interaction, and feedback cycles has to be geared to the participants. Gatekeepers also require support.
Coles-Kemp finds many liminal organisations within industry and government that are digitally excluded because they can’t do the work they need to; they feel they have no voice.
Coles-Kemp calls research like hers and the DAPM project, which Monica Whitty described earlier, “carrier-bag research”, referring to the science fiction writer Ursula K. Leguin’s 1989 essay using that term to talk about narratives that don’t fit the classic “hero” narrative. Technology, Coles-Kemp said, is frequently a hero narrative: “Do we sweep in to fix the problem, or see it as a form of utility?” she asked. “If everyday security wants a carrier bag, what are the technical implications of foregrounding the superhero narrative?”
This is the question for the wider community: for whom are we doing this research?
This talk was presented at the November 24, 2017 RISCS meeting, “Ethics, Cybersecurity, and Data Science, hosted by the Interdisciplinary Ethics Research Group at the University of Warwick.