Debi Ashenden attributes her career move into cyber security to a misunderstanding at a job interview.
At the time, she was looking for somewhere to finish the dissertation for her master’s degree in computer science while simultaneously getting some practical credibility, which she felt was lacking from a CV that included a first degree in English literature and a Master’s in Victorian literature. So, at the job interview: “I said I was interested in how people get access to information,” Ashenden explains. She had been working as a community development officer, in which capacity she helped young people find information about travel, education, and jobs. So what she meant was opening up access. The interviewer, who was considering her for a placement at what was then known as the Defence Evaluation Research Agency – now QinetiQ – interpreted it as the opposite: how to prevent people from getting access to information. Seeing a kindred spirit, he replied, “So are we.” Oops.
Still, it not only worked out, but led to a lengthy career. At the time, DERA had just learned it was going to be privatised. As a result, the organisation wanted someone who could bridge the gap between “deep techies” and the private sector and consultants – the outside world – the newly spun-off organisation would have to work with. “They wanted a translator,” she says, “and with my background I seemed to fit that profile.” This has been the thread tying together her whole career: taking knowledge and understanding from one community and introducing it to another one.
“It’s particularly satisfying to bridge communities,” she says.
Such near-accidental beginnings are common among those who have 20-plus-year careers in cyber security for the obvious reason that the field is so new as a recognised discipline. At DERA, Ashenden worked alongside researchers in electronic warfare; in-house arguments were ongoing about whether cyber security was a subfield within electronic warfare – or vice versa.
Commercialisation cost DERA its university-like qualities, and while Ashenden enjoyed moving into consultancy, working with banks and insurance houses she wanted to do more research and accepted a job offer from Cranfield University at the UK Defence Academy and because of her prior experience in a defence establishment, found it comfortable. There, she did some teaching – “Military students are so enthusiastic and questioning” – and met RISCS director Angela Sasse for the first time. Sasse had funding from what was then the Department of Trade and Industry to write a report on Human Vulnerabilities in Security Systems. That project gathered together a group of researchers and introduced Ashenden to RISCS deputy director Lizzie Coles-Kemp (Royal Holloway). “We realised we had a lot in common in the way we got into academia and the way that we think about cyber security.” Shortly afterwards, EPSRC funded a “sandpit”, and both Coles-Kemp and Ashenden were accepted. That work led to the VOME project.
“That was a fascinating experience,” Ashenden says, “because on paper neither of us had the track record to win something that size. We didn’t know what we shouldn’t do, so we kept doing things until someone said we couldn’t.”
The project gave Ashenden the opportunity to find her feet. VOME involved working with young people to make participatory videos; for example, a youth group in Newham made a music video about online identity, how they saw it, and how they valued it. The project also made a trading cards game around privacy. “It grounded where I saw my research fitting, and enabled me to know where I didn’t fit,” she says. In addition, she enjoyed community-based research because she was equally keen on trying to help security practitioners to do their work better.
And then HMRC lost two CDs containing data on all the households receiving child benefit in the UK. This 2007 incident prompted a sea change. It was many years before security practitioners stopped using it as a poster child for the lack of security awareness. Clearly, the practice of security had to change, but few believed it was really possible. Because Ashenden had a good network from the work she’d already done, she and Sasse teamed up on the paper CISOs and Organisational Culture: Their Own Worst Enemy (PDF). This research found that CISOs didn’t believe they had the skills to effect the necessary cultural change. As a result, rather than going out and engaging with employees, many security practitioners prefer to buy security awareness training packages.
This led to work on another paper, Can We Sell Security Like Soap? A New Approach to Behaviour Change, with Darren Lawrence (Cranfield University). This work sought to discover whether techniques known as “social marketing”, which are used a lot in health care campaigns, could be used in cyber security. This technique, Ashenden says, “has a nice framework that’s easy to step through with people who are not social or behavioural scientists.” A key difference from health care, however, is that in cyber security it’s difficult to identify non-divisible actions, and when you do the actions you’ve identified may not be sufficiently significant to warrant a behaviour change programme.
Currently, through the ESRC CREST project, Ashenden and Lawrence have been working to improve the relationship between security practitioners and the rest of their organisation. This effort has led to the Security Dialogues (PDF). Ashenden wanted to include both security practitioners and software developers in the workshops she created for this work, but “I couldn’t get the developers to attend. They didn’t see the point.” The researchers sought to create a “safe space” where the participants would be comfortable; it helped that Ashenden had worked with security practitioners for a long time and knew about the problems they faced. Ashenden continues to work closely with Lawrence.
More recently, software developers have begun to express interest: as organisations move to cloud infrastructures they also push towards continuous integration and continuous delivery. Speeding up cycle times means that security has to be included throughout. Among the newly interesting topics are issues like what secure code is; what secure development looks like, the risk perceptions of software developers, and the practice of code review. Ashenden is now exploring these topics through the cSALSA project.
In 2016, Ashenden moved to the University of Portsmouth, her first time working at a mainstream university, since Cranfield is defence-based and all post-graduate. Going forward, Ashenden says, “I continue to be interested in the notion of how to support security practitioners and developers. I’m also keen on the idea of building better dialogues and engagements, and, increasingly, on finding ways to foster security dialogues with those working in AI and machine learning.”