The third and final RISCS Community Meeting of 2018 took place at London’s Friends House on Thursday 18th October. The day covered two themes: ‘Incentives in Cyber Security’ and ‘Realising impact from research’.

Incentives: Motivating Cyber Security

The aim of the first session was to explore the spectrum of levers that are available to influence better cyber security behaviours. A series of invited talks from a range of cyber security experts across government, industry and academia provided a solid foundation from which to explore this complex space further. Click on the speaker below to view their presentation.

Emma Green, Head of the Cyber Security Incentives and Regulation Team, DCMS
Will Jones, Senior Lecturer in international relations at the School of Politics and Philosophy, RHUL
Gareth Hiscox, Cyber CEO, Hicox Insurance
Vicki Gavin, DPO and Head of Information Security Assurance, Northview Group

Based on his work with refugees, Will Jones, a researcher at Royal Holloway, discussed market design, a form of economic engineering that to the best of his knowledge has never been used in cyber security. In his experience, the most efficient way to help refugees is to ask them what they want and need, and to respect their choices. It’s best to begin with their actual market behaviour and design your service to suit; a design that works with their priorities will inevitably be more effective.

Will defined a “market” as any situation where agents interact with reference to a finite resource. Markets rely on institutions; market design begins with what people or institutions want and designs a market to produce that effect. Goals may include volume (as many transactions as possible), stability, fairness, efficiency, or safety. Among the examples Will provided was the live kidney exchange developed by Stanford professor Alvin Roth, who won the Nobel prize in economics for the idea, kidneys from willing donors who do not match the relatives they want to help are traded to other families in the same situation; an element of money ensures that everyone can afford the treatment. The exchange creates giant chains of trades that would be impossible for individuals to calculate, and has enabled more than 2,000 transplants in less than a decade just within the US. It is now expanding internationally.

Gareth Wharton, Cyber CEO of Hiscox Insurance, works a line of business that covers hackers, ransomware, extortion, and other types of attacks. The many uncertainties surrounding cyber security make this type of insurance difficult to price. Potential customers often think they don’t need it because they won’t be targeted, they don’t have valuable or personal information, or that claims won’t be paid. Hiscox is trying to offer clients training to help them defend themselves against the biggest attacks. The service is tailored; if people get 80% right on a diagnostic test they can skip that module. The trainers all recommended using a system of incentives. Hiscox considered several options such as Spotify vouchers or anti-virus software, and finally decided that a discount made the most sense, so companies where 80% of staff have completed the training gain a drop in the excess on the policy.

Vicki Gavin, the data protection officer and head of information security for the Northview Group, discussed the methods she used to engage staff in cyber security training at her former job at The Economist. If, she said, you treat people as though they are “stupid users” they will live down to that expectation. No one is born a professional in cyber security; everyone learns it therefore everyone *can* learn it.

Vicki believes there are five crucial elements to selling training: provide an answer to “what’s in it for me?”; pick a single focus, a small bite that suits the learner’s pace; use variety in messages and method and appeal to all senses; engage trainees with feedback and make them think; and be compelling – tell a story, use oral histories and some drama and whimsy. For example, she ran a raffle in which a ticket was given out for each submitted phishing email; the prize was a “security blanket” – that is, a homemade quilt. She typically ran each effort for five weeks because research shows that it takes three weeks to form a new habit. So by the end of five weeks, everyone was expert at identifying phishing emails and in the habit of sending them to the security team. One of her most effective initiatives was to have the infosecurity manager write a personal note to all staff when he himself was hacked, to show that it can happen to the best. Finally, when it was time to remind everyone of the basics, she put eight different reminders into “misfortune cookies” and distribute them to all staff around the globe. The real benefit, she said, was that it got everyone talking to each other about which fortunes they’d found and what they meant.

Panel: Understanding incentives

Following the talks, a panel session was held to inspire further discussion. We were honoured to be joined by panel members Pamela Briggs (Northumberland), Robert Carolina (Royal Holloway), Jonathan Cave (Oxford) and Evie Monnington-Taylor (Behavioural Insights). The panel set out to consider what we know about incentives, how well we are applying what we know, and what is missing.

Jonathan Cave, Senior Teaching Fellow in Economics, University of Warwick
Pamela Briggs, Professor and Chair in Applied Psychology, Northumberland University
Robert Carolina, Executive Director of the Institute for Cyber Security Innovation, RHUL
Evie Monnington-Taylor, Senior Advisor, Behavioural Insights Team

Panellist Rob Carolina raised the issue that individuals and organisations are influenced by different considerations. Compliance with regulations is important to businesses, but their fuzziness leads to the problem, especially in the US, that some organisations achieve “compliance” by hiring lawyers to argue that what they are doing is compliant.

“If you want to influence boards, follow the money,” he said. For that reason, the fining ability incorporated into GDPR is escalating organisations’ interest in following rules whose basic substance differs little from the requirements that have been in force for 20 years through previous data protection legislation. In that sense, GDPR has had the biggest impact on security in the UK industry even though security is not its focus.

In response, Evie Monnington-Taylor wondered how to build on this new interest to change their behaviour. If, for example, a company has suffered a breach, would it make sense for the ICO or another government agency to take advantage of it as a teachable moment and look for businesses with a similar profile and advise them how to avoid falling prey to the same problem?

Monnington-Taylor noted the problem for both organisations and individuals of the lack of feedback loops and the paucity of information: neither has any real way to tell how secure they are, either on their own or in relation to others. Worse, perverse incentives are created by the fact that someone can use the same bad password across dozens of sites for three years and have nothing happen. In security, unlike some other areas, it’s difficult to monitor people’s behaviour, but punishment is useless if it’s not both swift and certain. Lacking that, the only option is to make it easy for people to comply and hope they will. Sanctions solve little when the right behaviour is so difficult to undertake.

Individuals are what most interests Pam Briggs. In a recent paper, she studied the cultural makeup of groups within organisations, and found that there’s a certain size where they care about each other, which itself acts as an incentive. Because this factor has generally been overlooked in cyber security work, organisations typically adopt policies at either the organisation-wide level or the individual level, but fail to understand behaviour.

Briggs also picked up on Carolina’s mention of compliance culture. Time pressure in many organisations means that if you measure compliance by whether security training gets completed you quickly find that groups begin swapping answers to questions to cheat the trainer.

Jonathan Cave pointed out that discussions of incentives often – dangerously – ignore the fact that some of the most active markets are in cyber security exploits and breaches. Users at all levels are the front line of defence, the “eyes and ears of the system”. A second large gap surrounds the problem that what we tell people they need to do may not be accurate. Unlike in health care, where there are clear guidelines and a departure from them can create a natural experiment others can learn from, in security the threat is not fixed, and therefore prescribing behaviour is dangerous. In the light of this uncertainty, if we punish people unduly for risks they should have foreseen because they were inevitable or reward them for getting lucky that will become the culture.

These points all refer to the fact that there is no expectation of the software industry to accept liability for their products. This situation will have to change as the Internet of Things continues to grow and real world physical damage becomes an increased possibility, including loss of life. What will security look like if we’re looking for a flaw like Heartbleed in a device implanted in a human body as opposed to a $500 piece of equipment distributed through the supply chain?

These points refer to the fact that there is no expectation of the software industry to accept liability for their products. This situation will have to change as the Internet of Things continues to grow and real world physical damage becomes an increased possibility, including loss of life. What will security look like if we’re looking for a flaw like Heartbleed in a device implanted in a human body as opposed to a $500 piece of equipment distributed through the supply chain?

At this point participants raised the issue of developers and vendors. RISCS is already investigating ways to help developers through the Why Johnny Can’t Write Secure Software and Motivating Jenny projects. Companies must bear some responsibility, too, however: granted, the Heartbleed bug found in OpenSSL last year was the mistake made by a single person, but dozens of billion-dollar companies relied on that software without apparently ever reviewing it or thinking through the business risk of encountering such a problem.

It is essential, a commenter said, to take account of the entire ecosystem rather than focusing individually on social or technical issues.

-Wendy M. Grossman

Categories: Past Events