Pointing fingers solves nothing

To build organizations with cultures that reinforce security, we need to turn from awareness training, to a holistic approach taking human performance into account. In this post, we look at performance shaping factors as part of the root cause of poor security decisions, and suggest 4 key delivery domains for improved cybersecurity performance in organizations; leadership, integrating security in work processes, getting help when needed, and finally, delivering training and content.

This is a blog post about what most people call “security awareness”. This term is terrible; being aware that security exists doesn’t really help much. I’ve called it “pointing fingers solves nothing” – because a lot of what we do to build security awareness, has little to no effect. Or sometimes, the activities we introduce can even make us more likely to get hacked!

We want organizational cultures that make us less vulnerable to cyber threats. Phishing your own employees and forcing them to click through e-learning modules about hovering links in e-mails will not give us that.

What do we actually want to achieve?

Cybersecurity has to support the business in reaching its goals. All companies have a purpose; there is a reason they exist. Why should people working at a cinema care about cybersecurity for example? Let us start with a hypothetical statement of why you started a cinema!

What does the desire to share the love of films have to do with cybersecurity? Everything!

We love film. We want everyone to be able to come here and experience the magic of the big screen, the smell of popcorn and feeling as this is the only world that exists.

Mr. Moon (Movie Theater Entrepreneur)

Running a cinema will expose you to a lot of business risks. Because of all the connected technologies we use to run our businesses, a cyber attack can disturb almost any business, including a cinema. It could stop ticket sales, and the ability to check tickets. It could cost so much money that the cinema goes bankrupt, for example through ransomware. It could lead to liability issues if a personal data breach occurs, and the data was not protected as required by law. In other words; there are many reasons for cinema entrepreneurs to care about cybersecurity!

An “awareness program” should make the cinema more resilient to cyber attacks. We want to reach a state where the following would be true:

  • We know how to integrate security into our work
  • We know how information security helps us deliver on our true purpose
  • We know how to get help with security when we need it

Knowing when and how to get help is a key cybersecurity capability

Design principles for awareness programs

We have concluded that we want security be a natural part of how we work, and that people are motivated to follow the expected practices. We also know from research that the reason people click on a phishing e-mail or postpones updating their smartphone, is not a lack of knowledge, but rather a lack of motivation to prioritize security over short-term productivity. There can be many reasons for this, ranging from lack of situational awareness to stress and lack of time.

From human factors engineering, we know that our performance at work depends on many factors. There are factors that can significantly degrade our capability to make the right decisions, despite having the knowledge required to make the right decisions. According to the SPAR-H methodology for human reliability analysis, the following PSF’s (performance shaping factors) can greatly influence our ability to make good decisions:

  • Available time
  • Stress/stressors
  • Task complexity
  • Experience and training
  • Procedures
  • Human-machine interface
  • Fitness for duty

It is thus clear that telling people to avoid clicking suspicious links in e-mails from strangers will not be enough to improve the cybersecurity performance of the organization. When we want our program to actually make our organization less likely to see severe consequences from cyber attacks we need to do more. To guide us in making such a program, I suggest the following 7 design principles for building security cultures:

  1. Management must show that security is a priority
  2. Motivation before knowledge
  3. Policies are available and understandable
  4. Culture optimizing for human reliability
  5. Do’s before don’ts
  6. Trust your own paranoia – report suspicious observations
  7. Talk the walk – keep security on the agenda

Based on these principles, we collect activities into four delivery domains for cybersecurity awareness;

  1. Leadership
  2. Work integration
  3. Access to help
  4. Training and content

The traditional “awareness” practices we all know such as threat briefs, e-learning and simulated phishing campaigns fit into the fourth domain here. Those activities can help us build cyber resilience but they do depend on the three other domains supporting the training content.

Delivery domain 1 – Leadership

Leaders play a very important role in the implementation of a security aware organization culture. The most important part of the responsibility of leaders is to motivate people to follow security practices. When leaders respect security policies, and make this visible, it inspires and nudges others to follow those practices too. Leaders should also share how security helps support the purpose of the organization. Sharing the vision is perhaps the most important internally facing job of senior management, and connecting security to that vision is an important part of the job. Without security, the vision is much more likely to never materialize, it will remain a dream.

Further, leaders should seek to draw in relevant security stories to drive motivation for good practice. When a competitor is hit with ransomware, the leader should draw focus to it internally. When the organization was subject to a targeted attack, but the attack never managed to cause any harm due to good security controls, that is also worth sharing; the security work we do every day is what allows us to keep delivering services and products to customers.

leadership wheel
The leadership wheel; building motivation for security is a continuous process

Delivery domain 2 – work integration

Integrating security practices into how we deliver work, is perhaps the most important deliberate action to take for organizations. The key tool we need to make this reality is threat modeling. We draw up the business process in a flowchart, and then start to think like an attacker. How could cyber attacks disturb or exploit our business process? Then we build the necessary controls into the process. Finally, we need to monitor if the security controls are working as intended, and improve where we see gaps. This way, security moves from something we focus on whenever we read about ransomware in the news, to something we do every day as part of our normal jobs.

Let’s take an example. At our cinema, a key business process is selling tickets to our movies. We operate in an old-fashioned way, and the only way to buy tickets is to go to the ticket booth at the entrance of the cinema and buy your ticket.

How can cyber attacks disturb ticket sales over the counter?

Let’s outline what is needed to buy a ticket:

  • A computer connected a database showing available tickets
  • Network to send confirmation of ticket purchase to the buyer
  • Printer to print paper tickets
  • A payment solution to accept credit card payments, and perhaps also cash

There are many cyber attacks that could create problems here. A ransomware attack removing the ability to operate the ticket inventory for example, or a DDoS attack stopping the system from sending ticket ocnfirmations. Also, if the computer used by the seller is also used for other things such as e-mail and internet browsing, there are even more possibilities of attacks. We can integrate some security controls into this process:

  • Use only a hardened computer for the ticket sales
  • Set up ticket inventory systems that are less vulnerable to common attacks, e.g. use a software-as-a-service solution with good security. Choosing software tools with good security posture is always a good idea.
  • Provide training to the sales personnel on common threats that could affect ticket sales, including phishing, no shadow IT usage, and how to report potential security incidents

By going through every business process like this, and looking at how we can improve the cybersecurity for each process, we help make security a part of the process, a part of how we do business. And as we know, consistency beats bursts of effort, every time.

motivational meme.
Consistency beats motivational bursts every time. Make security a part of how we do work every day, and focus on continuous improvement. That’s how we beat the bad guys, again and again.

Delivery domain 3 – access to help

Delivery domain 3 is about access to help. You don’t build security alone, we do it together. There are two different types of help you need to make available:

  • I need help to prepare so that our workflows and our knowledge is good enough. Software developers may need help from security specialists to develop threat models or improve architectures. IT departments may need help designing and setting up security tools to detect and stop attacks. These are things we do before we are attacked, and that will help us reduce the probability of a successful attack, and help us manage attacks when they happen.
  • The other type of help we need, is when we have an active attack. We need to know who to call to get help kicking the cyber adversaries out and reestablishing our business capabilities

You may have the necessary competence in your organization to both build solid security architectures (help type 1) and to respond to incidents (help type 2). If not, you may want to hire consultants to help you design the required security controls. You may also want to contract with a service provider that offers managed detection and response, where the service provider will take care of monitoring your systems and responding to attacks. You could also sign up for an incident response retainer; then you have an on-call team you can call when the cyber villains are inside your systems and causing harm.

Delivery domain 4 – training and content

Our final domain is where the content lives. This is where you provide e-learning, you do phishing simulations, and write blog posts.

About 50% of the effort done in providing the “knowledge part” of awareness training should be focused on baseline security. These are security aspects that everyone in the organization would need to know. Some typical examples of useful topics include the following:

  • Social  engineering and phishing: typical social engineering attacks and how to avoid getting tricked
  • Policies and requirements: what are the rules and requirements we need to follow?
  • Reporting and getting help: how do we report a security incident, and what happens then?
  • Threats and key controls: why do we have the controls we do and how do they help us stop attacks?
  • Shadow IT: why we should only use approved tools and systems

Simulated phishing attacks are commonly used as part of training. The effect of this is questionable if done the way most organizations do them; send out a collection of phishing e-mails and track who is clicking them, or providing credentials on a fake login page. Everyone can be tricked if the attack is credible enough, and this can quickly turn into a blame game eroding trust in the organization.

Simulated phishing can be effective to provide more practical insights into how social engineering works. In other words, if it is used as part of training, and not primarily as a measurement, it can be good. It is important to avoid “pointing fingers”, and remember that our ability to make good decisions are shaped less by knowledge than performance shaping factors. If you see that too many people are falling for phishing campaigns, consider what could be the cause of this.

When it comes to e-learning, this can be a good way to provide content to a large population, and manage the fact that people join and leave organizations all the time. E-learning content should be easy to consume, and in small enough chunks to avoid becoming a drain on people’s time.

In addition to the baseline training we have discussed here, people who are likely to be targeted with specific attacks, or whose jobs increase the chance of severe consequences of cyber attacks, should get specific training relevant to their roles. For example, a financial department’s workers with authority to pay invoices, should get training in avoiding getting tricked by fake invoices, or to fall for typical fraud types related to business payments.

The last part should close the circle by helping management provide motivation for security. Are there recent incidents managers should know about? Managers should also get security metrics that provide insight into the performance of the organization, both for communication to the people in the organization, and to know if they resources they are investing in for security are actually bringing the desired benefit.

tl;dr – key takeaways for security awareness pushers

The most important take-away from this post is the fact that people’s performance when making security decisions is shaped both by knowledge, and by performance shaping factors. Building a strong security culture should optimize for good security decisions. This means we need to take both knowledge, leadership and the working environment into account. We have suggested 7 design principles to help build awareness programs that work. The principles are:

  1. Management must show that security is a priority
  2. Motivation before knowledge
  3. Policies are available and understandable
  4. Culture optimizing for human reliability
  5. Do’s before don’ts
  6. Trust your own paranoia – report suspicious observations
  7. Talk the walk – keep security on the agenda

Based on the principles we suggested that awareness programs consider 4 delivery domains: Leadership, Work Integration, Access to Help, and Training & Content.

Can cybersecurity culture be measured, and how can it drive national policy?

Background

NorSIS has studied what they term cybersecurity culture in Norway. The purpose of their study has been to help designing effective cybersecurity practices and to understand what security regulations Norwegians will typically accept.

The study wants to measure culture, a concept that does not easily lend itself to quantification or simple KPI’s. The attempt is based on a survey sent to a group of people that is representative for the Norwegian population.

The key insights sought by the study are summarized in 4 research questions:

  1. What characterizes the Norwegian cybersecurity culture?
  2. To what degree does cybersecurity education influence behaviors and awareness?
  3. How do Norwegians relate and react to cyber risks?
  4. To which degree do individuals take responsibility for the safety and security of cyberspace?

 

wp-1489524275511.jpg
Thanks to Bjarte Malmedal for sending me a nice hardcopy of the report he wrote with Hanne Eggen Røislien – you should follow him on Twitter for insightful security discussions!

 

The cultural dimension

NorSIS does not fall into the trap of reducing culture to behaviors alone but attempt to treat the cultural dimension as a set of norms, beliefs and practices influenced in various ways. They define 8 core issues that influence the cybercultural fabric of society:

  • Collectivism
  • Governance and control
  • Trust
  • Risk perception
  • Techno-optimism and digitalization
  • Competence
  • Interest
  • Behaviors

The discussion of these core issues that follows is sensible and logic. Then the authors summarize some results from their questionnaires, mapping answers to the 8 core issues. For example they report that only 18% of the respondents say they have little interest in IT and technology.

Competence and learning

Surprisingly, the report states that 59% of respondents report having received cybersecurity training sometime the last 2 years (without specifying any further what this entails). They also look into how people prefer to learn about security.

The authors take the perspective that many children are not receiving the cybersecurity guidance they need because only half the adult population has received cybersecurity training.

The report also states that it is unlikely that training will typically relate the security of cyberspace as a whole to the security of individual devices.

Risk perception

A key finding in the report is that 7 of 10 respondents think they expose themselves to threats online. They further associate the risk exposure with external factors rather than their own actions. Further 6 of 10 people feel confident about their own ability to identify what is and isn’t safe to do online.

The highest fear factors are found when doing online banking and using online government services. This is perhaps because it is during these activities the users are interacting with their most sensitive data.

Behavioral patterns

Most people report that they think about how safe a website is before using it and only 18% say they don’t think about this. The ability to actually assess this is most likely varying, and 61% report they feel competent to do such assessments.

Another interesting finding is that people report deliberately breaking security rules at work; 14% in the private sector, 8% in the public sector, and men report doing this more than women.

Risk-taking behaviors should be expected in any large group of people, and the self-reported numbers are reasonable when compared to other studies about motivation and willingness to follow corporate norms.

Study conclusions

The report draws up some main conclusions based on the data gathered. One is about education, where the authors feel confident that positive security behaviors correlate with security education. They argue that it should be a government responsibility to educate the population about security, ie by making it a part of school curriculum.

Regarding the surveillance-privacy tension in cybersecurity governance, the authors conclude that people mostly support giving police authority and the tools to fight cybercrime but they do not believe they will get any help by going to the police. Only 13% of victims to cybercrime file a police report.

They further propose policy for government action; primarily strengthening security education in the school system, and giving law enforcement further tools to fight cybercrime.

My thoughts on this

This report provides an interesting piece of work, in many aspects confirming with data assumptions security professionals tend to make about people in general, and perhaps the “typical user”.

The research questions asked at the outset of the report are perhaps implicitly answered through data and interpretations of those data. I will try to add my impression based on the report, and based on my personal experience from the corporate world.

What characterizes the Norwegian cybersecurity culture?

Norwegians are tech savvy – in the sense that they use technology. The report indicates that a lot of people are confident about their own use of technology, and most people believe they can assess what is safe and not safe to do online. When the report drills down into some behavioral aspects, there are issues that may paint a somewhat different picture.

  • People still use the same password on many services, although many report sounder practices. It is not unlikely that this self-reporting is skewed because people answer what they know they should be doing, instead of what they are actually doing.
  • People feel at risk when using online services, but still most people do not back up their data more often than every month, 15% report they never back up data, and 10% say they don’t even know. If the “correct answer” bias is affecting the results here, the situation is likely worse than this in practice. Think about the question: “how often do you check the oil on your car?”. Most people would like to say they do this regularly, like every month – but we all know that is not true.

The question asked about backup was actually how often people back up data that is important to them. I have a suspicion that a lot of people have never thought about what data is important. Is it the pictures of the grandchildren? Is it your financial documents, insurance papers, etc? Is it the recipe collection you keep in Microsoft OneNote? Most people will never have thought about this. A lot of people also believe nothing bad can happen as long as they store their files in the cloud. Beliefs are thus often formed without the competence needed to form informed decisions about value and risk.

My conclusion is that Norwegians are feeling quite confident about their own security practices, without necessarily having very good practices. Overconfidence is often a sign of insufficient know-how, which for the population as a whole is probably the case.

To what degree does cybersecurity education influence behaviors and awareness?

The effectiveness of cybersecurity education is a big area up for debate, especially in the corporate world, and it has also be discussed at length in academia. You can read about my take on when awareness training and when it actually works can be found here: https://safecontrols.blog/2017/02/16/when-does-cybersecurity-awareness-training-actually-work/ .

Awareness training is often about practices – knowing what to do. Then comes motivation and the habituation of that information – how can you make theory into practice, how can you make a conscious effort into habit and second nature? I think two important things are at play here that we tend to underestimate; building on a feeling of responsibility for the collective good (which is also one of the 8 core issues of cybersecurity culture as defined in the NorSIS report), and creating skills that lower the effort barrier for secure practices. People who feel the use of IT is difficult are unlikely to change their existing habits before the “difficulty barrier” has been reduced.

This is where schools can play a role, like NorSIS suggests – but that is also a major challenge based on the current state of affairs, at least in Norwegian schools. I have been arranging an after-school activity on coding for elementary school pupils a couple of years (mostly based on Scratch, and some Python). What is very visible in those sessions is that socio-economic backgrounds correlate to a very large degree with children’s technical know-how. A lot of the teachers also lack the know-how and perhaps interest to be an equalizing factor when it comes to technology as well, although political efforts do exist to make technology a more central topic in schools. In this regard we see Norway currently lagging behind other similar nations, like Sweden or the United Kingdom, where IT plays a bigger and more fundamental role in education.

How do Norwegians relate and react to cyber risks?

People worry about cyber risks, and they worry more the older they get. Another interesting aspect is that people are worried about being subject to online credit card fraud, whereas using debit or credit cards online is one of the behaviors with lower perceived risk scores in the study. Further, using online banking is seen as a low risk activity – which correlates well with banks being seen as “secure”.

Ironically, “using email” is only perceived as slightly more risky than using online banking – in spite of social engineering through e-mail being the primary initial attack vector for 30 years, and still going strong.

They also conclude that having received cybersecurity education does not necessarily change how people perceive online risks, and that this is at odds with how many security professionals view the effects of awareness training. This does not come as a surprise – changing feelings by transfer of facts is not likely a good strategy, and risk perception at the personal level is typically based on feelings, as the report also correctly states. Changing risk perception requires continuity, leadership and the challenging of assumptions among peers – it requires the evolution of culture, and that is a slow beast to move. Training is only one of many levers to pull to achieve that.

To which degree do individuals take responsibility for the safety and security of cyberspace?

Creating botnets would be really hard if all devices were patched, hardened and all users careful to avoid social engineering schemes. This is not something most people are thinking about when they dismiss the prompt to update their iOS version for the n’th time.

Most people probably don’t realize that it is the collective security of all the connected devices combined that make up the security landscape for the internet as a whole. Further it is easy to fall into the thinking trap that “there are so many computers that my actions have no impact” – more or less like the “my vote doesn’t count” among voters who stay at home on election day.

NorSIS sees education as a possible medicine, and that is definitely part of the story. Perhaps should that educational bit be distributed among many different curriculums – languages, social sciences, IT, mathematics – to help form consensus about why individual actions count for the safety of the many.

Summary of the summary

  • The NorSIS report on Norwegian cybersecurity culture is an ambitious project trying to highlight how society as a whole deals with security practices, beliefs, education and perceptions
  • The report indicates that interest and motivation is a key driver of positive security behaviors, and of know-how
  • There is an indication that education works in driving good behaviors.Security training seems to be less effective in changing risk perception. This should not be surprising based on knowledge about change processes in corporate environments: transfer of knowhow is not enough to change attitudes and norms.
  • There is a clear recommendation to increase security competence through the educational system. This seems well-founded and something all nations should consider.