As a person who primarily focuses on the human aspects of security and implementing security awareness programs, people are surprised when I am neither upset nor surprised when there is an inevitable human failing. The reason is that I have come to the conclusion that most awareness programs are just very bad, and that like all security countermeasures, there will be an inevitable failing.
I have to admit that it is frustrating to have to argue with people who claim that awareness is always bad. They argue that since there will always be a single failing, then it is not worth the effort to have an awareness program in the first place. Of course, I vehemently disagree. However to debate people, and address their points, at least in the eyes of decision makers, you need to understand the foundation of their arguments and accept the premises that are true.
Three years ago, I wrote a similar article on awareness programs failings. In the last three years, I have reviewed dozens of other programs, investigated incidents, watched vendor marketing campaigns, listened to the hype, and heard about thousands of data breaches. While I try to refrain from repeating the same points, there may be some repetition, but there is refinement. I intend to bring about the points that are most relevant to the current state of what is an apparent poor state of awareness.
[ ALSO ON CSO: A security awareness success story ]
In the coming months, I will delve into some of these failings as separate articles, as they can be complicated subjects to address. For now, just consider that they do present specific issues that you might need to address.
This is probably the greatest deficiency in all awareness programs. Too many awareness programs focus on telling people what not to do. The fact of the matter is that awareness should focus on implementing good security related behaviors. These behaviors should be defined in formal procedures and guidelines. In other words, security awareness programs should be the promotion of behaviors defined in governance.
Security policies and procedures commonly sit on the shelves, except when auditors request to see them to ensure they exist. Whether people realize this or not, governance should ensure that a security program is not an accident, but a purposeful activity that is well defined. In short, a security awareness program should likewise be a purposeful activity.
Relying on fear
As opposed to the positive promotion of procedures and guidelines, many awareness efforts attempt to scare people. I assume the thought is that if people are scared, they will stop and think an action through. That is a gross mistake.
I remember being both horrified and amused when I spoke to a CIO, who was touting how great an awareness video was. His quote was, “I was scared to check my email after watching the training.” That is completely wrong.
Consider that checking email is a critical business function. People should never be “afraid” to perform a critical business function, but confident in performing it safely. While “motivation” is important to encourage people to behave appropriately, and a bit of fear can provide some motivation, fear can paralyze people. Also, it will eventually backfire, as when nothing happens after a period of time, it becomes the equivalent of “Crying wolf!”
The hacker mentality: Tell people not to do that
While somewhat similar to relying on fear, many awareness programs rely on telling people how a hacker hacked them, and then telling them not to fall victim to it. For example, they will tell you how a hacker asked for a password over the phone and then tell people that this is the reason they should not give out their password over the telephone.
I cannot overstate this fact; just because a person knows how to break something, it doesn’t mean they know how to fix it. For example, just because you can step on a light bulb and break it, it doesn’t mean you can then fix it. While this analogy applies to hacking computers as well, it especially applies to hacking people.
When you tell people specifically what not to do, with specific examples, if a hacker tries other tactics, they will likely be successful. For example, during a social engineering test, I called people and asked for their passwords. If the person didn’t disclose their password, because they were told not to give out their passwords, I walked them through modifying the Registry file in their computer.
Improving security awareness is infinitely more complicated than telling people what not to do. Again, it is about promoting behaviors dictated by governance. This requires integrating behavioral science principles. Marketing is much more akin to awareness than being a career technologist. A technologist can learn behavioral science, but they have to first acknowledge that understanding behavioral science is more important than understanding the underlying technology.
Failing to consider successes
Awareness failures can be devastating. However, awareness failings are relatively rare when you consider all of the actions that users take on a regular basis. Awareness successes are less noticeable, but they happen on a regular basis. Consider how many spams and other emails are not opened.
Every time a user takes the appropriate action, it is a success. Again, it is easy to focus on the failures, and they can be bad. However, when you look at awareness from a cost/benefit perspective, you do need to consider how bad things would be if all potential user failings did occur. No security countermeasure is perfect. Awareness is, like every other countermeasure, not perfect. However, unlike many technical tools, there are no records created for blocked attacks.
Bad technical security
No user action should cause a devastating loss to an organization. For example, users should not have permission to install software on systems, and therefore ransomware should not be allowed to install on a system, if a user opens a malicious file. Storage devices should be encrypted, so a loss of the a necessarily creates a compromise. Web filters should stop people from visiting unsafe websites.
Poor technical security enables the inevitable user failing to become a serious incident. While better user awareness reduces the need for technical countermeasures to kick in, there must be defense in depth to prevent significant loss from non-technical, as well as technical, failings.
Focus on CBT and phishing
Computer-based training (CBT) is a single form of conveying information. It is not an awareness program to itself. People have different learning styles, and many people will not respond well to CBT. Consider how many people prefer reading a book to watching a movie. As important, even if people appreciate CBT, they might not appreciate the style of the CBT. Again consider that even if people prefer movies to books, some people prefer dramas to comedies. Some CBTs try to be informative. Some CBTs try to be comical. Some try to be scary. When you are looking at a large organization, you must consider that even well created CBT is only going to be effective in a minority of employees.
Phishing simulations, even assuming they are effective in reducing phishing incidents, only trains people in being less susceptible to phishing attacks. While that is a significant problem, it does nothing to help people with password security, physical security, safe web browsing, among countless other awareness topics.
I want to be clear that both CBT and phishing simulations can have a part in an effective awareness program, but they are only a part of an effective program.
Treating awareness as a casual activity
While treating awareness as a casual activity might be considered akin to a check the box mentality, some organizations do sincerely want to do more than check the box. They put pride into the CBTs that they choose. They might also put on some presentations. However, while there is the intent to do more than check the box, they still do not put in sufficient resources, and the awareness program is more of a hobby than a concerted effort to improve security behaviors. In short, the awareness program is more about activities, that while individually engaging, are disjointed and ineffective.
I previously wrote about the fact that security professionals assume users have a common sense with regard to most security issues. As I stated at the time, there can be no common sense without common knowledge. If you assume that users have more knowledge than they do, you will fail to address basic issues. For example, while it is true that most people know about the underlying principles of phishing, you cannot assume everyone does. Even when people know about phishing, you cannot assume the depth of their knowledge.
If there are any particular topics you want me to delve into further, please feel free to send me a message. Minimally, you need to examine the points made in this article and determine if they apply to your organization. Admittedly, none of these issues can be immediately corrected. Like all security deficiencies, poor awareness programs require a concerted effort to correct the issues.
The underlying problem is that security awareness programs are more difficult to implement than most security professionals want to acknowledge. Awareness is a separate discipline that requires the appropriate knowledge, skills, and abilities (KSA) to implement a program properly. Without those KSAs in place, nor even the knowledge that a specific set of KSAs exist, security awareness programs will continue to suck.
Ira Winkler, CISSP is president of Secure Mentem.
This story, "9 reasons why your security awareness program sucks" was originally published by CSO.