If it is everyone’s job to ensure online safety at work, that means everyone needs more and better training in how to do it. One of those on the front lines of that effort is Lance Spitzner, director at SANS Security Awareness.
Spitzner, a security awareness trainer for more than 20 years, spoke to us about how to improve the security posture of what he calls the human operating system.
He said that for Security Awareness Month, given that most awareness officers are part-time, SANS has created the National Security Awareness Month Planning Matrix and Toolkit, which provides an activity or training for every single day this month. “People can download and use the resources,” he said.
An edited transcript of our conversation follows:
The theme for this week is, “It’s everyone’s job to ensure online safety at work.” But while it’s everyone’s job, different people have very different roles. What are those different roles, and do any of them require specialized awareness training?
I’m actually a big, big fan of Smokey the Bear’s approach to awareness. I’m not a fan of saying, “Awareness is everyone’s job.” I’m a fan of, “Awareness is YOUR job.” My concern with the term “everyone” is that I hear, “Ooh, security is everyone’s job? Well then, I don’t have to worry about it because everyone else worries about it.”
So I take Smokey’s approach. It is everyone’s job, yes, but there’s this baseline of secure behaviors that everyone should and needs to exhibit. The problem is, technology alone can no longer secure an organization. Bad guys have developed a myriad of attack methods that bypass technology – firewalls, antivirus, email filters. Or they just pick up the phone. So, we need to make sure that everyone has the consistent, common baseline of secure behaviors.
In addition, certain roles are higher risk – people with privileged access, accounts payable, human resources, or those who handle highly sensitive information. They do require additional or specialized training.
It has become a cliché that, “People are the weakest link in the security chain,” along with its corollary, “You can’t patch stupid (or clueless or careless).” But you’ve been disputing that for a long time. Tell us why you hate those slogans.
Ultimately, people are not the weakest link. They are the primary attack vector for bad guys because we have invested so much in securing technology, it’s really hard for the bad guys to hack technology.
However, we’ve done nothing to secure the human, which means it’s really easy for the bad guys to attack the human element. We’ve created our own problem. So the whole reason I really detest, “Humans are the weakest link,” or, “You can’t patch stupid,” is that it implies that it’s their fault. It’s not. People are the primary target. Whether or not they are the weakest link is up to you and your organization.
If you go beyond just technology and invest in the human element, you’re going to have massive returns because now, not only technology but the human operating system is secure. As long as we continue to ignore the human side of cybersecurity, we are going to continue to lose this battle.
What do you think is the weakest link and why? And what can/should be done about it?
It’s not so much about the weakest link, it’s about what assets are the most vulnerable in our organization. Right now, that is the human operating system, simply because, as I said, we have done so little to help it. Cybersecurity is still really confusing.
If we want to secure the human element, we have to do two things. First, make cybersecurity simple. A perfect example of a behavior we have gotten horribly wrong is passwords. We bombard people with constantly changing, highly confusing and difficult behaviors like complex passwords requiring upper case, lower case, symbol, number, change every 90 days, never write down, unique password for every account.
Second, we have to communicate that in their terms, not ours. More than 80% of security awareness professionals have highly technical backgrounds. That’s great – they understand the problem – but that’s bad because they’re really bad at communicating the solution.
The challenge is to make it simpler, with simpler behaviors and communicate it to people in their terms.
You’ve said that humans are just another type of operating system. How so, given that you can’t program a human to do the exact same thing every time in a given situation?
The similarity is, operating systems store, process and transfer information. As a result, that’s where the bad guys used to go. Today, people store, process and transfer information, so the bad guys are going after that.
Many people have said computers are very predictable, and people are not. That’s why people are vulnerable. But I would argue that is why people can be your greatest strength. Technology is very predictable, which means the bad guys can easily get around it. Every time we buy technology and deploy it, the bad guys figure out a way to get around it six months later, because that technology always behaves the same.
What makes people so powerful is their ability to adapt. You can teach people what to look for, and then when they see an attack that you’ve never talked about, they’ll quickly detect it and stop it.
For example, in an organization I rolled out an awareness program. The first thing we taught everyone was how to spot and stop a phish. The very next day, they got hit with a targeted phone call attack. Even though we had never talked about phone call attacks in this training program, the individual quickly figured out something didn’t sound right, stopped it and then reported it. So I would argue that what makes humans so powerful is that they’re adaptable.
I’m in no way saying technology is bad. You absolutely start there and you need it. But we really have to address both layers – the technical and the human.
What role can/should technology play in security awareness?
From a security awareness perspective, that means how do we use technology to reach people, help educate them, help inform them. There are so many different ways to play this – online training, game-ification, interactive training – but also you can do things like tracking behaviors, so when somebody does something wrong you can let them know what they could have done right, like phishing simulations.
Are the risks different, and therefore should awareness training be different, for different sized companies, or those in different verticals? Or, put another way, does every training program need to be different, or are the basic principles the same no matter where you work?
It’s a little of both. I tend to see organizations share some of the same risks, like social engineering attacks – phishing or phone calls. So people need to be trained on the most common clues of social engineering. Passwords is another very common one I see, and again, it requires the same behaviors to manage those risks. However, organizations may have unique risks, such as international travel, cloud, working remotely, maybe social media, browsers. So there is a baseline of human risks that all organizations share. The top three are passwords, social engineering and accidental. After that, it may depend on the size, the industry, the tolerance for risk.
Does an organization need specialized awareness training for Bring-Your-Own-Device employees?
To be honest, no. I’ve seen a dramatic drop in the need for training on BYOD. That’s because most organizations – especially large ones – have some kind of MDM (mobile device management) software installed. Because that is such an effective technical control, that really helps manage the risk. The exception tends to be small organizations that don’t have the resources to invest in MDM. Then it is a risk, and then they need to do training.
What are the five most important takeaways employees should get from an awareness training session?
One: Cybersecurity is your responsibility. Not everyone else’s, not just the security team. Two: Technology alone will not protect you. Bad guys can bypass antivirus and firewalls. You are the key to defending yourself and the organizations. Then focus on the three core human risks most organizations share: Social engineering (which includes phishing), passwords, and accidents.
I write about software security – and insecurity – personal privacy and Big Data. I have written for CSO Online, the Sophos Naked Security blog and now for Synopsys.
[“source=ndtv”]