Assume that it’s time for Bob’s performance review.
Bob’s boss says he’s a great addition to the team. Easy to work with!
And the sales numbers? Hot mama, Bob’s smokin’! Mr. Bob surely has worked himself toward a big, fat raise!
Or not. Bob would have gotten a raise, that is, but he got fooled by a phishing email and unwittingly invited the bad guys in through the front door, torpedoing Widget Industries Ltd’s multimillion-dollar investment in security systems.
Fiction! But can you imagine if this were really the way employees were assessed? They answer a phishing scam email, they trigger a major security breach, and then they’re held accountable?
This is an approach that big companies might actually think of adopting, according to Dave Clemente, a research associate in the field of security who works at Chatham House, a London-based think tank on international affairs.
Speaking to Business Reporter, Clemente suggested that reprimands, at the very least, might help companies whose employees undo millions of dollars of security expenditures by doing something as simple as opening a bad email:
Even if it’s innocent, you can spend millions on firewalls and one of your employees can undo that by opening a dodgy email. ... One idea would be to encourage employees to be more careful. You could have a system where, if you open two or three of them [phishing emails], you get a reprimand.
I think people would comply, particularly if your behaviour regarding cyber security was linked to your annual assessment.
Of course, beyond the misdeeds of Bob and his ilk are the security disasters that companies manage to bring down a bit more systematically onto their own heads, particularly when jumping on the bandwagon for new trends and technologies without first figuring out the security implications, Clemente says:
For bigger companies, one problem is efficiency drives which push companies into insecure behaviour, like moving into the cloud or doing BYOD [Bring Your Own Device] before you realise the security implications, because everyone else is doing it. It’s done as a reaction to what other people are doing and done without being integrated into the company’s technology strategy.
Moving data to the cloud can be particularly tempting to small firms with limited resources who struggle with the burden of dealing with cyberthreats, Clemente noted.
It’s not such a bad idea, given that cloud services can have a decent amount of security, he said, but the downside is that small businesses lose control over data stored in someone else’s hands.
If we move toward holding employees accountable for goofy clicking, should C-level types likewise be held accountable for security fiascos that erupt out of their jumping on technology bandwagons such as BYOD and cloud services?
Call me a liberal weenie, but I’d suggest that decent training might produce better effects than whipping employees.
It all reminds me of a July 2012 article by Immunity Inc. CEO Dave Aitel in which he discussed whether security training might be futile.
Aitel said at the time that in spite of a conscientious approach to security training, his clients still have, on average, a click-through rate for client-side attacks of at least 5 to 10 percent.
Even the training software his clients use has “glaring flaws,” he said, including SQL injection and cross-site scripting – the two most common vulnerabilities in OWASP’s Top 10 list of application security risks.
What’s the answer? Reprimands? Performance assessments that take people to task for security snafus?
I’d say no. I’d suggest that better training might be the way to go.
After all, there are scads of training success stories, many of them posted in reply to Aitel’s PCWorld article.
What do you think? Should we put scam-clicking employees in stocks and toss tuna sandwiches at them, or is there a better way to improve security?
Let us know in the poll below:
Image of Employee Shouting courtesy of Shutterstock.
As an InfoSec Professorial I have to say yes, but only if they've had the initial training and followup reinforcement training, also if they are habitual offenders. Nothing burns me up more than to see people who DO know better commit an offense to corporate security because it's easier for them to do their job or they don't think anybody is watching or because they think that somebody else id getting away with it.
Why spend the money on security if all you do is say "don't do it again" or some system sends a report that nobody does anything about?
I have to agree with wrap2tyt.
We need to educate the user, reinforce those lessons and let all users know they will be held accountable (in some way) for their mistakes.
They didn't have ths answer but I would say yes if they had proper training and the breach wasnt a 0 day exploit.
The responsibility of the security for any organization will ultimately rest upon the employees themselves. The argument may be made that tighter technology strictures could be implemented, yet these invariably impact production. In like fashion, one might seek to place a portion of blame upon inadequate Training. Yet there is no training available that magically cures laziness or unwitting/willful ignorance. Should an employee demonstrate the lack of understanding necessary to prevent intrusion/loss, then they really shouldn't be in a position where it is possible for them to do so – lest they suffer the repercussions of said actions. Security is everyone's responsibility – not a singular department's burden to bear.
"Yet there is no training available that magically cures laziness or unwitting/willful ignorance."… and in that case the company can always say "you signed an acceptable use policy, by signing that document YOU stated that YOU have read AND understand the company policy regarding specific use of company assets that might include but may not be restricted to computers\apps, mobile devices, Internet usage, network resources, email…" I could go on but people need and should be held accountable for their actions, regardless of who they are.
One company I worked for stopped the requirement of the employee reading the AUP, instead there were annual required viewings of the CHRO, CIO and CSO reading the AUP, the employee signed he\she had viewed\understood the video and it's contents and had 15 work days to question any part of the training.
I loved it.
I would like to add that they should only be accountable if they were informed and it was clear to them what the expected behavior was. Also, security goals should be tied to annual incentives. This way they know ahead of time that if they are negligent it may cost them something. It also rewards and encourages positive behavior.
I agree with Ken and Wizard completely, but the devil is in the details:
Who sets security goals? The employee's manager? IT? A compliance department, if there is one? What are these goals? Completion of training programs? With or without a test/assessment of some sort?
Who assesses an employee's security? Would a non-technical manager know enough to do so? Would IT have the authority to do so? How do you deem one person adequate in their practices while another is not?
Who gets to assess the C-level executives and what penalties can be given for inadequate security performance? What about the board members?
Gavin
…and that's where we differ. I don't how it matter at all who wrote or decided what the company security policy is. The employee has to abide by it. First I'd have to say, stop being so concerned about what the C-Level folks are getting or doing that you are not getting or allowed to do, it's not your business. Second, if the employee has signed that he\she has read and understand the policy and has not gone on record as being opposed or challenging the policy, what's the problem? And even if you have gone on record does not excuse breaking the rule… does it?
The information security policy exists to support the business goal\mission, not to make it easier for someone to do their job or browse the web during lunchtime.
Third, "who assesses an employee's security", what does that even mean? If you mean who monitors the security of the organization, that would be the CSO or his department, who dishes the punishment, then that would be up to HR and legal, not IT Security, IT Security does not write the policy, again the policy is based on what the business needs are and MANAGEMENT OWNS THE POLICY! They may act like they don't but they do, because if security wrote owned the policy and dished the punishment, who would be using the assets?
Everyone in an organisation is accountable for their own actions. Security is both a corporate and personal responsibility at work. In the home environment, it is the person's own responsibility together with the person/people who administer the systems (both hardware and software).
If people have had appropriate training then they are deemed to know and understand both what is a risk and what behaviour is expected of them. If they don't apply their knowledge and some 'common sense' by clicking on inappropriate links then they need to know what the consequences are for them and the business.
I work to the ideas of 'Think about it first' and then 'If in doubt, don't!' That is safer than blindly clicking just because …! OK, find out more by asking those who could know, then you will know better next time – if there is one.
" but I'd suggest that decent training might produce better effects than whipping employees"
Agreed. Plenty of emails are obviously junk and should be deleted, but spammers are getting better and better at disguising their phishing emails. You have to teach your employees what to look out for, how to handle those phishing emails and really show them what could happen if they let it slip through.
Associating accountability with risk is certainly one of the few ways of creating an incentive to be more security conscious. Many systemic security problems tend to languish because there's no governance framework that identifies who's supposed to own up to them.
The downside to any punitive approach towards security is that employees will be less inclined to fess up or report when they've made a boo-boo. Even in more lenient organizations, I've had to wade through the morphing stories and tangles of lies just to find out that someone simply made a mistake. One investigation into a catastrophic failure took months to resolve and the culprit turned out to be an employee who was too afraid to admit his error — it was an error that could have been mitigated easily yet the fear of reprimand prolonged and exacerbated the damage. By that point, there was no choice but to fire him. Ironically, he had fulfilled his own worst fears.
Neither carrot nor stick works on its own – you need both. I've given a fair number of security awareness sessions and more recently I've included a simple quiz at the end to test attentiveness and comprehension, and it's clear that there are some people who just don't take it in. For them, a bit of stick might be needed, even if only "unless you get at least 6/10 you're gonna come back on the next awareness session!"
Though I've never had the opportunity to try it, I think the idea of phishing your own staff and either a gentle reprimand for those who fall for it (less gentle the 2nd or 3rd time!) or a small prize for those who detect it (or both) is an idea well worth exploring. It's all very well saying don't open dodgy attachments or click on links, but unless you've seen a few and know it's a real danger (it'll never happen to me), it's not going to result in a change in behaviour.
Dave Clemente wrote "One idea would be to encourage employees to be more careful. You could have a system where, if you open two or three of them [phishing emails], you get a reprimand."
Uhhh, nothing wrong with opening a phishing email, only responding to it. In fact you sometimes can't tell from the sender/subject that it IS a phishing email until you open it, see the bad grammar, misspellings, and foreign usage, and hover the pointer over bogus links.