A soldier was made to sign a non-disclosure agreement by the US Army after pointing out a security flaw which allowed accounts on shared PCs to be accessed without proper authentication.
The trivial login issue, which seems to allow soldiers to operate shared PCs with the access rights of the previous user, was exposed last week in a report on BuzzFeed, and has since been confirmed by senior US Army staff.
Army staff authenticate on shared computers on bases and in the field using Common Access Code (CAC) smart ID cards. On completing a session the card is removed from the reader and the session should be terminated. However, it appears that the logoff process is often slow and can easily be cancelled by the next user, who can then continue to access the system under the previous user’s account.
The issue itself is not hugely serious, although it’s not difficult to imagine a rogue member of staff easily manipulating it to gain access to information they should not have, or to carry out actions unmonitored – something which should be a high priority in US defense and intelligence circles, given the many high-profile problems keeping control of their data in recent years.
The way the problem was dealt with, on the other hand, could serve as a textbook example of how not to deal with security problems.
The issue has been known about for over two years, with one Army lieutenant who spotted it facing all manner of troubles when he tried to report it to senior staff. Having been told that the problem was too tricky to fix, he was then allegedly made to sign a non-disclosure agreement and told he could face imprisonment if he broke it.
Others who pointed out the flaw to superiors were faced with silent inaction.
A statement issued by senior Army IT security staff after the problem appeared in the news has advised soldiers to be more careful when logging out of shared PCs.
It really shouldn’t be beyond the abilities of IT staff to fix a problem like this, especially within a two-year time frame.
Admittedly army funds are not unlimited, like any budget, and rolling out a fix to machines scattered all over the world might be quite a task, but the problem should at the very least be noted down and added to requirements for any future redesign or upgrade.
Responding to helpful bug reports by enforced vows of silence and threats of jail is no way to encourage people to be open about problems they may spot.
More advanced, specialised vulnerability research may be restricted to dedicated experts, but the everyday users of a system are an invaluable resource for spotting simple, easily-exploited security holes.
Encouraging people to take more care and have responsibility for their own security clearly has some value. In an institution which relies heavily on discipline this approach may provide a powerful check on violators, but in normal situations it should only be part of the solution, not the only layer of protection.
Rules for accessing secure systems should be backed up with technical controls too; even the army can’t trust everyone it employs, as they now know to their cost.
In business settings, this approach to dealing with IT issues would be inexcusable. But then, most businesses don’t have the threat of 30-year prison sentences to dangle over potential data miscreants.
Image of army figures courtesy of Shutterstock.
Guess What? that's been going on in the Military for more years than you know> I was in the Navy in the 1980"s. So many thing I found wrong in our daily operations. So i found a solutions to them. I wrote a nice report with my findings and solutions, and submitted it to my superiors. Nothing was ever said about to me nor was anything done. I was "Highly Recommended" to sign a non-disclosure agreement.
I followed the link to the article explaining that the login flaw has been "confirmed by senior US Army staff". It confirms everything in this NakedSecurity article, including the story of one lieutenant who reported the problem to middle-ranking officers. "Keep quiet, or face jail time, he was told. Another soldier, who went to his superiors and even Congress, got no results." Another was told that fixing the problem is "impractical".
That's typical of all state operations, whose very nature is non-proprietary authority without proprietary responsibility. If the command (authority) structure refuses to take responsibility for its decisions, that forces those under its authority to take responsibility for problems they didn't create and have no authority to fix.
It's contrary to human nature to take the blame unjustly. If no one owns the problem, no one has responsibility for it. Everyone will end up passing the buck, and the problem will remain completely unaddressed. That sort of condition is a disaster waiting to happen. To the extent that national security depends on such an upside-down system…well, connect the dots for yourself.
OK wow, you have got to admit thats some crazy stuff man, a real snafu!
And the Air Force solved this problem by telling airmen to log off the computer when they leave the workstation…or simply locking the computer before removing the CAC…
“If no one owns the problem, no one has responsibility for it. Everyone will end up passing the buck, and the problem will remain completely unaddressed.”
Well said… not only in the military but anywhere.
I have a somewhat simple solution: Require that when a person logs off that they don't haul their worthless tail out of the chair until they confirm that they are logged off. Something akin to leaving a secured building – make sure that the door closes behind you and is locked.
Oh come on – this is obviously blown out of proportion.
Person finds security flaw, reports it to their employer (the army) and has to sign a NDA promising not to share the flaw while it is evaluated.
What a conspiracy!
This only applies to unclassified systems. Short of getting the staff duty roster, this is low threat.