A soldier was made to sign a non-disclosure agreement by the US Army after pointing out a security flaw which allowed accounts on shared PCs to be accessed without proper authentication.
The trivial login issue, which seems to allow soldiers to operate shared PCs with the access rights of the previous user, was exposed last week in a report on BuzzFeed, and has since been confirmed by senior US Army staff.
Army staff authenticate on shared computers on bases and in the field using Common Access Code (CAC) smart ID cards. On completing a session the card is removed from the reader and the session should be terminated. However, it appears that the logoff process is often slow and can easily be cancelled by the next user, who can then continue to access the system under the previous user's account.
The issue itself is not hugely serious, although it's not difficult to imagine a rogue member of staff easily manipulating it to gain access to information they should not have, or to carry out actions unmonitored - something which should be a high priority in US defense and intelligence circles, given the many high-profile problems keeping control of their data in recent years.
The way the problem was dealt with, on the other hand, could serve as a textbook example of how not to deal with security problems.
The issue has been known about for over two years, with one Army lieutenant who spotted it facing all manner of troubles when he tried to report it to senior staff. Having been told that the problem was too tricky to fix, he was then allegedly made to sign a non-disclosure agreement and told he could face imprisonment if he broke it.
Others who pointed out the flaw to superiors were faced with silent inaction.
A statement issued by senior Army IT security staff after the problem appeared in the news has advised soldiers to be more careful when logging out of shared PCs.
It really shouldn't be beyond the abilities of IT staff to fix a problem like this, especially within a two-year time frame.
Admittedly army funds are not unlimited, like any budget, and rolling out a fix to machines scattered all over the world might be quite a task, but the problem should at the very least be noted down and added to requirements for any future redesign or upgrade.
Responding to helpful bug reports by enforced vows of silence and threats of jail is no way to encourage people to be open about problems they may spot.
More advanced, specialised vulnerability research may be restricted to dedicated experts, but the everyday users of a system are an invaluable resource for spotting simple, easily-exploited security holes.
Encouraging people to take more care and have responsibility for their own security clearly has some value. In an institution which relies heavily on discipline this approach may provide a powerful check on violators, but in normal situations it should only be part of the solution, not the only layer of protection.
Rules for accessing secure systems should be backed up with technical controls too; even the army can't trust everyone it employs, as they now know to their cost.
In business settings, this approach to dealing with IT issues would be inexcusable. But then, most businesses don't have the threat of 30-year prison sentences to dangle over potential data miscreants.