The US alone has 210,000 cybersecurity jobs going unfulfilled, according to one recent estimate.
What’s more, the world’s next generation of programmers and IT pros are going to need a deep understanding of security, even if they aren’t “cybersecurity specialists.”
So, are they learning the security skills and mindsets they need? Not in the top universities in the United States, argues CloudPassage.
The topline claim from the company’s survey of 122 leading university computer science programs: US Universities Get “F” For Cybersecurity Education.
That’s a startling claim, so it’s worth exploring and reflecting on CloudPassage’s survey in a bit more detail.
CloudPassage’s research firm began by identifying the top 122 computer and information science programs in the United States, drawing on widely used lists from US News, Business Insider, and QS World. Next, it set standards for grading those programs. How many undergraduate courses in cybersecurity do they offer? How many are required for a student to earn a major in the field?
So, for example, to earn an “A,” a university would need to offer at least three courses in cybersecurity, and require computer or information science majors to take at least two. Not one of the nation’s top 50 programs met that standard; one school that did was the University of Alabama.
Conversely, eight of the top 50 universities offered and required no undergraduate courses in cybersecurity, thereby earning an “F” from CloudPassage. And no less than 28 of the top 50 programs earned miserable “D”s, by offering no more than three cybersecurity courses, while still requiring none.
CloudPassage didn’t discriminate in handing out these awful grades: “D”s or “Fs” showed up in Ivy League schools, legendary engineering and technical universities, highly respected public and private universities, you name it.
A smaller number of institutions did shine in CloudPassage’s survey – including Rochester Institute of Technology and Tuskegee University, each offering 10 security courses; DePaul with nine, and the University of Maryland with 8. (Fear the turtle! Sorry, inside joke there.)
So, what exactly does this mean? That’s harder to say. As CloudPassage CEO Robert Thomas says:
We […] need to train developers, at the very earliest stage of their education, to bake security into all new code. It’s not good enough to tack cybersecurity on as an afterthought anymore. This is especially true as more smart devices become Internet accessible and therefore potential avenues for threats.
And there’s the rub. It’s significant and troubling if top students can earn undergraduate degrees in computer and information science without ever taking security into account. But the research doesn’t answer another question: is security “baked into” the other courses they’re taking?
Do they learn cryptography and cryptanalysis in ways they’ll be able to use? Do their networking courses address access control, or firewalls, or secure protocol design, or penetration testing? Do their programming courses teach best practices for designing and writing more secure code, and testing security? Do their operating system courses discuss privilege control? Are their senior coding projects judged on security as well as other aspects of quality?
If so, they may be learning a good deal of cybersecurity, even though their transcripts never use the word.
Admittedly, that’s a big “if.” But it’s an important question, no matter that it’s harder to answer. So, too, is another question: How good are the cybersecurity courses that do exist?
Those questions aren’t answered by CloudPassage’s study. But maybe someone else will try to answer them in the future.
If nothing else, those “Ds” and “Fs” will get the attention of a whole lot of university deans and department heads. Which can only be a good thing for all of us.