Member-only story
The Failure of Cybersecurity Education
Several years ago, I was interviewing for an assistant professor position at the college I had been teaching at as an adjunct. Part of the process involved sitting with the provost and having a face-to-face interview. During this portion, he asked me what I would change about the current program. My response, which likely cost me the position, was simply that we needed to provide more classes before we gave our stamp of approval on graduates. As I saw him recoil in horror at the thought, I realized that was the wrong answer.
Since that time, I’ve had a lot of opportunity to think about this concept. First as that same adjunct teaching undergrads as well as eventually moving on back into the private sector leading teams of techs and engineers in various consulting roles and in C-level positions at startups. The one thing I’ve found almost universally is that no one truly does this critical function well.
I will interject here to say that this piece came about from a discussion I was having with my daughter, who is an information security manager in a mid-sized company. We were both lamenting the quality of graduates and how many basic concepts they still needed to be taught once they reached the “real world”. This eventually led to my dusting off some notes and engaging in this thought experiment on how a cybersecurity program should be conducted.