What Cybersecurity Gets Wrong

Cybercrime cost Americans $10.3 billion in 2022, according to the FBI’s IC3 report. The average cost of a breach globally was $4.45 million in 2023, per IBM’s Cost of a Data Breach report. And the total cost may be as much as $8 trillion around the world this year.

Cybersecurity is clearly getting something wrong.

Cybercriminals, of course, are constantly adapting, so we can’t lay all the blame at the feet of the industry that is supposed to defend us from them. At the same time, though, many organizations are falling behind — floundering as they try to address the latest threats or simply ignoring them and hoping for the best.

Even proactive CISOs are limited by budgetary constraints and operational inefficiencies. Bristling arsenals of security solutions are deployed in a haphazard perimeter, leaving plenty of gaps for motivated hackers to exploit. And staffing shortages don’t help. ISC2’s 2022 workforce study indicates a need for 3.4 million more cybersecurity professionals.

Here, InformationWeek addresses these and other challenges for effective cybersecurity, with insights from Joseph Williams, global partner at Infosys Consulting, Neil Jones, director of cybersecurity evangelism at Egnyte, and Zulfikar Ramzan, chief scientist for Aura.

Skill Deficits and Hiring Problems

In addition to the shortage of manpower, a 2021 report compiled by the Information Systems Security Association (ISSA) and industry analyst firm Enterprise Strategy Group (ESG) notes that 57% of the 500 professionals they surveyed believed that a skills shortage was negatively impacting their organization. So, not only are there not enough workers to manage necessary tasks, but the workers that are available do not offer the full breadth of necessary skills.

John Williams, Infosys Consulting

Williams observes, “There is a lot of turnover in cybersecurity staff. Tribal knowledge and institutional knowledge — all of that is painfully fluid. If you get three of your top penetration testers leaving, rebuilding that expertise is actually not trivial.”

Recruiting replacements and expanding staff presents a number of hurdles in and of itself — the ISSA/ESG report that their survey group was concerned about lack of competitive compensation and by incompetent human resources departments unable to locate suitable candidates.

Further, Williams notes, the integration of cybersecurity staff with various skill sets can be quite difficult. “I have silos of expertise,” he says. “I don’t have very many people who actually can knit it together. I see very few fleet commanders, people who can actually manage more than just a ship, that can manage a task force. Once you find one, the truth of the matter is they’re jumping off to startups to make the big time instead of hanging around in corporations.”

Lack of Communication Between Leadership Teams

As the authors of a recent article in the Harvard Business Review suggest, there is an additional disconnect: between the CISO and the board. According to board members surveyed, only 69% felt they were on the same page as their CISOs. Yet fewer than half even interact with them on a regular basis. This, says Jones, is often the result of “an executive team who doesn’t take cybersecurity seriously and views IT security spending as a project cost, rather than as an investment in brand protection.”

“Cybersecurity never gets the investment it deserves, because it doesn’t generate revenue,” Williams adds. “It doesn’t create value. It protects value. So one of the problems is under-investment. It’s hard to build the workforce you really need.”

Regular training for executives may help to mitigate this communication gap. So, too, recruiting executives who are more familiar with the current landscape may help to reinforce the notion that cybersecurity is important in the C-suite, even when the CISO isn’t there.

Conversely, CISOs need to cultivate communication skills that allow them to emphasize this importance in easily understood terms. People tune out when they don’t understand what they’re hearing. And executives may be particularly resistant. Nearly 30% of CEOs are high in narcissistic personality traits according to one study, suggesting that many may be resistant to admitting their own ignorance and asking for more detailed information.

This is not insignificant given the fact that these executives themselves may represent vulnerabilities to the organization. “We’ve seen executives or board members at big Fortune 500 companies practice very bad cyber hygiene,” Williams confides.

Overconfidence and Negligence

Hubris has contributed to more than a few cyber breaches. Organizations invest in suites of expensive tools that promise to safeguard data and assume that they will do just that. But whether or not these are the right tools, arrayed in the proper positions, is another question entirely.

Organizations should avoid “reliance on disparate, cobbled-together cybersecurity solutions that don’t provide comprehensive protection against cyber-attackers and malicious insiders,” Jones advises.

Williams uses the metaphor of the Maginot Line, a fortification erected by France along its eastern border in the 1930s to deter a German invasion. While impenetrable at its strongest points, it had not been reinforced near Belgium, allowing Germany to circumvent it. Similarly, impenetrable firewalls do no good if they do not surround the entirety of an organization’s assets or if they are not properly manned and monitored.

“The gaps are in the management side,” he says. “Can I get somebody who can take a look at our entire vulnerability landscape, where I’ve got 11 tools and figure out where the gaps are?”

“You end up with an environment where effectively you just don’t know what’s happening,” he continues.” You cannot be certain that you’ve covered it. You do the best you can. But no matter how good you think you are, tomorrow, you’re not that good anymore.”

The Cybersecurity & Infrastructure Security Agency (CISA) has begun cataloging a set of particularly bad practices to avoid, including the use of end-of-life software, default passwords, and single-factor authentication.

Jones emphasizes the use of end-of-life software in particular. “IT security solutions that are currently deployed should all be evaluated and categorized as follows: keep, upgrade or decommission,” he suggests.

No Proactive Measures

Further, regular threat hunting, penetration testing and patching need to be conducted. Certain vulnerabilities, known as zero-day vulnerabilities, have not yet been identified or corrected by the software provider. But N-day vulnerabilities have — and they are easily remedied by a regular patching schedule. Companies brutalized by the WannaCry attack had failed to patch an N-day vulnerability in a Microsoft product.

Neil Jones, Egnyte

“I always recommend that organizations focus on the most significant vulnerabilities that are likely to impact their infrastructure first, with a goal of testing and deploying patches for those vulnerabilities as soon as possible,” urges Jones. “High-impact vulnerabilities are the most likely to be acted upon by cyberattackers and they can have a debilitating impact on company productivity and brand reputation if and when an attack occurs.”

While threat hunting, penetration testing, and patching are essential cyber hygiene tasks, they should not be considered sufficient in and of themselves. Once completed, the system needs to again be reviewed for vulnerabilities that opened during the process.

Employees with administrative controls will often turn off various protections because they are viewed as an inconvenience, an obstacle to getting work done or as part of security exercises — after which, they may not be reinstated.

A regular backup schedule also needs to be established, especially for particularly crucial data sets. In the event of a destructive attack, data can be restored in short order. If it is not backed up, it is gone forever. All equipment should be inventoried, too. It is shockingly common for organizations not to know where each piece of equipment in their organization is located and who uses it — making it impossible to map all attack surfaces.

Conflating regulatory compliance and safety is an additional mistake. While such regulations as the GDPR and guidelines like the NIST Cybersecurity Framework offer useful parameters, they do not offer nearly the specificity required for each industry and even for each individual business.

These warnings apply equally to SMEs: While cyberattacks against large corporations make the headlines, it’s worth noting that 43% of them target smaller businesses.

No Established Recovery Plan

Unfortunately, hackers get past the most scrupulous of cyber-hygiene practices. Organizations need to prepare for this eventuality by having a solid recovery plan. Nearly 40% don’t have one.

“I’ll hire [an outside provider] when there’s a problem. Then I’ll pay this big fee to recover,” Williams says of the typical mentality. “It’s not a sustainable model. Frameworks are going to have to change from just prevention to full business continuity. How do we maintain the business while we’re under attack? The military industrial complex is already there. Almost no one else is.”

An ounce of prevention may be worth a pound of cure, but you still need to have some measure of the latter on hand in case disaster strikes. And building a resilient organization is nearly impossible if overconfident executives believe they are immune from it in the first place.

Human Error

Human error accounts for up to 95% of cybersecurity incidents, according to the World Economic Forum. Verizon’s data breach report puts it at 85%. However, only 67% of board members surveyed by the Harvard Business Review believed that human error was a primary cause of breaches.

These errors take many forms, from falling for phishing scams to using obvious passwords and single-factor authentication to leaving back doors open in coding. Alert fatigue may lead analysts to ignore real security issues when overly enthusiastic programs flag benign events on a regular basis.

Many are ultimately attributable to a failure to create a culture of security within an organization by instituting regular training and testing. But even then, people are remarkably susceptible to the manipulations of bad actors. The psychological literature has shown that hackers are often high in DarkTriad personality traits: Machiavellianism, narcissism, and psychopathy. People with these traits are skilled manipulators and typically feel little or no guilt in leveraging that skill to get what they want. This is the social engineering aspect of cyberattacks — gaining the trust of the user to steer them toward dangerous actions like sharing passwords.

Intriguingly, high narcissism scores in users correlated to susceptibility to phishing attacks as well. So did high neuroticism, one of the Big 5 personality traits. Another study found that procrastination, impulsivity, a predisposition to risk taking, and a failure to think about the future all led to increased susceptibility to scams.

Using tests that measure such qualities may help determine cybersecurity risk amongst various subsets of employees, allowing for more targeted training and correction of the problem. Even more informal surveys may highlight disparities in cybersecurity knowledge, which can then be enhanced. Other tests designed to measure specific cybersecurity skills may be helpful as well.

One study that surveyed employees at two IT companies found that the sales team was most lacking in cybersecurity knowledge. While some members were eager to learn more, they found access to that knowledge to be lacking. They encountered challenges in finding the time to seek it out while meeting their other responsibilities, underscoring the need for more structured programs.

Zulfikar Ramzan, Aura

“Employers need to do more than just mandate an annual cybersecurity awareness program and prepare employees to identify and react appropriately to increasingly sophisticated cyber threats — in their personal lives and at work, as remote work and shared devices blur these lines and create more opportunities for hackers and fraudsters to take advantage,” Ramzan exhorts.

How exactly to create an effective cybersecurity culture is a matter of debate. Some advocate for radical transparency, urging employees to report any potentially damaging action free of consequence so that it can be remediated. This, they argue, will deter them from concealing mistakes. However, it may also lead to lax behavior and the assumption that someone else will always be there to clean up the mess if they screw up.

Others have suggested systems of penalization for cybersecurity errors — fines for example — and rewards for consistent cyber-hygiene practices.

“I’m not a big believer in consequence, but I believe in accountability,” Williams says. At Infosys, employees are sent fake phishing emails and if they click on them, they are sent to additional training courses. Research has borne out the efficacy of this approach.

“End-users should complete cybersecurity training at least quarterly, with a focus on gamifying the process to make it more fun and interesting,” Jones adds.

Even steps as seemingly arbitrary as constantly changing the format of security warnings can lead users to pay more attention to them. When the same warning pops up multiple times in the same format and turns out to be a false alarm, users become habituated. If warnings constantly change form — so-called polymorphic warnings — users are more likely to pay attention to them.

While these massive cultural changes may seem intimidating — and potentially costly — getting a handle on them may mean the difference between smooth sailing and falling victim to digital pirates. So, clear the decks, secure your cargo, get your crew in shape, and chart a safe course. They’re waiting for you.

What to Read Next:

How to Build True Cyber Resilience

Bad Data: Is Cybersecurity Data Any Good at All?

How Will the New National Strategy Address Cyber Workforce Needs?