Cybersecurity

You Do Care About Cybersecurity, But That Is Not the Problem

By Philip R. Reitinger
Thursday, March 15, 2018, 2:00 PM

In “Who Cares About Cybersecurity?” Paul Rosenzweig conducted a public opinion survey on personal implementation of cybersecurity practices and concluded: “We don’t care [about cybersecurity]. At least not on a personal level. That’s the only reasonable interpretation of the data on the general lack of uptake for personal private security measures among the general population.

I’m going to be unreasonable: We do care about cybersecurity, but caring is not enough.

As Lawfare’s survey indicates, adoption of voluntary cybersecurity measures is low. The survey shows that use of encryption on personal devices, anonymous browsers and password managers is less than 20 percent. But other data show that this might be a reflection of how difficult or inconvenient it is to set up and use these measures, as opposed to a lack of concern about consequences.

The Global Cyber Alliance conducted a survey in the fall of 2017. In a poll of 1,000 U.S. consumers:

  • Only half can determine if a website is legitimate and safe;

  • 35 percent have stopped an online purchase because of security fears;

  • 27 percent said the fear of online scams causes excessive worry; and

  • Only 16 percent fear a burglar more than a hacker stealing personal information.

The results were similar, if a bit less dire, in the Global Cyber Alliance’s survey of 611 U.K. consumers.

People are afraid and are willing to take steps—such as canceling an online purchase to prevent fraud or identity theft. So why don’t they do more?

I posit that—in the case of device encryption, anonymous browsers and password managers—people may understand the risk, but they don’t understand the solution or its effectiveness. And even if they did, it is still prohibitively inconvenient to implement protective procedures. If a consumer doesn’t really understand the risk, what action to take, or whether that action is effective, implementation of solutions is sure to be low. Using an anonymous browser such as the Tor browser, for example, requires downloading software, deciding to use the browser when you are concerned and then putting up with slower browsing.

I agree with Rosenzweig that low “return on investment” means that consumers are not implementing optional security measures, but the nuance I would add is that this is more directly a result of high required investment, due to the uncertainty and difficulty of action, than low return.

Therefore, instead of raising awareness of risk or making individuals personally liable for security problems in order to increase perception of “return,” a better approach is to lower the required level of investment. Provide security by default, and give people at least baseline, but preferably enterprise-class, security automatically when they use a service. There are, of course, costs to this approach, but those costs are beyond the scope of this post and, in any event, aren’t so prohibitive as to make this solution unviable. There are good examples, such as blocking access to malicious sites through DNS as is done by Quad9 and similar services. Building in security by default will take investment from both government and industry.

As Rosenzweig says, a decade of government efforts to raise awareness has been insufficient. Awareness alone does not work at scale; awareness fails often. While increased awareness may raise costs for attackers, it can be overcome by automated attacks that will turn a small success rate into a series of significant and successful intrusions. The solution is not that “we need to think of ways in which government intervention can ‘nudge’ the general population in the right direction.” Instead, industry should stop asking consumers to make security decisions for which they are ill-equipped, especially when implementation of those decisions is burdensome. As Microsoft discovered decades ago, asking a consumer if she wants to run a process does not add value. If the consumer doesn’t understand what the process is, she will click “yes” almost always. Industry also needs to position bad security decisions so that they are, to use technical jargon, really hard to make. Save liability for inexplicably bad decisions that actors are equipped to make—decisions that don’t happen by default—such as corporations failing to meet basic and clear security standards.

This technology requires a paradigm shift: Don’t teach people to farm. Sell them food.

Topics: