By Christopher Paul, PhD
1. We read a lot about cybercrime and the vulnerabilities all organizations face. How would you characterize the threat?
We do read a lot about cybercrime and the risks to individuals and organizations from stolen data and illicit network access. These remain significant aspects of the contemporary security environment. But in my research, I focus on other cyber-enabled threats. Most recently, the threat of manipulation through deception and disinformation.
Traditional cybersecurity warns of “social engineering,” the use of deceptive messages or communications (through any media: email, social media, by phone, in person) to get people to divulge sensitive information that can then be used for fraudulent purposes (unwarranted access, identity theft, etc.). However, the same concept can be used not just to trick people into sharing sensitive information, but to manipulate their perceptions, beliefs, attitudes, intentions, and ultimately… behavior.
Disinformation campaigns are a form of social engineering and seek to intentionally change people’s attitudes and behaviors over the long term. The risk is different from the economic or other loss threatened by the disclosure or loss of sensitive or proprietary data: the risk is that people end up doing things that they otherwise wouldn’t have done or espousing beliefs or supporting policies (or candidates) that they wouldn’t have, had they only been exposed to open discussion and truth-based reasoned argumentation.
The range of attitudes and behaviors that could potentially be affected in this way is quite broad and could cover everything from product choice in an otherwise fair and competitive marketplace, to openness to political compromise, to voting preference in referenda about a nation’s future economic direction (as was the case in the BREXIT vote).
2. What are the biggest cyber threats facing organizations in 2019?
While a host of traditional cyber threats remain critical, certainly one of the biggest threats is from manipulation through cyber-enabled disinformation campaigns. Although Russia is the poster child for aggressive use of disinformation, they are not the only players. Numerous nations have used various forms of state-sponsored internet trolling to mobilize publics against dissidents or specific policies, or to intimidate or undermine political opponents. And, having observed Russia’s success in using propaganda that is not in any way committed to objective reality, many other actors (nation-states or otherwise) are beginning to explore similar approaches.
I have long assumed (as I believe most communication professionals have) that the foundation of effective influence is truth and credibility. Well, the cat is out of the bag: as Miriam Matthews and I describe in our paper on Russian Propaganda, the truth may be influential, but falsehoods can be highly influential, too, and untruth is certainly not a barrier to successful influence!
3. Are private enterprises more vulnerable than military or government agencies? Do you distinguish between the two in terms of the threats?
Considering disinformation campaigns, the general vulnerability of private organizations vs. government organizations share some similarities and some differences.
All personnel, whether private citizens or government or military personnel, are potentially vulnerable to disinformation. Government personnel may be slightly less vulnerable, as they may be more likely to have received information about the threat, and restrictions on government systems access to certain parts of the internet may make government employees less likely to be exposed to disinformation, at least while they are at work.
4. I’ve read that the average dwell-time—the average time before a company detects a cyber breach—is more than 200 days, highlighting this as an area where companies do not do well. Many companies large and small are not proactively looking for cyber breaches and only when they detect smoke do they realize the company has a cyber breach. What’s your advice to private sector companies in terms of preparedness?
I’ll speak to preparedness and awareness of attacks, not regarding cyber breaches, but related to disinformation campaigns.
Some companies are very attentive to their brands and to what others are saying about them. I’ve heard that Gatorade, for example, has a social media command center where they watch what is being said about or related to Gatorade and are prepared to intervene with corrected information, challenges to false assertions, or offers of customer support where they feel it is needed. While many organizations may not have that level of resources to dedicate to watching for disinformation about them, some kind of attention is a good idea.
Further, organizations should do more than just watch for possible threats, but prepare to respond to them, and warn their personnel about the possibility of attempts to manipulate them and about ongoing broader disinformation campaigns that have been detected. The Swedish Civil Contingencies Agency has actually published a helpful Handbook for Communicators on this very topic.
5. Strategic Summit delegates often have responsibility for internal communication. How can managers best communicate the importance of cybersecurity to employees?
The best single piece of advice I can think of is “put a why with the what.” Don’t just tell your personnel what you want them to do, tell them why they need to do it. And, don’t just tell them about the threat (the why), tell them what to do to protect themselves (and the company) from that threat.
Too often, security rules are imposed without context and don’t come with enough information for people to understand the threat, the security measure that is supposed to stop it, the risk of failure to comply with the security measure, or how adherence to the rule protects against the threat. Therefore, many security rules are viewed only as burdens and are not viewed as things that contribute value. Because of that, personnel treat security policies like they treat administrative burdens: things to be avoided, short-cut, or worked around.
It is much easier to get better compliance with security measures when everyone understands why they are necessary and how they contribute to security.
Christopher Paul is a senior social scientist at the RAND Corporation and professor at the Pardee RAND Graduate School. He is also a member of the adjunct faculty in the Center for Economic Development in the Heinz College at Carnegie Mellon University.