Analyzing the Perceptions of Cybersecurity Risks in Civil Society Organizations: A Survey Approach

April 5, 2022

This post describes our work presented at the CHI 2020 Networked Privacy Workshop, a collaboration by a team of researchers affiliated with the UC Berkeley Center for Long-Term Cybersecurity (CLTC) and the International Computer Science Institute (ICSI). You can find more information about this project in my previous blog post published on CLTC’s Medium channel.

In recent years, there has been an increase in research examining online security and privacy behaviors of underrepresented and vulnerable communities, revealing nuanced and group-specific concerns and practices. One such group consists of employees working for civil society organizations (CSOs), which include advocacy groups, humanitarian organizations, labor unions, indigenous peoples movements, faith-based organizations, community groups, professional associations, foundations, think tanks, charitable organizations, and other non-governmental and not-for-profit organizations [1]. 

Compared to other sectors, civil society operates in elevated-risk contexts, as they are often targeted for political or ideological reasons by state-sponsored actors [2], political opponents [3], hate groups [4], and radicalized individuals [5]. Whereas attacks against for-profit organizations mostly result in financial losses [6], attacks against individuals working for politically-vulnerable CSOs often carry greater ramifications, including, in severe cases, threats to freedom of expression, liberty, and even life [7, 8, 9, 10].

As part of my research at the Berkeley Lab for Usable and Experimental Security (BLUES) [11], my colleagues and I conducted an anonymous online survey with 102 CSO employees in Fall 2019 to collect information about their risk perceptions and strategies to mitigate cybersecurity threats. We surveyed employees at a broad range of organizations based in the US, some of which could be classified as high-risk and others as low-risk, in order to compare their average perceived risks. 

In this blog post, I would like to discuss the methodology that we used for our study and the challenges that we faced with our approach. We hope that this discussion of methodological challenges will benefit other researchers and practitioners working on understanding the complex threat landscape facing our civil society. 

Methodology

The goal of our study is to better understand cybersecurity concerns and practices in CSOs to improve their resilience against cybersecurity attacks. Based on our personal experience working with CSOs and prior work with journalists [12], activists [13], and humanitarian workers [14], we identified the following seven threats: phishing, malware, online harassment, online reputation attack, physical device compromise, surveillance, and attacks on online services. We designed and executed a survey among employees at CSOs to assess the respondents' risk perceptions of each of these seven threats and to collect information on their self-reported risk mitigation strategies for one specific threat chosen at random. Additionally, the survey presented a list of strategies that correspond to best practices in mitigating each threat and asked participants to report their level of familiarity with each and whether they have made use of them.

In addition to following standard questionnaire design guidelines, such as designing scales to measure the constructs, minimizing survey response time, and protecting the confidentiality of responses, we addressed several considerations that were specific to employees at CSOs: establishing trust, preserving anonymity, and recruitment challenges.

Establishing trust. While communicating research risks is essential in any study that involves human subjects, it is especially important when surveying vulnerable populations. More specifically, revealing information about the current security practices and priorities of a CSO could place their employees and the organization as a whole at heightened risk.  While our survey was completely anonymous and did not collect identifiers of any kind, we also had to ensure that our respondents felt safe enough to provide information related to our research goals. In addition to communicating our commitment to anonymization in the consent form, we separately highlighted the anonymity of the survey in a separate location in the survey itself, in order to establish trust with respondents and to relieve their concerns. This also increased the chance that respondents were aware of the safeguards employed, even if some of them did not read the consent form entirely. 

Using anonymity-preserving incentive strategies. We wanted to provide some incentives to respondents as a compensation for their time, and to increase participation rates in the survey. As we are not assigning any identifiers to the survey participants, we cannot follow-up with them to provide any direct compensation. Instead, at the end of the survey respondents can select one of three charities, to which we will proportionally donate our compensation budget at the end of the study. Research has shown that material incentives (either monetary or non-monetary) [15] and sharing result summaries [16] increase response and completion rates for online surveys, but without affecting the quality of responses [17, 18].

Using a trusted intermediary for participant recruitment. In order to reach our target population, we partnered with TechSoup, a nonprofit that coordinates an international network of other nonprofits, providing technical support, training, and tools. We distributed our survey in one of TechSoup’s periodic newsletters, which allowed us to leverage their large reach among nonprofits and their existing connection to our target audience. The survey was promoted via a banner ad, which included an anonymous link to our survey. The content and format of the banner had to accommodate the existing conventions established in TechSoup’s newsletters. Recruitment text with a stronger call-to-action may have attracted more attention, but this compromise was well worth it to directly access our target population. Additionally, we are able to disseminate key findings back through TechSoup’s platform so that participants who participated anonymously can review and learn from the results of the study.

Challenges

After analyzing the responses and the feedback from respondents, we identified several issues with our first iteration of the survey, including the survey length, incorrect terminology, non-applicable questions, and the design of the banner ad.

Length of the survey. By far the most common feedback that we have received was that the survey was too long and/or contained repetitive questions (mentioned by 41% of respondents who provided feedback). Our survey had a 19.8 minutes median completion time, which is longer than the recommended 10 minutes for online surveys [19].

To address this challenge, we suggest narrowing down the scope of research questions to reduce the number of constructs measured by the survey, or to use the between-subject approach, i.e. present only a subset of questions to each participant if the expected sample size is large. We also suggest providing feedback to respondents as they are completing the survey, for instance, by displaying their current progress in the survey and explaining that although questions might seem repetitious, they are in fact measuring different factors.

Incorrect terminology. Another issue that we discovered was the usage of the word ‘employee’ throughout the survey to refer to the participant. In the beginning of the survey, we clarify that we make no distinction between different capacities of involvement with the organization, and use the word ‘employee’ only for brevity. Nevertheless, 12% of respondents who provided feedback mentioned that they felt confused when responding to questions as their organization has no or few employees, and is composed mostly of volunteers. 

To avoid this issue, it is important to remember the fact that individuals engage with CSOs in different capacities, including employees, contractors, and volunteers. When referring to the respondent directly, one approach to solving this problem is to use neutral phrasing to encompass anyone working at a CSO (e.g. “as someone working for a civil society organization, consider the following…”). For questions that involve the organization itself, we recommend exhaustively listing different options to avoid any confusion (e.g. “How many individuals (including employees, volunteers, contractors, etc.) currently work for your organization?”). 

Non-applicable questions.  Our aim was to survey a broad range of CSOs, regardless of the cause they support, their position in the industry, or their size, which meant that some questions had to be broad enough to cover all types of organizations. For this reason, however, 17% of respondents who provided feedback mentioned that they were not able to answer some of the questions as they were not applicable to them. For instance, some respondents mentioned that their organization was too small to have information security policies, or that they do not report to the IT department because they are the IT department. 

To address this problem, we recommend including a ‘Not Applicable’ option for each question, alongside an optional free-text box that can be used to provide additional comments and clarifications. 

Conclusion

We applied survey-based methods to understand cybersecurity concerns and practices of CSO employees, including their perceived risks of different security and privacy threats, and their self-reported mitigation strategies. 

The design of our preliminary survey accounted for the unique requirements of our target population by establishing trust with respondents, using anonymity-preserving incentive strategies, and distributing the survey with the help of TechSoup. However, by carefully examining our methods and the feedback received from respondents, we have, nonetheless, uncovered several issues with our methodology.

I hope that the discussion of these challenges will benefit anyone planning or already designing a study using quantitative methods. In case you still have any questions or would like to learn more about this work, do not hesitate to reach out to me! 

Acknowledgments

I would like to thank the Citizen Clinic at the Center for Longer-Term Cybersecurity (CLTC) and members of the Berkeley Laboratory for Usable and Experimental Security (BLUES) lab for their support and for providing expert input and review of our survey instruments. I also thank TechSoup for their collaboration and providing access to their network of nonprofits during this research project. This research is sponsored by funding from the CLTC at UC Berkeley.

References

[1] World Bank Group, 2020. Civil Society Policy Forum
[2] Lipton, E., Sanger, D.E. and Shane, S., 2016. The perfect weapon: How Russian cyberpower invaded the US. The New York Times, 13. 
[3] Scott-Railton, J., Marczak, B., Guarnieri, C. and Crete-Nishihata, M., 2017. Bitter Sweet: Supporters of Mexico’s Soda Tax Targeted With NSO Exploit Links.
[4] Brandom, R., 2016. Anonymous groups attacked Black Lives Matter website for six months. The Verge. 
[5] Glaser, A., 2020. Bail organizations, thrust into the national spotlight, are targeted by online trolls. NBC News. 
[6] Accenture and Ponemon Institute, 2019. Ninth Annual Cost of Cybercrime Study.
[7] Marczak, W.R., Scott-Railton, J., Marquis-Boire, M. and Paxson, V., 2014. When governments hack opponents: A look at actors and technology. In 23rd USENIX Security Symposium (USENIX Security 14) (pp. 511-525).
[8] Marczak, B., Scott-Railton, J. and McKune, S., 2015. Hacking team reloaded? US-based Ethiopian journalists again targeted with spyware. Citizen Lab, 9.
[9] Crete-Nishihata, M., Dalek, J. and Deibert, R., 2014. Communities@ Risk: Targeted Digital Threats Against Civil Society. Citizen Lab, Munk Centre for International Studies, University of Toronto.
[10] Deibert, R.J., Rohozinski, R., Manchanda, A., Villeneuve, N. and Walton, G.M.F., 2009. Tracking GhostNet: Investigating a cyber espionage network.
[11] Berkeley Lab for Usable and Experimental Security (BLUES)
[12] McGregor, S.E., Roesner, F. and Caine, K., 2016. Individual versus organizational computer security and privacy concerns in journalism. Proceedings on Privacy Enhancing Technologies, 2016(4), pp.418-435.
[13] Marczak, W.R. and Paxson, V., 2017. Social Engineering Attacks on Government Opponents: Target Perspectives. Proceedings on Privacy Enhancing Technologies, 2017(2), pp.172-185.
[14] Le Blond, S., Cuevas, A., Troncoso-Pastoriza, J.R., Jovanovic, P., Ford, B. and Hubaux, J.P., 2018, May. On enforcing the digital immunity of a large humanitarian organization. In 2018 IEEE Symposium on Security and Privacy (SP) (pp. 424-440). IEEE.
[15] Göritz, A.S., 2006. Incentives in web studies: Methodological issues and a review. International Journal of Internet Science, 1(1), pp.58-70.
[16] Dillman, D.A., 2011. Mail and Internet surveys: The tailored design method--2007 Update with new Internet, visual, and mixed-mode guide. John Wiley & Sons.
[17] Gritz, A.S., 2004. The impact of material incentives on response quantity, response quality, sample composition, survey outcome and cost in online access panels. International Journal of Market Research, 46(3), pp.327-345.
[18] Sánchez-Fernández, J., Muñoz-Leiva, F., Montoro-Ríos, F.J. and Ibáñez-Zapata, J.Á., 2010. An analysis of the effect of pre-incentives and post-incentives based on draws on response to web surveys. Quality & Quantity, 44(2), pp.357-373.
[19] Revilla, M. and Ochoa, C., 2017. Ideal and maximum length for a web survey. International Journal of Market Research, 59(5), pp.557-565.