Abstract

While many cybersecurity culture studies have focused on identifying and measuring an organization's cybersecurity culture—assumptions, values, behaviors, and artifacts—less research has focused on how cybersecurity culture is enacted in the daily workplace in ways that lead to cultural change. In this paper, I approach cybersecurity culture as a meaning-making activity, or practice within an organization. Organizational theory on narrative practices—including storytelling, sensemaking, and sensegiving—provide a conceptual framework to better understand cultural meaning-making practices, as well as how those practices shape decision-making and organizational actions. Using ethnographic observation and interview data, I conducted a narrative analysis of interdisciplinary communication between IT and Facilities professionals working with Internet of Things vendors and their associated risks. The findings demonstrate that storytelling, sensegiving, and sensemaking practices were key to developing an emerging narrative that shaped professional and organizational decision-making to improve cybersecurity. The results of this study suggest that a narrative approach to cybersecurity culture can illuminate practices of cultural meaning-making and organizational decision-making, and suggests that organizations should provide resources for IT and Facilities professionals to engage in interdisciplinary work to create a more robust cybersecurity culture in Facilities departments.

Introduction

Research on cybersecurity culture in organizations often focuses on identifying what cybersecurity culture is, the components of cybersecurity culture (e.g. assumptions, values, behaviors, artifacts), and finding ways to measure those components to pinpoint educational programs to improve cybersecurity culture in organizations [1]. These studies often approach cybersecurity culture with certain assumptions: that technology users are the “problem” and that organizational cultural change is a top-down affair, where leaders and managers develop programs and policies to change weak cybersecurity behaviors and poor security knowledge. These studies also primarily focus on an organization's cybersecurity culture as it pertains to traditional IT technologies and threats, such as spamming and phishing campaigns directed at employees.

Recently emerging is a set of literature that is beginning to approach cybersecurity culture using a different lens: identifying the multiplicity of cybersecurity cultures, how cultural understandings of cybersecurity and risk are constructed in specific contexts, and the implications of these cultural constructions. Some of this literature has adopted Weick's concept of sensemaking, a meaning-making activity that people use to make sense of ambiguities or conflicts in one's work life [2]. However, there are still gaps in how we understand an organization's cybersecurity culture, particularly as it pertains to its enactment in the daily lives of employees in the workplace in ways that lead to determining risk and instigating cultural change. Furthermore, there is an emerging need to understand cybersecurity culture(s) due to the increase of Internet of Things (IoT) in the built environment as these devices are designed and maintained by stakeholders that include not only IT professionals, but also Facilities managers, capital project managers, and many others.

This article approaches cybersecurity culture in organizations by focusing on how culture is enacted through narrative communication practices. In this paper, I define cybersecurity culture as a meaning-making activity, or practice where different cybersecurity values, assumptions, and artifacts are created, debated, negotiated, and maintained by members in an organization. The research questions this study will answer are as follows:

  • How do Facilities and IT workers use narrative practices to construct cybersecurity values, assumptions, and artifacts?

  • How do these narrative practices shape organizational decision-making and change organizational culture?

This approach shifts away from a perspective that treats stories as an outcome or artifact of culture, to a narrative paradigm where stories are also creators of culture [3]. Specifically, this article focuses on narrative practices surrounding events with IoT vendors that create “risk.” In the teams we studied, “vendor stories” were commonly told in interdisciplinary meetings between IT and Facilities workers managing and planning smart infrastructures on a large, higher educational campus. “Risk,” as a concept, is a useful entrée into understanding negotiated cultural meaning-making as different professional disciplines tend to ascribe multiple meanings to what “risk” is according to their own work needs.

In this paper, I demonstrate how, in the groups we observed, members used narrative practices of storytelling, sensemaking, and sensegiving to negotiate and debate the risks of individual vendors. These activities linked to an overarching narrative within the interdisciplinary groups: that untrusted vendors carried cybersecurity and operational risks that were increasingly unmanageable and “something needed to be done about it.” This overarching narrative not only helped establish a multivalent concept of risk that could be shared across disciplinary divides, but also led to the interdisciplinary group taking actions to change the culture of the Facilities department in ways that incorporated their own cybersecurity values. Thus, narrative practices of storytelling, sensemaking, and sensegiving are not only mechanisms for negotiating meaning, but shape decision-making to instigate cultural change.

In the following, I discuss past approaches to cybersecurity culture that have used an integration theoretical perspective, as well as emerging new approaches using sensemaking theory. I then discuss the conceptual framework for this article that pulls from organizational studies literature on narrative practices that occur in conversation, including the use of storytelling, sensemaking, and sensegiving. Next, I discuss the organizational and institutional context in which these interdisciplinary groups and their vendor stories were embedded. This is followed by two vignettes. The first demonstrates how storytelling, sensemaking, and sensegiving were used to negotiate vendor risk as well as connect to an overarching narrative about the problem with vendors and the need for change. The second vignette details a group's encounter with a vendor and their use of narrative practices to both manage the vendor and to use the story of the vendor as a catalyst for cultural change with the Facilities department. This is followed by a discussion of the overarching cultural narrative of vendors and their associations with different types of risk. I conclude with a short discussion on the importance of using ethnography and a narrative conceptual approach to cybersecurity culture, how this research adds to the literature, as well as this article's limitations.

Related works

Cybersecurity culture has been considered an “ill-defined problem” [4], as well as an important area for future study [1]. Prior research on cybersecurity culture–as well as risk culture–have primarily adopted a very specific definition of organizational culture that focuses on defining and identifying what culture is rather than how culture is made. Reasons for this oversight are linked closely to the theoretical and methodological approaches that many researchers have adopted.

Many researchers studying cybersecurity culture frequently adopt Schein's definition of organizational culture [1,5]. Schein's view [6] is what organizational theorist Joanne Martin calls an integration perspective to organizational culture [7,8]. This perspective seeks out shared organization-wide consensus by all organizational members with all cultural manifestations consistently reinforcing the same themes, excluding ambiguity from cultural definitions and descriptions [7,8]. In addition, organizational cultural change is often viewed predominantly as a management or leadership activity that is intended to shape employees [6,8]. Adopting this theoretical perspective has led to cybersecurity studies viewing cybersecurity culture as an organization-wide consensus on specific cybersecurity beliefs, values, assumptions, and artifacts, which, in turn, has led to research that often tries to both define the components of a cybersecurity culture and measures the strength of cybersecurity culture within an organization (i.e. worker cybersecurity knowledge and potentially risky computer activities). Through these activities, researchers can then identify cybersecurity education and training that organizational leadership can use to develop a stronger cybersecurity culture.

For example, Da Veiga [9] adopts Schein's organizational culture definition and uses this definition as a framework for a survey-based Cybersecurity Culture Research Methodology (CSeCRM) that will identify and measure an organization's cybersecurity culture. This approach, however, implicitly dismisses cultural inconsistencies in an organization, including organizational subcultures and cultural ambiguities that may include different definitions of risk that could be of value for ensuring an organization's cybersecurity. (Da Veiga and his coauthors do consider the importance of organizational subcultures and different subcultural groupings in an organization in their work on information security culture. See [10,11].) Surveys such as the CSeCRM may also fall short of improving managerial attempts to design effective cybersecurity training and education as the diversity of cultural meanings of terms such as risk, or security, may not be fully understood or may be interpreted differently across the organization.

Huang and Pearlson's cybersecurity culture model is similar to Da Veiga but does include group level factors that focus on interaction and co-creation (e.g. interdependent collaboration, teamwork perception), as well as external environmental impacts [12]. They then test the model by conducting a case study using structured interviews and document collection. However, their work is still predominantly from an integration perspective: to find cybersecurity culture as an object and achieve organization-wide cybersecurity cultural consensus through a focus on management-driven cultural change [12].

In a similar stream of research, Corradini and Nardelli have used similar methods and theoretical frameworks to study and understand risk culture and risk perception to design cybersecurity awareness programs [13]. Corradini and Nardelli adopted a two-stage survey on individual risk perceptions to identify shared values, beliefs, knowledge, and understanding of risk. As with Da Veiga, the focus is on identifying shared cybersecurity cultural consensus using surveys [13].

Quigley, Burns, and Stallard's research appears to take a different approach to risk perception through a psychological perspective [14]. They view the meanings of risk as something constructed in an individual's mind based on confirmation bias, fear, and familiarity [14]. They conduct a rhetorical analysis of cybersecurity publications to identify how risk is constructed through stylistic devices, appeals, and argumentation that create narratives of risk [14]. While their conceptual framework is centered on psychological theories of perception, there is still the implicit view that there is a need to identify shared cultural meaning, viewing cultural constructions as an individual endeavor that can be found in texts while ignoring how cultural meaning is constructed through daily interaction and practice.

Despite the predominance of integration perspectives in cybersecurity culture research and a strong reliance on surveys, interviews, and textual analysis to identify cybersecurity culture and cultural perceptions of risk, there is also emerging work focused on organizational literature that centers on ambiguity and sensemaking. For example, Lakshmi et al. uses Weick's concept of sensemaking [2] to bring to the forefront the importance of negotiations and interactive meaning-making during cybersecurity incidents [15]. Likewise, Watkins et al. adopts sensemaking theory in his study of journalists’ cybersecurity practices in news organizations [16]. This work brings into focus the fragmented nature of cybersecurity sensemaking among professionals who lacked cybersecurity training or backgrounds in IT, that can lead to ad hoc and not always well-informed cybersecurity practices [16]. In these examples, cultural ambiguity and practices of meaning-making are highlighted.

There is also qualitative cybersecurity research investigating security narratives and workplace culture and effectiveness. Busse, Seifert, and Smith's study of two IT security firm's “security” narratives found that one firm's “narrative uncertainty” with the term “IT security” provided opportunities to frequently discuss tensions between different worker narratives [17]. These narrative discussions were viewed as beneficial for promoting greater creativity and mutual education in the IT security department [17]. In contrast, the second firm they investigated, which had general consensus around worker “IT security” narratives, had greater conflict between worker “security narratives” and the firm's security guidelines, which were seen as contradictory [17]. Busse, Seifert, and Smith's findings call into question whether narrative consensus around specific cultural terms, such as security (or risk), is required, or even beneficial, depending upon an organization's cultural history(s) and worker dynamics [17]. They also bring an approach that helps us better understand cybersecurity cultural meaning-making, and provides an opportunity to think about the role of narrative in cybersecurity culture.

Therefore, while there has been important work to better understand an organization's cybersecurity culture and perceptions of risk, many have investigated cybersecurity culture more as an object within an organization to be identified. In addition, studies using textual analysis, surveys, and interviews often collect “snapshots” of an individual's, organization's, or institution's perceptions of what their cybersecurity culture is and how risk is understood at a given time, not how it was constructed in the past or may change in the future. Those that do use textual analysis to pursue cultural constructions of cybersecurity over time have focused on the development of public rhetoric [18] rather than the development of cybersecurity culture within an organization.

Thus, there is a need for research on cybersecurity within organizations that approach cybersecurity culture (as well as risk culture) by adapting theory that provides the flexibility needed to better understand how workers negotiate meanings and shape cybersecurity culture and cybersecurity decision-making, over time. Doing so is particularly beneficial for understanding the emerging cybersecurity culture(s) in Facilities departments, who are responsible for the cybersecurity of IoT devices and device systems in the built environment. IoT devices are often new to many workers who frequently do not have an IT background. Likewise, IT cybersecurity specialists are often new to built environment technologies and do not always fully understand the operational needs and risk cultures of Facilities workers. Therefore, it is likely that Facilities and IT will have vastly different cultural perceptions of what cybersecurity and risk means within their organizations. Organizational theory that focuses on cultural ambiguities, conflicts, and emerging cultural practices can help inform researchers more about the nuances of cybersecurity culture within organizations as well as to better identify how to manage cybersecurity culture within different parts of an organization.

Conceptual framework

This study uses a conceptual framework that centers narrative and storytelling as practices for engaging in cybersecurity cultural meaning-making and cultural change. The concepts delineated below are from organizational literature on narrative practices used to negotiate and develop shared cultural assumptions, values, and behaviors. These practices not only shape how others interpret a story, but also shape the types of decisions and actions that an individual or group of people can take to instigate cultural change. Therefore, this conceptual framework provides a means of investigating cybersecurity culture as an activity and identifying when specific narrative practices are at play that indicate moments of negotiated and shared meaning-making and decision-making that shape cybersecurity culture within organizations.

Studies about narrative practices and their function have a long history in organizational research. The distinction between narrative and story has been widely contested, with sometimes competing definitions as well as researchers using both terms interchangeably [19]. For example, Czarniawska's definition of a story is an “emplotted narrative” and narrative as a chronological account [20]. However, many researchers have strayed away from definitions of narrative that require chronological storytelling. For this paper, I borrow from Boje and Hawkins and Saleem's work that views a story as reflections on past experiences and a narrative as the addition of coherence and plot—the cognitive frameworks to make sense of specific events, challenges, or other aspects of organizational life [19,21]. These definitions of story and narrative help us to better understand how narrative works as a practice consisting of storytelling, sensemaking, and sensegiving in social and organizational contexts.

Storytelling is not an unbiased or neutral endeavor. Storytellers use fragmented stories, also known as antenarratives [22], to generate meaning by connecting to other stories or narratives. Boje defines antenarratives as fragmented, unplotted, and incoherent stories told or performed in conversation that are both a “speculation and a bet” [22]. These storytelling practices are polyphonic, embedded in community practices [23], and can be used to negotiate the meaning of a story through “glossing” [22], which will shift or exaggerate meaning in new ways. Antenarratives can also be evoked to co-construct meaning through discontinuity, tensions, and editing [24]. Discontinuity allows meaning to be made through ambiguous connections between stories, tensions allow multiple perspectives to compete in a search for coherence, and editing is where storytellers use “slippages,” “gaps,” and “linkages'' to modify, omit, or emphasize a part of a story [24]. The choices a storyteller makes are connected to their audience as they interact with their listeners’ “paralinguistic and kinesic cues”' during their performances, reacting to these cues to shift the way they tell their story for the greatest effect [25].

Organizational research on narrative has also often focused on the social and communication practices of sensemaking and sensegiving in the production and exchange of stories and narratives. Sensemaking is a subjective, retrospective, social activity that attempts to give meaning to ambiguous or equivocal events [2,26]. Sensemaking is more than simple meaning-making, “but the active authoring of the situations in which reflexive actors are embedded and are attempting to comprehend” [26, p.267]. Thus, sensemaking is an interactive, communicative practice where the invention of meaning and interpretation of an event or a story is linked to one's personal and professional social reality. As a practice of authored meaning-making, sensemaking can also present itself in the use of nonstructural small stories [23,27,28] or antenarratives [22].

While sensemaking is about “making sense” of ambiguities, sensegiving is about conveying the meaning of a story for an audience: to influence organizational members when there are gaps between their sensemaking [29]. In this way, sensegiving is about influence [29] and is often connected to organizational change [30] or to strategize a narrative for a particular audience [31]. Sensegiving, like sensemaking, is also often found in the creation of narratives and stories and is situated and dependent upon the relationship between the storyteller and audience [25].

Storytelling, sensemaking, and sensegiving are all a part of group narrative practices that shape, maintain, and negotiate a shared culture, which in turn help to reaffirm goals, group roles, and shared concerns [23]. However, narratives can also shape and constrain decision-making, organizational arrangements, and future actions. For example, Abolafia's work demonstrated how individual sensemaking impacted collaborative policy decision-making and eventual sensegiving to an outside audience [31]. Likewise, Law's theory of narratives as “modes of ordering” [32] and Doolin's empirical analysis of “clinical leadership” narratives [33] suggest that narratives can structure organizational relations and reinforce specific social orders. In addition, Deuton and Rip have found that narratives can become “infrastructure” that has the agency to enable or constrain decision-making and future actions [34]. Therefore, while storytelling, sensemaking, and sensegiving can help us better understand the mechanisms of negotiated meaning-making, thinking of these practices as a part of a larger narrative process can provide us with insight into how these practices can create both constraints and affordances surrounding the range of possible interpretations of a story as well as what decisions and actions can be made.

Methodology

This study uses ethnographic data and a narrative analytical approach to understand how storytelling and narrative practices were used in vendor stories to negotiate meaning and create organizational change. Ethnographic research is useful in uncovering the complexities of what would generally be considered mundane daily practices, such as daily workplace activities, and negotiations and debates around everyday experiences [35]. The benefit of ethnographic research lies in its ability to observe human behaviors and events in context and then produce thick descriptions [36] to describe and interpret their meaning in great detail. This approach is useful when trying to better understand how meaning-making emerges from situated actions and daily dialogue. It is also key for studying the production of narratives in organizations [37].

Narrative analysis in ethnography is also a common analytical approach to better understanding the meaning of key events for ethnographic participants. It is particularly useful for examining complex, subjective meaning-making practices and how meaning is negotiated in participant interactions [38]. In organizations, narrative analysis can help us to see how storytelling is embedded in both micro-level relationships between the storyteller and the audience, as well as macro-level social, cultural, and organizational processes beyond the moment of storytelling interaction [28]. Thus, narrative analysis not only connects stories to their immediate storytelling context, but to the organizational context in which the storyteller is situated.

I use data from a larger 3-year research study on interdisciplinary communication and coordination between IT and Facilities professionals to improve IoT cybersecurity. The project consisted of 119 hours of ethnographic observations of interdisciplinary work meetings that included IT and Facilities staff from a higher educational institution involved in planning, deploying, and managing IoT devices and systems. Our field notes also included observing project team meetings on two new construction projects using IoT systems. Our research team on the project would meet weekly to discuss our ongoing ethnographic findings and work iteratively on our thematic analysis on IT and Facilities collaboration.

Due to COVID-19 limiting the ability to conduct in-person observations, most of our research consisted of virtual ethnography, observing work meetings on Zoom. Meetings were not recorded but would have one to two team members attending as observers and taking in-depth notes on meeting conversations. Our study also consisted of 42 expert interviews with Facilities and IT professionals and five case studies of higher education institutions in North America on their collaborative IT and Facilities cybersecurity practices.

For this paper, I focus on a subset of ethnographic data consisting of stories and narratives about vendors and their relationship to risk: what I call “vender stories.” This subset consisted of field notes from fifteen separate meetings that occurred during an 18-month period between 2020 and 2022. Each meeting was approximately 1 hour in length.

After identifying vendor stories in our ethnographic data set, I conducted a thematic analysis that identified how each vendor was characterized, the type of risk associated with the vendor, whether there were any calls to action to mitigate those risks, and the professional role and disciplinary background of the storyteller or speaker. Themes were generated using a process of iterative coding of the ethnographic case study based on “grounded theory” [39]. A thematic analysis was also conducted with the expert interview data using the same technique, then compared with the ethnographic data. Any vendor theme that did not occur in both datasets was eliminated. This analysis indicated a broad narrative beyond our single ethnographic case about types of vendors, their characteristics, and their relationship to risk. I then selected two vendor stories from this dataset and conducted a close narrative analysis of the field notes and identified within the field notes occurrences of storytelling, sensemaking, and sensegiving, and how these activities linked to the broader narrative about vendors and risk.

The 3-year study was approved by our Institutional Review Board as minimal risk with a waiver of consent for any subjects whose contact information we could not obtain. We were given permission to join the meetings by the Facilities leadership. Before we began observing any of our field sites, an email message was provided to meeting participants by a member of our research team with an attached information statement about the study. Our first day at each field site consisted of introducing the project and providing copies of the information statement. When a new participant was added to a meeting after our initial first day, we would contact them separately by email with our reason for attending meetings and the information statement. The information statement provided participants contact information to research team personnel and informed participants that should anyone want to opt out of the research, they should contact one of the researchers. Anyone who opted out did not have information about them, their dialogue, or their activities collected in the study.

For interviews, all subjects were provided with an information statement and we used an oral consent process. Only the research team had access to the data and the names and organizations in our dataset would be coded after research completion. All subjects and organizations in this paper have been provided pseudonyms. (For more information on our methodology, please see [40].)

The setting and vendor stories

In the higher educational institution, we observed, the increasing number of IoT devices (e.g. smart lighting, smart energy meters, automated building facades) being integrated into the built environment had brought specific challenges for the campuses’ central IT and Facilities professionals. Challenges included an increasing number of IoT devices that required long-term maintenance (including firmware updates), a lack of skill sets in both IT and Facilities to maintain IoT devices specific to the built environment, or operational technology (OT) systems, and a general lack of efficient coordination between IT and Facilities departments tasked with managing the cyberinfrastructure required for the increase in smart building systems.

This led one of the campus' cybersecurity specialists and their Facilities’ colleagues to create collaborative interdisciplinary working groups between different campus staff that are involved in the security, management, operations, and repair of IoT devices and device systems. These groups included central IT staff (e.g. specialists in network infrastructure, cybersecurity), members of different Facilities units (e.g. utilities workers, sustainability advisors, building maintenance and repair technicians), and “trusted” vendors working as consultants on specific types of networked technologies being used in campus buildings (e.g. electrical engineering consultants). These meetings were organized based on specific topics of interest related to IoT technologies and IoT systems on campus and were spaces where participants could share work stories, expertise, ongoing challenges, and technological and policy planning and decision-making on future IoT infrastructure. The participants in these meetings frequently used storytelling, sensemaking, and sensegiving during discussions that created a shared understanding between workers around types of IoT risk, what creates risk, and how to mitigate risk in a smart built environment.

In the following, I focus on two stories that took place in two different collaborative interdisciplinary meetings. The first is the Future Infrastructure Team (FIT), which focused on critical infrastructure systems that were networked in the built environment. Topics within these meetings focused on both planning for the future of campus infrastructure and managing issues and challenges currently being experienced in both old buildings and new constructions occurring on campus. Infrastructure systems that were frequently under discussion were different IT-related networks (both public and private), lighting systems, Heating, Ventilation, and Air Conditioning (HVAC) systems, shading systems, energy management systems, and building access systems. Challenges concerned infrastructure cybersecurity (security of devices and networks), device management, public and private network infrastructure security, management and maintenance, and data and network infrastructure planning.

The second meeting I focus on is the Metering Innovation Group (MIG). This group focused solely on networked energy meters to ensure that building energy data could be efficiently and effectively collected across campus. This data was not only of importance to ensure that buildings were running at their top efficiency, but to meet state compliance requirements regarding energy use, and to share with researchers, students, and other on-campus stakeholders who needed the data for their own work or research. This group also focused extensively on planning the future of campus-wide energy management systems that could be easily managed by the Facilities team of technicians and maintenance staff, as well as the Facilities department's own IT staff (who stored meter IP addresses and subnet information) and staff in central IT, who managed the servers and firewall policies for the private Facilities network. While this group was not specifically about cybersecurity risks related to external threats (e.g. hacking), it did have a cybersecurity specialist as a regular member to provide input on security risks related to metering networks, potential data loss or integrity, and potential adverse impacts on campus operations.

From the moment our research team began our observations in FIT and MIG meetings, vendors played a dominant role in worker stories about workplace challenges and risk. Vendors were, more often than not, the antagonists of these stories. Part of this tension between vendors and Facilities and IT staff was the sheer number of vendors whose equipment and applications were being used on campus. Having such a large and diverse number of vendors to track and manage led to difficulties in trying to keep up with IoT system operations, management, maintenance, and security when working with devices and applications that had different technical needs and required different technical skill sets. There was also the fact that many times Facilities and IT personnel were not always made aware of IoT devices that were already in the process of being procured for building projects or had already been installed.

For example, during one of the first meetings we observed, a question was posed to John, a Facilities energy conservation specialist, about how IoT generally entered campus. John replied that it most often occurred due to the replacement or renewal of building systems. He then added:

“Lighting controls, for example, are being purchased right now [for a project]. Just found out about this from contracts. And it's like, yet another vendor.”

One worker joked in reply, “Is that one of our existing six [vendors]?”

A second worker added, “We got six in housing.”

A third worker added, “We got six on this floor.”

The group chuckled together throughout this exchange.

This snippet of dialogue highlighted the ongoing tension between the university's IT and Facilities workers and vendors through John's fragmented storytelling about discovering a new vendor and his attached meaning to the story through the phrase, “yet another vendor,” indicating frustration. Other members of the group responded to the story using joking replies that increasingly emphasized the sheer number of vendors that Facilities and IT have to know about and manage. John's fragmented story, and the following joking replies is part of the emerging “vendor problem” narrative.

For many in these working groups, IoT risk was both cybersecurity risk and operational risk. The two were always closely entwined. If an IoT device was a security risk, it could potentially be breached and provide threat actors access to different campus networks, including those that were historically intended to be closed, or “on an island.” If this occurred, these security risks could then lead to operational risks: devices may not function properly, leading to the need for downtime, repair, and increased costs. In turn, IoT operational risks meant that devices may not be installed or function properly, which not only would lead to a need for repair and potentially costly rework, it also could lead to the device being vulnerable to security threats and therefore a cybersecurity risk. While the IT specialists in the working groups we observed often raised concerns about security risks and Facilities specialists focused on operational risks, they were often discussed in tandem and as two halves of a singular problem: IoT risk. Because of the length of time many in these groups had spent with one another, Facilities workers often aligned IoT risks with security and operational risks, and IT workers also understood that security risks were operational risks.

In the following vignettes, I will demonstrate how IT and Facilities used storytelling, sensemaking, and sensegiving to both negotiate and debate vendors and their risks, as well as to link to broader narratives of vendors and risk that shaped decision-making to pursue policy and cultural changes. These activities were the “narrative building blocks” of a larger shared cultural narrative amongst these groups that recurred throughout our time observing this higher educational institution: that vendors carry multivalent meanings of risk, and IT and Facilities workers need to mitigate this risk in their daily work and through instigating broader organizational change.

Vendor stories and negotiations of risk

Interpreting vendors and their risk level was not always something done in consensus. Often sensemaking around vendor risk was applied through the sharing of fragmented personal stories–or antenarratives–about worker experiences with vendors. This would be followed by workers engaging in negotiated sensemaking and sensegiving to both assign risk to a vendor and to link the meaning of these stories to promote a shared narrative of vendor risk and the need to do something about it. These interactive communication practices helped to determine what “type” of vendor they were dealing with, the types of risk associated with the vendor, and how to mitigate specific vendor risks. We can see how these practices occur in the following vignette where a Facilities worker in a FIT meeting shares his experience during a vendor training session about a lighting system in a newly constructed building. In this vignette, we can see how storytelling and negotiated sensemaking and sensegiving illustrate a larger shared narrative about vendor risk on campus and the need for organizational change.

During a FIT meeting on Zoom, the cybersecurity strategist from the IT division of campus, William, prompted a worker from Facilities’ IT unit, James, to present to the group what he learned about the new lighting control system. James shared a PDF file he received from the lighting control vendor during a training session the vendor held in-person to teach Facilities personnel about the new system. The most interesting thing about this system, according to James, is that they had four lighting control modules with unique IP addresses that control distinct zones of the building.

James is optimistic about the system and particularly about the 500 + page manual that the vendor provided. He feels confident that he has all the specific information that the university needs. He believes this system is preferable because it is compatible with the Facilities network and the company is local, so technical support will be readily available. James tells the group that there will be another training on the software later in the year. Some of the deliverables that will be provided by the vendor include a floorplan locating and listing each sensor and device. This floor plan will be clickable and interactive.

Here, at the beginning of the vignette, we find James sharing with the group a story about his experiences during a vendor training session, including presenting to the group the documentation provided to James during the training. However, James' story is not a neutral retelling, but already engaged in acts of sensegiving for his audience. He frames the story of the vendor in a positive light, as a vendor that can potentially be trusted. His evidence for this interpretation of the vendor is that the vendor provided extensive training, extensive information on the system, and is a local company and therefore will be able to easily provide technical support. His storytelling is an attempt to give sense to others that the vendor is not going to be unresponsive, unreliable, or lacks the knowledge and skills needed to ensure the system will run successfully. This vendor is portrayed as an example of a trusted vendor.

Despite James’ sensegiving, the rest of the group begins a process of negotiated sensemaking around the training materials and the technology the vendor is providing. During this process, specific members of the group attempt to make sense of James’ story through testing whether the lighting controls vendor might actually display risky vendor characteristics they have encountered before.

The group's discussion during and after James’ conversation hits on various technical topics, but the primary concern is that each lighting control module in the system has a Wi-Fi adapter. This means a technician can be 20–50 ft from the modules and can work wirelessly nearby to make adjustments, change schedules, etc.

William, an employee from the CISO office, asks Amanda, a network infrastructure specialist from central IT, to give some feedback about the Wi-Fi adaptor on the system. Specifically, he wants to know if this creates a dual home device (a device that sits between a trusted and untrusted, publicly accessible, network).

Amanda responds that she is not sure and that “we [the university] haven't had Wi-Fi for the Facilities network yet, so that would have to change.”

There is a general concern amongst team members about how this could create a situation in which the short-range Wi-Fi capability could be a security vulnerability. Ben, an electrical engineer in Facilities, suggests leaving the Wi-Fi adaptor unplugged “unless we need it.” Jacob, an IT business specialist in Facilities states that the short range of the Wi-Fi doesn't mean it's not a vulnerability.

William notes that it's important to enact a “turn off everything we don't need” protocol. He then uses this to segue to a more general conversation about vendor protocols. He says that there is a need to respond to the fact that “vendors turn on everything and we need to make our policy to turn it off unless it's being used.”

In this section of the vignette, we see the negotiation process of sensemaking and sensegiving taking place. William first tries to make sense of whether the technology will create a network vulnerability and asks Amanda to provide her expertise as to whether the lighting system could become a dual home device: a potential security risk. Amanda's response is ambiguous, highlighting that this would alter the closed Facilities network, but it remains unclear whether the system will be a security vulnerability. The Wi-Fi also raises concerns and several members of the team suggest how to control the system by not using the Wi-Fi adaptor. William uses this opportunity to connect and associate the current lighting systems vendor story into a larger narrative about vendors on campus being a security risk. In particular, he uses the lighting vendor security discussion to insert a fragmented story reflecting his past experiences with vendors that “turn on everything” to give sense to vendor risk and to connect to a larger narrative: that there needs to be a broader institutional response to control these types of vendors and their security risks.

In the next segment of the vignette, the conversation returns to James’ story as other characteristics about the vendor continue to be applied throughout a period of negotiated sensemaking and sensegiving.

The conversation then moves back to this particular instance of the new lighting controls. This package was already purchased from the vendor and now there is some concern that the university is being forced into a system where they will have to retrain people. Ben notes, “It's gonna kill us” to have to retrain this much.”

James continues to be optimistic about the situation and mentions again that the vendor is a local company, which is good for technical support. James adds, “They say you don't have to worry about firmware updates, and it should be good to go for ten years.”

Ben asks, “how are we supposed to know if something needs fixing?”

James responds, “The vendor said they would let us know.”

Members of the team begin to express a lot of skepticism around whether the vendor will really let them know and if the vendor or the university will be able to fix problems or if the components are proprietary.

In this segment, negotiated sensemaking surrounding vendor risk continues. Ben tries to make sense of the vendor by implicating the vendor's system will have a negative impact in the form of an increased workload for the staff. James continues to engage in sensegiving to provide an optimistic interpretation of the vendor as a trusted party that can mitigate operational risk: that the vendor won't be unresponsive when needed and can be depended upon to provide technical support. Ben then engages in more sensemaking and tries to better understand whether the vendor will truly be responsive when support is needed. However, James’ sensegiving attempts to interpret the vendor in a positive light do not satisfy the group and there is still cynicism and now an overarching negative interpretation of the vendor being untrustworthy. The implications of an untrusted vendor are high as it means a higher workload for Facilities staff, as well as the risk that devices might not receive firmware updates in time or have the components necessary to repair a device, leading to IoT operational and security risks.

However, after this initial cynicism, Ben appears to consider James’ interpretation of the story through a fragmented story that links the lighting vendor to other trusted vendors he experienced in the past.

Ben then says that in regard to expiring software or hardware and proprietary components [with IoT systems], he has had success with some vendors coming to him preemptively to say “this [technology] is going to be obsolete soon.”

Here, Ben is again engaged in sensemaking, considering an alternative interpretation of the vendor's risk based on his personal positive work experiences with some vendors. The group then links James’ story and the ensuing dialogue of sensemaking and sensegiving to connect to a larger narrative: the campus’ need for a broad strategy to manage and control lighting vendors to improve network security.

John, an energy manager in Facilities, notes that lighting system procurement is especially difficult because there are some large distributors and “you can't get away from them” because they have monopolized the market. He asks the group why they don't manage lighting systems the same as the meter systems to move toward a campus-wide networked system. This would mean that the university would “have more control.” He adds that this would require their team asking campus procurement to make a Request for Proposals [RFP] responsive to a campus-wide network that would force vendors to be compatible. However, a complication here is that different protocols in different lighting systems makes a campus-wide solution more complicated for staff to maintain.

Ben adds, it “becomes a terrible thing to troubleshoot”.

William jokes around about how some systems “fly low under the radar” and “sneak in.”

Jacob adds that he has seen this as well on another campus where he used to work.

William adds that the “challenge with the Facilities network being one big thing is that then it's all exposed.”

What's evident in this part of the conversation is that despite James’ best sensegiving efforts to frame the lighting controls vendor as trusted, and Ben's fragmented sensemaking story indicating that some vendors were honest about outdated technology, the conversation moves back into the larger narrative of the need for campus-wide vendor control, supported by more fragmented stories about vendors systems that “sneak in” and the assertion that the Facilities network still remains vulnerable. In this way, James’ storytelling and sensegiving about the lighting control vendor became subsumed into the larger narrative as yet another case highlighting the need for the group to find a way to control risks raised by vendors. At this point in the meeting, the conversation on the lighting controls vendor ends and the group moves to other meeting topics.

This vignette demonstrates how sensemaking and sensegiving play out in negotiation between IT and Facilities personnel and how these communication practices link to the broader narrative of vendors to determine the vendor's associated risks (both operational and security). James’ storytelling acts as the initial communicative action that then leads the group to test the potential trustworthiness of the vendor through questions designed to make sense of the vendor, and by establishing linkages to past experiences told through fragmented stories, often presented by William, the cybersecurity specialist, as “facts” in the workplace (e.g. “vendors turn on everything”, systems “fly low under the radar” and “sneak in”). The interaction leads to both sustaining a shared narrative of vendor risk within the group, while also allowing for negotiation and debate around what individual vendors mean for the group and where they fit in the narrative in terms of their potential operational or security risks.

From negotiating risk to pursing cultural change

In the collaborative meetings, there was an ongoing concern about how to make broader organizational and policy changes in the Facilities organization considering the increasing number of IoT devices, data, and associated risks. Many strategies for organizational change centered on how to persuasively communicate IoT risk and risk mitigation to executive leadership in the Facilities Department, as well as the Facilities Department's unit responsible for capital projects and IoT procurement decisions that occur during new construction and major building renovations. Discussions on these strategies focused on goals of changing the cultural values and assumptions that Facilities’ top leadership had around IoT systems (e.g. values of efficiency, innovation, cost savings) to include IoT risks that Facilities and IT workers were experiencing in their everyday work. These risks included the high costs of long-term IoT device maintenance, higher maintenance workload, unclear data ownership and management with vendors, and how these risks leading to cybersecurity vulnerabilities. A perceived obstacle in moving leadership cultural views were vendors and their sales pitches.

This vignette centers on the MIG, made up of IT and Facilities middle managers who had been working with two electrical engineering vendors (EngCo and ElectricCo) to develop a long-term plan for energy metering infrastructure. Here, narrative practices such as storytelling and sensemaking are used to not only interpret vendor risk, but also led this interdisciplinary collaborative working group to strategize how to use vendor interaction as an opportunity to give sense to Facilities leadership about the broad range of vendor risk and the importance of their own data infrastructure plan. The hoped-for outcome was to change Facilities’ leadership cultural assumptions that IoT vendors provide positive opportunities to cultural assumptions that aligned with the interdisciplinary groups’ broader narrative concerning vendor risk and the need to ensure data ownership.

During the MIG meeting on campus John, the energy manager in Facilities, informs the group that they have a surprise meeting set up for today, which he describes as the result of a “guerilla activity” that occurred yesterday when John met with members of the executive leadership in the Facilities department. John explains to the group that one member of Facilities leadership had a past work associate, who is now “leading a startup company” called AutoDATA. John was tasked to set up a meeting between AutoDATA and MIG to hear AutoDATA's “sales pitch” for a cloud-based data management system for Facilities’ assets.

Miles, a consultant from a large engineering firm, ENGCo, notes that for AutoDATA's system to work, AutoDATA would require a BIM (Building Information Model) for each location. John tells Miles that AutoDATA had said BIM was not needed, but Miles says they would still require a BIM model. Miles begins reading about the company on their website and tells the group that the website says that to set up AutoDATA's service is “zero effort”. He then snorts, making it clear that he doesn't believe it's zero effort.

In this first segment, John is engaged in storytelling about his meeting with Facilities leadership and their request that his team meet with the start-up company. Just as James engaged in sensegiving about his experience with the lighting controls vendor, John also engages in sensegiving for his audience, describing the Facilities leadership's request for John to arrange a meeting with AutoDATA as a “guerilla activity,” framing the request as surprise. John's description of the company as a “startup” giving a “pitch” also aligns with language typically used by the groups we observed for describing vendors that tend to act more like snake oil salesmen marketing devices that are often too good to be true. These types of salesman vendors were also often noted to occasionally pitch their products in shifty ways (e.g. one story about a vendor sales pitch at the university involved a vendor hacking a device, informing the university that the device was not secure, and then trying to sell a security solution).

After hearing the story, Miles engaged in sensemaking about the vendor from his own professional knowledge and experience as a vendor himself: bringing up potential obstacles to AutoDATAs service, such as the need for a BIM model in every location, even asserting that this would be needed after John noted that AutoDATA had said this was not the case. (In the building industry in large, older owner organizations such as federal government buildings and higher education campuses, having a BIM model for every location is often an impossibility, much less a complete and accurate model for every building.) Miles also found the company website and read their marketing materials that claimed their service is “zero effort” for the client: a claim that did not resonate with Miles (made clear through his snort). As a trusted vendor in the MIG group, Miles’ comments hold weight as he is seen as someone with professional authority and highly knowledgeable about the university's long-term technical and organizational needs with energy systems and data management solutions.

The meeting with AutoDATA then begins via Zoom with an exchange of organizational narratives: the story of MIG and the story of AutoDATA.

John provides the two AutoDATA representatives with a history of the MIG, including that the group is seven years old and interdisciplinary, noting that Jacob from Facilities and William from IT are their IT specialists. He then states that the installation of smart electrical meters on campus was part of the American Recovery and Reinvestment Act (ARRA) funding. The team had mostly focused on data pipeline activity and was now focusing on data collection tools for analysis and business use cases of the data set. Most of their work so far has been manual and in pilot form. He adds that the group would be most interested in data acquisition and control rather than starting from scratch.

The AutoDATA representatives begin with a slide presentation on the future of building automation as a means to achieve better energy performance. AutoDATA‘s service focuses on creating a virtual model of the building systems themselves. There is then output that pinpoints the locations of Facilities issues that require maintenance. The representatives use case studies where AutoDATA has been applied successfully to illustrate their service.

Here, John's narrative was focused on the past and present: the history of MIG, their current focus on energy data, and their work to date. The narrative is framed in a way that emphasizes their current goals of institutional data ownership and control: a key long-term goal intended to mitigate vendor risk associated with loss of data and data quality. John also notes that the team is not interested in “starting from scratch,” indicating hesitancy to start with a new technology solution. AutoDATA's narrative, in contrast, is future-focused and technosolutionist, emphasizing the benefits of building automation for stronger energy performance and efficient maintenance. Any discussions or allusions to risk were nonexistent.

After the presentation, MIG team members are invited to ask questions. Initial questions concern how AutoDATA collects data from different types of databases, how they would collect data points from the utilities plant on campus, how they use the data, and who would claim ownership of data.

John then explains, “We're in the process of purchasing a new server and a data management program, DATASoft, to replace an older one. I think the one case study [that the AutoDATA team presented in their PowerPoint presentation] could be a good model for doing analysis. One of the reasons we're fixated on owning data is we use analysis as a way of raising the capability of our in-house resources and building the competency of our operational and maintenance staff. So, while we appreciate the speed of how an automated tool can help us achieve solutions, we want to make sure that human learning moves along with machine learning. So, we want people to see the data to make sure they understand it, know what to do with it, and are competent with it. Can you speak to how you all meet that challenge?”

The response from AutoDATA is that they want to provide transparency to technicians as well, not just to verify that the data is correct, but to help technicians learn from the data. The issue is not that people cannot do it, but that computers can physically identify issues and technicians can focus on the data that only concerns their particular issue, rather than sort through the multitude of data being collected by every device.

Here, John's story of MIG's current activities of purchasing a new server and DATASoft—a decision made only after a long period of time consisting of extensive discussion in multiple collaborative meetings with a broad range of IT, Facilities, and trusted vendor consultants—is an attempt to give sense to AUTOData that automation is less important to MIG. Rather, the MIG values data ownership and control and frames these values as of institutional importance for the opportunity it provides to build resources for managing data and improving the competence and skill sets of their work staff. AutoDATA counters John's sensegiving and states that they too want “to help technicians learn from the data,” but that their service will bring focus to the data (and ergo improve efficiency in maintenance).

John then tells a story about how he and a building electrician were in a campus building that was having a fault electrical problem where fuses kept overloading on motor fans. John noted, “Nothing seemed to help. I went to Geoff, our energy engineer, to see what was up.” Geoff was able to quickly identify that the issue was with an air handling unit that was, through a sequence of events, “blowing the fuse.” John added, “So we really see the power of what you guys were doing, but it's important to us that we don't get derailed from this careful set of work processes and training of our technical and vendor resources to build something we can manage and operate. So, scale isn't the secret to all of this, respectfully, but we are concerned we will leave our ability to manage and maintain the system behind if we do it. We will have a whole bunch of data but will not be able to take care of it and it will exceed our capacity. We are interested in exploring further with you how we might be able to do that and to see how your process fits in.

Here, John uses the story of Geoff as a sensegiving mechanism to illustrate MIG's wariness of the new vendor's services. John's story emphasizes the power of people and their expertise to identify device issues requiring repair or maintenance. The story is used to highlight that their concerns are less about data and automation and more about developing the processes and training that will fit their institution's needs. The conversation ends with negotiations between AutoDATA and the MIG group about what a pilot project would look like at the university, the need for AutoDATA to partner on the pilot project with their trusted vendor, ENGCo, and next steps to pursue a pilot project.

Two weeks later, the meeting with AutoDATA becomes its own fragmented story that the MIG group gives sense to by linking to the broader narrative surrounding untrusted vendors.

John notes that he is getting other vendors “knocking on the door”—noting that one that he sees as “immature” is AutoDATA. John sees AutoDATA as an opportunity to get everyone [i.e. Facilities leadership] to see the necessity of the new server and DATASoft.

William adds, “I get hit up by these kinds of guys too and if there is some commonality—these solutions push costs onto us.” William requests that they all convene and talk “before pulling a trigger on this.”

Here, we see John and William, the cybersecurity specialist, exchanging fragmented stories of vendors that bring more risk than benefits to campus. John links a fragmented story of other vendors “knocking at his door” with the story of AutoDATA and their “immaturity” as a vendor. John then uses this as a catalyst for the MIG group to take action to get leadership to agree with their own server plan. William adds his own fragmented story about the large number of vendors that contact him and gives sense to this story by linking it to the usually unforeseen risk of long-term high costs of new products and services and to emphasize the need for the group to discuss the AutoDATA situation together before final decision-making on the Auto-DATA pilot.

The conversation then moves to a negotiated sensemaking: trying to work out remaining ambiguities surrounding AutoDATA's intentions and their implications to university decision-making surrounding data infrastructure planning.

Julian, a representative in the group from another trusted vendor, Electric Co, states that AutoDATA “took no time to take a shot at DATASoft” and that this is a concern. Julian states that if this is what AutoDATA is saying to those at the level of Facilities leadership, leadership might not understand what this really means. They might hear from AutoDATA that DATASoft “is not real software, not supportable, updateable” and that those messages being sent to the top level of the Facilities hierarchy could be detrimental to the university building a legitimate data infrastructure.

Jacob from Facilities’ IT division, counters that he doesn't think this was a threat. His understanding is the solution AutoDATA was offering required existing data and their focus was on new buildings.

To this, William suggests they charge for their data because AutoDATA, and other vendors, have a desire to get onto a campus. He adds, “these guys get value out of our data. I'm not trying to be obstructionist, but our data is valuable, and it has a brand, and we should utilize that.”

Much sensemaking about the lighting vendor in the first vignette, these sensemaking practices allow for debate about AutoDATA's real intentions and produces linkages between AutoDATA and several risks commonly associated with vendors in the broader vendor narrative: vendors have high unforeseen long-term costs, they do not understand the university's infrastructure history or long-term plans, they have the power to shape leadership's beliefs and assumptions about risk in ways that could disrupt the group's work, and they're highly motivated by the accumulation of valuable and potentially profitable data. As with the earlier vignette, despite the majority of the group associating the vendor with negative risks, the meaning of risk with this vendor is still being negotiated. Jacob's use of sensemaking frames AutoDATA's presentation as nonthreatening and of minimal risk to the university's valuable data.

The conversation then turns to strategically using storytelling to mitigate the potential risk of AutoDATA and similar vendors coming to leadership about their new products: to bring the interdisciplinary groups’ narrative of vendors and risk to Facilities’ leadership.

John adds that he promised Facilities leadership that he would provide them with a briefing on AutoDATA. He asks William and Jacob to help. Miles also offered to help, adding that vendors like AutoDATA are “a dime a dozen. They all use the same open-source code,” and this company is just one of many.

William sees this as a communication issue in which the vendors are going to keep knocking on senior leadership's doors and that their team should have a strategy for dealing with it.

Winston, another ENGCo representative, shares an ENGCo branded presentation slide deck that begins to make these conflicting and coexisting concerns into a visual.

John suggests additions to the slides, such as information on building metering, the difference between Information Technology and Operational Technology, and IoT. He then suggests creating a single presentation about this to present to their senior leadership in Facilities. He sees this as a need for language that will help leadership respond when these vendors reach out.

It is here we see the MIG members move from a strategy on how to give sense to Facilities leadership about AutoDATA, to a decision to strategize how to bring the broader narrative of vendor risk to Facilities’ leadership. This activity is viewed as a potential mechanism for actual organizational change: to bring their collaborative groups’ narrative of vendor risk to leadership in a way that is palatable and will create a shared language surrounding different technologies (including IoT) and their risks to Facilities.

After this meeting, Julian, Miles, William, and John continued to discuss in other MIG meetings about how to engage AutoDATA. After this, the team would go on to continue to prepare recommendations for engaging with vendors that they would present to Facilities leadership in a future meeting. The purpose of the recommendations was, as John put it, to “ensure we successfully communicate clearly to them the criteria for engaging with vendors, and that the framing is reasonable, and what the desired outcomes of MIG are.”

Several months after these events, it was clear that Facilities leadership was quickly gaining interest in vendor cybersecurity and operational risks. William started a new interdisciplinary group specifically on smart campus infrastructure that consisted of many members from MIG and FIT. Part of the group's task was to provide a deliverable to the head of Facilities as it was noted that leadership had become “a little spooked about cybersecurity” and wanted the working group to come up with “some kind of action items.”

The group's purpose was to develop a smart campus framework that aligned with existing university values, priorities, and objectives. The group had been tasked with producing deliverables for Facilities leadership to help them better understand the risks and opportunities of IoT systems on campus. This provided yet another opportunity to give sense to leadership about IoT cybersecurity risks, operational risks, and cost risks and the desire to confront the vendor sensegiving that would occur when vendors contacted Facilities leadership and pitched their technologies. As a cybersecurity specialist once noted in a meeting:

The big picture challenge is describing that [risk] to leadership. The technology that's there, the data: so much of what they get is vendors banging on, ‘you're going to have this thing and you'll have it automatically.’ And often that's not true. So how do you communicate with leadership when you're up against vendors.

John had also made headway with the capital projects unit of Facilities and had been able to add specifications requiring that several trusted vendors would vet other vendor products in building projects, earlier in the design process. These specifications were the beginning of creating new policies in the Facilities organization for mitigating operational and data risks with vendors. Each of these activities was an attempt to give sense about vendor risk across the Facilities department and to generate a new culture in Facilities consisting of new values and assumptions pertaining to vendor risk. To have MIG and FIT narratives of vendor risk embraced by leadership and across the Facilities organization.

Discussion: cultural constructions of vendor risk

A conceptual framework focused on narrative provides a means of investigating how narrative practices of storytelling, sensemaking, and sensegiving construct and shape cybersecurity culture. In the two vignettes, Facilities and IT workers used narrative practices to construct cybersecurity values, assumptions, and artifacts. These narrative practices produced cultural constructions of vendor risk, while also shaping decisions about how to improve cybersecurity and decrease risk. Together, these practices generated and sustained a narrative of vendors and their associated risks (See Table 1). Workers also used storytelling, sensemaking, and sensegiving to assign their own values and interpretations to a specific vendor story to determine where a vendor fit in the narrative, such as seen in the case of the debates surrounding the lighting vendor in the FIT meeting and the early meetings where MIT team members discussed AutoDATA.

Table 1.

Characteristics of vendor types and their risks.

TypologiesCharacteristicsRelationship to riskExamples from field note data
Trusted vendorsKnowledge of institution. Ongoing, long-term relationships. Engaged in long-term institutional decision-making. Reputation and authority.Can be used to mitigate risk. Control untrusted vendors. Control access to network and data for institution.He gives an example of campus elevators as something they could control centrally. They could have specialized expertise in management and maintenance of those systems, as well as have special vendors that they manage them with across campus.
Untrusted vendorsUnknown to the institution. Set unrealistic expectations about technology. Provides IoT solutions without plans for long-term maintenance, security.Raises security risk. Raises operational risk. Raises miscommunication of risk.William added, “if the notion was the devices have to come out in a year, what will happen is they'll be left and the device is just floating in the building and the network.”
Rogue vendorsPokes “holes” in closed network infrastructures, often without informing the institution.Raises security risks.William says that there is a need to respond to the fact that “vendors turn on everything. And we need to make our policy to turn off [everything] unless it's being used.”
Uneducated vendorsDoesn't understand, know, or have technical knowledge about the security requirements for device installation and use.Raises security risks.The building automated access manager notes that panels must be hardened to not be susceptible to hacking. “We rely on the vendor and the technician they send to do the job. Some understand the procedures, but others don't.”
Unresponsive vendorDoesn't always inform the institution when devices should be updated or if there is a malfunction.Raises security risks. Raises operational risks.Ben asks, “how are we supposed to know if it needs fixing?” James responds, “The vendor said they would let us know. “There is a lot of skepticism expressed in the group around whether the vendor will really let them know.
Salesman vendorsMarketing a new device, or solution. Device sounds “too good to be true.”Raises security and operational risk through miscommunication of risk to other stakeholders.William asked, “How did we find out about the device?” Miles replied, “A vendor told us about it. This vendor said he had IoT security solutions. His means of selling was to hack you and then sell you something.”
TypologiesCharacteristicsRelationship to riskExamples from field note data
Trusted vendorsKnowledge of institution. Ongoing, long-term relationships. Engaged in long-term institutional decision-making. Reputation and authority.Can be used to mitigate risk. Control untrusted vendors. Control access to network and data for institution.He gives an example of campus elevators as something they could control centrally. They could have specialized expertise in management and maintenance of those systems, as well as have special vendors that they manage them with across campus.
Untrusted vendorsUnknown to the institution. Set unrealistic expectations about technology. Provides IoT solutions without plans for long-term maintenance, security.Raises security risk. Raises operational risk. Raises miscommunication of risk.William added, “if the notion was the devices have to come out in a year, what will happen is they'll be left and the device is just floating in the building and the network.”
Rogue vendorsPokes “holes” in closed network infrastructures, often without informing the institution.Raises security risks.William says that there is a need to respond to the fact that “vendors turn on everything. And we need to make our policy to turn off [everything] unless it's being used.”
Uneducated vendorsDoesn't understand, know, or have technical knowledge about the security requirements for device installation and use.Raises security risks.The building automated access manager notes that panels must be hardened to not be susceptible to hacking. “We rely on the vendor and the technician they send to do the job. Some understand the procedures, but others don't.”
Unresponsive vendorDoesn't always inform the institution when devices should be updated or if there is a malfunction.Raises security risks. Raises operational risks.Ben asks, “how are we supposed to know if it needs fixing?” James responds, “The vendor said they would let us know. “There is a lot of skepticism expressed in the group around whether the vendor will really let them know.
Salesman vendorsMarketing a new device, or solution. Device sounds “too good to be true.”Raises security and operational risk through miscommunication of risk to other stakeholders.William asked, “How did we find out about the device?” Miles replied, “A vendor told us about it. This vendor said he had IoT security solutions. His means of selling was to hack you and then sell you something.”
Table 1.

Characteristics of vendor types and their risks.

TypologiesCharacteristicsRelationship to riskExamples from field note data
Trusted vendorsKnowledge of institution. Ongoing, long-term relationships. Engaged in long-term institutional decision-making. Reputation and authority.Can be used to mitigate risk. Control untrusted vendors. Control access to network and data for institution.He gives an example of campus elevators as something they could control centrally. They could have specialized expertise in management and maintenance of those systems, as well as have special vendors that they manage them with across campus.
Untrusted vendorsUnknown to the institution. Set unrealistic expectations about technology. Provides IoT solutions without plans for long-term maintenance, security.Raises security risk. Raises operational risk. Raises miscommunication of risk.William added, “if the notion was the devices have to come out in a year, what will happen is they'll be left and the device is just floating in the building and the network.”
Rogue vendorsPokes “holes” in closed network infrastructures, often without informing the institution.Raises security risks.William says that there is a need to respond to the fact that “vendors turn on everything. And we need to make our policy to turn off [everything] unless it's being used.”
Uneducated vendorsDoesn't understand, know, or have technical knowledge about the security requirements for device installation and use.Raises security risks.The building automated access manager notes that panels must be hardened to not be susceptible to hacking. “We rely on the vendor and the technician they send to do the job. Some understand the procedures, but others don't.”
Unresponsive vendorDoesn't always inform the institution when devices should be updated or if there is a malfunction.Raises security risks. Raises operational risks.Ben asks, “how are we supposed to know if it needs fixing?” James responds, “The vendor said they would let us know. “There is a lot of skepticism expressed in the group around whether the vendor will really let them know.
Salesman vendorsMarketing a new device, or solution. Device sounds “too good to be true.”Raises security and operational risk through miscommunication of risk to other stakeholders.William asked, “How did we find out about the device?” Miles replied, “A vendor told us about it. This vendor said he had IoT security solutions. His means of selling was to hack you and then sell you something.”
TypologiesCharacteristicsRelationship to riskExamples from field note data
Trusted vendorsKnowledge of institution. Ongoing, long-term relationships. Engaged in long-term institutional decision-making. Reputation and authority.Can be used to mitigate risk. Control untrusted vendors. Control access to network and data for institution.He gives an example of campus elevators as something they could control centrally. They could have specialized expertise in management and maintenance of those systems, as well as have special vendors that they manage them with across campus.
Untrusted vendorsUnknown to the institution. Set unrealistic expectations about technology. Provides IoT solutions without plans for long-term maintenance, security.Raises security risk. Raises operational risk. Raises miscommunication of risk.William added, “if the notion was the devices have to come out in a year, what will happen is they'll be left and the device is just floating in the building and the network.”
Rogue vendorsPokes “holes” in closed network infrastructures, often without informing the institution.Raises security risks.William says that there is a need to respond to the fact that “vendors turn on everything. And we need to make our policy to turn off [everything] unless it's being used.”
Uneducated vendorsDoesn't understand, know, or have technical knowledge about the security requirements for device installation and use.Raises security risks.The building automated access manager notes that panels must be hardened to not be susceptible to hacking. “We rely on the vendor and the technician they send to do the job. Some understand the procedures, but others don't.”
Unresponsive vendorDoesn't always inform the institution when devices should be updated or if there is a malfunction.Raises security risks. Raises operational risks.Ben asks, “how are we supposed to know if it needs fixing?” James responds, “The vendor said they would let us know. “There is a lot of skepticism expressed in the group around whether the vendor will really let them know.
Salesman vendorsMarketing a new device, or solution. Device sounds “too good to be true.”Raises security and operational risk through miscommunication of risk to other stakeholders.William asked, “How did we find out about the device?” Miles replied, “A vendor told us about it. This vendor said he had IoT security solutions. His means of selling was to hack you and then sell you something.”

The narrative itself organized vendor characteristics into values centered on trust. Trusted vendors, such as ENGCo or ElectricCo, were viewed as having long-term knowledge of the institution and engaged in long-term work relationships between IT and/or Facilities professionals. They were also often discussed in meetings as part of the solution to generate organizational change. Other applications of trusted vendor interpretations were seen in stories about professionals or firms that had strong technical knowledge, were available for troubleshooting (as demonstrated in James’ assertion that the lighting vendor could be trusted), or generally not a threat to future infrastructure planning.

In contrast, untrusted vendors were often framed in sensemaking and sensegiving as businesses or individuals who raised security and operational risks for the university. Storytelling between IT and Facilities workers created and pulled upon cultural conceptions of risk that had specific assumptions about what untrusted vendors were like. When workers used sensemaking or sensegiving in their storytelling, they often pulled from the narrative that untrusted vendors were unknown to the institution, often set unrealistic expectations about technology—usually making big promises about cost savings while ignoring the costs of upkeep of devices and systems—and often provided IoT solutions without clear plans for long-term IoT management, maintenance, and security. In the stories and conversations that I observed, different “subtypes” of untrusted vendors were associated with different types of risk.

In addition, a conceptual framework focused on narrative practices of sensemaking, sensegiving, and storytelling highlights how these practices shaped organizational decision-making and worked toward changing Facilities leadership's cultural perceptions of risk. Knowing that Facilities leadership was not aware of the cybersecurity and operational risks that were mounting at the university due to the increase of IoT (and IoT vendors), working groups were consciously (and unconsciously) using storytelling, sensemaking, and sensegiving to construct a new risk culture that they hoped would bubble up to the top levels of the Facilities organization. A part of this risk culture was a cybersecurity culture, which was being promoted intentionally not only by the cybersecurity strategist, but also by members within Facilities management. Much of this communication work centered on the narrative the Facilities and IT workers had constructed of the untrusted vendor as the antagonist in the story of Facilities workers living in a world of rapidly changing technology that was now falling under their purview on campus.

This was perhaps best seen in the MIT meeting where the participants in the meeting used storytelling, sensemaking, and sensegiving to associate the vendor as a type of snake oil salesman knocking on leadership's doors. This type of vendor was, perhaps, the greatest antagonist for campus workers as they were not only associated with operational and security risks, but also as having the potential and means to miscommunicate risk or remain silent on risk to other institutional stakeholders with greater decision-making power. Miscommunication to leadership was a critical issue as leadership were responsible for making final campus infrastructure decisions and were not well-versed in cybersecurity or operational risks. Miscommunication to leadership was also critical as it could shape leadership's beliefs about risk and security: vendors were potential cultural influencers. These salesman vendors were also noted in stories for their marketing campaigns, often characterized in ways that indicated dishonesty, such as William's suggestion that AutoDATA was primarily interested in selling a product for the purpose of owning valuable data that the company could monetize. It was this type of vendor, and shared stories, sensemaking, and sensegiving about similar daily vendor experiences, that instigated many attempts to push for Facilities leadership to adopt a new narrative on vendors that included their associated cybersecurity and operational risks to the campus.

A conceptual framework focused on narrative practices is also beneficial to cybersecurity cultural research in that a very different view of cybersecurity culture comes to light. First, it becomes clear that the meaning of concepts, such as risk perception, are more than an individual endeavor, as assumed by Quigley, Burns, and Stallard [14]. Instead, cybersecurity cultural meaning-making is a dialogical practice—something done through interaction centered on stories of past experiences, future concerns, and the sensemaking and sensegiving that occurs between a storyteller and their audience.

A narrative conceptual framework also allows us to shift from viewing cybersecurity culture as an object of organization-wide consensus about cybersecurity practices, as seen in Da Veiga [9], Huang and Pearlson [12], and Corradini and Nardelli [13]. What we find, instead, is a diversity of meanings surrounding conceptions of risk, as well as other cybersecurity values and concerns. In my observations, Facilities workers also valued how “risk” was viewed from a cybersecurity perspective, as strong cybersecurity of IoT had the potential to minimize their operational risks and meet policy requirements. This aligns with Busse, Seifert, and Smith's [17] study of security narratives that found that tensions between narratives and a lack of narrative consensus around cultural terms (such as risk and security) may not be required and could even be beneficial.

I also arrived at similar findings to Lakshmi et al. [15], who highlighted the importance of sensemaking negotiation and interactive meaning-making for improving cybersecurity. I suggest that using a narrative conceptual framework for investigating cybersecurity culture provides us with more conceptual tools to improve our understanding of how cybersecurity narratives emerge and how they shape decisions. A conceptual framework that identifies multiple narrative practices can help researchers better understand how cybersecurity culture emerges, is maintained, and shapes organizational decisions.

Limitations

Limitations to this research are inherent to the method of ethnographic research and a narrative approach. This paper only focuses on one university, making it difficult to generalize about the dynamics of storytelling and narrative beyond the institution under investigation. A narrative approach is also an interpretivist lens that has a very different approach to reliability and validity. As multiple interpretations of a narrative may be valid, the narrative must be grounded in the collected data with a careful eye to context [41]. While we followed our research subjects as much as we could in the online and offline workplace, we were not able to see all communications that may have occurred surrounding vendors in meetings that we were not invited to, such offshoot meetings between select individuals within different interdisciplinary groups around a specific task. It is possible that other narratives about vendors were at play in these settings.

Conclusion

This research adds to the current literature on cybersecurity culture in several distinct ways. First, this study adds to the work by Busse, Seifert, and Smith's [17] and Lakshmi [15], through adding other key narrative concepts in organizational studies, including sensegiving, fragmented or short stories, and viewing narrative as a process of negotiated cultural meaning-making. In addition, ethnographic methods in conjunction, in conjunction with a narrative lens, shifts our understanding of cybersecurity culture: it provides a methodological and conceptual paradigm for studying cybersecurity culture as a set of discursive practices that are situated and emergent, performative and interactive, and centers on ambiguities, conflicts, and negotiations. What is of interest is not finding where cybersecurity culture is located but how culture is made and helps researchers better understand discursive mechanisms and work practices that can lead to cybersecurity cultural change.

Second, this approach can also identify how cybersecurity culture is not always top-down, shaped only by IT managers or CISOs, but can be shaped from the bottom up, through middle management and workers in the field. This may particularly be the case for cyberinfrastructures in the built environment as these systems are governed or maintained by organizations without IT histories, with different conceptions of risk, and with different types of technologies. Future researchers could adopt ethnographic methods and organizational theory that focuses on daily communication and work practices to better understand not just what an organization's cybersecurity culture looks like at a moment in time, but how it is lived in the everyday workplace.

Third, a narrative and ethnographic approach demonstrates the power of interdisciplinary communication and collaboration to generate cybersecurity cultures and produce a shared risk culture where different professionals from different disciplines can find alignment in cultural meanings and goals for change. This is a particularly important finding for those large organizations with Facilities departments that are currently struggling to manage the increasing number of IoT devices in the built environment and to ensure those devices are both secure and operational. Planning the future of smart infrastructure in these types of organizations or institutions should include interdisciplinary relationship-building and collaborative work to determine the needs of both IT and Facilities to better mitigate risk. External vendors could be one solution, particularly when they are viewed as “trusted” by internal organizational members.

This aligns with Evripidou et al.’s [42] recommendation that cybersecurity solutions vendors could help organizations in the early stages of their security cultural development through helping those working in OT organizations break down communication and knowledge barriers. In addition, we suggest that managers and leaders in IT that are concerned about the security of built environment systems should work with Facilities department leadership and workers to find pathways for building interdisciplinary relationships between IT and Facilities. This requires that an organization provides its IT and Facilities departments with the resources needed in terms of man-hours and space for IT and Facilities professionals to engage in interdisciplinary dialogue. This can help IT and Facilities workers and leaders negotiate the meanings of terms such as risk and cybersecurity, and to develop behavioral and management solutions to improve cybersecurity in our built environment.

Finally, this research adds to cybersecurity culture literature by focusing on the concept of risk as it relates to an increasingly smart built environment. The smart built environment is an integration of IT and OT that must be managed by both IT and Facilities workers. Both sets of professionals come from distinctly different engineering traditions and different expectations about how to operate and maintain technology and the risks associated with different technologies. While an IT professional may view a hacked, smart HVAC system as a cybersecurity risk related to data theft or a potential weak point where a threat could access enterprise networks, a Facilities professional may view the risk of the same HVAC system through an operational or safety lens. Even when cybersecurity risk is a concern for Facilities, the potential risks may be less about issues concerning data privacy, and more about how security risks affect building operations or liability and reputational risks.

Despite these differences, the working groups we observed found ways for both types of risk to merge and intersect. Cybersecurity practitioners should consider that cybersecurity cultural concepts, such as risk, or data security, will have multivalent meanings—not just within IT departments as Lakshmi found [15]—but across an organization. Learning more about the different meanings and values surrounding concepts, such as risk, in other disciplines and organizational departments, can be useful to find mutual alignment in meanings and to be able to make stronger arguments for the need of improved cybersecurity practices, procedures, and policies in other organizational departments. Future research could also investigate how interdisciplinary groups or teams in organizations may impact cybersecurity culture and whether there is strength in having diverse cultural meanings for cybersecurity and risk.

Author contributions

Laura D. Osburn (Conceptualization [lead], Data curation [equal], Formal analysis [lead], Investigation [lead], Methodology [lead], Writing—original draft [lead], Writing—review & editing [lead])

Conflict of interest

None declared.

Funding

This material is based upon work supported by the National Science Foundation (NSF #1932769, “SaTC: CORE: Medium: Knowledge Work and Coordination to Improve O&M and IT Collaboration to Keep Our Buildings Smart AND Secure.”)

References

1.

Uchendu
 
B
,
Nurse
 
JRC
,
Bada
 
M
 et al.  
Developing a cyber security culture: current practices and future needs
.
Comput Secur
.
2021
;
109
:
1
38
.

2.

Weick
 
KE
.
Sensemaking in Organizations
.
Thousand Oaks
:
Sage Publications
,
1995
.

3.

Weick
 
KE
,
Browning
 
LD
.
Argument and narration in organizational communication
.
J Manag
.
1986
;
12
:
243
59
.

4.

Gcaza
 
N
,
von Solms
 
R
.
Cybersecurity culture: an ill-defined problem
. In:
Bishop
 
M
,
Futcher
 
L
,
Miloslavskaya
 
N
et al. (eds.)
Information Security Education for a Global Digital Society
.
Cham
:
Springer International Publishing
,
2017
,
98
109
.

5.

Reegård
 
K
,
Blackett
 
C
,
Katta
 
V
.
The concept of cybersecurity culture
. In:
Proceedings of the 29th European Safety and Reliability Conference (ESREL 2019)
,
Hannover, DE, Singapore
:
Research Publishing Services
,
2019
,
4036
43
.

6.

Schein
 
EH
.
Organizational Culture and Leadership
.
San Francisco
:
John Wiley & Sons
,
2010
.

7.

Martin
 
J
.
Organizational Culture: Mapping the Terrain
.
Thousand Oaks
:
Sage Publications
,
2002
.

8.

Martin
 
J
.
Cultures in Organizations: Three Perspectives
.
New York
:
Oxford University Press
,
1992
.

9.

Da Veiga
 
A
.
A cybersecurity culture research philosophy and approach to develop a valid and reliable measuring instrument
. In:
Proceedings of 2016 SAI Computing Conference (SAI)
, 13-15 July,
London, UK
,
2016
,
1006
5
.

10.

Da Veiga
 
A
,
Martins
 
N
.
Defining and identifying dominant information security cultures and subcultures
.
Comput Secur
.
2017
;
70
:
72
94
.

11.

Da Veiga
 
A
,
Astakhova
 
LV
,
Botha
 
A
 et al.  
Defining organisational information security culture—perspectives from academia and industry
.
Comput Secur
.
2020
;
92
:
101713
.

12.

Huang
 
K
,
Pearlson
 
K
.
For what technology can't fix: building a model of organizational cybersecurity culture
.
Proceedings of the 52nd Hawaii International Conference on System Sciences. Manoa, HI: HICSS
,
2019
,
6398
407
.

13.

Corradini
 
I
,
Nardelli
 
E
.
Building organizational risk culture in cyber security: the role of Human factors
. In:
Ahram
 
TZ
,
Nicholson
 
D
(eds.).
Advances in Human Factors in Cybersecurity
. Vol.
782
.
Cham
:
Springer International Publishing
,
2019
,
193
202
.

14.

Quigley
 
K
,
Burns
 
C
,
Stallard
 
K
. ‘
Cyber Gurus’: a rhetorical analysis of the language of cybersecurity specialists and the implications for security policy and critical infrastructure protection
.
Gov Inf Q
.
2015
;
32
:
108
17
.

15.

Lakshmi
 
R
,
Naseer
 
H
,
Maynard
 
S
 et al.  
Sensemaking in cybersecurity incident response: the interplay of organizations, technology and individuals
.
A Virtual AIS Conference
,
Marrakesh, Morocco
,
2021
.

16.

Watkins
 
EA
,
Al-Ameen
 
MN
,
Roesner
 
F
 et al.  
Creative and set in their ways: challenges of security sensemaking in newsrooms
. In:
Proceedings of the 7th USENIX Workshop on Free and Open Communications on the Internet. USENIX Association
,
2017
.

17.

Busse
 
K
,
Seifert
 
J
,
Smith
 
M
.
Exploring the security narrative in the work context
.
J Cybersecur
.
2020
;
6
:
1
12
.

18.

Bernal
 
V
.
The cultural construction of cybersecurity: digital threats and dangerous rhetoric
.
Anthropol Q
.
2021
;
94
:
611
38
.

19.

Hawkins
 
MA
,
Saleem
 
FZ
.
The omnipresent personal narrative: story formulation and the interplay among narratives
.
J Organ Change Manag
.
2012
;Vol.
25
:
204
19
.

20.

Czarniawska-Joerges
 
B
.
Narratives in Social Science Research
.
Thousand Oaks
:
Sage Publications
,
2004
.

21.

Boje
 
DM
.
Narrative Methods for Organizational and Communication Research
.
Thousand Oaks
:
Sage Publications
,
2001
.

22.

Boje
 
DM
.
The Storytelling Organization: a study of story performance in an office-supply firm
.
Adm Sci Q
.
1991
;
36
:
106
.

23.

Georgakopoulou
 
A
.
Small Stories, Interaction and Identities
.
Philadelphia
:
John Benjamins Pub. Co
,
2007
.

24.

Humle
 
DM
,
Pedersen
 
AR
.
Fragmented work stories: developing an antenarrative approach by discontinuity, tensions and editing
.
Manag Learn
.
2015
;
46
:
582
97
.

25.

Greatbatch
 
D
,
Clark
 
T
.
The Situated Production of Stories
.
Organization, Interaction and Practice: Studies in Ethnomethodology and Conversation Analysis
.
Cambridge
:
Cambridge University Press
,
2010
,
96
118
.

26.

Brown
 
AD
,
Colville
 
I
,
Pye
 
A
.
Making sense of sensemaking in organization studies
.
Org Stud
.
2015
;
36
:
265
77
.

27.

Bamberg
 
M
,
Georgakopoulou
 
A
.
Small stories as a new perspective in narrative and identity analysis
.
Text Talk Interdiscip J Lang Discourse Commun Stud
.
2008
;
28
:
377
96
.

28.

de Fina
 
A
,
Georgakopoulou
 
A
.
Introduction: narrative analysis in the shift from texts to practices
.
Text Talk Interdiscip J Lang Discourse Commun Stud
.
2008
;
28
:
275
81
.

29.

Robert
 
K
,
Ola
 
L
.
Reflexive sensegiving: an open-ended process of influencing the sensemaking of others during organizational change
.
Eur Manag J
.
2021
;
39
:
476
86
.

30.

Prior
 
DD
,
Keränen
 
J
,
Sensemaking
 
KS
,
sensegiving and absorptive capacity in complex procurements
.
J Bus Res
.
2018
;
88
:
79
90
.

31.

Abolafia
 
MY
.
Narrative construction as sensemaking: how a Central bank thinks
.
Org Stud
.
2010
;
31
:
349
67
.

32.

Law
 
J
.
Organization, narrative and strategy
. In:
Hassard
 
J
,
Parker
 
M
(eds.).
Towards a New Theory of Organizations
.
London
:
Routledge
,
1994
,
248
68
.

33.

Doolin
 
B
.
Narratives of change: discourse, technology and organization
.
Organization
.
2003
;
10
:
751
70
.

34.

Deuten
 
JJ
,
Rip
 
A
.
Narrative infrastructure in product creation processes
.
Organization
.
2000
;
7
:
69
93
.

35.

Smith
 
V
.
Ethnographies of work and the work of ethnographers
. In:
Atkinson
 
P
,
Coffey
 
A
,
Delamont
 
S
et al. (eds.)
Handbook of Ethnography
.
Thousand Oaks
:
Sage Publications
,
2014
,
220
33
.

36.

Geertz
 
C
.
The Interpretation of Cultures: Selected Essays
.
New York
:
Basic Books
,
1973
.

37.

Czarniawska
 
B
.
A Narrative Approach to Organization Studies
.
Thousand Oaks
:
Sage Publications
,
1998
.

38.

Cortazzi
 
M
.
Narrative analysis in ethnography
. In:
Atkinson
 
P
,
Coffey
 
A
,
Delamont
 
S
et al.
Handbook of Ethnography
.
Thousand Oaks
:
Sage Publications
,
2014
,
384
94
.

39.

Strauss
 
AC
,
Corbin
 
JM
.
Basics of Qualitative Research: Grounded Theory Procedures and Techniques
.
Thousand Oaks
:
Sage Publications
,
1990
.

40.

Osburn
 
L
,
Snider
 
M
,
Dossick
 
CS
.
Doing Online Ethnography: Reflections on Methods and Challenges in Hybrid Environments
.
Embracing Ethnography: Doing Contextualised Construction Research
.
New York
:
Routledge
,
2024
.

41.

Webster
 
L
,
Mertova
 
P
.
Using Narrative Inquiry as a Research Method
.
New York
:
Routledge
,
2007
.

42.

Evripidou
 
S
,
Ani
 
ED
,
Hailes
 
S
,
Watson
 
MDJ
.
Exploring the security culture of operational technology (OT) organisations: the role of external consultancy in overcoming organisational barriers
.
Proceedings of the Nineteenth Symposium on Usable Privacy and Security
.
USENIX Symposium on Usable Privacy and Security (SOUPS)
,
Anaheim, CA
,
2022
,
113
29
.

This is an Open Access article distributed under the terms of the Creative Commons Attribution-NonCommercial-NoDerivs License (https://creativecommons.org/licenses/by-nc-nd/4.0/), which permits non-commercial reproduction and distribution of the work, in any medium, provided the original work is not altered or transformed in any way, and that the work is properly cited. For commercial re-use, please contact [email protected]