Motivating cybersecurity behaviors: a beyond reasoned action conceptualization

Verlin B. Hinsz (North Dakota State University, Fargo, North Dakota, USA)

Organizational Cybersecurity Journal: Practice, Process and People

ISSN: 2635-0270

Article publication date: 22 November 2024

237

Abstract

Purpose

A critical issue in organizations concerned with cybersecurity is how to motivate personnel to engage in safety and security behaviors to counter potential threats. For these organizations to be effective, they must rely upon their members who are motivated to engage in behaviors to assure various forms of cybersecurity.

Design/methodology/approach

A conceptualization is described outlining the factors and processes involved in motivating cybersecurity behaviors. The theoretical starting point is the reasoned action approach (Fishbein and Ajzen, 2010), which provides a strong and parsimonious basis for considering the processes and factors that predict safety and security behaviors (intentions, perceived behavioral control, subjective norms, attitude toward the behavior and beliefs).

Findings

The conceptualization presented goes beyond the reasoned action approach to consider factors involved in cybersecurity behaviors that might not be reasoned (work routines and habits and motivating emotions). This more integrated conceptualization describes how personal factors such as anticipated affect, attitude toward the process and personal norms can be seen as contributing to motivated behavior.

Originality/value

The beyond reasoned action conceptualization is of value to organizations for which motivated safety and security behaviors contribute to their effectiveness, with the conceptualization providing practical recommendations for enhancing cyber safety and cybersecurity. A research agenda based on this beyond reasoned action conceptualization articulates numerous avenues for further investigation.

Keywords

Citation

Hinsz, V.B. (2024), "Motivating cybersecurity behaviors: a beyond reasoned action conceptualization", Organizational Cybersecurity Journal: Practice, Process and People, Vol. ahead-of-print No. ahead-of-print. https://doi.org/10.1108/OCJ-08-2023-0015

Publisher

:

Emerald Publishing Limited

Copyright © 2024, Verlin B. Hinsz

License

Published in Organizational Cybersecurity Journal: Practice, Process and People. Published by Emerald Publishing Limited. This article is published under the Creative Commons Attribution (CC BY 4.0) license. Anyone may reproduce, distribute, translate and create derivative works of this article (for both commercial and non-commercial purposes), subject to full attribution to the original publication and authors. The full terms of this license may be seen at http://creativecommons.org/licences/by/4.0/legalcode


In the State of the State Address (2022), the Governor of North Dakota (population ∼ 775,000) mentioned that in one year there were 4.5 billion attacks against the state government’s network which includes higher education, K-12 education, and all other state functions. Moreover, the state information technology team had to respond to over 50,000 cyberattack incidents that were not thwarted by automated defense systems. This announcement highlights cybersecurity issues in modern organizations, and in this case, the risks one sparsely populated state is facing from cyberattacks. The organizational employee can be seen as the first line of defense to cybersecurity (cf. Hinsz and Nickell, 2009; Sasse and Flechais, 2005). Surprisingly, it is also these employees who may pose the greatest vulnerability for the organization in terms of cybersecurity (e.g. insider threat; Predd et al., 2008). Like the Governor of North Dakota, other leaders, managers, directors, and supervisors in these organizations implore members to take training, utilize technology, and follow procedures to avoid cyberattacks and reduce the associated costs. The focus of this article is on the description of a conceptualization of organizational members’ motivation for cybersecurity behavior.

This paper relies upon the central organizational construct of motivation (cf. Kanfer, 1990) to articulate a conceptualization of motivated behavior to explain cybersecurity related actions and responses. However, the paper provides only a limited perspective of motivation by focusing on the processes that contribute to behavioral intentions. As a conceptualization of motivated cybersecurity behavior, this article does not review some specific features of cybersecurity behavior (see Zaccaro et al., 2019a for several). Moreover, a comprehensive review of theoretical formulations of cybersecurity behavior is beyond the scope of this article (see Burns et al., 2018, for just one example examining expectancy theory).

Initially, this paper highlights unique features of cybersecurity behavior. Then I provide a preliminary conceptualization of motivated cybersecurity behavior based on the theory of planned behavior (Ajzen, 1991) and the theory of reasoned action (Fishbein and Ajzen, 2010). I will then go beyond this reasoned action approach to describe a number of additional components for a broader conceptualization of motivated cybersecurity behavior. I conclude with a set of research questions that serve as a research agenda for further consideration of motivated cybersecurity behavior.

Cybersecurity behavior

The events and behaviors associated with cyberspace go by various names. I will use cybersecurity behavior as a generic, umbrella term for a class of activities, actions and responses associated with protection, safety, and security in the cyber realm. Conceptually, this article follows Dalal et al. (2022) for terminology that outlines the domain of cybersecurity in organizations. Importantly, this article focuses on individuals’ motivation to engage in behaviors that impact cybersecurity. These behaviors involve actions and inactions related to protection, prevention, deterrence, promotion, threats, vulnerability, mitigation, risk, cyber technology, etc. (see Posey et al., 2013, for a taxonomic outline of cybersecurity behaviors).

Cybersecurity is an exemplar of a general class of safety and security behaviors (Carr et al., 2021; Dalal et al., 2022). There is an emerging emphasis on developing a more general perspective of the dynamics and contexts that surround security and security behaviors affecting human lives and activities (Hodgetts et al., 2023). Similarly, safety and security behaviors by employees have also emerged as important concerns for organizations that provide goods and services (Bitzer et al., 2009; Cooke and Winner, 2007; Hinsz and Nickell, 2015; Hinsz et al., 2007; Proctor and Chen, 2015). Many organizations rely upon their members to engage in behaviors to assure various forms of safety and security, such as retailers (e.g. shoplifting), transportation (e.g. TSA agents), military installations (e.g. sentry duty), schools (e.g. active shooters), food service (e.g. food contamination), commodity production (e.g. oil rig accidents), public events (e.g. marathons), corporate innovation (e.g. industrial espionage), health care (e.g. patient safety), financial services (e.g. identity theft), and national security (e.g. document security). Consequently, for these organizations to be effective, organization members need to be motivated to perform the behaviors that result in safety and security. This paper proposes a conceptualization to explain the factors and processes of motivated behaviors that lead to safety and security in organizational settings. In particular, the conceptualization directs attention to how organizational members can be motivated to engage in cybersecurity behaviors.

Cybersecurity behavior falls under special consideration for the criterion problem (Austin and Villanova, 1992; Dalal et al., 2022). Safety and security are qualitatively different from other classes of behavior associated with performance (e.g. quantity, quality). The core content of safety and security behaviors can be seen as protection from threats. Organizations can perceive threats in a variety of ways and their members are often deployed to deter these threats or to protect the organization from exploitation of its vulnerabilities. Building on the view that safety and security involves protection from threats, safety and security behaviors can be seen as novel in a number of ways for considerations of assessing performance. (1) The potential negative outcomes associated with engaging security and safety behaviors generally occur rarely, however, they have severe consequences when they do occur. (2) The negative outcomes can arise for a number of reasons (e.g. technological), however, the main behavioral point of focus is the person who acts or fails to act in the situation. (3) Depending on the circumstance, if a person acts in an unsafe or insecure fashion, the threat may or may not materialize. (4) Alternatively, if a person fails to act appropriately in a situation, that too may or may not lead to exposure to threats. (5) Importantly, even if a person/agent acts inappropriately in a situation, this does not mean that a negative outcome will occur. Consequently, the actions and inactions of an organization’s members impact the threats to an organization, and there is limited correspondence between negative outcomes and safety and security behaviors. Thus, a critical concern for safety and security in many organizational contexts such as cybersecurity is to motivate personnel to engage in the appropriate cybersecurity behaviors and avoid inappropriate behaviors. Hence, organizations and their leaders have to motivate all these appropriate behaviors and extinguish inappropriate behaviors, even when what is desired is inaction (e.g. not opening a risky link to a website).

Because safety and security behaviors are important for many organizations to achieve their goals (Zaccaro et al., 2019a), it may be very useful to focus on understanding the predictors of safety and security behaviors to motivate those behaviors (Hinsz and Nickell, 2015). This paper presents a conceptualization for motivated safety and security behaviors with specific relation to cybersecurity (e.g. threat deterrence, breach mitigation, protection from vulnerabilities). Hopefully, this conceptualization provides an enhanced understanding of the factors and processes involved in motivating safety and security behaviors such as cybersecurity.

A conceptual beginning

For years, my colleagues and I have been working in the realm of motivated safety and security behaviors, examining elements of safety and security from the domain of food safety (e.g. Hinsz et al., 2007; Nickell et al., 2005; Park et al., 2015). From this research, we conclude that an effective way of fostering safety and security is to motivate the human who is responsible for safety and security behaviors. As a starting point for considering the motivation of these behaviors, the theory of planned behavior (Ajzen, 1991) and its precursor, the theory of reasoned action (Ajzen and Fishbein, 1980; Fishbein and Ajzen, 1975) are used. Others have also noted the utility of these theories for application to cybersecurity-related behavior (e.g. Burns et al., 2018; Dalal et al., 2022; Pfleeger and Caputo, 2012). Moreover, researchers have discussed how desired cybersecurity outcomes result from behaviors of individuals who are motivated to engage in those cybersecurity-related behaviors (Burns et al., 2018; Posey et al., 2013).

Although not generally considered as such, the theory of planned behavior is a theory of motivated behavior (Ajzen, 1991) because it outlines the processes and factors that contribute to the prediction of goal-directed or purposeful behavior (see Figure 1). Because the broader conceptualization being proposed builds upon this theory, I will first describe the relevant components of the theory of planned behavior (from right to left in Figure 1).

Safety and security behaviors

Conceptually, the negative outcomes associated with safety and security risks can be influenced by the behaviors of organizational members. As part of their duties, tasks, and responsibilities, these members engage in safety and security behaviors or make sure that they don’t engage in behaviors that expose the organization to safety and security risks. Although other factors also play a role in the occurrence of the negative outcomes associated with safety and security (e.g. technological failures, improper protocols), it is the safety and security behaviors that are under the control and influenced by personnel. There is a constellation of behaviors that are appropriate in a context that can promote safety and security. These behaviors are more or less effective depending on the conditions that exist in the situation, but motivation is key to getting desired behaviors to occur. Most organizations select for and train the desired behaviors, and attempt to restrict inappropriate behaviors. Yet, the conditions resulting in vulnerabilities to threats or breakdown in protocols can still exist or emerge. Thus, motivation is essential for maintaining safety and security. For many situations involving motivation, a complex arrangement of factors and processes can impact desired and undesired safety and security behaviors.

Behavioral intentions

The intentions that people have to behave in certain ways are key to the motivation of those behaviors. Intentions are people’s judgments about the likelihood that they will or will not engage in the behavior. These intentions are considered the immediate precursors of the behaviors. People behave in specific ways because they intend/desire/want/are willing to. That is, people determine and intend to engage in behaviors that they choose. Consequently, safety and security outcomes are influenced by the behaviors of individuals. For example, individuals in organizations can have intentions to comply with firm policies regarding cybersecurity and intentions to protect cyber technology from threats in those firms (Burns et al., 2018). There is a rich literature that shows how intentions are key predictors of behavior (e.g. Fishbein and Ajzen, 1975, 2010; Triandis, 1977) and for predicting safety behaviors (e.g. Hinsz and Nickell, 2015). According to the theory of planned behavior, intentions are predicted by attitudes, subjective norms, and perceived behavioral control (see Figure 1).

Perceived behavioral control

Perceived behavioral control reflects the degree people believe they have the control over and capability to engage in behaviors such as those associated with cybersecurity. Because of differences in the way people might be able to perform a behavior, perceived behavioral control predicts some behaviors better than others. Because safety and security behaviors can be directly as well as indirectly related to important outcomes, we expect that people’s perception that they can control safety and security behaviors will be predictive of those behaviors. There are consistent findings that perceived behavioral control predicts behavior and seemingly has an impact on behavior independent of that of intentions (Hinsz and Nickell, 2015; Hinsz et al., 2007; Nickell and Hinsz, 2023; Nickell et al., 2005). Importantly, perceived behavioral control can theoretically influence intentions, so perceived behavioral control is represented in Figure 1 with a dashed line for that link. Thus, perceived behavioral control is important for organizational personnel feeling effective about their engaging in cyber safety and security behaviors.

Attitude toward the behavior

Attitude toward the behavior reflects personal opinions (e.g. positive or negative; favorable-unfavorable) about engaging in the behavior. As such, it is a summary of the general reactions the person has towards engaging in the behavior. Numerous considerations of cybersecurity have relied upon the attitude concept to help explain and predict cybersecurity-related behaviors (e.g. Dalal et al., 2022; Pfleeger and Caputo, 2012; Zaccaro et al., 2019b). A long history of research supports the role of attitudes in the prediction of intentions (e.g. Fishbein and Ajzen, 2010; Triandis, 1977). This research demonstrates that when attitudes and behaviors are assessed at corresponding levels, with sufficient specificity, and with high quality measures, then attitudes are quite predictive of the corresponding behaviors. Within the theory of planned behavior, the attitudes predict intentions, which then predict behavior. Yet, this prediction is enhanced if factors such as subjective norms are used to compliment attitudes.

Subjective norms

Subjective norms reflect the person’s belief that important others think the person should or should not engage in the behavior. That is, to what degree does the person believe others who are important to the person approve or disapprove of the person engaging in the behavior. Not only do people do what they want to do (i.e. behave according to their attitudes) but they also do what they believe others want them to do (i.e. behave according to their subjective norms). Cybersecurity behaviors are often affected by these normative influences (cf. Dreibelbis et al., 2018; Pfleeger and Caputo, 2012). Although intentions are differentially predicted by attitudes and subjective norms, research indicates that properly assessed subjective norms make significant contributions to the prediction of intentions (e.g. Hinsz and Nickell, 2015; Hinsz et al., 2007; Nickell and Hinsz, 2023; Nickell et al., 2005). Thus, it is important to recognize the impact that subjective norms will have on safety and security behaviors.

Prediction of intentions

According to the theory of planned behavior, behavioral intentions are expected to be predicted by attitudes toward the behavior, subjective norms, and perhaps perceived behavioral control (Figure 1). Our previous research consistently showed that attitudes toward the behavior and subjective norm each made a strong and unique prediction to food safety intentions (Hinsz and Nickell, 2015; Hinsz et al., 2007; Nickell and Hinsz, 2023; Nickell et al., 2005). Consequently, our research provides evidence that the theory of planned behavior provides at least an adequate basis for understanding safety and security behaviors.

Beliefs

It is possible to go beyond the previously listed factors of the theory of planned behavior to consider how the influences of those factors arise. In particular, beliefs are the foundation for these predictors of intentions. Beliefs are implicated in numerous formulations of the development and precursors of cybersecurity behaviors (Burns et al., 2018; Dalal et al., 2022; Zaccaro et al., 2019b). Beliefs also underlie other general concepts associated with cybersecurity (e.g. trust; Pfleeger and Caputo, 2012; safety climate; Dalal et al., 2022; Nickell and Hinsz, 2011). These beliefs reflect the associations individuals have about how particular outcomes, perceptions of control, and normative referents impact behaviors. These beliefs can be stated as the individuals’ subjective probability about the degree of association between the attribute and the potential outcomes (e.g. if I change my password, I will reduce risks of identity theft).

Behavioral beliefs reflect the person’s associations regarding cognitive aspects of engaging in the behavior (e.g. outcomes that are beneficial to the person). These beliefs are the positive and negative outcomes that come to mind for individuals when they think about engaging in the behavior. These behavioral beliefs reflect the core notions that are the foundation for a person’s attitude toward engaging in the behavior. For many operators in organizations, they perceive no benefits or losses for using the cyber technologies that would reduce threats, so they have indifferent attitudes and intentions to use the technology (cf. Dreibelbis et al., 2018).

Normative beliefs underlie the social forces reflecting individuals’ beliefs about how others think about their engaging in the behaviors. For organizational members, the subjective norm highlights their beliefs that people important to them (e.g. supervisor; co-workers) expect them to engage in the behavior as well as their motivation to comply with the wishes of these people. As Pfleeger and Caputo (2012) highlight, when individuals notice that co-workers do not react to a potential cybersecurity threat, then via a diffusion of responsibility, the individuals perceive that they have no responsibility to respond.

Control beliefs reflect two components of these relationships between some outcome and individuals’ perceived control over that outcome. First, control beliefs indicate perceptions the person has of control over the situation so that they can behave in a way such that desired outcomes occur. Second, control beliefs reflect the association between their capability to engage in the appropriate behavior so that the desired outcomes occur (cf. self-efficacy; Bandura, 1997). These control beliefs are critical for understanding the ways that individuals perceive they can and should interact with specific cyber technologies and the effectiveness of doing so to achieve cybersecurity. Research shows that if people believe that their behavior can influence outcomes, they are more likely to engage in those behaviors. Similarly, if they believe that their actions have no influence on the outcomes, then they are less likely to pursue those actions.

Given the complicated nature of cybersecurity behaviors, the beliefs that organizational members have are likely to contribute greatly to understanding why individuals would or would not engage in cybersecurity behaviors. Nickell and Hinsz (2023) demonstrated that behavioral beliefs did predict attitudes toward engaging in safety behaviors, that control beliefs predicted the workers’ perceived behavioral control, and that normative beliefs predict subjective norms for safety behaviors (see also Nickell et al., 2005).

An important premise of the theory of planned behavior and the theory of reasoned action is that background or external factors not included in the theory have their impact on behavior primarily through their impact on the beliefs that people have about engaging in the behavior (Ajzen, 1991; Ajzen and Fishbein, 1980). This premise has important implications for the consideration of many variables often thought to be important for understanding behaviors such as cybersecurity. That is, background factors such as age, gender, social status, or personality traits are predicted to have an impact on behavior through the person’s beliefs and the components (attitudes, subjective norm and perceived behavioral control) that result from beliefs. Our previous research has found support for this premise. Nickell and Hinsz (2023) found a gender effect that emerged for attitude toward the behavior. This gender effect was mediated by the behavioral beliefs, as would be predicted. Additionally, Hinsz and Nickell (2015) found that although job satisfaction and organizational commitment did predict intentions and behaviors, these relationships disappeared once attitudes, subjective norms and perceived behavioral control were included in the predictions (see Ajzen, 2011, for similar findings). Thus, evidence about beliefs mediating the influence of background factors on attitudes, subjective norms and perceived behavioral control is consistent with the theory of planned behavior.

Toward a more comprehensive conceptualization

From the foundation of the theory of planned behavior and the theory of reasoned action (now integrated into the reasoned action approach, Fishbein and Ajzen, 2010), a more comprehensive conceptualization of motivated safety and security behavior can be developed (see Figure 2). This conceptualization is based on decades of prior research and theorizing, in particular that of the reasoned action approach, however, it is also informed by the theorizing of Triandis (1977). Additionally, other recent perspectives on motivated behavior and the work of other researchers and theorists contribute to, and are integrated in, the conceptualization.

A premise of the reasoned action approach (Fishbein and Ajzen, 2010) is that the key components of attitude toward the behavior, perceived social norms, and perceived behavioral control are sufficient as theoretical constructs to account for most types of behaviors that are reasoned and mostly under volitional control (i.e. sufficiency principle). This is a parsimoniously desirable and empirically viable premise. However, there have been a number of challenges to the reasoned action approach that deserve further scrutiny (Fishbein and Ajzen, 2010), particularly for their relevance to safety and security behaviors. Given research on work routines discussed directly below, Triandis’ (1977) model of interpersonal behavior may provide a more comprehensive explanation of behaviors and insights about the nature of factors that contribute to behaviors such as those related to cybersecurity. The conceptualization described here resonates with many aspects of Triandis’ model, although it deviates from it as well.

The aim is to establish a conceptualization for understanding and predicting safety and security behaviors that (1) augments the reasoned action approach, (2) goes beyond the reasoned action approach in terms of the sufficiency principle, (c) yet reaffirms the correspondence principle (Ajzen and Fishbein, 1980; Fishbein and Ajzen, 1975) that dictates the kinds of factors that have demonstrable relationships with behaviors such as those that involve safety and security (e.g. emotions and traits that correspond with the behavior being examined). This approach is labeled the Beyond Reasoned Action Conceptualization which can be a strong foundation for understanding motivated cybersecurity behaviors. The intent is not to undermine the reasoned action approach, but to go beyond it to consider other kinds of factors that can foster a deeper understanding of specific kinds of behaviors (e.g. cybersecurity).

There a number of issues that help to describe the ways in which the conceptualization goes beyond the reasoned action approach. First, the beyond reasoned action conceptualization entertains the notion that certain factors can influence behavior that are not “reasoned” and that might operate outside of what can be considered reasoned processes. For cybersecurity behavior, some factors can contribute to the prediction of behavior (e.g. work routines and habits; motivating emotions) but not be considered to be reasoned per se. Second, some factors reflect unique features of the nature of work for organizations that rely upon members’ security behaviors (e.g. personal norms, attitude toward the process) that might not be captured by the components of the reasoned action approach. Third, some factors in the beyond reasoned action conceptualization may be fundamental aspects of motivated behavior that might not be reasoned or may be uniquely tied to safety and security behaviors (e.g. a motivating emotion such as fear). Fourth, the conceptualization includes factors that have emerged in the literature that deserve further scrutiny (e.g. anticipated affect; attitude toward the process). Thus, in many ways, the conceptualization offered for motivating cybersecurity behaviors differs from the reasoned action approach, but is generalizable to the larger class of organizational behavior concerned with safety and security.

Work routines and habits

Behavioral psychology has established the relationship between habits and behavior (e.g. Hull, 1943; Triandis, 1977) which has again achieved attention with habit’s role in everyday life (e.g. Wood, 2019). Safety and security in organizations is often maintained because related behaviors are habitual and take on the nature of routines in that they are repeated in response to situational cues (Hinsz et al., 2007; Ohly et al., 2017; see Vance et al., 2012, for cybersecurity). Observations of safety and security behaviors in a variety of organizational settings show that if workers reliably perform appropriate behaviors that are repetitive, then safety and security are better protected (e.g. inserting the CAC card prior to initiating work on a computer).

According to Triandis’ approach, behavior can be predicted by people’s intentions in conjunction with the frequency they behaved similarly in the past, which is a traditional measure of habits. Although there are merits for considering how work habits can influence workers’ behaviors, the literature shows conceptual, theoretical, and methodological shortcomings for the habit construct and its assessment in terms of the frequency of past behavior (e.g. Ajzen, 2002; Ouellette and Wood, 1998; Verplanken, 2006; Verplanken and Aarts, 1999; Verplanken and Orbell, 2003). Consequently, some have considered the habit construct from the perspective of behavior that falls along an automatic–controlled continuum (Ronis et al., 1989) and in terms of conditions of automaticity (i.e. efficiency, lack of awareness, uncontrollability, and unintentionality; Bargh, 1994). With repeated performance of safety and security behaviors, the behavior might become more automatic, demonstrating habit strength (Neal et al., 2006), or alternatively, be part of a controlled routine (Ohly et al., 2017; Weiss and Ilgen, 1985).

A work routines perspective considers some repeated behaviors to occur with conscious control and reason (Bamberg et al., 2003; Hinsz et al., 2007; Ohly et al., 2017). Hinsz et al. (2007) proposed that habit strength might better predict certain behaviors (e.g. delete messages with lures to an imbedded malicious url, i.e. spear phishing) whereas work routines might be a better predictor of other behaviors (e.g. following dual authentication protocols for accessing specific resources). Hinsz et al. (2007) found that a work routine measure added significantly to the prediction of behavior by incorporating notions that safety and security behavior can be repetitive in nature, yet be performed with conscious awareness which is likely for many cybersecurity behaviors (Jajodia et al., 2010). Consequently, work routines and habits notions are included in the beyond reasoned action conceptualization for motivated safety and security behaviors and have received some attention for cybersecurity (Dalal et al., 2022).

Motivating emotions

For decades, emotions were rarely considered in models of motivated behavior in organizations (cf. Lord et al., 2002). However, research suggests that if properly assessed, emotions can impact the motivation for safety and security (e.g. fear in reaction to perceived threats). In general, emotions can be fast acting, non-conscious patterns of responding that are provoked by situational variables or cues (Lord and Kanfer, 2002). Not all behaviors are likely to be substantially influenced by emotions. Consequently, emotions do not fit with the reasoned action approach, which generally focuses on factors under volitional control (e.g. intentions) that predict behavior. Nevertheless, when considering a more comprehensive approach to the prediction of safety and security behaviors, there is merit in exploring the role of specific emotions that may influence behaviors that correspond with the specific emotion.

Cybersecurity breaches are useful examples for the consideration of the impact of motivating emotions on safety and security behaviors because a specific emotion is related to threats: fear. Fear is a motivating emotion that propels behavior to protect against a threat. When presented with a cue (e.g. screen indicating a computer is infected) that is perceived as threatening, people have a fast, generally non-conscious emotional reaction that we can call fear. Moreover, fear would be a motivating emotion that should lead a person to behave in ways to reduce or eliminate the perceived threat such as for cybersecurity behavior (Pfleeger and Caputo, 2012). As a consequence, individuals who are more likely to experience fear should be more motivated to behave in ways that protect against threats. Consequently, for cybersecurity, those who have a sensitivity toward threats should be more likely to behave in a protective fashion with cyber technologies and would be more attentive to cybersecurity (i.e. direct effect of motivating emotion on behavior in Figure 2). Moreover, if the operator already has an intention toward cybersecurity behaviors, then a greater threat sensitivity should facilitate the existing intention (i.e. moderating effect of motivating emotion in Figure 2). These two effects of fear show how a motivating emotion adds unique variance to the prediction of specific behaviors such as cybersecurity.

Anticipated affect

Emotions reflect how affective influences can contribute to the prediction of safety and security behaviors. Historically, affect has been a problem for researchers attempting to predict behavior. Fishbein and Ajzen (1975) suggested that affect was incorporated into the attitude toward behavior. Triandis (1977) proposed that affect made a unique contribution to the prediction of intentions. Although this is a simplification of the differences of opinion, the reasoned action approach tended to focus on the cognitive aspects of attitudes while Triandis’ approach focused on the more affective aspects of attitudes. Research has found support for both perspectives, so affect could play a role in motivating cybersecurity behaviors. As part of a belief elicitation procedure for a sample of food processing workers who provided poultry products for school lunch programs (Nickell and Hinsz, 2009), one of the workers remarked that “I would feel really bad if I did something and one of the children got sick or died.” Perhaps the reason why many measures of affect did not add to the prediction of intention was because it assessed the affect associated with engaging in the behavior whereas the factor motivating behavior was anticipated affect.

Anticipated affect reflects the positive or negative feelings people have as a function of thinking about the results of their engaging or not engaging in the behaviors. This anticipated affect can clearly influence intentions toward safety and security behaviors. For example, when cybersecurity personnel anticipate regret if they act in a way that puts security at risk, or anticipate joy because they did their part in thwarting a threat. Research on anticipated affect has examined anticipated regret as playing a role in choices people make (Mellers and McGraw, 2001). Similarly, motivated safety and security behavior involves people’s choices about the actions they would take, so anticipated affect could play a role in motivated safety and security behavior. This anticipated affect could add independently and significantly to the prediction of intentions to engage in cybersecurity behaviors.

Anticipated affect has gained more attention in efforts to understand behavior (Ajzen and Sheikh, 2013; Richard et al., 1996; Sandberg and Conner, 2008). Sandberg and Conner (2008) conducted a meta-analysis that examined the unique contribution of anticipated affect on intentions and behavior and found that anticipated affect added a unique 7% contribution to the prediction of intentions. Similarly, we found that both positive anticipated affect (e.g. happy; r = 0.42, p < 0.001) and negative affect (e.g. sad; r = 0.34, p < 0.001) related to safe food practices (Betts and Hinsz, 2012). Given the potential anticipated affect might have for the prediction of intentions for cybersecurity (e.g. following appropriate protocols, allowing virus infections), it is included in this beyond reasoned action conceptualization.

Attitude toward the process

One notion that has resurfaced with recent work on motivated behavior is the value that a person gains from doing things in a way that fits with how they like to do things (Higgins, 2006). That is, if people engage in behaviors in ways that feel right for them (e.g. engaging in protection behaviors; Posey et al., 2013), then there is an added value derived from engaging in the behavior. This added value would enhance the attitude a person has for engaging in the behavior. Consequently, in addition to the attitude toward the behavior or outcome that is associated with behavioral intention models, the attitude toward the process implies that people’s attitudes might include their attitude toward engaging in the processes by which the behavior or outcomes arise. Most typical approaches to assessing attitudes do not include this added value notion because it is not seen as an outcome associated with engaging in the behavior (e.g. making cyber activities secure). Nevertheless, the notion of an attitude toward the process was proposed in a theory of goal pursuit (Bagozzi and Warshaw, 1990). Bagozzi and Warshaw tested if this attitude toward the process added to the predictions of the theory of planned behavior (see also Hinsz and Ployhart, 1998). Although not a strong predictor, the attitude toward the process did make a unique contribution to an overall attitude measure.

With the potential value added from the behavior fitting with the individual, it seems reasonable that an attitude toward the process component would add to the prediction of motivated safety and security behaviors. Park et al. (2015) predicted that people who were more prevention focused (cf. Higgins, 1997) were more likely to experience this value from fit in terms of pursuing safety behaviors. These workers who experienced the added value from fit enjoyed safety tasks more, were more satisfied and involved with their jobs, and perceived that they were more effective. Similarly, when individuals are asked to perform tasks that protect against cyber threats, the design of those tasks (Parker et al., 2019) and technology utilized in those tasks (Coovert et al., 2019; Pfleeger and Caputo, 2012) can frustrate or facilitate the individuals’ desire to do the tasks in a certain way. Consequently, based on this conceptual development and the empirical evidence, the attitude toward the process component is included in the conceptualization for the prediction of intentions towards safety and security behaviors.

Perceived social norms

When Fishbein and Ajzen (2010) updated their theoretical formulations into the reasoned action approach, one change they made was to expand the subjective norms construct to include descriptive norms in addition to the typical focus on injunctive norms (i.e. subjective norms). Considerable evidence had accumulated over prior decades related to a social norms approach (e.g. Cialdini et al., 1999) showing that both injunctive norms (what I believe I am expected to do) and descriptive norms (what I believe others generally do) influenced behavior change in a variety of settings. Fishbein and Ajzen (2010) combined these two types of norms into the perceived social norms component of the reasoned action approach.

Other types of norms might also play a role in perceived social norms (e.g. Godin et al., 2005). In particular, individuals in certain organizations might perceive a norm to engage in the behavior because they perceive they have an obligation or duty to do so. This norm is not injunctive in that it is not in reference to others. Moreover, it is not descriptive because these individuals feel compelled to perform the behavior regardless of what others may do. In this case, we might consider this a personal norm. For safety and security personnel (e.g. military, government agents, workers in high security fields) and likely those involved in cybersecurity, the motivation to engage in safety and security behaviors may be influenced by this personal norm of responsibility, obligation, and duty to perform the safety and security behaviors. Because of training, self-selection, or indoctrination, people in such organizations carry norms with them into situations and generally act according to the personal norms. It is easy to speculate that individuals pursuing cybersecurity would also have these personal norms of duty. Consequently, the beyond reasoned action conceptualization includes these personal aspects of norms for the prediction of motivated cybersecurity behaviors.

Background factors

The reasoned action approach also has a way of accounting for background or external factors that are often considered to influence behavior and related intentions (e.g. age, gender, social status, personality traits; Ajzen and Fishbein, 1980; Fishbein and Ajzen, 2010). These background factors might contribute to predicting motivated cybersecurity behaviors. As a function of the sufficiency principle, the reasoned action approach assumes that background factors do not need to be identified with separate components. Rather, the existing components of the reasoned action approach, if measured appropriately, are sufficient to account for the variance in the predicted behavior. That is, these background factors have their impact on behavior through the beliefs that individuals hold (e.g. Fishbein and Ajzen, 2010). Although the reasoned action approach to background factors limits the consideration for these variables, it does provide for parsimony in the components needed to predict behavior.

The accumulated literature on the reasoned action approach provides much support for the view that the specified components are sufficient to explain and predict behavior (see Fishbein and Ajzen, 2010, for an extensive summary). However, researchers report variability in the findings that arises as a function of the specific behaviors examined. Similarly, are the components of the reasoned action approach sufficient to explain the role of background factors for safety and security behaviors? Given the points raised earlier about the relative unique nature of safety and security, perhaps some specific background factors (e.g. trait conscientiousness, organizational climate of safety) have such diffuse and powerful influences that they contribute to the prediction of safety and security behaviors beyond that provided by the reasoned action components.

There is value in examining whether background factors add to the prediction of safety and security behaviors. If they do not, then we have learned that the reasoned action components are sufficient to account for the impact of the background factors and attention can be focused on the existing components. However, if background factors add to the prediction of safety and security behaviors, then we learn that these variables can be used to diagnosis critical issues for motivating safety and security as well as variables to be targeted for training (e.g. Brummel et al., 2019), selection (e.g. Mueller-Hanson and Garza, 2019), and placement of organizational members to improve effectiveness (cf. Jose et al., 2019). It is possible to consider these background variables that may be important for understanding and predicting cybersecurity behaviors. In particular, the impact of background factors is discussed here with one situational factor (organizational climate) and a dispositional variable (trait conscientiousness). These background factors were chosen because they are likely to relate to safety and security behaviors and because there are large literatures for each related to employee performance in organizations.

Organizational climate of safety and security

Research on organizations has come to recognize that through a variety of processes, organizations convey information to their members about the organization (its values, goals, rituals, norms, expectations, etc.). This information disseminated to the members of the organization is often pervasive and persuasive, covering topics such as policies, procedures, and practices (Zohar, 2003). As a consequence, members of the organization often hold a set of beliefs and expectations about the organization and what it means to be a member of the organization. This set of beliefs that members of an organization have about the organization can be considered its climate (Rentsch, 1990).

Research on organizational climate indicates that it can have powerful and broad influences on organizational members, their actions on behalf of the organization, and various aspects of organizational functioning and effectiveness (Ashkanasy et al., 2000; Parker et al., 2003). Researchers have mentioned the development of a climate of cybersecurity and safety (Dalal et al., 2022) and security culture in organizations (Dreibelbis et al., 2018; Pfleeger and Caputo, 2012) as ways to promote cybersecurity as well help an organization be more resilient in response to cyberattacks (Dreibelbis et al., 2018).

An organizational climate has the potential to impact members engaging in cybersecurity behaviors. If the cybersecurity climate is weak, members are less likely to follow security expectations, norms and regulations. Yet, if the climate is strong, then similar workers would be more motivated to pursue cybersecurity, and to do so diligently, even in the face of obstacles. In this way, an organizational climate of safety and security can have a profound influence on members’ motivation toward cybersecurity.

If an organizational climate of safety and security reflects members’ beliefs about safety and security behaviors, then an organizational climate should impact behaviors through the reasoned action components of attitude toward the behavior, perceived social norm, and perceived behavioral control (see Figure 2). By suggesting that organizational climate has its effects through these components does not imply that it is not important. Rather, the way organizational climate has its profound effects can be traced to the beliefs organizational members have, and that behavior is influenced by these beliefs through the reasoned action components. If an organization is troubled by insufficient safety and security, then this might appear in a measure of the organizational climate, and change might be brought about by influencing the set of beliefs that are reflected in the organizational climate. Alternatively, because of the diffuse impact of climate on an organization, an organizational climate of safety and security might add to the prediction of safety and security behaviors beyond that which is accounted for by the reasoned action components. That is, an organizational climate of safety and security might have an additional direct effect on safety and security behaviors (see dotted line in Figure 2). Regardless of which of these patterns emerges, a better understanding how organizational climate relates to the motivation of safety and security behaviors will be revealed by investigating them.

Trait conscientiousness

Trait conscientiousness reflects a number of underlying stable characteristics of people (e.g. careful, thorough, methodical, perseverance, vigilant, organized, reliable) that can contribute to their behavior at work (Roberts and Hogan, 2001). Trait conscientiousness is one of the few traits shown to be a consistent predictor of job performance across many occupations (Barrick and Mount, 1991; Barrick et al., 2001). Yet, little research has focused on the relationship between conscientiousness and safety behavior (cf. Wallace and Chen, 2006). Given its underlying personality structure related to being motivated to being attentive in pursuit of tasks, trait conscientiousness may be of importance for influences on cybersecurity behaviors (Dalal et al., 2022), in particular in the selection of people for cybersecurity jobs as well as for individuals who may be risks for insider threat (Dreibelbis et al., 2018).

Like other background factors identified above, trait conscientiousness might have a direct effect on safety and security behaviors (see Figure 2, dotted line). Alternatively, trait conscientiousness might serve as a moderator of the relationship between intentions and safety and security behaviors in that intentions are more likely to relate to behaviors for individuals who are more conscientious. Both the direct and moderating relationships would be inconsistent with the reasoned action approach which proposes that trait conscientiousness has its impact through people’s beliefs. It is possible that conscientiousness has an additional impact on the prediction of motivated safety and security behaviors over and above that of the reasoned action components because, as a personality trait, conscientiousness contributes to behavior through non-reasoned processes. Consequently, because trait conscientiousness involves underlying behavioral dispositions that are closely related to features of safety and security behaviors, it might have an added impact on the prediction of motivated cybersecurity behavior.

A research agenda

One outgrowth of the beyond reasoned action conceptualization is the illumination of a number of different research questions. Although cybersecurity behavior has attracted much attention, research questions about cybersecurity behavior continue to emerge. The conceptualization offered here provides additional avenues for future research. A number of these are framed as research questions below and are offered to spur additional research on understanding how to motivate cybersecurity behavior. The research questions often have embedded hypotheses about the ways that factors and processes described in the conceptualizations impact safety and security behaviors as well as the subsequent outcomes. In general, these questions reflect concerns about how well the reasoned action approach and the beyond reasoned action conceptualization models account for the patterns of responses of people who have cybersecurity as a specific responsibility. Other issues concern the ways in which the proposed factors relate to cybersecurity behaviors and behavioral intentions. Readers can refer to Figure 2 to see a depiction of the relationships stated or implied by the research questions.

RQ1.

Does the theory of planned behavior account for variability in cybersecurity behavior and behavioral intentions? Although there is establish research addressing this question (e.g. Burns and Roberts, 2013; Sommestad et al., 2019), will future research replicate these earlier findings for instances and organizations involved in different forms of cybersecurity behavior?

RQ2.

Does a measure of habits and work routines enhance the prediction of cybersecurity behavior (cf. Hagger et al., 2023; Hinsz et al., 2007) or can the components of the reasoned action approach account for variance in cybersecurity behavior without including habits and work routines?

RQ3.

Does a motivating emotion such as fear (a) contribute directly to the prediction of cybersecurity behaviors, (b) moderate (facilitate) the relationship between intentions and behavior, or (c) as a background factor, is the relationship between fear and cybersecurity behavior accounted for by the components of the reasoned action approach?

RQ4.

Does anticipated affect enhance the prediction of intentions over and above that of the reasoned action components, or does a good measure of attitude toward the behavior account for the variance associated with anticipated affect?

RQ5.

Does a measure of attitude toward the process of engaging in cybersecurity behaviors make a unique contribution to the predictions of cybersecurity behavioral intentions in comparison to the predictive capability of the attitude toward the behavior alone? And how does the attitude toward the process measure relate to the other concepts being considered for their relationships with behavioral intentions?

RQ6.

Do descriptive norms augment the impact of injunctive norms in the prediction of intentions (as proposed by the reasoned action approach)? And, in the case of cybersecurity, does a measure of personal norms add to the perceived social norms construct in predicting intentions?

RQ7.

Will the components of the reasoned action approach explain and predict the relationships between an organizational climate of cybersecurity, and cybersecurity behaviors and behavioral intentions? Or, is the organizational climate of cybersecurity sufficiently robust that it makes a unique contribution to the prediction of cybersecurity behaviors and behavioral intentions?

RQ8.

Does conscientiousness, as a personality trait with features relevant for safety and security, make an independent contribution to the prediction of cybersecurity behaviors (e.g. Gratian et al., 2018; Kennison and Chan-Tin, 2020; Shappie et al., 2020.)? Does conscientiousness facilitate the relationship between cybersecurity intentions and behaviors? Or do the components of the reasoned action approach account for the relationship between conscientiousness and cybersecurity behaviors as predicted?

RQ9.

A key question to be addressed is whether the beyond reasoned action conceptualization proposed here provides an effective model of the factors and processes hypothesized to have an impact on cybersecurity behaviors?

RQ10.

With regard to measures of outcomes (e.g. inspection reports; supervisor ratings), does the beyond reasoned action conceptualization account for substantial variance in the outcome measures associated with cybersecurity behaviors?

Some conclusions

Although employees may be a prime vulnerability for organizations’ cybersecurity (e.g. insider threats), these same employees can be part of organizations’ cybersecurity firewall (Pfleeger et al., 2014). A premise of this article is that by utilizing reasoned action conceptualizations, it is possible to gain an understanding of how to motivate cybersecurity behaviors of these employees. Moreover, cybersecurity is an example of the class of safety and security behaviors which are critical for many modern organizations. Thus, it is important to understand the key factors that contribute to the motivation of safety and security behaviors. Although cybersecurity technologies are likely to play an important role in cybersecurity, for most organizations it is crucial that their personnel are also motivated to perform cyber safety and cybersecurity behaviors. The proposed reasoned action conceptualizations highlight a number of factors and processes that are predicted to impact cybersecurity. In particular, similar to other approaches to cybersecurity behavior (e.g. Wiederhold, 2014), the reasoned action approach highlights the roles that personal dispositions (attitudes) and social influences (social norms) have for predicting cybersecurity behavior. A major contribution of this paper is the presentation of a conceptualization that goes beyond the reasoned action approach to consider factors (1) that are not “reasoned” (e.g. routines and habits, motivating emotions), (2) that may have unique impact due to their correspondence with safety and security behaviors (e.g. organizational climate of safety; trait conscientiousness), or (3) reflect emerging notions that deserve further scrutiny for their capability to predict cybersecurity behavioral intentions (e.g. anticipated affect, attitude toward the process, personal norms).

Cybersecurity behaviors are difficult to assess because it is not always clear that a specific behavior contributes to security, puts cyber technology at risk, or is unrelated to cybersecurity. Often an expert would have to observe the behavior directly to discern how it contributes to cybersecurity safety. Moreover, cybersecurity behaviors often go unnoticed, are generally not quantifiable, and also are distinct from quality because they don’t improve outcomes, but protect against negative outcomes which are rare and tenuously related to cybersecurity behaviors. For these reasons, like other safety and security behaviors, unique conceptualizations and approaches for cybersecurity behavior are likely required. A motivating cybersecurity behavior approach was described here based on going beyond the reasoned action approach. Because this paper is limited in its intended scope, there are a number of limitations to the approach pursued in paper. However, a more cognitive approach that focuses on motivating awareness, attention, and vigilance of cybersecurity personnel for security risks and threats might be useful, particularly in conjunction with a motivating cybersecurity behavior approach. Such a cognitive approach is a clear avenue for further conceptualizations for understanding and enhancing cybersecurity in organizations.

From the conceptualizations, it is clear that interventions will have the greatest impact by targeting the beliefs associated with personal social norms, perceived behavioral control and attitudes toward cybersecurity behaviors. These interventions can be the traditional approaches of organizations such as training (Brummel et al., 2019) as well as the those associated with persuasive appeals associated with attitude change (Fishbein and Ajzen, 1975). It is via belief formation and change that intentions regarding cybersecurity will be changed – which will result in the desired cybersecurity behavior change (Pfleeger and Caputo, 2012). Belief-based interventions for cybersecurity change will also provide the basis for developing trust (Pfleeger and Caputo, 2012), augment the effects of selection (Mueller-Hanson and Garza, 2019), and facilitate training (Brummel et al., 2019). Moreover, by changing the employees’ beliefs about cybersecurity, there can be a more effective integration of the human with technology for pursuing cybersecurity behaviors. In these ways, by utilizing reasoned action conceptualizations, it is possible to articulate how behavioral and organizational sciences can contribute to cybersecurity (Dreibelbis et al., 2018).

Figures

The theory of planned behavior

Figure 1

The theory of planned behavior

A beyond reasoned action conceptualization for safety and security behaviors

Figure 2

A beyond reasoned action conceptualization for safety and security behaviors

References

Ajzen, I. (1991), “The theory of planned behavior”, Organizational Behavior and Human Decision Processes, Vol. 50 No. 2, pp. 179-211, doi: 10.1016/0749-5978(91)90020-T.

Ajzen, I. (2002), “Residual effects of past on later behavior: habituation and reasoned action principles”, Personality and Social Psychology Review, Vol. 6 No. 2, pp. 107-122, doi: 10.1207/S15327957PSPR0602_02.

Ajzen, I. (2011), “Job satisfaction, effort, and performance: a reasoned action perspective”, Contemporary Economics, Vol. 5 No. 4, pp. 32-43, 0.476, doi: 10.5709/Ce.1897-9254.26.

Ajzen, I. and Fishbein, M. (1980), Understanding Attitudes and Predicting Social Behavior, Prentice-Hall, Englewood Cliffs, NJ.

Ajzen, I. and Sheikh, S. (2013), “Action versus inaction: anticipated affect in the theory of planned behavior”, Journal of Applied Social Psychology, Vol. 43 No. 1, pp. 155-162, doi: 10.1111/j.1559-1816.2012.00989.x.

Ashkanasy, N.M., Wilderom, C.P.M. and Peterson, M.F. (2000), Handbook of Organizational Culture and Climate, Sage, Thousand Oaks, CA.

Austin, J.T. and Villanova, P. (1992), “The criterion problem: 1917–1992”, Journal of Applied Psychology, Vol. 77 No. 6, pp. 836-874, doi: 10.1037/0021-9010.77.6.836.

Bagozzi, R.P. and Warshaw, P.R. (1990), “Trying to consume”, Journal of Consumer Research, Vol. 17 No. 2, pp. 127-140, doi: 10.1086/208543.

Bamberg, S., Ajzen, I. and Schmidt, P. (2003), “Choice of travel mode in the theory of planned behavior: the roles of past behavior, habit, and reasoned action”, Basic and Applied Social Psychology, Vol. 25 No. 3, pp. 175-187, doi: 10.1207/S15324834BASP2503_01.

Bandura, A. (1997), Self-efficacy: The Exercise of Control, W. H. Freeman, New York.

Bargh, J.A. (1994), “The four horsemen of automaticity: awareness, intention, efficiency, and control in social cognition”, in Wyer, R.S. Jr and Srull, T.K. (Eds), Handbook of Social Cognition: Basic Processes; Applications, Lawrence Erlbaum Associates, pp. 1-40.

Barrick, M.R. and Mount, M.K. (1991), “The big five personality dimensions and job performance: a meta-analysis”, Personnel Psychology, Vol. 44 No. 1, pp. 1-26, doi: 10.1111/j.1744-6570.1991.tb00688.x.

Barrick, M.R., Mount, M.K. and Judge, T.A. (2001), “Personality and performance at the beginning of the new millennium: what do we know and where do we go next?”, International Journal of Selection and Assessment, Vol. 9 Nos 1-2, pp. 9-30, doi: 10.1111/1468-2389.00160.

Betts, K.R. and Hinsz, V.B. (2012), “Anticipated affect predicts food safety practices”, Presented at the Annual Meeting of the Society for Personality and Social Psychology, San Diego, CA.

Bitzer, E.G., Chen, P.Y. and Johnston, R.G. (2009), “Security in organizations: expanding the Frontier of industrial-organizational psychology”, in Hodgkinson, G.P. and Ford, J.K. (Eds), International Review of Industrial and Organizational Psychology, John Wiley & Sons, Vol. 24, pp. 131-150, doi: 10.1002/9780470745267.ch4.

Brummel, B.J., Hale, J. and Mol, M.J. (2019), “Training cyber security personnel”, in Zaccaro, S.J., Dalal, R.S., Tetrick, L.E. and Steinke, J.A. (Eds), Psychosocial Dynamics of Cyber Security, Routledge, pp. 217-239.

Burns, S. and Roberts, L. (2013), “Applying the Theory of Planned Behaviour to predicting online safety behaviour”, Crime Prevention and Community Safety, Vol. 15 No. 1, pp. 48-64, doi: 10.1057/cpcs.2012.13.

Burns, A.J., Roberts, T.L., Posey, C., Bennett, R.J. and Courtney, J.F. (2018), “Intentions to comply versus intentions to protect: a VIE theory approach to understanding the influence of insiders' awareness of organizational SETA efforts”, Decision Sciences, Vol. 49 No. 6, pp. 1187-1228, doi: 10.1111/deci.12304.

Carr, S.C., Hopner, V., Hakim, M.A., Hodgetts, D.J., Chamberlain, K., Nelson, N., Ball, R. and Jones, H. (2021), “Scaling the security staircase”, Political Psychology, Vol. 42 No. 4, pp. 575-595, doi: 10.1111/pops.12715.

Cialdini, R.B., Bator, R.J. and Guadagno, R.E. (1999), “Normative influences in organizations”, in Thompson, L.L., Levine, J.M. and Messick, D.M. (Eds), Shared Cognition in Organization: The Management of Knowledge, Lawrence Erlbaum Associates, pp. 195-211, doi: 10.4324/9781410603227-9.

Cooke, N.J. and Winner, J.L. (2007), “Human factors of homeland security”, Reviews of Human Factors and Ergonomics, Vol. 3 No. 1, pp. 79-110, doi: 10.1518/155723408X299843.

Coovert, M.D., Dreibelbis, R. and Borum, R. (2019), “Factors influencing the human–technology interface for effective cyber security performance”, in Zaccaro, S.J., Dalal, R.S., Tetrick, L.E. and Steinke, J.A. (Eds), Psychosocial Dynamics of Cyber Security, Routledge, pp. 267-290.

Dalal, R.S., Howard, D.J., Bennett, R.J., Posey, C., Zaccaro, S.J. and Brummel, B.J. (2022), “Organizational science and cybersecurity: abundant opportunities for research at the interface”, Journal of Business and Psychology, Vol. 37, pp. 1-29, doi: 10.1007/s10869-021-09732-9.

Dreibelbis, R.C., Martin, J., Coovert, M.D. and Dorsey, D.W. (2018), “The looming cybersecurity crisis and what it means for the practice of industrial and organizational psychology”, Industrial and Organizational Psychology, Vol. 11 No. 2, pp. 346-365, doi: 10.1017/iop.2018.3.

Fishbein, M. and Ajzen, I. (1975), Belief, Attitude, Intention, and Behavior: An Introduction to Theory and Research, Addison-Wesley, Reading, MA.

Fishbein, M. and Ajzen, I. (2010), Predicting and Changing Behavior: The Reasoned Action Approach, Psychology Press, New York, doi: 10.4324/9780203838020.

Godin, G., Conner, M. and Sheeran, P. (2005), “Bridging the intention-behaviour ‘gap’: the role of moral norm”, British Journal of Social Psychology, Vol. 44 No. 4, pp. 497-512, doi: 10.1348/014466604X17452.

Gratian, M., Bandi, S., Cukier, M., Dykstra, J. and Ginther, A. (2018), “Correlating human traits and cyber security behavior intentions”, Computers and Security, Vol. 73, pp. 345-358, doi: 10.1016/j.cose.2017.11.015.

Hagger, M.S., Hamilton, K., Phipps, D.J., Protogerou, C., Zhang, C.-Q., Girelli, L., Mallia, L. and Lucidi, F. (2023), “Effects of habit and intention on behavior: meta-analysis and test of key moderators”, Motivation Science, Vol. 9 No. 2, pp. 73-94, doi: 10.1037/mot0000294.

Higgins, E.T. (1997), “Beyond pleasure and pain”, American Psychologist, Vol. 52 No. 12, pp. 1280-1300, doi: 10.1037/0003-066X.52.12.1280.

Higgins, E.T. (2006), “Value from hedonic experience and engagement”, Psychological Review, Vol. 113 No. 3, pp. 439-460, doi: 10.1037/0033-295X.113.3.439.

Hinsz, V.B. and Nickell, G.S. (2009), “Food processing workers can be the first line of defense against intentional contamination”, In Bellinghouse, V.C. (Ed.), Food Processing: Methods, Techniques, and Trends, Nova Science, Hauppauge, NY, pp. 323-330.

Hinsz, V.B. and Nickell, G.S. (2015), “The prediction of food safety intentions and behavior with job attitudes and the reasoned action approach”, Journal of Work and Organizational Psychology, Vol. 31 No. 2, pp. 91-100, doi: 10.1016/j.rpto.2015.03.001.

Hinsz, V.B. and Ployhart, R.E. (1998), “Trying, intentions, and the processes by which goals influence performance: an empirical test of the theory of goal pursuit”, Journal of Applied Social Psychology, Vol. 28 No. 12, pp. 1051-1066, doi: 10.1111/j.1559-1816.1998.tb01667.x.

Hinsz, V.B., Nickell, G.S. and Park, E.S. (2007), “The role of work habits in the motivation of food safety behaviors”, Journal of Experimental Psychology: Applied, Vol. 13 No. 2, pp. 105-114, doi: 10.1037/1076-898X.13.2.105.

Hodgetts, D., Hopner, V., Carr, S., Bar-Tal, D., Liu, J.H., Saner, R., Yiu, L., Horgan, J., Searle, R.H., Massola, G., Hakim, M.A., Marai, L., King, P. and Moghaddam, F. (2023), “Human security psychology: a linking construct for an eclectic discipline”, Review of General Psychology, Vol. 27 No. 2, pp. 177-193, doi: 10.1177/10892680221109124.

Hull, C.J. (1943), Principles of Behavior: An Introduction to Behavior Theory, Appleton-Century-Crofts, New York.

Jajodia, S., Liu, P., Swarup, V. and Wang, C. (2010), Cyber Situational Awareness: Issues and Research, Springer, New York.

Jose, I., LaPort, K. and Trippe, D.M. (2019), “Requisite attributes for cyber security personnel and teams: cyber risk mitigation through talent management”, in Zaccaro, S.J., Dalal, R.S., Tetrick, L.E. and Steinke, J.A. (Eds), Psychosocial Dynamics of Cyber Security, Routledge, pp. 167-193.

Kanfer, R. (1990), “Motivation theory and industrial and organizational psychology”, in Dunnette, M.D. and Hough, L.M. (Eds), Handbook of Industrial and Organizational Psychology, 2nd ed., Vol. 1, Consulting Psychologist Press, pp. 75-170.

Kennison, S.M. and Chan-Tin, E. (2020), “Taking risks with cybersecurity: using knowledge and personal characteristics to predict self-reported cybersecurity behaviors”, Frontiers in Psychology, Vol. 11, 546546, doi: 10.3389/fpsyg.2020.546546.

Lord, R.G. and Kanfer, R. (2002), “Emotions and organizational behavior”, in Lord, R.G., Klimoski, R. and Kanfer, R. (Eds), Emotions in the Workplace: Understanding the Structure and Role of Emotions in Organizational Behavior, Jossey-Bass, pp. 5-19.

Lord, R.G., Klimoski, R.J. and Kanfer, R. (2002), Emotions in the Workplace: Understanding the Structure and Role of Emotions in Organizational Behavior, Jossey-Bass, San Francisco, CA.

Mellers, B.A. and McGraw, A.P. (2001), “Anticipated emotions as guides to choice”, Current Directions in Psychological Science, Vol. 10 No. 6, pp. 210-214, doi: 10.1111/1467-8721.00151.

Mueller-Hanson, R. and Garza, M. (2019), “Selection and staffing of cyber security positions”, in Zaccaro, S.J., Dalal, R.S., Tetrick, L.E. and Steinke, J.A. (Eds), Psychosocial Dynamics of Cyber Security, Routledge, pp. 194-216.

Neal, D.T., Wood, W. and Quinn, J.M. (2006), “Habits – a repeat performance”, Current Directions in Psychological Science, Vol. 15 No. 4, pp. 198-202, doi: 10.1111/j.1467-8721.2006.00435.x.

Nickell, G.S. and Hinsz, V.B. (2009), “Eliciting food production workers' beliefs about factors that contribute to potential contamination”, in Bellinghouse, V.C. (Ed.), Food Processing: Methods, Techniques, and Trends, Nova Science, pp. 565-573.

Nickell, G.S. and Hinsz, V.B. (2011), “Having a conscientious personality helps an organizational climate of food safety predict food safety behavior”, in Walsch, M.B. (Ed.), Food Supplies and Food Safety: Production, Conservation and Population Impact, Nova Science, pp. 189-198.

Nickell, G.S. and Hinsz, V.B. (2023), “Applying the theory of planned behavior to understand workers' production of safe food”, Journal of Work and Organizational Psychology, Vol. 39 No. 2, pp. 89-100, doi: 10.5093/jwop2023a10.

Nickell, G.S., Hinsz, V.B. and Park, E.S. (2005), “Using normative information to encourage food processing workers to keep food clean”, in Maunsell, B. and Bolton, D.J. (Eds), Food Safety Risk Communication: The Message and Motivational Strategies, Teagasc – The National Food Centre, pp. 99-109.

Ohly, S., Göritz, A.S. and Schmitt, A. (2017), “The power of routinized task behavior for energy at work”, Journal of Vocational Behavior, Vol. 103, Part B, pp. 132-142, doi: 10.1016/j.jvb.2017.08.008.

Ouellette, J.A. and Wood, W. (1998), “Habit and intention in everyday life: the multiple processes by which past behavior predicts future behavior”, Psychological Bulletin, Vol. 124 No. 1, pp. 54-74, doi: 10.1037/0033-2909.124.1.54.

Park, E.S., Hinsz, V.B. and Nickell, G.S. (2015), “Regulatory fit theory at work: prevention focus' primacy in safe food production”, Journal of Applied Social Psychology, Vol. 45 No. 7, pp. 363-373, doi: 10.1111/jasp.12302.

Parker, C.P., Baltes, B.B., Young, S.A., Huff, J.W., Altmann, R.A., Lacost, H.A. and Roberts, J.E. (2003), “Relationships between psychological climate perceptions and work outcomes: a meta-analytic review”, Journal of Organizational Behavior, Vol. 24 No. 4, pp. 389-416, doi: 10.1002/job.198.

Parker, S.K., Winslow, C.J. and Tetrick, L.E. (2019), “Designing meaningful, healthy, and effective cyber security work”, in Zaccaro, S.J., Dalal, R.S., Tetrick, L.E. and Steinke, J.A. (Eds), Psychosocial Dynamics of Cyber Security, Routledge, pp. 240-266.

Pfleeger, S.L. and Caputo, D.D. (2012), “Leveraging behavioral science to mitigate cybersecurity risk”, Computers and Security, Vol. 31 No. 4, pp. 597-611, doi: 10.1016/j.cose.2011.12.010.

Pfleeger, S.L., Sasse, M.A. and Furnham, A. (2014), “From weakest link to security hero: transforming staff security behavior”, Journal of Homeland Security and Emergency Management, Vol. 11 No. 4, pp. 489-510, doi: 10.1515/jhsem-2014-0035.

Posey, C., Roberts, T.L., Lowry, P.B., Bennett, R.J. and Courtney, J.F. (2013), “Insiders' protection of organizational information assets: development of a systematics-based taxonomy and theory of diversity for protection-motivated behaviors”, MIS Quarterly, Vol. 37 No. 4, pp. 1189-1210, doi: 10.25300/MISQ/2013/37.4.09.

Predd, J., Pfleeger, S.L., Hunker, J. and Bulford, C. (2008), “Insiders behaving badly”, IEEE Security and Privacy, Vol. 6 No. 4, pp. 66-70, doi: 10.1109/MSP.2008.87.

Proctor, R.W. and Chen, J. (2015), “The role of human factors/ergonomics in the science of security: decision making and action selection in cyberspace”, Human Factors, Vol. 57 No. 5, pp. 721-727, doi: 10.1177/0018720815585906.

Rentsch, J.R. (1990), “Climate and culture: interaction and qualitative differences in organizational meanings”, Journal of Applied Psychology, Vol. 75 No. 6, pp. 668-681, doi: 10.1037/0021-9010.75.6.668.

Richard, R., van der Pligt, J. and de Vries, N. (1996), “Anticipated affect and behavioral choice”, Basic and Applied Social Psychology, Vol. 18 No. 2, pp. 111-129, doi: 10.1207/s15324834basp1802_1.

Roberts, B.W. and Hogan, R. (2001), Personality Psychology in the Workplace, American Psychological Association, doi: 10.1037/10434-000.

Ronis, D.L., Yates, J.F. and Kirscht, J.P. (1989), “Attitudes, decisions, and habits as determinants of repeated behavior”, in Pratkanis, A.R., Breckler, S.J. and Greenwald, A.G. (Eds), Attitude Structure and Functions, Erlbaum, pp. 213-239.

Sandberg, T. and Conner, M. (2008), “Anticipated regret as an additional predictor in the theory of planned behavior: a meta-analysis”, British Journal of Social Psychology, Vol. 47 No. 4, pp. 589-606, doi: 10.1348/014466607X258704.

Sasse, M.A. and Flechais, I. (2005), “Usable security: why do we need it? How do we get it?”, in Cranor, L.F. and Garfinkel, S. (Eds), Security and Usability: Designing Secure Systems that People Can Use, O'Reilly Media, pp. 13-30.

Shappie, A.T., Dawson, C.A. and Debb, S.M. (2020), “Personality as a predictor of cybersecurity behavior”, Psychology of Popular Media, Vol. 9 No. 4, pp. 475-480, doi: 10.1037/ppm0000247.

Sommestad, T., Karlzén, H. and Hallberg, J. (2019), “The theory of planned behavior and information security policy compliance”, Journal of Computer Information Systems, Vol. 59 No. 4, pp. 344-353, doi: 10.1080/08874417.2017.1368421.

State of the State Address (2022), “Governor Doug Burgum 2022 state of the state address”, available at: https://www.governor.nd.gov/news/burgum-outlines-efforts-address-workforce-grow-economy-2022-state-state-address (accessed 17 February 2022).

Triandis, H.C. (1977), Interpersonal Behavior, Brooks/Cole, Monterey, CA.

Vance, A., Siponen, M. and Pahnila, S. (2012), “Motivating IS security compliance: insights from habit and protection motivation theory”, Information and Management, Vol. 49 Nos 3-4, pp. 190-198, doi: 10.1016/j.im.2012.04.002.

Verplanken, B. (2006), “Beyond frequency: habit as a mental construct”, British Journal of Social Psychology, Vol. 45 No. 3, pp. 639-656, doi: 10.1348/014466605X49122.

Verplanken, B. and Aarts, H. (1999), “Habit, attitude, and planned behavior: is habit an empty construct or an interesting case of automaticity?”, European Review of Social Psychology, Vol. 10 No. 1, pp. 101-134, doi: 10.1080/14792779943000035.

Verplanken, B. and Orbell, S. (2003), “Reflections on past behavior: a self-report index of habit strength”, Journal of Applied Social Psychology, Vol. 33 No. 6, pp. 1313-1330, doi: 10.1111/j.1559-1816.2003.tb01951.x.

Wallace, C. and Chen, G. (2006), “A multilevel integration of personality, climate, self-regulation, and performance”, Personnel Psychology, Vol. 59 No. 3, pp. 529-557, doi: 10.1111/j.1744-6570.2006.00046.x.

Weiss, H.M. and Ilgen, D.R. (1985), “Routinized behavior in organizations”, Journal of Behavioral Economics, Vol. 14 No. 1, pp. 57-67, doi: 10.1016/0090-5720(85)90005-1.

Wiederhold, B.K. (2014), “The role of psychology in enhancing cybersecurity”, Cyberpsychology, Behavior, and Social Networking, Vol. 17 No. 3, pp. 131-132, doi: 10.1089/cyber.2014.1502.

Wood, W. (2019), Good Habits, Bad Habits, Picador, New York.

Zaccaro, S.J., Dalal, R.D., Tetrick, L. and Steinke, J.A. (2019a), The Psychosocial Dynamics of Cyber Security, Routledge, New York.

Zaccaro, S.J., Dalal, R.D., Tetrick, L. and Steinke, J.A. (2019b), “The psychosocial dynamics of cyber security: an overview”, in Zaccaro, S.J., Dalal, R.S., Tetrick, L.E. and Steinke, J.A. (Eds), Psychosocial Dynamics of Cyber Security, Routledge, pp. 1-12.

Zohar, D. (2003), “Safety climate: conceptual and measurement issues”, in Quick, J.C. and Tetrick, L.E. (Eds), Handbook of Occupational Health Psychology, American Psychological Association, pp. 123-142, doi: 10.1037/10474-006.

Corresponding author

Verlin B. Hinsz can be contacted at: verlin.hinsz@ndsu.edu

Related articles