Escolar Documentos
Profissional Documentos
Cultura Documentos
OVERLOAD
Carol S. Saunders is affiliated with the University of South Florida. She has received the
LEO Award from the Association of Information Systems (AIS) and the Lifetime Achieve-
ment Award from the Organizational Communication & Information Systems Division of
the Academy of Management. She served or is serving on numerous editorial boards,
including a three-year term as Editor-in-Chief of MIS Quarterly. Her articles appear in top-
ranked management, information systems, computer science, and communication journals.
She currently is the AIS Vice President of Publications.
This page intentionally left blank
EMOTIONAL AND
COGNITIVE OVERLOAD
The Dark Side of Information
Technology
Typeset in Bembo
by Taylor & Francis Books
CONTENTS
List of Illustrations vi
Acknowledgements vii
Glossary 134
References 142
Index 166
ILLUSTRATIONS
Figures
1.1 The blender approach to understanding overload 5
3.1 Emotional-Cognitive Model of Overload (ECOM) 45
5.1 Information Technology Dark Side Diamond 77
5.2 Work-life balance continuum (adapted from Sarker, Xiao, 84
Sarker & Ahuja, 2012).
Tables
1.1 Comparison of brain overload in blenders and people 7
3.1 Summary of issues in processing and output for expert versus
non-expert 50
5.1 Summary of the Information Technology dark side diamond 97
6.1 Operationalization of IT-related overload with item loadings 108
6.2 Operationalization of memories of past cognitive and 109
emotional overload with item loadings
Boxes
3.1 Chris and Alix 37
3.2 Example application of the Emotional-Cognitive Overload
Model 51
5.1 Anna and David 76
ACKNOWLEDGEMENTS
Charles Darwin (1871)– a naturalist best known for his contributions to the science
of evolution– wrote, “It has often been said that no animal uses any tool” (p. 51).
Darwin challenged this 19th-century statement through his own observations
and those of his colleagues. For example, Darwin noted that Asian elephants would
repel flies by waving a branch in their trunks. Interestingly, the elephants would
first fashion the branch into a tool by removing side branches or shortening the
stem. Earlier, Savage and Wyman (1843–1844) reported that chimpanzees in their
natural habitat use stones to crack fruits. They also devise sticks for hunting prosi-
mians. Later, Köhler (1917/1925) observed that big apes restructure their envir-
onment to reach food. Thus, wild animals adapt tools to make them more efficient
and use them to enhance their chances of survival. It is indeed more efficient for
the elephant to have the right tool for chasing flies away than relying on the length
of his trunk. Yet, animals do not exhibit the full scope of intelligence observable in
humans. Evolutionary research has related the use of tools with the development of
hominid brains (Wrangham, 1994; Carvalho, Cunha, Sousa, & Matsuzawa, 2008;
Sanz & Morgan, 2013). Our early hominid ancestors, such as Ardipithecus, were
capable of making simple tools (Panger, Brooks, Richmond, & Wood, 2002;
Roche, Blumenschine, & Shea, 2009). Neanderthals displayed their abilities in
handling complex Paleolithic tools for their survival. Through evolution, the better
early hominids designed and handled complex tools, the smarter and fitter they
became. Early hominid’s use of tools, like ours today, was goal-driven and made it
possible to accumulate exogenous resources and conserve endogenous ones.
age, the design and use of ‘digital tools’ such as the smartphone is causing some
concern. The popular press is full of revelations of this First World problem. For
example, Tristan Harris, a former product manager and design ethicist at Google,
recently declared war on smartphones. He stated in an interview with Rachel Metz
for the MIT Technology Review:
It’s so invisible what we’re doing to ourselves.… It’s like a public health crisis.
It’s like cigarettes, except because we’re given so many benefits, people can’t
actually see and admit the erosion of human thought that’s occurring at the
same time.
(Cited in Metz, 2017)
Research has demonstrated that even the absence of a smartphone in one’s pocket
can be a cause for concern. Specifically, phone owners have been reporting
‘phantom vibration syndrome’. In this syndrome, the phone owner is so used to
receiving messages that her body perceives that the phone is vibrating and deli-
vering information even when it is not (Drouin, Kaiser, & Miller, 2012). Nicholas
Carr (2017), in his article “How Smartphones Hijack Our Minds”, reported
research denouncing the addictive nature of the smartphone and its weakening
effect on the brain. People are becoming too dependent on their smartphone, and
their ability to think and make sound judgements is decreasing. Carr concluded
from his readings that when a smartphone’s proximity increases, brainpower
decreases. In a similar vein, Hancock (2014) now muses over whether current
technology engenders stupidity instead of whether it can cure stupidity.
The smartphone is not the only Information Technology (IT) that has a dark
side. The popular press is full of accounts about the dark side of other types of IT:
information overload, email fatigue, iDisorders, technostress, or social media junk-
ies to name just a few. Though clearly these advanced technologies have many
wonderful uses, their dire consequences on users’ behaviour and stress is generating
societal concern. However, IT itself is not the problem. Rather it is how IT is
actually used that can lead to good or dire consequences. When it is not used well,
the dark side of IT is unveiled.
We are particularly concerned with two ‘dark side’ challenges: IT-related over-
load and IT addiction. We define IT-related overload as the state of being challenged
in processing information used in IT-related activities. Rather than focus on the
amount (i.e., input) or symptoms (i.e., output) of overload, we seek to unlock the
black box of the mind and focus on mental processes. That is, we are concerned
with a form of brain overload, or the inability to adequately process input and handle
the associated brain load. We define brain load as the emotional and cognitive
efforts required by individuals to appraise and process inputs using the resources
available to them. Further, we define IT addiction as the state of being challenged in
balancing IT usage mindfully so as to preserve one’s resources.
When used well, we view Information Technologies as powerful tools. In parti-
cular, we view them as exogenous resources – digital tools that may require our
Information Technology’s dark side 3
Brain overload
The dark side of IT has exponentially increased in the last half-century as a result of
the introduction of new digital tools such as the Internet, email, smartphones, and
Social Networking Systems (SNSs). Indeed, since the commercialization of the Inter-
net skyrocketed shortly after the introduction of web browsers, we find ourselves
increasingly inundated with information in the form of requests, advertising, pop-
ups, new apps, emails, or text messages delivered by various technologies. We are
deluged with information that is continuously being pushed at us by others or
pulled by us from the Internet and other myriad of technologies because we feel
compelled to seek additional information or social contact. We face the challenge
of dealing with the huge amount of information that is omnipresent in our world.
“Never in history has the human brain been asked to track so many data points”
(Hallowell, 2005, p.58). The consequences are serious in today’s information-rich
environment. In First World countries, “contemporary society suffers from infor-
mation constipation. The steps from information to knowledge and from knowl-
edge to wisdom, and thence to insight and understanding, are held captive to the
nominal insufficiency of processing capacity” (Hancock, 2014, p.450). Managers
and employees who suffer cognitively from overload may end up making an
increasing number of errors and poor decisions while trying to process dizzying
amounts of data (Hallowell, 2005). They may also suffer emotionally from the
overload, IT addiction, and workplace stress. For example, employees working in
high-technology industries have been found to demonstrate psychosomatic symp-
toms and reduced productivity related to high mental demands (Arnetz & Wiholm,
1997; Tarafdar, Tu, Ragu-Nathan, & Ragu-Nathan, 2007). One estimate places
the cost of information overload due to “lowered productivity and throttled
innovation” at $900 billion a year (Powers, 2010, p.62).
We believe that ‘brain overload’ is a better term to describe the phenomenon
more commonly called ‘information overload’. Processing the information that
Information Technologies deliver is brain-related and heavily reliant upon available
resources. Therefore, brain overload is a function of the brain (e.g., processor) and
not information (e.g., input). While the consequences of brain overload have been
reported frequently in the literature, they systematically have been attributed to
situations characterized by too much data, information, or connectivity. The focus
has been on the input and the output rather than on the cognitive processes (i.e.,
black box).
4 Information Technology’s dark side
More than four decades ago, Simon (1971) pointed out the challenges of proces-
sing so much information and the need for attention resources to do so. He wrote,
Indeed, there has always been a lot of data in the world. Not many of us have read
all the books in a library. Libraries are not blamed for causing information overload
– technologies, especially email, are.
Hi there, Thanks for your mail, which I regrettably will not read since I’m working
away from the office. I’ll be back, however, on the 4th of May fully charged. So if your
email is still relevant after then, please send it again or otherwise it’ll end up in the heap
of mails that I’ll unlikely respond to. Even better, if the matter is urgent, give me a call
at +XXXXXXX. Have a good one – Corey
PS – join the fight against email fatigue and let others know that email, while
helpful, shouldn’t be a substitute for face-to-face or telephone communication. Together,
we can make the world a less stressful place.
Not everyone is afraid of brain overload in today’s digital world. In fact, some
people enjoy it and impatiently wait for the next tweet or text. They appreciate
the high-speed connections that allow them to leverage a vast range of information
in accomplishing a phenomenal amount of work. Slow connections leave them
bored and annoyed. These individuals might even suffer from a form of IT addiction
that compels them to stay connected for fear of losing out.
To better understand the role of the brain in processing information, we propose
a model based on cognitive theories of memory (Atkinson & Shiffrin, 1968;
Bower, 1981). In particular, we draw on both the emotional and cognitive aspects
of the brain and consider the resources necessary to fuel its processing of inputs.
We introduce our model, the Emotional-Cognitive Overload Model (ECOM),
using the metaphor of a blender.
Blender metaphor
We use the commonplace blender to explain the brain overload phenomenon.
With a blender, we normally pour in the ingredients that need to be processed and
push the button to mix/blend. This is State 1 in Figure 1.1. If the ingredients are
hard to blend or if we want a smoother consistency of blended materials, we turn
the knob to liquefy rather than blend. That is, we call on the blender’s greater
processing capabilities. For simplicity sake, we assume that processing abilities are
similar for most blenders. State 2 in Figure 1.1 is when the blender cannot handle
the processing. Finally, if there are too many ingredients for one batch in the
blender, we can blend some of them, pour that into a separate container, and then
process the remainder in another batch. If we do not process in batches, there will
be an overflow condition, which is what is happening in State 3 in Figure 1.1.
input, it is moved to the person’s memory, where past emotions and lifelong
experiences are organized and stored. Cognitivist theories help explain how
incoming events are coded, specific memories are constructed, memories become
consolidated so they can be appropriately associated with one another, and per-
sonality traits are encapsulated. (Personality traits are representative of the way
individuals think and behave in certain contexts.)
At this point, it also is important to understand that each individual, in a unique
way, compares each input to what is stored in memory. Only the pertinent infor-
mation then undergoes cognitive processing. By pertinence, or relevance, we mean
that a new input matches the information stored in memory. Pertinence is critical
at the starting point of our blender metaphor. In other words, pertinent informa-
tion makes sense because it fits cognitively with what is stored in the individual’s
memory. The memory uses pertinence to accept or reject inputs, therefore con-
trolling the brain load. The concept of pertinence means that not all information
that is received is processed. The idea that not all information is processed is very
different from that promulgated in much of the literature on overload. Our model
is about improving information processing and not about blaming the dizzying
amount of information that is received or the connectivity that delivers it.
This ‘amount illusion’ sees information as pouring in and relates brain overload
primarily to the amount of input. Little is said about the capability of individuals to
process the information. If one assumes that the problem people are dealing with is
too much information or too many social connections, the solution is to find ways
of filtering out what is extraneous and only allowing the needed information into
the mind for processing. This has happened to the extent that it is suggested that
technology be used as a filter or to handle email, time spent on social media, and so
on. However, in this scenario, individuals do not look for ways of improving the
processing and sparing their resources.
underload situation on a long boring flight, with their actions to elude boredom
ultimately resulting in errors. Similarly, anaesthesiologists – whose work is increas-
ingly supported by technology – when underloaded, have been found to focus
their attention on things other than their patients. When demands for their atten-
tion decrease, they have been found reading (Slagel & Weinger, 2009) or surfing
online (Saunders, Rutkowski, Pluyter, & Spanjers, 2016). Hospital administrators
are noticing their bored anaesthesiologists and are substituting many of them with
less expensive monitoring technology.
Individual differences
Once inputs have been selected for processing on the basis of their pertinence, they
are processed and stored in the person’s memory. The stored memories evolve as
individuals attempt to make sense of their own world. Each person’s memories are
very different from those of others.
Processing incoming inputs involves a certain level of effort, which calls upon
mental and physiological resources. Resources can reduce an individual’s brain load
by making the processing more efficient. Overall, resources are treated as the fuel
that runs the processing. Each person’s pool of resources is different from that of
others and depends upon how exhausted the person is. The level of resources
needed to process inputs can be compared to the different power levels in blenders.
Emotions distinguish individuals from blenders. Emotions can either help or
hinder processing of brain load. For example, memory of emotional reactions to
financial information has been found to be better than recall of the actual numbers
involved (Rose, Roberts, & Rose, 2004). Experience is encoded with a tag called a
valence. A valence may be a positive or negative emotional tag attached to events and
concepts that were activated in association with prior experience of the related
emotional tag. An input is congruent when its emotional tag, or valence, matches
that stored with a related item in memory. Where there is a mismatch with the
valence of the input and what is stored in memory, processing becomes less efficient
and challenges the individual’s pool of resources. He will, for example, focus more of
his scarce attentional resources in order to understand and solve the problem.
Chunking abilities
The attentional resources of the brain are rather limited (Kahneman, 1973; Neisser,
1976). The brain can only hold seven, plus or minus two, items at a time (Miller,
1956a). Individuals become overloaded when they have to deal with more input
items than they can handle. Thus, they must learn to focus their attention and
handle input efficiently. As noted by Miller (1956b), but often omitted in the lit-
erature, the only way to efficiently process the input and to extend the amount of
information that can be processed is by chunking. Chunking occurs when individual
items are combined into blocks called chunks. How the items are organized into
chunks determines recall. In addition to its role in processing of information,
10 Information Technology’s dark side
multiple features, dismayed by its multiple crashes, and unable to get the IT sup-
port that she needed.
We were asked by a large Dutch bank to investigate the possible adoption of an
innovative TV banking system that would eventually replace its current one. Most
customers were reluctant to adopt the new system. We believe that this reluctance
could be explained by IT-related ECO created from both information overload
and too many requests to use IT. To test this premise, we conducted a survey of
Dutch participants aged 16 or older; 1,857 responded from a total sample of 2,538
(Rutkowski & Saunders, 2010). We found that almost two-thirds of the partici-
pants (61%) were concerned about being cognitively overloaded with too much
information when they use new Information Technologies. Just over two-fifths
(42%) felt cognitively and emotionally overloaded with requests to use new Infor-
mation Technologies.
We concluded that requests to use new technologies can also create brain
overload conditions. Further, brain overload can be caused not only by being asked
to use too many technologies, but also by failing to intentionally forget some part
of what we have already learned (Rutkowski, Saunders, & Hatton, 2013). For
example, when the smartphone was introduced, one had to forget how to use a
traditional camera. Indeed, we now look at a screen to adjust a picture instead of
looking directly through the camera viewfinder.
Old technologies with which we are familiar may be very similar to new ones,
but different enough to be confusing. Brain overload is created when individuals
try to match the new functionalities of the software or services with the technology
they already know. If it differs, they may intentionally forget how they used to
interface with the old technology. Intentionally forgetting is cognitively taxing and
also contributes to feelings of burnout and rejection toward new technologies.
Overload with IT requests is similar to a component of technostress that is dis-
cussed commonly in the popular press.
Technostressed Mary
Recently, one of our young doctoral students, Mary, came up with an interesting
new strategy. Mary stated:
I decided to remove the email application on my smartphone. I cannot cope with the
constant pop-ups. They were driving me crazy. I will never be able to finish my dis-
sertation that way. Would you please send me a phone text message when I need to
check important updates for my dissertation during the weekend? I have to focus if I am
ever going to finish my PhD.
Mary’s strategy is twofold: deleting the email application from her phone and
asking us to inform her of the relevance of our emails. This meant that we would
have to send one email AND a text message in order for her to access important
messages, multiplying the technologies we use (e.g., computers and smartphones).
12 Information Technology’s dark side
In order to spare some of her resources, Mary was asking to dig into our pool. Doing
so was her way of dealing with brain overload from messages delivered by technol-
ogy. We gladly accepted this somehow self-centred request, relieving her of some of
the “growing pains with information overload” (Rutkowski & Saunders, 2010).
Technostress
Technostress, or the type of stress experienced in organizations by technology
end users as a result of their inability to cope with the demands of organiza-
tional computer usage (Tarafdar, Tu, & Ragu-Nathan, 2010), is another dark
side of IT. This stress may be induced by a surfeit of information delivered by
IT. It may also be the result of “application multitasking, constant connectivity,
information overload, frequent system upgrades and consequent uncertainty,
continual relearning and consequent job-related insecurities, and technical pro-
blems associated with the organizational use of ICT [Information and Com-
munications Technology]” (Tarafdar et al., 2010, pp.304–305). Unlike our
ECOM approach, this has not been discussed in relation to emotions, cognitions,
or resources.
The term ‘technostress’ is interesting but confusing as it seems to suggest that
technologies bring on the stress. According to our model, the technology is not
to blame. Rather, we argue that the stress is created by a lack of available
resources or impulse control. Some individuals may never have experienced
technostress even when juggling many technologies. Mary ended up upset just
hearing the constant ‘beep’ of her phone when an email lands. We believe her
stress is more symptomatic of a lack of resources than it is a function of the
information received. It arrives at a moment in time when she needs to leverage
her pool of resources to the maximum in order to finish a very relevant task –
completing her PhD.
Tarafdar and colleagues have identified five major creators, or components, of
technostress: techno-overload, techno-innovations, techno-complexity, techno-
insecurity, and techno-uncertainty (e.g., Tarafdar et al., 2007; Ragu-Nathan, Tar-
afdar, Ragu-Nathan, & Tu, 2008; Tarafdar et al., 2010). At the organizational level,
technostress has been found to lead to increased role stress and reduced productivity,
end-user performance, and end-user satisfaction. These findings are discussed in
greater detail in Chapter 5.
Interestingly, technostress has been strongly related to compulsive behaviours
(Lee, Chang, Lin, & Cheng, 2014), which are often associated with addiction.
Further, drug addiction has been found to display the same underlying symptoms
as SNS or Internet addiction (both types of IT addiction) (Goeders, 2003). In
particular, “SNS addiction incorporates the experience of the ‘classic’ addiction
symptoms, namely mood modification, salience, tolerance, withdrawal symptoms,
conflict, and relapse” (Kuss & Griffiths, 2011, p.3530). Brooks, Longstreet, and
Califf (2017) found technostress to be strongly and positively related to Internet
addiction.
Information Technology’s dark side 13
Addictive IT behaviours
There is indeed another IT-related challenge associated with having ‘too much’
that is reaching epic proportions: too much Internet and mobile phone con-
nectivity. People in all generations are staying connected too long, and this hyper-
connectivity often leads to a range of dysfunctional behaviours including IT
addiction, excessive media multitasking, and Pathological Internet Use. Pathological
Internet Use (PIU) has four elements: (1) excessive Internet use, often associated
with a loss of sense of time or a neglect of basic drives; (2) withdrawal, including
feelings of anger, depression, and tension when Internet is not accessible; (3) tol-
erance, including the need for better computer equipment, more software, or more
hours of use; and (4) adverse consequences, including arguments, lying, poor
school or vocational achievement, social isolation, and fatigue (Block, 2008, from
Spada, 2014, p.4).
Hyperconnectivity is being reported among all age groups. Tweens (children in
the 8–12 age range) and teens (children in the 13–18 age range) are averaging over
4.5 hours and 6 hours a day, respectively, on the Internet. A quarter of the teens in
a recent survey reported reaching for their phones within five minutes of waking
up (Ipsos MediaCT & Wikia, 2013). They are texting and emailing so much that
employers of young adults accuse them of having difficulty starting and ending
conversations and being nervous when making phone calls (Colbert, Yee, &
George, 2016). And older adults (commonly called ‘silver surfers’) are also taking
advantage of access to the Internet and smartphones so that they can be in a state of
constant communication with others (Colbert et al., 2016). One study even
reported that it is parents, not teenagers or tweens, who spend the most time in
front of screens (Molina, 2017).
The challenge to ‘unplug’ is spawning new opportunities for the tourism
industry as tour operators are advocating device-free vacations. For example,
Intrepid Travel, an adventure travel company, now offers “Digital Detox Trips” in
which the participants pledge not to bring along any digital devices and must resort
to paper notebooks to record their impressions (Glusac, 2016). Renaissance Pitts-
burgh’s family detox package trades digital devices for board games and cards
during the family’s stay. Further, digital detox retreats have sprung up with offers to
disconnect, for a price; and resorts offer an ‘iPhone crèche’ where you can leave
your mobile devices. In the private sphere, the negative impacts of IT-related
overload have been linked to the exponential use of Information Technologies.
State legislatures are now providing motivation to unplug in other ways. In
Hawaii, ‘smartphone zombies’, or pedestrians so distracted by what’s on their
phones that they are oblivious when crossing streets, are fined. Further, 47 states
and the District of Columbia have banned texting while driving (Molina, 2017).
In the Net Generation, hyperconnectivity is manifesting a number of new
behaviours. Net Geners are people born after 1980; this includes the groups called
Millennials and Generation Y. Net Geners have now developed the skill of
‘phubbing’ during conversations, which means that they can maintain eye contact
14 Information Technology’s dark side
while also texting. However, the eye contact may not be as meaningful as they
think, because just having the phone in sight likely reduces their conversation
partners’ perception of closeness, trust, and relationship quality (Colbert et al.,
2016). Another task that Net Geners may not be as good at performing as they
think they are is media multitasking. Media multitasking entails checking mobile
phone content as often as every 30 seconds, or even less (Rosen, Carrier, & Che-
ever, 2013), an activity which commands high switching costs as multitaskers shift
frequently from one task to another. This may explain why younger users of
mobile phones are significantly more likely than older users to experience overload
from information and communication messages delivered by their phones (Saunders,
Wiener, Klett, & Sprenger, 2017).
Some claim that such heavy use of smartphones can lead to a particular type of
addiction called mobile email addiction. Symptoms of this addiction are that the
mobile phone user becomes preoccupied with using the smartphone, has difficulty
in controlling or quitting the behaviour, and gets angry or frustrated when inter-
rupted (Turel & Serenko, 2010). Attention deficit hyperactivity disorder (ADHD),
depression, and social phobia as well as hostility have been identified as symptoms
of Internet addiction in adolescents (Yen, Ko, Yen, Wu, & Yang, 2007).
Mobile email addiction is viewed as one form of Internet addiction. Kandell
(1998) defined Internet addiction as psychological dependence on the Internet. The
dependence is characterized by: (1) an increasing investment of resources in Inter-
net-related activities; (2) unpleasant feelings (e.g., anxiety, depression, emptiness)
when offline; (3) an increasing tolerance to the effects of being online; and (4)
denial of the problematic behaviours (Kandell, 1998, p.11). In short, Internet
addicts find it hard to unplug from the Internet, and they suffer from withdrawal
upon doing so (Davis, 2001).
Among American psychologists and psychiatrists, there is no recognition of IT
addiction (i.e., Internet, SNS, or mobile email addictions) or stress. That is, no
form of technology addiction or technostress is listed in the current version of the
Diagnostic and Statistical Manual of Mental Disorders (DSM 5), which contains a formal
list of mental disorders. This is because many believe that the term ‘addiction’
should only be used in respect to chemical substances (Turel & Serenko, 2010) or
when the person has a physiological dependence on some stimulus, which is
usually a substance (Davis, 2001). Others believe that a common set of symptoms
and diagnosis criteria are missing (Turel & Serenko, 2010). Hence, in this book we
use the term Pathological Internet Use to describe the behaviours described in the
literature as IT addiction. As we discuss in Chapter 4, the lack of control con-
sciously exerted by the brain during information processing contributes heavily to
IT addiction. These behaviours can be specific or general. They are considered
specific when a person is dependent on a particular function of the Internet such as
online auction services, sexual material/services, or gambling. They are considered
general when the Internet is overused in such cases where people waste time
online without a clear objective. But whether it is called IT addiction, Internet
addiction, specific PIU, or general PIU, it is a force to be dealt with in our society.
Information Technology’s dark side 15
In the rest of this book, we will tell you why. In addressing this force as a society,
we can reap the benefits of technology while staving off its harmful effects.
In the mid 1800s, Ignaz Semmelweis was puzzled when he noted that the
number of deaths caused by puerperal (childbed) fever were more than three
times higher in the obstetrical clinic he was supervising than in another compar-
able obstetrical clinic in the same hospital, Vienna General Hospital. Mothers
begged to be sent to the clinic that was not run by Semmelweis. In his investi-
gation, he observed that the two clinics shared the same climate and that his
clinic had far fewer patients. The first step in solving the mortality rate mystery
was when Semmelweis read the pathology report of a doctor who had died after
being infected by an accidental poke from a student’s scalpel while he had been
performing an autopsy in Vienna General Hospital. Semmelweis realized that the
doctor’s autopsy displayed a pathology similar to that of the women who were
dying from puerperal fever in his clinic. He quickly linked the cadaveric con-
tamination to puerperal fever. To fend off the pathology, he proposed having all
doctors in his clinic use a chlorine handwash. That practice reduced mortality
rates to less than 1 per cent in his clinic. The handwashing practice was at odds
with the established scientific and medical thinking of Semmelweis’ time, and he
could not explain why it worked so well. It took decades for the new scientific
practice, first introduced by Semmelweis in 1847, to be accepted. It was not until
Louis Pasteur developed germ theory and Joseph Lister confirmed Pasteur’s
theory that an explanation of the benefits of Semmelweis’ hygienic practices was
discovered (Wikipedia, 2017).
Kant introduced the notion of mental representation and schemata in the first
chapter of his Critique of Pure Reason (1781–1787/2003). Kant described schemata as
a form of analysis in interposition between the sensory data and the abstract a priori
categories in the mind. Schemata are dual: one part is rules (i.e., logic) and the
other is empirical perception (i.e., image). Kant wrote, “This representation of a
universal procedure of the imagination in providing an image for a concept, I
entitle the schema of this concept” (A140).
Later, Diderot (1818–1819) conceptualized the mind as a metaphor– the soul’s
vessel. He argued that when the material dispositions of the brain are inadequate,
the mind is not able to navigate the body vessel. In fact, he considered the mind
to be a material entity (i.e., the brain) that, when it functions adequately, controls
the body.
The work of functionalists surely fed one of the most famous disputes in psy-
chology: the James–Canon controversy on emotion (1884–1929). Cannon (1914,
1927, 1929) stated that brain activity causes both an emotional experience (e.g.,
fear) and peripheral responses (e.g., sweating), which is the central view on emo-
tion even today. James (1884, 1890, 1894) favoured a peripheral view in which
bodily responses must occur before the feeling of fear. The debate still animates
research in psychology and neuroscience (Ekman, 1984; Cobos, Sanchez, Garcia,
Vera, & Vila, 2002). Both the central and peripheral views are still present in
research on emotions and cognition, and a plethora of definitions for the concept
of emotion are actively circulating in the scientific community. In this book, we
use Sherer’s (1994) definition of emotion: the “intelligent interface that mediates
between input and output” (p.127). This means adopting, or daring to adopt, a
central view. We distinguish emotions from primary drives such as hunger (Tom-
kins, 1984) and from feelings, or the subjective experience of emotion. We consider
emotion as having a specific intentional object (Frijda, 1986), such as a ‘loved one’
or a ‘feared one’.
Interestingly, this line of reasoning smoothly shifts the mind-body supervenience
problem toward a new problem, that of cognition-emotion supervenience. In psychology,
cognition-emotion supervenience is also referred to as the ‘interplay of affect and
cognition’ or, more commonly, ‘feeling and thinking’. Obviously, the solutions
proposed in solving the controversy (i.e., central versus peripheral) have shifted as a
function of the dominant paradigm.
Cognitive revolution
Other researchers have observed shortcomings with the behaviourist approach.
Simon (1980) claimed that behaviourists did not solve important questions regard-
ing the complexity of the human mind. Lashley (1929) criticized the S-R scheme
as too simplistic. He stressed the importance of understanding the brain by focusing
on complex mental problems, especially problem-solving. While, according to the
literature, the cognitivist school is deemed to have emerged as a paradigmatic
revolution in the 1950s, Knapp and Robertson (1986) stated that “the conditions
so often regarded as necessary before cognitive psychology could develop were
present in years earlier” (p.14). For example, Moore (1938) conducted research to
“throw light on the problem of how knowledge gets into the mind” (p.v). Thus
cognitive sciences were forming even earlier than the 1950s. For example, the idea
of the cognitive map and spatial representation was introduced when researchers
began studying the paths of rats searching for food in labyrinths (Tolman, 1948).
The functionalist notion of intention is embedded in the very core of the mental
processes of problem-solving and decision-making.
Cognitivists challenge the S-R scheme and focus their research on how the mind
deals with information. Cognition refers to the metamorphosis that a stimulus (e.g.,
information) goes through while being processed by the human mind. Neisser
The brain and paradigms of the mind 23
synaptic biochemical and electrical mechanisms that support the body’s activities.
Neurons’ neurosecretory cells synthetize and release neurohormones (e.g., dopa-
mine and oxytocin) that circulate through the blood and serve as biochemical
messengers.
Limbic system
The limbic system is a complex collection of structures that is commonly referred to
as the emotional brain or archaic brain. In a nutshell, it includes the amygdala, hip-
pocampus, thalamus, hypothalamus, basal ganglia, and cingulate gyrus. These
structures have been studied extensively in order to understand emotion as well as
memory. As LeDoux (1998) reported, the “limbic system itself has been a moving
target… [with] [m]ountains of data on the role of limbic area in emotion… but
there is still little understanding of how our emotions might be the product of the
limbic system” (p.158). LeDoux (1992) demonstrated that the amygdala is a locus
of synaptic plasticity underlying learned fear. Research has focused on the pathways
between sensory input to the amygdala, and on intercellular signalling mechanisms.
Authors have speculated that this part of the limbic system modulates explicit (i.e.,
declarative) memories formed in other systems (Packard, Cahill, & McGaugh,
1994). Scoville and Milner (1957) demonstrated that damage to the hippocampus
leads to a deficit in Long-Term Memory (LTM). The hypothalamus links the ner-
vous and endocrinal systems via the hypophysis. The limbic system communicates
through secretion of neurohormones and transmitters that control basic bodily
homeostatic states such as hunger, thirst, mood, and fatigue. The limbic system,
particularly the hypothalamus, is involved in social attachment behaviour through the
action of the neurohormone oxytocin. Oxytocin is also commonly called the ‘love’ or
‘cuddle’ hormone. It is a key biological parameter in understanding reproductive
behaviours, attachment to offspring, and thus survival of the species.
The limbic system also plays a role in substance addiction through dopaminergic
projection to the nervous system. Neurohormones such as dopamine are heavily
involved in the BRS mechanism, which is a complex cerebral circuit engaging
specific neuronal pathways that are modulated by cortical oversight systems affili-
ated with emotion, memory, judgment, and decision-making (Makris, Oscar-
Berman, Jaffin, Hodge, Kennedy, Caviness et al., 2008). The major component of
BRS is the mesocorticolimbic reward circuit (Heimer & Van Hoesen, 2006). In
animals and humans, the BRS is responsive to positive and negative reinforcement.
Behaviourists have demonstrated that reinforcement increases the probability of a
subsequent response. When abused, drugs activate the BRS and are as addictive as
natural reinforcers such as food (Volkow & Wise, 2005).
Interestingly, researchers have found that the limbic system is tightly connected
to the prefrontal cortex (PFC) and therefore involved in many brain functions,
such as emotion, LTM, and motivation. Damasio (1994) demonstrated that ana-
tomic damage to part of the limbic system leads to inability to use affective feed-
back in judgment and decision-making. A traumatic brain injury in part of the
26 The brain and paradigms of the mind
Prefrontal cortex
The prefontal cortex (PFC) plays a key role when someone is dealing with informa-
tion and in decision-making (Ernest & Paulus, 2005). In particular, the PFC helps
us detect errors or recover from disruptions (Rowe, Maughan, Moran, Ford,
Briskman, & Goodman, 2010). The PFC is involved in the central executive
control system that can be broadly divided into cognitive components such as
mental set-shifting, inhibition, information updating, working memory (WM),
response monitoring, and temporal coding (Szczepanski & Knight, 2014). These
activities have proven crucial in effective decision-making. In turn, damage to the
PFC results in impaired recollection- and familiarity-based recognition, failure to
exhibit memory advantages for novel stimuli, poor affect, socially inappropriate
decision-making, failure to use emotion in making decisions, and defective social and
moral reasoning relating to the ability to experience cognitive and emotional empa-
thy. The PFC has extensive reciprocal connections with nearly all cortical and sub-
cortical structures. (For a review, see Szczepanski & Knight, 2014.)
Advances in cognitive neurosciences and understanding of neurocognition
(brain/mind) systems (Tulving, 2002) have relied heavily on identifying biological
processes that support cognition and behaviour. This scientific practice focuses on
the neural connections in the brain that are involved in mental processes. Parts of
the brain, such as the limbic system and the PFC, play an important role in
understanding emotion and cognition. Cognitivists modelling memories have
relied heavily on advances in the field of neurophysiology. For example, the
development of functional neuroimaging techniques (e.g., positron emission tomo-
graphy and functional magnetic resonance imaging) has helped researchers under-
stand how these parts of the brain function as well as their impact on cognition and
behaviours. Despite such technical progress, there is still no comprehensive biological
map addressing the broader mind-body supervenience problem.
the cognitive system. It deals with one sensory channel at a time as it determines
what information is recognized. Broadbent’s all-or-nothing model explains the
bottleneck effect that occurs before pattern recognition. However, the model
does not account for what is known as the ‘cocktail party situation’ – that is,
when a person can be immersed in a discussion at a party and still hear her name
being mentioned in another conversation. If the stimulus is not analysed, as
Broadbent proposed, how can its relevance be demonstrated? Nevertheless,
the Filter Model of Attention was of extreme relevance in the evolution of the
conceptualization of memory and attentional processes in the history of cognitive
psychology. It inspired researchers such as Treisman (1964) to investigate atten-
tion selection as a function of information content, and its threshold in activating
hierarchical awareness.
Deutsch and Deutsch (1963) suggested a model in which pertinence is the
key to the selection of attention. Based on this Pertinence Model, Norman
(1969) stipulated that all signals are initially analysed and then passed on to an
attenuator before further processing. However, the Pertinence Model is not
economical in terms of the cognitive system’s total load. Furthermore, it
has failed under certain experimental laboratory conditions (Treisman &
Riley, 1969).
The conceptualizations of human memory and attentional resources as limited
and embedded have their roots in the pivotal article by Miller (1956a), “The
Magical Number Seven, Plus or Minus Two: Some Limits on Our Capacity for
Processing Information”. Miller (1956a) quantified the mind’s limited capacity and
stated in the unitization hypothesis that the only way to increase the amount of
information being processed is “by organizing the stimulus input simultaneously
into several dimensions and successfully into a sequence of chunks [so that] we
manage to break … [the] information bottleneck” (p.95). How the items are
organized into chunks determines recall. For example, memorizing and recalling
the letters ‘UAKESUU’ is harder than memorizing and recalling the same letters
introduced as ‘USA UK EU’, because they are grouped together into acronyms
that have associations with terms stored in our memory. Thus, memory limits can
be overcome by encoding items into chunks before transferring them to schemata,
forming mental representations. Mandler (1967) extended the unitization hypoth-
esis by proposing the existence of “superchunks”. The cognitive system’s ability to
overcome its structural limitations opened the way for the two major con-
ceptualizations of memory architecture: the Modal Model and the Full Working
Memory Model.
Modal Model
The most common representation of the structure of the human memory archi-
tecture is the Modal Model (Atkinson & Shiffrin, 1968). The model combines the
short-term storage and attentional system into a single limited-capacity memory:
the Short-Term Memory (STM). The model was developed to represent the capacity
of each basic memory store in terms of time and load and is based on the
assumption of the existence of two distinct structural components, as proposed by
Broadbent: LTM and STM. Incoming sensory information that initially enters the
sensory memory (SM) store soon decays and then is lost in a short period of time.
The STM receives the selected inputs both from the SM and the LTM stores. The
model suggests that the way an input is processed depends on the particular
executive control processes that the individual activates in the STM (e.g., rehear-
sing, searching, deciding, or coding) and on matching with the information held in
the LTM.
The Long-Term Memory is permanent memory that is partitioned into two types
of memory: Explicit (i.e., declarative, conscious) Memory and Implicit (i.e., non-
declarative, non-conscious) Memory. Explicit Memory is a brain construct that refers
to the conscious recollection of factual information, previous experiences, and con-
cepts. It is subdivided into the Semantic Memory that acts as a mental thesaurus and the
Episodic Memory that stores personal experiences (PFC and limbic system) (Tulving,
1972, 1983). Implicit memory is not a brain system construct since it is non-conscious. It
refers to a heterogeneous collection of abilities (Squire & Alvarez, 1995).
Working memory
Norman (1968) argued that the two stores in the Modal Model actually con-
stitute a single storage mechanism: the working memory. The WM’s role is to
activate traces leading to temporal versus permanent change in the cognitive
system itself. Later on, Baddeley and Hitch (1974) focused on the STM store and
developed the multicomponent model of the WM. In their model, the central
executive control system directs, selects, and orchestrates the flow of information
so as to overcome limited structural capacity, thus accounting for the cocktail
party situation.
The distinction between these models focuses on the STM: it is clearly limited
in the Modal Model and more flexible in the Full WM model. A strength of the
Full WM model is that it helps explain information processing in task-switching
contexts. Also, the central executive control system is directly responsible for
coordinating the information used to perform planning activities and make decisions
(Baddeley & Logie, 1999).
Later, Baddeley (2000) added the idea of an episodic buffer, which is similar in
function to Tulving’s (1972) episodic memory. Schemata guide the way information
is encoded and retrieved from the LTM based on the activation of the associated
cognitive network (Bower, 1981).
The brain and paradigms of the mind 29
Bower’s model suggests that links between affect and thinking are neither
motivationally based… nor are they the result of merely incidental, blind
associations, as conditioning theories imply. Instead, Bower (1981) proposed
that affect, cognition and attitudes are integrally linked within an associated
network of mental representations.
(Cited in Forgas, 2003, p.599)
are often overlooked (Bargh & Ferguson, 2000). Above all, both schools consider
that “mental and behavioural processes… can proceed without the intervention of
conscious deliberation and choice” (Bargh & Ferguson, 2000, p.925). In addition,
the study of emotion and feelings initially was perceived as a curse by both para-
digms. The science of ‘computer-like operation’ is not about emotion according to
cognitivist Neisser (1967) and his colleagues. Only later did behaviourists consider
personality traits to be antecedents of behaviour (Zajonc, 1980). Finally, both
paradigms initially addressed emotion and differences in personal disposition as
nuisance variables that needed to be controlled or even ignored.
Eventually, both paradigms evolved toward a greater consideration of emotion
through the common concept of association. Behaviourists pair stimuli together
through conditioning, whereas cognitivists match stimuli to mental representations
through information processing (Skinner, 1985). Both also address emotions and
affect through concepts such as positive and negative nodes or the BRS. Further-
more, the consideration of personality disposition evolved for both. Interestingly,
both approaches even aimed to expel all vocabulary relating to mentalism– the ter-
minology of the mind, used particularly in psychoanalysis (Gardner, 1987)– from
their scientific practices.
work surely is applicable in terms of the existence of the unconscious as well as the
control it may exert in repressing emotional arousal when processing information
(Nisbett & Ross, 1980; Norman, 1980). Boden (1977) stated that cognitivists “have
to acknowledge that while theorizing purely on the verbal level and lacking any of
the rich conceptual instruments of an artificial intelligence programmer, Freud was
occupied with exactly the same problems as the present-day cognitive psycholo-
gist” (in Wegman, 1985, p.9). Therefore, cognitivists converge with Freud’s pos-
tulation that human beings are scarcely aware of how their higher-order cognitive
processes determine their behaviours (Nisbett & Wilson, 1977).
Processing inputs– meaning and sensory– involves a certain level of effort that calls
upon attentional, cognitive, emotional, and physical resources. How attentional
resources are allocated remains a key question for cognitivists. Most memory the-
oreticians think that the cognitive system has limited attentional resources (Kah-
neman, 1973; Neisser, 1976; Kahneman & Treisman, 1984). However, the debate
about the limitations of cognition is ongoing (Winograd & Neisser, 1992). We
speculate that this is because cognitivists have mostly conceptualized the super-
venience of cognition over emotion in problem-solving and decision-making
activities. We consider IT-related overload to be the state of being challenged in
processing information delivered by IT or in imposing control over IT-related
activities. This challenge is related to both emotional and cognitive resources.
Although the amount of information is commonly blamed, we argue that the
culprit is actually inadequate processing resources. The theoretical underpinnings
of ‘too much information’ or ‘too much connectivity’ are mostly supported by a
surfeit interpretation of Miller’s “Magical Number Seven” article. Too often,
researchers fail to reference his later work on the unitization hypothesis and
ignore the importance of reloading the schemata in the LTM in order to chunk
information (Sweller, 1988). Also, too little attention has been given to the
individual’s resource pool. We assume that multiple resources influence the
explicit strategy exerted to counter the limitation of attentional resources during
information processing and decision-making. Information Technology has
become one of these resources, whether it is used mindfully or not. The alloca-
tion of resources must be carefully and consciously reviewed when dealing with
IT-related overload.
Coping mindfully is a brain trigger and therefore rewarding. It is a metacognitive
activity. How shall we stop a flood of emails without missing relevant information?
Do we want parts of the information we just deleted? Are we so focused on staying
at the top of the pecking order that we are willing to exhaust our resources by
answering all emails? Should we rely on organizational policy in doing so? Each
individual has his own answers and therefore his own strategies (see Chapter 4).
Extreme anger might stimulate someone to decide to delete all her emails, effi-
ciently dumping part of the problem without any immediate consequences. That
could be an extremely wise decision, even if it is perceived as irrational. As Frank
(1988) stated, “Many actions, purposely taken with full knowledge of their con-
sequences are irrational” (cited in LeDoux, 1998, p.36). Corey came up with a
much more subtle approach in his auto-reply (see Chapter 1).
Information overload is either conceptualized in terms of external variables such
as task requirements or personality antecedents (e.g., behaviourist, organizational
psychology view) or as the interaction of the task and the human information
processing capability or resources (e.g., cognitivist, neuroscientist view). Informa-
tion overload is therefore either conceptualized from an input perspective in the
context of personality-environment fit/misfit, regarding the demands imposed by
the task and personality, or in terms of IPC. As discussed in Chapter 1, we argue
that the term ‘brain overload’ better describes the phenomenon more commonly
The brain and paradigms of the mind 35
called ‘information overload’. That is, brain overload focuses on the brain (e.g.,
processor) and not on the amount of input (i.e., data or information).
Interestingly, the same applies to IT addiction. It is either conceptualized as the
result of external factors such as the amount of time spent connected to IT (Korac-
Kakabadse, Kouzmin, & Korac-Kakabadse, 2001) or antecedents such as narcissism
(Buffardi & Campbell, 2008) rather than being viewed in terms of the mental
processing and heuristics stored and activated in memory linked to the goal of the
behaviour itself. Jacoby (1984) wrote that consumers use cognitive strategies to
limit the amount of information used as part of their decision-making process,
“stopping far short of overloading themselves” (p.434). Similarly, some users make
use of mental strategies and do not experience an ounce of IT addiction, whereas
others cannot stop without experiencing withdrawal (Davis, 2001; Caplan, 2003).
Our claim is that the previous paradigms have their own limitations in under-
standing a complex set of problems. Chapter 3 presents in detail the Emotional-
Cognitive Overload Model (ECOM) that we have developed, along with a review
of the management information systems overload literature. The chapters that
follow reflect on related issues.
Conclusion
Neuroscience research has demonstrated that the brain areas required for cognition
and emotion are highly interconnected (Ghashghaei & Barbas, 2002). Emotion
fosters the prioritization and organization of our behaviour (Barrett & Campos,
1987; Lazarus, 1994), providing input for information processing, judgment, and
decision-making (Frijda, 1986, 1994). In the popular and scientific literature, we
can also find discussions on and evidence of the existence of a ‘second brain’. Also
known as the mind-gut connection, the second brain relates to the gut feelings
transmitted via the stomach, oesophagus, small intestine, and colon to the CNS
(Gershon, 1999). This is exemplified in the “miracle on the Hudson”. An Airbus
A320 made an unpowered emergency landing in the Hudson River near New
York City after both engines failed because of bird strikes. Master of the Guild of
Air Pilots and Air Navigators Rick Peacock-Edwards said, “To have safely exe-
cuted this emergency ditching and evacuation, with the loss of no lives, is a heroic
and unique aviation achievement.” The pilot, a former US Air Force fighter pilot,
clearly used his gut feelings and expertise to execute that manoeuvre, saving 155
lives (Rutkowski, 2016).
Behaviourists such as Zajonc (1980) have conceptualized emotion and cognition
as independent sources of effects in information processing. Emotion is thought to
have temporal priority over cognitive processes. This peripheral conceptualization,
or mind-gut connection, is surely worthwhile considering. Cognitivists on the
other hand have argued that cognitive processes are necessary for the processing,
elicitation, and experience of the emotions. Their approach is correlated with
neuroscience research which demonstrates that emotion and cognition are two
sides of the same coin. Newell, Rosenbloom, and Laird (1989) stated that:
36 The brain and paradigms of the mind
Although there are still many open controversies, one bit of scientific evidence
with considerable agreement is that the human nervous system requires energy to
keep its homeostatic and higher brain functions in balance. The nervous system
requires the consumption of endogenous and extraneous resources (Kahneman,
1973; Hobfoll, 1989). Not surprisingly, for its own functioning, the brain receives
20 per cent of the blood, oxygen, and calories supplied to the body. Thus, for the
sake of our discussion, energy is critical. Energy deployed, energy provided may be
the key to understanding a part of the supervenience puzzle of IT-related overload.
3
INDIVIDUAL DIFFERENCES IN
EXPERIENCING IT-RELATED
OVERLOAD
Overload situations such as those described above are all too frequent in today’s
workplace. An understanding of overload using the Emotional-Cognitive Over-
load Model (ECOM) that we present in this chapter could allow managers, like the
two described above, to actively manage their business lives instead of merely
reacting to problems. To build the ECOM we draw not only from the cognitivist
theories introduced in the last chapter, but also from the rich management information
systems (MIS) literature on overload.
MIS is tightly linked to the computational paradigm of cognition, which mani-
fested itself in the research on cognitive styles. Cognitive styles served as a basis for
MIS and decision support systems (DSS) design in the 1970s and 1980s. The
research on cognitive styles was in line with the so-called cognitive revolution and
von Neumann’s (1958) work on the computer and the brain. Cognitive style has
been studied as a constraint in implementing operations research proposals (Huys-
mans, 1970) and as an important characteristic in project teams (White, 1984;
White & Leifer, 1986). Another cognitive topic, hemispherical specialization
(Robey & Taggart, 1982), was also studied by MIS researchers at that time. The
“Minnesota experiments” (Dickson, Senn, & Chervany, 1977) influenced the field
of MIS and DSS by promulgating experimental methods that are core to the cog-
nitivist paradigm. Why did such cognitive research vanish from the constellation of
MIS research? Many attribute this to Huber’s (1983) negative critique of the
appropriateness of cognitive style in MIS and DSS design. More recently, Myers-
Briggs Type Indicators focusing on sensation, intuition, feeling, and thinking
(Barkhi, 2002) have been reintroduced. However, cognition and emotion, or
thinking and feeling, are scarcely on the MIS research map.
Earlier, Mason and Mitroff (1973) clearly stated that “what information is for
one type of person will definitively not be information for another” and that
the job of MIS designers “is not to get (or force) all types to conform to one but
give each type the kind of information he is psychologically attuned to and will use
most effectively” (p.478). Obviously, the MIS discipline was sidetracked from this
goal. While we are still aiming at serving users through effective design, we end up
facing unexpected consequences of IT-related overload. This chapter focuses on
what past research, particularly from the MIS literature, can tell us about this
overload situation and what insights our model, the ECOM, can add.
Definitions of overload
In our reading of the literature, we found that many researchers of overload do not
define the term. Perhaps the authors of these studies think that the concept of
overload is so straightforward that it does not need definition. On the contrary, we
find the concept of overload to be quite complex. Fortunately, more recent
research has tended to offer definitions to clarify the type of overload that is being
studied, though these definitions can differ.
In particular, overload is often defined in terms of input (i.e., electronic junk,
data smog, avalanche of data, informational load), output (i.e., information fatigue
syndrome, analysis paralysis, mental stress, technostress), or a combination of both.
It may be reduced to the ‘number of inputs’ (e.g., amount of data, ideas, messages,
emails) generated by IT usage, such as groupware tools. When overload is defined,
the focus is typically on information or communication overload, or having more
information or communication than can be assimilated, processed, or observed.
Overload is also described as a paradox – for example, “we are not receiving
enough information, too much information is thrown at us” (Koeniger & Jano-
witz, 1995, p.5)– a situation with time pressure– for example, “too many things to
do at once” (Grise & Gallupe, 1999–2000, p.161)– a consequence of lack of
structure and organization in a system (Hiltz & Turoff, 1985), or a symptom of a
failure to create “high-quality” information for management use (Simpson &
Prusak, 1995, p.413).
A few researchers view overload as multidimensional (i.e., qualitative or quanti-
tative) and link it to the performance of a task or role (Ahuja & Thatcher, 2005;
Tarafdar et al., 2007). Quantitative overload is defined as “an individual’s perception
that they cannot perform a task because they lack critical resources” (Ahuja &
Thatcher, 2005, p.435). Qualitative overload is defined as the situation where
“employees perceive assigned work as exceeding their capability or skill levels”
(Ahuja & Thatcher, 2005, p.436) or where there is “a lack of knowledge pressure”
40 IT-related overload and individuals
(Pennington, Kelton, & DeVries, 2006, p.26). There is a common thread of inca-
pacity in all of these multidimensional views of overload.
More recently, Karr-Wisniewski and Lu (2010) introduced the concept of
“technology overload” with three dimensions: information overload, communica-
tion overload, and system feature overload. Information overload is by far the most
common type of overload in the literature. It occurs “when an individual’s infor-
mation processing capabilities are exceeded by the information processing require-
ments” (Karr-Wisniewski & Lu, 2010, p.1062). Communication overload is the state
when an individual is unable to process the information that is received from
another person or process (Karr-Wisniewski & Lu, 2010). It is important because it
focuses on how technology can be used to transmit messages. System feature overload is
the state that occurs when the technology an individual has to use to complete a task
is too complex for the task and for the individual (Karr-Wisniewski & Lu, 2010).
Technology overload is very closely aligned conceptually with IT-related overload,
and as we discuss in Chapter 6, Karr-Wisniewski and Lu’s technology overload scales
have been adapted to measure IT-related overload (Saunders et al., 2017).
Information processing
Information processing (IP) is the way information is selected, encoded, and activated
in human memory. We find that the overload literature, especially the MIS over-
load literature, tends to be superficial in its application of cognitivist models and
could benefit from cognitivist views of pertinence and emotions involved in IP. In
the MIS field, information is defined as “data endowed with relevance and purpose”
(Pearlson, Saunders, & Galletta, 2016, p.11).
Not all MIS research has misapplied cognitivist models. To their credit, Grise
and Gallupe (1999–2000), Jones, Ravid, and Rafaeli (2004), Kock (2000), Minas,
Potter, Dennis, Bartelt, and Bae (2014), Paul and Nazareth (2010), and Schultz and
Vandenbosch (1998) have provided a slightly more nuanced view of individuals’
IP. Interestingly, all but Kock (2000) study cognition and IP in a group context.
None of these actually measure overload, but they do explore how individuals
working in group contexts process information. Further, none actually distinguish
group-level overload from individual-level overload.
While the overload literature, particularly the MIS literature, does not delve
deeply into IP, a few business disciplines have started considering the actual cog-
nitive processes involved in dealing with information overload in greater detail. For
example, a number of accounting articles employ a computational approach of
human problem-solving that is premised upon bounded rationality and limited
human IPC.
Overload in the limited IPC perspective is understood as the computational
excess of information which leads decision makers to employ compensatory deci-
sion rules involving, for example, chunking heuristics such as simplification or
reduction of information search (Payne, 1976). In a different vein, Revsine (1970)
introduced the concepts of schemata and mapping involved in processing financial
data. He found that the abstract structures for processing this data vary as a function
of the number of dimensional units in the data and the information combinations.
He also concluded that adding financial data makes IP more difficult. As noted in
Chapter 1, Rose et al. (2004) reported that the recall of numerical data increases as
information or cognitive load decreases, while affective responses are relatively
unaffected by load. Affective reactions to financial information appear to have
greater persistence in Long-Term Memory (LTM). Decision makers recall affective
responses to numerical data more accurately than the actual data. Further, decision
makers’ reliance on affective responses decreases as the information load or cognitive
load decreases. Finally, load (vis-à-vis overload) is frequently discussed in the accounting
literature (e.g., Snowball, 1980; Iselin, 1988, 1993; Simnett, 1996; Tuttle & Burton,
1999; Swain & Haka, 2000; Rosman, Biggs, Graham, & Bible, 2007).
In another example– this time from the marketing literature– Jacoby (1984)
demonstrated that consumers chunk to reduce information load and avoid cogni-
tive overload. Malhotra (1984) concluded that “overload could occur by way of
the imposed information load exceeding the processing capacity of the consumer,
and/or by producing dysfunctional consequences on decision making” (p.439).
Further, Daniels (2008) suggested the need to consider affect in IP.
While authors in accounting and marketing provide interesting elements to
better understand overload, they frame overload in terms of a computational view
of human memory. They employ highly mathematical abstractions with decision
rules that they manipulate in lab experiments to better grasp the exact nature of the
heuristics used to overcome the limitation of the magical seven plus or minus two.
Interestingly, there have been controversial results from lab experiments studying
consumer behaviour (Jacoby, Speller, & Kohn-Berning, 1975; Malhotra, Jain, &
42 IT-related overload and individuals
Lagakos, 1982; Jacoby, 1984; Malhotra, 1984). On the one hand, Malhotra et al.
(1982) purported that “consumers are capable of processing fairly large amounts of
information. Yet the capacity of consumers to absorb and process information is
not unlimited” (p.35). On the other hand, Jacoby (1984) concluded that consumers
use cognitive strategies to limit the amount of information entering into their
decision making, “stopping far short of overloading themselves” (p.434).
With a few exceptions (e.g., Cook, 1993; Grise & Gallupe, 1999–2000; Chang
& Ley, 2006), the MIS literature has looked at the phenomenon of overload
without considering the point at which a load becomes overload. This is in dra-
matic contrast to a considerable body of the work in cognitive psychology on
overload, where it has mostly been studied in relation to learning under ‘cognitive
load’, also referred as ‘mental load’ (Chandler & Sweller, 1991). Chandler and
Sweller (1991) define cognitive load as “the manner in which cognitive resources are
focused and used during learning and problem solving” (p.294). In the cognitive
psychology literature, cognitive overload is a construct that represents the symptoms
that occur when cognitive load overwhelms cognitive resources required for
chunking. The information stored in the LTM in the form of cognitive schemata
has to be (re)loaded into the Short-Term Memory (STM) to allow chunking of
the information (Sweller, Van Merrienboer, & Paas, 1998; Paas, Renkl, & Sweller,
2004). Multiple strategies exist to deal with these situations of insufficient and/or
exhausted cognitive resources. The most common strategy is to increase mental
effort so that information can be chunked meaningfully (Sweller, 1988).
SHORT-TERM LONG-TERM
MEMORY MEMORY
INPUT (STM) (LTM)
7 +/−2
INFORMATION,
(Chunking of Mental framework
REQUESTS TO
information occurs (Prior experience of
USE
here after reloading Emotional-Cognitive
INFORMATION
information held in Overload stored in
TECHNOLOGY Mental load
the memory stores Episodic Memory)
AND
TASKS
PE in LTM)
updating
RT
IN
EN
OUTPUT
EMOTIONAL OR COGNITIVE OVERLOAD
EMOTIONAL COGNITIVE
Stress Dumping part of
Burnout problem
Distractibility More errors
Pool of resources
Frustration Lower performance
Information Inner frenzy Shedding tasks
Processing Capacities Impatience Mental confusion
Physiological Poorer decisions
resources
Bower, 1981; Tulving, 2002). More specifically, it focuses on processes that take place
in the STM and the LTM. In this section we present a detailed discussion of inputs,
processes, and outputs of ECOM. In particular, we focus on processes when we
describe memory architecture and emotion. We specifically address the organization
of LTM and the role of individuals’ schemata. We also discuss filtering, chunking
processes, and the role of expertise as part of the pool of resources. Finally, we look at
critical problems that must be addressed in future research on IT-related overload:
debunking the ‘Amount Illusion’, and ‘Contingency Boundedness’.
Inputs
The ECOM focuses on two types of inputs: information and requests to use
Information Technologies.
Information
By far the most frequently discussed type of input in the overload literature is
information. Our focus is on information that is sent to (versus sought by) the
individual. Increasingly, information is delivered through IT.
Input in relation to overload typically has been studied in terms of the amount
of information that is needed to create overload, but not the point at which ‘load’
46 IT-related overload and individuals
becomes ‘brain overload’. Since many overload researchers do not consider indi-
vidual differences in processing information, they implicitly assume that there is a
common brain overload point for all. This approach of finding a common overload
point is in dramatic contrast to a considerable body of research in cognitive psy-
chology. Further, previous research on overload assumes that when there is too
much information, a bottleneck is created at the filter and brain overload occurs.
We have adopted a more nuanced view of the filter. Our ECOM assumes that
sensory inputs are filtered by the human memory on the basis of their pertinence
to the individual. Only pertinent information is cognitively processed. When
individuals cannot select pertinent information, it becomes a problem of IP
(O’Reilly, 1980). Similarly, Sutcliffe and Weick (2008) observed that overload
occurs because of individuals’ “inability to make sense of demands, capabilities and
context as well as the data” (p.62). Thus, individuals can filter out and reject those
inputs that are not pertinent before they are ever subjected to a deeper level of
processing. Consequently, because of the ability to select pertinent information, the
amount of information that creates ECO varies by individual.
Processes
Once inputs have been selected by individuals for processing on the basis of their
pertinence, they are processed, embedded with emotional valences (Bower, 1981,
1991, 2001), and stored in the more permanent part of the LTM. The processes
include cognitive and emotional processing, which are so intertwined they cannot
really be studied separately. Chapter 6 addresses this issue in greater detail.
Role of emotions
Emotion can either help or hinder the processing of pertinent inputs. To go back
to an example used earlier, people have been found to remember their emotional
reactions to financial information better than the actual numbers (Rose et al.,
2004). An emotional valence, which may reflect either a positive or negative
emotion, is attached to events and concepts that are activated in association with
the prior experience (Bower, 1981).
When the valence of an input matches the valence of a related experience stored
in an individual’s LTM, it is said to be congruent. If the information is not con-
gruent, the individual must strain to process it. Processing is especially challenging
when the individual’s resources are limited. It is then that efficiency in matching
the stimulus to the mental framework becomes more critical. Even with efficient
matching processes, the additional processing strain due to incongruence may lead
to brain load so great that it cannot be processed successfully with the individual’s
cognitive resources. Thus, information lacking congruence with an individual’s
mental framework is more likely to create ECO.
In addition to the inefficiency in processing incongruent, mismatched valences,
there are several reasons why individuals may not be able to process information
IT-related overload and individuals 47
load adequately: they may be exhausted and/or lack the proper resources; they
may lack expertise or experience; they may lack time. Multiple coping strategies
exist to deal with these situations of insufficient and/or exhausted resources. The
most common strategy is to make the mental effort more efficient by chunking the
stimulus information into meaningful chunks or superchunks (see Miller, 1956b;
Mandler, 1967; Sweller, 1988). For example, individuals can usually remember
their four-digit PIN numbers. However, if they try to remember a ten-digit US
phone number (as opposed to just checking it on their phones), they probably
break the number up into chunks of three or four: the three-digit area code, the
three-digit exchange code, and a four-digit number. Research has found that
chunking also speeds up the retrieval of information from LTM (Logan, 2004).
Role of resources
Research in psychology supports the idea that processing all inputs involves both a
certain level of mental effort and the resources needed to accomplish this effort.
Endogenous cognitive and emotional resources can make the mental effort more
efficient and thereby reduce an individual’s brain load. Cognitive and emotional
resources are combined into the individual’s pool of resources (Kahneman, 1973;
Hobfoll, 1989, 2002, 2011). The individual’s pool of resources may include cog-
nitive ability, personality traits, and physiological resources required to maintain
homeostasis (see Chapter 2).
One personality trait that is of particular interest in situations leading to brain
overload is Need for Cognition. Need for Cognition (NFC) has been defined as “a
need to structure relevant situations in meaningful, integrated ways. It is the need
to understand and make reasonable the experiential world” (Cohen, Stotland, &
Wolfe, 1955, p.291). NFC demonstrates a stable dispositional tendency to engage
in and enjoy effortful cognitive activities (Cacioppo & Petty, 1982; Haugtvedt,
Petty, Cacioppo, & Steidley, 1988; Cacioppo, Petty, Feinstein, & Jarvis, 1996).
Tam and Ho (2005) showed that NFC plays “a pivotal role in influencing a
user’s level of elaboration and choice outcome”, adding that “NFC is a moderator
that induced objective processing of personalized offers” (p.288). Similarly, Alju-
khadar, Senecal, and Daoust (2012) found NFC moderated the relationship
between information overload and decision strategies for online purchasing. In
particular, individuals with low NFC were more likely to use online agents and
accept their recommendations than were individuals with high NFC. NFC also
plays a role in processing information. People with high NFC are assumed to make
more resources available in order to focus their attention. They use systematic rules
to better process information by carefully evaluating more alternatives. In contrast,
people with low NFC turn to heuristics to cope with the high cognitive demands
in memory and make decisions based on peripheral cues (Aljukhadar et al., 2012).
Further, people with a high NFC tend to recall more information than those with
low NFC because they typically apply more mental effort in thinking about and
elaborating on the information when they are processing it (Cacioppo et al., 1996).
48 IT-related overload and individuals
Therefore they are more likely than low-NFC individuals to be motivated to exert
additional effort in information acquisition, reasoning, and problem-solving
(Cacioppo et al., 1996), especially when they view the context as relevant
(Haugtvedt et al., 1988). This was supported in our survey of almost 2,000 Dutch
participants who were invited to evaluate the ECO from information delivered by
a ‘video contact’ technology. This technology was designed to allow users to dis-
cuss health or financial matters with specialists online. The results demonstrated
that IT-related overload is negatively related to NFC, but positively related to
memories of past situations of emotional and cognitive overload (Rutkowski,
Saunders, Weiner, & Smeulders, 2013). Consequently, NFC is a personality trait
that serves as an important resource in facilitating cognitive processing.
with the non-expert, the expert’s success is tagged positively in memory, while
their failure is tagged negatively in memory. Both positive and negative valences
are encoded in the mental framework as PECOs. These relationships are summar-
ized in the propositions in Table 3.1.
TABLE 3.1 Summary of issues in processing and output for expert versus non-expert
Note: *For the non-expert with low brain load, it is possible that the individual will experience Emotional-
Cognitive Overload where there is high incongruence.
IT-related overload and individuals 51
Amount Illusion
As we have noted, most literature on overload faults the amount of information as
creating the situation of overload. In particular, information overload is considered
to be based only on the amount of information that is received (e.g., Chervany &
Dickson, 1974; Chewning & Harrell, 1990; Berghel, 1997; Allen & Shoard, 2005).
Some studies measuring information overload assume it occurs when individuals
are faced with an increasing number of alternatives, when they are faced with an
increasing number of dimensions of information available per alternative (Payne,
1976; Cook, 1993; Swain & Haka, 2000), or when they have to process varying
numbers of cues or data items (Chervany & Dickson, 1974; Iselin, 1988; Chewn-
ing & Harrell, 1990). We think this is the wrong way to think about overload. We
use both theoretical arguments and empirical results to demonstrate that overload is
not just about the amount of information.
The ECOM model argues that overload is not just about amount. It is premised
on the important role that individual differences play in creating overload experi-
ences. In particular, people have different cognitive and emotional resources and
these are expended differently across individuals, or even within individuals: Some
have more cognitive abilities, which makes it easier for them to process incoming
stimuli; Some have personality traits such as Need for Cognition that influence the
way they process information or other inputs; Some may have expended con-
siderable resources on earlier IP and so become overloaded in processing additional
stimuli when their resources are depleted. Further, they have encoded in their
schemata negative experiences about earlier failure in processing information or using
new technologies, which create incongruences with the valences of new stimuli that
they are asked to process.
Initial tests of ECOM support the idea that overload is not just about amount. In
fact, in a study of Germans using mobile technologies, the amount of information
was not significantly related to IT-related overload (Saunders et al., 2017). The
same was true in the Dutch study, mentioned above, in which people were asked
to adopt video contact technologies designed to provide information about their
health or their banking transactions and accounts (Rutkowski, Saunders, Weiner et
al., 2013). What was important here was PECO with IT (emotional and cognitive
components); this is key in estimating how individuals intend to respond to
requests to use new technologies (Rutkowski, Saunders, Weiner et al., 2013;
Saunders et al., 2017).
Contingency Boundedness
In developing theoretical models, it is important to consider boundaries (Bacharach,
1989). In deciding which boundaries to consider for ECOM, we seek to answer the
‘where’ and ‘when’ questions (Whetten, 1989). We draw on the literature to con-
sider the organizational context in answering the ‘where’ question. We expand the
previous discussion of temporal context in answering the ‘when’ question.
54 IT-related overload and individuals
Organizational context
How organizations are designed impacts and shapes their information processing
requirements (Galbraith, 1974; Tushman & Nadler, 1978; Schick, Gordon, &
Haka, 1990). Schick and his colleagues focus on organizational-level information
overload and neatly assume that individuals’ information processing strategies are a
given. They view information overload as having organizational structure deter-
minants and argue that it is up to organizations to set the appropriate information
processing time for work completion. When the actual time exceeds the allotted
time for a work task, information overload occurs. Tushman and Nadler (1978),
who built on the work of Galbraith, defined information processing as the “gathering,
interpreting, and synthesis of information in the context of organizational decision
making” (p.614). For Galbraith (1974) and Tushman and Nadler (1978), informa-
tion overload occurs when the individual’s IPC cannot handle the information
processing requirements of a task. To deal with overload, Galbraith (1974) sug-
gested creating formal structures, rules, and regulations so that information can be
processed more effectively. Doing so encourages the coordination of information
provision across units and consequently reduces uncertainty. This improved coor-
dination can reduce information processing requirements and positively influence
an individual’s IPC. If rules, regulations, and formal structures prove inadequate,
Galbraith (1974) suggested either reducing the amount of information to be pro-
cessed by creating slack resources or self-contained units, or increasing the capacity
to process information by using vertical information systems or lateral relationships.
It has also been shown that overload is reduced by the organizational redesign of
interactions (Sparrow, 1999) or branch facilities (Meier, 1963). In contrast, changes
in organizational design, such as disintermediation or centralization, might increase
information processing requirements (Schneider, 1987). O’Reilly (1980) demon-
strated empirically that organizational characteristics can cause overload and that
information overload negatively impacts organizational performance.
Temporal context
In many situations, people would not get overloaded if they were just given
enough time to process the pertinent inputs that they receive. Thus, time is an
important boundary in the ECOM. Important aspects of this temporal context are
time constraints, individual perceptions of time, and task-switching.
Time constraints
The importance of time in relation to overload cannot be overstressed. That is why
many researchers talk about how not having enough time to process and under-
stand inputs (i.e., information) can result in overload (e.g., Galbraith, 1974; Schultz
& Vandenbosch, 1998; Kock, 2000; Farhoomand & Drury, 2002; Ahuja &
Thatcher, 2005; Ahuja, Chudoba, Kacmar, McKnight, & George, 2007; Paul &
IT-related overload and individuals 55
Nazareth, 2010). In fact, Kock (2000) and Schick et al. (1990) argued that overload
is more about time pressures that it is about the amount of information.
homogeneity– whether all units of time are the same as in minutes or hours
(homogeneous) or whether some are qualitatively different from others (epochal);
nature of flow– like a river or speeding arrow (linear) or seasonal or repeating in
some other way (cyclical);
direction of flow– one irreversible direction, as in past, present, future (unidirec-
tional), or as in mathematics and physics with positive and negative values for
the flow (bidirectional);
objectivity– based on fact or quantifiable (objective), open to greater interpreta-
tion and based on personal feelings, emotions, and perceptions (subjective), or
based on people’s shared perceptions and interpretations (intersubjective);
time orientation– short term or long term;
chronicity– preferring to do one thing at a time (monochronic) or preferring to
do multiple things at a time (polychronic).
Switching back and forth across tasks requires a certain reaction time; that is, a
“reaction time switch cost” (Wylie & Allport, 2000). The switching can take place
in a matter of milliseconds. Net Geners who think they are ‘media multitasking’
are actually task-switching across parallel tasks. Oulasvirta, Rattenbury, Ma, and
Raita (2012) found that many individuals interrupted other tasks to check their
smartphones so often that the behaviours could be considered a trigger for habitual
use. Such repeated interruptions take away from the time needed to complete a
task, and they expend extra resources in recovering from them (Speier et al., 1999).
Conclusion
We started this chapter with a situation that may be familiar to many. There has
been a lot of research about overload that can be useful in understanding this
situation. However, we believe that a cognitivist perspective on overload can not
only add further clarity to our understanding of this phenomenon, but also inform
and guide future research on overload. To that end we proposed the Emotional-
Cognitive Overload Model, which incorporates key cognitivist concepts: (1)
memory architecture and schemata; (2) pertinence; and (3) emotions. To illustrate
these concepts, we described how the ECOM could be used to shed further light
on the opening situation. We then debunked the Amount Illusion and stressed the
importance of Contingency Boundedness.
4
INFORMATION TECHNOLOGY AS A
RESOURCE
From the Bright to the Dark Side of Addiction
At the turn of the millennium, data gathered from monitoring login behaviour
showed that the system was part of what we could characterize as a ‘healthy rou-
tine’ for the parents. Parents mostly logged in to the system to monitor their
babies’ feeding times. On weekends, system usage was low since parents visited
their newborns in person. The parents reported a certain ‘feeling of control’ in
being able to monitor their infants. On average, system use dropped after the first
few days but picked up toward the end of the period of hospitalization to a level
exceeding that of the initial use. This was because when parents were informed
that their newborn would soon be discharged, they wanted to make sure that the
baby was doing well and would not be kept longer. Both parents used the system
in most cases (82%), and they were enthusiastic about the possibility of using it to
complement their hospital visits. They particularly enjoyed the fact that they had
constant access: “I could see my newborn all day, and that was great.” Fathers used the
system from their workplaces (19%).
Maybe unwisely from a cybersecurity perspective, the majority of the parents
had been willing to let others access the system, sharing their login information not
only with other family members, including sisters and brothers (56%) and grand-
parents (48%) but also with their best friends (33%) and close work colleagues
(22%). One family had about 40 different OBS users. Another family extended the
login information as far as Brazil. Worldwide networked communication that
monitored the baby was established in the family’s whole social network. This
produced a feeling of closeness amongst the family members concerned.
Overall, parents commented favourably on the system. As one father wrote:
For us online baby has been very important. Our oldest daughter (1½ years) could not
be with her newborn sister because of her age and the fact that she has a disease herself.
Thanks to OBS she could be with her sister every day.
I found it so hard to get discharged from the hospital after giving birth because I could not
be with my daughter the entire day. I was relieved that I could be with her through the
Internet.
as much as I have been preparing mentally myself that my twins will probably be born
prematurely, and remain in the intensive care unit, I felt hopeless and lonely when
returning home, besides the support of my husband. I knew all [would] be well though.
It was an overwhelming feeling. Being able to monitor their progress day by day was a
tremendous relief to my pain. I could go back to some of my work routine, just being
patient and waiting for them to finally be home.
She was thankful for our work on the system as it helped her to cope with the separation.
60 Information Technology as a resource
The streaming was not always available as there were times when the camera
was switched off by staff. Just under a fifth (19%) of the parents had encountered a
blue screen at the login phase, indicating that the camera was not in operation.
This was not viewed as a concern for the majority of parents: “We were not anxious
when we saw the blue screen. We knew that the nurses were taking care of our baby and we
respected their decision to switch off the camera.”
In pediatrics, physicians and nurses see daily the effects of social deprivation on
newborns and their parents. The OBS reduced these effects to some extent by
enabling social contact between parents and their newborns and, thus, providing a
new form of technological socio-cognitive resource to cope with the difficult time
of separation. The system reduced the parents’ anxiety and, moreover, added
communication opportunities for those in difficult family circumstances (Spanjers &
Rutkowski, 2005; Spanjers et al., 2007). According to the nurses, the system gave
parents the feeling of greater control in their relationships with their babies, and it
also meant that parents were more relaxed when visiting the neonatal wards.
As noted above, with the exception of a few extreme cases, the pattern of parent
usage was pretty healthy. All the parents ‘loved’ the OBS, and most claimed they
were “addicted to the system”. None of the parents claimed brain overload. How-
ever, not all impacts of the system were positive. Some parents displayed extreme
behaviours: 22 per cent of the mothers reported a form of anxiety watching their
baby online and 13 per cent reported problems in disconnecting. As one mother
told us, “It was extremely hard disconnecting from the system, turning the PC off.” We
witnessed anxiety in some parents, such as when they called the wards too often or
showed signs of panic if the screen was under “blue mode too long”. Some mothers,
isolated in their home, stayed with the blue screen as if their life depended on it
and called the wards incessantly.
With the migration from connecting via analogue or ISDN phone lines (first
generation of the OBS) to access via Internet technology (second generation), we
did not see a tremendous shift in the pattern of usage. Indeed, neither the fre-
quency of connections to the system nor the duration of connections differed sig-
nificantly between the first generation (n = 29,663 records) and the second
generation (n = 21,067 records). Interestingly, from 2003 our data became less rich
as parents simply connected to the system in the early morning and stayed logged
in the whole day. The average connection time increased from 5 minutes to 50.
We assumed, in these cases, that it was not likely the parents had been sitting in
front of their PCs the whole time, but rather they had been quickly checking the
status of their babies in the middle of their other activities. Maybe our assumption
was incorrect.
In the year 2006, a mobile phone company funded a project to extend the use
of the OBS to mobile phones. We embraced the funding, as we could not have
foreseen the impact that this level of connectivity would later have on parental
usage patterns. However, three years later, Dutch national TV news highlighted
the use of the OBS in one hospital (SBS6, 2009). The piece included an interview
with the parents of one baby in a participating ward, and they explained how
Information Technology as a resource 61
wonderful the system was – nothing surprising there. Yet the broadcast was
somehow disturbing, as the OBS had effectively become a resource-consuming
‘monster’ for these parents. They explained that they slept with the mobile phone
unlocked between them in bed. The father, a primary school teacher, projected the
live stream on the wall of his classroom while teaching. Although the pupils thought
this was “funny” and seemed not to be disturbed, they claimed it sometimes interfered
with their learning Math.
Overall, while we were very excited about the OBS as an efficient technological
socio-cognitive resource, we were concerned by some of the parents’ usage pat-
terns. On one hand, the system allowed parents to share the difficult early arrival of
their child and to cope with that stressful situation. On the other hand, a few
parents mindlessly used the technological resource – behaviour that can be com-
pared to IT addiction (i.e., Internet addiction). When the OBS was first installed,
we observed parents who were unable to properly use the resource to cope with
their actual separation. Interestingly, it appears that Internet hyperconnectivity and
greater mobility have in fact increased parents’ satisfaction with the technology while
simultaneously raising questions regarding context and appropriateness of usage.
Are the parents’ behaviours roughly equivalent to IT addiction? It is a slippery
slope. Indeed, variations in cognitive styles and personality impact stress relative to
technology differently (Moreland, 1993). Earlier work addressed technophobia,
which is the struggle to accept computer technology, versus technophilia, which is a
form of overidentification with technology that leads to a dissolution of human-
technology boundaries (Brod, 1984). Both have been related to technostress in that
they modify internal belief systems that technology should always be available and
create stress when it is not. We demonstrated in Chapter 3 that individuals are not
equal when dealing with information overload. We argue here that IT addiction
reflects a mindless use of IT as a socio-cognitive resource. The phenomena is
rooted at the individual information processing level in which a form of control or
self-regulation is required to avoid excessive use of IT. We speculate that more
emotional and cognitive resources are required to impose control on system usage
when the emotional brain is hijacked. Moreover, the content of the information
transmitted via technology is core to the human brain and its primary drive: social
attachment. Would the OBS have been that successful if it had not been newborns
being streamed? Would people project a distant relative in class or at the office?
Maybe. Additionally, we argue that with the improvement of technological design
and features, some systems are extremely immersive and they greatly increase the
sense of social presence. The users can scarcely apply the necessary controls to use
such technological resources wisely.
Missing Out’ in regard to their child. We also explain how the OBS example can
be used to understand strategies and technology use for social contact.
“missing a minute of the great opportunity” to watch their newborn. One mother made
a disturbing comment illustrating FOMO:
I can still remember the day my husband forgot the laptop at the first floor of our house
and went to work. I crawled up the stairs even though I was forbidden to move … to be
able to see my baby … I could not resist … the OBS is a marvelous technology.
This FOMO was rooted in the mother’s involvement with her newborn, the
extremely personal relevance of the child displayed on the screen and the associated
anxiety relative to the infant’s condition. FOMO cannot be regarded as a mere
arousal experience in the S-R (i.e., Stimulus-Response) tradition. The ‘philia’ is
not technological; in fact, the technology is a useful socio-cognitive resource that
supports parents in a difficult time. Rather, the problem is related to the highly
pertinent nature of the information supported by the system. The cognitive
approach supported by Bower’s work (1981, 1991, 2001) informs us that indivi-
duals are more likely to process information that is affectively and cognitively
congruent with the mental schemata stored in the Long-Term Memory (LTM).
For a mother with a baby in hospital, there can be little more important than
viewing her child and, though online, feeling her presence. How can one resist a
system that provides extended social presence? Why even try?
on the appraisal and reappraisal of situations. Therefore, what is stressful for one
person may be less so for another.
Coping requires the expenditure of resources to solve problems embedded in
stressful situations. Newell and Simon (1972) underlined the importance of heur-
istics and strategies in problem-solving. Being confronted with a stressful situation
is, in itself, a problem. Much information has to be processed in order to reinstitute
homeostasis after one faces a stressful situation. Homeostasis involves both explicit
and implicit processes. Efficiency in returning to a state of balance depends on the
individual’s pool of resources. There are many possible combinations of resources,
whether it be mental frameworks or physiological factors to fuel the organism;
thus, as we demonstrated in Chapter 2, individuals have different pools of resour-
ces. Situations and psychological states are appraised differently as a function of the
resources available. Some are more or less efficient, but all are highly dependent on
the appraisal of the situation (Lazarus & Smith, 1989). For example, some indivi-
duals make sense of a negative situation by reorganizing their schemata while
coping with the situation. They focus on the problem itself (i.e., planful problem-
solving) or on the emotion (i.e., positive reappraisal) (Lazarus & Folkman, 1984;
Park & Folkman, 1997). Combining both cognition and emotion appears to be an
efficient way of improving the emotional state.
We argue that technology such as the OBS is being used as an extraneous
resource. When it is used as a component of the individual’s pool of resources, its
use varies across parents. The narratives of some mothers reflected both emotion
and cognition (i.e., relief and being in control; happiness and empowerment) when
using the OBS. Such narratives are emotional-cognitive in essence. We could
speculate, for example, that the father who displayed his newborn in the class he
was teaching was, in fact, using the technology efficiently. Indeed, the father
enacted his self-efficacy (Bandura, 1977), having no doubt that he could cogni-
tively perform well as a teacher while being emotionally stabilized by the social
presence of his newborn on the screen.
Information processing coping strategies use schematic models that are based on
experiences in a given culture or family, often activated unconsciously (Barnard,
1985). Sometimes, not expressing, or even repressing, the arousal is a way of
coping with pain by avoiding the activation of the emotional network in the mind.
It requires a form of control over one’s expression of emotion (Gross, 2007). The
individual exerts control by shuffling the negative experience in Episodic Memory
to a less accessible place. Such strategies may be an unconscious effort to deactivate
negatively valenced schemata and cut off the pain. The traumatic event is therefore
consciously forgotten or pushed away. Research has demonstrated that reactivating the
memory of an emotional experience rekindles response components of that experi-
ence, such as mental images, associated feelings, bodily sensations, and physiological
arousal (e.g., Pennebaker & Beall, 1986; Rimé, Noël, & Philippot, 1991; Schaefer &
Philippot, 2005). This approach represents supervenience of cognition on emotion.
Similarly, it is indeed well known in psychoanalysis that reactivating ‘suppressed’
emotions through verbalization, or the expression of one’s own thoughts, may help
Information Technology as a resource 65
Based on this study, one of the authors was invited on a TV show to provide a
scientific interpretation of the online jokes circulating after the event of 9/11. The
shock was tremendous, and surely some jokes were inappropriate according to
social norms prevailing in our society. Still, people shared the jokes on the Internet.
Humour may sometimes be the only efficient weapon in the face of violence or
adversity. In terms of mind-body supervenience, the body expresses itself first and
this response is regulated by what Freud refers to as the external world, a world of
cognition and mental images. Obviously cognition and emotion are interrelated
when processing information and solving problems. They are interrelated in the
same way the peripheral and central nervous systems are connected in the brain.
One shall not laugh at death, right? Yet jokes about 9/11 transmitted over the
Internet were a way of dealing with the pain of this event. Similarly, when the
father in the OBS study projected the streaming of his child in his classroom, was
he reducing his stress? Could this be part of the bright side of using IT as a socio-
cognitive resource? Perhaps our society is changing what is culturally acceptable in
terms of the use of technology. A reconsideration of the way individuals deploy
their coping strategies may be a by-product of our digital world.
for well-being. First, we concluded that the OBS was not a cause of IT addiction.
Rather, it provided a powerful continuous connection that activated the BRS,
which consequently started a cycle of wanting even more connectivity. The con-
nection between mother and child through the OBS diminished a tremendous
biological feeling of loneliness. Second, we observed that parents who suffered
from an underlying pathology, such as anxiety disorders, tended to suffer more
severely from any interruption to the service delivery. We therefore concluded
that IT addiction in the case of the OBS also was related to iDisorders and Pathological
Internet Use (PIU).
Mood disorders
Individuals who experience problematic Internet use have been reported to display
high rates of depression symptoms (Young & Rogers, 1998). Longitudinal studies
have shown that greater use of the Internet was associated with increased signs of
loneliness and depression (see Kraut et al., 1998). Excessive use of technologies (e.
g., online chat, video gaming, emailing) may cause depression transmittable
through “emotional contagion” via SNSs (Hancock, Gee, Ciaccio, & Mae-Hwah
Lin, 2008; Moreno, Jelenchick, Egan, Cox, Young, Gannon, & Becker, 2011).
Indeed, Rosen, Whaling et al. (2013) reported that, aside from the work of Davila,
Hershenberg, Feinstein, Gorman, Bhatia, and Starr (2012), most authors in the
field converge on the negative impact of technologies on mood disorders, parti-
cularly depression. SNSs have been associated with increases in loneliness and
depression (O’Keeffe & Clarke-Pearson, 2011) or, in contrast, with decreases as a
result of social bonding. Further readings on the topic confirm the importance of
both the individual’s pool of resources for information processing and the memory
in understanding IT addiction. Caplan (2007) made this point clear in his research
to determine the cognitive predictors of the negative outcome of Internet use. He
differentiated dispositional from situational loneliness. Dispositional loneliness is
part of an individual’s mental framework in memory. It is a personality trait that is,
in fact, the expression of a form of social anxiety that arises from the desire to
create a positive impression of oneself to others. Socially anxious people are highly
motivated to seek low-risk communicative encounters (Schlenker & Leavy, 1982).
They tend to perceive their self-presentation online to be greater than in face-to-
face contexts. Situational loneliness also addresses resources, but from a social
contact perspective. It may be a function of relocating to other cities or countries,
travelling extensively, or having too little time for social encounters. Caplan
(2007) demonstrated that in some studies on IT addiction, results classically
attributed to situational loneliness actually should have been attributed to dis-
positional loneliness. In these cases, social anxiety was a confounding variable. In
the behaviourist tradition, both types of loneliness could be treated as ante-
cedents. However, social anxiety has been shown to be at the root of PIU. As
Bower (1991) demonstrated, anxiety disorders are rooted in a defective organi-
zation of the mental schemata. Interestingly, socially anxious people may benefit
emotionally and cognitively from Internet use. However, in doing so, they may
end up even more isolated from the real world. In other words, social phobia
may lead to greater technophilia.
70 Information Technology as a resource
Personality disorders
The Big Five personality traits have been found to predict extent of social media
use. Further, narcissism has been linked to higher usage (Ryan & Xenos, 2011) and
has been used extensively in studies to understand Internet addiction (i.e., IT
addiction). Narcissism refers to a fundamental absorption towards the self and the
constant need to validate one’s existence. It reflects a grandiose, inflated, self-
centred self-concept that suppresses low self-esteem based on defective attachment
in childhood (Cohen & Clark, 1984). The term is rooted in psychoanalytic verbi-
age and is the result of a maladaptive defence mechanism of the ego (i.e., self).
Individuals suffering from Narcissistic Personality Disorder (NPD) have a very poor
image of the self that is unconsciously rooted in memory. Narcissists either feel
they can never fulfil the requirements imposed by their parents (i.e., over-
investment) or have experienced physical or psychological abandonment (i.e.,
underinvestment). They build strong unconscious defence mechanisms to enhance
their image at ‘any cost’. Doing so helps them cope with low self-esteem as well as
anxiety disorders. Narcissism is in the official classification of personality disorders
(i.e., the DSM 5). It occurs on a spectrum from mild to severe (i.e., psychopathy).
Narcissism is characterized by a poor ability to empathize and decode the emotions
of others, leading to antisocial behaviour and deception.
In the literature on IT addiction, narcissism serves both as an antecedent and a
symptom. In particular, narcissists find SNSs appealing, and their use of SNSs
increases their narcissistic behaviours (Bergman, Fearrington, Davenport, & Berg-
man, 2011). For example, Facebook has been found to attract users with a narcis-
sistic personality (Mehdizadeh, 2010; Ryan & Xenos, 2011), and narcissism predicts
higher levels of social activity in online communities (Buffardi & Campbell, 2008).
Also, individuals scoring higher on the narcissism scale and lower in measures of
self-esteem spend more time and report more self-promotional content on Face-
book (Mehdizadeh, 2010) as well as on other SNSs. Indeed SNSs encourage nar-
cissistic behaviour that is directed towards self-promotion (DeWall, Buffardi,
Bonser, & Campbell, 2011) and entitlement/exhibitionism (Carpenter, 2012).
They have been criticized for ‘producing’ a narcissistic generation. For example,
Bergman et al. (2011) argued that there has been an increase in narcissism due to
the values held by the Net Generation. In order to better understand IT addiction
and societal impact, it is worthwhile taking a closer look at this specific personality
disorder. In addition to narcissism, extended time on the Internet has been related
to more antisocial behaviour among Chinese students (Ma, Li, & Pow, 2011) as
well as attention deficit (Yen et al., 2007).
Specific PIU
Specific PIU involves an overuse or abuse of content-specific functions of the
Internet (e.g., gambling, online porn) and is cast as one of the many possible
manifestations of a broader behavioural disorder. Davis (2001) argued that in
the absence of the technology, the behavioural disorder would likely be manifested
in some alternative way. Davis’ approach is congruent with the emotional-cognitive
model of memory. He speculated that abusive Internet usage (i.e., stimulus) does not
cause depression or dysfunctional addictive behaviour (i.e., symptoms). Rather, it
predisposes the individual to develop maladaptive usage through a pre-existing
pathology. Davis proposed an interesting alternative explanation to the relation
amongst iDisorders and IT addiction, as previously presented. His proposal differ-
entiates between antecedents and symptoms and emphasizes the key role of infor-
mation processing. As Bower (1991) argued, such pathologies as social anxiety are
indeed directly involved in the way information is processed in memory.
Additionally, the construct of PIU provides insights about the maladaptive use of
technological resources such as the OBS or SNSs. It can be used to explain why
results previously attributed to loneliness as a predictor for IT addiction should in
fact be attributed to a deeper cognitive level; for instance, to an anxiety disorder
(e.g., social anxiety). Also, in the case of the OBS, it happened that for some
mothers we had to confiscate, or interrupt, streaming of their newborns. They
would jeopardize their own well-being – for example, by crawling up stairs to
monitor their child – or they would seriously disturb the functioning of the neo-
natal wards with recurrent phone calls. They would burst into tears in panic at any
sign of the blue screen (i.e., no connection). The goal in halting the streaming was
to provide mental and emotional rest to the over-connected and stressed parents.
At the time, we considered developing a test to grant accessibility to the system
based on obtaining favourable results in relation to state versus anxiety trait dis-
orders. The test could also inform how parents would use coping strategies as
resources. We did not pursue such testing as we considered it unethical to restrain
parents’ viewing of their babies a priori based on anxiety trait disorders. However,
72 Information Technology as a resource
pathological use became a strong indicator of the need to provide proper support
to mothers in distress.
Finally, specific PIU is useful in explaining why narcissists are drawn to SNSs
and are over-connected, based on frequency of social media posts and ‘likes’
(Mehdizadeh, 2010; Ryan & Xenos, 2011). SNSs are surely bringing sunshine to
their lives on a regular basis. Also, and contrary to what is often depicted in the
literature on IT addiction (Korac-Kakabadse et al., 2001), the time spent on the
Internet or being connected is not in itself a symptom of addiction (Junco, 2013;
Rosen, Whaling et al., 2013).
General PIU
General PIU is conceptualized as a multidimensional pathological overuse of the
Internet due to the unique communicative context of the Internet itself. The misuse
of the technology results in personal and professional consequences associated with
the experience of being online and its unique social context. General PIU occurs
when an individual develops problems due to the interpersonal contexts available
online (see Caplan, 2002, 2003, 2010). For example, some OBS mothers reported
feeling guilty about disconnecting, or experiencing discomfort handling the system.
The possibility of experiencing a form of interpersonal contact with their newborn
was key in understanding their excessive usage. As mentioned above, the love
hormone (i.e., oxytocin) was hijacking their brain, leading to over-connectivity.
It may be worthwhile studying the BRS in conjunction with general PIU (Chou &
Hsiao, 2000). We argue that a primary function of the OBS or SNS technology is to
link us to loved ones. It is a socio-technological resource. In doing so, the brain may be
inundated by oxytocin. Also, when SNSs activate the brain through extrinsic rewards
such as status, number of likes, or positive comments, it may become more difficult to
disconnect while the brain is so ‘rewarded’. Further, self-disclosure, in the same way as
food or sex, can activate the intrinsic BRS as primary reward (Tamir & Mitchell, 2012).
Brain activation and a particular focus on attachment are surely interesting constructs
when considering the abuse of media such as Facebook or Instagram.
The relationships of depression, narcissism, or loneliness with the various forms
of IT addiction appear to present a chicken-and-egg problem. What is evident is
that whatever the symptoms may be, they are exacerbated when there is no control
imposed by the person on IT usage. Recently, Caplan (2010) updated Davis’ cog-
nitive-behavioural model and presented studies that have found empirical support for
the general PIU model. Caplan (2010) suggested that
preference for online social interaction and use of the Internet for mood reg-
ulation, predict deficient self-regulation of Internet use (i.e., compulsive
Internet use and a cognitive preoccupation with the Internet). In turn, defi-
cient self-regulation was a significant predictor of the extent to which one’s
Internet use led to negative outcomes.
(p. 1089)
Information Technology as a resource 73
difficulty disconnecting. That is, some parents could foresee their inability to use
the technology mindfully. They anticipated extra stress if they were to view their
newborns non-stop and, thus, opted not to adopt the OBS. They consciously took
the time to reflect on the OBS (i.e., accessing subconscious high-order cognitive
processes) and exerted cognitive control. When individuals are aware of some of
their emotional and cognitive weaknesses, they may act to avoid PIU. Were the
decisions of these parents based on previous experience of addictive BlackBerry
use? Were their actions a form of repression of their emotions? Were they simply
well aware of some underlying anxiety pathology or subconscious processes that
would interfere with their IT use? We do not know. However, these questions
offer intriguing avenues for future research.
Conclusion
To conclude, when control is more and more difficult to impose mindfully on the
brain and when the information delivered is way too pertinent to be filtered,
individuals may end up suffering from IT addiction. Additionally, as Davis (2001)
and Caplan (2007, 2010) have demonstrated, personality disorders are cognitive
precursors of IT addiction and, subsequently, IT addiction exacerbates such dis-
orders. In Chapter 1, we reported Nicholas Carr’s and Tristan Harris’ latest ‘war’
against the smartphone. Hopefully, this chapter illuminates IT use as a much more
complex cognitive and emotional phenomenon. Bashing technologies such as the
smartphone may be an easy route to take. In Chapter 7, we focus on the bright
side of IT. Technologies can be fabulous resources. It is for the user to apply them
mindfully and with moderation. As is the case for drugs, it is a civic responsibility
of manufacturers to warn about the consequences of excessive usage of their devices
and to help develop rules and guidelines for safe and effective use.
5
DARK SIDE OF INFORMATION
TECHNOLOGY AT THE
ORGANIZATIONAL LEVEL
The American poet Robert Frost once noted that “by working faithfully
eight hours a day, you may eventually get to be boss and work twelve hours a
day”. Today, Robert Frost’s words seem prescient. We live in a world where
working long hours is becoming the norm– at least in some First World
countries. Anna no doubt wishes that she could confine her workday to 12
hours.
In the two decades from 1980 to 2000, the time spent on work increased in the
USA, Canada, Britain, Japan, and Australia. For example, American full-time
workers put in an average of 50 hours a week in 2015 (Isidore & Luhby, 2015)
compared to an average of 40 hours a week in 1973 (Porter & Kakabadse, 2006).
In this chapter we explore the dark side of these longer workdays and weeks at the
organizational level. In explaining these dark side challenges, we focus on what
technology can do to you at the organizational level and not on what technology
can do for you (Gutek, 1983, p.163).
Work
Technology
Work tasks
Work overload Automation
Work-life balance Robotics
People impacts
information and the capacity to process that information. Overload occurs (and
it occurs quite often) when the information processing requirements exceed
the organization’s IPC. The way to address such overload is to structure the
organization appropriately. That is, IPC is partly attributable to organizational
design (Schick et al., 1990).
Schick et al. (1990) also considered organizational IPC when they suggested
ways of reducing information load by decreasing the amount of information pro-
cessing related to interactions. In particular, they suggested that processing can be
made more time-efficient by changing the structure to facilitate information flows
as well as by relying on standard operating procedures, rules, regulations, and
computer-based information systems. Based on their review of the overload litera-
ture, Klausegger et al. (2007) concluded that adaption methods such as these can
help reduce overload.
Collaboration
Organizational design can be especially important in establishing effective structures
for promoting collaboration in today’s organizations. There is a growing emphasis
on matrix-based structures and teams, especially cross-functional and global teams
like Anna’s in the example at the start of this chapter. And consistent with the
growth in matrix- or team-based organizational structures, there are an increasing
number of technologies aimed at supporting and enhancing the connectivity of
organizational members. Investments in technologies such as Enterprise Social Soft-
ware, collaboration tools, and social media tools are generally viewed positively. But
is this really always borne out by their implementation?
Anna and her husband, David, would say, “Definitely not!” Cross and colleagues
(i.e., Cross & Gray, 2013; Cross, Rebele, & Grant, 2016) would say, “Not always.”
They have argued that the new emphasis on collaboration and the use of tech-
nology to support collaboration in organizations has resulted in collaborative
activities, such as attending meetings or answering colleagues’ questions, occupying
around 80 per cent of the time of managers and their employees. Further, “20% to
35% of value-added collaborations come from only 3% to 5% of employees” (Cross
et al., 2016, p.74). These “stars” are likely to have exponentially higher numbers of
messages compared to other employees (Oldroyd & Morris, 2012). It is these
highly valued employees, or stars, who are experiencing collaboration overload, or the
situation when employees interact so much with other employees that they cannot
get their own work done during normal work hours (Cross & Gray, 2013). Col-
laboration overload can lead to consequences typically found with IT-related overload:
employee stress, employee burnout, turnover of valued employees, and inefficiencies in
decision-making and execution. It can also impair overall organizational performance
(Oldroyd & Morris, 2012).
Collaboration overload can be diagnosed using Organizational Network Analy-
sis. To deal with collaboration overload, structural changes, among others, may be
implemented that formally assign decision rights for routine decisions to more
80 Dark side of IT at the organizational level
Communication
Organizational structure affects communication flows. Communication flows carry
information needed to complete tasks and reduce uncertainty. Yet these very
communications can contribute to perceptions of overload on the part of some
employees.
In an interesting study of the communication patterns of 79 employees of a large
international workstations/servers firm, Barley, Meyerson, and Grodal (2011)
found that even though their time was chewed up by multiple communication
media, it was only email that served as a symbol of overload. Because of their
asynchronicity, emails had a tendency to batch up, especially (1) early in the
morning, due to receiving email sent the previous night by global partners, and (2)
at the end of the working day, as employees did not have the opportunity to
respond to all the emails received throughout the day. And because of company
norms about short response times and their own desire not to miss anything
(nowadays called Fear of Missing Out), many employees experienced feelings of
loss of control over their email and anxiety about their inability to deal with it in a
timely manner. This was not the case with synchronous face-to-face meetings or
teleconferences even though these tended to consume huge amounts of the
employees’ time.
Because of its unique contribution to feelings of overload, it is important to
establish organizational norms regarding email in order to reduce the perceptions of
overload. In the tech company above as well as in Anna’s company, easing up on
the norm for immediate response could have reduced the stress levels of some
employees. Further, organizations could create the communication norm of tech-
nology-free meetings to reduce task-switching and make meetings more effective
(Colbert et al., 2016).
We also explored organizational norms for communication when we tracked the
email pollution of members of a team within a large multinational (see Rutkowski
& van Genuchten, 2008). The data showed that many of the unwanted emails
originated from within the organization. One could call it ‘internal spamming’.
Based on the data, three policy guidelines were proposed to the team to impact
pollution awareness: (1) no more using the ‘reply to all’ button; (2) no more per-
sons in ‘cc’ than in ‘sent to’; (3) no more email fights. Immediately after imple-
menting a norm about limiting the number of recipients on outgoing messages,
email pollution was reduced by 27 per cent. The goal of this norm was to make
Dark side of IT at the organizational level 81
employees actively decide who really needed to receive a particular email. This
allowed the team members to spare their resources to focus attention on more
important emails from customers, to complete other tasks that were more impor-
tant than responding to internal emails, and to meet face-to-face to resolve critical
issues. Following the publication of this study, we received supportive emails from
software developers and managers. It seems some have taken our recommendations
literally. For example, Andrew Cawood, Chief Information Officer of the global
information and measurement firm Nielsen, explained in a memo to 35,000
employees that his strategy to eliminate bureaucracy and inefficiency was to disable
the ‘reply to all’ button on their screens.
Organizations should also establish norms about social media use. For example,
in order to distribute the responses more evenly across experts, they could encou-
rage collaboration stars to send targeted requests for information to social media
discussion groups (Cross & Gray, 2013). Another organizational strategy for dealing
with overload related to email and social media is to include intelligent agents for
filtering inputs (Jackson & Farzaneh, 2012).
Organizations are not doing such a great job in alleviating their employees’
overload. Universities and businesses seldom have written policies regarding how
emails should be processed or the speed at which this should be done; nor do they
have policies about the use of social media platforms. However, social norms such as
those encountered by Barley et al. (2011) clearly are in place. Organizations should
work to ensure that communication flows are effective by avoiding conditions
of overload.
Work
Highly related to organizational design is the design of work and the identification
of work tasks. As we demonstrate next, the design of work is also related to
information overload, work-life balance, and the challenges that employees
experience in seeking a work-life balance.
People used to talk about jobs. As far back as the Industrial Revolution, when
people talked about jobs they meant a discreet task or set of tasks with a well-
defined beginning and end (Bridges, 1994). However, in the mid 20th century the
concept of a ‘job’ morphed into the concept of ‘work’, or into an
In this new world of ‘work’, what needs to be done might never cease. With a
broad definition of what is to be done in the workplace coupled with modern
Information Technologies that allow some, if not many, employees like Anna to
82 Dark side of IT at the organizational level
Work tasks
The concept of overload has been linked to the concept of tasks for decades.
Tushman and Nadler (1978) viewed overload from the perspective of the inade-
quacy of organizations’ IPC to deal with uncertainty in the tasks of organizational
work units. In the literature, task characteristics that affect information overload are
task complexity, task novelty, task interruption, and task-switching ( Jackson &
Farzaneh, 2012). Task characteristics such as task novelty, complexity, and inter-
dependence create uncertainty that can be relieved upon receiving the necessary
information. Further, interruption of complex tasks has been linked to overload
(Speier et al., 1999).
Schick et al. (1990) offered a rather interesting perspective on how work can
lead to information overload. They consider work to be a function of IPC and
information load, which is “the amount of data to be processed per unit of time”
(Schick et al., 1990, p.203). Schick and colleagues’ view of work is so closely
detailed that the organization decides the amount of time or resources (i.e., capa-
city) it should take for an individual to complete a task and the number of tasks
that an individual is expected to do in a period of time. That is, the organization
decides on the time an employee has available to complete a task, and if there is
not enough time to complete the task, information overload can occur. Schick and
colleagues propose various organizational strategies for reducing overload including
providing employees with fewer or simpler tasks to perform, expanding the
workforce in order to divide the work across more employees, or making more
time (a valuable resource) available to employees. The reality is that tasks the orga-
nization assigns often are not what employees spend their time on. One recent
survey reports that employees spend less than half their time on the tasks for which
they were hired (Van Knippenberg, Dahlander, Haas, & George, 2015).
workday. This work overload (sometimes called role overload when employees feel
like they have too much to do in their various roles in light of available time and
resources) is significantly and positively related to work-family conflict (Bolino &
Turnley, 2005; Ahuja et al., 2007). Work-family conflict occurs when the time and
energy demands of one set of roles (i.e., work or family) makes it difficult to fulfil
the demands of another. It appears that women are more likely to experience role
overload, whereas men are more likely to experience work-life conflict (Duxbury
& Higgins, 2001). In particular, working mothers appear more challenged in bal-
ancing their different loads than do fathers. Further, work-family conflict (inter-
ference) is greater for married couples than for workers who are single. The
spillover of work demands into family life increased in a large sample of British
workers in the period from 1992 to 2000 (White, Hill, McGovern, Mills, &
Smeaton, 2003). Similarly, in their longitudinal study, Duxbury and Higgins
reported a sharp increase in role overload in Canada in the decade between 1991
and 2001, suggesting that the Canadian workers surveyed were prioritizing work
over family when they were at home. They noted that role overload was highest
for married couples in the ‘sandwich’ group, who bear the burden of taking care of
younger children and older parents. Interestingly, in a British study, dual-earner
couples reported lower levels of work-family conflict, or spillover of work into
home life, than did single-earner couples (White et al., 2003). This may be because
the dual-earner couples may have more resources available to them by paying for
domestic services or childcare, or it may be due to one of the partners, especially
the woman, taking a less demanding job.
Work-life balance is said to occur when the levels of work-family conflict are
acceptable. In particular we define work-life balance as the degree to which indivi-
duals can satisfactorily harmonize the temporal, emotional, and behavioural
demands of work and family life that are levied on them (Sarker, Sarker, & Jana,
2010). A proper work-life balance requires deploying one’s resources mindfully.
One way of thinking about how employees perceive the relationship between
work life and family life is on a continuum, as demonstrated in Figure 5.2 (see
Sarker, Xiao, Sarker, & Ahuja, 2012).
On one end of the continuum is the compartmentalized perspective in which
work life is totally separated from family life. At the other end of the continuum is
the encompassing perspective in which the “individual’s life is completely encom-
passed within his/her work domain, and the success in the work domain equates to
success in the personal life domain” (Sarker et al., 2012, p.148). This means that
work demands are always prioritized over the demands of the family. This seems to
be the case with Anna. Somewhere in the middle, to varying degrees, is the over-
lapping perspective. Though the work domain and family/personal domain may be
physically and temporally separated, emotional and behavioural overlaps likely still
exist. Individuals who hold this perspective may accept these overlaps, but they still
face varying degrees of work-family conflict in trying to establish a level of work-life
balance that they find to be suitable. Though these individuals may allow some spil-
lover between their work and family roles, they usually have a ‘zone of intolerance’ as
84 Dark side of IT at the organizational level
Life/
Life Work Life Work
Work
FIGURE 5.2 Work-life balance continuum (adapted from Sarker, Xiao, Sarker &
Ahuja, 2012).
Source: Adapted from Sarker et al. (2012)
to what constitutes a viable intrusion of work into their family life (Sarker et al., 2010).
Information Technology blurs boundaries between both domains.
Barley et al. (2011) illustrated these work-family relationships in a discussion
based upon the inability of employees to handle all of their email during normal
working hours. Their working hours had been filled with meetings or tele-
conferences. As a result, nearly 60 per cent of the employees in their study handled
work-related email from home. Those that had not sent work-related emails from
home explained that doing so would have likely led to family conflict. These
individuals held a compartmentalized perspective. Some of those who had
answered their emails from home noted that email made the boundary between
work and home more permeable (see also Duxbury & Higgins, 2001). To varying
degrees, these individuals held an overlapping perspective. The Harvard Business
Review (Groysberg & Abrahams, 2014) article noted that several executives did not
think it was possible to compete in the global marketplace and still achieve work-
life balance. In fact, one executive illustrated an encompassing perspective when he
stated that it was an impossibility to have “a great family life, hobbies, and an
amazing career” (Groysberg & Abrahams, 2014, p.66).
The negative impacts of role overload and work-family conflict have bled over
into organizations. Canadian respondents who experienced considerable overload
and work-family conflict were less committed to the organizations for which they
worked and less satisfied with their work; in addition, they were more stressed with
their work, were absent from work more often, reported greater intentions to quit,
experienced burnout more often, and used the health system more (Duxbury &
Higgins, 2001). Further, Indian systems developers on globally distributed teams,
who were struggling due to flexible scheduling, were more likely to say that they
were thinking about leaving the organization (Sarker, Ahuja, & Sarker, 2018).
Some claim that organizations have become more aware of work-life balance
issues and that they have made progress in implementing programmes and initia-
tives that mitigate them (Duxbury & Higgins, 2001). Ways that organizations could
reduce role overload and work-family conflict in order to help employees realize
more work-life balance include the following: establishing ‘people management’
practices that encourage a focus on output vis-à-vis hours worked, more supportive
Dark side of IT at the organizational level 85
leaders, etc.; allowing employees more control over when and where they work,
such as is possible with flexitime and telecommuting programmes; letting employ-
ees refuse to work overtime without this hurting their careers; offering employee
and family assistance programmes; providing a limited number of annual paid leave
days to take care of parents or children; introducing initiatives such as self-directed
work teams or information sharing between management and employees to
heighten employees’ control over their work; offering company-sponsored nur-
series; or making work teams responsible for finding ways to deal with work-life
balance issues (Duxbury & Higgins, 2001; White et al., 2003). While all these
suggestions make sense, their contribution to creating a favourable balance between
work and family life needs to be carefully studied. Not all may be fruitful. For
example, in one study flexible work hour practices did not affect the work-family
conflict of men, though it did significantly reduce the conflict for women (White
et al., 2003). In fact, most men chose to use the additional time that was made
available in the flexible work hours option to work even more. Further, flexible
scheduling was found to be negatively (and significantly) related to work-life con-
flict for the Indian systems developers on globally distributed teams mentioned
above. Possibly they found it more difficult to navigate the conflicting demands of
work and family life (Sarker et al., 2018).
came to balancing work and family life, the flexible and family-friendly policies
adopted by businesses were nearly as beneficial as policies to regulate working
hours. Duxbury and Higgins (2001) urged stronger action, arguing for legislation
that would prohibit employees from working overtime and/or give them the right
to time off instead of overtime pay. The French government has probably taken
the most dramatic steps by passing legislation which “requires companies with
more than 50 employees to establish hours when staff should not send or answer
emails” (Morris, 2017). In particular, organizations must negotiate with employees
about the employees’ right to “switch off” (Agence France-Presse, 2016). The law
is intended to ensure that employees are fairly paid for work, have flexibility in
working outside of normal work hours, and are less subject to burnout. Such leg-
islation can definitely change the work-life relationship perspective from encom-
passing to lower degrees of overlap by protecting employees’ private time, but it
may create additional work conflicts for those on highly interdependent distributed
teams spread around the globe.
People impacts
When looking at the dark side of IT at the organizational level, our intent in this
chapter is to report the impact of dysfunctional behaviours and briefly discuss how
they may be alleviated. IT addiction can come in many forms: computer addiction,
cyber-related addictions (Kuss & Griffiths, 2011), Internet addiction, or even addic-
tion to certain types of IT such as SNS, email, or mobile technology– not to forget
mobile email addiction (Turel & Serenko, 2010). In this chapter we focus on
addictions that have been studied in the workplace.
Technostress
Perhaps the type of dysfunctional overload behaviour (other than information
overload) that has been studied at the organizational level more than others is
technostress. It has been related directly to reduced satisfaction and productivity/
performance (Tarafdar et al., 2007; Ragu-Nathan et al., 2008; Tarafdar et al., 2010).
For example, using new technologies requires employees to update their IT skills.
Doing so takes away time from completing assigned work tasks. If they cannot get
the technology to work, they need to troubleshoot and seek technical assistance–
all of which means that their IT-enabled work must wait (Tarafdar et al., 2007).
Technostress also has been linked to decreased innovation in work tasks when using
IT, increased dissatisfaction with the IT that is used, and reduced commitment to the
organization (Tarafdar et al., 2007). In addition, technostress is linked to role stress,
which in turn is thought to create role overload (Tarafdar et al., 2007; Tarafdar et al.,
2011). It has even been studied in relation to social media (Brooks et al., 2017) and
stressful information security requirements (D’Arcy, Herath, & Shoss, 2014).
Tarafdar and colleagues (e.g., Tarafdar et al., 2007; Ragu-Nathan et al., 2008;
Tarafdar et al., 2010; Tarafdar et al., 2011; D’Arcy, Gupta, Tarafdar, & Turel, 2014)
Dark side of IT at the organizational level 87
IT addiction
To our knowledge, only a few studies have addressed Pathological Internet Use
within organizations (e.g., Yellowlees & Marks, 2007; Turel & Serenko, 2010).
88 Dark side of IT at the organizational level
However, there are quite a few studies on IT addiction, and the number is grow-
ing rapidly. Determining its exact incidence in organizations is virtually impossible
since different criteria are used to assess whether or not someone is addicted,
“including (i) intolerance, (ii) withdrawal, (iii) increased use, (iv) loss of control, (v)
extended recovery period, (vi) sacrificing social, occupational, and recreational
activities, and (vii) continued use despite negative consequences” (Kuss & Griffiths,
2011, p.3540). Figures range from 6 to 17 per cent for mobile email addiction and
from 3 to 80 per cent for Internet addiction (Yellowlees & Marks, 2007). Admit-
tedly, many of the studies making these estimates may have been flawed (Yellowlees
& Marks, 2007).
Though we may not know the exact incidence, stories abound that attest to the
presence of IT addiction. One story is told by Karaiskos, Tzavellas, Balta, and
Paparrigopoulos (2010) about a 24-year-old woman who used social media so
much that her behaviour interfered significantly with both her private life and her
professional life. She was fired from work because she used Facebook for at least
five hours a day to repeatedly check her account. When she went to a psychiatric
clinic to get help for her social media addiction, she used her smartphone to access
Facebook. As if things were not bad enough, she developed insomnia and anxiety
symptoms. Some media addicts may neglect family and home duties; others may
use the medium as a “mental safe haven” or be totally preoccupied with it (Turel
& Serenko, 2010, p.42). Media addicts are all around. You likely have gone out to
dinner with smartphone addicts who could not pull themselves away from their
phones to talk with you when you were sitting at the other side of the table.
People who suffer from various IT addictions may suffer from mood swings,
feelings of work overload, and work-family conflicts (Turel & Serenko, 2010).
Consequently, they may be less satisfied with their work and more subject to
voluntary turnover. Employees spend considerable time checking their social media
inboxes (i.e., Facebook, Twitter, and LinkedIn) throughout the workday. In a
survey of 168,000 employees, 43 per cent of the respondents confessed that
checking their social media at work hurt their work productivity (Brooks et al.,
2017). Further, organizations have been sued by their employees because they
developed mobile email addiction or became “BlackBerry addicts” (Turel &
Serenko, 2010, p.43).
consult them nearly 30,000 times annually (or approximately five times per waking
hour). A 2015 Gallup survey reported that iPhone owners “couldn’t imagine life
without the device” (Carr, 2017). When their phone rings or beeps while working
on a challenging task, they become distracted and their work gets sloppy, even if they
do not check or answer their phone. In other words, their cognitive resources are
diverted (Carr, 2017). Someone in each organization, most likely human resource
professionals, should be monitoring technology use to ensure that organizational
policies do not promote the dark side of IT use (Porter & Kakabadse, 2006).
Finally, in terms of the work, work demands and work resources, as well as
personal resources, have been found to be predictors of two dimensions of tech-
nostress: techno-strain and techno-addiction (Salanova et al., 2013). Salanova et al.
(2013) found that work demands have negative impacts on both dimensions. It
appears that the more work demands are placed on employees, the more work
resources are required (e.g., social support, mental and emotional competences).
Some work demands, such as work overload and frequent software updates, are
predictors of both techno-strain and techno-addiction. Other demands such as
emotional overload and obstacles hindering effective IT use are specific predictors
of techno-strain. In terms of personal resources, mental competences predict
techno-strain while emotional resources predict techno-addiction.
Technology
Information Technologies originally were designed and built to serve workers and
make business operations more efficient. However, all too often workers are being
substituted by IT. As we have reported, IT-related overload and IT addiction cause
humans to malfunction and exhaust their resources. Rather than applying work
design and adapting organizational structures to augment employees’ resources,
organizations are increasingly turning to automatization and robots as the desired
solution to enhance organizational IPC. A new robot-human supervenience pro-
blem has emerged. Little thought as to the consequences of this technocentric view
are given nowadays. Nobel Prize winner Paul Krugman recently published a
column titled “Sympathy for the Luddites” in The New York Times. He showed his
sympathy for Luddites when he described the pain of late 18th-century British
cloth workers who saw their jobs being taken over by machines. He concluded:
Automation
Huge productivity gains have been provided to organizations through enhanced
computational capabilities and the associated automation of work (Hancock, 2013).
In the 1970s it was hoped that the strides in productivity achieved through auto-
mation would lead to a world in which 30- or maybe even 20-hour work weeks
would become a reality. However, the benefits of automation have not been
shared in an equitable fashion to make this dream come true for the ‘collective
good’. Rather, a disproportionate few have enjoyed the benefits of automation
(Hancock, 2013). And there is a growing number of individuals who are suffering
due to automation, when technology takes the place of their work.
Let us focus on one highly educated group of professionals who have suf-
fered greatly from the introduction of Health Information Technology (HIT).
Our research has focused on one group of such individuals: the Medical
Doctor of Anaesthesiology (MDA). In particular, our research found that HIT
may fully or partially substitute for MDAs in the operating room (OR)
(Medina, Verhulst, & Rutkowski, 2015). Clearly HIT has brought some
advantages for the healthcare industry (i.e., Kim & Michelman, 1990; Chris-
tensen, Bohmer, & Kenagy, 2000; Goldschmidt, 2005; Kaplan & Porter, 2011;
Romanow, Cho, & Straub, 2012). For example, patients’ physiological reac-
tions to medication can now be monitored in real time by medical software in
the OR (Modell, 2005). However, the downside of HIT is that MDAs may
ultimately be replaced by HIT.
Dark side of IT at the organizational level 91
Robots
Organizations are increasingly turning to robots to perform work tasks more effi-
ciently, more accurately, and in a cost-saving way. A robot is “a reprogrammable,
multifunctional manipulator designed to move material, parts, tools, or specialized
devices through variable programmed motions for the performance of a tasks”
(Hamilton & Hancock, 1986, p.70). The term came from Capek’s 1921 play RUR:
Rossum’s Universal Robots and appears to be derived from the Austro-Hungarian
Empire term, robota, which means ‘vassal’ or ‘worker’ (Schaefer, Adams, Cook,
Bardwell-Owens, & Hancock, 2015). The first robot traces back to the 3rd century
when Archytas of Tarentum developed a mechanical bird, referred to as “pigeon”,
that was powered by steam. Leonardo da Vinci developed a robot using a knight’s
suit of armour and an internal cable mechanism in 1495. Robots were first put to
work in the General Motors (GM) assembly line in 1961. George Devol’s indus-
trial robotic arm was the first robot installed by GM. MIT’s John McCarthy and
Marvin Minsky contributed to the science of robotics in their Artificial Intelligence
Laboratory in 1959. Also in 1959, Stanford built the first robot to know and react
to its own actions. Stanford scientists spent the decade of the 1970s building a cart
that could follow a line or be controlled by a computer. About the same time,
business organizations also were conducting robotics research. For example, Honda
began research on collaborative robots in 1986 (see Schaefer et al., 2015, for details
on key historical achievements in the evolution of robot design). Given the big
role that automobile manufacturers played in the early days of industrial robots, it is
Dark side of IT at the organizational level 93
not surprising that automobile manufacturers made 40 per cent of the worldwide
purchases of robots in 2013 (Goodman, 2015). The Occupational Safety and
Health Administration has tallied 33 deaths in the USA alone, and more are likely
as robots become more prevalent (Goodman, 2015).
workload and create situations of overload, the robot should deceive the human
into believing that the information that the robot has delivered is all the informa-
tion that has been collected. Hancock et al. (2011) view such actions on the part of
a robot as deception by omission. However, this deception on the robot’s part can
help avoid costly mistakes that might follow from humans who are cognitively
overloaded.
not perform the test again for fun, the collision did not leave a bruise. However, as
robots become more collaborative, the operations they carry out are becoming
more complex, and ensuring the safety of teammates is challenging to say the least.
Another challenge that has emerged since Hamilton and Hancock first started
writing about robotic safety is the risk to the robots, and thus to the humans that
they serve, from malicious hacking (Goodman, 2015).
Manufacturing organizations are not the only organizations benefiting from
advances in robotic technologies. Robotic advances are being seen in pharmacol-
ogy (Hemmerling & Taddei, 2011; Hemmerling, Taddei, Wehbe, Zaouter, Cyr, &
Morse, 2012). They also are being used to perform surgeries. The da Vinci® Sur-
gical System, a minimally invasive robotic surgery system, allows greater precision
in cutting and offers better views of the patient’s surgery site, especially because
there is less blood to obscure the vision. However, innovative robotic technologies
have killed hospital patients in recent years (Sharkey & Sharkey, 2013), though the
number of injuries recorded is far less for surgical robots than industrial ones
(Goodman, 2015). As one example of a fatal injury, a Chicago man died in 2007
when a surgeon punctured his spleen while operating a $1.8 million da Vinci ‘hands-
on’ robot surgical robotic system for the first time on a living person. Saunders et al.,
2016) pointed to five hazards that are at the heart of such tragedies: (1) overloaded or
underloaded OR professionals; (2) inadequate training of surgeons on the robotic
systems; (3) inadequate training for the healthcare professionals on the surgical team;
(4) the complexity of HIT; and (5) overconfident surgeons. We argued that adequate
training of healthcare professions on the technology as well as certification of mastery
of use of the technology would go a long way.
Saunders et al. (2016) suggested that the MDAs on the surgical teams are
underloaded and, hence, may be bored. Yet there are a number of reasons why
members of the surgical team may be overloaded. Alarms in the OR may be one.
It turns out that the number-one health hazard in the OR for three years in a row
(2013–2015) was the overwhelming number of alarms (ECRI Institute, 2012,
2013, 2014). Surgical team members may become overloaded by alarms that are
constantly going off and may not pay adequate attention to each one. To pay
attention and respond appropriately to these alarms increases mental load and may
drain attentional resources (Tollner, Riley, Matthews, & Shockley, 2005). Further,
the complexity of the robotic systems along with their associated new medical
procedures may intensify team members’ mental strain and stress, adding to their
mental load (Ayyagari, Gover, & Purvis, 2011; Tarafdar et al., 2007). Moreover,
surgeons may become distracted as they perform complex surgical movements
under time pressure. They must remember the proper sequence of steps in a given
procedure as they converse with surgical team members about instruments and
the patient’s status (Tollner et al., 2005). Not surprisingly, the surgeons may be
especially overloaded (Zheng, Cassera, Martinec, Spaun, & Swanstrom, 2010).
Sergeeva, Huysman, and Faraj (2016) spent 102 hours observing operations that
used the da Vinci robotic system. They observed the challenges surgeons face when
learning to operate the system. Surgeons have to unlearn how they performed
96 Dark side of IT at the organizational level
surgeries, using their senses, in the past. In particular, they relied on the sense of
touch inside the patient’s body. Further, they need to unlearn being in the centre
of things since they now sit remotely in the OR– away from the patient and in
front of a technologically advanced 3D camera. The surgeons must learn to operate
the robotic arms from a distance by relying heavily on the clear and detailed 3D
images. But surgeons are not the only ones who need to be retrained. This applies
to the whole team. For example, it is the scrub nurse who actually inserts or
changes precision instruments in the robotic arm and takes tissues out of the
patient’s body. MDAs must learn to anticipate sudden movement from the robotic
arms that tend to be hovering over the patient. In fact, the MDAs that Sergeeva
et al. (2016) observed made a metal shield to protect the patient’s face from any
sudden movements by the robot. In one emergency situation, the MDA had to
crawl under the table on which the patient was resting in the OR in order to
reinsert a breathing tube that had fallen out of the patient’s mouth. In summary,
operating with the robotic system can be dangerous, and the entire team needs to
be trained to provide coordinated responses to unanticipated problems.
Conclusion
In this chapter we have described the dark side of IT from four perspectives:
organizational design and structure, work, people impacts, and technology. Each
has the potential to cause serious damage at the organizational level as well as at
individual and even societal levels. Further, the damage can be compounded when
these four factors interact with one another. Table 5.1 summarizes the key issues
and interrelationships.
TABLE 5.1 Summary of the Information Technology dark side diamond
(Continued )
Table 5.1 (Continued)
Factor Description Issues Examples of interrelationships with other factors
People impacts Technostress: a type of stress experienced Performance/ productivity Organizational design and structure: Work
in organizations by technology end users as Satisfaction with work can be designed with flexibility (i.e., job
a result of their inability to cope with the Voluntary intention to quit sharing) and norms about using commu-
demands of organizational computer usage. organization nication media. Work: Organizations can
Pathological Internet Use: the consequences establish and enforce policies to promote
of problematic cognition coupled with work-life balance (i.e., flexitime, freedom
behaviour that intensifies or maintains to turn down overtime without repercus-
maladaptive response. sions). Technology: Policies and norms can
IT addiction: the state of being challenged be created to reduce technostress and IT
in balancing IT usage mindfully so as to addiction.
preserve one’s resources; includes Internet,
mobile email, and SNS addictions.
Technology Though technology is found in myriad People’s work being replaced by Organizational design and structure:
forms, our dark side focus is on automation robots or automation Organizational work needs to be rede-
and robots. Robot: a reprogrammable Robot safety signed to reflect differences in task perfor-
multifunctional manipulator designed to mance by robots or in response to
move material, parts, tools, or specialized automation. Work: With automation and
devices through variable programmed the introduction of robots, humans (indi-
motions for the performance of tasks. viduals and teams) need training when
Automation: automatic operation of an they are placed in charge of robots and
apparatus, process, or system performed by automated systems to make sure they
IT to take the place of some aspect of operate smoothly. When they are replaced
human performance. by automated systems or robots, people
need to be retrained for other jobs. People
impacts: Workers experience techno-inse-
curity from fear of losing their jobs to
automation or robots and techno-
complexity from having to learn and adapt
to new technologies.
6
MEASURES OF IT-RELATED OVERLOAD
In the business world, there is a well-known adage: ‘If you can’t measure it, you
can’t manage it.’ Measurement is critical to understanding new discoveries and
therefore for advancing science. Some early examples of measurement based on
triangulation are given below.
In 1800 Britain was faced with a colossal problem. It was trying to rule a sub-
continent without having a clue as to what it looked like. So it commissioned
Colonel William Lambton to undertake the Great Trigonometrical Survey (more
commonly known as the Survey of India) to ascertain precisely where places were
located in the colony. In 1808 Lambton started surveying the southernmost tip of
India. He used a triangulation technique which incorporated trigonometry, chains,
metallic bars, monuments, and theodolites to survey the land. When Lambton
died, George Everest took over the mission. Everest was, in turn, replaced upon his
death in 1843 by Andrew Scott Waugh. It took decades to measure India. In the
process, Waugh came upon the world’s tallest mountain, which he named in 1856
in honour of Everest (Arbesman, 2013).
A way of measuring this massive area of land was essential for its governance by
the British Empire. Almost a century earlier, the French had undertaken a similar
arduous mission involving triangulation. Charles Marie de la Contamine (1735–
1744) took part in the French geodesic project to triangulate the distance through
the Andes in order to settle the question of the earth’s circumference. Although
their project had been carefully planned, they encountered many mishaps (see
Bryson, 2003).
therefore measure, the earth. Hipparchus of Nicaea (150 BC) used triangulation to
work out the distance of the moon from the earth. Once Hipparchus knew the
length of one side of a triangle as well as the values of both corner angles at the
baseline, pointing at the moon allowed him to determine all the other dimensions.
His approach, triangulation, is used extensively in navigation and military strategy.
Based on the principles of geometry, triangulation increases the accuracy of
observations.
The term triangulation is also applied to a research strategy that uses a multitrait-
multimeasure approach, or convergent validation. It uses multiple reference points
(often more than three) to measure phenomena (Jick, 1979). Its goal is to ensure that
the results of research are not the product of methodological artefact (Campbell &
Fiske, 1959). According to Denzin (1978), triangulation can be applied within-method
by using multiple comparison groups or multiple scales/indices for the same construct.
Triangulation within-method aims mostly at assessing construct reliability (i.e., internal
consistency) and validity. For example, two different scales could be used to measure
overload. Triangulation can also be applied between-methods by using diverse
methods for cross-validation; for example, using semi-structured interviews and phy-
siological measurements to capture overload. Between-methods triangulation is mostly
concerned with the generalization of constructs (i.e., external validity).
Triangulation may provide a new lens for viewing and therefore understanding
IT-related overload. Triangulation makes it possible to “capture a more complete,
holistic and conceptual” portrayal of the phenomenon (Jick, 1979, p.603). Our
earlier methods to measure IT-related overload were not always successful, but we
learned a lot during the painful process. In the next sections we want to share with
you what we learned. We believe that triangulated measurement of IT-related
overload can expand the knowledge of this phenomenon. However we also
recognize that measurement can be a double-edged sword. If the measurement is
inaccurate, it can create a spurious understanding of IT-related overload. There-
fore– like our predecessors William Lambton, de la Contamine, and Hipparchus of
Nicaea– we use triangulation. We employ it to measure IT-related overload. In
this chapter, we start with a philosophical discussion of theoretical systems and
measurements. We describe the use of self-report measures, which is where most
overload researchers have started. We then suggest approaches to complement the
self-report measures. These include physiological measures such as thermal-imagery
as well as measures of electrodermal activity such as heat flux, galvanic skin
response (GSR), and physiological measures of energy expenditure in Metabolic
Equivalent of Tasks (METs). We conclude with studies that employ some of these
measurement approaches and suggestions for moving forward on the neuroscience
measurement frontier.
research form scientific languages with their own set of concepts, conventions,
codes, and rules, providing ways to look at the world through particular lenses.
Theory may also be viewed as a system in which constructs are related to each
other by propositions (Bacharach, 1989). Constructs are defined as “terms which,
though not observational either directly or indirectly, may be applied or even
defined on the basis of the observables” (Kaplan, 1964, p.55). Indeed, constructs
are operationalized into configurations of variables. A variable is an observable entity
which is capable of assuming two or more values (Schwab, 1980). Hypotheses
relate variables to one another and are typically tested using statistical methods.
Thus, both propositions and hypotheses belong to theoretical systems and are
statements of relationships. Operationalizing constructs into observables relates to
‘measurement’. Measurement is critical to the advancement of science. Sinan Aral
suggested that “revolutions in science have often been preceded by revolutions in
measurement” (Kitchin, 2014, p.1). Science may be considered a language since
it evolves constantly not only through paradigm shifts, but also with new instru-
ments and technologies.
(i.e., qualitative). They later applied the methods of natural sciences (e.g., sta-
tistics), characterized as objective (i.e., quantitative); for example, in measuring
learning performance. Popper (1959) stated that the terms ‘objective’ and
‘subjective’ are “heavily burdened with a heritage of contradictory usages and
of inconclusive and interminable discussions” (p.44). The contradictory usage
refers to the term ‘subjective’ being somehow tainted, as it applies to our
“feeling of conviction” (which has varying degrees) (Popper, 1959, p.46).
However, as Kant (1781–1787/2003) emphasized, “objective reasons too may
serve as subjective causes of judging” (in Popper, 1959, p.45). That is, objective
measures sometimes can lead to tainted conclusions. The method used to
measure objective versus subjective phenomena is not a certification of ‘objec-
tivity’ in the broader sense of the term.
In this chapter, we define the term objective as focusing on the object, the
material in World 1 (Popper, 1978). That is, the brain and its architecture sup-
port information processing. The brain is therefore a mere material oper-
ationalization of the mind in World 1. The functioning of some part of the
brain, such as the prefrontal cortex (PFC), enables information processing and
therefore mental processes such as decision-making in World 2. It is deemed
‘object’ in the sense it has universal properties and functions, such as a chair as
four legs and can be used to sit on. As we reported in Chapter 2, anatomic brain
damage, particularly of the PFC, leads to the inability to use affective feedback
in making judgments and decisions (Damasio, 1994). Also, individuals suffering
from Narcissistic Personality Disorder have less brain matter in areas that overlap
with the areas associated with empathy (i.e., left anterior insula, rostral and
median cingulate cortex, as well as part the PFC) (Schulze, Dziobek, Vater,
Heekeren, Bajbouj, Renneberg et al., 2013; Nenadic, Güllmar, Dietzek, Lang-
bein, Steinke, & Gader, 2015).
We define the term subjective as focusing on the subject, the psychological states
of subjective experiences (Popper, 1978). Phenomenological sociology (Weber,
1949), introduced a clear focus on the World 2 of subjective meanings. Weber
underlined that social action should be studied through interpretive means, referred
as Verstehen. In sociology, understanding the subjective meaning and purpose
attached to the actions of individuals is necessary to understand social actions
(Calhoun, 2002). The focus is on the subjective meaning that humans (i.e., subjects)
attach to their actions and interactions within specific social contexts.
In the late 1800s, Sir Francis Galton (1892) was inspired by the work of Darwin
(1859) on natural selection. Galton demonstrated that objective tests could provide
meaningful scores. He was the first to use statistical methods to study individual
differences and the inheritance of intelligence. To do so, he introduced the use of
questionnaires and surveys for collecting the data that he needed for his anthropo-
metric studies. He is commonly referred as the father of psychometrics (see Kaplan
& Saccuzzo, 2010).
In the early 1900s, Binet and Simon (1904) introduced the first standardized
Intelligence Quotient (IQ) test (referred to as the Binet-Simon). The Binet-Simon
Measures of IT-related overload 103
of overload, let alone measure it. In philosophical terms, there are subjective
versus objective constructs of overload, and these require different measure-
ments. Further, while overload has been studied, underload has not been as
widely researched.
Objective measurements
Other research measures the objective experience of the overload. The basic pre-
mise in this research has been to use proxy measures of demands on the brain (i.e.,
brain load or mental load) using the amount of information or alternatives. This
makes it possible to test for bottleneck effects of observables on higher activities
such as decision-making or learning. Researchers mostly use questionnaires in the
context of laboratory experiments. Their research designs operationalize the
amount of information in terms of the actual input delivered to the participant.
Observables relate to message complexity, such as the number of words in the
body of an online message, the number of words in non-indented lines in the body
of a message, the number of lines according to header fields, the number of lines
excluding those of attachments, the number of indented lines excluding those of
attachments, and the number of contributors to ‘reply’ messages. (Jones et al.,
2004). Overload is also operationalized by manipulating the number of alternatives
in order to create bottleneck effects (Chervany & Dickson, 1974; Payne, 1976;
Jacoby, 1984; Cook, 1993) or by manipulating interruptions and task complexity
while using a technology (Speier et al., 1999). Still other operationalization
employs observables such as the numbers of ideas, idea diversity, time, and task
domain (Grise & Gallupe, 1999–2000).
In discussing objective measurements of cognitive load, we would be remiss if
we did not mention the work of Sweller and his colleagues (see Sweller et al.,
1998; Paas et al., 2004). Sweller (1988) was among the first to measure cognitive
load in the context of learning. He proposed four types of cognitive load: intrinsic,
extraneous, germane, and informational. He and his colleagues devised various
measurements for most of them. For example, in researching the inherent com-
plexity determined by the interaction between the nature of the material and the
106 Measures of IT-related overload
expertise of the learner (i.e., intrinsic cognitive load), he measured cognitive abil-
ities and skills. Sweller and colleagues noted the technology features when studying
extraneous cognitive load and measured the amount of information when studying
information load. Relevant to our research, they found that different technology
features differentially impacted extraneous cognitive load depending upon the
modalities involved (e.g., visual, auditory, tactile). Splitting attention between
technologies (multitasking) increases cognitive load, especially when different
modalities are engaged.
You cannot process the number of requests you receive to use new Internet
communication tools.
You cannot handle the number of requests you receive to use new Internet
communication tools.
You cannot cope with the number of requests you receive to use new Internet
communication tools.
You are overwhelmed by the effort it takes to handle the number of requests
you receive to use new Internet communication tools.
You feel pretty irritated by the number of requests you receive to use new
Internet communication tools.
You feel emotionally pressured by the number of requests you receive to use
new Internet communication tools.
You feel confused by the number of requests you receive to use new Internet
communication tools.
Source: Saunders et al. (2017) *Based on the instrument validation process, the item was removed.
Hobfoll, 1989). One of these is the social desirability bias in which the respondent
tries to anticipate what is considered the most appropriate response. Another is
fundamental error bias (Ross, 1977), also referred to as self-serving attribution bias
(Forsyth, 1980) (see Chapter 3). From a socio-cognitivist perspective, confusion
about information load (i.e., the amount of information) and the so-called infor-
mation overload should be understood as the result of a cognitive distortion. Even
time spent on an activity may be difficult to measure using self-report measures.
Measures of IT-related overload 109
TABLE 6.2 Operationalization of memories of past cognitive and emotional overload with
item loadings
Construct # Item (loading)
Memories of past 1 I could not process the amount of information delivered
cognitive when I started using a computer. (.853)
overload 2 I could not handle the excessive amount of information
provided when I started using a computer. (.859)
3 I had problems adopting computers and learning how to
use them. (.797)
4 I was overwhelmed by the effort it took to learn using
computers. (.829)
Memories of past 1 I felt emotionally pressured when I first used a computer.
emotional (.866)
overload 2 I felt confused when I first used a computer. (.797)
3 I felt frustrated when I first used a computer. (.848)
4 I felt happy when I first used a computer.R, *
Source: Saunders et al. (2017) * Based on the instrument validation process, the item was removed.
For example, Junco (2013) compared actual versus self-reported times for use of
different technologies among 45 university students over a period of a month. The
self-reported time (hours and minutes) they felt they had accessed Facebook,
Twitter, and email was compared to the actual time, which had been monitored
with software installed on their computers (i.e., an objective measure). Although
correlations between self-reports and objective software measures of the actual time
spent accessing technologies were significant and reasonably high (e.g., the correla-
tions ranged from .587 for Facebook access to .628 for email access), the self-report
estimates were quite different from the objective software ones. For example, while
users self-reported an average of 145 minutes per day accessing Facebook on their
computer, the actual average time, according to the monitoring software, was 26
minutes per day.
Individuals who experience IT-related overload may attribute the state of
excessive emotional and cognitive overload to external factors such as too much
information and time constraints, rather than to internal ones such as their own
limited information processing capacity or previous failed coping strategies
(Ross, 1977). This distortion is self-serving attribution bias, which refers the
tendency to ascribe success to dispositional attributes and failure to situational
ones (Taylor & Koivumaki, 1976). This bias explains reports in the literature
about the difficulty of measuring overload (Bettman et al., 1990; Jacoby, 1984;
Malhotra, 1984) and the importance of using multiple scales to assess its existence
and potential behavioural consequences. Since self-reports reflect respondents’ sub-
jective experiences and interpretations, they lack objectivity (Donaldson & Grant-
Vallone, 2002). Therefore, the validated subjective measures of IT-related overload
should be complemented with objective measurement techniques that focus on the
brain and the body.
110 Measures of IT-related overload
Physiological markers
The use of self-report scales to measure stress and IT-related overload is probably
even more problematic in the OR than in laboratory experiments or many field
settings. Surgeons are unlikely to admit, or even recognize, that they are experi-
encing stress (Sexton, Thomas, & Helmreich, 2000; Yule, Flin, Paterson-Brown, &
Maran, 2006; Arora, Hull, Sevdalis, Tierney, Nestel, Woloshynowych et al., 2010).
It is therefore desirable to complement the subjective measures of overload with
objective measurements such as physiological markers. Physiological data collected
using psychophysiological tools are unobtrusive and not susceptible to social
desirability bias.
We used the Thermoview 8300 infrared thermal imaging camera as an objective
measure of IT-related overload. As presented in Chapter 2, the Peripheral Nervous
System controls vital centres, such as the cardiac regulation and vasomotor centres.
The latter regulates homeostatic processes such as body temperature and blood
flow. Homeostasis relates to the physiological efforts to keep the body balanced
and in equilibrium. Potential threats to our bodily homeostasis generate stress
(Levine, 2005). Congruently, research has demonstrated that stress exposure results
112 Measures of IT-related overload
stressful event in the role of the navigator. One participant said, “As navigator, it is dif-
ficult in this new situation to get the image focus if you have to think for another person [i.e.
surgeon].” None reported a stressful event when playing the role of anesthetist.
Additional instruments
Additional instruments and, therefore, observables are available to measure IT-
related overload. For example, early work used pupil dilatation and eye movements
as observables of mental workload during decision-making tasks (Hess & Polt,
1964; Bradshaw, 1968) as well as emotional stimulation through audio stimuli
(Partala & Surakka, 2003). Eye-tracking is commonly applied in the domains of
human-computer interaction and design usability. Minas et al. (2014) used elec-
troencephalography, electrodermal activity, and facial electromyography to suggest
that confirmation bias in information processing during online team discussions is a
better explanation than overload for their results.
Functional neuroimaging techniques (e.g., Positron Emission Tomography and
Functional Magnetic Resonances Imaging, or fMRI) also could be employed to
explore the portions of the brain that are activated while performing tasks
(Dimoka, 2011; Dimoka, Pavlou, & Davis, 2011). Dimoka, Banker, Benbasat,
Davis, Dennis, Gefen et al. (2012) reported that “while self-reports may not be able
to capture unconscious processes that are unavailable to introspection, neurophy-
siological tools can capture unconscious processes with direct responses from the
human body” (pp.680–681). Using functional neuroimaging techniques would be
most useful in measuring prior experience of IT-related overload. Indeed, patterns
of neuronal activities should be observed mainly in the PFC and the ‘emotional
116 Measures of IT-related overload
brain’ (e.g., limbic system). Empirical regularity studies of the Episodic Memory
have demonstrated a pattern of activities in the left PFC when encoding informa-
tion into the Episodic Memory and in the right PFC when decoding. This is
known as the Hemispheric Encoding/Retrieval Asymmetry Model (see Kapur,
Craik, Tulving, Wilson, Houle, & Brown, 1994; Kapur, Craik, Jones, Brown,
Houle, & Tulving 1995). Such studies could provide evidence of neural correlates
to validate the self-report scales of prior experience of overload with IT. Indeed,
fMRI is a fascinating tool to study IT-related overload. Using fMRI, Jaeggi,
Buschkuehl, Etienne, Ozdoba, Perrig, and Nirkko (2007) demonstrated that parti-
cipants’ level of performance under conditions of task overload triggers different
activation increases in cortical areas. Congruent with the results of our research, the
participants who performed poorly as a surgeon expended additional mental resour-
ces with increasing difficulty, while the brains of the high-performing participants
‘kept cool’ in terms of activation changes and associated frontal temperatures (Pluyter
et al., 2014).
Overload is often conceptualized by default as a supervenience of cognition over
emotion by considering that the brain activity (cognitive overload) generated an
emotional response (stress/mental strain) that manifested itself bodily. However, for
the top surgeons, the body did ‘warm up’ while thinking harder without this being
linked at all to frustration. If we had used fMRI, as suggested by Jaeggi et al.
(2007), we probably would have found that the top surgeon’s brains remained
‘cool’. We were indeed dealing with a few cool-headed surgical trainees who were
cognitively absorbed.
Over this past decade of research, we have found personality dispositions such as
anxiety (i.e., neuroticism), intellectual engagement (i.e., Need for Cognition), and
cognitive absorption (i.e., openness/intellect) to be key in understanding IT-related
overload. For example, openness/intellect is linked to a larger bandwidth of
information processing. Mandler (1967) described such bandwidth in terms of
superchunking. These traits involve the PFC, particularly the working memory,
abstract reasoning, and the control of attention (DeYoung, Peterson, & Higgins,
2005). Recently, DeYoung, Hirsh, Shane, Papademetris, Rajeevan, and Gray
(2010) identified brain structures in their explanatory model of the Big Five. They
aimed at specifying the biological systems that linked the psychological mechanisms
underlying the traits of the Big Five. Different measurements represent similar
constructs, but point to different results when addressing overload. This is surely
related to the fact that the brain structure is composed of complementary, but also
conflicting, functions. All are highly interconnected– just as emotion and cognition
are; and just as the mind and the body are.
Conclusion
To conclude, we think we have a better understanding of the phenomena of IT-
related overload now that we have developed a triangulation approach using both
subjective and objective measurements of overload. However, the landscape is
Measures of IT-related overload 117
huge and filled with conceptual and measurement traps. We have learned that
emotional and cognitive overload may depend on the individual’s pool of resour-
ces. Some individuals make more use of either their cognitive aspects (e.g., mental
schemata, personality factors, skills, superchunks) or their emotional aspects (i.e.,
positive arousal, stress, mental strain, anxiety); it is mostly a combination of both
with individuals having their own preferred strategies. In our triangulation efforts,
we learned in a controlled setting how rich the pool of resources is, and how dif-
ficult it is to measure concepts such as overload. Much more research using trian-
gulation will be required to understand IT-related overload as it relates to the
functioning of the human pool of resources. Operationalization of observables and
instruments to gather measurements will be necessary to fully explore this fasci-
nating pool of resources: It is the very same pool that makes us so human and so
different from one another.
7
LEVERAGING THE POSITIVE SIDE OF IT
Bryson (2003) translated Newton’s law of universal gravitation into more readable
language:
if you double the distance between two objects, the attraction between them
becomes four times weaker. This can be expressed with the formula F=G
(Mm/r2) which is of course way beyond anything that most of us could make
practical use of, but at least we can appreciate that it is elegantly compact.
(pp.73–74)
Gamification
of the myriad forms of smart technology is smart farming (Wolfert et al., 2017). Big
data and algorithms are affecting the entire food supply chain by providing pre-
dictive insights to farming operations on aspects such as yield models and feed
intake models and by driving real-time operational decisions.
and information may quickly become outdated and dumped in cyberspace, where
individuals lack the ability to simply delete what is no longer pertinent.
Ultimately it is up to individuals to be aware of how their personal information
is being used and to manage this use. However, sometimes this is not possible
because their personal data is stored but not accessible to them. To be more cau-
tious about requests for personal information, Spiekermann-Hoff and Novotny
(2015) suggested the use of software agents who could compare the privacy pre-
ferences stored in the individual’s browser with companies’ privacy policies and
alert the individual if there is a discrepancy. Probably the best-known software
agent solution for individual privacy is W3C’s Platform for Privacy Preferences
Project. Another alternative for sharing personal information in return for services
is to include a level-of-service option. To get a minimum level of service, indivi-
duals need disclose only a minimum level of personal information. To get highly
personalized services with hopefully secure information, a small fee would be
charged. Spiekermann-Hoff and Novotny (2015), among others, have emphasized
the importance of government’s role in setting standards and establishing regulations
to protect individuals’ privacy rights in regard to their personal information.
Robotics
can learn to behave in the way its owner wants it to after it is stroked for displaying
a desirable behaviour. Thus, by moving its head and legs, making seal sounds, and
responding to light, sound, and touch, PARO appears to be alive. Research sug-
gests that dementia patients are comforted by PARO’s presence. PARO can help
human caregivers when their emotional resources run low: They are reliable,
trustworthy, and do not suffer from burnout or impatience (Johnston, 2015).
Surely, such difficult questions create juicy business for lawyers and ethicists.
These ethical issues are interesting from a cognitivist perspective. Kohlberg
(1976) proposed the process of moral development based on Piaget’s (1951) stages
of cognitive development in children. Kohlberg’s process of moral development,
which continues throughout the individual’s lifetime, has three levels with two
stages each:
For man, when perfected, is the best of animals, but, when separated from law
and justice, he is the worst of all; since armed injustice is the more dangerous,
and he is equipped at birth with arms, meant to be used by intelligence and
virtue, which he may use for the worst ends.
(Book 1/2)
Brain enhancements
adequate” (p.3). Such statements suggest that incoming information cannot get
through the filter. For example, Outlook’s email filtering facility uses rules that
typically focus on filtering out the emails of certain message senders. However, this
function is perceived either to be too hard to use or useless (Tagg et al., 2009). It
likely is useless because the filter focuses on the wrong thing: the sender of the
message rather than the message’s pertinence to that recipient. The problem is not
at the filtering point, but rather in the brain that does not have the resources to
process the pertinent information which passes through the filter.
Another contributor to the myopic view on overload in the current overload
literature is the focus on the limited capacity of the Short-Term Memory and the
circumvention of the emotional nature of signals attached to input delivered by IT
and their effects on information processing. The role of cognitive schemata, in
overcoming limitations of the architecture of the cognitive system, and con-
sequences on the decision process have been considered. We all have been dealing
with overload since birth, and in the process we have developed multiple coping
strategies to deal with situations of insufficient and/or exhausted cognitive resour-
ces. In the 1970s, a simple way to conserve our cognitive resources was to apply
compensatory decision rules to simplify or reduce information search (Payne,
1976). Disabling the app or technology, as technostressed Mary did in Chapter 1, is
another ‘old-school’ option.
Recent developments to counter brain overload include the development of
brain chip implants and biochemical enhancements. Nanotechnology, Biotechnol-
ogy, Information Technology, and Cognitive Science have led to radical
enhancements of human abilities. For example, data can be transmitted directly to
the brain when the person with the brain implant is at rest. This drastically
increases the amount of information that can be assimilated each day. Just imagine
how wonderful it would be to wake up and find that you have been infused with
volumes of new information. However, it is not quite that easy. The information
cannot be fully integrated for 30 to 90 days as neurons grow around the chip to
accommodate the new data. Further, the newly chipped person would have to
spend nine full months in the classroom learning how to think in a different (but
mindful) way, process the new data, and categorize it meaningfully into information
(Hamlett, Cobb, & Guston, 2013).
Warren, Leff, Athanasiou, Kennard, and Darzi, (2009) presented an ethical per-
spective on the use of neurocognitive enhancement to improve the function of the
central executive control system in surgeons. Warren et al. (2009) reported that
most research focuses on Transcranial Magnetic Stimulation, brain-machine inter-
faces, and neurosurgical implants of devices and tissue are designed mostly for
alleviating pathologies, such as paralysis (Liepert, 2005; Birbaumer & Cohen, 2007).
The use of such technology is highly desirable in these cases. Warren et al. (2009)
also considered the use of psychopharmacological enhancement on surgeons in
their ethical analysis. Indeed, the administration of drugs has already proven effec-
tive in enhancing pilots’ performance on complex tasks in emergency situations
(Yesavage, Mumenthaler, Taylor, Friedman, O’Hara, Sheikh et al., 2002). There is
Leveraging the positive side of IT 127
Helping ourselves
More than ever we are sensitive about the quality of the food we eat: its origin, its
calories, and its effect on and in our bodies. We read labels on food cans and
packages and look carefully at what we have on our plates. Virtually no one who
could afford to do otherwise would recommend a junk food diet to their child.
However, when considering IT and information junk, we are far from being
responsible. In a world filled with ‘fake news’ and obvious liars who hold public
office, we need to try to be more discerning about the sources of our information
and the relevance of the incoming data for making our lives better and more pro-
ductive. We need to consider mindfully how we are using our technology: Do we
spend too much time viewing screens? Are we becoming addicted to the tech-
nology? Is it changing our life for the worse? Can we use the technology more
efficiently and effectively?
We are constantly faced with technology of social saturation which reflects the
media potential for expression and connection to overpopulate the self (i.e., ego)
(Gergen, 1991). Denouncing and rejecting the technology of social saturation have
become common practices. Going ‘cold turkey’ on SNS or email applications on
our smartphones may indeed be a mindful solution. This is an efficient coping
strategy for sparing one’s resources. In technostressed Mary’s case (Chapter 1), dis-
abling her email work app sounds more like a desperate move to save her work-life
balance. Her mental resources are partially exhausted. Neither the technology nor
the designers are to be blamed. The lack of organizational policies or norms related
to email overload is the culprit. Mary is learning the hard way that, in Frost’s pre-
diction, working 8 hours ultimately leads to working 12 hours (Chapter 5).
Rejecting technology when overloaded is a short-term solution.
‘Going to war’ with technology and warning about the consequences of exces-
sive smartphone usage are salutary acts for software designers such as Justin
Rosenstein (see above) or Tristan Harris (Chapter 1). Rosenstein declared that he
Leveraging the positive side of IT 129
had removed the Facebook app from his smartphone (Lewis, 2017). He had assessed
the impact of his own design on “attention economy” to be too dangerous.
Rejecting his own technological design (the ‘like’ button) was a path of least resis-
tance. We also did not foresee the dark side of the Online Baby System (Chapter 4).
When technophiles are worried, it may give us more reason to be as well.
It is important to remember that IT-related overload is more than merely input
or output. It is an emotional and cognitive experience encoded in our brains that
helps us decide on the adoption of the next technology. These encoded memories
of past uses of technology can be positive or negative. If we had a bad experience
learning a new technology in the past, then we may be unwilling to try a new
technology now even though it may be very beneficial to us. On the other hand,
we may remember past experiences with new technologies as so easy or pleasant
that we think that we are capable of handling any new technology that is thrown
our way. The end result might be that we spend an inordinate amount of time
adjusting to a technology that is really not helpful to us, or we may try to master so
many new functions (such as those available on our smartphones) that we experi-
ence stress or become addicted. Either way, the result is undesirable, but avoidable
were we to think mindfully about our use of the technology. In some cases,
while learning a new technology may be unpleasant, it may be necessary to stave
off the alternative of being replaced by a machine.
Mindful use of IT may mean that we consider the extent to which we are
choosing instant gratification over a better life. For example, we might choose to
spend hours mesmerized by the screen in front of us rather than taking the time to
develop lasting relationships offline, or to do a better job of completing an assigned
task. Or we may choose to give up valuable personal information in return for
something that is of little worth in the long run.
option on their email systems, or delay installing the latest version of software until
a time when it is really needed. Other organizations are acting responsibly when
they institute flexible work programmes or when they negotiate with employees
about after-hours communications or required versus optional overtime. Still other
organizations are respectful not only of corporate assets, but also of the personal
information of their customers and employees and do not use it to their detriment
or sell it to third-party vendors who will use it detrimentally. Another avenue of
responsibility relates to organizational use of online games. Overusing games can
lead to IT addiction or employee stress from being monitored, thus diminishing
the benefits of their use. While work substitution and automation may make
undeniable economic sense, responsible organizations will ensure that those
employees whose jobs are eliminated are trained to survive in the new digital world.
As we all move forward, organizations should be mindful about how technology can
be used for everyone’s long-term well-being.
Conclusion
I, however believe that there is at least one philosophical problem in which all
thinking men are interested. It is the problem of cosmology: the problem of
understanding the world—including ourselves, and our knowledge, as part of the world.
All science is cosmology, I believe. (p.15)
In cosmology, the term ‘big bang’, coined by Hoyle in 1952, is in fact an inter-
esting metaphor since it gives the impression the universe emerged from a ‘big’
explosion– ‘BANG’. While cosmologists note that the big bang was not an
explosion in the conventional sense of the term, they concede that it was sudden
expansion on a colossal scale (Bryson, 2003). Just as the language of science evolves
constantly, so do technologies. Technological advances, such as in particle physics,
astrophysics, or quantum mechanics, provide standard models of cosmology using
the language of dark matter, dark energy, and black holes. Cosmologists also try to
understand the primordial Big Bang-Genesis supervenience problem in which the
scientific explanation of the beginning of the world supervenes the story of crea-
tion in the Bible. The primary drive, or intrinsic intellectual reward, of a physical
cosmologist is obviously beyond the sky! When considering and pondering the
bright and dark sides of IT, we also may be studying what Penn Buckinghamshire
refers to as “cosmology”.
Revolutionary research
An issue in cosmology is measurement. In Chapter 6, we talked about the chal-
lenges of measuring IT-related overload. Researchers will run into similar chal-
lenges for measuring variables in studies of new technologies. Perhaps neuroscience
approaches and equipment like thermal imaging cameras will become more refined
and more affordable to assist in this endeavour.
Griffiths (2015) described a second cognitive revolution using computation and
big data which offers an alternative to merely analyzing behaviour, especially small-
scale laboratory studies. For example, instead of the collaborative filtering of big
data gathered from online websites, data could be used to understand how human
minds work. Collaborative filtering predicts individuals’ purchases based on the simi-
larity of their behaviour to that of others. A cognitive approach would consider the
way preferences, semantic representation, and categorization combine into com-
plex models of human cognition. Consequently, computation using big data would
allow researchers to formulate more precise hypotheses about how exactly the
mind works and the consequences of these cognitions on behaviour. This knowl-
edge could extend cognitive science and inform the research of computer scientists
trying to understand how individuals are using new technologies.
In addition to revolutionizing the way that cognitive research is conducted, big
data could also influence research methodologies across a range of disciplines (i.e.,
information systems, management, computer science, psychology, communica-
tions, etc.). Gone will be surveys of 100 to 500 respondents. In fact, the use of
surveys of thousands of participants may be on its way out. How can data gathered
132 Leveraging the positive side of IT
Theory will still be very helpful in the world of big data– but the “age-old search
for causality” may be abandoned in a world of research where “mere correlations
suffice” (Sax, 2016, p.26).
foundations from which to tackle big data. We should not be content to surf on
the wave of new technologies built and designed for the profit of a few. The race
against the machine, or the robot, is not lost! It is just starting. We are now more
knowledgeable and aware of the dark side of IT. Shall we apply a form of mindful
optimism?
GLOSSARY
Second brain relates to the gut-feelings transmitted via the stomach, esophagus,
small intestine, and colon to the CNS; also known as the mind-gut
connection.
Self-serving attribution bias the tendency to attribute success to dispositional
attributes and failure to situational ones; fundamental error attribution.
Semantic Memory part of explicit memory in Long-Term Memory that acts as a
mental thesaurus.
Short-Term Memory (STM) short-term storage and attentional system in the
form of a single limited-capacity memory; limited to 7 +/− 2 bits of
information.
Social Networking System (SNS) social media systems such as Facebook, Lin-
kedIn, and Instagram.
Social presence the degree to which a medium allows an individual to establish a
personal connection with others that resembles face-to-face interaction.
Subjective focusing on the subject, the psychological states of subjective experi-
ences; Popper’s World 2.
Superchunk comprised of first-order chunks which are combined in levels so that
they require less effort to store in memory and also make the information
easier to remember.
Supervenience the ontological relation that occurs when upper-level system
properties are determined by their lower-level properties.
System feature overload the state that occurs when the technology an individual
has to use to complete a task is too complex for the task and for the individual.
Technology of social saturation reflects the media potential for expression and
connection to overpopulate the self (i.e., ego).
Technophilia a form of overidentification with technology that leads to a dis-
solution of human-technology boundaries.
Technophobia the struggle to accept computer technology.
Technostress type of stress experienced in organizations by technology end users
as a result of their inability to cope with the demands of organizational com-
puter usage.
Triangulation a research strategy that uses a multitrait-multimeasure approach, or
convergent validation with the goal of ensuring that the results of research are
not the product of methodological artefact (i.e., observation is cross-verified
from two or more sources).
Valence a positive or negative emotional tag attached to events and concepts that
were activated in association with prior experience of the related emotional
tag.
Variable an observable entity which is capable of assuming two or more values.
Work an “ongoing, often unending stream of meaningful activities that [allow] the
worker to fulfil a distinct role” (Pearlson et al., 2016, p.77).
Work-family conflict situation that occurs when the time and energy demands
of one set of roles (i.e., work or family) makes it difficult to fulfill the demands
of another.
Glossary 141
Abaker, I., Hashem, T., Chang, V., Anuar, N. B., Adewole, K., Yaqoob, I., Gani, A.,
Ahmed, E., & Chiroma, H. (2016). The role of big data in smart city. International Journal
of Information Management, 36(5), 748–758.
Agence France-Presse (2016). French workers win legal right to avoid checking work email
out-of-hours. The Guardian (December 31). Available at: https://www.theguardian.
com/money/2016/dec/31/french-workers-win-legal-right-to-avoid-checking-work-ema
il-out-of-hours (accessed December 5, 2017).
Ahuja, M.K., Chudoba, K.M., Kacmar, C.J., McKnight, D.H., & George, J.F. (2007). IT
road warriors: Balancing work-family conflict, job autonomy, and work overload to
mitigate turnover intentions. MIS Quarterly, 31(1), 1–17.
Ahuja, M.K., & Thatcher, J.B. (2005). Moving beyond intentions and toward the theory of
trying: Effects of work environment and gender on post-adoption information technol-
ogy use. MIS Quarterly, 29(3), 427–459.
Aljukhadar, M., Senecal, S., & Daoust, C.E. (2012). Using recommendation agents to cope
with information overload. International Journal of Electronic Commerce, 17(2), 41–70.
Allen, D.K., & Shoard, M. (2005). Spreading the load: Mobile information and commu-
nications technologies and their effect on information overload. Information Research, 10
(2), article 227.
Anderson, J.R., & Bower, G.H. (1973). Human associative memory. Washington, DC:
Winston.
Arbesman, S. (Ed.) (2013). The half-life of facts: Why everything we know has an expiration date.
New York: Penguin.
Aristotle (350 BCE). Politics. Translated by Benjamin Jowett (Public domain). Available at:
http://pinkmonkey.com/dl/library1/gp017.pdf (accessed December 13, 2017).
Arnetz, B.B., & Wiholm, C. (1997). Stress: Psychophysiological symptoms in modern offi-
ces. Journal of Psychosomatic Research, 43(1), 35–42.
Arora, S., Hull, L., Sevdalis, N., Tierney, T., Nestel, D., Woloshynowych, M., Darzi, A. &
Kneebone, R. (2010). Factors compromising safety in surgery: Stressful events in the
operating room. The American Journal of Surgery, 199(1), 60–65.
References 143
Berguer, R., Smith, W.D., & Chung, Y.H. (2001). Performing laparoscopic surgery is sig-
nificantly more stressful for the surgeon than open surgery. Surgical Endoscopy, 15(10),
1204–1207.
Bettman, J.R., Johnson, E., & Payne, J. (1990). A componential analysis of cognitive effort
and choice. Organizational Behavior and Human Decision Processes, 45(1), 111–139.
Binet, A. & Simon, T. (1904). Méthodes nouvelles pour le diagnostic du niveau intellectuel
des anormaux. L’année psychologique, 11, 191–244.
Birbaumer, N., & Cohen, L.G. (2007). Brain-computer interfaces: Communication and
restoration of movement in paralysis. The Journal of Physiology, 15(579), 621–636.
Block, J.J. (2008). Issues for DSM-V: Internet addiction. The American Journal of Psychiatry,
165(3), 306–307.
Bluedorn, A.C. (Ed.) (2002). The human organization of time: Temporal realities and experience.
Stanford, CA: Stanford University Press.
Boden, M.A. (Ed.) (1977). Artificial intelligence and natural man. New York: Basic Books.
Bolino, M.C., & Turnley, W.H. (2005). The personal costs of citizenship behavior: The
relationship between individual initiative and role overload, job stress, and work-family
conflict. Journal of Applied Psychology, 90(4), 740–748.
Bouzida, N., Bendada, A., & Maldague, X. (2009). Visualization of human body thermo-
regulation by infrared imaging. Journal of Thermal Biology, 34(3), 120–126.
Bower, G.H. (1981). Mood and memory. American Psychologist, 36(2), 139–148.
Bower, G.H. (1991). Mood congruity of social judgments. In J.P. Forgas (Ed.), Emotion and
social judgments (pp.31–54). Oxford: Pergamon Press.
Bower, G.H. (2001). Mood as a resource in processing self-relevant information. In J.P.
Forgas (Ed.), Handbook of affect and social cognition (pp.256–272). Mahwah, NJ: Lawrence
Erlbaum.
Bradshaw, J.L. (1968). Load and pupillary changes in continuous processing tasks. British
Journal of Psychology, 59(3), 265–271.
Broadbent, D. (1958). Perception and communication. London: Pergamon Press.
Brod, C. (1984). Technostress: The human cost of the computer revolution. Reading: Addison-Wesley.
Bridges, W. (1994). Job shift: How to prosper in a workplace without jobs. Reading, MA: Addi-
son-Wesley.
Brooks, S., Longstreet, P., & Califf, C. (2017). Social media induced technostress and its
impact on Internet addiction: A distraction-conflict theory perspective. Association of
Information System Transactions on Human-Computer Interaction, 9(2), 99–122.
Brynjolfsson, E., & McAfee, A. (2011). Race against the machine: How the digital revolution is
accelerating innovation, driving productivity, and irreversibly transforming employment and the
economy. Lexington, MA: Digital Frontier Press.
Bryson, B. (2003). A short history of nearly everything. New York: Broadway Books.
Buffardi, L.E., & Campbell, W.K. (2008). Narcissism and social networking web sites. Per-
sonality and Social Psychology Bulletin, 34(10), 303–1314.
Bureau of Labor Statistics (2017). Occupational outlook handbook. https://www.bls.gov/
ooh/transportation-and-material-moving/ (accessed December 30, 2017).
Cacioppo, J.T., & Petty, R.E. (1982). The need for cognition. Journal of Personality and Social
Psychology, 42(1), 116–131.
Cacioppo, J.T., Petty, R.E., Feinstein, J.A., & Jarvis, B.G. (1996). Dispositional differences
in cognitive motivation: The life and times of individuals varying in need for cognition.
Psychological Bulletin, 119(2), 197–253.
Calhoun, C.J. (2002). Classical sociological theory. Malden, MA: Blackwell.
Cao, C.G., Zhou, M., Jones, D.B., & Schwaitzberg, S.D. (2007). Can surgeons think and
operate with haptics at the same time? Journal of Gastrointestinal Surgery, 11(11), 1564–1569.
References 145
Campbell, D.T., & Fiske, D.W. (1959). Convergent and discriminant validation by the
multitrait-multimethod matrix. Psychological Bulletin, 56(2), 81–105.
Cannon, W.B. (1914). The interrelations of emotions as suggested by recent psychological
researches. American Journal of Psychology, 25(2), 256–282.
Cannon, W.B. (1927). The James/Lange theory of emotion: A critical examination and an
alternative theory. American Journal of Psychology, 39, 106–124.
Cannon, W.B. (Ed.) (1929). Bodily changes in pain, hunger, fear and rage. New York: D.
Appleton and Co.
Cannon, W.B. (Ed.) (1932). The wisdom of the body. New York: Norton.
Caplan, S.E. (2002). Problematic Internet use and psychosocial well-being: Development of
a theory-based cognitive-behavioral measurement instrument. Computers in Human Beha-
vior, 18(5), 553–575.
Caplan, S.E. (2003). Preference for online social interaction: A theory of problematic
Internet use and psychosocial well-being. Communication Research, 30(6), 625–648.
Caplan, S.E. (2007). Relations among loneliness, social anxiety, and problematic Internet
use. Cyberpsychology Behavior, 10(2) 234–242.
Caplan, S.E. (2010). Theory and measurement of generalized problematic Internet use: A
two-step approach. Computers in Human Behavior, 26(5), 1089–1097.
Carlson, N., & Buskist, W. (1997). Psychology: The science of behavior. Boston, MA: Allyn and
Bacon.
Carpenter, C.J. (2012). Narcissism on Facebook: Self-promotional and anti-social behavior.
Personality and Individual Differences, 52(4), 482–486.
Carr, N. (2017). How smartphones hijack our minds. Wall Street Journal (October 6). Avail-
able at: https://www.wsj.com/articles/how-smartphones-hijack-our-minds-1507307811
(accessed October 26, 2017).
Carvalho, S., Cunha, E., Sousa, C., & Matsuzawa, T. (2008). Chaînes opératoires and
resource exploitation strategies in chimpanzee (Pan troglodytes) nut cracking. Journal of
Human Evolution, 55(1), 148–163.
Cellan-Jones, R. (2014). Stephen Hawking warns artificial intelligence could end mankind.
The BBC (December 2). Available at: www.bbc.com/news/technology-30290540
(accessed June 12, 2017).
Chambers, A.B., & Nagel, D.C. (1985). Pilots of the future: Human or computer? Com-
munications of the ACM, 28(11), 74–87.
Chandler, P., & Sweller, J. (1991). Cognitive load theory and the format of instruction.
Cognition and Instruction, 8(4), 293–332.
Chang, S.L., & Ley, K. (2006). A learning strategy to compensate for cognitive overload in
online learning: Learner use of printed online material. Journal of Interactive Online Learning,
5(1), 104–117.
Chen, Y.C., Shang, R.A., & Kao, C.Y. (2009). The effects of information overload on
consumers’ subjective state towards buying decision in the Internet shopping environ-
ment. Electronic Commerce Research and Applications, 8(1), 48–58.
Chervany, N., & Dickson, G. (1974). An experimental evaluation of information in a pro-
duction environment. Management Science, 20(10), 1335–1344.
Chewning, E.G., & Harrell, A.M. (1990). The effect of information load on decision
makers’ cue utilization levels and decision quality in a financial distress decision task.
Accounting, Organizations and Society, 15(6), 527–542.
Chialastri, A. (2012). Automation in aviation. In F. Kongoli (Ed.), Aviation (pp.79–102).
Rijeka, Croatia: InTech.
Chou, C., & Hsiao, M.C. (2000). Internet addiction, usage, and gratifications: The Taiwan
college students’ case. Computers and Education, 35(1), 65–80.
146 References
Christensen, C.M., Bohmer, R., & Kenagy, J. (2000). Will disruptive innovations cure
health care? Harvard Business Review, 78(5), 102–112,199.
Clark, P.A. (1985). A review of the theories of time and structure for organizational sociol-
ogy. Research in the Sociology of Organizations, 4, 35–97.
Cobos, P., Sanchez, M., Garcia, C., Vera, M.N., & Vila, J. (2002). Revisiting the James
versus Cannon debate on emotion: Startle and autonomic modulation in patients with
spinal cord injuries. Biological Psychology, 61(3), 251–269.
Cohen, A.R., Stotland, E., & Wolfe, D.M. (1955). An experimental investigation of need
for cognition. Journal of Abnormal and Social Psychology, 51(2), 291–294.
Cohen, K.N., & Clark, J.A. (1984). Transitional object attachments in early childhood and per-
sonality characteristics in later life. Journal of Personality and Social Psychology, 46(1), 106–111.
Colbert, A., Yee, N., & George, G. (2016). The digital workforce and the workplace of the
future. Academy of Management Journal, 59(3), 731–739.
Cook, G.J. (1993). An empirical investigation of information search strategies with implica-
tions for decision support system design. Decision Sciences, 24(3), 683–699.
Craik, F.M.I., & Lockhart, R.S. (1972). Level of processing: A framework for memory
research. Journal of Verbal Learning and Verbal Behavior, 11(6), 671–684.
Cross, R., & Gray, P. (2013). Where has the time gone? California Management Review, 56(1),
50–66.
Cross, R., Rebele, R., & Grant, A. (2016). Collaborative overload. Harvard Business Review,
94(1), 74–79.
Csikszentmihalyi, M. (1975). Beyond boredom and anxiety. San Francisco, CA: Jossey-Bass.
Csikszentmihalyi, M., & Csikszentmihalyi, I. (1988). Optimal experience. Cambridge, UK:
Cambridge University Press.
Dahr, R. (1996). The effects of decision strategy on deciding to defer choice. Journal of
Behavioral Decision Making, 9(4), 265–281.
Damasio, A.R. (1994). Descartes’ error: Emotion, reason and the human brain. New York: Putnam.
Damasio, A., & Van Hoesen, G.W. (1983). Emotional disturbances associated with focal
lesions of the limbic frontal lobe. In K. Heilman & P. Satz (Eds), The neuropsychology of
human emotion: Recent advances (pp.85–110). New York: The Guilford Press.
Daniels, K. (2008). Affect and information processing. In G.P. Hodgkinson and W.H. Star-
buck (Eds), Oxford handbook of organizational decision-making (pp.325–341). Oxford:
Blackwell.
D’Arcy, J., Gupta, A., Tarafdar, M., & Turel, O. (2014). Reflecting on the “dark side” of
information technology use. CAIS, 35, article 5.
D’Arcy, J., Herath, T., & Shoss, M.K. (2014). Understanding employee responses to stressful
information security requirements: A coping perspective. Journal of Management Information
Systems, 31(2), 285–318.
Darwin, C. (Ed.) (1859). On the origin of species by means of natural selection, or the preservation of
favoured races in the struggle for life. New York: D. Appleton and Co.
Darwin, C. (Ed.) (1871). The descent of man, and selection in relation to sex. London: Murray.
Davila, J., Hershenberg, R., Feinstein, B.A., Gorman, K., Bhatia, V., & Starr, L.R. (2012).
Frequency and quality of social networking among young adults: Associations with
depressive symptoms, rumination, and co-rumination. Psychology of Popular Media Culture,
1(2), 72–86.
Davis, R.A. (2001). A cognitive-behavioral model of pathological Internet use. Computers in
Human Behavior, 17(2), 187–195.
Davis, R.A., Flett, G.L., & Besser, A. (2002). Validation of a new scale for measuring pro-
blematic Internet use: Implications for pre-employment screening. CyberPsychology and
Behavior, 5(4), 331–345.
References 147
Dawson, M.E., Schell, A.M., & Filion, D.L. (1990). The electrodermal response system. In
J.T. Cacioppo & L.G. Tassinary (Eds), Principles of psychophysiology: Physical, social and
inferential elements (pp.295–324). Cambridge: Cambridge University Press.
Denning, P.J. (1982). Electronic junk. Communications of the ACM, 25(3), 163–165.
Denzin, N.K. (1978). The research act (2nd ed.). New York: McGraw-Hill.
Descartes, R. (1644). Principles of philosophy. Amsterdam: Louis Elzevir.
Deutsch, J., & Deutsch, D. (1963). Attention: Some theoretical considerations. Psychological
Review, 70(1), 80–90.
DeWall, N.C., Buffardi, E.L., Bonser, I., & Campbell, W.K. (2011). Narcissism and implicit
attention seeking: Evidence from linguistic analyses of social networking and online pre-
sentation. Personality and Individual Differences, 51(1), 57–62.
DeYoung, C.G., Hirsh, J.B., Shane, M.S., Papademetris, X., Rajeevan, N., & Gray, J.R.
(2010). Testing predictions from personality neuroscience: Brain structure and the Big
Five. Psychological Science, 21(6), 820–828.
DeYoung, C.G., Peterson, J.B., & Higgins, D.M. (2005). Sources of openness/intellect:
Cognitive and neuropsychological correlates of the fifth factor of personality. Journal of
Personality, 73(4), 825–858.
Dickson, G.W., Senn, J.A., & Chervany, N.L. (1977). Research in management information
systems: The Minnesota experiments. Management Science, 23(9), 913–924.
Diderot, D. (1818–1819). Oeuvres de Denis Diderot (11 vols). Paris: J.L.J. Brière.
Dimoka, A. (2011). Brain mapping of psychological processes with psychometric scales: An
fMRI method for social neuroscience. NeuroImage, 54(Suppl. 1), S263–S271.
Dimoka, A., Banker, R.D., Benbasat, I., Davis, F.D., Dennis, A.R., Gefen, D., Gupta, A.,
Ischebeck, A., Kenning, P., Pavlou, P., Müller-Putz, G., Riedl, R., vom Brocke, J., &
Weber, B. (2012). On the use of neurophysiological tools in IS research: Developing a
research agenda for NeuroIS. MIS Quarterly, 36(3), 679–702.
Dimoka, A., Pavlou, P.A., & Davis, F. (2011). NeuroIS: The potential of cognitive neu-
roscience for information systems research. Information Systems Research, 22(4), 687–702.
Donaldson, S.I., & Grant-Vallone, E.J. (2002). Understanding self-report bias in organiza-
tional behavior research. Journal of Business and Psychology, 17(2), 245–260.
Dossey, L. (2014). FOMO, digital dementia, and our dangerous experiment. Explore: The
Journal of Science and Healing, 10(2), 69–73.
Drouin, M., Kaiser, D.H., & Miller, D.A. (2012). Phantom vibrations among under-
graduates: Prevalence and associated psychological characteristics. Computers in Human
Behavior, 28(4), 1490–1496.
Dunahoo, C.L., Hobfoll, S.E., Monnier, J., Hulsizer, M.R., & Johnson, R. (1998). There’s
more than rugged individualism in coping. Part 1: Even the Lone Ranger had Tonto.
Anxiety, Stress & Coping, 11(2), 137–165.
Duxbury, L.E., & Higgins, C.A. (2001). Work-life balance in the new millennium: Where are we?
Where do we need to go? Ottawa, ON: Canadian Policy Research Networks.
Ebbinghaus, H. (1913/1885). Memory: A contribution to experimental psychology. New York:
Columbia Teachers’ College.
ECRI Institute (2012). Top 10 health technology hazards for 2013. Health Devices, 41(11),
1–23. Available at: www.ecri.org/2013hazards (accessed December 13, 2017).
ECRI Institute (2013). Top 10 health technology hazards for 2014. Health Devices, 42(11),
354–380. Available at: www.healthit.gov/facas/sites/faca/files/STF_Top_Ten_Tech_Hazards_
2014-06-13.pdf (accessed December 13, 2017).
ECRI Institute (2014). Top 10 health technology hazards for 2015. Health Devices, 43(11),
1–31. Available at: https://www.ecri.org/press/Pages/ECRI-Institute-Announces-Top-
10-Health-Technology-Hazards-for-2015.aspx (accessed December 13, 2017).
148 References
Edmunds, A. & Morris, A. (2000). The problem of information overload in business orga-
nisations: A review of the literature. International Journal of Information Management, 20(1),
17–28.
Ekman, P. (1984). Expression and the nature of emotion. In K.R. Scherer & E. Ekman
(Eds), Approaches to emotion (pp.319–344). Hillsdale, NJ: Lawrence Erlbaum.
Eppler, M.J., & Mengis, J. (2004). The concept of information overload: A review of lit-
erature from organization science, accounting, marketing, MIS, and related disciplines.
The Information Society, 20(5), 325–344.
Ernest, M., & Paulus, M.P. (2005). Neurobiology of decision making: A selective review
from a neurocognitive and clinical perspective. Biology Psychiatry, 58(8), 597–604.
Eysenck, M.W., & Eysenck, H.J. (1980). Mischel and the concept of personality. British
Journal of Psychology, 71(2), 191–204.
Farhoomand, A.F., & Drury, D.H. (2002). Managerial information overload. Communications
of the ACM, 45(10), 127–131.
Festinger, L., & Carlsmith, J.M. (1959). Cognitive consequences of forced compliance. The
Journal of Abnormal and Social Psychology, 58(2), 203–210.
Fleeson, W. (2001). Towards a structure- and process-integrated view of personality:
Traits as density distributions of states. Journal of Personality and Social Psychology, 80(6),
1011–1027.
Foehr, U.G. (2006). Media multitasking among American youth: Prevalence, predictors, and pairings.
Menlo Park, CA: The Henry J. Kaiser Family Foundation.
Folkman, S., & Lazarus, R.S. (1988). The relationship between coping and emotion:
Implications for theory and research. Social Science & Medicine, 26(3), 309–317.
Forbes, B.C. (1921). Why do so many men never amount to anything? [Interview with
Thomas Edison]. American Magazine, 91, 10–11,85–86,89.
Forgas, J.P. (2003). Affective influences on attitudes and judgments. In R.J. Davidson, K.R.
Scherer, & H.Goldsmith (Eds), Handbook of affective sciences (pp.596–618). Oxford: Oxford
University Press.
Forsyth, D.R. (1980). The functions of attributions. Social Psychology Quarterly, 43(2),
184–118.
Frank, R.H. (Ed.) (1988). Passions within reasons: The strategic role of the emotions. New York:
Norton.
Fredette, J., Marom, R., Steinert, K., & Witters, L. (2012). The promise and peril of
hyperconnectivity for organizations and societies. In S. Dutta & B. Bilbao-Osorio (Eds),
The global information technology report 2012: Living in a hyperconnected world (pp.113–119).
Geneva, Switzerland: World Economic Forum.
Freedman, D.A. (2010). Statistical model and causal inference: A dialogue with social sci-
ences. Edited by D. Collier, J.S. Sekhon, & P.B. Stark. New York: Cambridge University
Press.
Freud, S. (1894). The neuro-psychoses of defence. In J. Strachey (Ed.), The standard edition of
the complete works of Sigmund Freud (Vol.3, pp.41–61). London: Hogarth.
Freud, S. (Ed.) (1927). The ego and the ID. London: Hogarth.
Frey, C., & Osborne, M. (2013). The future of employment: How susceptible are jobs to compu-
terisation? Oxford: University of Oxford.
Frijda, N.H. (1986). The emotions. New York: Cambridge University Press.
Frijda, N.H. (1994). Emotions are functional most of the time. In P. Ekman and R.J.
Davidson (Eds), The nature of emotion: Fundamental questions (pp.112–122). New York:
Oxford University Press.
Galbraith, J.R. (1974). Organization design: An information processing view. Interfaces, 4(3),
28–36.
References 149
Galton, F. (1892). Hereditary genius: An inquiry into its laws and consequences. London: Mac-
Millan and Co.
Gardner, H. (Ed.) (1987). The mind’s new science: A history of the cognitive revolution. New
York: Basic Books.
Gergen, K.J. (1991). The saturated self: Dilemmas of identity in contemporary life. New York:
Basic Books.
Gershon, M. (1999). The second brain: A groundbreaking new understanding of nervous disorders of
the stomach and intestine. New York: Harper Perennial.
Ghashghaei, H.T., & Barbas, H. (2002). Pathways for emotion: Interactions of prefrontal and
anterior temporal pathways in the amygdala of the rhesus monkey. Neuroscience, 115(4),
1261–1279.
Glusac, E. (2016). The challenge to unplug. The New York Times (Sept 18), p.2.
Goeders, N.E. (2003). The impact of stress on addiction. European Neuropsychopharmacology,
13(6), 435–441.
Goldin, C., & Katz, L. (2009). The race between education and technology: The evolution
of US educational wage differentials, 1890 to 2005. In The race between education and tech-
nology (pp.287–323). Cambridge, MA: Harvard University Press. [Working paper version
available at: www.nber.org/papers/w12984]
Goldschmidt, P.G. (2005). HIT and MIS: Implications of health information technology and
MIS. Communications of the ACM, 48(10), 68–74.
Golson, J. (2016). Tesla driver killed in crash with Autopilot active, NHSTA investigating.
The Verge (June 30). Available at: https://www.theverge.com/2016/6/30/12072408/tesla
-autopilot-car-crash-death-autonomous-model-s (accessed December 13, 2017).
Goodman, M. (2015). Future crimes: Everything is connected, everyone is vulnerable and what we
can do about it. New York: Doubleday.
Goonetilleke, R.S. & Luximon, Y. (2010). The relationship between monochronicity,
polychronicity and individual characteristics. Behaviour and Information Technology, 29(2),
187–198.
Goos, M., & Manning, A. (2007). Lousy and lovely jobs: The rising polarization of work in
Britain. The Review of Economics and Statistics, 89(1), 118–133.
Gretzel, U., Sigala, M., Xiang, Z., & Koo, C. (2015). Smart tourism: Foundations and
developments. Electronic Markets, 25(3), 179–188.
Griffiths, T.L. (2015). Manifesto for a new (computational) cognitive revolution. Cognition,
135, 21–23.
Grinbaum, A., & Groves, C. (2013). What is responsible about responsible innovation?
Understanding the ethical issues. In R Owen, J. Bessant & M. Heintz (Eds), Responsible
innovation: Managing the responsible emergence of science and innovation in society (pp.119–139).
Chichester, UK: Wiley & Sons.
Grise, M.-L., & Gallupe, R.B. (1999–2000). Information overload: Addressing productivity
paradox in face-to-face electronic meetings. Journal of Management Information Systems, 16
(3), 157–185.
Gross, J.J. (Ed.) (2007). Handbook of emotion regulation. New York: Guilford.
Groysberg, B., & Abrahams, R. (2014). Manage your work, manage your life. Harvard Business
Review, 92(3), 58–66. Gutek, B.A. (1983). Changing boundaries. In J. Zimmerman (Ed.),
The technological woman: Interfacing with tomorrow (pp.157–172). New York: Praeger.
Hall, E.T. (1983). The dance of life. New York: Anchor.
Hallowell, E.M. (2005). Overloaded circuits: Why smart people underperform. Harvard
Business Review, 83(1), 54–62.
Hamilton, J.E., & Hancock, P.A. (1986). Robotics safety: Exclusion guarding for industrial
operations. Journal of Occupational Accidents, 8(1–2), 69–78.
150 References
Hamlett, P., Cobb, M.D., & Guston, D.H. (2013). National citizens’ technology forum:
Nanotechnologies and human enhancement. In S.A. Hays, J.S. Robert, C.A. Miller, & I.
Bennett (Eds), Nanotechnology, the brain, and the future (pp.265–283). Dordrecht: Springer
Netherlands.
Hancock, J., Gee, K., Ciaccio, K., & Mae-Hwah Lin, J. (2008). I’m sad you’re sad: Emo-
tional contagion in CMC. Proceedings of the 2008 ACM Conference on Computer Supported
Cooperative Work (pp.295–298). New York: ACM.
Hancock, P.A. (2013). Automation: how much is too much? Ergonomics, 57(3), 449–454.
Hancock, P.A., Billings, D.R., & Schaefer, K. E. (2011). Can you trust your robot? Ergo-
nomics in Design, 19(3), 24–29.
Hassard, J. (1996). Images of time in work and organization. In S.R. Clegg, C. Hardy, & W.
R. Nord (Eds), Handbook of Organization Studies, (pp.581–598). London: Sage.
Haugtvedt, C.P., Petty, R.E., Cacioppo, J.T., & Steidley, T. (1988). Personality and ad
effectiveness: Exploring the utility of need for cognition. Advances in Consumer Research,
15, 209–212.
Heider, F. (1946). Attitudes and cognitive organization. The Journal of Psychology, 21(1),
107–112.
Heimer, L., & Van Hoesen, G.W. (2006). The limbic lobe and its output channels: Impli-
cations for emotional functions and adaptive behavior. Neuroscience Biobehavior Review, 30
(2), 126–147.
Hemmerling, T.M., & Taddei, R. (2011). Robotic anesthesia: A vision for the future of
anesthesia. Translational Medecine, 1(1), 1–20.
Hemmerling, T.M., Taddei, R., Wehbe, M., Zaouter, C., Cyr, S., & Morse, J. (2012). First
robotic tracheal intubations in humans using the Kepler intubation system. British Journal
of Anaesthesia, 108(6), 1011–1016.
Hess, E.H., & Polt, J.M. (1964). Pupil size in relation to mental activity during simple pro-
blem solving. Science, 140(3611), 1190–1192.
Hiltz, R.S., & Turoff, M. (1985). Structuring computer-mediated communication systems to
avoid information overload. Communications of the ACM, 28(7), 680–688.
Hobfoll, S.E. (1989). Conservation of resources: A new attempt at conceptualizing stress.
American Psychologist, 44(3), 513–524.
Hobfoll, S.E. (2002). Social and psychological resources and adaptation. Review of General
Psychology, 6(4), 307–324.
Hobfoll, S.E. (2011). Conservation of resource caravans and engaged settings. Journal of
Occupational and Organizational Psychology, 84(1), 116–122.
Hobfoll, S.E., & Freedy, J. (1993). Conservation of resources: A general stress theory applied
to burnout. In W.B. Schaufeli, C. Maslach, & T. Marek (Eds), Professional burnout: Recent
developments in theory and practice (pp.115–133). Washington, DC: Routledge.
Huber, G.P. (1983). Cognitive style as a basis for MIS and DSS designs: Much ado about
nothing? Management Science, 29(5), 567–579.
Hull, C.L. (Ed.) (1943). Principles of behavior: An introduction to behavior theory. Oxford, UK:
D. Appleon-Century.
Huysmans, J.H. (1970). The effectiveness of the cognitive-style constraint in implementing
operations research proposals. Management Science, 17(1), 92–104.
Ipsos MediaCT & Wikia (2013). Generation Z: A look at the technology and media habits
of today’s teens. Available at: www.wikia.com/Generation_Z:_A_Look_at_the_Techno
logy_and_Media_Habits_of_Today%E2%80%99s_Teens (accessed September 20, 2017).
Iselin, E.R. (1988). The effects of information load and information diversity on
decision quality in a structured decision task. Accounting, Organizations and Society, 13
(2), 147–164.
References 151
Iselin, E.R. (1993). The effects of the information and data properties of financial rations and
statements on managerial decision quality. Journal of Business Finance & Accounting, 20(2),
249–266.
Isidore, C., & Luhby, T. (2015). Turns out Americans work really hard… but some want to
work harder. CNN Money (July 9). Available at: http://money.cnn.com/2015/07/09/
news/economy/americans-work-bush/index.html (accessed October 26, 2017).
Jackson, T.W., & Farzaneh, P. (2012). Theory-based model of factors affecting information
overload. International Journal of Information Management, 32(6), 523–532.
Jacoby, J. (1984). Perspectives on information overload. The Journal of Consumer Research,
10(4), 432–435.
Jacoby, J., Speller, D., & Kohn-Berning, C. (1975). Constructive criticism and programmatic
research: Reply to Russo. Journal of Consumer Research, 2(2), 154–156.
Jaeggi, S.M., Buschkuehl, M., Etienne, A., Ozdoba, C., Perrig, W.J., & Nirkko, A.C.
(2007). On how high performers keep cool brains in situations of cognitive overload.
Cognitive, Affective & Behavioral Neuroscience, 7(2), 75–89.
Jakimowicz, J., & Cuschieri, A. (2005). Time for evidence-based minimal access surgery
training: Simulate or sink. Surgical Endoscopy, 19(12), 1521–1522.
Jaques, E. (1982). The form of time. New York: Crane Russak.
James, W. (1884). What is an emotion? Mind, 9(34), 188–205.
James, W. (Ed.) (1890). The principles of psychology. New York: Holt.
James, W. (1894). The physical basis of emotion. Psychological Review, 1, 516–529.
Jasperson, J., Carter, P.E., & Zmud, R.W. (2005). A comprehensive conceptualization of
post-adoptive behaviors associated with information technology enabled work systems.
MIS Quarterly, 29(3), 525–557.
Jick, T.D. (1979). Mixing qualitative and quantitative methods: Triangulation in action.
Administrative Science Quarterly, 24(4), 602–611.
Johnston, A. (2015). Robotic seals comfort dementia patients but raise ethical issues. KLWA
Local Public Radio (August 17). Available at: http://kalw.org/post/robotic-seals-comfort-
dementia-patients-raise-ethical-concerns#stream/0 (accessed December 13, 2017).
Jones, E.E., & Harris, V.A. (1967). The attribution of attitudes. Journal of Experimental Social
Psychology, 3(1), 1–24.
Jones, Q., Ravid, G., & Rafaeli, S. (2004). Information overload and the message dynamics
of online interaction spaces: A theoretical model and empirical exploration. Information
Systems Research, 15(2), 194–210.
Junco, R. (2013). Comparing actual and self-reported measures of Facebook use. Computers
in Human Behavior, 29(3), 626–631.
Jutai, J.W., & Hare, R.D. (1983). Psychopathy and selective attention during performance of
a complex perceptual-motor task. Psychophysiology, 20(2), 146–151.
Kahneman, D. (Ed.) (1973). Attention and effort. Englewood Cliffs, NJ: Prentice Hall.
Kahneman, D., & Treisman, A. (1984). Changing views of attention and automaticity. In R.
Parasuraman and D.R. Davies (Eds), Varieties of attention (pp.28–61). Orlando, FL: Academic
Press.
Kandell, J.J. (1998). Internet addiction on campus: The vulnerability of college students.
Cyberpsychology & Behavior, 1(1), 11–17.
Kant, I. (1781–1787/2003). Critique of pure reason. Translated by Norman Kemp Smith.
Basingstoke, UK: Palgrave MacMillan.
Kaplan, A. (1964). The conduct of inquiry. San Francisco, CA: Chandler.
Kaplan, R., & Porter, M. (2011). How to solve the cost crisis in health care. Harvard Business
Review, 89(9), 47–64.
152 References
Kaplan, R.M., & Saccuzzo, D.P. (2010). Psychological testing: Principles, applications, and issues
(8th ed.). Belmont, CA: Wadsworth, Cengage Learning.
Kapur, S., Craik, F.I.M., Jones, C., Brown, G.M., Houle, S., & Tulving, E. (1995).
Functional role of the prefrontal cortext in retrieval of memories: A PET study. Neu-
roReport, 6(14), 1880–1884.
Kapur, S., Craik, F.I.M., Tulving, E., Wilson, A.A., Houle, S., & Brown, G.M. (1994).
Neuroanatomical correlates of encoding in episodic memory: Levels of processing effect.
Proceedings of the National Academy of Sciences of the United States of America, 91(6), 2008–
2011.
Karaiskos, D., Tzavellas, E., Balta, G., & Paparrigopoulos, T. (2010). P02–232 –Social net-
work addiction: A new clinical disorder? European Psychiatry, 25(Suppl. 1), 855.
Karr-Wisniewski, P., & Lu, Y. (2010). When more is too much: Operationalizing technol-
ogy overload and exploring its impact on knowledge worker productivity. Computers in
Human Behavior, 26(5), 1061–1072.
Kellogg, R.T. (1990). The psychology of writing. New York: Oxford.
Kim, K.K., & Michelman, J. E. (1990). An examination of factors for the strategic use of
information systems in the healthcare industry. Management Information Systems, 14(2),
201–215.
Kitchin, R. (2014). Big data, new epistemologies and paradigm shifts. Big Data & Society, 1(1),
1–12. Klaus, M.H., & Kennel, J.H. (1985). Parent-infant bonding. Mosby Press: St Louis.
Klausegger, C., Sinkovics, R.R., & “Joy” Zou, H. (2007). Information overload: A cross-
national investigation of influence factors and effects. Marketing Intelligence & Planning, 25
(7), 691–718.
Klingberg, T. (2009). The overflowing brain: Information overload and the limits of working memory.
Oxford: Oxford University Press.
Kluver, H., & Bucy, P.C. (1937). Psychic blindness and other symptoms following bilateral
temporal lobectomy in rhesus monkeys. American Journal of Physiology, 119(2), 352–353.
Knapp, T.J., & Robertson, L.C. (Eds) (1986). Approaches to cognition: Contrasts and con-
troversies. Hillsdale, NJ: Lawrence Erlbaum.
Kock, N. (2000). Information overload and worker performance: A process-centered view.
Knowledge and Process Management, 7(4), 256–264.
Koeniger, P., & Janowitz, K. (1995). Drowning in information, but thirsty for knowledge.
International Journal of Information Management, 15(1), 5–16.
Kohlberg, L. (1976). Moral stages and moralization: The cognitive-developmental approach.
In T. Lickona (Ed.), Moral development and behavior: Theory, research and social issues (pp.31–
53). New York: Holt, Rinehart, Winston.
Köhler, W. (1925/1917). The mentality of apes. New York: Humanities Press.
König, C.J., & Waller, M.J. (2010). Time for reflection: A critical examination of poly-
chronicity. Human Performance, 23(2), 173–190.
Korac-Kakabadse, N., Kouzmin, A., & Korac-Kakabadse, A. (2001). Emerging impact of
online over-connectivity. Paper presented at the 9th European Conference on Informa-
tion Systems. Bled, Slovenia, 27–29 June.
Kraut, R., Patterson, M., Lundmark, V., Kiesler, S., Mukopadhyay, T., & Scherlis, W.
(1998). Internet paradox: A social technology that reduces social involvement and psy-
chological well-being? American Psychologist, 53(9), 1017–1031.
Krugman, P. (2013). Sympathy for the Luddites. The New York Times (June 13). Available at:
www.nytimes.com/2013/06/14/opinion/krugman-sympathy-for-the-luddites.html?_r=0
(accessed November 5, 2017).
Kuhn, T. S. (1962). The structure of scientific revolutions. Chicago, IL: The University of Chi-
cago Press.
References 153
Kuss, D.J., & Griffiths, M.D. (2011). Online social networking and addiction: A review of
the psychological literature. International Journal of Environmental Research and Public Health,
8(9), 3528–3552.
Lanzetta, J.T., & Orr, S.P. (1980). Influence of facial expressions on the classical condition-
ing of fear. Journal of Personality and Social Psychology, 39(6), 1081–1087.
Lanzetta, J.T., & Orr, S.P. (1986). Excitatory strength of expressive faces: Effects of happy
and fear expressions and context on the extinction of a conditioned fear response. Journal
of Personality and Social Psychology, 50(1), 190–194.
Lashley, K.S. (Ed.) (1929). Brain mechanisms and intelligence. Chicago, IL: Chicago University Press.
Lazarus, R.S. (1994). Emotion and adaptation. Oxford: Oxford University Press.
Lazarus, R.S., & Folkman, S. (1984). Stress, appraisal, and coping. New York: Springer.
Lazarus, R.S., & Smith, C.A. (1989). Knowledge and appraisal in the cognition-emotion
relationship. Cognition and Emotion, 2(4), 281–300.
Leavitt, H.J. (1958). Managerial psychology. Chicago, IL: University of Chicago Press.
LeDoux, J. (1992). Emotion and the amygdala. In J.P. Aggleton (Ed.), The amygdala: Neu-
robiological aspects of emotion, memory, and mental dysfunction (pp.339–351). New York:
Wiley-Liss.
LeDoux, J. (Ed.) (1998). The emotional brain: The mysterious underpinnings of emotional life.
London: Clays Ltd.
Lee, A.R., Son, S.M., & Kim, K.K. (2016). Information and communication technology
overload and social networking service fatigue: A stress perspective. Computers in Human
Behavior, 55, 51–61.
Lee, Y.K., Chang, C.T., Lin, Y., & Cheng, Z.H. (2014). The dark side of smartphone
usage: Psychological traits, compulsive behavior and technostress. Computers in Human
Behavior, 31, 373–383.
Levine, S. (2005). Developmental determinants of sensitivity and resistance to stress. Journal of
Psychoneuroendocrinology, 30(10), 939–946.
Lewis, P. (2017). Our minds can be hijacked: The tech insiders who fear a smartphone dys-
topia. The Guardian (October 6). Available at: https://www.theguardian.com/technology/
2017/oct/05/smartphone-addiction-silicon-valley-dystopia (accessed September 9, 2017).
Liden, G.B., Wolowicz, M., Stivoric, J., Teller, A., Kasabach, C., Vishnubhatla, S., Pelletier,
R., Farringdon, J., & Boehmke, S. (2002). Characterization and implications of the sen-
sors incorporated into the SenseWear™ armband for energy expenditure and activity
detection. Available at: www.bodymedia.com/site/docs/papers/Sensors.pdf (accessed
September 12, 2017).
Liepert, J. (2005). Transcranial magnetic stimulation in neurorehabilitation. Acta Neurochir-
urgica Supplementum, 93, 71–74.
Liu, D., Santhanam, R., & Webster, J. (2017). Towards meaningful engagement: A frame-
work for design and research of gamified information systems. MIS Quarterly, 41(4),
1011–1034.
Logan, G.D. (2004). Working memory, task switching, and executive control in the task
span procedure. Journal of Experimental Psychology: General, 133(2), 218–236.
Ma, H., Li, S., & Pow, J. (2011). The relation of Internet use to prosocial and antisocial
behavior in Chinese adolescents. Cyberpsychology, Behavior and Social Networking, 14(3),
123–130.
Mackenzie, K.D. (Ed.) (1976). A theory of group structures: Basic theory (Vol.1). New York:
Gordon & Breach.
Makris, N., Oscar-Berman, M., Jaffin, S.K., Hodge, S.M., Kennedy, D.N., Caviness, V.S.,
Marinkovic, K., Breiter, H.C., Gasic, G.P., & Harris, G.J. (2008). Decreased volume of
the brain reward system in alcoholism. Biological Psychiatry, 64(3), 192–202.
154 References
Mascarenhas, Y. (2017). Stephen Hawking: AI could “develop a will of its own” in conflict
with ours that “could destroy us”. International Business Times (November 8). Available at:
www.ibtimes.co.uk/stephen-hawking-ai-could-develop-will-its-own-conflict-ours-that-
could-destroy-us-1646352 (accessed November 11, 2017).
Maslach, C., & Jackson, S.E. (1981). The measurement of experienced burnout. Journal of
Organizational Behavior, 2(2), 99–113.
Malhotra, N.K. (1984). Reflections on the information overload paradigm in consumer
decision making. The Journal of Consumer Research, 10(4), 436–440.
Malhotra, N.K., Jain, A.K., & Lagakos, S.W. (1982). The information overload controversy:
An alternative viewpoint. Journal of Marketing, 46(2), 27–37.
Mandler, G. (1967). Organization and memory. The Psychology of Learning and Motivation, 1,
327–372.
Mason, R.O., & Mitroff, J. (1973). A program for research on Management Information
Systems. Management Science, 19(5), 475–487.
Medina, H., Verhulst, M., & Rutkowski, A.F. (2015). Is it health IT? Task complexity and
work substitution. Paper presented at the 2015 Americas Conference on Information
Systems, Puerto Rico, August 13–15.
Mehdizadeh, S. (2010). Self-presentation 2.0: Narcissism and self-esteem on Facebook.
Cyberpsychology, Behavior and Social Networking, 13(4), 357–364.
Meier, R.L. (1963). Communications overload: Proposals from the study of a university
library. Administrative Science Quarterly, 7(4), 521–544.
Menninger, K. (Ed.) (1963). The vital balance: The life process in mental health and illness. New
York: Viking.
Metz, R. (2017). Smartphones are weapons of mass manipulation, and this guy is declaring
war on them. MIT technology review (October 19). Available at: https://www.technolo
gyreview.com/s/609104/smartphones-are-weapons-of-mass-manipulation-and-this-
guy-is-declaring-war-on-them/ (accessed November 27, 2017).
Miller, D.T., & Ross, M. (1975). Self-serving biases in the attribution of causality: Fact or
fiction? Psychology Bulletin, 82(2), 13–25.
Miller, G.A. (1956a). The magical number seven, plus or minus two: Some limits on our
capacity for processing information. The Psychological Review, 63(2), 81–97.
Miller, G.A. (1956b). Human memory and the storage of information. IRE, Transaction
Information Theory, 2(3), 129–137.
Miller, N.E. (1980). Applications of learning and biofeedback to psychiatry and medicine. In
H.I. Kaplan, A.M. Freedman, & B.J. Sadock (Eds), Comprehensive textbook of psychiatry (3rd
ed., pp.468–484). Baltimore, MD: Williams and Wilkins.
Minas, R.K., Potter, R.F., Dennis, A.R., Bartelt, V., & Bae, S. (2014). Putting on the
thinking cap: Using NeuroIS to understand information processing biases in virtual teams.
Journal of Management Information Systems, 30(4), 49–82.
Modell, J.H. (2005). Assessing the past and shaping the future of anesthesiology: The 43rd
Rovenstine Lecture. Anesthesiology, 102(5), 1050–1057.
Mohan, G. (2013). Facebook is a bummer, study says. Los Angeles Times (August 14).
Available at: www.latimes.com/science/sciencenow/la-sci-sn-facebook-bummer-20130814-
story.html (accessed November 30, 2017).
Molina, B. (2017). Do smartphones keep us in or out of touch? USA Today (August 8), 1B,2B.
Monetta, L., & Joanette, Y. (2003). Specificity of the right hemisphere’s contribution to
verbal communication: The cognitive resources hypothesis. Journal of Medical Speech Lan-
guage Pathology, 11(4), 203–211.
Montgomery, K.C. (2015). Youth and surveillance in the Facebook era: Policy interventions
and social implications. Telecommunications Policy, 39(9), 771–786.
References 155
Pluyter, J.R., Rutkowski, A.-F., & Jakimowicz, J.J. (2014). Immersive training: Breaking the
bubble and measuring the heat. Surgical Endoscopy, 28(5), 1545–1554.
Pluyter, J.R., Rutkowski, A.-F., Jakimowicz, J.J., & Saunders, C.S. (2012). Measuring users’
mental strain when performing technology based surgical tasks on a surgical simulator
using thermal imaging technology. Proceedings of the 45th Hawaii International Conference on
System Sciences (pp.2920–2926). Washington, DC: IEEE Computer Society.
Ponce de León, M.S., Golovanova, L., Doronichev, V., Romanova, G., Akazawa, T.,
Kondo, O., Ishida, H., & Zollikofe, C.P.E. (2008). Neanderthal brain size at birth pro-
vides insights into the evolution of human life history. Proceedings of the National Academy
of Science of the United States of America, 105(37), 13764–13768.
Popper, K. (Ed.) (1959). The logic of scientific discovery. London: Routledge.
Popper, K. (1978). Three worlds: The Tanner lecture on human values. Delivered at The
University of Michigan (April 7). Available at: https://tannerlectures.utah.edu/_docum
ents/a-to-z/p/popper80.pdf (accessed September 29, 2017).
Porter, G., & Kakabadse, N.K. (2006). HRM perspectives on addiction to technology and
work. Journal of Management Development, 25(6), 535–560.
Powers, W. (Ed.) (2010). Hamlet’s BlackBerry. New York: HarperCollins.
Pratarelli, M.E., Browne, B.L., & Johnson, K. (1999). The bits and bytes of computer/
Internet addiction: A factor analytic approach. Behavior Research Methods, Instruments, and
Computers, 31(2), 305–314.
Puri, C., Olson, L., Pavlidis, I., Levine, J., & Starren, J. (2005). StressCam: Non-
contact measurement of users’ emotional states through thermal imaging. Paper
presented at the Conference on Human Factors in Computing Systems, Portland,
OR, April 2–7.
Ragu-Nathan, T.S., Tarafdar, M., Ragu-Nathan, B.S., & Tu, Q. (2008). The consequences
of technostress for end users in organizations: Conceptual development and empirical
validation. Information Systems Research, 19(4), 417–433.
Revelle, W. (1994). Individual differences in personality and motivation: Non-cognitive
determinants of cognitive performance. In A. Baddeley and L Weiskrantz, (Eds), Atten-
tion: Selection awareness and control: A tribute to Donald Broadbent (pp.346–373). Oxford:
Clarendon Press.
Revsine, L. (1970). Data expansion and conceptual structure. Accounting Review, 45(4), 704–
712.
Reyes, M.L., Lee, J.D., Liang, Y., Hoffman, J.D., & Huang, R.W. (2009). Capturing driver
response to in-vehicle human-machine interface technologies using facial thermography.
Proceedings of the International Driving Symposium on Human Factors in Driver Assessment,
Training and Vehicle Design, 5, 536–542.
Riger, S. (1993). What’s wrong with empowerment? American Journal of Community Psy-
chology, 21(3), 279–292.
Rimé, B. (2009). Emotion elicits the social sharing of emotion: Theory and empirical
review. Emotion Review, 1(1), 60–85.
Rimé, B., Noël, P., & Philippot, P. (1991). Episode émotionnel, réminiscences cognitives et
réminiscences sociales. Cahiers Internationaux de Psychologie Sociale, 11, 93–104.
Robey, D., & Taggart, W.M. (1982). Human information processing in information and
decision support systems. MIS Quarterly, 6(2), 62–73.
Roche, H., Blumenschine, R.J., & Shea, J.J. (2009). Origins and adaptations of early Homo:
What archeology tells us. In F.E. Grine, J.G. Fleagle, & R.E. Leakey (Eds), The first
humans: Origin and early evolution of the genus Homo (pp.135–147). Dordrecht: Springer.
Roche, S.M., & McConkey, K.M. (1990). Absorption: Nature, assessment, and correlates.
Journal of Personality and Social Psychology, 59(1), 91–101.
158 References
Romanow, D., Cho, S., & Straub, D. (2012). Riding the wave: Past trends and future
directions for health IT research. MIS Quarterly, 36(3), 3–10.
Rose, J.M., Roberts, F.D., & Rose, A.M. (2004). Affective responses to financial data and
multimedia: The effects of information load on cognitive load. International Journal of
Accounting Information Systems, 5(1), 5–24.
Rosen, L.D., Carrier, M.L., & Cheever, N.A. (2013). Facebook and texting made me do
it: Media-induced task-switching while studying. Computers in Human Behavior, 29(3),
948–958.
Rosen, L.D., Cheever, N.A., & Carrier, L.M. (Eds) (2012). iDisorder: Understanding our
obsession with technology and overcoming its hold on us. New York: Palgrave Macmillan.
Rosen, L.D., Whaling, K., Rab, S., Carrier, L.M., & Cheever, N.A. (2013). Is Facebook
creating “iDisorders”? The link between clinical symptoms of psychiatric disorders and
technology use, attitudes and anxiety. Computers in Human Behavior, 29(3), 1243–1254.
Rosman, A., Biggs, S., Graham, L., & Bible, L. (2007). Successful audit workpaper review
strategies in electronic environments. Journal of Accounting, Auditing & Finance, 22(1),
57–83.
Ross, L. (1977). The intuitive psychologist and his shortcomings: Distortions in the attribu-
tion process. Advances in Experimental Social Psychology, 10, 173–220.
Rowe, R., Maughan, B., Moran, P., Ford, T., Briskman, J., & Goodman, R. (2010). The
role of callous and unemotional traits in the diagnosis of conduct disorder. Journal of Child
Psychology and Psychiatry, 51(6), 688–695.
Rutkowski, A.F. (2016). Work substitution: A neo-Luddite look at software growth. IEEE
Software, 33(3), 101–104.
Rutkowski, A.F., Rijsman, J.B., & Gergen, M. (2004). Paradoxical laughter at a victim as
communication with a non-victim. International Review of Social Psychology, 17(4), 5–11.
Rutkowski, A.F., & Saunders, C. (2010). Growing pains with information overload. IEEE
Computer, 43(6), 94–96.
Rutkowski, A.F., Saunders, C., & Hatton, L. (2013). The generational impact of software.
IEEE Software, 30(3), 87–89.
Rutkowski, A., Saunders, C., Wiener, M., & Smeulders, R. (2013). Intended usage of a
healthcare communication technology: Focusing on the role of IT-related overload.
Paper presented at the 34th International Conference on Information Systems, Milan,
Italy, December 15–18.
Rutkowski, A.F., & van Genuchten, M. (2008). No more reply-to-all. IEEE Computer,
41(7), 95–96.
Ryan, T., & Xenos, S. (2011). Who uses Facebook? An investigation into the relationship
between the Big Five, shyness, narcissism, loneliness, and Facebook usage. Computers in
Human Behavior, 27(5), 1658–1664.
Salanova, M., Llorens, S., & Cifre, E. (2013). The dark side of technologies: Technostress
among users of information and communication technologies. International Journal of Psy-
chology, 48(3), 422–436.
Sanz, C.M., & Morgan, D.B. (2013). Ecological and social correlates of chimpanzee tool
use. Proceedings of the Royal Society B: Biological Sciences, 368(1630). doi:10.1098/
rstb.2012.0416
Sarker, S., Ahuja, M., & Sarker, S. (2018). Work-life conflict of globally distributed software
development personnel: An empirical investigation using border theory. Information Sys-
tems Research, 29(1), 103–126.
Sarker, S., Sarker, S., & Jana, D. (2010). The impact of the nature of globally distributed
work arrangement on work–life conflict and valence: The Indian GSD professionals’
perspective. European Journal of Information Systems, 19(2), 209–222.
References 159
Sarker, S., Xiao, X., Sarker, S., & Ahuja, M. (2012). Managing employees’ use of mobile
technologies to minimize work-life balance impacts. MIS Quarterly Executive, 11(4),
143–157.
Saunders, C., Rutkowski, A.F., Pluyter, J., & Spanjers, R. (2016). Health information
technologies: From hazardous to the dark side. Journal of the Association for Information Sci-
ence and Technology, 67(7), 1767–1772.
Saunders, C.S., Van Slyke, C., & Vogel, D. (2004). My time or yours? Managing time
visions in global virtual teams. Academy of Management Executive, 18(1), 19–31.
Saunders, C., Wiener, M., Klett, S., & Sprenger, S. (2017). The impact of mental repre-
sentations on ICT-related overload in the use of mobile phones. Journal of Management
Information Systems, 34(3), 803–825.
Savage, T.S., & Wyman, J. (1843–1844). Observations on the external characters and habits
of Troglodytes niger, Geoff. And on its organization. Boston Journal of Natural History, 4,
362–386.
Sax, M. (2016). Big data: Finders keepers, losers weepers? Ethics and Information Technology,
18(1), 25–31.
SBS6 (2009). Baby Mobile. (Dutch national TV news, October 4.)
Schachter, S. (Ed.) (1959). The psychology of affiliation. Stanford, CA: Stanford University Press.
Schaefer, A., & Philippot, P. (2005). Selective effects of emotion on the phenomenal char-
acteristics of autobiographical memories. Memory, 13(2), 148–160.
Schaefer, K.E., Adams, J.K., Cook, J.G., Bardwell-Owens, A., & Hancock, P.A. (2015). The
future of robotic design: Trends from the history of media representations. Ergonomics in
Design, 23(1), 13–19.
Schechner, S. (2017). Meet your new boss: An algorithm. The Wall Street Journal (December
10). Available at: https://www.wsj.com/articles/meet-your-new-boss-an-algorithm-
1512910800 (accessed December 14, 2017).
Scherer, K.R. (1994). Emotion serves to decouple stimulus and response. In P. Ekman & R.
J. Davidson (Eds), The nature of emotion: Fundamental questions (pp.127–130). New York:
Oxford University Press.
Schick, A.G., Gordon, L.A., & Haka, S. (1990). Information overload: A temporal approach.
Accounting, Organizations and Society, 15(3), 199–220.
Schijven, M.P., & Bemelman, W.A. (2011). Problems and pitfalls in modern competency-
based laparoscopic training. Surgical Endoscopy, 25(7), 2159–2163.
Schijven, M., & Jakimowicz, J. (2003). The learning curve on the Xitact LS 500 laparoscopy
simulator: Profiles of performance. Surgical Endoscopy, 18(1), 121–127.
Schlenker, B.R., & Leavy, M.R. (1982). Social anxiety and self-presentation: A con-
ceptualization and model. Psychological Bulletin, 92(3), 641–669.
Schlotz, W., Hellhammer, J., Schulz, P., & Stone, A.A. (2004). Perceived work overload
and chronic worrying predict weekend-weekday differences in the cortisol awakening
response. Psychosomatic Medicine, 66(2), 207–214.
Schneider, S.C. (1987). Information overload: Causes and consequences. Human Systems
Management, 7(2), 143–153.
Schneider, W., & Fisk, A.D. (1982). Concurrent automatic and controlled visual search: Can
processing occur without resource cost? Journal of Experimental Psychology: Learning,
Memory, and Cognition, 8(4), 261–278.
Schroeder, R. (2014). Big data and the brave new world of social media research. Big Data &
Society, 1(2), 1–11.
Schultz, U., & Vandenbosch, B. (1998). Information overload in a groupware environment:
Now you see it, now you don’t. Journal of Organizational Computing and Electronic Com-
merce, 8(2), 127–148.
160 References
Schulze, L., Dziobek, I., Vater, A., Heekeren, H.R., Bajbouj, M., Renneberg, B., Heuser,
I., & Roepke, S. (2013). Gray matter abnormalities in patients with narcissistic personality
disorder. Journal of Psychiatric Research, 4(10), 1363–1369.
Schwab, D.P. (1980). Construct validity in organizational behavior. In B.M. Staw & L.L.
Cummings (Eds), Research in organizational behavior (Vol.2, pp.3–43). Greenwich, CT:
JAI Press.
Schwarz, N. (1990). Feelings as information: Informational and motivational functions
of affective states. In E.T. Higgins & R. Sorrentino (Eds), Handbook of motivation and
cognition: Foundations of social behavior (Vol.2, pp.527–561). New York: Guilford
Press.
Scoville, W.B., & Milner, B.J. (1957). Loss of recent memory after bilateral hippocampal
lesions. Journal of Neurology, Neurosurgery and Psychiatry, 20(1), 11–21.
Sergeeva, A., Huysman, M.H., & Faraj, S.A. (2016). Material enactment of work practices:
Zooming in on the practice of surgery with the Da Vinci robot. Paper presented at Ifip
WG8.2 Working Conference, Dublin, Ireland, December 9–10.
Sexton, J.B., Thomas, E.J., & Helmreich, R.L. (2000). Error, stress, and teamwork in med-
icine and aviation: Cross sectional surveys. British Medical Journal, 320(7237), 745–749.
Seymour, N.E. (2008). VR to OR: A review of the evidence that virtual reality simulation
improves operating room performance. World Journal of Surgery, 32(2), 182–188.
Shapira, N.A., Goldsmith, T.D., Keck, P.E., Khosla, U.M., & McElroy, S.L. (2000). Psy-
chiatric features of individuals with problematic Internet use. Journal of Affective Disorder,
57(1–3), 267–272.
Sharkey, N., & Sharkey, A. (2013). Robotic surgery: On the cutting edge of ethics. Com-
puter, 46(1), 56–64.
Shirom, A., Nirel, N., & Vinokur, A.D. (2006). Overload, autonomy, and burnout as
predictors of physicians’ quality of care. Journal of Occupational Health Psychology, 11(4),
328–342.
Shiv, B., & Fedorikhin, A. (1999). Heart and mind in conflict: Interplay of affect and cog-
nition in consumer decision making. Journal of Consumer Research, 26(3), 278–282.
Simnett, R. (1996). The effect of information selection, information processing and task
complexity on predictive accuracy of auditors. Accounting, Organizations and Society, 21(2),
699–719.
Simon, H.A. (1971). Designing organizations for an information-rich world. In M. Green-
berger (Ed.), Computers, communications, and the public interest (pp.37–72). Baltimore, MD:
The Johns Hopkins Press.
Simon, H.A. (1980). The behavioral and social sciences. Science, 209, 71–77.
Simon, H.A., & Newell, A. (1971). Human problem solving: The state of the theory in
1970. American Psychologist, 26(2), 145–159.
Simpson, C.W., & Pruzak, L. (1995). Troubles with information overload. Moving quantity
to quality in information provision. International Journal of Information Management, 15(6),
413–425.
Skinner, B.F. (1935). The generic nature of the concepts of stimulus and response. Journal of
General Psychology, 12(1), 40–65.
Skinner, B.F. (1985). Cognitive science and behaviourism. British Journal of Psychology, 76(3),
291–301.
Slagel, J.M., & Weinger, M.B. (2009). Effects of intraoperative reading on vigilance and
workload during anaesthesia care in an academic medical center. Anesthesiology, 110(2),
275–283.
Snowball, D. (1980). Some effects of accounting expertise and information load: An
empirical study. Accounting, Organizations, and Society, 5(3), 323–338.
References 161
Spada, M.M. (2014). An overview of problematic Internet use. Addictive Behaviors, 39(1),
3–6.
Spanjers, R.W.L. (2012). Be patient: A longitudinal study on adoption and diffusion of IT-inno-
vation in Dutch Healthcare. Doctoral dissertation, Tilburg University.
Spanjers, R., & Rutkowski, A.F. (2005). The Telebaby® case. In J. Tan (Ed.), E-health care
information systems: An introduction for students and professionals (pp.27–36). San Francisco,
CA: Jossey-Bass.
Spanjers, R., Rutkowski, A.F., & Feuth, S. (2003). Telebaby: Live videostreaming from a
neonatal ward using Internet. Paper presented at the 9th Americas Conference on Infor-
mation Systems, Tampa, FL, August 4–6.
Spanjers, R., Rutkowski, A.F., & Van Genuchten, M. (2007). BabyMobile, virtual baby visit
at the hospital using UMTS. Paper presented at the 11th International Association of
Science and Technology for Development International Conference on Internet and
Multimedia Systems and Applications, Honolulu, HI, August 20–22.
Sparrow, P.R. (1999). Strategy and cognition: Understanding the role of management
knowledge structures, organizational memory and information overload. Creativity and
Innovation Management, 8(2), 140–148.
Speier, C., Valacich, J.S., & Vessey, I. (1999). The influence of task interruption on indi-
vidual decision making: An information overload perspective. Decision Sciences, 30(2),
337–360.
Spiekermann-Hoff, S., & Novotny, A. (2015). A vision for global privacy bridges: Technical
and legal measures for international data markets. Computer Law and Security Review, 31(2),
181–200.
Spink, A. (2004). Multitasking information behavior and information task switching: An
exploratory study. Journal of Documentation, 60(4), 336–351.
Spink, A., Cole, C., & Waller, M. (2008). Multitasking behavior. Annual Review of Informa-
tion Science and Technology, 42, 93–118.
Spitzer, M. (Ed.) (2012). Digitale Demenz: Wie wir uns und unsere Kinder um den Verstand
bringen. München: Droemer Knaur Verlag.
Squire, L.R., & Alvarez, P. (1995). Retrograde amnesia and memory consolidation: A neu-
robiological perspective. Current Opinion in Neurobiology, 5(2), 169–177.
Stahl, J.E., Egan, M.T., Goldman, J.M., Tenney, D., Wiklund, R.A., Sandberg, W.S.,
Gazelle, S., & Rattner, D.W. (2005). Introducing new technology into the operating
room: Measuring the impact on job performance and satisfaction. Surgery, 137(5),
518–526.
Stemberger, J., Allison, R.S., & Schnell, T. (2010). Thermal imaging as a way to classify
cognitive workload. Paper presented at the Canadian Conference on Computer and
Robot Vision, Ottawa, Canada, May 31–June 2.
Strasburger, V.C., & Hogan, M.J. (2013). Children, adolescent and the media. Pediatrics, 132
(5), 958–961.
Streufert, S. & Streufert, S.C. (Eds) (1978). Behavior in the complex environment. New York:
Wiley.
Subrahmanyam, K., Kraut, R., Greenfield, P., & Gross, E. (2000). The impact of home
computer use on children’s activities and development. The Future of Children – Children
and Computer Technology, 10(2), 123–144.
Sutcliffe, K.M., & Weick, K.E. (2008). Information overload revisited. In G.P. Hodgkinson
& W.H. Starbuck (Eds), The Oxford handbook of organizational decision making (pp.56–75).
Oxford: Oxford University Press.
Swain, M.R., & Haka, S.F. (2000). Effects of information load on capital budgeting deci-
sions. Behavioral Research in Accounting, 12(1), 171–198.
162 References
Sweller, J. (1988). Cognitive load during problem solving: Effects on learning. Cognitive
Science, 12(2), 257–285.
Sweller, J., Van Merrienboer, J., & Paas, F. (1998). Cognitive architecture and instructional
design. Educational Psychology Review, 10(3), 251–296.
Sykes, K., & Macnaghten, P. (2013). Opening up dialogue and debate. In R. Owen, J.
Bessant, & M. Heitz (Eds), Responsible innovation: Managing the responsible emergence of science
and innovation in society (pp.85–107). Chichester: John Wiley.
Szczepanski, S.M., & Knight, R.T. (2014). Insights into human behavior from lesions to the
prefrontal cortex. Neuron, 83(5), 1002–1018.
Tagg, R., Gandhi, P., & Srinivasan Kumaar, R. (2009). Recognizing work priorities and
tasks in incoming messages through personal ontologies supplemented by lexical clues.
Paper presented at the 17th European Conference on Information Systems, Verona, Italy,
8–10 June.
Tam, K.Y., & Ho, S.Y. (2005). Web personalization as a persuasion strategy: An elaboration
likelihood model perspective. Information Systems Research, 16(3), 271–291.
Tamir, D.I., & Mitchell, J.P. (2012). Disclosing information about the self is intrinsically
rewarding. Proceedings of the National Academy of Sciences, 109(21), 8038–8043.
Tarafdar, M., Beath, C.M., & Ross, J.W. (2017). Enterprise cognitive computing applications.
Working Paper No. 420. Cambridge, MA: MIT Center for Information Systems
Research (CISR).
Tarafdar, M., Pullins, E.B., & Ragu-Nathan, T.S. (2015). Technostress: Negative effect on
performance and possible mitigations. Information Systems Journal, 25(2), 103–132.
Tarafdar, M., Tu, Q., & Ragu-Nathan, T.S. (2010). Impact of technostress on end-user
satisfaction and performance. Journal of Management Information Systems, 27(3), 303–334.
Tarafdar, M., Tu, Q., Ragu-Nathan, B.S., & Ragu-Nathan, T.S. (2007). The impact of
technostress on role stress and productivity. Journal of Management Information Systems,
24(1), 301–328.
Tarafdar, M., Tu, Q., Ragu-Nathan, T.S., & Ragu-Nathan, B.S. (2011). Crossing to the
dark side: Examining creators, outcomes, and inhibitors of technostress. Communications of
the ACM, 54(9), 113–120.
Taylor, S.E., & Koivumaki, J.H. (1976). The perception of self and others: Acquaintance-
ship, affect, and actor-observer differences. Journal of Personality and Social Psychology, 33(4):
403–408.
Teasdale, J.D. (1993). Selective effects of emotion on information processing. In A.Baddeley
& L.Weiskrantz (Eds), Attention: Selection, awareness, and control: A tribute to Donald Broad-
bent (pp.374–389). Oxford: Clarendon Press.
Tellegen, A. (1981). Practicing the two disciplines of relaxation and enlightenment:
Comment on “Role of the feedback signal in electromyography biofeedback: The
relevance of attention” by Qualls and Sheehan. Journal of Experimental Psychology:
General, 110(2), 217–226.
Tennant, M. (Ed.) (1988). Psychology and adult learning. London: Routledge.
Terman, L.M. (1916). The uses of intelligence tests. In L.M. Terman (Ed.), The measurement
of intelligence: An explanation of and a complete guide for the use of the Stanford revision and
extension of the Binet-Simon Intelligence Scale (pp.3–21). Boston: Houghton Mifflin.
Thatcher, A., & Goolam, S. (2005). Development and psychometric properties of the Pro-
blematic Internet Use Questionnaire. South African Journal of Psychology, 35(4), 793–809.
Tingley, K. (2017). Learning to love our robot co-workers. The New York Times Magazine
(February 23), 30–32,58,63. Available at: https://www.nytimes.com/2017/02/23/maga
zine/learning-to-love-our-robot-co-workers.html (accessed November 6, 2017).
Toffler, A. (1970). Future shock. New York: Random House.
References 163
Tollner, A.M., Riley, M.A., Matthews, G., & Shockley, K.D. (2005). Divided attention
during adaptation to visual-motor rotation in an endoscopic surgery simulator. Cognition,
Technology and Work, 7(1), 6–13.
Tolman, E.C. (1948). Cognitive maps in rats and men. Psychological Review, 55(4), 189–208.
Tomkins, S.S. (1984). Affect theory. In K.R. Scherer and P. Ekman (Eds), Approaches to
emotion (pp.163–195). Hillsdale, NJ: Erlbaum.
Treisman, A. (1964). Selective attention in man. British Medical Bulletin, 20(1), 12–16.
Treisman, A., & Riley, J. (1969). Is selective attention selective perception or selective
response? A further test. Journal of Experimental Psychology, 79(1), 27–34.
Trompenaars, F., & Hampden-Turner, C. (Eds) (2011). Riding the waves of culture: Under-
standing diversity in global business. New York: Nicholas Brealey Publishing.
Tsai, H.Y., Compeau, D., & Haggerty, N. (2007). Of races to run and battles to be won:
Technical skill updating, stress, and coping of IT professionals. Human Resource Manage-
ment, 46(3), 395–409.
Tulving, E. (1962). Subjective organization in free recall of unrelated word. Psychology
Review, 69(4), 344–354.
Tulving, E. (1972). Episodic and semantic memory. In E. Tulving & W. Donaldson (Eds),
Organization of memory (pp.381–403). New York: Academic Press.
Tulving, E. (Ed.) (1983). Elements of episodic memory. New York: Oxford.
Tulving, E. (2002). Episodic memory: From mind to brain. Annual Review of Psychology, 53, 1–25.
Turel, O., & Serenko, A. (2010). Is mobile email addiction overlooked? Communications of
the ACM, 53(5), 41–43.
Turkle, S. (Ed.) (2011). Alone together: Why we expect more from technology and less from each
other. New York: Basic Books.
Tushman, M.L., & Nadler, D.A. (1978). Information processing as an integrating concept in
organizational design. Academy of Management Review, 3(3), 613–624.
Tuttle, B., & Burton, F.G. (1999). The effects of a modest incentive on information over-
load in an investment analysis task. Accounting, Organizations and Society, 24(8), 673–687.
Tversky, A., & Kahneman, D. (1973). Availability: A heuristic for judging frequency and
probability. Cognitive Psychology, 5(2), 207–232.
Ursin, H. (1980). Personality, activation and somatic health. In S. Levine & H. Ursin (Eds),
Coping and Health (NATO Conference Series III: Human factors, pp.259–280). New
York: Plenum.
Vaillant, G.E. (1977). Adaptation to life. Boston, MA: Little Brown.
Van Knippenberg, D., Dahlander, L., Haas, M.R., & George, G. (2015). Information,
attention, and decision making. Academy of Management Journal, 58(3), 649–657.
Vatsyayan, S.H. (1981). A sense of time: An exploration of time in theory, experience, and art.
Delhi: Oxford University Press.Vezyridis, P., & Timmons, S. (2015). On the adoption of
personal health records: Some problematic issues for patient empowerment. Ethics and
Information Technology, 17(2), 113–124.
Vinkers, C.H., Penning, R., Hellhammer, J., Verster, J.C., Klaessens, J.H., Olivier, B., &
Kalkman, C.J. (2013). The effect of stress on core and peripherical temperature. Stress, 16
(5), 520–530.
Volkow, N.D., & Wise, R.A. (2005). How can drug addiction help us understand obesity?
Nature Neuroscience, 8(5), 555–560.
Volkskrant (2001). Big mother. (Dutch national newspaper, August 9.)
von Neumann, J. (Ed.) (1958). The computer and the brain. New Haven, CT: Yale University
Press.Waller, M.J., Conte, J.M., Gibson, G., & Carpenter, A. (2001). The impact of
individual time perception on team performance under deadline conditions. Academy of
Management Review, 26(4), 586–600.
164 References
Young, K.S. (Ed.) (1998). Caught in the Net: How to recognize the signs of Internet addiction and a
winning strategy for recovery. New York: John Wiley.
Young, K.S. (1999). The evaluation and treatment of Internet addiction. In L. VandeCreek
& T. Jackson (Eds), Innovations in clinical practice: A source book (pp.17,19–31). Sarasota, FL:
Professional Resource Press.
Young, K.S., & Rogers, R.C. (1998). The relationship between depression and Internet
addiction. CyberPsychology & Behavior, 1(1), 25–28.
Yule, S., Flin, R., Paterson-Brown, S., & Maran, N. (2006). Non-technical skills for sur-
geons in the operating room: A review of the literature. Surgery, 139(2), 140–149.
Zajonc, R.B. (1980). Feeling and thinking: Preferences need no inferences. American Psy-
chologist, 35(2), 151–175.
Zheng, B., Cassera, M.A., Martinec, D.V., Spaun, G.O., & Swanstrom, L.L. (2010). Mea-
suring mental workload during the performance of advanced laparoscopic tasks. Surgical
Endoscopy, 24(1), 45–50.
INDEX
accidents, industrial and military 92, 94, 121 blender metaphor 5–8
Attention Deficit Hyperactivity Disorder bottlenecks 7, 27, 42, 44, 46, 105
(ADHD) 14 Bower, G.H. 29, 30, 43, 63, 69, 71,
algorithms 91, 120–122, 124, 125, 128 107, 118
Amazon 123 brain chip implants 126
American Academy of Pediatrics 74 brain injuries 25
amount illusion 7, 8, 45, 49, 52, 53, 57 brain load 2, 4, 8–10, 44, 46–51,
Amsterdam 121 105, 110
amygdala 25, 29 brain overload 2, 3, 5–8, 11, 12, 18,
anaesthesiology 9, 90, 91 22, 23, 26, 34, 35, 43, 44, 46, 47,
anxiety 14, 29, 30, 32, 43, 60, 62, 63, 51, 60, 126
67–71, 74, 75, 80, 88, 107, 116, 117 Brain Reward System (BRS) 21, 25, 31–33,
apps 3, 74, 108, 120, 121, 126–129 66, 68, 72, 119, 127
Aral, Sinan 101 Brisbane, Australia 121
Archytas of Tarentum 92 Broadbent, D. 23, 26–28, 32, 42
Ardipithecus 1 Brookstone 121
Aristotle 18, 31, 125 Bryson, Bill 99, 100, 107, 118, 131
Artificial Intelligence (AI) 15, 23, 33, 92, 93, Buckinghamshire, Penn 130, 131
119, 122 burnout 11, 43, 45, 51, 52, 79, 84, 86,
Artificial Intelligence Laboratory (MIT) 92 97, 124
Asimov, Isaac 93, 124
associative models 21, 23, 29, 74, 118 Cannon, W.B. 20, 24
automation 78, 90–92, 96, 98, 111, 130 Capek, Karel 92
autonomic nervous system 24, 112, 113 Caplan, S.E. 69, 72, 75
auto-reply 4, 34 Carr, Nicholas 2, 89
Cawood, Andrew 81
Barcelona 121 Center for Internet Addiction 68
behaviourism 20–25, 30–35, 69, 74
big bang 131 Central Nervous System (CNS) 24, 35
big data 119–122, 128, 130–133 cerebral circuit 25
‘Big Five’ personality traits 31, 70, 116 cerebral cortex 24
Biotechnology 126 chunking 6, 7, 9, 10, 23, 27, 32, 34, 41, 42,
BlackBerry 75, 88 44, 45, 47, 50
Index 167
cognitive absorption 110, 116 Facebook 56, 67, 68, 70, 72–74, 88, 109,
cognitive load 41, 42, 80, 105, 106 118, 129
cognitive overload 45, 1–133 facial temperature 112
cognitivism 8, 20, 22, 23, 24, 26, 30–35, 38, fatigue 2, 4, 13, 25, 39, 43, 55, 111, 112
40–44, 57, 65, 110, 124 Fear of Missing Out (FOMO) 62, 63, 66, 80
collaborative filtering 131 Federal Aviation Administration 92
communication overload 39, 40, 105, 108 filtering 7, 8, 26, 27, 40, 42, 44–46, 73, 75,
computationist models 23 81, 125, 126, 131
congruence 9, 22, 23, 29, 32, 46, 49, 52, Fisher, Sir Ronald Aylmer 132
63, 71, 111, 112, 114, 116 flow 23, 28, 37, 55, 61, 62, 111, 112
constructs 28, 42, 71, 72, 100, Folkman, S. 65
101, 103–106, 108, 109, 112, France 76, 86, 99, 130
116, 132 Freud, Sigmund 32, 33, 65, 66
contingency boundedness 45, 52, 53, 57 Frost, Robert 77, 128
full working memory model 27
Damasio, A. 25, 26, 30, 44, 102, 110 functionalism 19, 20, 22, 63
Darwin, Charles 1, 102 Functional Magnetic Resonances Imaging
Da Vinci® Surgical System 95 (fMRI) 115, 116
Davis, R.A. 71, 72, 75 functional neuroimaging techniques 26, 115
deadlines 55
decision support systems (DSS) 38 Galbraith, J.R. 54, 78
de la Contamine, Charles Marie 99, 100 Galton, Sir Francis 102
depression 13, 14, 52, 68, 69, 71–73 galvanic skin response (GSR) 100,
Descartes, René 18, 101 113, 114
deskilling 91 gamification 119, 120, 128
Devol, George 92 Gardner, H. 18, 19, 23, 31, 101
Diagnostic and Statistical Manual of Mental General Motors (GM) 92, 94
Disorders (DSM) 14, 70, 71 germ theory 17, 18
Diderot 19, 101 Great Trigonometrical Survey 99
digital footprint 122
Disneyland 120 Harris, Tristan 2, 75, 128
driverless cars 91 Hawking, Stephen 119
Health Information Technology (HIT) 90,
Ebbinghaus, H. 19 91, 95, 120, 124
Edison, Thomas 15 hemispherical specialization 38
ego 32, 65, 70, 73, 74, 119, 128 hemispheric encoding 116
email overload 4, 128 heuristics 23, 32, 35, 41, 47, 48, 63, 64
Emotional-Cognitive Overload (ECO) 6, Hipparchus of Nicaea 100
11, 44, 46, 48, 49, 52 homeostasis 24, 25, 36, 47, 64, 66,
Emotional-Cognitive Overload Model 111, 112
(ECOM) 5, 6, 12, 15, 35, 38, 44–46, Honda 92
50–55, 57 Hoyle, Fred 131
emotions 8, 9, 12, 19, 20, 25, 29, 31–33, hyperconnectivity 13, 61, 62, 119
35, 40, 43, 44, 46, 55, 57, 63–65, 67, 70,
75, 93, 118, 124, 127 id 65
Enterprise Cognitive Computing (ECC) 76, iDisorders 2, 62, 68, 69, 71, 73
77, 91, 92 implicit memory 28
episodic memory 28, 29, 45, 48, 50, 64, 116 individual differences 7, 9, 31, 37, 44, 46,
ethics 2, 67, 122, 124–127 52, 53, 55, 102–104
Everest, George 99 industrial revolution 81
evolution 1, 15, 22, 27, 92, 119 information processing (IP) 40, 41, 43, 44,
experts 4, 6, 8, 10, 31, 32, 35, 45, 47–52, 46, 49, 53
81, 91, 106, 111, 113 information processing capacity (IPC) 31,
explicit memory 28 32, 34, 39–41, 43, 44, 54, 56, 78, 79,
eye-tracking 115 82, 89
168 Index
Instagram 67, 72, 73 mindfulness 2, 4, 15, 34, 74, 75, 83, 98,
interactive cognitive subsystems (ICS) 30 122, 125, 126, 128–130, 132, 133
International Space Station 123 mind-gut 19, 32, 35, 110
Internet 3, 12–14, 58–61, 66–72, 74, 86–88, minimally invasive surgery 110
98, 106, 107, 119, 121, 132 Minnesota experiments 38
Internet of Things 121, 132 Mitroff, J. 38, 43
introspection 19, 102, 115 mobile mindset study 67
IQ tests 103 modal model 27–29
IT addiction 1–3, 5, 7, 12–15, 22, 35, Mohan, Geoffrey 73
61, 62, 66–72, 74, 75, 77, 78, monochronicity 55
86–89, 98, 128–130 multitasking 12–14, 50, 56, 57, 67, 73,
IT-related overload 1, 2, 4, 13, 15, 20, 33, 91, 106
34, 36–57, 77–79, 82, 87, 89, 99–117, Myers-Briggs personality types 38
128, 129, 131
nanotechnology 126
Jacoby, J. 35, 41, 42, 105, 109 narcissism 31, 35, 68, 70, 72, 73,
James, W. 19, 20, 63 102, 127
natural selection 102
Kant, Immanuel 18, 19, 23, 32, Neanderthals 1, 15
101, 102 need for cognition (NFC) 47, 48, 53,
Kiva Systems 123 107, 116
Knight Rider 93 neo-Luddites 89
Kohlberg, L. 124, 125 net generation 13, 14, 56, 57, 70
Köhler, W. 1 Netherlands 11, 48, 53, 58, 60, 76, 91,
Krugman, Paul 89 121, 127
Kuhn, T.S. 18 neurohormones 25
neuroticism 31, 116
Lambton, William 99, 100 Newton, Sir Isaac 118
Laparoscopic Surgical Skills (LSS) 113–115 Nexus A.I. 121
laughter 65–66 Nielsen 81
Laws of Robotics (Asimov) 93 non–experts 10, 48–50, 111
Lazarus, R.S. 65 Norman, D.A. 27, 28
Leavitt, H.J. 77
Leonardo da Vinci 92 objective time 55
‘Like’ buttons 118, 119, 129 objectivity 47, 55, 101–105, 109, 111, 116
limbic system 24–26, 28, 32, 66, 116 Occupational Safety and Health
LinkedIn 73, 88 Administration 93
Lister, Joseph 17 online baby system (OBS) 58–64, 66–68,
logic theory 23 71, 72, 74, 75
loneliness 59, 67–69, 71–73 organizational design 54, 77–79, 81,
long-term memory (LTM) 25, 28, 29, 34, 85, 88, 96–98, 129
41, 42, 45–49, 63, 69, 74 Outlook 126
Luddites 89, 90 over-connectivity 31, 58, 68, 72
lying 13, 22 oxytocin 24, 25, 66, 72
peripheral nervous system (PNS) 24 social networking systems (SNS) 3, 12, 14,
personality traits and disorders 8, 22, 24, 31, 66, 69–74, 86, 98, 128
32, 34, 47, 48, 53, 61, 67–70, 75, 102, social phobia 14, 69
103, 110, 116, 117 Songdo City 122
pertinence 6–10, 27 ,29, 37, 40, 42–46, 48, South Africa 94
49, 52, 54, 57, 63, 75, 92, 93, 119, 121, South Korea 122, 130
123, 126 Spitzer, M. 73
phantom vibration syndrome 2, 73 Standford-Binet Intelligence Scale
phenomenological sociology 102 (SBIS) 103
pilots 8, 35, 126 Star Trek: The Next Generation 93
Platform for Privacy Preferences 123 Steve Jobs schools 73
polychronicity 55, 56 stimulus–response (S–R) 21, 22, 24, 63
Popper, K. 18, 101, 102, 107, 130 subjective time 55
positron emission tomography 26, 115 subjectivity 20, 30, 55, 101, 102, 104, 109,
prefrontal cortex (PFC) 25, 26, 28, 32, 102, 111, 114, 116
115, 116 superchunking 10, 27, 32, 47,
prior experience of ECO (PECO) 116, 117
48–50, 53 supervenience 18–20, 23, 24, 26, 32,
privacy 73, 90, 122, 123, 127, 130 34, 36, 64, 66, 73, 89, 101, 107, 110,
psychoanalysis 31, 32, 64, 70 116, 119, 131
psychometrics 102–105, 107, 113 suppressed emotions 64
Puerperal Fever 17 Survey of India see Great Trigonometrical
Survey
qualitative overload 39, 43, 104, 105 Sweller, J. 34, 42, 47, 105, 106
quantum computing 122
task-switching 28, 52, 54, 56, 57, 67, 80,
requests to use IT 11 82, 87
Revelle, W. 31 technophilia 61, 69, 73, 129
Robots 78, 89, 90, 92–96, 98, 119, technophobia 61
123–125, 123, 128, 129, 133 techno-strain 89
Rosenstein, Justin 118, 119, 128 technostress 2, 11, 12, 14, 39, 50, 61, 78,
Royal Dutch Shell 121 86, 87, 89, 97, 98, 126, 128
Tesla 124
Savage, T.S. 1 Thermoview 8300 camera 111
schemata 19, 23, 26–29, 34, 41–45, three worlds theory (Popper) 101
48, 49, 53, 57, 63–65, 69, 101, time management 74
107, 117, 126 tools 1–3, 15, 39, 79, 91, 92, 98, 106, 111,
scientific revolution 18 115, 116, 119
second brain 35, 110 transcranial magnetic stimulation 126
self-driving 91, 124 trauma 25, 63–65
self-serving attribution bias 49, 108, 109 triangulation 99, 100, 110, 112–117
semantic memory 28, 29, 48 Turkle, S. 68
Semmelweis, Ignaz 17, 18 Twitter 5, 73, 88, 109
SenseWear BodyMedia system 113
sensory memory 27, 28 Uber 120, 121
Seoul 121 underload 7–9, 49, 95, 104
September 11 attacks (9/11) 66 United States of America (USA) 27, 77,
seven (magical number) 9, 27, 34, 91–94, 122, 130
40, 41
Sherer, K.R. 20 valence 7, 9, 21, 29, 30, 43, 46, 50–52, 64,
short-term memory (STM) 28, 42, 45 65, 74
Simbionix LAP Mentor 113 Vallor, Sharon 124
smart farming 122
smartphones 2, 3, 10, 11, 13, 14, 37, 51, 57, Watson, J.B. 20
58, 67, 73, 75, 88, 91, 121, 128, 129, 132 Waugh, Andrew Scott 99
170 Index
withdrawal 12–14, 35, 66, 68, work-life balance 16, 76, 78, 81–86, 98,
61, 88 128, 132
working day 77, 82, Work-Life Balance campaign 85
83, 88 Wundt, Wilhelm 19
work-family conflict 16, 83–85, 87, Wyman, J. 1
88, 97, 129
working memory (WM) 26–30 Young, Kimberley 68