Você está na página 1de 179

EMOTIONAL AND COGNITIVE

OVERLOAD

We live in a world of limitless information. With technology advancing at an astonishingly


fast pace, we are challenged to adapt to robotics and automated systems that threaten to
replace us. Both at home and at work, an endless range of devices and Information Tech-
nology (IT) systems place demands upon our attention that human beings have never
experienced before, but are our brains capable of processing it all?
In this important new book, an in-depth view is taken of IT’s under-studied dark side and its
dire consequences on individuals, organizations, and society. With theoretical underpinnings
from the fields of cognitive psychology, management, and information systems, the idea of brain
overload is defined and explored, from its impact on our decision-making and memory to how
we may cope with the resultant ‘technostress’. Discussing the negative consequences of tech-
nology on work substitution, technologically induced work-family conflicts, and organizational
design as well as the initiatives set up to combat these, the authors go on to propose measure-
ment approaches for capturing the entangled aspects of IT-related overload. Concluding on an
upbeat note, the book’s final chapter explores emerging technologies that can illuminate our
world when mindfully managed.
Designed to better equip humans for dealing with new technologies, supported by case
studies, and exploring the idea of ‘IT addiction’, the book concludes by asking how IT pro-
cesses may aid rather than hinder our cognitive functioning. This is essential reading for
anyone interested in how we function in the digital age.

Anne-Françoise Rutkowski is Professor in Management of Information at Tilburg University.


Her research interests include information overload, decision-making, emotion, and the materi-
ality of algorithms. Her background is in psychology. Her research has been published in Decision
Support Systems, IEEE Computer, IEEE Software, Journal of Surgical Endoscopy, and MIS Quarterly.

Carol S. Saunders is affiliated with the University of South Florida. She has received the
LEO Award from the Association of Information Systems (AIS) and the Lifetime Achieve-
ment Award from the Organizational Communication & Information Systems Division of
the Academy of Management. She served or is serving on numerous editorial boards,
including a three-year term as Editor-in-Chief of MIS Quarterly. Her articles appear in top-
ranked management, information systems, computer science, and communication journals.
She currently is the AIS Vice President of Publications.
This page intentionally left blank
EMOTIONAL AND
COGNITIVE OVERLOAD
The Dark Side of Information
Technology

Anne-Françoise Rutkowski and Carol S. Saunders


First published 2019
by Routledge
2 Park Square, Milton Park, Abingdon, Oxon OX14 4RN
and by Routledge
711 Third Avenue, New York, NY 10017
Routledge is an imprint of the Taylor & Francis Group, an informa business
© 2019 Anne-Françoise Rutkowski and Carol S. Saunders
The right of Anne-Françoise Rutkowski and Carol S. Saunders to be identified as
authors of this work has been asserted by them in accordance with sections 77
and 78 of the Copyright, Designs and Patents Act 1988.
All rights reserved. No part of this book may be reprinted or reproduced or
utilised in any form or by any electronic, mechanical, or other means, now
known or hereafter invented, including photocopying and recording, or in any
information storage or retrieval system, without permission in writing from the
publishers.
Trademark notice: Product or corporate names may be trademarks or registered
trademarks, and are used only for identification and explanation without intent to
infringe.
British Library Cataloguing in Publication Data
A catalogue record for this book is available from the British Library
Library of Congress Cataloging in Publication Data
A catalog record has been requested for this book

ISBN: 978-1-138-05333-5 (hbk)


ISBN: 978-1-138-05335-9 (pbk)
ISBN: 978-1-315-16727-5 (ebk)

Typeset in Bembo
by Taylor & Francis Books
CONTENTS

List of Illustrations vi
Acknowledgements vii

1 Information Technology’s Dark Side: IT-related Overload and


IT-Addiction 1
2 The Brain and Paradigms of the Mind 17
3 Individual Differences in Experiencing IT-related Overload 37
4 Information Technology as a Resource: From the Bright to the
Dark Side of Addiction 58
5 Dark Side of Information Technology at the Organizational
Level 76
6 Measures of IT-related Overload 99
7 Leveraging the Positive Side of IT 118

Glossary 134
References 142
Index 166
ILLUSTRATIONS

Figures
1.1 The blender approach to understanding overload 5
3.1 Emotional-Cognitive Model of Overload (ECOM) 45
5.1 Information Technology Dark Side Diamond 77
5.2 Work-life balance continuum (adapted from Sarker, Xiao, 84
Sarker & Ahuja, 2012).

Tables
1.1 Comparison of brain overload in blenders and people 7
3.1 Summary of issues in processing and output for expert versus
non-expert 50
5.1 Summary of the Information Technology dark side diamond 97
6.1 Operationalization of IT-related overload with item loadings 108
6.2 Operationalization of memories of past cognitive and 109
emotional overload with item loadings
Boxes
3.1 Chris and Alix 37
3.2 Example application of the Emotional-Cognitive Overload
Model 51
5.1 Anna and David 76
ACKNOWLEDGEMENTS

The acknowledgement section is a tangible way of showing my gratitude to all my


co-authors referred to in this book. Specially, Carol, thank you for being my friend
and the strongest link in my chain of publication. It was another great adventure
writing this book together. Also, Michiel, thank you for being the powerful link in
both my publications and life. A special note to Les Wold for reviewing some of
the physiological jargon. A token of my gratitude goes to my colleague and friend
Piet Ribbers, who has provided me with continuous support during the last 20
years. Lauren, Louis: I hope one day you will forget about technologies for a few
days… only. In one of my magic Mary Poppins bags, you will find five old
paperback copies of Marcel Pagnol’s work. Overload yourself, read them… ALL…
to my mother, Marie-Françoise, ton château en papier, and to my father, Wlodzimierz,
who taught me all that matters.
I would like to thank the Schöller Foundation for recognizing me as a Fellow in
2012. The award came at a low point in my academic career and served as
important validation for my work on the negative consequences of overload. My
only regret is that Frau Schöller, who made this award possible, will not be able to
see this fruit of her generosity. I remember fondly our discussion over coffee in her
office in Nürnberg. I also want to thank my wise counselor, lifelong sweetheart,
chief cheerleader/supporter and best friend, Rusty. He spent many hours discussing
the topics covered in this book with me and editing two chapters. Finally, I would
like to thank my very supportive family: Kristin, Russell, and Janel.
This page intentionally left blank
1
INFORMATION TECHNOLOGY’S DARK
SIDE
IT-related Overload and IT Addiction

Charles Darwin (1871)– a naturalist best known for his contributions to the science
of evolution– wrote, “It has often been said that no animal uses any tool” (p. 51).
Darwin challenged this 19th-century statement through his own observations
and those of his colleagues. For example, Darwin noted that Asian elephants would
repel flies by waving a branch in their trunks. Interestingly, the elephants would
first fashion the branch into a tool by removing side branches or shortening the
stem. Earlier, Savage and Wyman (1843–1844) reported that chimpanzees in their
natural habitat use stones to crack fruits. They also devise sticks for hunting prosi-
mians. Later, Köhler (1917/1925) observed that big apes restructure their envir-
onment to reach food. Thus, wild animals adapt tools to make them more efficient
and use them to enhance their chances of survival. It is indeed more efficient for
the elephant to have the right tool for chasing flies away than relying on the length
of his trunk. Yet, animals do not exhibit the full scope of intelligence observable in
humans. Evolutionary research has related the use of tools with the development of
hominid brains (Wrangham, 1994; Carvalho, Cunha, Sousa, & Matsuzawa, 2008;
Sanz & Morgan, 2013). Our early hominid ancestors, such as Ardipithecus, were
capable of making simple tools (Panger, Brooks, Richmond, & Wood, 2002;
Roche, Blumenschine, & Shea, 2009). Neanderthals displayed their abilities in
handling complex Paleolithic tools for their survival. Through evolution, the better
early hominids designed and handled complex tools, the smarter and fitter they
became. Early hominid’s use of tools, like ours today, was goal-driven and made it
possible to accumulate exogenous resources and conserve endogenous ones.

Misused tools or valuable resources?


Like humans, tools have evolved over time. They provide capabilities that
undoubtedly were never imagined by our early ancestors. However, in our digital
2 Information Technology’s dark side

age, the design and use of ‘digital tools’ such as the smartphone is causing some
concern. The popular press is full of revelations of this First World problem. For
example, Tristan Harris, a former product manager and design ethicist at Google,
recently declared war on smartphones. He stated in an interview with Rachel Metz
for the MIT Technology Review:

It’s so invisible what we’re doing to ourselves.… It’s like a public health crisis.
It’s like cigarettes, except because we’re given so many benefits, people can’t
actually see and admit the erosion of human thought that’s occurring at the
same time.
(Cited in Metz, 2017)

Research has demonstrated that even the absence of a smartphone in one’s pocket
can be a cause for concern. Specifically, phone owners have been reporting
‘phantom vibration syndrome’. In this syndrome, the phone owner is so used to
receiving messages that her body perceives that the phone is vibrating and deli-
vering information even when it is not (Drouin, Kaiser, & Miller, 2012). Nicholas
Carr (2017), in his article “How Smartphones Hijack Our Minds”, reported
research denouncing the addictive nature of the smartphone and its weakening
effect on the brain. People are becoming too dependent on their smartphone, and
their ability to think and make sound judgements is decreasing. Carr concluded
from his readings that when a smartphone’s proximity increases, brainpower
decreases. In a similar vein, Hancock (2014) now muses over whether current
technology engenders stupidity instead of whether it can cure stupidity.
The smartphone is not the only Information Technology (IT) that has a dark
side. The popular press is full of accounts about the dark side of other types of IT:
information overload, email fatigue, iDisorders, technostress, or social media junk-
ies to name just a few. Though clearly these advanced technologies have many
wonderful uses, their dire consequences on users’ behaviour and stress is generating
societal concern. However, IT itself is not the problem. Rather it is how IT is
actually used that can lead to good or dire consequences. When it is not used well,
the dark side of IT is unveiled.
We are particularly concerned with two ‘dark side’ challenges: IT-related over-
load and IT addiction. We define IT-related overload as the state of being challenged
in processing information used in IT-related activities. Rather than focus on the
amount (i.e., input) or symptoms (i.e., output) of overload, we seek to unlock the
black box of the mind and focus on mental processes. That is, we are concerned
with a form of brain overload, or the inability to adequately process input and handle
the associated brain load. We define brain load as the emotional and cognitive
efforts required by individuals to appraise and process inputs using the resources
available to them. Further, we define IT addiction as the state of being challenged in
balancing IT usage mindfully so as to preserve one’s resources.
When used well, we view Information Technologies as powerful tools. In parti-
cular, we view them as exogenous resources – digital tools that may require our
Information Technology’s dark side 3

endogenous brain resources. Resources are defined as “objects, personal characteristics,


conditions and energies that are valued by individuals or that serve as a means of
attainment of other resources” (Hobfoll, 1989, p.516). They may be endogenous
physical, emotional, or cognitive energy (Hobfoll & Freedy, 1993). Some are
temporal. The resources affect each other, exist as a resource pool (Kahneman,
1973), and are necessary for cognitive processing (Monetta & Joanette, 2003). Both
endogenous and exogenous resources are necessary to battle the dark side of IT.

Brain overload
The dark side of IT has exponentially increased in the last half-century as a result of
the introduction of new digital tools such as the Internet, email, smartphones, and
Social Networking Systems (SNSs). Indeed, since the commercialization of the Inter-
net skyrocketed shortly after the introduction of web browsers, we find ourselves
increasingly inundated with information in the form of requests, advertising, pop-
ups, new apps, emails, or text messages delivered by various technologies. We are
deluged with information that is continuously being pushed at us by others or
pulled by us from the Internet and other myriad of technologies because we feel
compelled to seek additional information or social contact. We face the challenge
of dealing with the huge amount of information that is omnipresent in our world.
“Never in history has the human brain been asked to track so many data points”
(Hallowell, 2005, p.58). The consequences are serious in today’s information-rich
environment. In First World countries, “contemporary society suffers from infor-
mation constipation. The steps from information to knowledge and from knowl-
edge to wisdom, and thence to insight and understanding, are held captive to the
nominal insufficiency of processing capacity” (Hancock, 2014, p.450). Managers
and employees who suffer cognitively from overload may end up making an
increasing number of errors and poor decisions while trying to process dizzying
amounts of data (Hallowell, 2005). They may also suffer emotionally from the
overload, IT addiction, and workplace stress. For example, employees working in
high-technology industries have been found to demonstrate psychosomatic symp-
toms and reduced productivity related to high mental demands (Arnetz & Wiholm,
1997; Tarafdar, Tu, Ragu-Nathan, & Ragu-Nathan, 2007). One estimate places
the cost of information overload due to “lowered productivity and throttled
innovation” at $900 billion a year (Powers, 2010, p.62).
We believe that ‘brain overload’ is a better term to describe the phenomenon
more commonly called ‘information overload’. Processing the information that
Information Technologies deliver is brain-related and heavily reliant upon available
resources. Therefore, brain overload is a function of the brain (e.g., processor) and
not information (e.g., input). While the consequences of brain overload have been
reported frequently in the literature, they systematically have been attributed to
situations characterized by too much data, information, or connectivity. The focus
has been on the input and the output rather than on the cognitive processes (i.e.,
black box).
4 Information Technology’s dark side

More than four decades ago, Simon (1971) pointed out the challenges of proces-
sing so much information and the need for attention resources to do so. He wrote,

What information consumes is rather obvious: it consumes the attention of


its recipients. Hence a wealth of information creates a poverty of attention
and a need to allocate that attention efficiently among the overabundance of
information sources that may consume it.
(Simon, 1971, pp.40–41)

Indeed, there has always been a lot of data in the world. Not many of us have read
all the books in a library. Libraries are not blamed for causing information overload
– technologies, especially email, are.

Using resources mindfully


Recently, this automatic email reply arrived in one of our inboxes:

Hi there, Thanks for your mail, which I regrettably will not read since I’m working
away from the office. I’ll be back, however, on the 4th of May fully charged. So if your
email is still relevant after then, please send it again or otherwise it’ll end up in the heap
of mails that I’ll unlikely respond to. Even better, if the matter is urgent, give me a call
at +XXXXXXX. Have a good one – Corey
PS – join the fight against email fatigue and let others know that email, while
helpful, shouldn’t be a substitute for face-to-face or telephone communication. Together,
we can make the world a less stressful place.

In the digital workplace, managers show signs of overload from communications


delivered by email and other technologies. Some respond as Corey does in the
email signature above. In fact, Corey is sharing his coping strategy for curbing
email overload in this automatic reply. Consequently, he is using the auto-reply
option in a mindful way, sparing his resources. Research from psychology sup-
ports the idea that processing all inputs such as incoming email messages involves
a certain level of resources. Expending endogenous resources can reduce an
individual’s brain load and increase his processing efficiency. In addition to each
individual’s endogenous pool of resources are exogenous ones. Time is a
common exogenous resource that all too often proves inadequate. Indeed, Corey
is apparently lacking enough time to read all the emails in his inbox upon his
return to the office. He warns email senders that their message simply may not be
read unless it is re-sent at a later time. Also, Corey kindly urges the senders to
question the relevance of the content of their messages over time. He is expertly
building healthy boundaries for handling a flood of emails. In other words, he is
ensuring that he has adequate resources for solving his IT-related overload
equation. He does provide the option of giving him a phone call or meeting him
face-to-face.
Information Technology’s dark side 5

Not everyone is afraid of brain overload in today’s digital world. In fact, some
people enjoy it and impatiently wait for the next tweet or text. They appreciate
the high-speed connections that allow them to leverage a vast range of information
in accomplishing a phenomenal amount of work. Slow connections leave them
bored and annoyed. These individuals might even suffer from a form of IT addiction
that compels them to stay connected for fear of losing out.
To better understand the role of the brain in processing information, we propose
a model based on cognitive theories of memory (Atkinson & Shiffrin, 1968;
Bower, 1981). In particular, we draw on both the emotional and cognitive aspects
of the brain and consider the resources necessary to fuel its processing of inputs.
We introduce our model, the Emotional-Cognitive Overload Model (ECOM),
using the metaphor of a blender.

Blender metaphor
We use the commonplace blender to explain the brain overload phenomenon.
With a blender, we normally pour in the ingredients that need to be processed and
push the button to mix/blend. This is State 1 in Figure 1.1. If the ingredients are
hard to blend or if we want a smoother consistency of blended materials, we turn
the knob to liquefy rather than blend. That is, we call on the blender’s greater
processing capabilities. For simplicity sake, we assume that processing abilities are
similar for most blenders. State 2 in Figure 1.1 is when the blender cannot handle
the processing. Finally, if there are too many ingredients for one batch in the
blender, we can blend some of them, pour that into a separate container, and then
process the remainder in another batch. If we do not process in batches, there will
be an overflow condition, which is what is happening in State 3 in Figure 1.1.

State 1: Normal processing

State 3: Overflow from too much


to process
State 2: Inability to process well

FIGURE 1.1 The blender approach to understanding overload


6 Information Technology’s dark side

The material to be processed represents information, and the blender is used


to represent the brain’s memory processes. Even though we only have one
brain, it is organized in a way that allows us to process input in batches. In
State 1 the information to be processed is limited enough or easy enough that
it can be processed without difficulty. Consequently, no overload occurs.
However, in State 2 the person’s pool of resources is inadequate for processing
the information. The person may lack expertise in processing the information,
lack interest in or time for solving the problem, or be too exhausted because of
a lack of physiological resources. As a result, the person must either call upon a
higher level of cognitive ability than usual to be able to process the information
or adapt to lower levels of performance by learning to live with an increased
number of errors, reduced information integration, and impaired decision-
making (Bettman, Johnson, & Payne, 1990; Shiv & Fedorikhin, 1999). Of
course, while blenders may be relatively similar in their processing abilities, they
may vary slightly in terms of power or capacity. Individuals, on the other hand,
definitely have very different cognitive abilities and stored memories in the
brain that are used to process information. More precisely, they may each
have a very different pool of resources from which to draw. We suggest that
some individuals process information better than others. They have better cog-
nitive abilities. In State 3, the information processing needs to be made more
efficient. One way to do this is to chunk the information, which is like proces-
sing the information in batches. However, at some point, the amount of infor-
mation or the ability to process it exceeds an individual’s resources. This leads to
a state of emotional and cognitive overload.
What we have not addressed so far is when to start the blender. We argue that
there must be some relevant (pertinent) input that starts the blending process –
such as the desire to have a fruit smoothie or a frozen daiquiri. It is unlikely that an
individual would start the blending process if the request is to blend cod liver oil
with jello, or some other ghastly concoction. Similarly, before an individual starts
processing information, there must be some pertinent input to motivate the pro-
cessing and it must be perceived positively. In our blender example, the individual
can remember how good the smoothie or frozen daiquiri tasted in the past and is
motivated by this positive memory. Furthermore, the smoothies this person so
enjoys making and drinking may be loaded with sugars or alcohol, consumption of
which is addictive to the brain. This addiction may also motivate the person to start
the blender.
Clearly our blender metaphor is quite simplistic when it comes to explaining
overload and viewing smoothies as a form of addiction. We hope to remedy this
with a more complex model presented in Chapter 3, following a discussion of
models in Chapter 2. In the ECOM, Emotional-Cognitive Overload (ECO) is defined
as the negative emotional and cognitive consequences of brain overload. In Table
1.1 we continue our blender metaphor by highlighting key aspects of information
processing that are an important part of the ECOM but which are not usually
elaborated upon in overload research.
Information Technology’s dark side 7

TABLE 1.1 Comparison of brain overload in blenders and people

Aspects of informa- Blender state Person state


tion processing
Pertinent input We will not turn the blen- The information will not be processed
der on unless we want to unless it is perceived to be pertinent. In
concoct something tasty, information processing, the valence may
like a smoothie with be positive or negative. When informa-
brain-rewarding sugar. tion is extremely pertinent, the person
may exhaust all resources to process the
information.
Processing No overflow. The container No overload. The person can process all
can hold all of the ingre- of the information perceived as perti-
dients and process them. nent. If the individual’s resources are
The blender may be used not used fully, the person may be
seldom or never. underloaded, or bored, and may decide
to use resources for other activities.
Individual The blender can do different Individuals have markedly different
differences types of blending. Some pools of resources. Some people have
blending is very coarse. If the resources to easily process a limited
the material is to be amount of information. Others have a
smoother, a higher level of larger pool of resources. As the
processing is needed. The information-processing requirements
knob can be turned to a increase, they can exert greater effort
different position indicating and invoke higher levels of their
more intense blending. resources. They may experience the
Relatively little difference is processing to be great fun and quite
assumed in the power of challenging. Others who do not have
blenders. the needed resources experience over-
load when they are not able to process
all the information.
Chunking The blender cannot hold all The resource requirements are great.
abilities of the ingredients. The Overload will occur unless there is
ingredients will overflow chunking.
the container unless they are
processed in batches.

Pertinence and ‘Amount Illusion’


In our anecdote, Corey would process all the emails that he receives after returning
to the office – even if only giving them a quick glance to see if they are relevant to
him or not. In contrast, our model reflects the assumption that individuals do not
process all information that they receive. Instead, the information and other input
that they receive is filtered, but filtered in a different way from that typically por-
trayed in the popular and academic press. We argue that the focus should not be
on bottlenecks that are created by brain funnels filled with too much information.
Likewise, we do not support the widely held assumption that IT addiction is
commonly related to too much connectivity. Rather, we suggest that it is time to
look at the processes that individuals use to deal with the deluge of information or
social connections with which they are presented. When individuals receive an
8 Information Technology’s dark side

input, it is moved to the person’s memory, where past emotions and lifelong
experiences are organized and stored. Cognitivist theories help explain how
incoming events are coded, specific memories are constructed, memories become
consolidated so they can be appropriately associated with one another, and per-
sonality traits are encapsulated. (Personality traits are representative of the way
individuals think and behave in certain contexts.)
At this point, it also is important to understand that each individual, in a unique
way, compares each input to what is stored in memory. Only the pertinent infor-
mation then undergoes cognitive processing. By pertinence, or relevance, we mean
that a new input matches the information stored in memory. Pertinence is critical
at the starting point of our blender metaphor. In other words, pertinent informa-
tion makes sense because it fits cognitively with what is stored in the individual’s
memory. The memory uses pertinence to accept or reject inputs, therefore con-
trolling the brain load. The concept of pertinence means that not all information
that is received is processed. The idea that not all information is processed is very
different from that promulgated in much of the literature on overload. Our model
is about improving information processing and not about blaming the dizzying
amount of information that is received or the connectivity that delivers it.
This ‘amount illusion’ sees information as pouring in and relates brain overload
primarily to the amount of input. Little is said about the capability of individuals to
process the information. If one assumes that the problem people are dealing with is
too much information or too many social connections, the solution is to find ways
of filtering out what is extraneous and only allowing the needed information into
the mind for processing. This has happened to the extent that it is suggested that
technology be used as a filter or to handle email, time spent on social media, and so
on. However, in this scenario, individuals do not look for ways of improving the
processing and sparing their resources.

Processing – the conditions of no overload and underload


Up to this point we have spent a lot of time talking about brain overload.
There are, however, many occasions when a person does not experience
overload (i.e., normal processing takes place). It could be that the person does
not have much to process and the brain load is relatively slight. It could also be
that the brain load is great, but the person is able to handle it successfully. This
is often the case with experts.
When the brain load is too slight, underload may occur. Surprisingly, underload
may lead to negative consequences just as overload does. For example, the
experienced pilots of an Airbus A320 who overshot their plane’s destination and
forgot to land appeared to have been suffering from underload. In the hopes of
dodging boredom, they started ‘playing’ on their laptops to keep their minds and
attention busy (Rutkowski, 2016). They claimed that they lost track of time and
location because they were absorbed in exploring the new monthly crew flight
scheduling system on their laptops. It may be, though, that their expertise led to an
Information Technology’s dark side 9

underload situation on a long boring flight, with their actions to elude boredom
ultimately resulting in errors. Similarly, anaesthesiologists – whose work is increas-
ingly supported by technology – when underloaded, have been found to focus
their attention on things other than their patients. When demands for their atten-
tion decrease, they have been found reading (Slagel & Weinger, 2009) or surfing
online (Saunders, Rutkowski, Pluyter, & Spanjers, 2016). Hospital administrators
are noticing their bored anaesthesiologists and are substituting many of them with
less expensive monitoring technology.

Individual differences
Once inputs have been selected for processing on the basis of their pertinence, they
are processed and stored in the person’s memory. The stored memories evolve as
individuals attempt to make sense of their own world. Each person’s memories are
very different from those of others.
Processing incoming inputs involves a certain level of effort, which calls upon
mental and physiological resources. Resources can reduce an individual’s brain load
by making the processing more efficient. Overall, resources are treated as the fuel
that runs the processing. Each person’s pool of resources is different from that of
others and depends upon how exhausted the person is. The level of resources
needed to process inputs can be compared to the different power levels in blenders.
Emotions distinguish individuals from blenders. Emotions can either help or
hinder processing of brain load. For example, memory of emotional reactions to
financial information has been found to be better than recall of the actual numbers
involved (Rose, Roberts, & Rose, 2004). Experience is encoded with a tag called a
valence. A valence may be a positive or negative emotional tag attached to events and
concepts that were activated in association with prior experience of the related
emotional tag. An input is congruent when its emotional tag, or valence, matches
that stored with a related item in memory. Where there is a mismatch with the
valence of the input and what is stored in memory, processing becomes less efficient
and challenges the individual’s pool of resources. He will, for example, focus more of
his scarce attentional resources in order to understand and solve the problem.

Chunking abilities
The attentional resources of the brain are rather limited (Kahneman, 1973; Neisser,
1976). The brain can only hold seven, plus or minus two, items at a time (Miller,
1956a). Individuals become overloaded when they have to deal with more input
items than they can handle. Thus, they must learn to focus their attention and
handle input efficiently. As noted by Miller (1956b), but often omitted in the lit-
erature, the only way to efficiently process the input and to extend the amount of
information that can be processed is by chunking. Chunking occurs when individual
items are combined into blocks called chunks. How the items are organized into
chunks determines recall. In addition to its role in processing of information,
10 Information Technology’s dark side

chunking also can involve converting a sequence of actions into an automatic


routine. Construction of an increasing number of interrelated complex chunks
increases expertise and therefore speeds information processing and decreases
overload by more efficiently dealing with attentional resource constraints. Some
chunking is simple, such as automatically putting toothpaste on a toothbrush before
inserting it into the mouth. Others, such as debugging a computer program, are
more complex and emerge as a result of habit over time that provides expertise.
Especially good chunkers can combine chunks into superchunks. Superchunks
comprise first-order chunks which are combined in levels so that they require less
effort to store in memory and also make the information easier to remember
(Mandler, 1967). Experts are particularly good superchunkers.
Experts are distinguished from non-experts in that they have performed a set of
activities so many times that it has been converted into superchunks. The set of
activities thus becomes automatic and can be completed by the expert with ease.
As a result of repetition of similar activities, experts, compared to non-experts, are
better aware of what information they need to complete the activities. Thus, they
are better than non-experts at distinguishing which inputs represent pertinent
information and which can be ignored without any further processing. This ability
to prioritize inputs as a function of their pertinence means that experts are better
able to process inputs efficiently and successfully (Sutcliffe & Weick, 2008). Fur-
ther, they can more easily store and retrieve memories associated with their
expertise. While they may process the same number of chunks as non-experts,
their chunks are bigger and contain more information. Experts require less effort
and fewer mental resources in processing the brain load created from the incoming
inputs. Thus, even with high brain load, experts may not experience emotional or
cognitive overload at all, because their cognitive processes are highly automatized.
Further, when they do experience high brain load, they may be able to handle the
load successfully and, consequently, tag their stored memories positively. The suc-
cessful resolution of conditions of high brain load can lead to and enhance self-image,
which allows them to see themselves as ‘super-experts’.

Overload from requests to use new IT


There is another type of input that can create overload but that has not, to our
knowledge, been discussed by other researchers. This type of overload emanates
from requests to use new Information Technologies. Individuals are not only
swamped with large quantities of information delivered by IT; they are also
deluged with promises of capabilities delivered by such devices as smartphones,
iPads, and new software applications. Further, they are often forced to adopt new
versions of software even though they are satisfied with the older versions whose
functions they have finally mastered. Tarafdar, Tu, Ragu-Nathan, and Ragu-
Nathan (2011) related the story of a university secretary who found it so difficult to
use a new student-management application that it drove her to early retirement.
The use of the software was mandated, but she was never able to master its
Information Technology’s dark side 11

multiple features, dismayed by its multiple crashes, and unable to get the IT sup-
port that she needed.
We were asked by a large Dutch bank to investigate the possible adoption of an
innovative TV banking system that would eventually replace its current one. Most
customers were reluctant to adopt the new system. We believe that this reluctance
could be explained by IT-related ECO created from both information overload
and too many requests to use IT. To test this premise, we conducted a survey of
Dutch participants aged 16 or older; 1,857 responded from a total sample of 2,538
(Rutkowski & Saunders, 2010). We found that almost two-thirds of the partici-
pants (61%) were concerned about being cognitively overloaded with too much
information when they use new Information Technologies. Just over two-fifths
(42%) felt cognitively and emotionally overloaded with requests to use new Infor-
mation Technologies.
We concluded that requests to use new technologies can also create brain
overload conditions. Further, brain overload can be caused not only by being asked
to use too many technologies, but also by failing to intentionally forget some part
of what we have already learned (Rutkowski, Saunders, & Hatton, 2013). For
example, when the smartphone was introduced, one had to forget how to use a
traditional camera. Indeed, we now look at a screen to adjust a picture instead of
looking directly through the camera viewfinder.
Old technologies with which we are familiar may be very similar to new ones,
but different enough to be confusing. Brain overload is created when individuals
try to match the new functionalities of the software or services with the technology
they already know. If it differs, they may intentionally forget how they used to
interface with the old technology. Intentionally forgetting is cognitively taxing and
also contributes to feelings of burnout and rejection toward new technologies.
Overload with IT requests is similar to a component of technostress that is dis-
cussed commonly in the popular press.

Technostressed Mary
Recently, one of our young doctoral students, Mary, came up with an interesting
new strategy. Mary stated:

I decided to remove the email application on my smartphone. I cannot cope with the
constant pop-ups. They were driving me crazy. I will never be able to finish my dis-
sertation that way. Would you please send me a phone text message when I need to
check important updates for my dissertation during the weekend? I have to focus if I am
ever going to finish my PhD.

Mary’s strategy is twofold: deleting the email application from her phone and
asking us to inform her of the relevance of our emails. This meant that we would
have to send one email AND a text message in order for her to access important
messages, multiplying the technologies we use (e.g., computers and smartphones).
12 Information Technology’s dark side

In order to spare some of her resources, Mary was asking to dig into our pool. Doing
so was her way of dealing with brain overload from messages delivered by technol-
ogy. We gladly accepted this somehow self-centred request, relieving her of some of
the “growing pains with information overload” (Rutkowski & Saunders, 2010).

Technostress
Technostress, or the type of stress experienced in organizations by technology
end users as a result of their inability to cope with the demands of organiza-
tional computer usage (Tarafdar, Tu, & Ragu-Nathan, 2010), is another dark
side of IT. This stress may be induced by a surfeit of information delivered by
IT. It may also be the result of “application multitasking, constant connectivity,
information overload, frequent system upgrades and consequent uncertainty,
continual relearning and consequent job-related insecurities, and technical pro-
blems associated with the organizational use of ICT [Information and Com-
munications Technology]” (Tarafdar et al., 2010, pp.304–305). Unlike our
ECOM approach, this has not been discussed in relation to emotions, cognitions,
or resources.
The term ‘technostress’ is interesting but confusing as it seems to suggest that
technologies bring on the stress. According to our model, the technology is not
to blame. Rather, we argue that the stress is created by a lack of available
resources or impulse control. Some individuals may never have experienced
technostress even when juggling many technologies. Mary ended up upset just
hearing the constant ‘beep’ of her phone when an email lands. We believe her
stress is more symptomatic of a lack of resources than it is a function of the
information received. It arrives at a moment in time when she needs to leverage
her pool of resources to the maximum in order to finish a very relevant task –
completing her PhD.
Tarafdar and colleagues have identified five major creators, or components, of
technostress: techno-overload, techno-innovations, techno-complexity, techno-
insecurity, and techno-uncertainty (e.g., Tarafdar et al., 2007; Ragu-Nathan, Tar-
afdar, Ragu-Nathan, & Tu, 2008; Tarafdar et al., 2010). At the organizational level,
technostress has been found to lead to increased role stress and reduced productivity,
end-user performance, and end-user satisfaction. These findings are discussed in
greater detail in Chapter 5.
Interestingly, technostress has been strongly related to compulsive behaviours
(Lee, Chang, Lin, & Cheng, 2014), which are often associated with addiction.
Further, drug addiction has been found to display the same underlying symptoms
as SNS or Internet addiction (both types of IT addiction) (Goeders, 2003). In
particular, “SNS addiction incorporates the experience of the ‘classic’ addiction
symptoms, namely mood modification, salience, tolerance, withdrawal symptoms,
conflict, and relapse” (Kuss & Griffiths, 2011, p.3530). Brooks, Longstreet, and
Califf (2017) found technostress to be strongly and positively related to Internet
addiction.
Information Technology’s dark side 13

Addictive IT behaviours
There is indeed another IT-related challenge associated with having ‘too much’
that is reaching epic proportions: too much Internet and mobile phone con-
nectivity. People in all generations are staying connected too long, and this hyper-
connectivity often leads to a range of dysfunctional behaviours including IT
addiction, excessive media multitasking, and Pathological Internet Use. Pathological
Internet Use (PIU) has four elements: (1) excessive Internet use, often associated
with a loss of sense of time or a neglect of basic drives; (2) withdrawal, including
feelings of anger, depression, and tension when Internet is not accessible; (3) tol-
erance, including the need for better computer equipment, more software, or more
hours of use; and (4) adverse consequences, including arguments, lying, poor
school or vocational achievement, social isolation, and fatigue (Block, 2008, from
Spada, 2014, p.4).
Hyperconnectivity is being reported among all age groups. Tweens (children in
the 8–12 age range) and teens (children in the 13–18 age range) are averaging over
4.5 hours and 6 hours a day, respectively, on the Internet. A quarter of the teens in
a recent survey reported reaching for their phones within five minutes of waking
up (Ipsos MediaCT & Wikia, 2013). They are texting and emailing so much that
employers of young adults accuse them of having difficulty starting and ending
conversations and being nervous when making phone calls (Colbert, Yee, &
George, 2016). And older adults (commonly called ‘silver surfers’) are also taking
advantage of access to the Internet and smartphones so that they can be in a state of
constant communication with others (Colbert et al., 2016). One study even
reported that it is parents, not teenagers or tweens, who spend the most time in
front of screens (Molina, 2017).
The challenge to ‘unplug’ is spawning new opportunities for the tourism
industry as tour operators are advocating device-free vacations. For example,
Intrepid Travel, an adventure travel company, now offers “Digital Detox Trips” in
which the participants pledge not to bring along any digital devices and must resort
to paper notebooks to record their impressions (Glusac, 2016). Renaissance Pitts-
burgh’s family detox package trades digital devices for board games and cards
during the family’s stay. Further, digital detox retreats have sprung up with offers to
disconnect, for a price; and resorts offer an ‘iPhone crèche’ where you can leave
your mobile devices. In the private sphere, the negative impacts of IT-related
overload have been linked to the exponential use of Information Technologies.
State legislatures are now providing motivation to unplug in other ways. In
Hawaii, ‘smartphone zombies’, or pedestrians so distracted by what’s on their
phones that they are oblivious when crossing streets, are fined. Further, 47 states
and the District of Columbia have banned texting while driving (Molina, 2017).
In the Net Generation, hyperconnectivity is manifesting a number of new
behaviours. Net Geners are people born after 1980; this includes the groups called
Millennials and Generation Y. Net Geners have now developed the skill of
‘phubbing’ during conversations, which means that they can maintain eye contact
14 Information Technology’s dark side

while also texting. However, the eye contact may not be as meaningful as they
think, because just having the phone in sight likely reduces their conversation
partners’ perception of closeness, trust, and relationship quality (Colbert et al.,
2016). Another task that Net Geners may not be as good at performing as they
think they are is media multitasking. Media multitasking entails checking mobile
phone content as often as every 30 seconds, or even less (Rosen, Carrier, & Che-
ever, 2013), an activity which commands high switching costs as multitaskers shift
frequently from one task to another. This may explain why younger users of
mobile phones are significantly more likely than older users to experience overload
from information and communication messages delivered by their phones (Saunders,
Wiener, Klett, & Sprenger, 2017).
Some claim that such heavy use of smartphones can lead to a particular type of
addiction called mobile email addiction. Symptoms of this addiction are that the
mobile phone user becomes preoccupied with using the smartphone, has difficulty
in controlling or quitting the behaviour, and gets angry or frustrated when inter-
rupted (Turel & Serenko, 2010). Attention deficit hyperactivity disorder (ADHD),
depression, and social phobia as well as hostility have been identified as symptoms
of Internet addiction in adolescents (Yen, Ko, Yen, Wu, & Yang, 2007).
Mobile email addiction is viewed as one form of Internet addiction. Kandell
(1998) defined Internet addiction as psychological dependence on the Internet. The
dependence is characterized by: (1) an increasing investment of resources in Inter-
net-related activities; (2) unpleasant feelings (e.g., anxiety, depression, emptiness)
when offline; (3) an increasing tolerance to the effects of being online; and (4)
denial of the problematic behaviours (Kandell, 1998, p.11). In short, Internet
addicts find it hard to unplug from the Internet, and they suffer from withdrawal
upon doing so (Davis, 2001).
Among American psychologists and psychiatrists, there is no recognition of IT
addiction (i.e., Internet, SNS, or mobile email addictions) or stress. That is, no
form of technology addiction or technostress is listed in the current version of the
Diagnostic and Statistical Manual of Mental Disorders (DSM 5), which contains a formal
list of mental disorders. This is because many believe that the term ‘addiction’
should only be used in respect to chemical substances (Turel & Serenko, 2010) or
when the person has a physiological dependence on some stimulus, which is
usually a substance (Davis, 2001). Others believe that a common set of symptoms
and diagnosis criteria are missing (Turel & Serenko, 2010). Hence, in this book we
use the term Pathological Internet Use to describe the behaviours described in the
literature as IT addiction. As we discuss in Chapter 4, the lack of control con-
sciously exerted by the brain during information processing contributes heavily to
IT addiction. These behaviours can be specific or general. They are considered
specific when a person is dependent on a particular function of the Internet such as
online auction services, sexual material/services, or gambling. They are considered
general when the Internet is overused in such cases where people waste time
online without a clear objective. But whether it is called IT addiction, Internet
addiction, specific PIU, or general PIU, it is a force to be dealt with in our society.
Information Technology’s dark side 15

In the rest of this book, we will tell you why. In addressing this force as a society,
we can reap the benefits of technology while staving off its harmful effects.

What’s coming next? A sneak preview


The following six chapters dive into details of scientific practices borrowed from
philosophy, behavioural and cognitive psychology, neurophysiology, and artificial
intelligence to enlighten our understanding of the dark side of IT. In this book we
address three main questions: (1) Why do some individuals experience IT-related
overload while others do not? (2) Why do some individuals experience IT addiction
while others do not? and (3) What are the consequences of the dark side of IT?
Thomas Edison once said, “Results! Why, man, I have gotten a lot of results. I
know several thousand things that won’t work” (Forbes, 1921, p.89). We found
this to be the case in our painful attempts to measure IT-related overload, which
we describe in Chapter 6. In this book we present results we have collected in our
own research and draw on data collected by others to support our arguments.
Some studies suggest ways to tackle IT-related overload that are likely to work
well, while others indicate approaches that are less convincing. We realize that we
may not have understood the IT-related problem fully. Still, we believe we are
getting closer every day. Our Emotional-Cognitive Overload Model (ECOM) in
Chapter 3 is based on cognitive theories of memory architecture that are intro-
duced in Chapter 2. We also build on the work of other scientists in order to
better understand the impact of emotion and pools of resources on IT-related
overload and IT addiction. The ECOM suggests that not every person experiences
overload or addiction in the same way– if at all.
Our book is not about bashing IT. Our interest in the dark side is triggered
by our wishing to better understand the possible effects of IT, both positive
and negative. Now is the right time to take a serious look at ‘responsible’ IT
use and the consequences of its mindless use. In Chapter 7 we suggest some
ways of acting responsibly in relation to IT. We recognize that Information
Technologies were originally built to serve humanity. However, it seems that
way too many people are being held hostage by various forms of IT: They
suffer from IT-related overload or IT addiction (or both). But blindly bashing
technology or imposing rules and policy without a deeper understanding of the
phenomena would be sterile. This would only increase a misunderstanding of
the role of technology.
The brain and availability of resources are key in understanding the phenom-
enon. Evolutionary theorists determined that Neanderthals had brains of similar size
to modern humans, sometimes even larger (Ponce de León, Golovanova, Dor-
onichev, Romanova, Akazawa, Kondo et al., 2008). Through evolution, tools have
served as efficient resources enhancing our efficiency in our natural environment.
Our use of tools has been one of the main competitive advantages over other
species in hominid brain development. However, if not used mindfully, the impact
of new digital tools may have the inverse effect on our brain (Chapters 3 and 4).
16 Information Technology’s dark side

Hyperconnected managers and employees may suffer from work-family conflict or


jeopardize their work-life balance (Chapter 5).
Grinbaum and Groves (2013) emphasized that any innovation creates ultimately
new social practices and institutions transforming our day-to-day interaction with
the world and each other. In relation to IT innovation, those new practices and
institutions are key in handling its dark side. The future is bright because it is ours
in which to build the needed institutions and social practices.
2
THE BRAIN AND PARADIGMS OF THE
MIND

In the mid 1800s, Ignaz Semmelweis was puzzled when he noted that the
number of deaths caused by puerperal (childbed) fever were more than three
times higher in the obstetrical clinic he was supervising than in another compar-
able obstetrical clinic in the same hospital, Vienna General Hospital. Mothers
begged to be sent to the clinic that was not run by Semmelweis. In his investi-
gation, he observed that the two clinics shared the same climate and that his
clinic had far fewer patients. The first step in solving the mortality rate mystery
was when Semmelweis read the pathology report of a doctor who had died after
being infected by an accidental poke from a student’s scalpel while he had been
performing an autopsy in Vienna General Hospital. Semmelweis realized that the
doctor’s autopsy displayed a pathology similar to that of the women who were
dying from puerperal fever in his clinic. He quickly linked the cadaveric con-
tamination to puerperal fever. To fend off the pathology, he proposed having all
doctors in his clinic use a chlorine handwash. That practice reduced mortality
rates to less than 1 per cent in his clinic. The handwashing practice was at odds
with the established scientific and medical thinking of Semmelweis’ time, and he
could not explain why it worked so well. It took decades for the new scientific
practice, first introduced by Semmelweis in 1847, to be accepted. It was not until
Louis Pasteur developed germ theory and Joseph Lister confirmed Pasteur’s
theory that an explanation of the benefits of Semmelweis’ hygienic practices was
discovered (Wikipedia, 2017).

Revolutions and paradigms


In 1847, Semmelweis was bucking the science of his time. By science, we mean
the organized school of thought relating to specific kinds of tradition in scientific
practice, which include the combination of accepted laws, theories, applications, and
18 The brain and paradigms of the mind

instrumentation. Traditions of scientific research are therefore scientific languages


with their own sets of concepts, conventions, codes, and rules, that provide a way
to look at the world through different lenses (Popper, 1959). As Kuhn (1962)
informed us, “If science is the constellation of facts, theories and methods… then
scientists are the men who, successfully or not, have striven to contribute one or
another element to that particular constellation” (p.1). In some cases, a scientist’s
contribution may result in a paradigm shift, or a fundamental change in scientific
practices. By requiring physicians to wash their hands, Semmelweis introduced a
paradigmatic shift in medical practices. The world view of medicine in 1847 did
not include hygiene, germ theory, or handwashing practices. Medicine was ripe for
a scientific revolution, or a dramatic change in science.
In this chapter, we present the main paradigm shifts that relate to the cog-
nitive revolution– “the mind’s new science” (Gardner, 1987)– that stems from
the philosophical concept of mind-body supervenience. We then apply the lens
offered by this scientific revolution to understand brain overload and IT
(Information Technology) addiction.

Mind-body supervenience in philosophy: a precursor of paradigm shifts


In philosophy, supervenience is the ontological relation that occurs when upper-level
system properties are determined by their lower-level properties, making them
hierarchical in nature. For example, hardware supervenes on software because
software applications cannot be run without some form of hardware. The mind-
body supervenience dispute has been going on since before the birth of Christ.
The dispute centres around what is in charge (the mind or the body) and how it
operates. Multiple answers have been provided, depending on the scientific
practice being deployed.
The mind-body supervenience problem is central to philosophy. Mind-body
supervenience holds that mental phenomena must be anchored in some type of
physical system. In the Scholastic-Aristotelian tradition, the mind is conceptualized
as a tabula rasa (viz. blank slate) at birth. From an epistemological perspective,
human mental content is therefore built from bodily perceptions and experiences.
Descartes (1644), the father of modern Western philosophy, rejected this splitting
of corporeal substance into matter and stated “Cogito, ergo sum” (I think, therefore I
am). He conceptualized the mind as a thinking substance distinct from the body.
Later, the philosopher Kant (1781–1787/2003) criticized Descartes’ rationalist
theories of the mind. Kant stated:

The “I think” must be able to accompany all my representations: for otherwise


something would be represented in me that could not be thought at all, which
is as much as to say that the representation would either be impossible or else
at least would be nothing to me.
(B131–B132)
The brain and paradigms of the mind 19

Kant introduced the notion of mental representation and schemata in the first
chapter of his Critique of Pure Reason (1781–1787/2003). Kant described schemata as
a form of analysis in interposition between the sensory data and the abstract a priori
categories in the mind. Schemata are dual: one part is rules (i.e., logic) and the
other is empirical perception (i.e., image). Kant wrote, “This representation of a
universal procedure of the imagination in providing an image for a concept, I
entitle the schema of this concept” (A140).
Later, Diderot (1818–1819) conceptualized the mind as a metaphor– the soul’s
vessel. He argued that when the material dispositions of the brain are inadequate,
the mind is not able to navigate the body vessel. In fact, he considered the mind
to be a material entity (i.e., the brain) that, when it functions adequately, controls
the body.

A history of paradigm shifts


The extensive debates regarding mind-body supervenience in philosophy have
spawned discussions regarding the supervenience of cognition and emotions, or
of mind-gut feelings, in psychology. The primary paradigm shift from philoso-
phy to psychology occurred when Wundt challenged one of Kant’s postulates
that mental processes could not be empirically investigated. In 1879, Wundt
founded the first laboratory of psychology, at the University of Leipzig, there-
fore distinguishing psychology as a separate scientific practice from philosophy
and biology. Wundt defined psychology as “the study of conscious experience
as experience” (Gardner, 1987, p.102). Generally referred to as the father of
experimental psychology, Wundt applied the method of introspection, which
involves attending to one’s physiological sensations and reporting thoughts or
images as objectively as possible. The introspection method focuses on the
sensation rather than on the stimulus.
A second major paradigm shift in psychology occurred when Ebbinghaus (1885/
1913) applied the natural science methods to capture internal mental processes.
Ebbinghaus was the first scientist to establish the memory as a proxy to study the
human mind. In particular, he measured learning and forgetting performance
curves to study the effects of internal processes. He measured performance as the
retention of bits– that is, one single element of information (e.g., the letter ‘A’ or
the number ‘5’)– during learning.
While progress in psychology was being made in Europe, James (1890) took an
“American pragmatic approach.… He suggested that psychological mechanisms
exist because they are useful and help individuals to survive and carry out impor-
tant activities of living” (Gardner, 1987, p.108). This third paradigm shift, which
focuses on survival, is referred to as functionalism. According to functionalists,
internal mental processes provide people with the means (i.e., intent) and ends (i.e.,
goal) to adapt in order to survive in the environment. Through this shift, func-
tionalism highlighted the importance of mental dispositions and purpose, a word
derived from the old French porpos meaning aim or intention.
20 The brain and paradigms of the mind

The work of functionalists surely fed one of the most famous disputes in psy-
chology: the James–Canon controversy on emotion (1884–1929). Cannon (1914,
1927, 1929) stated that brain activity causes both an emotional experience (e.g.,
fear) and peripheral responses (e.g., sweating), which is the central view on emo-
tion even today. James (1884, 1890, 1894) favoured a peripheral view in which
bodily responses must occur before the feeling of fear. The debate still animates
research in psychology and neuroscience (Ekman, 1984; Cobos, Sanchez, Garcia,
Vera, & Vila, 2002). Both the central and peripheral views are still present in
research on emotions and cognition, and a plethora of definitions for the concept
of emotion are actively circulating in the scientific community. In this book, we
use Sherer’s (1994) definition of emotion: the “intelligent interface that mediates
between input and output” (p.127). This means adopting, or daring to adopt, a
central view. We distinguish emotions from primary drives such as hunger (Tom-
kins, 1984) and from feelings, or the subjective experience of emotion. We consider
emotion as having a specific intentional object (Frijda, 1986), such as a ‘loved one’
or a ‘feared one’.
Interestingly, this line of reasoning smoothly shifts the mind-body supervenience
problem toward a new problem, that of cognition-emotion supervenience. In psychology,
cognition-emotion supervenience is also referred to as the ‘interplay of affect and
cognition’ or, more commonly, ‘feeling and thinking’. Obviously, the solutions
proposed in solving the controversy (i.e., central versus peripheral) have shifted as a
function of the dominant paradigm.

Behaviourists, cognitivists, and their revolutions


The numerous and extended debates on mind-body supervenience in philosophy
have varied according to the two dominant paradigms: behaviourism and cogniti-
vism. Behaviourists mostly focus on modelling human behaviour, starting from an
animal’s point of view. Typically, they are concerned with stimulus and ante-
cedents. In contrast, cognitivists try to open the black box of the mind. They try to
study conscious mental processes and operations. In this section, we point to
the key contributions of these two paradigms and the revolutions they prompted.
We explore their distinctive scientific practices, tentatively reconciling them in the
chapters that follow as we seek to understand IT-related overload.

Stimulus and antecedents


Watson (1913) launched the behaviourist revolution with a strong focus on the
consequences of the activation of the physiological system on animal behaviour.
Indeed, behaviourism is defined as the study of the effects of the environment on the
observable behaviour of individuals without consulting hypothetical events or
aspects of cognition that occur within the mind (Carlson & Buskist, 1997). The
underlying focus of behaviourism is on patterns of observable behaviours provoked
exclusively by stimuli. Behaviourist research identifies the determinants or
The brain and paradigms of the mind 21

antecedents of individuals’ behaviour while ignoring their mental processes. Ante-


cedents – from the Latin antece-de-ns (viz. that go before) – precede the observed
behaviour and are hypothetically governed by a set of natural or social laws. The
behaviourist school is responsible for tremendous progress in experimental psy-
chology through the operationalization of the Stimulus-Response (S-R) scheme.
Behaviourists developed mechanistic models, where environmental conditions are
changed to alter the probability of certain behaviours occurring and researchers use
statistical approaches in their laboratories (see the work of Pavlov, 1927; Skinner,
1935; Hull, 1943). Behaviourists accept the biological nature of organisms and
discovered laws of behaviour.
Interestingly, emotion was often at the core of most behaviourist experiments
during the early 20th century, though only implicitly. For example, Pavlov (1927)
conditioned defensive responses in animals. He presented subjects with a neutral
stimulus (e.g., a tone) in parallel with another stimulus (e.g., an electric shock). In
so doing, he demonstrated that the neutral stimulus when used on its own would
acquire the affective properties of the other stimulus. He noted it was possible to
pair the neutral stimulus to another stimulus such that the neutral stimulus would
be valenced positively (i.e., as reward) or negatively (i.e., as punishment).
In the classical behaviourist paradigm of associative learning, antecedents are sti-
muli that cue behaviour. They are also perceptive and depend on the environment.
Consider Pavlov’s example. The bodily perception of the tone became associated
in the animal’s mind with the electric shock, cueing defensive behaviours. Later,
the tone presented alone directly cued the defensive behaviour without the deliv-
ery of the electric shock. Thus, in order to treat maladaptive behaviour, such as the
subject running away or acting aggressively when it hears a bell, behaviourists
propose replacing one behaviour with a healthier one until the original behaviour
is extinguished. That could be done, for example, by presenting the animal with a
positive reward each time the bell rings. Over time, the defensive behaviour is
extinguished. The behaviourist approach is therefore heavily concerned with
rewards (negative and positive) related to the biological centre of reward, which is
today referred to as the Brain Reward System (BRS).
Behaviourist approaches based on associative learning have been extensively used
to explain addictive behaviour – repeated and compulsive in nature, affecting indivi-
duals and their surroundings. This mostly originated with Skinner’s early efforts to
understand the rewarding effects of certain stimuli on laboratory rats placed in an
empty box. The rats could ‘decide’ to either push on a pedal to get food (i.e., a
reinforcer) or push on another one to receive electric simulation in the BRS.
Skinner (1935) observed the frequency with which the animals performed a
behaviour to get a reward. The behaviour (i.e., pushing on one or the other pedal)
became a behavioural antecedent through association. Some rats went as far as
dying from exhaustion, preferring the stimulation to the food. This study provided
the first evidence of brain addiction, which is a form of self-injurious behaviour. It
is a fascinating phenomenon as it does not contribute to the survival of the
organism. On the contrary, it may even lead to its extinction, as in the case of
22 The brain and paradigms of the mind

exhausted rats. Interestingly, there is no evidence of rats suffering from brain


addiction outside of the laboratory, where the environment is not constructed to
facilitate the addiction. That is, in the real world, the rats need to actively search
for potent reinforcers such as food.
With the evolution of behavioural paradigms, antecedents have been used
extensively as predictors of behaviour. Congruently, in the organizational beha-
viour and management information systems literature, antecedents are mostly
deemed causal, behavioural (e.g., maladaptive usage), environmental (e.g., working
pressure and technological features), and psychological (e.g., personality) when
researchers consider brain overload or IT addiction. That is, an antecedent is any
stimulus that in association with another, impacts behaviour. Let us take lying as an
example. Applying the behaviourist paradigm, in the child’s mind, lying may be
associated with an efficient means to manipulate his environment. The child’s goal
is to either avoid punishment or obtain rewards. Thus, applying the behaviourist
paradigm to extinguish the lying, the child should be positively reinforced when he
tells the truth and his lies should be completely ignored (i.e., not reinforced).
Pathological lying in adults occurs when the behaviour has been strongly rein-
forced in childhood as a functional behavioural antecedent. For example, under a
situation of time pressure at work, absenteeism could be a form of deceptive
behaviour used to avoid a high workload.
Mental processes were definitely not a concern when behaviourism was at its
peak. Behaviourists refused to deal with the black box of the mind; they simply
excluded mental operations and processing from their studies.

Cognitive revolution
Other researchers have observed shortcomings with the behaviourist approach.
Simon (1980) claimed that behaviourists did not solve important questions regard-
ing the complexity of the human mind. Lashley (1929) criticized the S-R scheme
as too simplistic. He stressed the importance of understanding the brain by focusing
on complex mental problems, especially problem-solving. While, according to the
literature, the cognitivist school is deemed to have emerged as a paradigmatic
revolution in the 1950s, Knapp and Robertson (1986) stated that “the conditions
so often regarded as necessary before cognitive psychology could develop were
present in years earlier” (p.14). For example, Moore (1938) conducted research to
“throw light on the problem of how knowledge gets into the mind” (p.v). Thus
cognitive sciences were forming even earlier than the 1950s. For example, the idea
of the cognitive map and spatial representation was introduced when researchers
began studying the paths of rats searching for food in labyrinths (Tolman, 1948).
The functionalist notion of intention is embedded in the very core of the mental
processes of problem-solving and decision-making.
Cognitivists challenge the S-R scheme and focus their research on how the mind
deals with information. Cognition refers to the metamorphosis that a stimulus (e.g.,
information) goes through while being processed by the human mind. Neisser
The brain and paradigms of the mind 23

(1967) wrote, “Such terms as sensation, perception, imagery, retention, recall,


problem-solving and thinking, among others refer to hypothetical stages or
aspects of cognition” (p.4). Cognitivist theoreticians are the precursors of the
developers of artificial intelligence, computer science, and neuroscience (Gardner,
1987). They deal with the inner processing of information, working on infor-
mation theories, flow, and processing. A non-exhaustive list of core topics in the
cognitive sciences includes perceptual interpretation, information categorization,
evaluation, judgment, problem-solving, decision-making, and learning. Cogni-
tive scientists have programmed digital computers to perform problem-solving
tasks that were challenging to the everyday human (Newell, Shaw, & Simon,
1957). Miller (1956a, 1956b), Broadbent (1958), and Newell and Simon (1972)
were instrumental in providing key insights for understanding memory, attention
selection, information processing, and therefore brain overload. Newell et al.
(1957) designed and implemented a class of information- or list-processing lan-
guages that incorporate basic information processes. They wrote programs such as
Logic Theorist to run on a computer designed to solve difficult problems (Newell
et al., 1957). They accounted for the ‘behaviourist magic’ that occurs inside the
human mind preceding behaviour. Their key contribution was to establish com-
plex problem-solving processes and empirically test heuristics, or the operational
path taken to solve a problem expeditiously. They stated that “the programmability
of the theories is the guarantor of their operationality” (Simon & Newell, 1971, p.
148). Cognitivists reverse-engineered the mind and developed new models of
memory architecture following two main approaches: computationist and associative.
Computationist models focus on formal operations using symbols to be computed
during information processing. Scientists such as Miller (1956a, 1956b) and
Broadbent (1958) demonstrated that memory capacity is limited to 7+/−2 bits of
information. Furthermore, they contributed to the theorization of the process of
attention by viewing it as a limited cognitive resource. Bits of information are
organized in chunks; that is, the assemblage of units of a larger number of elements
(i.e., information). Based on their relevance (i.e., congruity) with other elements in
memory, they are structured in the form of schemata. All schemata are combined
to form a set of mental representations.
Associative models (Anderson & Bower, 1973) focus on the activation of
mental representations through node activations. These mental representations
are stored and later activated when the brain needs to process new elements of
information (bits or chunks). To solve problems, the individual consciously acts
upon theses mental representations, which are fused with their stored personal
history and mental thesaurus. Interestingly, Kant (1781–1787/2003) used his
understanding of mind-body supervenience to convey the idea that “a schema
is directly activated in terms of sensory experience and yet can be plausibly
thought to provide an interpretation of the experience itself” (Gardner, 1987,
p.58). Schemata are enriched through personal experiences that build nets of
representations explicitly in the memory. Schemata are required for current and
future problem-solving.
24 The brain and paradigms of the mind

As we explained previously, debates on mind-body supervenience have been


waged throughout the history of psychology. The associated theories, concepts, and
methods used to understand the thoughts and actions of human beings have varied
according to the dominant paradigms: behaviourism and cognitivism. In the next
section, we detail the rich tenets of cognitive theories and associated terminology.

Brain, memory architecture, and emotion


The structure of the human brain has evolved over millions of years in response to
our need to survive in hostile environments (Heider, 1946). The human brain has
shown the potential to adapt based on its experiences, processing information and
making decisions while primarily focusing on survival. This evolved nervous system
promotes fitness both directly and indirectly. For example, it directly helps us avoid
harm by signalling when to remove a hand from a hot surface to prevent a burn.
Such reflexes are deemed to be peripheral, rigid, and non-reflective when they
connect one type of stimulus to one type of response, as observed in the S-R tra-
dition. They promote fitness indirectly by modifying biological parameters, such as
the peptide hormones and neuropeptides involved in reproductive behaviours and
the attachment to offspring (e.g., oxytocin). Panksepp, Knutson, and Burgdorf
(2002) argued on the other hand that the human emotional system is proactive
because it anticipates fitness-relevant stimulus based on experience. Therefore, we
learn to avoid putting our hands into a fire after having been burned a few times.

Human nervous system


The human nervous system is divided into the Peripheral Nervous System (PNS) and
the Central Nervous System (CNS). The PNS and CNS act in concert to process
information. The PNS consists of spinal and cranial nerves, the Autonomic Ner-
vous System, and ganglia, which are sensory receptor organs scattered throughout
the body. Information about perceived bodily changes are conveyed as electrical
signals carried via the nervous network to the CNS. The CNS is divided into the
brain (containing about 1 trillion cells) and the spinal cord. The CNS consists of
three main functional components: the sensory system, the motor system, and
higher brain functions (e.g., the hypothalamus, subcortical, and cortical areas). The
hypothalamus is a portion of the brain that is particularly concerned with home-
ostasis. Homeostats, such as regulating body temperature, thirst, and sleep, are
energy-consuming physiological mechanisms (Cannon, 1932). The hypothalamus, a
part of the limbic system, is located below the thalamus. While the cortical areas
are involved in personality, creativity, thinking, judgment, and mental processing
among others, the subcortical areas are involved in consciousness and attention
processes. All three elements are involved in motivation, emotion, learning, and
memory. The limbic system is where the subcortical structure meets the cerebral
cortex, the highest level of neuronal organization and function and the uppermost
region of the CNS. Neurons are connected to one another through complex
The brain and paradigms of the mind 25

synaptic biochemical and electrical mechanisms that support the body’s activities.
Neurons’ neurosecretory cells synthetize and release neurohormones (e.g., dopa-
mine and oxytocin) that circulate through the blood and serve as biochemical
messengers.

Limbic system
The limbic system is a complex collection of structures that is commonly referred to
as the emotional brain or archaic brain. In a nutshell, it includes the amygdala, hip-
pocampus, thalamus, hypothalamus, basal ganglia, and cingulate gyrus. These
structures have been studied extensively in order to understand emotion as well as
memory. As LeDoux (1998) reported, the “limbic system itself has been a moving
target… [with] [m]ountains of data on the role of limbic area in emotion… but
there is still little understanding of how our emotions might be the product of the
limbic system” (p.158). LeDoux (1992) demonstrated that the amygdala is a locus
of synaptic plasticity underlying learned fear. Research has focused on the pathways
between sensory input to the amygdala, and on intercellular signalling mechanisms.
Authors have speculated that this part of the limbic system modulates explicit (i.e.,
declarative) memories formed in other systems (Packard, Cahill, & McGaugh,
1994). Scoville and Milner (1957) demonstrated that damage to the hippocampus
leads to a deficit in Long-Term Memory (LTM). The hypothalamus links the ner-
vous and endocrinal systems via the hypophysis. The limbic system communicates
through secretion of neurohormones and transmitters that control basic bodily
homeostatic states such as hunger, thirst, mood, and fatigue. The limbic system,
particularly the hypothalamus, is involved in social attachment behaviour through the
action of the neurohormone oxytocin. Oxytocin is also commonly called the ‘love’ or
‘cuddle’ hormone. It is a key biological parameter in understanding reproductive
behaviours, attachment to offspring, and thus survival of the species.
The limbic system also plays a role in substance addiction through dopaminergic
projection to the nervous system. Neurohormones such as dopamine are heavily
involved in the BRS mechanism, which is a complex cerebral circuit engaging
specific neuronal pathways that are modulated by cortical oversight systems affili-
ated with emotion, memory, judgment, and decision-making (Makris, Oscar-
Berman, Jaffin, Hodge, Kennedy, Caviness et al., 2008). The major component of
BRS is the mesocorticolimbic reward circuit (Heimer & Van Hoesen, 2006). In
animals and humans, the BRS is responsive to positive and negative reinforcement.
Behaviourists have demonstrated that reinforcement increases the probability of a
subsequent response. When abused, drugs activate the BRS and are as addictive as
natural reinforcers such as food (Volkow & Wise, 2005).
Interestingly, researchers have found that the limbic system is tightly connected
to the prefrontal cortex (PFC) and therefore involved in many brain functions,
such as emotion, LTM, and motivation. Damasio (1994) demonstrated that ana-
tomic damage to part of the limbic system leads to inability to use affective feed-
back in judgment and decision-making. A traumatic brain injury in part of the
26 The brain and paradigms of the mind

limbic system leads to impaired emotional reactions to punishment or reward in


monkeys (Kluver & Bucy, 1937) and causes emotional and behavioural dis-
turbances (Damasio & Van Hoesen, 1983).

Prefrontal cortex
The prefontal cortex (PFC) plays a key role when someone is dealing with informa-
tion and in decision-making (Ernest & Paulus, 2005). In particular, the PFC helps
us detect errors or recover from disruptions (Rowe, Maughan, Moran, Ford,
Briskman, & Goodman, 2010). The PFC is involved in the central executive
control system that can be broadly divided into cognitive components such as
mental set-shifting, inhibition, information updating, working memory (WM),
response monitoring, and temporal coding (Szczepanski & Knight, 2014). These
activities have proven crucial in effective decision-making. In turn, damage to the
PFC results in impaired recollection- and familiarity-based recognition, failure to
exhibit memory advantages for novel stimuli, poor affect, socially inappropriate
decision-making, failure to use emotion in making decisions, and defective social and
moral reasoning relating to the ability to experience cognitive and emotional empa-
thy. The PFC has extensive reciprocal connections with nearly all cortical and sub-
cortical structures. (For a review, see Szczepanski & Knight, 2014.)
Advances in cognitive neurosciences and understanding of neurocognition
(brain/mind) systems (Tulving, 2002) have relied heavily on identifying biological
processes that support cognition and behaviour. This scientific practice focuses on
the neural connections in the brain that are involved in mental processes. Parts of
the brain, such as the limbic system and the PFC, play an important role in
understanding emotion and cognition. Cognitivists modelling memories have
relied heavily on advances in the field of neurophysiology. For example, the
development of functional neuroimaging techniques (e.g., positron emission tomo-
graphy and functional magnetic resonance imaging) has helped researchers under-
stand how these parts of the brain function as well as their impact on cognition and
behaviours. Despite such technical progress, there is still no comprehensive biological
map addressing the broader mind-body supervenience problem.

Information processing, models of memory, and cognitive


schemata: an overview

Attention and filters


The concepts of attention and filters are critical to the study of information pro-
cessing and brain overload. Attention is defined as “the process of allocating
resources to a stimulus or attributes of a stimulus” (Basil, 1994, p.180). Broadbent
(1958) is best known for developing a popular model of information processing.
This Filter Model of Attention describes a sieve, or filter, that selectively accepts or
rejects information signals. The filter reduces the information processing load on
The brain and paradigms of the mind 27

the cognitive system. It deals with one sensory channel at a time as it determines
what information is recognized. Broadbent’s all-or-nothing model explains the
bottleneck effect that occurs before pattern recognition. However, the model
does not account for what is known as the ‘cocktail party situation’ – that is,
when a person can be immersed in a discussion at a party and still hear her name
being mentioned in another conversation. If the stimulus is not analysed, as
Broadbent proposed, how can its relevance be demonstrated? Nevertheless,
the Filter Model of Attention was of extreme relevance in the evolution of the
conceptualization of memory and attentional processes in the history of cognitive
psychology. It inspired researchers such as Treisman (1964) to investigate atten-
tion selection as a function of information content, and its threshold in activating
hierarchical awareness.
Deutsch and Deutsch (1963) suggested a model in which pertinence is the
key to the selection of attention. Based on this Pertinence Model, Norman
(1969) stipulated that all signals are initially analysed and then passed on to an
attenuator before further processing. However, the Pertinence Model is not
economical in terms of the cognitive system’s total load. Furthermore, it
has failed under certain experimental laboratory conditions (Treisman &
Riley, 1969).
The conceptualizations of human memory and attentional resources as limited
and embedded have their roots in the pivotal article by Miller (1956a), “The
Magical Number Seven, Plus or Minus Two: Some Limits on Our Capacity for
Processing Information”. Miller (1956a) quantified the mind’s limited capacity and
stated in the unitization hypothesis that the only way to increase the amount of
information being processed is “by organizing the stimulus input simultaneously
into several dimensions and successfully into a sequence of chunks [so that] we
manage to break … [the] information bottleneck” (p.95). How the items are
organized into chunks determines recall. For example, memorizing and recalling
the letters ‘UAKESUU’ is harder than memorizing and recalling the same letters
introduced as ‘USA UK EU’, because they are grouped together into acronyms
that have associations with terms stored in our memory. Thus, memory limits can
be overcome by encoding items into chunks before transferring them to schemata,
forming mental representations. Mandler (1967) extended the unitization hypoth-
esis by proposing the existence of “superchunks”. The cognitive system’s ability to
overcome its structural limitations opened the way for the two major con-
ceptualizations of memory architecture: the Modal Model and the Full Working
Memory Model.

Models of memory architecture


Our discussion of memory architecture models includes a description of the Modal
Model (three-store model) for short-term, long-term, and sensory memory. We
also explore the roles of the central executive control system and schemata as they
relate to WM.
28 The brain and paradigms of the mind

Modal Model
The most common representation of the structure of the human memory archi-
tecture is the Modal Model (Atkinson & Shiffrin, 1968). The model combines the
short-term storage and attentional system into a single limited-capacity memory:
the Short-Term Memory (STM). The model was developed to represent the capacity
of each basic memory store in terms of time and load and is based on the
assumption of the existence of two distinct structural components, as proposed by
Broadbent: LTM and STM. Incoming sensory information that initially enters the
sensory memory (SM) store soon decays and then is lost in a short period of time.
The STM receives the selected inputs both from the SM and the LTM stores. The
model suggests that the way an input is processed depends on the particular
executive control processes that the individual activates in the STM (e.g., rehear-
sing, searching, deciding, or coding) and on matching with the information held in
the LTM.
The Long-Term Memory is permanent memory that is partitioned into two types
of memory: Explicit (i.e., declarative, conscious) Memory and Implicit (i.e., non-
declarative, non-conscious) Memory. Explicit Memory is a brain construct that refers
to the conscious recollection of factual information, previous experiences, and con-
cepts. It is subdivided into the Semantic Memory that acts as a mental thesaurus and the
Episodic Memory that stores personal experiences (PFC and limbic system) (Tulving,
1972, 1983). Implicit memory is not a brain system construct since it is non-conscious. It
refers to a heterogeneous collection of abilities (Squire & Alvarez, 1995).

Working memory
Norman (1968) argued that the two stores in the Modal Model actually con-
stitute a single storage mechanism: the working memory. The WM’s role is to
activate traces leading to temporal versus permanent change in the cognitive
system itself. Later on, Baddeley and Hitch (1974) focused on the STM store and
developed the multicomponent model of the WM. In their model, the central
executive control system directs, selects, and orchestrates the flow of information
so as to overcome limited structural capacity, thus accounting for the cocktail
party situation.
The distinction between these models focuses on the STM: it is clearly limited
in the Modal Model and more flexible in the Full WM model. A strength of the
Full WM model is that it helps explain information processing in task-switching
contexts. Also, the central executive control system is directly responsible for
coordinating the information used to perform planning activities and make decisions
(Baddeley & Logie, 1999).
Later, Baddeley (2000) added the idea of an episodic buffer, which is similar in
function to Tulving’s (1972) episodic memory. Schemata guide the way information
is encoded and retrieved from the LTM based on the activation of the associated
cognitive network (Bower, 1981).
The brain and paradigms of the mind 29

Emotions and cognition as interdependent in information processing


To understand the interdependence of emotions and cognition, we introduce
Bower’s Associative Model of Emotional Memory (1981) and the Interacting
Cognitive Subsystem (Barnard, 1985). Each approach to emotions and cognition
was principally built using either the Modal Model or WM.

Bower’s Associative Model of Emotional Memory


Bower’s Associative Model of Emotional Memory focuses on both emotions
and cognition. Bower (1981) argued that cognitive processes are necessary for
eliciting and experiencing emotions. His model is based on the information
processing view of cognition as an associated network that activates nodes
representing specific concepts, events, and clusters in the LTM. Each emotion
node is connected to another node representing valenced events (i.e., episodic
memory) and/or valenced concepts (i.e., semantic memory) that have been
activated previously in association with past experiences of the related emo-
tions. Memory activation spreads from emotions and from concepts held in the
memory. Thus, the cognitive schemata are reinforced through a feedback loop
as a function of past experiences. Bower (1991) wrote that “about six (plus or
minus a few) basic emotion nodes are biologically wired into the brain, and…
a number of innate as well as learned environmental situations can turn on a
particular emotion node” (p.32).
Information processing in memory has proven to be affect-congruent, and its
retrieval is affect-dependent. For example, Bower (1981) demonstrated that
anxious people recall threatening material better than non-threatening material. In
other words, individuals are more likely to process information that is affectively, as
well as cognitively, congruent (i.e., pertinent). By congruent we mean that the
cognitive schemata (i.e., mental representations) in LTM match emotionally and
cognitively with the information being processed. The match includes auto-
biographical events, concepts, and affects. When the information is congruent, it
selectively and automatically primes associated cognitive schemata in memory.
Also, information that is encoded in a similar affective state is easier to retrieve
through reactivation of the associated cognitive schemata. Bower’s model predicts
the affect-congruity phenomena, the perceptual threshold for affect-congruent atten-
tional biases being higher than it is for affect-incongruent material. Again, anxious
individuals recall threatening material better than non-threatening material (Bower,
1981). Like pertinent material, congruent material is deemed to match based on
the actual schemata stored in the mind. Bower (2001) clearly established that affect
is a resource used by human memory to process information. It presents a selection
bias for self-relevant information. At the same time, emotional information is pro-
cessed in parallel by brain systems (i.e., the amygdala and hippocampus) responsible
for identifying emotional aspects as well as the non-emotional, conceptual, or
semantic aspects of information (LeDoux, 1992).
30 The brain and paradigms of the mind

Bower’s model suggests that links between affect and thinking are neither
motivationally based… nor are they the result of merely incidental, blind
associations, as conditioning theories imply. Instead, Bower (1981) proposed
that affect, cognition and attitudes are integrally linked within an associated
network of mental representations.
(Cited in Forgas, 2003, p.599)

Bower’s model is supported by findings from neuroscience. For example, LeDoux


(1998) emphasized that “conscious memories can make us tense and anxious and we
need to account for this as well” (p.203). That is, the mind can biologically activate a
bodily reaction when recalling memories. Also, neuroscientists have demonstrated
that emotionally tagged signals assist with decision-making processes (LeDoux, 1992;
Damasio, 1994).

Interacting Cognitive Subsystem


Barnard looked at affect-based judgement differently than Bower. Barnard (1985)
proposed a computational approach to model Bower’s associated network. His
model, the Interacting Cognitive Subsystem (ICS), presents eight subsystems of the
mental architecture, similar to Baddeley’s (1986) full WM model. ICS offers a
schematic perspective on the activation of emotion in information processing under
prototypical features of emotion-eliciting situations. The related schematic models
are based on experiences in a given culture or family. The ICS model proposes that
different histories develop as a result of different cognitive-affective routines. The
model operates at the abstract level of meanings, which “enables the concept of an
emotion to be invoked without the experience of it” (Barnard, Duke, Byrne, &
Davidson, 2007, p.1173). This explains why highly personal emotional signals
better match arousal and valenced memory recognition tasks than low personal
signals. ICS proposes that affect-biased judgement occurs at the schematic level
(i.e., abstract model of experience), whereas Bower focuses on specific conceptual
levels. (See Teasdale, 1993, for a comparison of these models.) Both models provide
an interesting explanation of subjective organization of memory (Tulving, 1972).
Emotion and affect have been studied extensively to determine how they influence
cognition and behaviour. Interestingly, Schwarz (1990) indicated that when won-
dering about how we feel regarding a certain target, we may mistake feelings
experienced due to a pre-existing state for a reaction to a target. This may explain
the common saying that ‘you only have one chance to make a first impression’.

Behaviourist and cognitivist: enemy brothers


The scientific practices used to understand the thoughts and actions of human
beings have varied according to the dominant paradigms of behaviourism and
cognitivism. Both have contributed to a better understanding of the mind and
behaviours. However, fundamental similarities between cognitivists and behaviourists
The brain and paradigms of the mind 31

are often overlooked (Bargh & Ferguson, 2000). Above all, both schools consider
that “mental and behavioural processes… can proceed without the intervention of
conscious deliberation and choice” (Bargh & Ferguson, 2000, p.925). In addition,
the study of emotion and feelings initially was perceived as a curse by both para-
digms. The science of ‘computer-like operation’ is not about emotion according to
cognitivist Neisser (1967) and his colleagues. Only later did behaviourists consider
personality traits to be antecedents of behaviour (Zajonc, 1980). Finally, both
paradigms initially addressed emotion and differences in personal disposition as
nuisance variables that needed to be controlled or even ignored.
Eventually, both paradigms evolved toward a greater consideration of emotion
through the common concept of association. Behaviourists pair stimuli together
through conditioning, whereas cognitivists match stimuli to mental representations
through information processing (Skinner, 1985). Both also address emotions and
affect through concepts such as positive and negative nodes or the BRS. Further-
more, the consideration of personality disposition evolved for both. Interestingly,
both approaches even aimed to expel all vocabulary relating to mentalism– the ter-
minology of the mind, used particularly in psychoanalysis (Gardner, 1987)– from
their scientific practices.

Nurture versus nature debate of individual differences


As we discussed earlier, behaviourists, in the Scholastic-Aristotelian tradition, con-
sider the mind to be a tabula rasa. Thus, their paradigm surely favours a nurture
perspective on the mind, where environmental characteristics shape and predict
behaviours. Consequently, personality is built from environmental and perceptual
experiences and consists of observable behaviours. Here, we define a personality trait
as the tendency to manifest particular patterns of cognition, emotion, motivation,
and behaviour in response to a variety of eliciting stimuli (Fleeson, 2001). Each of
the well-known Big Five personality traits (conscientiousness, agreeableness, neu-
roticism, openness, and extroversion) are therefore another form of behavioural
antecedent. From a nurture perspective, a spoiled or abused child may turn into a
pathological liar– a common trait observed in Narcissistic Personality Disorder.
Skinner’s laboratory rats may have been nurtured to be the very first rats addicted
to electricity or, more precisely, ‘pedal over-connectivity’. Scores on personality
tests have been significantly correlated to patterns of cognition, emotion, motivation,
and behaviour (Eysenck & Eysenck, 1980; Fleeson, 2001).
In contrast, cognitivists see the mind as a pre-existing built-in architecture that
has universal properties. They predominantly support a nature perspective of the
mind. Cognitivists regard personal differences to be based on the information proces-
sing capacity (IPC) that are resources deployed to cope with the limited nature of
the memory system. Revelle (1994) argued that “personality effects can be under-
stood in terms of differences in the way and in the rate at which parameters of the
cognitive control system are adjusted to cope with changes in a constantly varying
world” (p.347). Cognitivists predominantly use cognitive style and expertise instead
32 The brain and paradigms of the mind

of personality traits. Cognitive style is “an individual’s characteristic and consistent


approach to organizing and processing information and experience” (Tennant,
1988, p.3). It is a way of processing information that favours certain strategies and
heuristics above others. Expertise is an innate form of heuristic style which is
domain-specific; that is, it is derived from prior experience within a certain domain
of information. Such expertise is critical because the more frequently an individual
processes a domain of information, the less attentional resources are required to do
so in future (Schneider & Fisk, 1982). Broadbent (1958) supported the idea of
differences in cognitive processing when he noted that “some individuals show
larger decrements from prolonged work than others do” (p.140). Some individuals
may even be better at chunking bits of information than others– superchunkers
(Mandler, 1967). Their IPC enables them to be especially good at processing
information. Congruently, cognitivists turn mostly to aptitude tests that measure
higher psychological processes such as quantitative reasoning, visual-spatial proces-
sing, working memory, and fluid reasoning. (We discuss such measurements in
detail in Chapter 6).
The human memory relies on different structures and associations diffusely dis-
tributed throughout the brain. Neuroscience research has linked specific structures,
such as the limbic system, PFC, or BRS, to specific abilities. While there is
extensive evidence that cognition and emotion are interconnected, the role of the
brain in emotion– more precisely, emotion-cognition supervenience– has not been
clearly established. Emotion is thought to have temporal priority over cognitive
processes. This peripheral view, or mind-gut connection, is surely worthwhile
considering. On the contrary, cognitivists with a central view argue that cognitive
processes are necessary for the processing, elicitation, and experience of emotions.
Their approach is supported by neuroscience research which demonstrated that
emotion and cognition are two sides of the same coin.

A common enemy: mentalism


As we previously explained, debates on mind-body supervenience have been
waged throughout the history of psychology. Interestingly, both sides want to ban
all vocabulary relating to mentalism, or unconscious (non-declarative) processes.
The behaviourist demonstrations of classical and operant conditionings of fear
appeared as an explicit (i.e., final) answer to the psychoanalytic unconscious con-
ceptualizations of phobia and anxiety-neurosis (Freud, 1894, 1927). Interestingly,
Freud’s approach to supervenience, referred to as ego psychology, was the first to
tightly combine the mind and body. Freud (1927) speculated that “the ego is first
and foremost a body-ego; it is not merely a surface entity, but it is itself the pro-
jection on the surface” (p.31). Like Kant, Freud considered the interposition
between sensory data and abstract a priori categories in the mind. As a medical
practitioner, Freud went further than Kant. He observed how the ego (i.e., mind)
affected his patients’ bodily reactions. He linked mind and body, or emotion (when
perceived as physiological) to the mind (cognition). As cognitivists suggest, Freud’s
The brain and paradigms of the mind 33

work surely is applicable in terms of the existence of the unconscious as well as the
control it may exert in repressing emotional arousal when processing information
(Nisbett & Ross, 1980; Norman, 1980). Boden (1977) stated that cognitivists “have
to acknowledge that while theorizing purely on the verbal level and lacking any of
the rich conceptual instruments of an artificial intelligence programmer, Freud was
occupied with exactly the same problems as the present-day cognitive psycholo-
gist” (in Wegman, 1985, p.9). Therefore, cognitivists converge with Freud’s pos-
tulation that human beings are scarcely aware of how their higher-order cognitive
processes determine their behaviours (Nisbett & Wilson, 1977).

Consumption of resources: are we all equally affected by IT-related


overload?
While individuals are expected to base their decisions on rational arguments, formal
logic, and principles, they are often inefficient in processing information and sol-
ving problems (Tversky & Kahneman, 1973). Research has demonstrated that logic
and ‘gut feelings’ coexist side by side in the human mind to process information
and make decisions. Individuals process information with the support of their
memories, which include some rules of logic and the emotions attached to their
experiences. Conscious memories are embedded with emotional cues to help us
make sense of our environments (Frijda, 1986, 1994). Those memories are built
with information content, organized contextually, and coloured by emotional
experiences. Those memories are used to extract the relevance and purpose of data
and to make sense of the available information. Paying attention requires mental
effort and expends mental energy (Kahneman, 1973). As Wickens (1980) empha-
sized, people can only attend to information to the extent that their mental capa-
city is available. That is, they need to have a variety of resources including mental
capacity and attentional resources.
As noted in Chapter 1, resources are “objects, personal characteristics, conditions
and energies” that have value or can be used to acquire other resources (Hobfoll,
1989, p.516). They may be individually possessed physical, emotional, or cognitive
energy (Hobfoll & Freedy, 1993). Resources are also time-based. They affect each
other, exist as a resource pool (Kahneman, 1973), and are necessary for cognitive
processing (Monetta & Joanette, 2003). Hobfoll (2002, 2011) adopted an inte-
grated view of resources that considers them broadly as aggregates rather than
focusing on a specific resource. Individuals act in ways to conserve these resources
(Hobfoll, 1989; Hobfoll & Freedy, 1993). Interestingly, as we presented in Chapter
1, efficient mental strategies are required in order to spare resources. Developing
coping strategies entails drawing from the available resource pool, but doing so also
consumes energy. Additionally, as the behaviourists and the BRS mechanism
revealed, some ‘curious’ behaviours characterized as addictive are surely exhausting
resources and are, therefore, maladaptive.
Kahneman (1973) postulated that limits in information processing arise because
meaning processing requires considerably more resources than sensory processing.
34 The brain and paradigms of the mind

Processing inputs– meaning and sensory– involves a certain level of effort that calls
upon attentional, cognitive, emotional, and physical resources. How attentional
resources are allocated remains a key question for cognitivists. Most memory the-
oreticians think that the cognitive system has limited attentional resources (Kah-
neman, 1973; Neisser, 1976; Kahneman & Treisman, 1984). However, the debate
about the limitations of cognition is ongoing (Winograd & Neisser, 1992). We
speculate that this is because cognitivists have mostly conceptualized the super-
venience of cognition over emotion in problem-solving and decision-making
activities. We consider IT-related overload to be the state of being challenged in
processing information delivered by IT or in imposing control over IT-related
activities. This challenge is related to both emotional and cognitive resources.
Although the amount of information is commonly blamed, we argue that the
culprit is actually inadequate processing resources. The theoretical underpinnings
of ‘too much information’ or ‘too much connectivity’ are mostly supported by a
surfeit interpretation of Miller’s “Magical Number Seven” article. Too often,
researchers fail to reference his later work on the unitization hypothesis and
ignore the importance of reloading the schemata in the LTM in order to chunk
information (Sweller, 1988). Also, too little attention has been given to the
individual’s resource pool. We assume that multiple resources influence the
explicit strategy exerted to counter the limitation of attentional resources during
information processing and decision-making. Information Technology has
become one of these resources, whether it is used mindfully or not. The alloca-
tion of resources must be carefully and consciously reviewed when dealing with
IT-related overload.
Coping mindfully is a brain trigger and therefore rewarding. It is a metacognitive
activity. How shall we stop a flood of emails without missing relevant information?
Do we want parts of the information we just deleted? Are we so focused on staying
at the top of the pecking order that we are willing to exhaust our resources by
answering all emails? Should we rely on organizational policy in doing so? Each
individual has his own answers and therefore his own strategies (see Chapter 4).
Extreme anger might stimulate someone to decide to delete all her emails, effi-
ciently dumping part of the problem without any immediate consequences. That
could be an extremely wise decision, even if it is perceived as irrational. As Frank
(1988) stated, “Many actions, purposely taken with full knowledge of their con-
sequences are irrational” (cited in LeDoux, 1998, p.36). Corey came up with a
much more subtle approach in his auto-reply (see Chapter 1).
Information overload is either conceptualized in terms of external variables such
as task requirements or personality antecedents (e.g., behaviourist, organizational
psychology view) or as the interaction of the task and the human information
processing capability or resources (e.g., cognitivist, neuroscientist view). Informa-
tion overload is therefore either conceptualized from an input perspective in the
context of personality-environment fit/misfit, regarding the demands imposed by
the task and personality, or in terms of IPC. As discussed in Chapter 1, we argue
that the term ‘brain overload’ better describes the phenomenon more commonly
The brain and paradigms of the mind 35

called ‘information overload’. That is, brain overload focuses on the brain (e.g.,
processor) and not on the amount of input (i.e., data or information).
Interestingly, the same applies to IT addiction. It is either conceptualized as the
result of external factors such as the amount of time spent connected to IT (Korac-
Kakabadse, Kouzmin, & Korac-Kakabadse, 2001) or antecedents such as narcissism
(Buffardi & Campbell, 2008) rather than being viewed in terms of the mental
processing and heuristics stored and activated in memory linked to the goal of the
behaviour itself. Jacoby (1984) wrote that consumers use cognitive strategies to
limit the amount of information used as part of their decision-making process,
“stopping far short of overloading themselves” (p.434). Similarly, some users make
use of mental strategies and do not experience an ounce of IT addiction, whereas
others cannot stop without experiencing withdrawal (Davis, 2001; Caplan, 2003).
Our claim is that the previous paradigms have their own limitations in under-
standing a complex set of problems. Chapter 3 presents in detail the Emotional-
Cognitive Overload Model (ECOM) that we have developed, along with a review
of the management information systems overload literature. The chapters that
follow reflect on related issues.

Conclusion
Neuroscience research has demonstrated that the brain areas required for cognition
and emotion are highly interconnected (Ghashghaei & Barbas, 2002). Emotion
fosters the prioritization and organization of our behaviour (Barrett & Campos,
1987; Lazarus, 1994), providing input for information processing, judgment, and
decision-making (Frijda, 1986, 1994). In the popular and scientific literature, we
can also find discussions on and evidence of the existence of a ‘second brain’. Also
known as the mind-gut connection, the second brain relates to the gut feelings
transmitted via the stomach, oesophagus, small intestine, and colon to the CNS
(Gershon, 1999). This is exemplified in the “miracle on the Hudson”. An Airbus
A320 made an unpowered emergency landing in the Hudson River near New
York City after both engines failed because of bird strikes. Master of the Guild of
Air Pilots and Air Navigators Rick Peacock-Edwards said, “To have safely exe-
cuted this emergency ditching and evacuation, with the loss of no lives, is a heroic
and unique aviation achievement.” The pilot, a former US Air Force fighter pilot,
clearly used his gut feelings and expertise to execute that manoeuvre, saving 155
lives (Rutkowski, 2016).
Behaviourists such as Zajonc (1980) have conceptualized emotion and cognition
as independent sources of effects in information processing. Emotion is thought to
have temporal priority over cognitive processes. This peripheral conceptualization,
or mind-gut connection, is surely worthwhile considering. Cognitivists on the
other hand have argued that cognitive processes are necessary for the processing,
elicitation, and experience of the emotions. Their approach is correlated with
neuroscience research which demonstrates that emotion and cognition are two
sides of the same coin. Newell, Rosenbloom, and Laird (1989) stated that:
36 The brain and paradigms of the mind

no satisfactory integration yet exists of these phenomena into cognitive sci-


ence. But the mammalian system is clearly constructed as an emotional system,
and we need to understand in what way this shapes the architecture, if indeed
it does so at all. (p.127)

Although there are still many open controversies, one bit of scientific evidence
with considerable agreement is that the human nervous system requires energy to
keep its homeostatic and higher brain functions in balance. The nervous system
requires the consumption of endogenous and extraneous resources (Kahneman,
1973; Hobfoll, 1989). Not surprisingly, for its own functioning, the brain receives
20 per cent of the blood, oxygen, and calories supplied to the body. Thus, for the
sake of our discussion, energy is critical. Energy deployed, energy provided may be
the key to understanding a part of the supervenience puzzle of IT-related overload.
3
INDIVIDUAL DIFFERENCES IN
EXPERIENCING IT-RELATED
OVERLOAD

Consider the plight of the two managers in Box 3.1.

BOX 3.1 CHRIS AND ALIX


Chris, a vice president of marketing, just arrived at the office with his favourite
latte from Coffee Amor. As he was walking to work from the subway, he had a
thought about a new strategic initiative. He wanted to write it down before he
forgot it so that he could talk about it with his boss. Just as he set down his latte,
his office phone started ringing and his smartphone buzzed to indicate the arri-
val of a text, followed by another buzz and yet a third buzz. He answered the
phone and one of his direct reports ducked into his office to hand him the
monthly marketing report. While speaking on the phone, he sat down and star-
ted answering the texts. As he responded to the first text, he heard a buzzer…
He wondered, “What was that for? Is it my Fitbit alerting me that I’ve been sit-
ting too long?” No, it was an alert that he was to meet with his boss, Alix, in 15
minutes. He checked his phone to read yet another text. Another notice about
the meeting went off. His latte was ice-cold by now. Chris went into Alix’s office.
She finished writing an email on the screen as she started talking to him. He
started to tell her about his new idea– what he could remember of it– but he
could see that she was not paying any attention because she was checking texts
on her phone while nodding her head in seeming agreement with what he was
saying. Then his smartphone rang and interrupted what little flow of conversa-
tion there was. Chris apologized for the phone interruption and told Alix, “I’m
feeling very frustrated. I can’t seem to get anything meaningful done at work.”
Chris could not process all of the pertinent information that was flying at
him– and Alix wasn’t doing all that well at it either.
38 IT-related overload and individuals

Overload situations such as those described above are all too frequent in today’s
workplace. An understanding of overload using the Emotional-Cognitive Over-
load Model (ECOM) that we present in this chapter could allow managers, like the
two described above, to actively manage their business lives instead of merely
reacting to problems. To build the ECOM we draw not only from the cognitivist
theories introduced in the last chapter, but also from the rich management information
systems (MIS) literature on overload.
MIS is tightly linked to the computational paradigm of cognition, which mani-
fested itself in the research on cognitive styles. Cognitive styles served as a basis for
MIS and decision support systems (DSS) design in the 1970s and 1980s. The
research on cognitive styles was in line with the so-called cognitive revolution and
von Neumann’s (1958) work on the computer and the brain. Cognitive style has
been studied as a constraint in implementing operations research proposals (Huys-
mans, 1970) and as an important characteristic in project teams (White, 1984;
White & Leifer, 1986). Another cognitive topic, hemispherical specialization
(Robey & Taggart, 1982), was also studied by MIS researchers at that time. The
“Minnesota experiments” (Dickson, Senn, & Chervany, 1977) influenced the field
of MIS and DSS by promulgating experimental methods that are core to the cog-
nitivist paradigm. Why did such cognitive research vanish from the constellation of
MIS research? Many attribute this to Huber’s (1983) negative critique of the
appropriateness of cognitive style in MIS and DSS design. More recently, Myers-
Briggs Type Indicators focusing on sensation, intuition, feeling, and thinking
(Barkhi, 2002) have been reintroduced. However, cognition and emotion, or
thinking and feeling, are scarcely on the MIS research map.
Earlier, Mason and Mitroff (1973) clearly stated that “what information is for
one type of person will definitively not be information for another” and that
the job of MIS designers “is not to get (or force) all types to conform to one but
give each type the kind of information he is psychologically attuned to and will use
most effectively” (p.478). Obviously, the MIS discipline was sidetracked from this
goal. While we are still aiming at serving users through effective design, we end up
facing unexpected consequences of IT-related overload. This chapter focuses on
what past research, particularly from the MIS literature, can tell us about this
overload situation and what insights our model, the ECOM, can add.

What the literature says about overload


In their review of information overload across MIS, accounting, management, and
marketing, Eppler and Mengis (2004) underlined the lack of definition, analysis,
and measurement of overload in these disciplines. Consider, for example, the dis-
cipline of MIS. Eppler and Mengis (2004) stated that “the focus of MIS researchers
has been to propose effective counter-measures, and not to study the root causes of
the problem or its contextual factors” (p.339). The lack of systematic research in
the MIS field on overload caused by Information Technology (IT) is indeed sur-
prising given that information systems are designed and developed to provide
IT-related overload and individuals 39

information. However, MIS researchers, most notably Chervany and Dickson


(1974), were among the first to consider the concept of overload after the term was
first coined by Alvin Toffler (1970). Further, MIS researchers have contributed in
other ways to the understanding of overload: the role of interruptions (Speier,
Valacich, & Vessey, 1999); the role of schema and memory (Kock, 2000); the
concept that high information load can increase an individual’s information pro-
cessing capacity (IPC) up to a point (Schultz & Vandenbosch, 1998; Grise & Gal-
lupe, 1999–2000); the possible transitory nature of overload (Schultz &
Vandenbosch, 1998); the impact of overload on information search strategies
(Speier et al., 1999); and the multiple dimensions of overload and their link to task
performance (Ahuja & Thatcher, 2005; Tarafdar et al., 2007). Perhaps the greatest
contribution to overload made by the MIS researchers to date is their “focus on
solutions and the effects of new information technology on the individual, the
group, and the organization” (Eppler & Mengis, 2004, p.341).

Definitions of overload
In our reading of the literature, we found that many researchers of overload do not
define the term. Perhaps the authors of these studies think that the concept of
overload is so straightforward that it does not need definition. On the contrary, we
find the concept of overload to be quite complex. Fortunately, more recent
research has tended to offer definitions to clarify the type of overload that is being
studied, though these definitions can differ.
In particular, overload is often defined in terms of input (i.e., electronic junk,
data smog, avalanche of data, informational load), output (i.e., information fatigue
syndrome, analysis paralysis, mental stress, technostress), or a combination of both.
It may be reduced to the ‘number of inputs’ (e.g., amount of data, ideas, messages,
emails) generated by IT usage, such as groupware tools. When overload is defined,
the focus is typically on information or communication overload, or having more
information or communication than can be assimilated, processed, or observed.
Overload is also described as a paradox – for example, “we are not receiving
enough information, too much information is thrown at us” (Koeniger & Jano-
witz, 1995, p.5)– a situation with time pressure– for example, “too many things to
do at once” (Grise & Gallupe, 1999–2000, p.161)– a consequence of lack of
structure and organization in a system (Hiltz & Turoff, 1985), or a symptom of a
failure to create “high-quality” information for management use (Simpson &
Prusak, 1995, p.413).
A few researchers view overload as multidimensional (i.e., qualitative or quanti-
tative) and link it to the performance of a task or role (Ahuja & Thatcher, 2005;
Tarafdar et al., 2007). Quantitative overload is defined as “an individual’s perception
that they cannot perform a task because they lack critical resources” (Ahuja &
Thatcher, 2005, p.435). Qualitative overload is defined as the situation where
“employees perceive assigned work as exceeding their capability or skill levels”
(Ahuja & Thatcher, 2005, p.436) or where there is “a lack of knowledge pressure”
40 IT-related overload and individuals

(Pennington, Kelton, & DeVries, 2006, p.26). There is a common thread of inca-
pacity in all of these multidimensional views of overload.
More recently, Karr-Wisniewski and Lu (2010) introduced the concept of
“technology overload” with three dimensions: information overload, communica-
tion overload, and system feature overload. Information overload is by far the most
common type of overload in the literature. It occurs “when an individual’s infor-
mation processing capabilities are exceeded by the information processing require-
ments” (Karr-Wisniewski & Lu, 2010, p.1062). Communication overload is the state
when an individual is unable to process the information that is received from
another person or process (Karr-Wisniewski & Lu, 2010). It is important because it
focuses on how technology can be used to transmit messages. System feature overload is
the state that occurs when the technology an individual has to use to complete a task
is too complex for the task and for the individual (Karr-Wisniewski & Lu, 2010).
Technology overload is very closely aligned conceptually with IT-related overload,
and as we discuss in Chapter 6, Karr-Wisniewski and Lu’s technology overload scales
have been adapted to measure IT-related overload (Saunders et al., 2017).

Information processing
Information processing (IP) is the way information is selected, encoded, and activated
in human memory. We find that the overload literature, especially the MIS over-
load literature, tends to be superficial in its application of cognitivist models and
could benefit from cognitivist views of pertinence and emotions involved in IP. In
the MIS field, information is defined as “data endowed with relevance and purpose”
(Pearlson, Saunders, & Galletta, 2016, p.11).

(Mis)application of cognitivist models


Although they did not specifically code for it in their research, Eppler and Mengis
(2004) noted that IPC influences overload. Even though a number of papers
implicitly or explicitly talk about information overload in terms of the inability to
process information, the discussion is often quite superficial or reduced to a listing
of mechanisms that are part of IP itself; for example, “capacity interference”
between signals (Speier et al., 1999), the need to filter out unnecessary information
(Allen & Shoard, 2005), the need to filter and condense data (Chervany & Dick-
son, 1974), “subconscious” and “instantaneous” IP (Simpson & Prusak, 1995,
p.419), or “filtering”, “screening”, and “omitting” information (Hiltz & Turoff,
1985). Some authors refer to Broadbent’s (1958) Filter Model of Attention, Miller’s
(1956a) pivotal article, “The Magical Number Seven”, Newell and Simon’s (1972)
computational model of human problem-solving, or derived applied models
(Streufert & Streufert, 1978). We believe these models can be very useful in more
fully exploring the underlying processes and their interface with the architecture of
human memory. However, we are concerned that in much of the research on
overload, these models are only partially applied or they are misapplied.
IT-related overload and individuals 41

Not all MIS research has misapplied cognitivist models. To their credit, Grise
and Gallupe (1999–2000), Jones, Ravid, and Rafaeli (2004), Kock (2000), Minas,
Potter, Dennis, Bartelt, and Bae (2014), Paul and Nazareth (2010), and Schultz and
Vandenbosch (1998) have provided a slightly more nuanced view of individuals’
IP. Interestingly, all but Kock (2000) study cognition and IP in a group context.
None of these actually measure overload, but they do explore how individuals
working in group contexts process information. Further, none actually distinguish
group-level overload from individual-level overload.
While the overload literature, particularly the MIS literature, does not delve
deeply into IP, a few business disciplines have started considering the actual cog-
nitive processes involved in dealing with information overload in greater detail. For
example, a number of accounting articles employ a computational approach of
human problem-solving that is premised upon bounded rationality and limited
human IPC.
Overload in the limited IPC perspective is understood as the computational
excess of information which leads decision makers to employ compensatory deci-
sion rules involving, for example, chunking heuristics such as simplification or
reduction of information search (Payne, 1976). In a different vein, Revsine (1970)
introduced the concepts of schemata and mapping involved in processing financial
data. He found that the abstract structures for processing this data vary as a function
of the number of dimensional units in the data and the information combinations.
He also concluded that adding financial data makes IP more difficult. As noted in
Chapter 1, Rose et al. (2004) reported that the recall of numerical data increases as
information or cognitive load decreases, while affective responses are relatively
unaffected by load. Affective reactions to financial information appear to have
greater persistence in Long-Term Memory (LTM). Decision makers recall affective
responses to numerical data more accurately than the actual data. Further, decision
makers’ reliance on affective responses decreases as the information load or cognitive
load decreases. Finally, load (vis-à-vis overload) is frequently discussed in the accounting
literature (e.g., Snowball, 1980; Iselin, 1988, 1993; Simnett, 1996; Tuttle & Burton,
1999; Swain & Haka, 2000; Rosman, Biggs, Graham, & Bible, 2007).
In another example– this time from the marketing literature– Jacoby (1984)
demonstrated that consumers chunk to reduce information load and avoid cogni-
tive overload. Malhotra (1984) concluded that “overload could occur by way of
the imposed information load exceeding the processing capacity of the consumer,
and/or by producing dysfunctional consequences on decision making” (p.439).
Further, Daniels (2008) suggested the need to consider affect in IP.
While authors in accounting and marketing provide interesting elements to
better understand overload, they frame overload in terms of a computational view
of human memory. They employ highly mathematical abstractions with decision
rules that they manipulate in lab experiments to better grasp the exact nature of the
heuristics used to overcome the limitation of the magical seven plus or minus two.
Interestingly, there have been controversial results from lab experiments studying
consumer behaviour (Jacoby, Speller, & Kohn-Berning, 1975; Malhotra, Jain, &
42 IT-related overload and individuals

Lagakos, 1982; Jacoby, 1984; Malhotra, 1984). On the one hand, Malhotra et al.
(1982) purported that “consumers are capable of processing fairly large amounts of
information. Yet the capacity of consumers to absorb and process information is
not unlimited” (p.35). On the other hand, Jacoby (1984) concluded that consumers
use cognitive strategies to limit the amount of information entering into their
decision making, “stopping far short of overloading themselves” (p.434).
With a few exceptions (e.g., Cook, 1993; Grise & Gallupe, 1999–2000; Chang
& Ley, 2006), the MIS literature has looked at the phenomenon of overload
without considering the point at which a load becomes overload. This is in dra-
matic contrast to a considerable body of the work in cognitive psychology on
overload, where it has mostly been studied in relation to learning under ‘cognitive
load’, also referred as ‘mental load’ (Chandler & Sweller, 1991). Chandler and
Sweller (1991) define cognitive load as “the manner in which cognitive resources are
focused and used during learning and problem solving” (p.294). In the cognitive
psychology literature, cognitive overload is a construct that represents the symptoms
that occur when cognitive load overwhelms cognitive resources required for
chunking. The information stored in the LTM in the form of cognitive schemata
has to be (re)loaded into the Short-Term Memory (STM) to allow chunking of
the information (Sweller, Van Merrienboer, & Paas, 1998; Paas, Renkl, & Sweller,
2004). Multiple strategies exist to deal with these situations of insufficient and/or
exhausted cognitive resources. The most common strategy is to increase mental
effort so that information can be chunked meaningfully (Sweller, 1988).

Pertinence from the cognitivist perspective


Much research based on the cognitivist models described in Chapter 2 auto-
matically assumes that a bottleneck is created because not enough of the informa-
tion requiring attention can be filtered so that the really important items can be
processed (e.g., Hiltz & Turoff, 1985; Allen & Shoard, 2005). A basic underlying
assumption of this research appears to be that bottlenecks cannot be avoided. This
research typically does not take into account the role of the personal pertinence of
information as a factor in dealing with overload. Broadbent (1958) and other cog-
nitivists have moved away from this ‘all-or-none’ conceptualization of filtering.
They suggest that pertinence is actually a form of matching based on pattern
recognition and activation in memory (Treisman & Riley, 1969). This process can
fend off overload by only allowing pertinent information to pass through the filter.
Not all conceptions of pertinence in overload research are in agreement with the
cognitivist perspective that we adopt in this book. Some overload researchers view
pertinence as being embedded in the stimulus. For example, lack of information
pertinence is seen as a lack of filters or in terms of information usefulness (O’Reilly,
1980; Koeniger & Janowitz, 1995; Schultz & Vandenbosch, 1998; Pennington et
al., 2006; Lee, Son, & Kim, 2016). Pertinence may also be viewed as whatever
information the individual considers to be vital (Farhoomand & Drury, 2002).
However, Tulving (1962) and, later, Craik and Lockhart (1972) experimentally
IT-related overload and individuals 43

demonstrated that information retrievability from memory is related to the way


that individuals encode the information. Cognitivists claim that pertinence depends
on the schemata encoded in memory, rather than just being a characteristic of the
information itself. What is pertinent for one person may have no importance for
another, as Mason and Mitroff (1973) noted.

Do not forget about emotions in information processing


The role of cognitive styles and positive and negative valence, either as a part of
the processing of information or the information itself, is sorely lacking in over-
load research. Emotions and affects are only indirectly addressed when consider-
ing overload output. For example, Edmunds and Morris (2000) have mentioned
a “feeling” of overload and other authors have addressed situations of stress,
anxiety, fatigue, burnout, or frustration resulting from information overload
(Hiltz & Turoff, 1985; Farhoomand & Drury, 2002; Ahuja & Thatcher, 2005;
Lee et al., 2016). However, as Bower (1981, 1991, 2001) suggested, feeling and
thinking are integrally linked within an associated network of mental repre-
sentations. Interestingly, Schultz and Vandenbosch (1998), when suggesting that
information overload may best be considered a temporary phenomenon relying
on IPC, wrote that “people select cues based on their personal cognitive structure
and biases” (p. 131). This approach, though not explicitly grounded in cognitive
psychology, demonstrates the importance of cognitive schemata during IP to
decide on the pertinence of a stimulus to the individual. In fact, to be processed,
the pertinent stimuli must be an emotional as well as a cognitive match with the
individual’s schemata.

Where cognitivist research can inform overload research


Strides have been shown in the existing literature in terms of understanding
overload and introducing information processing concepts. This can help explain
certain aspects of the situation described at the start of the chapter. In particular,
this demonstrates how modern technology delivers information to be processed
(e.g., Denning, 1982; Hiltz & Turoff, 1985; Schultz & Vandenbosch, 1998;
Speier et al., 1999; Edmunds & Morris, 2000; Nelson, 2001; Farhoomand &
Drury, 2002; Jones et al., 2004), which can lead to situations where the indivi-
duals experience frustration (as Chris did). It aptly illustrates how interruptions
exacerbate the overload situation (Speier et al., 1999) and how Chris and Alix
might be experiencing qualitative overload in terms of their managerial roles
(Ahuja & Thatcher, 2005; Tarafdar et al., 2007).
All in all, overload research in business disciplines has been rather skimpy when
it comes to explaining “a neurological phenomenon … caused by brain overload”
(Hallowell, 2005, p.55). Here is where the cognitivist perspective could come in
handy. It could help address these limitations in much of the overload literature in
the business disciplines:
44 IT-related overload and individuals

 Muddy conceptualizations of overload. Many writings on overload, especially earlier


studies, did not define what was meant by overload, overlooked its complexity,
or failed to clearly distinguish it as input (i.e., causes) or output (i.e., symptoms).
 Superficial application of information processing models. Although IP models are
mentioned in a number of the writings on overload, the discussion is often
quite superficial– reduced to only listing cognitive mechanisms– or misapplied.
 Individual differences in IPC. Relatively little attention has been given to the
inward experience of the mind in dealing with information. As such, it is often
not recognized that an individual’s pool of resources (e.g., different mental
schemata, time)– and, therefore, IPC– influence perceptions of overload. For
example, overload researchers cite Miller (1956a) but do not consider one of
his key approaches to IP: the unitization hypothesis, which states that the use
of chunks can help avoid information bottlenecks (p.95).
 The role of pertinence and emotion/affect in filtering and processing information. In the
past, overload research mentioning IP has focused only on rational cognitive
aspects and not the role that emotions play in the process. As we discussed in
Chapter 2, researchers implicitly consider cognition to supervene emotion to the
extent that emotion is viewed as a curse (Neisser, 1967). However, cognitivists
(Bower, 1981; Barnard, 1985) and neuroscientists (Damasio, 1994; LeDoux,
1998) who have been studying the interplay between affect and cognition
have demonstrated that only pertinent stimuli that are both an emotional and a
cognitive match with prior encoded schemata are processed further.

Emotional-Cognitive Overload Model (ECOM)


Prior overload research has not fully mined the theories of cognitive psychology. In
particular, we believe that overload research should more fully address: (1) memory
architecture and schemata; (2) pertinence; and (3) emotion. These elements are impor-
tant in understanding how individuals draw on their pools of resources and, therefore,
process information differently. Consequently we propose the Emotional-Cognitive
Overload Model (ECOM), presented in Figure 3.1.
Our model is basically an input-process-output model with a strong focus on
process. Inputs are most typically pieces of information but may also include other
stimuli such as requests to use new Information Technologies. Our model focuses
on Emotional-Cognitive Overload (i.e., brain overload) as an output. We define
Emotional-Cognitive Overload (ECO) as the negative emotional and cognitive mani-
festations resulting from the inability to process a pertinent stimulus and the cor-
responding emotional and cognitive failures to resolve the situation of high brain
load. That is, when individuals suffer from ECO, their resources are insufficient for
handling the brain load that is created from an incoming stimulus.
The inputs ‘causing’ overload and the outputs of overload have often been discussed
in the overload literature. A contribution of ECOM is that it opens the black box of
IP using psychological theories of emotion and cognition (Atkinson & Shriffrin, 1968;
IT-related overload and individuals 45

COGNITIVE PROCESSING OF MENTAL LOAD

SHORT-TERM LONG-TERM
MEMORY MEMORY
INPUT (STM) (LTM)
7 +/−2
INFORMATION,
(Chunking of Mental framework
REQUESTS TO
information occurs (Prior experience of
USE
here after reloading Emotional-Cognitive
INFORMATION
information held in Overload stored in
TECHNOLOGY Mental load
the memory stores Episodic Memory)
AND
TASKS
PE in LTM)
updating
RT
IN
EN

Matches input in STM with memories stored in LTM


CE

OUTPUT
EMOTIONAL OR COGNITIVE OVERLOAD

EMOTIONAL COGNITIVE
Stress Dumping part of
Burnout problem
Distractibility More errors
Pool of resources
Frustration Lower performance
Information Inner frenzy Shedding tasks
Processing Capacities Impatience Mental confusion
Physiological Poorer decisions
resources

FIGURE 3.1 Emotional-Cognitive Model of Overload (ECOM)

Bower, 1981; Tulving, 2002). More specifically, it focuses on processes that take place
in the STM and the LTM. In this section we present a detailed discussion of inputs,
processes, and outputs of ECOM. In particular, we focus on processes when we
describe memory architecture and emotion. We specifically address the organization
of LTM and the role of individuals’ schemata. We also discuss filtering, chunking
processes, and the role of expertise as part of the pool of resources. Finally, we look at
critical problems that must be addressed in future research on IT-related overload:
debunking the ‘Amount Illusion’, and ‘Contingency Boundedness’.

Inputs
The ECOM focuses on two types of inputs: information and requests to use
Information Technologies.

Information
By far the most frequently discussed type of input in the overload literature is
information. Our focus is on information that is sent to (versus sought by) the
individual. Increasingly, information is delivered through IT.
Input in relation to overload typically has been studied in terms of the amount
of information that is needed to create overload, but not the point at which ‘load’
46 IT-related overload and individuals

becomes ‘brain overload’. Since many overload researchers do not consider indi-
vidual differences in processing information, they implicitly assume that there is a
common brain overload point for all. This approach of finding a common overload
point is in dramatic contrast to a considerable body of research in cognitive psy-
chology. Further, previous research on overload assumes that when there is too
much information, a bottleneck is created at the filter and brain overload occurs.
We have adopted a more nuanced view of the filter. Our ECOM assumes that
sensory inputs are filtered by the human memory on the basis of their pertinence
to the individual. Only pertinent information is cognitively processed. When
individuals cannot select pertinent information, it becomes a problem of IP
(O’Reilly, 1980). Similarly, Sutcliffe and Weick (2008) observed that overload
occurs because of individuals’ “inability to make sense of demands, capabilities and
context as well as the data” (p.62). Thus, individuals can filter out and reject those
inputs that are not pertinent before they are ever subjected to a deeper level of
processing. Consequently, because of the ability to select pertinent information, the
amount of information that creates ECO varies by individual.

Processes
Once inputs have been selected by individuals for processing on the basis of their
pertinence, they are processed, embedded with emotional valences (Bower, 1981,
1991, 2001), and stored in the more permanent part of the LTM. The processes
include cognitive and emotional processing, which are so intertwined they cannot
really be studied separately. Chapter 6 addresses this issue in greater detail.

Role of emotions
Emotion can either help or hinder the processing of pertinent inputs. To go back
to an example used earlier, people have been found to remember their emotional
reactions to financial information better than the actual numbers (Rose et al.,
2004). An emotional valence, which may reflect either a positive or negative
emotion, is attached to events and concepts that are activated in association with
the prior experience (Bower, 1981).
When the valence of an input matches the valence of a related experience stored
in an individual’s LTM, it is said to be congruent. If the information is not con-
gruent, the individual must strain to process it. Processing is especially challenging
when the individual’s resources are limited. It is then that efficiency in matching
the stimulus to the mental framework becomes more critical. Even with efficient
matching processes, the additional processing strain due to incongruence may lead
to brain load so great that it cannot be processed successfully with the individual’s
cognitive resources. Thus, information lacking congruence with an individual’s
mental framework is more likely to create ECO.
In addition to the inefficiency in processing incongruent, mismatched valences,
there are several reasons why individuals may not be able to process information
IT-related overload and individuals 47

load adequately: they may be exhausted and/or lack the proper resources; they
may lack expertise or experience; they may lack time. Multiple coping strategies
exist to deal with these situations of insufficient and/or exhausted resources. The
most common strategy is to make the mental effort more efficient by chunking the
stimulus information into meaningful chunks or superchunks (see Miller, 1956b;
Mandler, 1967; Sweller, 1988). For example, individuals can usually remember
their four-digit PIN numbers. However, if they try to remember a ten-digit US
phone number (as opposed to just checking it on their phones), they probably
break the number up into chunks of three or four: the three-digit area code, the
three-digit exchange code, and a four-digit number. Research has found that
chunking also speeds up the retrieval of information from LTM (Logan, 2004).

Role of resources
Research in psychology supports the idea that processing all inputs involves both a
certain level of mental effort and the resources needed to accomplish this effort.
Endogenous cognitive and emotional resources can make the mental effort more
efficient and thereby reduce an individual’s brain load. Cognitive and emotional
resources are combined into the individual’s pool of resources (Kahneman, 1973;
Hobfoll, 1989, 2002, 2011). The individual’s pool of resources may include cog-
nitive ability, personality traits, and physiological resources required to maintain
homeostasis (see Chapter 2).
One personality trait that is of particular interest in situations leading to brain
overload is Need for Cognition. Need for Cognition (NFC) has been defined as “a
need to structure relevant situations in meaningful, integrated ways. It is the need
to understand and make reasonable the experiential world” (Cohen, Stotland, &
Wolfe, 1955, p.291). NFC demonstrates a stable dispositional tendency to engage
in and enjoy effortful cognitive activities (Cacioppo & Petty, 1982; Haugtvedt,
Petty, Cacioppo, & Steidley, 1988; Cacioppo, Petty, Feinstein, & Jarvis, 1996).
Tam and Ho (2005) showed that NFC plays “a pivotal role in influencing a
user’s level of elaboration and choice outcome”, adding that “NFC is a moderator
that induced objective processing of personalized offers” (p.288). Similarly, Alju-
khadar, Senecal, and Daoust (2012) found NFC moderated the relationship
between information overload and decision strategies for online purchasing. In
particular, individuals with low NFC were more likely to use online agents and
accept their recommendations than were individuals with high NFC. NFC also
plays a role in processing information. People with high NFC are assumed to make
more resources available in order to focus their attention. They use systematic rules
to better process information by carefully evaluating more alternatives. In contrast,
people with low NFC turn to heuristics to cope with the high cognitive demands
in memory and make decisions based on peripheral cues (Aljukhadar et al., 2012).
Further, people with a high NFC tend to recall more information than those with
low NFC because they typically apply more mental effort in thinking about and
elaborating on the information when they are processing it (Cacioppo et al., 1996).
48 IT-related overload and individuals

Therefore they are more likely than low-NFC individuals to be motivated to exert
additional effort in information acquisition, reasoning, and problem-solving
(Cacioppo et al., 1996), especially when they view the context as relevant
(Haugtvedt et al., 1988). This was supported in our survey of almost 2,000 Dutch
participants who were invited to evaluate the ECO from information delivered by
a ‘video contact’ technology. This technology was designed to allow users to dis-
cuss health or financial matters with specialists online. The results demonstrated
that IT-related overload is negatively related to NFC, but positively related to
memories of past situations of emotional and cognitive overload (Rutkowski,
Saunders, Weiner, & Smeulders, 2013). Consequently, NFC is a personality trait
that serves as an important resource in facilitating cognitive processing.

Role of prior experience and expertise


Brain load also varies as a function of experience or expertise, another form of
endogenous resources. A relatively high brain load that cannot be reduced with
mental effort, either because the person is unwilling or unable to exert the neces-
sary level of mental effort, triggers the emergence of ECO that feeds into the
Episodic Memory (see Chapter 2). This creates a new experience; namely, of a
prior experience of ECO related to some inputs. We define prior experience of ECO
(PECO) as a part of the mental framework that is associated with concepts stored
in the Semantic Memory. It results from the encoding of ECO in the Episodic
Memory. We argue that PECO remains part of the schemata stored in the LTM. It
therefore can permanently affect an individual’s decision-making when confronted
with new inputs.
Expertise relates to heuristics and cognitive abilities (see Chapter 2). A heuristic is
a way of processing information which favours certain strategies above others.
Expertise is an innate form of heuristic style that is domain-specific. It is derived
from prior experience within a domain of information. It is critical because the
more frequently an individual processes information within a particular domain,
the less attentional resources are required to process further information in that
domain (Schneider & Fisk, 1982). Sutcliffe and Weick (2008) have addressed the
role of expertise through their sense-making lens. The sense-making lens suggests
that when people are fully engaged, in a state they call the “ready-to-hand mode
of engagement” (Sutcliffe & Weick, 2008, p.64), they are less likely to experience
information overload. It is only when an individual stands back and observes a
project that overload is experienced in the “present-at-hand mode”. The more
expert individuals have an experientially learned sense of salience (i.e., pertinence)
the better they can process incoming information and subsequently prioritize their
actions. They are more likely to be in a ready-to-hand mode. As their expertise
develops, individuals become more solidly engaged and, consequently, experience
fewer occasions of overload. In contrast, non-experts do not know what to observe
or respond to, nor can they remain focused and in a ready-to-hand functioning
mode like the experts. Sutcliffe and Weick’s (2008) concern is that overload is
IT-related overload and individuals 49

viewed in terms of representation and confrontation activities in IP, rather than


construction and interpretation activities. In contrast to the Amount Illusion view,
their approach to overload focuses on interpretation and the creation of meaning
around the pertinent information.
It is likely that most individuals can process low brain loads. The only exception
might be a low brain load with high incongruence. Non-experts may feel quite
pleased with themselves when they actually handle brain load successfully, even if
the brain load was low. In this case the experience is encoded as a positive
experience. However, it would be encoded as a negative experience of ECO if the
low load was handled unsuccessfully. In contrast, the expert may successfully
handle a low brain load but feel quite bored with his achievement. This was the
case in a study of enlisted navy personnel who performed better in situations of
underload, but were less satisfied (O’Reilly, 1980), and with the surgeons that we
discuss in Chapter 6.
The picture is much more complex for conditions of high brain load. Non-
experts likely experience ECO and are not able to process the brain load successfully,
especially when their personal mental framework and the input are incongruent. In
the situations where non-experts are unsuccessful, a PECO is encoded in their LTM,
and they likely manifest a well-known cognitive bias called fundamental error attri-
bution (Ross, 1977) or self-serving attribution bias (Forsyth, 1980). In a nutshell, success
is attributed to internal factors (i.e., efficient endogenous resources such as intellective
aptitude), whereas failure is attributed to external ones (i.e., exogenous resources such
as too little time). Self-serving bias allows unsuccessful non-experts to comfort
themselves by rationalizing that they are overloaded because of exogeneous factors
such as the time allowed to process the information or the description of the task
( Jones & Harris, 1967; Miller & Ross, 1975). The situation reflects the need for non-
experts to gather a lot of information– so much so that they cannot process it, or
even select it. The situation reflects bulimia of hyper-consumption in an attempt not
to miss any important information. In our digital world, where so much information
is being thrown at us, individuals may experience ECO when they lack the expertise
to detect and select the important information. In the event that they do successfully
handle the overload, non-experts tag a positive experience of ECO in their sche-
mata. As their expertise increases, they may attribute the success to their own actions
(Miller & Ross, 1975).
The expert should be able to handle low brain loads and, very likely, high brain
loads successfully. The expert may even find situations with high brain load to be
challenging and arousing. Unlike the non-expert, the expert is better able to pro-
cess stimuli successfully (Sutcliffe & Weick, 2008) and to prioritize them as func-
tion of their pertinence. While the expert may recognize that certain incoming
stimuli are not pertinent, the non-expert may think they are pertinent and thus
increase his brain load as he tries to process them. In contrast, even where brain
load is high, the expert may not experience ECO because he processes the load
automatically. But even experts may experience ECO when congruence is low and
the load is high. Experts, however, learn while processing the input. As is the case
50 IT-related overload and individuals

with the non-expert, the expert’s success is tagged positively in memory, while
their failure is tagged negatively in memory. Both positive and negative valences
are encoded in the mental framework as PECOs. These relationships are summar-
ized in the propositions in Table 3.1.

Processing requests to use Information Technologies


While the ECOM can be applied to information, it can also be applied to inputs in
the form of requests for using new IT. Upon receiving such a request, PECO with
IT, encoded in the Episodic Memory, is activated to appraise the new input. The
request to use new IT has a higher chance of being turned down if previous
experience with IT has been negative. In contrast, an individual is more likely to
use new IT if prior experience has been good. The user may be able to chunk
more easily if he is comfortable with using the new technology. This is supported
by Jasperson, Carter, and Zmud (2005) who argue that individuals engage in
metacognitive activities regarding their most recent post-adoptive experiences
when confronted with a decision about whether to use a new technology. This
type of overload is similar to the techno-complexity stressor of technostress.

TABLE 3.1 Summary of issues in processing and output for expert versus non-expert

High brain load Low brain load


Expert Will not experience Emotional- Will not experience Emotional-
Cognitive Overload: Successfully Cognitive Overload: Successfully
handles the high brain load; chal- handles the low brain load; bored;
lenged; aroused may decide to use resources to
multitask
Non-expert Will experience Emotional-Cog- Will not experience Emotional-
nitive Overload: If high brain load Cognitive Overload:*Pleased with
is handled successfully, this achievement of successfully handling
experience will be encoded as a low brain load; may decide to use
positive prior experience when resources to multitask
facing Emotional-Cognitive
Overload in the future; chal-
lenged/self-worth increased
because of growing expertise;
Or If high brain load is not handled
successfully, this experience will
be encoded as a negative prior
experience when facing Emo-
tional-Cognitive Overload in the
future; will blame the amount of
information and may experience
exhaustion or frustration; may
apply wrong coping strategies and
become even more exhausted

Note: *For the non-expert with low brain load, it is possible that the individual will experience Emotional-
Cognitive Overload where there is high incongruence.
IT-related overload and individuals 51

Outputs: emotional-cognitive IT-related overload


Typically, overload is described in general terms. It has been associated with
numerous cognitive and emotional symptoms. Overload can be associated with
short-term emotional consequences such as frustration (Wickens, 1992; Chen,
Shang, & Kao, 2009), distractibility, inner frenzy, and impatience, representing the
emotional side of the cognitively overwhelming effort (Hallowell, 2005). Longer-
term emotional consequences from sustained cognitive effort can mentally exhaust
the person and cause chronic stress (Schlotz, Hellhammer, Schulz, & Stone, 2004)
similar to burnout symptoms (Maslach & Jackson, 1981).
Overload also is associated with negative cognitive consequences that result from
the mental strain and exhaustion of a person’s resources. This may occur when an
individual faces too great an information load and lacks either the cognitive cap-
abilities to handle the mental effort required (e.g., expertise) or the necessary exo-
genous resources. The individual may choose short-term cognitive strategies that
directly and negatively impact on the output side of the equation. One such strat-
egy is to economize mental effort (Kellogg, 1990) through dumping part of, or the
entire, problem – by shedding tasks, deferring choice (Dahr, 1996), or reverting to
previously learned conventions. Another strategy involves accepting lower levels of
performance – by living with an increased number of errors, reduced information
integration, and impaired decision-making (Bettman et al., 1990; Shiv & Fedor-
ikhin, 1999). When individuals apply these strategies, overload becomes a “tem-
porary affliction that they remedy almost immediately” (Schultz & Vandenbosch,
1998, p.132).

Applying the Emotional-Cognitive Overload Model


Let us return to the situation at the start of the chapter. Box 3.2 describes how the
ECOM applies to this situation.

BOX 3.2 EXAMPLE APPLICATION OF THE EMOTIONAL-


COGNITIVE OVERLOAD MODEL
Consider the example with Chris and his CEO, Alix, at the start of this chapter.
Chris describes the many input stimuli that he and his boss are receiving (e.g.,
phone calls, emails, texts, Fitbit alerts, people knocking on the door) and
complains to Alix about the negative consequences on his work (e.g., not get-
ting anything accomplished, frustration). Is he experiencing information over-
load (from too much information) or brain overload (from the brain’s inability
to process input stimuli)? According to Hallowell (2005), Chris is experiencing
brain overload, and we agree. Indeed, it is likely that in the past Chris had
emotional and cognitive experiences of overload and felt frustrated when he
could not solve the situations of high brain load. Now as he receives texts on
his smartphone, or experiences a similar input, a negative valence is attached to
52 IT-related overload and individuals

it, therefore activating an anticipation of possible frustration. He is unable to


process all of the pertinent input stimuli and, hence, feels the emotional con-
sequence of frustration. Frequent repetition over time can even lead to the
more serious condition of ECO, which can result in burnout or depression fol-
lowing the feeling of being helpless to deal with the situation. There is a lack of
congruence, and since the emotional valences of his personal mental frame-
work are tagged negatively, he is likely to feel overloaded and averse to
undertaking the activity. However, does Chris (and most of us in the same
situation) blame his own brain or the situation, characterized by the large
amounts of information that he is receiving? There is a high probability that
Chris is displaying self-serving attribution bias and placing blame on his job or
not having enough time to get everything done rather than on not having
enough intellectual ability or other endogenous resources. To help him deal
more effectively with the situation, he could act more like an expert and pre-
serve his cognitive resources by developing a strategy for handling texts or by
turning off his phone. That way, he would be able to better direct his attention
and make better use of his cognitive resources.

Problems to be addressed in future research on IT-related overload


Despite the increasing body of research on information overload, two critical
problems need to be addressed to better understand the phenomena of IT-rela-
ted overload: (1) the Amount Illusion and (2) Contingency Boundedness. The
first problem, Amount Illusion, relates to the assumption that overload is based
primarily on the simple volume or amount of information. This assumption is
frequently applied in overload research. While some people have the ability to
cope with very large amounts of information without experiencing information
overload (Tsai, Compeau, & Hagerty, 2007), others experience overload when
dealing with relatively small amounts (Rutkowski & Saunders, 2010). This sug-
gests that amount alone is not sufficient to explain information overload. We
demonstrate how the ECOM can be used to understand how people experience
overload differently. The second problem, Contingency Boundedness, needs to
be addressed by exploring nuances in the experience of IT-related overload as a
function of context and exogenous resources provided to the individual. Much
past research has studied overload in controlled experimental settings (e.g.,
Chervany & Dickson, 1974; Payne, 1976; Speier et al., 1999). Consequently,
the amount of information has been overemphasized and the impact of con-
textual and temporal differences on individuals’ experiences of IT-related over-
load has not been adequately reported. Therefore, we discuss how organizational
context impacts perceptions of overload. This conversation is extended in
Chapter 5. We also use the lens of time in trying to understand IT-related
overload. More specifically, we focus on three aspects of time: (1) the severity
of time constraints in various professional settings; (2) individual differences in
the perception of time; and (3) task-switching.
IT-related overload and individuals 53

Amount Illusion
As we have noted, most literature on overload faults the amount of information as
creating the situation of overload. In particular, information overload is considered
to be based only on the amount of information that is received (e.g., Chervany &
Dickson, 1974; Chewning & Harrell, 1990; Berghel, 1997; Allen & Shoard, 2005).
Some studies measuring information overload assume it occurs when individuals
are faced with an increasing number of alternatives, when they are faced with an
increasing number of dimensions of information available per alternative (Payne,
1976; Cook, 1993; Swain & Haka, 2000), or when they have to process varying
numbers of cues or data items (Chervany & Dickson, 1974; Iselin, 1988; Chewn-
ing & Harrell, 1990). We think this is the wrong way to think about overload. We
use both theoretical arguments and empirical results to demonstrate that overload is
not just about the amount of information.
The ECOM model argues that overload is not just about amount. It is premised
on the important role that individual differences play in creating overload experi-
ences. In particular, people have different cognitive and emotional resources and
these are expended differently across individuals, or even within individuals: Some
have more cognitive abilities, which makes it easier for them to process incoming
stimuli; Some have personality traits such as Need for Cognition that influence the
way they process information or other inputs; Some may have expended con-
siderable resources on earlier IP and so become overloaded in processing additional
stimuli when their resources are depleted. Further, they have encoded in their
schemata negative experiences about earlier failure in processing information or using
new technologies, which create incongruences with the valences of new stimuli that
they are asked to process.
Initial tests of ECOM support the idea that overload is not just about amount. In
fact, in a study of Germans using mobile technologies, the amount of information
was not significantly related to IT-related overload (Saunders et al., 2017). The
same was true in the Dutch study, mentioned above, in which people were asked
to adopt video contact technologies designed to provide information about their
health or their banking transactions and accounts (Rutkowski, Saunders, Weiner et
al., 2013). What was important here was PECO with IT (emotional and cognitive
components); this is key in estimating how individuals intend to respond to
requests to use new technologies (Rutkowski, Saunders, Weiner et al., 2013;
Saunders et al., 2017).

Contingency Boundedness
In developing theoretical models, it is important to consider boundaries (Bacharach,
1989). In deciding which boundaries to consider for ECOM, we seek to answer the
‘where’ and ‘when’ questions (Whetten, 1989). We draw on the literature to con-
sider the organizational context in answering the ‘where’ question. We expand the
previous discussion of temporal context in answering the ‘when’ question.
54 IT-related overload and individuals

Organizational context
How organizations are designed impacts and shapes their information processing
requirements (Galbraith, 1974; Tushman & Nadler, 1978; Schick, Gordon, &
Haka, 1990). Schick and his colleagues focus on organizational-level information
overload and neatly assume that individuals’ information processing strategies are a
given. They view information overload as having organizational structure deter-
minants and argue that it is up to organizations to set the appropriate information
processing time for work completion. When the actual time exceeds the allotted
time for a work task, information overload occurs. Tushman and Nadler (1978),
who built on the work of Galbraith, defined information processing as the “gathering,
interpreting, and synthesis of information in the context of organizational decision
making” (p.614). For Galbraith (1974) and Tushman and Nadler (1978), informa-
tion overload occurs when the individual’s IPC cannot handle the information
processing requirements of a task. To deal with overload, Galbraith (1974) sug-
gested creating formal structures, rules, and regulations so that information can be
processed more effectively. Doing so encourages the coordination of information
provision across units and consequently reduces uncertainty. This improved coor-
dination can reduce information processing requirements and positively influence
an individual’s IPC. If rules, regulations, and formal structures prove inadequate,
Galbraith (1974) suggested either reducing the amount of information to be pro-
cessed by creating slack resources or self-contained units, or increasing the capacity
to process information by using vertical information systems or lateral relationships.
It has also been shown that overload is reduced by the organizational redesign of
interactions (Sparrow, 1999) or branch facilities (Meier, 1963). In contrast, changes
in organizational design, such as disintermediation or centralization, might increase
information processing requirements (Schneider, 1987). O’Reilly (1980) demon-
strated empirically that organizational characteristics can cause overload and that
information overload negatively impacts organizational performance.

Temporal context
In many situations, people would not get overloaded if they were just given
enough time to process the pertinent inputs that they receive. Thus, time is an
important boundary in the ECOM. Important aspects of this temporal context are
time constraints, individual perceptions of time, and task-switching.

Time constraints
The importance of time in relation to overload cannot be overstressed. That is why
many researchers talk about how not having enough time to process and under-
stand inputs (i.e., information) can result in overload (e.g., Galbraith, 1974; Schultz
& Vandenbosch, 1998; Kock, 2000; Farhoomand & Drury, 2002; Ahuja &
Thatcher, 2005; Ahuja, Chudoba, Kacmar, McKnight, & George, 2007; Paul &
IT-related overload and individuals 55

Nazareth, 2010). In fact, Kock (2000) and Schick et al. (1990) argued that overload
is more about time pressures that it is about the amount of information.

Individual differences in perceptions of time


Individuals have different perceptions of time (e.g., Vatsyayan, 1981; Jaques, 1982;
Clark, 1985; Hassard, 1996; Trompenaars & Hampden-Turner, 2011) or time
visions (Saunders, Van Slyke, & Vogel, 2004). These perceptions of time are mul-
tidimensional and may include the following:

 homogeneity– whether all units of time are the same as in minutes or hours
(homogeneous) or whether some are qualitatively different from others (epochal);
 nature of flow– like a river or speeding arrow (linear) or seasonal or repeating in
some other way (cyclical);
 direction of flow– one irreversible direction, as in past, present, future (unidirec-
tional), or as in mathematics and physics with positive and negative values for
the flow (bidirectional);
 objectivity– based on fact or quantifiable (objective), open to greater interpreta-
tion and based on personal feelings, emotions, and perceptions (subjective), or
based on people’s shared perceptions and interpretations (intersubjective);
 time orientation– short term or long term;
 chronicity– preferring to do one thing at a time (monochronic) or preferring to
do multiple things at a time (polychronic).

Some of these dimensions might be particularly useful in understanding the


temporal context of ECOM. For instance, psychologists often map ‘objective’ time
to ‘subjective’ time to determine conditions (e.g., fatigue, mental disorders, etc.)
that distort or otherwise affect an individual’s estimation of time. Short-term frus-
trations from overload might be associated with occasions when an individual
subjectively views the time available to process inputs as shorter than it really
(objectively) is. Further, different time visions may be based upon individual traits
such as one’s personal or culturally invoked sense of time urgency (i.e., the fre-
quent concern with the passage of time) (Waller, Conte, Gibson, & Carpenter,
2001) or concern with deadlines. If someone is not very concerned about meeting
deadlines, that individual is not as likely to feel the pressure of an impending
deadline as a time constraint, and hence is less likely to feel overloaded than
someone who is very concerned about deadlines.
Polychronicity relates to the chronicity dimension and relates to a person’s pre-
ference about working on multiple tasks over a period of time. Polychronicity is
defined as “the extent to which people (1) prefer to be engaged in two or more
tasks or events simultaneously and are also actually engaged and (2) believe their
preference is the best way to do things” (Bluedorn, 2002, p.51). At the other end
of the continuum from high polychronicity is monochronicity, or the extent to
which people prefer to do one thing at a time (König & Waller, 2010). It was first
56 IT-related overload and individuals

described as a cultural phenomenon by Hall (1983) in his study of Native Amer-


icans. More recently it has been perceived as being transmitted by the cultures in
which an individual is embedded (e.g., national, organizational), but varying across
individuals (Bluedorn, 2002). Both Hall and Bluedorn look at task-switching over
relatively long periods, such as hours or days. In fact, Bluedorn notes that poly-
chronicity is more associated with the sequencing of multiple tasks than with the
speed of their execution. Because of their possible preference for multitasking, it
has been suggested that polychronic individuals may be less prone to experiencing
IT-related overload. This is corroborated in our survey of 1,004 Germans, which
finds that polychronicity has a significant negative relationship with IT-related
overload (Saunders et al., 2017).

Task-switching (a better way of thinking about multitasking)


Polychronicity has been considered a proxy for multitasking (König & Waller,
2010) or as related to multitasking through common elements (Bluedorn,
2002). Many individuals believe that they can multitask, or perform multiple
tasks at the same time. They feel that in this way they can process information
more effectively and make better use of their time since they can easily switch
from task to task (Goonetilleke & Luximon, 2010). More specifically, it is
claimed that people in the Net Generation are adept at multitasking, especially
when it comes to media. They have been shown to have a preference for using
two different electronic or digital devices at the same time (Foehr, 2006;
Wallis, 2006). For example, they might chat with their friends on Facebook
while watching a movie on another medium.
However, it has been demonstrated that people cannot actually process two or
more tasks simultaneously, because doing so would require that the tasks use the
same cortical area in the brain (Klingberg, 2009). That is, the human brain does not
have the resources to process multiple tasks simultaneously. When individuals have
to do more than one task within a given time period, performance in some or all
of the tasks may be affected as they allocate the available attentional resources
among the tasks at hand (Spink, 2004; Spink, Cole, & Waller, 2008). Although
access to technology has increased, we can hardly assume the same regarding indi-
viduals’ IPC. Debate rages about whether an individual actively engaged in one
task is only passively processing other task(s), or whether that individual is equally
engaged in both (all) tasks and hence is truly multitasking. Additionally, level of
performance in handling the multiple tasks may depend on whether or not one of
the tasks is automatic, such as brushing one’s teeth or walking. Individuals typically
can simultaneously walk and carry out other tasks such as looking at their envir-
onment or talking with someone else. Multitasking represents a level of mental
engagement that is represented in the literature as either simultaneous or sequential.
In both cases, switching between tasks, even for a very short time, is required.
Hence, we believe the most appropriate term for this activity is task-switching, not
multitasking.
IT-related overload and individuals 57

Switching back and forth across tasks requires a certain reaction time; that is, a
“reaction time switch cost” (Wylie & Allport, 2000). The switching can take place
in a matter of milliseconds. Net Geners who think they are ‘media multitasking’
are actually task-switching across parallel tasks. Oulasvirta, Rattenbury, Ma, and
Raita (2012) found that many individuals interrupted other tasks to check their
smartphones so often that the behaviours could be considered a trigger for habitual
use. Such repeated interruptions take away from the time needed to complete a
task, and they expend extra resources in recovering from them (Speier et al., 1999).

Conclusion
We started this chapter with a situation that may be familiar to many. There has
been a lot of research about overload that can be useful in understanding this
situation. However, we believe that a cognitivist perspective on overload can not
only add further clarity to our understanding of this phenomenon, but also inform
and guide future research on overload. To that end we proposed the Emotional-
Cognitive Overload Model, which incorporates key cognitivist concepts: (1)
memory architecture and schemata; (2) pertinence; and (3) emotions. To illustrate
these concepts, we described how the ECOM could be used to shed further light
on the opening situation. We then debunked the Amount Illusion and stressed the
importance of Contingency Boundedness.
4
INFORMATION TECHNOLOGY AS A
RESOURCE
From the Bright to the Dark Side of Addiction

The Online Baby System


At the end of the 1990s, way before there was any buzz about IT (Information
Technology) addictions associated with over-connectivity, we were involved in the
development of the Online Baby System (OBS) in the Netherlands (Spanjers,
2012). The system allowed parents to virtually visit their hospitalized premature
infants in neonatal wards using live video streaming over the Internet (Spanjers,
Rutkowski, & Feuth, 2003; Spanjers & Rutkowski, 2005; Spanjers, Rutkowski, &
van Genuchten, 2007). The OBS was designed first as an in-house point-to-point
closed camera circuit, visually connecting the mother in her hospital room with her
baby in the neonatal ward. It evolved into an Internet-based system using a camera
with a special Internet Protocol address, with mobile capabilities (laptop). It was
accessible via analogue or ISDN phone connections. Improvements in the tech-
nology have progressively made the system more accessible and flexible for users to
the point where it is now available on smartphones. The OBS was a key element
in the wider hospital corporate campaign on innovation. It received national media
coverage and was often referred to as “big mother” (Volkskrant, 2001) or, later,
“Baby Mobile” (SBS6, 2009). As a result of efforts by the Dutch Foundation of
Parents of Incubator Children, the presence of an OBS is now a quality criterion
for hospital perinatal centre care (Neokeurmerk). Although the OBS does not
directly contribute to infants’ medical care, physicians and nurses acknowledge its
importance in hospital policy (Spanjers et al., 2003). Through the years of the
project, we ran interviews and focus groups with the medical staff and parents. We
also conducted surveys and studied the login behaviours of users. The results of
parent interviews and surveys indicated that 100 per cent of them would recommend
it to other parents; 78 per cent judged it to add value to the healthcare services; and
85 per cent used it daily.
Information Technology as a resource 59

At the turn of the millennium, data gathered from monitoring login behaviour
showed that the system was part of what we could characterize as a ‘healthy rou-
tine’ for the parents. Parents mostly logged in to the system to monitor their
babies’ feeding times. On weekends, system usage was low since parents visited
their newborns in person. The parents reported a certain ‘feeling of control’ in
being able to monitor their infants. On average, system use dropped after the first
few days but picked up toward the end of the period of hospitalization to a level
exceeding that of the initial use. This was because when parents were informed
that their newborn would soon be discharged, they wanted to make sure that the
baby was doing well and would not be kept longer. Both parents used the system
in most cases (82%), and they were enthusiastic about the possibility of using it to
complement their hospital visits. They particularly enjoyed the fact that they had
constant access: “I could see my newborn all day, and that was great.” Fathers used the
system from their workplaces (19%).
Maybe unwisely from a cybersecurity perspective, the majority of the parents
had been willing to let others access the system, sharing their login information not
only with other family members, including sisters and brothers (56%) and grand-
parents (48%) but also with their best friends (33%) and close work colleagues
(22%). One family had about 40 different OBS users. Another family extended the
login information as far as Brazil. Worldwide networked communication that
monitored the baby was established in the family’s whole social network. This
produced a feeling of closeness amongst the family members concerned.
Overall, parents commented favourably on the system. As one father wrote:

For us online baby has been very important. Our oldest daughter (1½ years) could not
be with her newborn sister because of her age and the fact that she has a disease herself.
Thanks to OBS she could be with her sister every day.

A mother declared of the OBS:

I found it so hard to get discharged from the hospital after giving birth because I could not
be with my daughter the entire day. I was relieved that I could be with her through the
Internet.

Another mother, an executive business woman, explained:

as much as I have been preparing mentally myself that my twins will probably be born
prematurely, and remain in the intensive care unit, I felt hopeless and lonely when
returning home, besides the support of my husband. I knew all [would] be well though.
It was an overwhelming feeling. Being able to monitor their progress day by day was a
tremendous relief to my pain. I could go back to some of my work routine, just being
patient and waiting for them to finally be home.

She was thankful for our work on the system as it helped her to cope with the separation.
60 Information Technology as a resource

The streaming was not always available as there were times when the camera
was switched off by staff. Just under a fifth (19%) of the parents had encountered a
blue screen at the login phase, indicating that the camera was not in operation.
This was not viewed as a concern for the majority of parents: “We were not anxious
when we saw the blue screen. We knew that the nurses were taking care of our baby and we
respected their decision to switch off the camera.”
In pediatrics, physicians and nurses see daily the effects of social deprivation on
newborns and their parents. The OBS reduced these effects to some extent by
enabling social contact between parents and their newborns and, thus, providing a
new form of technological socio-cognitive resource to cope with the difficult time
of separation. The system reduced the parents’ anxiety and, moreover, added
communication opportunities for those in difficult family circumstances (Spanjers &
Rutkowski, 2005; Spanjers et al., 2007). According to the nurses, the system gave
parents the feeling of greater control in their relationships with their babies, and it
also meant that parents were more relaxed when visiting the neonatal wards.
As noted above, with the exception of a few extreme cases, the pattern of parent
usage was pretty healthy. All the parents ‘loved’ the OBS, and most claimed they
were “addicted to the system”. None of the parents claimed brain overload. How-
ever, not all impacts of the system were positive. Some parents displayed extreme
behaviours: 22 per cent of the mothers reported a form of anxiety watching their
baby online and 13 per cent reported problems in disconnecting. As one mother
told us, “It was extremely hard disconnecting from the system, turning the PC off.” We
witnessed anxiety in some parents, such as when they called the wards too often or
showed signs of panic if the screen was under “blue mode too long”. Some mothers,
isolated in their home, stayed with the blue screen as if their life depended on it
and called the wards incessantly.
With the migration from connecting via analogue or ISDN phone lines (first
generation of the OBS) to access via Internet technology (second generation), we
did not see a tremendous shift in the pattern of usage. Indeed, neither the fre-
quency of connections to the system nor the duration of connections differed sig-
nificantly between the first generation (n = 29,663 records) and the second
generation (n = 21,067 records). Interestingly, from 2003 our data became less rich
as parents simply connected to the system in the early morning and stayed logged
in the whole day. The average connection time increased from 5 minutes to 50.
We assumed, in these cases, that it was not likely the parents had been sitting in
front of their PCs the whole time, but rather they had been quickly checking the
status of their babies in the middle of their other activities. Maybe our assumption
was incorrect.
In the year 2006, a mobile phone company funded a project to extend the use
of the OBS to mobile phones. We embraced the funding, as we could not have
foreseen the impact that this level of connectivity would later have on parental
usage patterns. However, three years later, Dutch national TV news highlighted
the use of the OBS in one hospital (SBS6, 2009). The piece included an interview
with the parents of one baby in a participating ward, and they explained how
Information Technology as a resource 61

wonderful the system was – nothing surprising there. Yet the broadcast was
somehow disturbing, as the OBS had effectively become a resource-consuming
‘monster’ for these parents. They explained that they slept with the mobile phone
unlocked between them in bed. The father, a primary school teacher, projected the
live stream on the wall of his classroom while teaching. Although the pupils thought
this was “funny” and seemed not to be disturbed, they claimed it sometimes interfered
with their learning Math.
Overall, while we were very excited about the OBS as an efficient technological
socio-cognitive resource, we were concerned by some of the parents’ usage pat-
terns. On one hand, the system allowed parents to share the difficult early arrival of
their child and to cope with that stressful situation. On the other hand, a few
parents mindlessly used the technological resource – behaviour that can be com-
pared to IT addiction (i.e., Internet addiction). When the OBS was first installed,
we observed parents who were unable to properly use the resource to cope with
their actual separation. Interestingly, it appears that Internet hyperconnectivity and
greater mobility have in fact increased parents’ satisfaction with the technology while
simultaneously raising questions regarding context and appropriateness of usage.
Are the parents’ behaviours roughly equivalent to IT addiction? It is a slippery
slope. Indeed, variations in cognitive styles and personality impact stress relative to
technology differently (Moreland, 1993). Earlier work addressed technophobia,
which is the struggle to accept computer technology, versus technophilia, which is a
form of overidentification with technology that leads to a dissolution of human-
technology boundaries (Brod, 1984). Both have been related to technostress in that
they modify internal belief systems that technology should always be available and
create stress when it is not. We demonstrated in Chapter 3 that individuals are not
equal when dealing with information overload. We argue here that IT addiction
reflects a mindless use of IT as a socio-cognitive resource. The phenomena is
rooted at the individual information processing level in which a form of control or
self-regulation is required to avoid excessive use of IT. We speculate that more
emotional and cognitive resources are required to impose control on system usage
when the emotional brain is hijacked. Moreover, the content of the information
transmitted via technology is core to the human brain and its primary drive: social
attachment. Would the OBS have been that successful if it had not been newborns
being streamed? Would people project a distant relative in class or at the office?
Maybe. Additionally, we argue that with the improvement of technological design
and features, some systems are extremely immersive and they greatly increase the
sense of social presence. The users can scarcely apply the necessary controls to use
such technological resources wisely.

Technology as a socio-cognitive resource


In this section we build upon the concept of resources developed in the preceding
two chapters. We describe how the OBS creates conditions of flow and social
presence for parents using the system. However, some also experience ‘Fear of
62 Information Technology as a resource

Missing Out’ in regard to their child. We also explain how the OBS example can
be used to understand strategies and technology use for social contact.

Flow, social presence, and fear of missing out


The OBS video streaming technology was surely innovative at the time when it
was introduced, but it was not the ease of use or perceived enjoyment of the
system that explained its tremendous success. Such factors are often used to
understand technology use. IT addiction was not yet on the research agenda.
Rather, researchers studied the importance of flow, perceived enjoyment, invol-
vement, and social presence. Flow has been defined as “the holistic sensation that
people feel when they act with total involvement” (Csikszentmihalyi, 1975, p. 36).
According to Barki and Hartwick (1989), involvement reflects the psychological
importance and personal relevance of an object or an event. While doing their
grocery shopping or going into town, mums can now ‘stay connected’ to their
baby via OBS. They can do so even while sleeping. There is no doubt the mothers
are in state of flow as they use technology to be involved with their newborns.
The parents reported a state of flow or deep mental immersion with their new-
borns that was mediated through the system. Interestingly, research has demon-
strated that while in a state of flow, individuals lack attentional resources
(Csikszentmihalyi & Csikszentmihalyi, 1988).
The OBS supported a sense of social presence through its immersive design.
Social presence is the degree to which a medium allows an individual to establish a
personal connection with others that resembles face-to-face interaction (Walter,
Ortbach, & Niehaves, 2013). Although not physically with the infants, parents
experienced social presence in that they felt connected to their babies; one descri-
bed the experience as “being with her”. Walter et al. (2013) developed a framework
of social presence based on human computer interaction in online communities.
This includes emotional and evaluative presence. Emotional social presence refers
to the social presence occurring when warmth and closeness is experienced. Eva-
luative social presence refers to the ability of humans to see and know that they are
present and can influence situations. The OBS encouraged both emotional and
evaluative social presence. The parents reported feelings of happiness and being
together with their children as they watched them move and monitored their
every breath from a distance. The hyperconnectivity observed in the OBS case is,
therefore, not synonymous with an addiction to a media. Rather, it is an indication
of social presence.
The mothers experienced social presence as well as a state of flow through the
technology. Their resources were fully focused on the ‘object of their affection’, in
some cases to the point where the mother displayed a new iDisorder dubbed Fear
Of Missing Out (FOMO). FOMO is defined as “a form of social anxiety—a com-
pulsive concern that one might miss an opportunity for social interaction, a novel
experience, or some other satisfying event aroused by posts seen on social media
sites” (Dossey, 2014, p. 69). Some mothers reported they would want to avoid
Information Technology as a resource 63

“missing a minute of the great opportunity” to watch their newborn. One mother made
a disturbing comment illustrating FOMO:

I can still remember the day my husband forgot the laptop at the first floor of our house
and went to work. I crawled up the stairs even though I was forbidden to move … to be
able to see my baby … I could not resist … the OBS is a marvelous technology.

This FOMO was rooted in the mother’s involvement with her newborn, the
extremely personal relevance of the child displayed on the screen and the associated
anxiety relative to the infant’s condition. FOMO cannot be regarded as a mere
arousal experience in the S-R (i.e., Stimulus-Response) tradition. The ‘philia’ is
not technological; in fact, the technology is a useful socio-cognitive resource that
supports parents in a difficult time. Rather, the problem is related to the highly
pertinent nature of the information supported by the system. The cognitive
approach supported by Bower’s work (1981, 1991, 2001) informs us that indivi-
duals are more likely to process information that is affectively and cognitively
congruent with the mental schemata stored in the Long-Term Memory (LTM).
For a mother with a baby in hospital, there can be little more important than
viewing her child and, though online, feeling her presence. How can one resist a
system that provides extended social presence? Why even try?

Resources: coping strategies and emotions


Coping strategies are frequently displayed by parents who leave their newborn at
the hospital. They experience traumatic stress and enter an anxiety state when
separated from the child (Klaus & Kennel, 1985). Emotional and cognitive coping
strategies are activated in the mind when processing a negative situation (Lazarus &
Folkman, 1984; Folkman & Lazarus, 1988; Lazarus, 1994). As described in Chapter
2, the LTM serves as base that contains different cognitive-affective heuristics and
strategies required to process information (Barnard, 1985). The main cognitive
strategies in stressful situations are based on individuals’ cognitive styles, and there-
fore on the emotional and cognitive resources that are available to them to process
the stressful information.
Coping is generally defined as a response to a distressing emotion, with the
function of reducing tension. From the phylogenetic perspective, it contributes to
the survival of the individual in the face of threatening situations (Miller, 1980;
Ursin, 1980). This functionalist perspective focuses on behavioural responses
exclusively. Indeed, from a functionalist perspective, psychological mechanisms
exist to help individuals carry out important survival activities (James, 1894). The
seminal work of Lazarus and Folkman (1984) demonstrated the importance of
cognitive activities in coping with distressing emotions. Folkman and Lazarus
(1988) defined coping in terms of “cognitive and behavioural efforts to manage
specific external and/or internal demands that are appraised as taxing or exceeding
the resources of the person” (p. 310). Coping strategies are dynamic and vary based
64 Information Technology as a resource

on the appraisal and reappraisal of situations. Therefore, what is stressful for one
person may be less so for another.
Coping requires the expenditure of resources to solve problems embedded in
stressful situations. Newell and Simon (1972) underlined the importance of heur-
istics and strategies in problem-solving. Being confronted with a stressful situation
is, in itself, a problem. Much information has to be processed in order to reinstitute
homeostasis after one faces a stressful situation. Homeostasis involves both explicit
and implicit processes. Efficiency in returning to a state of balance depends on the
individual’s pool of resources. There are many possible combinations of resources,
whether it be mental frameworks or physiological factors to fuel the organism;
thus, as we demonstrated in Chapter 2, individuals have different pools of resour-
ces. Situations and psychological states are appraised differently as a function of the
resources available. Some are more or less efficient, but all are highly dependent on
the appraisal of the situation (Lazarus & Smith, 1989). For example, some indivi-
duals make sense of a negative situation by reorganizing their schemata while
coping with the situation. They focus on the problem itself (i.e., planful problem-
solving) or on the emotion (i.e., positive reappraisal) (Lazarus & Folkman, 1984;
Park & Folkman, 1997). Combining both cognition and emotion appears to be an
efficient way of improving the emotional state.
We argue that technology such as the OBS is being used as an extraneous
resource. When it is used as a component of the individual’s pool of resources, its
use varies across parents. The narratives of some mothers reflected both emotion
and cognition (i.e., relief and being in control; happiness and empowerment) when
using the OBS. Such narratives are emotional-cognitive in essence. We could
speculate, for example, that the father who displayed his newborn in the class he
was teaching was, in fact, using the technology efficiently. Indeed, the father
enacted his self-efficacy (Bandura, 1977), having no doubt that he could cogni-
tively perform well as a teacher while being emotionally stabilized by the social
presence of his newborn on the screen.
Information processing coping strategies use schematic models that are based on
experiences in a given culture or family, often activated unconsciously (Barnard,
1985). Sometimes, not expressing, or even repressing, the arousal is a way of
coping with pain by avoiding the activation of the emotional network in the mind.
It requires a form of control over one’s expression of emotion (Gross, 2007). The
individual exerts control by shuffling the negative experience in Episodic Memory
to a less accessible place. Such strategies may be an unconscious effort to deactivate
negatively valenced schemata and cut off the pain. The traumatic event is therefore
consciously forgotten or pushed away. Research has demonstrated that reactivating the
memory of an emotional experience rekindles response components of that experi-
ence, such as mental images, associated feelings, bodily sensations, and physiological
arousal (e.g., Pennebaker & Beall, 1986; Rimé, Noël, & Philippot, 1991; Schaefer &
Philippot, 2005). This approach represents supervenience of cognition on emotion.
Similarly, it is indeed well known in psychoanalysis that reactivating ‘suppressed’
emotions through verbalization, or the expression of one’s own thoughts, may help
Information Technology as a resource 65

in healing associated bodily symptoms. Freud’s approach, referred to as the ego


psychology model (which explains mental processes are organized based on the
Freudian idea of the ego and the Id), tightly linked mind and body, or emotion
(when perceived as physiological) to the mind (cognition) (see Chapter 2). Ego
refers to “the idea that in every individual there is a coherent organization of
mental processes” (Freud, 1927, p. 15). According to the ego psychology model,
coping is a cognitive process (i.e., an ego process) such as denial, repression, sup-
pression, intellectualization, or problem-solving applied to reduce negative emo-
tions (Menninger, 1963; Vaillant, 1977). This approach focuses on the quality of
the process and the reality of its deployment.

Social resource: the need for social contact in sharing of emotion


In Western cultures, independence and assertiveness are rewarded (Riger, 1993).
The ‘lone ranger’ or ‘man against the elements’ reflects a dominant mentality in
our Western society. This mentality promulgates the notion that ‘real men’ cannot
express their emotions without being categorized as weak (Dunahoo, Hobfoll,
Monnier, Hulsizer, & Johnson, 1998). In a similar way, expressing one’s emotions
was not the forte of Freud’s patients at the start of the 19th century. However,
Rimé (2009) demonstrated that the social sharing of emotion is a strong healing
mechanism in processing traumatic life events. Social cognitivists studying emotion
consider the nature of emotion at the social level and demonstrate the importance
of sharing emotions. After natural disasters, catastrophes, or life-change events, people
have a tendency to talk about their experiences and disclose their feelings. Indeed,
searching for social contact and opening up after a trauma can support the reorgani-
zation of the mental schemata in the mind. It reduces their associated negative
valences and potential activation, which consequently decreases mental rumination,
or contemplation, by ‘giving it a place’ and ‘having a good laugh about it’.
Original work by Schachter (1959) demonstrated that being exposed to an
emotional condition elicits a person’s motivation to seek social contact. Similarly
research has shown that when exposed vicariously to a stressful event via video,
participants release their stress by laughing out loud (Rutkowski, Rijsman, &
Gergen, 2004). Laughter is something to which we can all relate. Most of us laugh
in uncomfortable situations. However, in our study, laughter occurred only when
the participants were exposed to a social contact (i.e., another person) following
the stressful video clip. When left alone after the film, none of the participants
laughed out loud. This phenomena, named “paradoxical laughter at a victim”
(Rutkowski et al., 2004), demonstrated the social nature of laughter as an indivi-
dual outlet to cope with stress. In stressful situations, one needs to first appraise the
situation, as Folkman and Lazarus (1988) have demonstrated, but also validate one’s
stressful emotional response (i.e., laughter) as socially appropriate with another
person. Our research clearly established that laughter is a phylogenetic archaic
defence mechanism (Peripheral Nervous System) with a strong social and cultural
component.
66 Information Technology as a resource

Based on this study, one of the authors was invited on a TV show to provide a
scientific interpretation of the online jokes circulating after the event of 9/11. The
shock was tremendous, and surely some jokes were inappropriate according to
social norms prevailing in our society. Still, people shared the jokes on the Internet.
Humour may sometimes be the only efficient weapon in the face of violence or
adversity. In terms of mind-body supervenience, the body expresses itself first and
this response is regulated by what Freud refers to as the external world, a world of
cognition and mental images. Obviously cognition and emotion are interrelated
when processing information and solving problems. They are interrelated in the
same way the peripheral and central nervous systems are connected in the brain.
One shall not laugh at death, right? Yet jokes about 9/11 transmitted over the
Internet were a way of dealing with the pain of this event. Similarly, when the
father in the OBS study projected the streaming of his child in his classroom, was
he reducing his stress? Could this be part of the bright side of using IT as a socio-
cognitive resource? Perhaps our society is changing what is culturally acceptable in
terms of the use of technology. A reconsideration of the way individuals deploy
their coping strategies may be a by-product of our digital world.

When technology hijacks the pool of resources


The bright side of the OBS was evident when the system was used wisely. How-
ever, there was a paradox in that the technology was sometimes used not as an
extraneous socio-cognitive resource but, instead, was allowed to commandeer
control of the emotional brain. We have described how some parents started using
it in a compulsive way as if it were a type of drug. They experienced withdrawal
symptoms and suffered from FOMO. When used compulsively, the OBS may
draw so heavily from an individual’s pool of resources that the pool is depleted.
Control mechanisms have to be exerted in order to make the best decisions
regarding usage of technological resources such as the OBS or, more generally,
Social Networking Systems (SNSs). However, is it always possible or easy to apply
such control? Obviously, unconscious processes play a big part in the way that
humans deploy their pool of resources to process information and make decisions.
As we presented in Chapter 2, the limbic system – and particularly the ‘love’
hormone, oxytocin – plays a central role that is tightly linked to the behaviour of
social bonding and attachment. The love hormone hijacks the brain of any healthy
mother separated from her newborn. Indeed, mothers even used the OBS to sti-
mulate lactation. Watching the baby facilitated the mother’s production of milk,
enabling her to nourish her infant. This observation confirmed the physiological
reality of the ‘feeling of being together’ (i.e., social presence). We could speculate
that watching one’s own baby through the OBS aroused the Brain Reward System
(BRS). As we have described, the BRS is concerned with homeostatic state and
reproductive functions, including parental investment. Considering the love hor-
mone and the BRS as physiological indicators of OBS success is a first step in
understanding any relationship between the OBS and IT addiction.
Information Technology as a resource 67

Conclusion: the Online Baby System and IT addiction


On the bright side, the OBS provided parents a technological resource that helped
them draw on an extended network of social contacts. We found out that the OBS
encouraged the social sharing of emotion. Indeed, parents created online commu-
nities around their newborn by sharing login information. At that time, there was
no Facebook application where people could ‘like’ posts about the newborn, nor
was there the opportunity to share pictures on Instagram. Neither was there any
chat function to enable discussion around the system. Rather, parents shared the
live streaming of their children via the OBS. They called to share updates about
their infants’ health. They needed to share not only the pain, but also the joy of
their newborns growing steadily. Most of all, the OBS allowed parents to experi-
ence social presence with their newborns; they were involved; they felt together.
No wonder they thought the OBS was a marvellous technology. Indeed, today, IT
enriches our pools of resources to cope with our everyday lives and improve our
well-being through connectivity. Individuals of virtually all ages use the Internet to
fulfil their curiosity, realize business advantages, connect to one another across
continents, and share their emotions. Social media allows them to stay connected
to their loved ones – to share pictures; to express joy and pain; to feel less lonely
when experiencing pain; and to share their happy moments.
In their article, Rosen, Whaling, Rab, Carrier, and Cheever (2013) reported the
results of the Mobile Mindset Study. They found that 58 per cent of adults and 68
per cent of young adults checked their phones more than once an hour. The vast
majority (73%) panicked when they misplaced devices; 14 per cent felt desperate
and 7 per cent became physically sick when their smartphone was missing. These
findings are in line with what we observed in the OBS study. As we reported,
mothers experienced anxiety at the idea of disconnecting from the video streaming.
Some went as far sleeping with the mobile phone, with Internet access on full mode.
On the dark side, the system could be related to Pathological Internet Use (PIU)
(which focuses on personality disorders) or IT addiction (which focuses on technol-
ogy usage) (Shapira, Goldsmith, Keck, Khosla, & McElroy, 2000). However, we
were unable to actually measure the extent of PIU or IT addiction based on their
time online with the OBS. Before the turn of the millennium, we could easily
measure the ‘time spent’ on the system through login data. It was the classic way of
measuring media and technology usage (Kraut, Patterson, Lundmark, Kiesler,
Mukopadhyay, & Scherlis, 1998; Subrahmanyam, Kraut, Greenfield, & Gross, 2000).
With the advent of Wi-Fi-enabled mobile devices, it became extremely challenging
to monitor the behaviour of the OBS parents. Nowadays, measuring actual usage
time has proven problematic, and most research does not take into account users’
preferences for task-switching/multitasking. To solve this problem and properly
assess IT usage, new measurement scales have been developed and validated (see the
Media and Technology Usage and Attitudes Scale by Rosen, Whaling et al., 2013).
Intensively using technology as a socio-cognitive resource by staying over-connected
should not be ethically or socially judged. In contrast, we can address the consequences
68 Information Technology as a resource

for well-being. First, we concluded that the OBS was not a cause of IT addiction.
Rather, it provided a powerful continuous connection that activated the BRS,
which consequently started a cycle of wanting even more connectivity. The con-
nection between mother and child through the OBS diminished a tremendous
biological feeling of loneliness. Second, we observed that parents who suffered
from an underlying pathology, such as anxiety disorders, tended to suffer more
severely from any interruption to the service delivery. We therefore concluded
that IT addiction in the case of the OBS also was related to iDisorders and Pathological
Internet Use (PIU).

iDisorders versus Pathological Internet Use


In her book Alone Together, Turkle (2011) noted that “the ties we form through
the Internet are not, in the end, the ties that bind. But they are the ties that pre-
occupy. We text each other at family dinners, while we jog, while we drive”
(p. 280). As presented in Chapter 1, the dark side of technology is under scrutiny.
Research on the phenomena of various types of IT addiction have emerged in the
1990s with the expansion of widespread Internet connection in First World homes.
Most research in the field of Internet and other IT addictions defines the phe-
nomena in terms of personality antecedents and the amount of Internet usage.
Over-connectivity, or the situation of being connected to the extent that the indivi-
dual may experience one or more of the various forms of IT addiction, is mostly
blamed. Publications on “iDisorder” (Rosen, Cheever, & Carrier, 2012) or
“Facebook depression” (O’Keeffe & Clarke-Pearson, 2011) are flourishing. An
iDisorder is defined as “the negative relationship between technology usage and
psychological health” (Rosen, Whaling et al., 2013, p. 1243). Yet, as we have
noted, some individuals disconnect without experiencing any withdrawal syn-
dromes, anxiety disorders, or depression (Shapira et al., 2000). Research mostly
focuses on the impact of technology usage on mood disorders (e.g., depression
and social anxiety) and personality disorders (e.g., narcissistic and and antisocial
personality disorders).
Kimberley Young (1998) was first to describe excessive and problematic
Internet use as a legitimate clinical disorder, coining the term “Internet addiction
disorder” (Murali & George, 2007, p. 24). She established the first Center for
Internet Addiction in 1995. The traditional stimulus of ‘too much time on
Internet’ is often associated directly with symptoms detrimental to the person,
such as jeopardizing one’s health, damaging social relationships, or experiencing
withdrawal symptoms, sleeping disorders, or impaired work performance (Young,
1998, 1999). However, as we emphasized in the discussion of the OBS, it is not
possible to assess when ‘too much time’ becomes Internet addiction. Frequency
or time spent on the Internet may be a confounding variable in itself. Young
(1999) concluded that problematic Internet use and iDisorders are extremely
nuanced and may appear somewhat confusing. Most studies of technology use
focus on the negative impact on human decision-making. The individual’s pool
Information Technology as a resource 69

of resources is seldom investigated through an information processing lens. We


believe the time is right to adopt a cognitive-behavioural focus on information
processing. In particular, we suggest considering the role of schemata held in the
LTM when trying to understand PIU.

iDisorders: confounding factors?

Mood disorders
Individuals who experience problematic Internet use have been reported to display
high rates of depression symptoms (Young & Rogers, 1998). Longitudinal studies
have shown that greater use of the Internet was associated with increased signs of
loneliness and depression (see Kraut et al., 1998). Excessive use of technologies (e.
g., online chat, video gaming, emailing) may cause depression transmittable
through “emotional contagion” via SNSs (Hancock, Gee, Ciaccio, & Mae-Hwah
Lin, 2008; Moreno, Jelenchick, Egan, Cox, Young, Gannon, & Becker, 2011).
Indeed, Rosen, Whaling et al. (2013) reported that, aside from the work of Davila,
Hershenberg, Feinstein, Gorman, Bhatia, and Starr (2012), most authors in the
field converge on the negative impact of technologies on mood disorders, parti-
cularly depression. SNSs have been associated with increases in loneliness and
depression (O’Keeffe & Clarke-Pearson, 2011) or, in contrast, with decreases as a
result of social bonding. Further readings on the topic confirm the importance of
both the individual’s pool of resources for information processing and the memory
in understanding IT addiction. Caplan (2007) made this point clear in his research
to determine the cognitive predictors of the negative outcome of Internet use. He
differentiated dispositional from situational loneliness. Dispositional loneliness is
part of an individual’s mental framework in memory. It is a personality trait that is,
in fact, the expression of a form of social anxiety that arises from the desire to
create a positive impression of oneself to others. Socially anxious people are highly
motivated to seek low-risk communicative encounters (Schlenker & Leavy, 1982).
They tend to perceive their self-presentation online to be greater than in face-to-
face contexts. Situational loneliness also addresses resources, but from a social
contact perspective. It may be a function of relocating to other cities or countries,
travelling extensively, or having too little time for social encounters. Caplan
(2007) demonstrated that in some studies on IT addiction, results classically
attributed to situational loneliness actually should have been attributed to dis-
positional loneliness. In these cases, social anxiety was a confounding variable. In
the behaviourist tradition, both types of loneliness could be treated as ante-
cedents. However, social anxiety has been shown to be at the root of PIU. As
Bower (1991) demonstrated, anxiety disorders are rooted in a defective organi-
zation of the mental schemata. Interestingly, socially anxious people may benefit
emotionally and cognitively from Internet use. However, in doing so, they may
end up even more isolated from the real world. In other words, social phobia
may lead to greater technophilia.
70 Information Technology as a resource

Personality disorders
The Big Five personality traits have been found to predict extent of social media
use. Further, narcissism has been linked to higher usage (Ryan & Xenos, 2011) and
has been used extensively in studies to understand Internet addiction (i.e., IT
addiction). Narcissism refers to a fundamental absorption towards the self and the
constant need to validate one’s existence. It reflects a grandiose, inflated, self-
centred self-concept that suppresses low self-esteem based on defective attachment
in childhood (Cohen & Clark, 1984). The term is rooted in psychoanalytic verbi-
age and is the result of a maladaptive defence mechanism of the ego (i.e., self).
Individuals suffering from Narcissistic Personality Disorder (NPD) have a very poor
image of the self that is unconsciously rooted in memory. Narcissists either feel
they can never fulfil the requirements imposed by their parents (i.e., over-
investment) or have experienced physical or psychological abandonment (i.e.,
underinvestment). They build strong unconscious defence mechanisms to enhance
their image at ‘any cost’. Doing so helps them cope with low self-esteem as well as
anxiety disorders. Narcissism is in the official classification of personality disorders
(i.e., the DSM 5). It occurs on a spectrum from mild to severe (i.e., psychopathy).
Narcissism is characterized by a poor ability to empathize and decode the emotions
of others, leading to antisocial behaviour and deception.
In the literature on IT addiction, narcissism serves both as an antecedent and a
symptom. In particular, narcissists find SNSs appealing, and their use of SNSs
increases their narcissistic behaviours (Bergman, Fearrington, Davenport, & Berg-
man, 2011). For example, Facebook has been found to attract users with a narcis-
sistic personality (Mehdizadeh, 2010; Ryan & Xenos, 2011), and narcissism predicts
higher levels of social activity in online communities (Buffardi & Campbell, 2008).
Also, individuals scoring higher on the narcissism scale and lower in measures of
self-esteem spend more time and report more self-promotional content on Face-
book (Mehdizadeh, 2010) as well as on other SNSs. Indeed SNSs encourage nar-
cissistic behaviour that is directed towards self-promotion (DeWall, Buffardi,
Bonser, & Campbell, 2011) and entitlement/exhibitionism (Carpenter, 2012).
They have been criticized for ‘producing’ a narcissistic generation. For example,
Bergman et al. (2011) argued that there has been an increase in narcissism due to
the values held by the Net Generation. In order to better understand IT addiction
and societal impact, it is worthwhile taking a closer look at this specific personality
disorder. In addition to narcissism, extended time on the Internet has been related
to more antisocial behaviour among Chinese students (Ma, Li, & Pow, 2011) as
well as attention deficit (Yen et al., 2007).

Pathological Internet Use: A resource pool view


Pathological Internet Use (PIU) is a type of IT addiction defined as the consequences
of problematic cognition coupled with behaviour that intensifies or maintains
maladaptive response (Davis, 2001). In the official classification of psychiatric
Information Technology as a resource 71

disorders (DSM 5), PIU is related to physiological dependence, as with a substance


addiction, which leads to withdrawal symptoms. Such dependency may occur
when a person has exhausted available resources to cope with problems that they
have to solve (Young, 1999; Davis, 2001; Davis, Flett, & Besser, 2002). Research
has demonstrated that time spent connected or using the technology is a cognitive
precursor rather than a symptom of PIU (Thatcher & Goolam, 2005). However,
some users make use of their resources consciously and do not experience IT
addiction. They use IT extensively, but happily (Pratarelli, Browne, & Johnson,
1999). The cognitive-behavioural approach proposed by Davis (2001) is very
helpful in understanding PIU. Davis conceptualized PIU as a distinct pattern of
Internet-related cognition and behaviour that results in negative outcomes for the
individual. He proposes two distinct forms of PIU: specific and general.

Specific PIU
Specific PIU involves an overuse or abuse of content-specific functions of the
Internet (e.g., gambling, online porn) and is cast as one of the many possible
manifestations of a broader behavioural disorder. Davis (2001) argued that in
the absence of the technology, the behavioural disorder would likely be manifested
in some alternative way. Davis’ approach is congruent with the emotional-cognitive
model of memory. He speculated that abusive Internet usage (i.e., stimulus) does not
cause depression or dysfunctional addictive behaviour (i.e., symptoms). Rather, it
predisposes the individual to develop maladaptive usage through a pre-existing
pathology. Davis proposed an interesting alternative explanation to the relation
amongst iDisorders and IT addiction, as previously presented. His proposal differ-
entiates between antecedents and symptoms and emphasizes the key role of infor-
mation processing. As Bower (1991) argued, such pathologies as social anxiety are
indeed directly involved in the way information is processed in memory.
Additionally, the construct of PIU provides insights about the maladaptive use of
technological resources such as the OBS or SNSs. It can be used to explain why
results previously attributed to loneliness as a predictor for IT addiction should in
fact be attributed to a deeper cognitive level; for instance, to an anxiety disorder
(e.g., social anxiety). Also, in the case of the OBS, it happened that for some
mothers we had to confiscate, or interrupt, streaming of their newborns. They
would jeopardize their own well-being – for example, by crawling up stairs to
monitor their child – or they would seriously disturb the functioning of the neo-
natal wards with recurrent phone calls. They would burst into tears in panic at any
sign of the blue screen (i.e., no connection). The goal in halting the streaming was
to provide mental and emotional rest to the over-connected and stressed parents.
At the time, we considered developing a test to grant accessibility to the system
based on obtaining favourable results in relation to state versus anxiety trait dis-
orders. The test could also inform how parents would use coping strategies as
resources. We did not pursue such testing as we considered it unethical to restrain
parents’ viewing of their babies a priori based on anxiety trait disorders. However,
72 Information Technology as a resource

pathological use became a strong indicator of the need to provide proper support
to mothers in distress.
Finally, specific PIU is useful in explaining why narcissists are drawn to SNSs
and are over-connected, based on frequency of social media posts and ‘likes’
(Mehdizadeh, 2010; Ryan & Xenos, 2011). SNSs are surely bringing sunshine to
their lives on a regular basis. Also, and contrary to what is often depicted in the
literature on IT addiction (Korac-Kakabadse et al., 2001), the time spent on the
Internet or being connected is not in itself a symptom of addiction (Junco, 2013;
Rosen, Whaling et al., 2013).

General PIU
General PIU is conceptualized as a multidimensional pathological overuse of the
Internet due to the unique communicative context of the Internet itself. The misuse
of the technology results in personal and professional consequences associated with
the experience of being online and its unique social context. General PIU occurs
when an individual develops problems due to the interpersonal contexts available
online (see Caplan, 2002, 2003, 2010). For example, some OBS mothers reported
feeling guilty about disconnecting, or experiencing discomfort handling the system.
The possibility of experiencing a form of interpersonal contact with their newborn
was key in understanding their excessive usage. As mentioned above, the love
hormone (i.e., oxytocin) was hijacking their brain, leading to over-connectivity.
It may be worthwhile studying the BRS in conjunction with general PIU (Chou &
Hsiao, 2000). We argue that a primary function of the OBS or SNS technology is to
link us to loved ones. It is a socio-technological resource. In doing so, the brain may be
inundated by oxytocin. Also, when SNSs activate the brain through extrinsic rewards
such as status, number of likes, or positive comments, it may become more difficult to
disconnect while the brain is so ‘rewarded’. Further, self-disclosure, in the same way as
food or sex, can activate the intrinsic BRS as primary reward (Tamir & Mitchell, 2012).
Brain activation and a particular focus on attachment are surely interesting constructs
when considering the abuse of media such as Facebook or Instagram.
The relationships of depression, narcissism, or loneliness with the various forms
of IT addiction appear to present a chicken-and-egg problem. What is evident is
that whatever the symptoms may be, they are exacerbated when there is no control
imposed by the person on IT usage. Recently, Caplan (2010) updated Davis’ cog-
nitive-behavioural model and presented studies that have found empirical support for
the general PIU model. Caplan (2010) suggested that

preference for online social interaction and use of the Internet for mood reg-
ulation, predict deficient self-regulation of Internet use (i.e., compulsive
Internet use and a cognitive preoccupation with the Internet). In turn, defi-
cient self-regulation was a significant predictor of the extent to which one’s
Internet use led to negative outcomes.
(p. 1089)
Information Technology as a resource 73

The supervenience of the Information Technology-ego problem

Information Technology-ego fusion


We live in the era of technophilia in which overidentification with technology
leads to a dissolution of human-technology boundaries (Brod, 1984). While tech-
nologies such as SNSs are efficient socio-cognitive resources, the literature on
iDisorders, as well as the headlines of magazines, suggest a fusion more so than a
dissolution. The latest research on the ‘phantom vibration syndrome’ – the body’s
perception the phone is vibrating and delivering information when, in fact, it is
not – shows how the fusion has already occurred for large number of people
(Drouin et al., 2012). In this respect, there is no supervenience problem. How
many of us have witnessed couples using their smartphones to communicate with
others or to multitask while they are together? Thus, “Cogito, ergo sum” (I think,
therefore I am) can be updated to “Twitto, ergo sum” (I tweet, therefore I am).
SNSs allow marketing of the self through mirroring of our social identities.
Such mirrors promote narcissism. Social media provides new technological
resources that support the primary drive to belong and the desire for interpersonal
attachment (Baumeister & Leary, 1995). Facebook, Instagram, LinkedIn, and
Twitter are based on network externality (i.e., the number of followers). They
function as extrinsic and intrinsic gratification mirrors by rewarding posts and
pictures. It may boost the self-esteem of the weaker, feed fat the narcissist ego,
and be trivial to others.
While some degree of narcissism is healthy, individuals suffering from NPD
show serious brain malfunctioning. They have less brain matter in areas that over-
lap with the areas associated with empathy (i.e., left anterior insula, rostral and
median cingulate cortex as well as part of the prefrontal cortex). It is hard to
believe that SNS users are all extreme narcissists. Research in psychiatry stipulates
that persons suffering from severe forms of NPD represent only one per cent of the
total population. If SNS use were determined by severe NPD, this would seriously
reduce the market share value of Facebook or other SNSs. And if SNSs have
become the root of a narcissist epidemic in younger generations (Bergman et al.,
2011), our society is at risk.
Another concern about SNSs, which are social by definition, is that they hijack
our reward system as well as our sense of identity. Individuals expose their auto-
biographical memory online, after applying a social filter – or not (i.e., privacy
settings). If the fusion were to fail and the mirror used for self-reflection were to
crack, one might experience depression, loneliness, or aggressiveness. According to
Geoffrey Mohan (2013), “Facebook is a bummer that makes us feel worse about
our lives.” That may be a big problem for younger generations. Spitzer (2012), a
German psychiatrist, went as far as characterizing children’s iPad usage within the
new Steve Jobs Schools as child abuse. In his book Digitale Demenz, he recom-
mended exposing children as little as possible to digital media since it would lead to
them getting fat, dumb, aggressive, lonely, sick, and unhappy.
74 Information Technology as a resource

Using technological resources mindfully


On the bright side, one could speculate that SNSs may be an efficient technological
resource to reinforce a positive image of the self through social contact when one is
suffering from low self-esteem or social anxiety. This socio-cognitive resource allows
connectivity and social sharing of emotion. To manage the technology, individuals
must draw on coping strategies from their pool of resources. To avoid addiction,
they could adopt a usage plan for all media similar to that recommended by the
American Academy of Pediatrics (Strasburger & Hogan, 2013).

Treatment of Pathological Internet Use


The majority of the proposed treatments for IT addictions are behaviourist in their
approach. Participation in support groups (O’Reilly, 1996) and time management
techniques (e.g., addressing the frequency of use of the technology) are the pre-
dominant approaches (Young, 1998). Such methods have been classically used to
cope with other forms of non-technological addictions. Behaviourists often pro-
mote using associative learning to replace a maladaptive behaviour with one that is
healthier. Such treatment would be particularly efficient regarding general PIU. An
approach that would be more appropriate for treating specific PIU is cognitive
behavioural therapy, which proposes activating information stored in the LTM.
Positively valenced nodes are reactivated in order to help the negatively valenced
nodes fade away. In order to shortcut mental rumination of negative elements,
individuals activate brain processes to favour the recall of positive information and
reorganize cognition through emotion.

Knowing when to say no to the ego!


When the brain is hyper-activated in simulated IT environments, it becomes more
and more difficult to apply conscious mental effort to explicitly reduce usage, or
even acknowledge the existence of PIU. When the information is relevant, one
can barely resist continuing IT usage. As we discussed in relation to the OBS, in
situations where the brain is naturally hijacked by the love hormone, many more
resources have to be employed to make good decisions regarding IT usage. Some
OBS parents lacked the cognitive and emotional resources to make wise decisions
about their use of the OBS. However, when newborns went to their homes, all
use of the OBS ceased.
Some individuals show a form of ‘arousal immunity’ to SNSs or apps. A few
parents refused to use the OBS. They did not claim that they had no time to learn
how to use the system, nor that they had too many requests to use technology,
with the proliferation of apps like WhatsApp, Facebook, and so on. Rather, some
parents explained that they would not be able to confront the hospitalization of
their baby online without experiencing serious psychological damage. They
imposed control on their own usage. Others said they would have too much
Information Technology as a resource 75

difficulty disconnecting. That is, some parents could foresee their inability to use
the technology mindfully. They anticipated extra stress if they were to view their
newborns non-stop and, thus, opted not to adopt the OBS. They consciously took
the time to reflect on the OBS (i.e., accessing subconscious high-order cognitive
processes) and exerted cognitive control. When individuals are aware of some of
their emotional and cognitive weaknesses, they may act to avoid PIU. Were the
decisions of these parents based on previous experience of addictive BlackBerry
use? Were their actions a form of repression of their emotions? Were they simply
well aware of some underlying anxiety pathology or subconscious processes that
would interfere with their IT use? We do not know. However, these questions
offer intriguing avenues for future research.

Conclusion
To conclude, when control is more and more difficult to impose mindfully on the
brain and when the information delivered is way too pertinent to be filtered,
individuals may end up suffering from IT addiction. Additionally, as Davis (2001)
and Caplan (2007, 2010) have demonstrated, personality disorders are cognitive
precursors of IT addiction and, subsequently, IT addiction exacerbates such dis-
orders. In Chapter 1, we reported Nicholas Carr’s and Tristan Harris’ latest ‘war’
against the smartphone. Hopefully, this chapter illuminates IT use as a much more
complex cognitive and emotional phenomenon. Bashing technologies such as the
smartphone may be an easy route to take. In Chapter 7, we focus on the bright
side of IT. Technologies can be fabulous resources. It is for the user to apply them
mindfully and with moderation. As is the case for drugs, it is a civic responsibility
of manufacturers to warn about the consequences of excessive usage of their devices
and to help develop rules and guidelines for safe and effective use.
5
DARK SIDE OF INFORMATION
TECHNOLOGY AT THE
ORGANIZATIONAL LEVEL

In this chapter we explore the negative impact of Information Technology (IT) at


the organizational level. Box 5.1 describes a scenario in which IT is having a
negative effect on Anna and her husband, David.

BOX 5.1 ANNA AND DAVID


Anna moans to herself, “Oh no! Not again!” Her husband, David, who was
trying unsuccessfully to sleep next to her, shouts, “Turn that thing off!” It is
3:30a.m. in Orlando, and this is the fourth time that night that her mobile is
ringing. She knows that when she picks it up, she will end up trying to calm a
panicked teammate in Sydney. And she won’t be able to go back to sleep until
she answers a string of texts and emails. She has seven teammates: two in the
Netherlands, one in Australia, one in France, one in India, and two in Germany.
Her colleagues are busy testing a new state-of-the-art Enterprise Cognitive
Computing (ECC) application for call centres that is going to be rolled out in
two weeks.… Or at least that is the hope her manager is clinging to. She is
pretty certain that the rollout won’t happen as planned, but she is too tired to
even care. That said, she doesn’t want to let down teammates who depend on
her. In her company it is standard policy to communicate whenever there is a
problem– no matter what the time. It is also company policy to respond as
soon as possible to emails and texts. As much as she loves her work, she
doesn’t know how much longer she can live without sleep or time to enjoy her
family. Trying to fall back asleep, she finds herself wondering when a new
automated assistant will come to her rescue and restore her work-life balance
by warning her colleagues that it is the middle of the night for her. Obviously,
they all keep ignoring that fact as they are under pressure themselves. It has
never crossed her mind that, if successful, the ECC application would cost
thousands of call centre workers their jobs.
Dark side of IT at the organizational level 77

The American poet Robert Frost once noted that “by working faithfully
eight hours a day, you may eventually get to be boss and work twelve hours a
day”. Today, Robert Frost’s words seem prescient. We live in a world where
working long hours is becoming the norm– at least in some First World
countries. Anna no doubt wishes that she could confine her workday to 12
hours.
In the two decades from 1980 to 2000, the time spent on work increased in the
USA, Canada, Britain, Japan, and Australia. For example, American full-time
workers put in an average of 50 hours a week in 2015 (Isidore & Luhby, 2015)
compared to an average of 40 hours a week in 1973 (Porter & Kakabadse, 2006).
In this chapter we explore the dark side of these longer workdays and weeks at the
organizational level. In explaining these dark side challenges, we focus on what
technology can do to you at the organizational level and not on what technology
can do for you (Gutek, 1983, p.163).

Information Technology dark side diamond


Almost half a century ago, Leavitt (1958) proposed a diamond with critical success
factors for management change. His focus was on structure, managerial tasks,
people, and technology. In this chapter we use a similar diamond to illustrate four
important factors to consider when trying to understand the dark side of IT in
organizations and what it can do to workers (see Figure 5.1). The four factors that
we discuss are organizational design and structure, work, people impacts, and
technology. All of these are interrelated when it comes to understanding IT-related
overload and IT addiction in organizations.

Organizational design and


structure
Information processing capacity
Collaboration
Communication

Work
Technology
Work tasks
Work overload Automation
Work-life balance Robotics

People impacts

Impact of IT-related overload


(Technostress)
Impact of IT addiction

FIGURE 5.1 Information Technology Dark Side Diamond


78 Dark side of IT at the organizational level

Organizational design and structure


The first factor in considering the dark side of IT in organizations is organiza-
tional design and structure. Researchers in organizational theory have investigated
the relationships organizational design and structure have with overload to a lesser
extent than researchers in other disciplines, including other management dis-
ciplines (Klausegger, Sinkovics, & Zou, 2007). That does not diminish the
importance of this factor. Organizational structure relates to the pattern of interac-
tions or the network of relationships that exist among organizational members
and units (Mackenzie, 1976). Organizational design is the process whereby the
organizational structure is made to fit with specific characteristics both inside and
outside the organizational system (Tushman & Nadler, 1978). Specific character-
istics may include information processing capacity, collaboration networks, and
communication flows.

Information processing capacity


As we noted in Chapter 3, early proponents of organizational design suggested
formal organizational structures, rules, and regulations to help process information
(e.g., Galbraith, 1974; Tushman & Nadler, 1978). Their focus was on information
processing requirements and information processing capacity (IPC), or the organiza-
tion’s ability to process the information needed to execute tasks, reduce uncer-
tainty, resolve technical exceptions, and provide adequate coordination for the
completion of organizational tasks.
Tushman and Nadler (1978) viewed information as a scarce resource which
is needed to deal with uncertainty. To help get the necessary information to
the right places in the organization, Galbraith (1974) suggested reducing the
amount of information to be processed by creating slack resources or self-
contained units, or increasing the capacity to process information by using
vertical information systems or lateral relationships. The self-contained units
are separate on the organization chart and do not communicate with one
another. In essence, they are organized on the basis of output. The informa-
tion systems that Galbraith (1974) proposed were not necessarily computer-
based. However, Galbraith’s writings recognized the importance of systematic
ways of conveying information, with vertical information systems being one
such way. Another way of transmitting relevant information across the orga-
nization is to have formal positions designed to communicate with and liaise
across two or more units.
Like Galbraith, Tushman and Nadler (1978) viewed organizations as infor-
mation processing systems; according to them, “a basic function of the orga-
nization’s structure is to create the most appropriate configuration of work
units to facilitate the collection, process and distribution of information”
(p.614). The organization’s tasks create uncertainty for the work units per-
forming them. The greater the uncertainty, the greater is the need for
Dark side of IT at the organizational level 79

information and the capacity to process that information. Overload occurs (and
it occurs quite often) when the information processing requirements exceed
the organization’s IPC. The way to address such overload is to structure the
organization appropriately. That is, IPC is partly attributable to organizational
design (Schick et al., 1990).
Schick et al. (1990) also considered organizational IPC when they suggested
ways of reducing information load by decreasing the amount of information pro-
cessing related to interactions. In particular, they suggested that processing can be
made more time-efficient by changing the structure to facilitate information flows
as well as by relying on standard operating procedures, rules, regulations, and
computer-based information systems. Based on their review of the overload litera-
ture, Klausegger et al. (2007) concluded that adaption methods such as these can
help reduce overload.

Collaboration
Organizational design can be especially important in establishing effective structures
for promoting collaboration in today’s organizations. There is a growing emphasis
on matrix-based structures and teams, especially cross-functional and global teams
like Anna’s in the example at the start of this chapter. And consistent with the
growth in matrix- or team-based organizational structures, there are an increasing
number of technologies aimed at supporting and enhancing the connectivity of
organizational members. Investments in technologies such as Enterprise Social Soft-
ware, collaboration tools, and social media tools are generally viewed positively. But
is this really always borne out by their implementation?
Anna and her husband, David, would say, “Definitely not!” Cross and colleagues
(i.e., Cross & Gray, 2013; Cross, Rebele, & Grant, 2016) would say, “Not always.”
They have argued that the new emphasis on collaboration and the use of tech-
nology to support collaboration in organizations has resulted in collaborative
activities, such as attending meetings or answering colleagues’ questions, occupying
around 80 per cent of the time of managers and their employees. Further, “20% to
35% of value-added collaborations come from only 3% to 5% of employees” (Cross
et al., 2016, p.74). These “stars” are likely to have exponentially higher numbers of
messages compared to other employees (Oldroyd & Morris, 2012). It is these
highly valued employees, or stars, who are experiencing collaboration overload, or the
situation when employees interact so much with other employees that they cannot
get their own work done during normal work hours (Cross & Gray, 2013). Col-
laboration overload can lead to consequences typically found with IT-related overload:
employee stress, employee burnout, turnover of valued employees, and inefficiencies in
decision-making and execution. It can also impair overall organizational performance
(Oldroyd & Morris, 2012).
Collaboration overload can be diagnosed using Organizational Network Analy-
sis. To deal with collaboration overload, structural changes, among others, may be
implemented that formally assign decision rights for routine decisions to more
80 Dark side of IT at the organizational level

appropriate employees in collaboration networks, change spans of control to


achieve optimum cognitive load balance, and designate people to serve as buffers
(i.e., ‘utility players’ or ‘go-to people’) to highly valued, but overworked, colla-
borators (Cross et al., 2016). Organizations could also charge employees and teams
from other departments for any time that the star devotes to their problems, or the
star’s knowledge could be codified into existing operations (Oldroyd & Morris,
2012) or knowledge management systems.

Communication
Organizational structure affects communication flows. Communication flows carry
information needed to complete tasks and reduce uncertainty. Yet these very
communications can contribute to perceptions of overload on the part of some
employees.
In an interesting study of the communication patterns of 79 employees of a large
international workstations/servers firm, Barley, Meyerson, and Grodal (2011)
found that even though their time was chewed up by multiple communication
media, it was only email that served as a symbol of overload. Because of their
asynchronicity, emails had a tendency to batch up, especially (1) early in the
morning, due to receiving email sent the previous night by global partners, and (2)
at the end of the working day, as employees did not have the opportunity to
respond to all the emails received throughout the day. And because of company
norms about short response times and their own desire not to miss anything
(nowadays called Fear of Missing Out), many employees experienced feelings of
loss of control over their email and anxiety about their inability to deal with it in a
timely manner. This was not the case with synchronous face-to-face meetings or
teleconferences even though these tended to consume huge amounts of the
employees’ time.
Because of its unique contribution to feelings of overload, it is important to
establish organizational norms regarding email in order to reduce the perceptions of
overload. In the tech company above as well as in Anna’s company, easing up on
the norm for immediate response could have reduced the stress levels of some
employees. Further, organizations could create the communication norm of tech-
nology-free meetings to reduce task-switching and make meetings more effective
(Colbert et al., 2016).
We also explored organizational norms for communication when we tracked the
email pollution of members of a team within a large multinational (see Rutkowski
& van Genuchten, 2008). The data showed that many of the unwanted emails
originated from within the organization. One could call it ‘internal spamming’.
Based on the data, three policy guidelines were proposed to the team to impact
pollution awareness: (1) no more using the ‘reply to all’ button; (2) no more per-
sons in ‘cc’ than in ‘sent to’; (3) no more email fights. Immediately after imple-
menting a norm about limiting the number of recipients on outgoing messages,
email pollution was reduced by 27 per cent. The goal of this norm was to make
Dark side of IT at the organizational level 81

employees actively decide who really needed to receive a particular email. This
allowed the team members to spare their resources to focus attention on more
important emails from customers, to complete other tasks that were more impor-
tant than responding to internal emails, and to meet face-to-face to resolve critical
issues. Following the publication of this study, we received supportive emails from
software developers and managers. It seems some have taken our recommendations
literally. For example, Andrew Cawood, Chief Information Officer of the global
information and measurement firm Nielsen, explained in a memo to 35,000
employees that his strategy to eliminate bureaucracy and inefficiency was to disable
the ‘reply to all’ button on their screens.
Organizations should also establish norms about social media use. For example,
in order to distribute the responses more evenly across experts, they could encou-
rage collaboration stars to send targeted requests for information to social media
discussion groups (Cross & Gray, 2013). Another organizational strategy for dealing
with overload related to email and social media is to include intelligent agents for
filtering inputs (Jackson & Farzaneh, 2012).
Organizations are not doing such a great job in alleviating their employees’
overload. Universities and businesses seldom have written policies regarding how
emails should be processed or the speed at which this should be done; nor do they
have policies about the use of social media platforms. However, social norms such as
those encountered by Barley et al. (2011) clearly are in place. Organizations should
work to ensure that communication flows are effective by avoiding conditions
of overload.

Work
Highly related to organizational design is the design of work and the identification
of work tasks. As we demonstrate next, the design of work is also related to
information overload, work-life balance, and the challenges that employees
experience in seeking a work-life balance.
People used to talk about jobs. As far back as the Industrial Revolution, when
people talked about jobs they meant a discreet task or set of tasks with a well-
defined beginning and end (Bridges, 1994). However, in the mid 20th century the
concept of a ‘job’ morphed into the concept of ‘work’, or into an

ongoing, often unending stream of meaningful activities that allowed the


worker to fulfil a distinct role. More recently, organizations are moving away
from organization structures built around particular jobs to settings in which a
person’s work is defined in terms of what needs to be done.
(Pearlson et al., 2016, p.77)

In this new world of ‘work’, what needs to be done might never cease. With a
broad definition of what is to be done in the workplace coupled with modern
Information Technologies that allow some, if not many, employees like Anna to
82 Dark side of IT at the organizational level

be accessible 24 hours a day, 7 hours a week, is it surprising that many individuals


are now suffering from IT-related overload? It should be remembered that “there
is a distinct difference between using technology to increase the work hours in a
day versus using it to liberate employees from the rigid structure of traditional
work environments” (Porter & Kakabadse, 2006, p.557).

Work tasks
The concept of overload has been linked to the concept of tasks for decades.
Tushman and Nadler (1978) viewed overload from the perspective of the inade-
quacy of organizations’ IPC to deal with uncertainty in the tasks of organizational
work units. In the literature, task characteristics that affect information overload are
task complexity, task novelty, task interruption, and task-switching ( Jackson &
Farzaneh, 2012). Task characteristics such as task novelty, complexity, and inter-
dependence create uncertainty that can be relieved upon receiving the necessary
information. Further, interruption of complex tasks has been linked to overload
(Speier et al., 1999).
Schick et al. (1990) offered a rather interesting perspective on how work can
lead to information overload. They consider work to be a function of IPC and
information load, which is “the amount of data to be processed per unit of time”
(Schick et al., 1990, p.203). Schick and colleagues’ view of work is so closely
detailed that the organization decides the amount of time or resources (i.e., capa-
city) it should take for an individual to complete a task and the number of tasks
that an individual is expected to do in a period of time. That is, the organization
decides on the time an employee has available to complete a task, and if there is
not enough time to complete the task, information overload can occur. Schick and
colleagues propose various organizational strategies for reducing overload including
providing employees with fewer or simpler tasks to perform, expanding the
workforce in order to divide the work across more employees, or making more
time (a valuable resource) available to employees. The reality is that tasks the orga-
nization assigns often are not what employees spend their time on. One recent
survey reports that employees spend less than half their time on the tasks for which
they were hired (Van Knippenberg, Dahlander, Haas, & George, 2015).

Work overload and work-life balance


Schick and colleagues (1990) assumed that the organization could make the
resource of time more available to the employees during the conventional work-
day, or it could expect them to work longer hours in the workplace (overtime) or
at home. In fact, modern Information Technologies have virtually “eliminated the
conventional workday and have made time and distance immaterial to the execution
of many organizational tasks” (Ragu-Nathan et al., 2008, p.418).
Often the reason that employees work at home is that they are experiencing
work overload: they have too much work to do within the designated conventional
Dark side of IT at the organizational level 83

workday. This work overload (sometimes called role overload when employees feel
like they have too much to do in their various roles in light of available time and
resources) is significantly and positively related to work-family conflict (Bolino &
Turnley, 2005; Ahuja et al., 2007). Work-family conflict occurs when the time and
energy demands of one set of roles (i.e., work or family) makes it difficult to fulfil
the demands of another. It appears that women are more likely to experience role
overload, whereas men are more likely to experience work-life conflict (Duxbury
& Higgins, 2001). In particular, working mothers appear more challenged in bal-
ancing their different loads than do fathers. Further, work-family conflict (inter-
ference) is greater for married couples than for workers who are single. The
spillover of work demands into family life increased in a large sample of British
workers in the period from 1992 to 2000 (White, Hill, McGovern, Mills, &
Smeaton, 2003). Similarly, in their longitudinal study, Duxbury and Higgins
reported a sharp increase in role overload in Canada in the decade between 1991
and 2001, suggesting that the Canadian workers surveyed were prioritizing work
over family when they were at home. They noted that role overload was highest
for married couples in the ‘sandwich’ group, who bear the burden of taking care of
younger children and older parents. Interestingly, in a British study, dual-earner
couples reported lower levels of work-family conflict, or spillover of work into
home life, than did single-earner couples (White et al., 2003). This may be because
the dual-earner couples may have more resources available to them by paying for
domestic services or childcare, or it may be due to one of the partners, especially
the woman, taking a less demanding job.
Work-life balance is said to occur when the levels of work-family conflict are
acceptable. In particular we define work-life balance as the degree to which indivi-
duals can satisfactorily harmonize the temporal, emotional, and behavioural
demands of work and family life that are levied on them (Sarker, Sarker, & Jana,
2010). A proper work-life balance requires deploying one’s resources mindfully.
One way of thinking about how employees perceive the relationship between
work life and family life is on a continuum, as demonstrated in Figure 5.2 (see
Sarker, Xiao, Sarker, & Ahuja, 2012).
On one end of the continuum is the compartmentalized perspective in which
work life is totally separated from family life. At the other end of the continuum is
the encompassing perspective in which the “individual’s life is completely encom-
passed within his/her work domain, and the success in the work domain equates to
success in the personal life domain” (Sarker et al., 2012, p.148). This means that
work demands are always prioritized over the demands of the family. This seems to
be the case with Anna. Somewhere in the middle, to varying degrees, is the over-
lapping perspective. Though the work domain and family/personal domain may be
physically and temporally separated, emotional and behavioural overlaps likely still
exist. Individuals who hold this perspective may accept these overlaps, but they still
face varying degrees of work-family conflict in trying to establish a level of work-life
balance that they find to be suitable. Though these individuals may allow some spil-
lover between their work and family roles, they usually have a ‘zone of intolerance’ as
84 Dark side of IT at the organizational level

Life/
Life Work Life Work
Work

Compartmentalized Overlapping Encompassing

FIGURE 5.2 Work-life balance continuum (adapted from Sarker, Xiao, Sarker &
Ahuja, 2012).
Source: Adapted from Sarker et al. (2012)

to what constitutes a viable intrusion of work into their family life (Sarker et al., 2010).
Information Technology blurs boundaries between both domains.
Barley et al. (2011) illustrated these work-family relationships in a discussion
based upon the inability of employees to handle all of their email during normal
working hours. Their working hours had been filled with meetings or tele-
conferences. As a result, nearly 60 per cent of the employees in their study handled
work-related email from home. Those that had not sent work-related emails from
home explained that doing so would have likely led to family conflict. These
individuals held a compartmentalized perspective. Some of those who had
answered their emails from home noted that email made the boundary between
work and home more permeable (see also Duxbury & Higgins, 2001). To varying
degrees, these individuals held an overlapping perspective. The Harvard Business
Review (Groysberg & Abrahams, 2014) article noted that several executives did not
think it was possible to compete in the global marketplace and still achieve work-
life balance. In fact, one executive illustrated an encompassing perspective when he
stated that it was an impossibility to have “a great family life, hobbies, and an
amazing career” (Groysberg & Abrahams, 2014, p.66).
The negative impacts of role overload and work-family conflict have bled over
into organizations. Canadian respondents who experienced considerable overload
and work-family conflict were less committed to the organizations for which they
worked and less satisfied with their work; in addition, they were more stressed with
their work, were absent from work more often, reported greater intentions to quit,
experienced burnout more often, and used the health system more (Duxbury &
Higgins, 2001). Further, Indian systems developers on globally distributed teams,
who were struggling due to flexible scheduling, were more likely to say that they
were thinking about leaving the organization (Sarker, Ahuja, & Sarker, 2018).
Some claim that organizations have become more aware of work-life balance
issues and that they have made progress in implementing programmes and initia-
tives that mitigate them (Duxbury & Higgins, 2001). Ways that organizations could
reduce role overload and work-family conflict in order to help employees realize
more work-life balance include the following: establishing ‘people management’
practices that encourage a focus on output vis-à-vis hours worked, more supportive
Dark side of IT at the organizational level 85

leaders, etc.; allowing employees more control over when and where they work,
such as is possible with flexitime and telecommuting programmes; letting employ-
ees refuse to work overtime without this hurting their careers; offering employee
and family assistance programmes; providing a limited number of annual paid leave
days to take care of parents or children; introducing initiatives such as self-directed
work teams or information sharing between management and employees to
heighten employees’ control over their work; offering company-sponsored nur-
series; or making work teams responsible for finding ways to deal with work-life
balance issues (Duxbury & Higgins, 2001; White et al., 2003). While all these
suggestions make sense, their contribution to creating a favourable balance between
work and family life needs to be carefully studied. Not all may be fruitful. For
example, in one study flexible work hour practices did not affect the work-family
conflict of men, though it did significantly reduce the conflict for women (White
et al., 2003). In fact, most men chose to use the additional time that was made
available in the flexible work hours option to work even more. Further, flexible
scheduling was found to be negatively (and significantly) related to work-life con-
flict for the Indian systems developers on globally distributed teams mentioned
above. Possibly they found it more difficult to navigate the conflicting demands of
work and family life (Sarker et al., 2018).

Interplay of work with other dark side diamond factors


Organizational design and structure can impact work-life balance. For example, the
increased organizational use of groups and teams over time has increased work-
family conflict among a large group of British workers (White et al., 2003). This is
especially true when the tasks of the work team are highly interdependent, prob-
ably due to increases in complexity and the need for coordination that this creates
(Sarker et al., 2018). It is likely that high task interdependence creates a strain as
people try to achieve a work-life balance in the face of uncertain schedules. The
demands that are created from working in groups is exacerbated by Information
Technologies that deliver requests from multiple team members outside of con-
ventional work hours. Further, across First World countries, technology has helped
blur the boundary between work and home, making work-life balance hard to
achieve. Technology enables work across time and place to the extent that people
work longer and at a faster pace (Sarker et al., 2010). Technology allows the delivery
of synchronous communications at any time. Depending on organizational norms
and communication policies, the worker may be expected to respond immediately
despite this being at the cost of much work-family conflict and personal strain.
Work-life balance is a topic that has garnered the attention of governments as
well as businesses. The motivation of the White et al. (2003) study was to assess the
impact of the New Labour government’s Work-Life Balance Campaign in 2000.
The Work-Life Balance Campaign was designed to encourage employers to
implement practices that would enhance work-life balance, such as ‘family-
friendly’ policies and flexible work hours. White and colleagues found that when it
86 Dark side of IT at the organizational level

came to balancing work and family life, the flexible and family-friendly policies
adopted by businesses were nearly as beneficial as policies to regulate working
hours. Duxbury and Higgins (2001) urged stronger action, arguing for legislation
that would prohibit employees from working overtime and/or give them the right
to time off instead of overtime pay. The French government has probably taken
the most dramatic steps by passing legislation which “requires companies with
more than 50 employees to establish hours when staff should not send or answer
emails” (Morris, 2017). In particular, organizations must negotiate with employees
about the employees’ right to “switch off” (Agence France-Presse, 2016). The law
is intended to ensure that employees are fairly paid for work, have flexibility in
working outside of normal work hours, and are less subject to burnout. Such leg-
islation can definitely change the work-life relationship perspective from encom-
passing to lower degrees of overlap by protecting employees’ private time, but it
may create additional work conflicts for those on highly interdependent distributed
teams spread around the globe.

People impacts
When looking at the dark side of IT at the organizational level, our intent in this
chapter is to report the impact of dysfunctional behaviours and briefly discuss how
they may be alleviated. IT addiction can come in many forms: computer addiction,
cyber-related addictions (Kuss & Griffiths, 2011), Internet addiction, or even addic-
tion to certain types of IT such as SNS, email, or mobile technology– not to forget
mobile email addiction (Turel & Serenko, 2010). In this chapter we focus on
addictions that have been studied in the workplace.

Technostress
Perhaps the type of dysfunctional overload behaviour (other than information
overload) that has been studied at the organizational level more than others is
technostress. It has been related directly to reduced satisfaction and productivity/
performance (Tarafdar et al., 2007; Ragu-Nathan et al., 2008; Tarafdar et al., 2010).
For example, using new technologies requires employees to update their IT skills.
Doing so takes away time from completing assigned work tasks. If they cannot get
the technology to work, they need to troubleshoot and seek technical assistance–
all of which means that their IT-enabled work must wait (Tarafdar et al., 2007).
Technostress also has been linked to decreased innovation in work tasks when using
IT, increased dissatisfaction with the IT that is used, and reduced commitment to the
organization (Tarafdar et al., 2007). In addition, technostress is linked to role stress,
which in turn is thought to create role overload (Tarafdar et al., 2007; Tarafdar et al.,
2011). It has even been studied in relation to social media (Brooks et al., 2017) and
stressful information security requirements (D’Arcy, Herath, & Shoss, 2014).
Tarafdar and colleagues (e.g., Tarafdar et al., 2007; Ragu-Nathan et al., 2008;
Tarafdar et al., 2010; Tarafdar et al., 2011; D’Arcy, Gupta, Tarafdar, & Turel, 2014)
Dark side of IT at the organizational level 87

declared that there are five technostress creators or components: techno-overload,


techno-invasion, techno-complexity, techno-insecurity, and techno-uncertainty.
Techno-overload relates to situations where the technology forces its users to work
faster, longer, and harder. The technology users must process simultaneous streams
of information, resulting in information overload, interruptions, and task-switch-
ing. Techno-overload is akin to IT-related overload. Techno-invasion is the
situation where employees feel that they can be reached through the technology
anytime, anywhere. That is, employees feel tethered to the technology at work,
home, and play. Work-family conflicts often result as a consequence of always being
connected to the technology. Techno-complexity is the situation where IT users feel
intimidated about learning and adapting to new technologies. Further, they do not
know enough about the technology to do their jobs well. Techno-insecurity is the
situation where employees feel threatened that they will lose their jobs to others who
are more tech-savvy. Finally, techno-uncertainty refers to contexts where they feel
unsettled by constant upgrades and related software and hardware changes. This is
somewhat related to IT-related overload from requests to use new IT.
The technostress (i.e., techno-overload, techno-complexity, and techno-uncer-
tainty) concept has even been applied to the domain of Information Security
Policy (ISP). Employees increasingly commit many volitional ISP violations such as
sharing their passwords or failing to logoff when leaving their workstation. They
claim that there are just too many security requirements and that these are “con-
straining, inconvenient and difficult to understand” (D’Arcy, Herath et al., 2014,
p.286). D’Arcy, Herath et al. (2014) argued that security requirements increase the
employees’ workload, making it difficult for them to complete their tasks within
the necessary time frame. For example, scheduled security maintenance can disrupt
the employees’ work schedule and lead them to feel stressed. In addition, many
security requirements demand time and effort to learn and understand. Often they are
couched in jargon that is hard for employees to understand. Further, stress is created
by continual updates to work-related security requirements. Employees need to
expend their resources (time and effort) to keep up and adjust to the changes. Because
of the added stress, employees feel less motivated to comply to security requirements.
Clearly the various technostress creators are related to work overload and work-
family conflicts. Technostress has also been linked to organizational structure. In
particular, employees have been found to experience greater technostress in cen-
tralized organizations (Wang, Shu, & Tu, 2008). Further, as we discuss next,
technostress has been linked to addiction (Salanova, Llorens, & Cifre, 2013; Lee
et al., 2014; Brooks et al., 2017). For example, Tarafdar, Pullins, and Ragu-Nathan
(2015) found that of the 3,100 employees they surveyed, 46 per cent exhibited
medium to high symptoms of addiction to the technology that stressed them out.

IT addiction
To our knowledge, only a few studies have addressed Pathological Internet Use
within organizations (e.g., Yellowlees & Marks, 2007; Turel & Serenko, 2010).
88 Dark side of IT at the organizational level

However, there are quite a few studies on IT addiction, and the number is grow-
ing rapidly. Determining its exact incidence in organizations is virtually impossible
since different criteria are used to assess whether or not someone is addicted,
“including (i) intolerance, (ii) withdrawal, (iii) increased use, (iv) loss of control, (v)
extended recovery period, (vi) sacrificing social, occupational, and recreational
activities, and (vii) continued use despite negative consequences” (Kuss & Griffiths,
2011, p.3540). Figures range from 6 to 17 per cent for mobile email addiction and
from 3 to 80 per cent for Internet addiction (Yellowlees & Marks, 2007). Admit-
tedly, many of the studies making these estimates may have been flawed (Yellowlees
& Marks, 2007).
Though we may not know the exact incidence, stories abound that attest to the
presence of IT addiction. One story is told by Karaiskos, Tzavellas, Balta, and
Paparrigopoulos (2010) about a 24-year-old woman who used social media so
much that her behaviour interfered significantly with both her private life and her
professional life. She was fired from work because she used Facebook for at least
five hours a day to repeatedly check her account. When she went to a psychiatric
clinic to get help for her social media addiction, she used her smartphone to access
Facebook. As if things were not bad enough, she developed insomnia and anxiety
symptoms. Some media addicts may neglect family and home duties; others may
use the medium as a “mental safe haven” or be totally preoccupied with it (Turel
& Serenko, 2010, p.42). Media addicts are all around. You likely have gone out to
dinner with smartphone addicts who could not pull themselves away from their
phones to talk with you when you were sitting at the other side of the table.
People who suffer from various IT addictions may suffer from mood swings,
feelings of work overload, and work-family conflicts (Turel & Serenko, 2010).
Consequently, they may be less satisfied with their work and more subject to
voluntary turnover. Employees spend considerable time checking their social media
inboxes (i.e., Facebook, Twitter, and LinkedIn) throughout the workday. In a
survey of 168,000 employees, 43 per cent of the respondents confessed that
checking their social media at work hurt their work productivity (Brooks et al.,
2017). Further, organizations have been sued by their employees because they
developed mobile email addiction or became “BlackBerry addicts” (Turel &
Serenko, 2010, p.43).

Interplay of people impacts with other dark side diamond factors


As part of organizational design and structure, organizations may design work to
offer options of job sharing and telecommuting and/or have policies about the
online activities of their employees. However, even if organizations have policies
about online usage, they do not appear to be enforcing them. Consider, for
example, the finding that employees were spending an average of three hours a
week online for personal purposes a decade ago (Yellowlees & Marks, 2007). The
number is undoubtedly much higher now. Since Yellowlees and Marks’ article,
smartphones have exploded onto the scene, and it is estimated that their owners
Dark side of IT at the organizational level 89

consult them nearly 30,000 times annually (or approximately five times per waking
hour). A 2015 Gallup survey reported that iPhone owners “couldn’t imagine life
without the device” (Carr, 2017). When their phone rings or beeps while working
on a challenging task, they become distracted and their work gets sloppy, even if they
do not check or answer their phone. In other words, their cognitive resources are
diverted (Carr, 2017). Someone in each organization, most likely human resource
professionals, should be monitoring technology use to ensure that organizational
policies do not promote the dark side of IT use (Porter & Kakabadse, 2006).
Finally, in terms of the work, work demands and work resources, as well as
personal resources, have been found to be predictors of two dimensions of tech-
nostress: techno-strain and techno-addiction (Salanova et al., 2013). Salanova et al.
(2013) found that work demands have negative impacts on both dimensions. It
appears that the more work demands are placed on employees, the more work
resources are required (e.g., social support, mental and emotional competences).
Some work demands, such as work overload and frequent software updates, are
predictors of both techno-strain and techno-addiction. Other demands such as
emotional overload and obstacles hindering effective IT use are specific predictors
of techno-strain. In terms of personal resources, mental competences predict
techno-strain while emotional resources predict techno-addiction.

Technology
Information Technologies originally were designed and built to serve workers and
make business operations more efficient. However, all too often workers are being
substituted by IT. As we have reported, IT-related overload and IT addiction cause
humans to malfunction and exhaust their resources. Rather than applying work
design and adapting organizational structures to augment employees’ resources,
organizations are increasingly turning to automatization and robots as the desired
solution to enhance organizational IPC. A new robot-human supervenience pro-
blem has emerged. Little thought as to the consequences of this technocentric view
are given nowadays. Nobel Prize winner Paul Krugman recently published a
column titled “Sympathy for the Luddites” in The New York Times. He showed his
sympathy for Luddites when he described the pain of late 18th-century British
cloth workers who saw their jobs being taken over by machines. He concluded:

Today, however, a much darker picture of the effects of technology on labour


is emerging. In this picture, highly educated workers are as likely as less edu-
cated workers to find themselves displaced and devalued, and pushing for
more education may create as many problems as it solves.
(Krugman, 2013, p.A27)

We consider ourselves to be ‘neo-Luddites’, even though the term ‘Luddites’


typically is used pejoratively. It refers to people who dislike technological innova-
tions because they perceive them as potentially threatening to their employment
90 Dark side of IT at the organizational level

and/or imposing restrictions regarding personal freedoms such as privacy. In dis-


cussing Luddites, Sykes and Macnaghten (2013) shared their hope that in 200 years
“we are more capable as a society of considering new technologies in a mature
way, and thinking through some of the benefits, and also the potential inequity
and downsides, in advance of their application” (pp.85–86). Doing so would make
it possible to build a world in which values are equitably shared “rather than just
giving more money and power to the already powerful” (p.86). In this section we
consider what can happen when organizations (and professions) do not consider the
dark side of automation and robots in a mature way.
Large-scale work substitution using automation and robots within organizations
is not inconceivable. The impact of new technologies on employment and the
polarization of wages is being debated in academic circles (Goldin & Katz, 2009).
For example, the decline in the number of jobs has been attributed to the ‘routi-
nization’ of tasks that can be processed by machines instead of humans possessing
an average level of education (Goos & Manning, 2007). In addition to– or perhaps
because of– savings in labour costs, IT offers a competitive advantage when it
comes to conducting structured and repetitive tasks (Brynjolfsson & McAfee,
2011). But it is not only routine tasks that are in danger. Even non-routine cog-
nitive tasks such as complex decision-making can now be automated (Frey &
Osborne, 2013).

Automation
Huge productivity gains have been provided to organizations through enhanced
computational capabilities and the associated automation of work (Hancock, 2013).
In the 1970s it was hoped that the strides in productivity achieved through auto-
mation would lead to a world in which 30- or maybe even 20-hour work weeks
would become a reality. However, the benefits of automation have not been
shared in an equitable fashion to make this dream come true for the ‘collective
good’. Rather, a disproportionate few have enjoyed the benefits of automation
(Hancock, 2013). And there is a growing number of individuals who are suffering
due to automation, when technology takes the place of their work.
Let us focus on one highly educated group of professionals who have suf-
fered greatly from the introduction of Health Information Technology (HIT).
Our research has focused on one group of such individuals: the Medical
Doctor of Anaesthesiology (MDA). In particular, our research found that HIT
may fully or partially substitute for MDAs in the operating room (OR)
(Medina, Verhulst, & Rutkowski, 2015). Clearly HIT has brought some
advantages for the healthcare industry (i.e., Kim & Michelman, 1990; Chris-
tensen, Bohmer, & Kenagy, 2000; Goldschmidt, 2005; Kaplan & Porter, 2011;
Romanow, Cho, & Straub, 2012). For example, patients’ physiological reac-
tions to medication can now be monitored in real time by medical software in
the OR (Modell, 2005). However, the downside of HIT is that MDAs may
ultimately be replaced by HIT.
Dark side of IT at the organizational level 91

The combination of economic challenges and the introduction of efficient HIT


in the OR is accelerating the phenomenon of work substitution of MDAs. Over
the last quarter-century, integrated HITs have increasingly supported the MDA
workload within healthcare organizations. Now MDAs are able to monitor brain
activity, vital signs, and administered medication using a single machine. Even
unstructured and complex tasks are being performed by HITs in a more financially
advantageous and often more accurate manner than if humans were performing
them (Autor & Dorn, 2013; Frey & Osborne, 2013). The future could see MDAs’
expertise being pushed out of OR as their added value, derived from years of
education (i.e., 11 years of higher education), becomes too expensive for hospital
administrators. No doubt the road to full substitution will be long and fraught with
obstacles. Nonetheless, the march of automatization will gain ground in anaes-
thesiology as it has gained ground in almost every single sector of the economy
(Chambers & Nagel, 1985; Chialastri, 2012).
In our study (see Medina et al., 2015), we sought to answer the following
research questions: Has the profession of anaesthesiology lost its value in the OR?
If so, why? We found that Dutch MDAs spent much of their time in ORs per-
forming psychomotor and planning tasks that could be performed by HITs. Much
of the MDAs’ OR time was spent task switching, not only because their work
routinely involves multiple tasks, but also because efforts to cut costs had translated
into their simultaneously supervising the care of patients in two ORs, the holding
bay, and the recovery room. Further, MDAs must process information received
from patient data management systems and their smartphones. Their use of mon-
itoring and mobile technology in the OR has created the false impression that
MDAs can be ‘absent’, or substituted by HIT. Another concern is that the MDAs
themselves do not realize how deadly their multitasking can potentially be. An
MDA supervising several locations and relying on snapshots of data fed to them by
the HIT may not get the full picture of what is happening with the patient in
surgery. This may ultimately result in anaesthesia-related patient injuries because
the MDAs are emotionally and cognitively overloaded. In sum, it is time for
MDAs, hospital administrators, and society to explore not only how automation
can be used to substitute for the work of MDAs, but what aspects of the work of
MDAs should be automated and how.
Moreover, deskilling of other professions should also be questioned. For exam-
ple, how will driverless cars affect the 4 million plus who make their living by
driving (e.g., as taxi drivers, chauffeurs, bus drivers, delivery truck drivers, heavy
and tractor trailer truck drivers, etc.) in the USA alone (Bureau of Labor Statistics,
2017)? Dire forecasts say there is a 90 per cent certainty that their jobs will be
replaced by self-driving vehicles by the year 2023 (Goodman, 2015).
Or consider the bleak future for law clerks now that Enterprise Cognitive
Computing applications have entered the legal world. Enterprise Cognitive Comput-
ing (ECC) applications are software that use tools such as “natural language pro-
cessing, image recognition, intelligent search, and decision analysis to adapt their
underlying computational and modelling algorithms or processing based on
92 Dark side of IT at the organizational level

exposure to new data… to enable an organization’s business processes” (Tarafdar,


Beath, & Ross, 2017, p.3). The vendor, ROSS Intelligence, has developed an
ECC application that helps lawyers obtain pertinent reference cases sooner by
searching case law to identify legal precedents. The knowledge base of cases must
be continuously updated in order to identify the most useful cases. This legal dis-
covery ECC application can now do virtually all of the work previously performed
by law clerks. Further, many repetitive legal tasks such as drafting wills, trusts, and
residential real estate closing documents have been automated (NPR All Tech
Considered, 2017). At this time, lawyers themselves have not been substituted by
the ECC application (or other automated systems) since they provide the crea-
tivity and reasoning to win cases based on the case material provided by the ECC
(Tarafdar et al., 2017).
As dire as work substitution due to automation can be for the people whose
work is substituted, automation has another dark side. In particular, work perfor-
mance can be diminished and lives can be lost when workers suffer from automation
addiction, or when people rely on digital systems too heavily to perform their work
adequately. The Federal Aviation Administration attributed 51 accidents and the
loss of hundreds of lives to automation addiction in the USA (Nicolelis, 2017). The
brain’s use of technology needs to be carefully considered to prevent situations of
automation addiction.

Robots
Organizations are increasingly turning to robots to perform work tasks more effi-
ciently, more accurately, and in a cost-saving way. A robot is “a reprogrammable,
multifunctional manipulator designed to move material, parts, tools, or specialized
devices through variable programmed motions for the performance of a tasks”
(Hamilton & Hancock, 1986, p.70). The term came from Capek’s 1921 play RUR:
Rossum’s Universal Robots and appears to be derived from the Austro-Hungarian
Empire term, robota, which means ‘vassal’ or ‘worker’ (Schaefer, Adams, Cook,
Bardwell-Owens, & Hancock, 2015). The first robot traces back to the 3rd century
when Archytas of Tarentum developed a mechanical bird, referred to as “pigeon”,
that was powered by steam. Leonardo da Vinci developed a robot using a knight’s
suit of armour and an internal cable mechanism in 1495. Robots were first put to
work in the General Motors (GM) assembly line in 1961. George Devol’s indus-
trial robotic arm was the first robot installed by GM. MIT’s John McCarthy and
Marvin Minsky contributed to the science of robotics in their Artificial Intelligence
Laboratory in 1959. Also in 1959, Stanford built the first robot to know and react
to its own actions. Stanford scientists spent the decade of the 1970s building a cart
that could follow a line or be controlled by a computer. About the same time,
business organizations also were conducting robotics research. For example, Honda
began research on collaborative robots in 1986 (see Schaefer et al., 2015, for details
on key historical achievements in the evolution of robot design). Given the big
role that automobile manufacturers played in the early days of industrial robots, it is
Dark side of IT at the organizational level 93

not surprising that automobile manufacturers made 40 per cent of the worldwide
purchases of robots in 2013 (Goodman, 2015). The Occupational Safety and
Health Administration has tallied 33 deaths in the USA alone, and more are likely
as robots become more prevalent (Goodman, 2015).

Robot design features


For the most part, the robots in industry do not look or act like the robots that are
popularized in film and fiction, and on which most people base their view of
robots. The portrayal of robots in fictional media suggests that robots should be
humanlike in appearance, behaviour, and cognitive capability (Schaefer et al.,
2015). Typically, robots in fictional media have been portrayed as metallic versions
of humans with some specific features emphasized and others removed. For
example, many fictional robots have two legs, two arms, a torso, and a head. This
was the case with the robot GORT in the 1951 film The Day the Earth Stood Still.
However, since he did not have any emotions, he did not need facial features such
as eyes, eyebrows, nose, and mouth. The importance of behavioural features has to
do with the robot’s intention and purpose. Many fictionalized robots are pro-
grammed to follow Isaac Asimov’s original Three Laws of Robotics: “(1) A robot
must not harm a human or allow a human to be harmed. (2) A robot must obey a
human unless orders conflict with the first law. (3) A robot must protect itself from
harm unless this violates the first two laws” (Asimov, 1942, cited in Hancock,
Billings, & Schaefer, 2011, p.24). While many fictionalized robots are portrayed as
protectors of humans who have only good intentions, some robots have been
portrayed as ‘evil’, such as Lore in Star Trek: The Next Generation or KARR
(Knight Automated Roving Robot) in the TV series Knight Rider (Schaefer et al.,
2015). More recently, greater emphasis has been placed on robots’ anthro-
pomorphic cognitive capabilities, which are based on artificial intelligence. The
cognitive capabilities are an important way of integrating robots into society.
Besides anthropomorphism, two other design features that have gained impor-
tance over time in fiction and in the workplace are the matching of form with
function and human-robot integration. The physical form must match the func-
tional design. This translates into making companion robots more human- or
animal-like, whereas industrial robots may emphasize only the underlying function
for which they were designed. As robots become more integrated into society and
the workplace, they must prove to be more reliable, predictable, and safe (Schaefer
et al., 2015). Increasingly, robots are becoming part of industrial teams. This places
a twist on Asimov’s laws: Robots must not only inspire trust within humans that
they will do their jobs, but they must also must actively promote the “interests of a
larger organizational entity”: their work team (Hancock et al., 2011, p.26).
Ironically, Hancock and colleagues (2011) provided an example of how decep-
tion on the part of robots can lead to beneficial results. Even though the robots
may be able to process massive amounts of data, only a few bits of that information
may be pertinent to the human’s decision. Rather than increase the human’s
94 Dark side of IT at the organizational level

workload and create situations of overload, the robot should deceive the human
into believing that the information that the robot has delivered is all the informa-
tion that has been collected. Hancock et al. (2011) view such actions on the part of
a robot as deception by omission. However, this deception on the robot’s part can
help avoid costly mistakes that might follow from humans who are cognitively
overloaded.

Another dark side of robot technology: safety


As we have noted, GM pioneered the use of robots as early as 1961 in its die-
casting, welding, and painting operations. By 1980, it had 300 robots on the fac-
tory floor (Hamilton & Hancock, 1986). The number of robots working away at
GM has mushroomed to at least 30,000 (Tingley, 2017). Over 30 years ago,
Hamilton and Hancock indicated that any repetitive task is a candidate for robotics.
Companies are incentivized to substitute human labour with robots when they
hear that they can get a high return on investment within a short payback period
of a few years. Plus, the tolerances that robotics can achieve on repetitive tasks
often exceeds those of the robots’ human counterparts. These statistics seem to be
borne out in a recent New York Times article: “The robot’s price tag was $35,000,
and within two months, it paid for itself by quadrupling the efficiency of the press
and eliminating scrap” (Tingley, 2017, p.32).
Safety has long been a major dark side of using robots in manufacturing orga-
nizations. Japanese industries first started using robots in 1969, and the number
doing so nearly doubled each year from 1975 and 1981 (Hamilton & Hancock,
1986). But just as the number of robots grew, so did the safety issues. A 1983
report of industrial robots published by the Japanese Ministry of Labour specified
48 recorded accidents involving human workers and robots: 2 resulted in lost-time
injuries, 7 resulted in minor injuries, and 37 were near misses. This record con-
stitutes a high fatality ratio when compared to most common occupational
situations. Unfortunately, robots have been involved in fatalities in several
industrial and military accidents. For example, a US car factory employee was
asphyxiated by a robot that mistook him for an auto part when the employee was
cleaning the robot’s cage, and nine South African soldiers were shot to death
during a training exercise when a robotic anti-aircraft gun suffered from a computer
glitch (Goodman, 2015).
To avoid such calamities in the future, Hamilton and Hancock (1986) have
recommended proximity sensing systems to prevent the robot from colliding with
humans in any way. Further, robots that are capable of inflicting harm on humans
should be exclusion guarded to prevent humans form entering locations where
they are operating. These recommendations appear to have been implemented
with some robots. For example, the owner of Dynamics, a manufacturing plant
that produces moulds for the mass production of small plastic and metal parts,
tested a robot’s safety by letting the robot collide with a part of his body. The
robot was installed after passing the owner’s safety test. Though the owner would
Dark side of IT at the organizational level 95

not perform the test again for fun, the collision did not leave a bruise. However, as
robots become more collaborative, the operations they carry out are becoming
more complex, and ensuring the safety of teammates is challenging to say the least.
Another challenge that has emerged since Hamilton and Hancock first started
writing about robotic safety is the risk to the robots, and thus to the humans that
they serve, from malicious hacking (Goodman, 2015).
Manufacturing organizations are not the only organizations benefiting from
advances in robotic technologies. Robotic advances are being seen in pharmacol-
ogy (Hemmerling & Taddei, 2011; Hemmerling, Taddei, Wehbe, Zaouter, Cyr, &
Morse, 2012). They also are being used to perform surgeries. The da Vinci® Sur-
gical System, a minimally invasive robotic surgery system, allows greater precision
in cutting and offers better views of the patient’s surgery site, especially because
there is less blood to obscure the vision. However, innovative robotic technologies
have killed hospital patients in recent years (Sharkey & Sharkey, 2013), though the
number of injuries recorded is far less for surgical robots than industrial ones
(Goodman, 2015). As one example of a fatal injury, a Chicago man died in 2007
when a surgeon punctured his spleen while operating a $1.8 million da Vinci ‘hands-
on’ robot surgical robotic system for the first time on a living person. Saunders et al.,
2016) pointed to five hazards that are at the heart of such tragedies: (1) overloaded or
underloaded OR professionals; (2) inadequate training of surgeons on the robotic
systems; (3) inadequate training for the healthcare professionals on the surgical team;
(4) the complexity of HIT; and (5) overconfident surgeons. We argued that adequate
training of healthcare professions on the technology as well as certification of mastery
of use of the technology would go a long way.
Saunders et al. (2016) suggested that the MDAs on the surgical teams are
underloaded and, hence, may be bored. Yet there are a number of reasons why
members of the surgical team may be overloaded. Alarms in the OR may be one.
It turns out that the number-one health hazard in the OR for three years in a row
(2013–2015) was the overwhelming number of alarms (ECRI Institute, 2012,
2013, 2014). Surgical team members may become overloaded by alarms that are
constantly going off and may not pay adequate attention to each one. To pay
attention and respond appropriately to these alarms increases mental load and may
drain attentional resources (Tollner, Riley, Matthews, & Shockley, 2005). Further,
the complexity of the robotic systems along with their associated new medical
procedures may intensify team members’ mental strain and stress, adding to their
mental load (Ayyagari, Gover, & Purvis, 2011; Tarafdar et al., 2007). Moreover,
surgeons may become distracted as they perform complex surgical movements
under time pressure. They must remember the proper sequence of steps in a given
procedure as they converse with surgical team members about instruments and
the patient’s status (Tollner et al., 2005). Not surprisingly, the surgeons may be
especially overloaded (Zheng, Cassera, Martinec, Spaun, & Swanstrom, 2010).
Sergeeva, Huysman, and Faraj (2016) spent 102 hours observing operations that
used the da Vinci robotic system. They observed the challenges surgeons face when
learning to operate the system. Surgeons have to unlearn how they performed
96 Dark side of IT at the organizational level

surgeries, using their senses, in the past. In particular, they relied on the sense of
touch inside the patient’s body. Further, they need to unlearn being in the centre
of things since they now sit remotely in the OR– away from the patient and in
front of a technologically advanced 3D camera. The surgeons must learn to operate
the robotic arms from a distance by relying heavily on the clear and detailed 3D
images. But surgeons are not the only ones who need to be retrained. This applies
to the whole team. For example, it is the scrub nurse who actually inserts or
changes precision instruments in the robotic arm and takes tissues out of the
patient’s body. MDAs must learn to anticipate sudden movement from the robotic
arms that tend to be hovering over the patient. In fact, the MDAs that Sergeeva
et al. (2016) observed made a metal shield to protect the patient’s face from any
sudden movements by the robot. In one emergency situation, the MDA had to
crawl under the table on which the patient was resting in the OR in order to
reinsert a breathing tube that had fallen out of the patient’s mouth. In summary,
operating with the robotic system can be dangerous, and the entire team needs to
be trained to provide coordinated responses to unanticipated problems.

Interplay of technology with dark side diamond factors


When automation and robots are brought into organizations, the structure of teams
needs to be carefully evaluated and, where necessary, adapted. Further, the division
of labour within teams needs to be modified, and team members need to be
trained to operate the technology, to monitor it to see that it is working correctly,
and to update it when necessary. In terms of people impacts, employees may
experience techno-complexity when they are compelled to operate the robots or
use the system.

Conclusion
In this chapter we have described the dark side of IT from four perspectives:
organizational design and structure, work, people impacts, and technology. Each
has the potential to cause serious damage at the organizational level as well as at
individual and even societal levels. Further, the damage can be compounded when
these four factors interact with one another. Table 5.1 summarizes the key issues
and interrelationships.
TABLE 5.1 Summary of the Information Technology dark side diamond

Factor Description Issues Examples of interrelationships with other factors


Organizational Organizational structure: patterns of interac- Information processing capacity Work: Consider changes in the nature of
design and tions or the network of relationships that Collaboration work; divide collaborative work among
structure exist among organizational members and Communication members; designate support people for
units.Organizational design: the process stars; design organizations to facilitate
whereby organizational structure is made information flows. People impacts: Beha-
to fit with specific characteristics both vioural changes may occur in regard to
inside and outside the organizational collaboration.Technology: Leverage tech-
system. nology to help collaboration; establish
norms and policies for emails and social
media.
Work Work: “ongoing, often unending, stream of Work overload Organizational design and structure:
meaningful activities that allow the worker Role overload Implement organizational policies and
to fulfil a distinct role.” (Pearlson et al., Work-family conflict norms to reduce work-family conflict;
2016, p. 77) Work-life balance support legislation to reduce work-family
Work-family relationships conflict; design tasks to reduce overload.
Organizational performance People impact: Work demands and resources
Individual burnoutIndividual dis- (work and personal) predict technostress,
satisfaction with work burnout, and intention to leave the orga-
nization. Technology: IT delivers messages
at all times of the day and night, making it
difficult to separate family life from work.

(Continued )
Table 5.1 (Continued)
Factor Description Issues Examples of interrelationships with other factors
People impacts Technostress: a type of stress experienced Performance/ productivity Organizational design and structure: Work
in organizations by technology end users as Satisfaction with work can be designed with flexibility (i.e., job
a result of their inability to cope with the Voluntary intention to quit sharing) and norms about using commu-
demands of organizational computer usage. organization nication media. Work: Organizations can
Pathological Internet Use: the consequences establish and enforce policies to promote
of problematic cognition coupled with work-life balance (i.e., flexitime, freedom
behaviour that intensifies or maintains to turn down overtime without repercus-
maladaptive response. sions). Technology: Policies and norms can
IT addiction: the state of being challenged be created to reduce technostress and IT
in balancing IT usage mindfully so as to addiction.
preserve one’s resources; includes Internet,
mobile email, and SNS addictions.
Technology Though technology is found in myriad People’s work being replaced by Organizational design and structure:
forms, our dark side focus is on automation robots or automation Organizational work needs to be rede-
and robots. Robot: a reprogrammable Robot safety signed to reflect differences in task perfor-
multifunctional manipulator designed to mance by robots or in response to
move material, parts, tools, or specialized automation. Work: With automation and
devices through variable programmed the introduction of robots, humans (indi-
motions for the performance of tasks. viduals and teams) need training when
Automation: automatic operation of an they are placed in charge of robots and
apparatus, process, or system performed by automated systems to make sure they
IT to take the place of some aspect of operate smoothly. When they are replaced
human performance. by automated systems or robots, people
need to be retrained for other jobs. People
impacts: Workers experience techno-inse-
curity from fear of losing their jobs to
automation or robots and techno-
complexity from having to learn and adapt
to new technologies.
6
MEASURES OF IT-RELATED OVERLOAD

In the business world, there is a well-known adage: ‘If you can’t measure it, you
can’t manage it.’ Measurement is critical to understanding new discoveries and
therefore for advancing science. Some early examples of measurement based on
triangulation are given below.
In 1800 Britain was faced with a colossal problem. It was trying to rule a sub-
continent without having a clue as to what it looked like. So it commissioned
Colonel William Lambton to undertake the Great Trigonometrical Survey (more
commonly known as the Survey of India) to ascertain precisely where places were
located in the colony. In 1808 Lambton started surveying the southernmost tip of
India. He used a triangulation technique which incorporated trigonometry, chains,
metallic bars, monuments, and theodolites to survey the land. When Lambton
died, George Everest took over the mission. Everest was, in turn, replaced upon his
death in 1843 by Andrew Scott Waugh. It took decades to measure India. In the
process, Waugh came upon the world’s tallest mountain, which he named in 1856
in honour of Everest (Arbesman, 2013).
A way of measuring this massive area of land was essential for its governance by
the British Empire. Almost a century earlier, the French had undertaken a similar
arduous mission involving triangulation. Charles Marie de la Contamine (1735–
1744) took part in the French geodesic project to triangulate the distance through
the Andes in order to settle the question of the earth’s circumference. Although
their project had been carefully planned, they encountered many mishaps (see
Bryson, 2003).

Triangulation: the three amigos


In A Short History of Nearly Everything, Bryson (2003) described the 18th century as
a period when scientists were infected with a powerful desire to understand, and
100 Measures of IT-related overload

therefore measure, the earth. Hipparchus of Nicaea (150 BC) used triangulation to
work out the distance of the moon from the earth. Once Hipparchus knew the
length of one side of a triangle as well as the values of both corner angles at the
baseline, pointing at the moon allowed him to determine all the other dimensions.
His approach, triangulation, is used extensively in navigation and military strategy.
Based on the principles of geometry, triangulation increases the accuracy of
observations.
The term triangulation is also applied to a research strategy that uses a multitrait-
multimeasure approach, or convergent validation. It uses multiple reference points
(often more than three) to measure phenomena (Jick, 1979). Its goal is to ensure that
the results of research are not the product of methodological artefact (Campbell &
Fiske, 1959). According to Denzin (1978), triangulation can be applied within-method
by using multiple comparison groups or multiple scales/indices for the same construct.
Triangulation within-method aims mostly at assessing construct reliability (i.e., internal
consistency) and validity. For example, two different scales could be used to measure
overload. Triangulation can also be applied between-methods by using diverse
methods for cross-validation; for example, using semi-structured interviews and phy-
siological measurements to capture overload. Between-methods triangulation is mostly
concerned with the generalization of constructs (i.e., external validity).
Triangulation may provide a new lens for viewing and therefore understanding
IT-related overload. Triangulation makes it possible to “capture a more complete,
holistic and conceptual” portrayal of the phenomenon (Jick, 1979, p.603). Our
earlier methods to measure IT-related overload were not always successful, but we
learned a lot during the painful process. In the next sections we want to share with
you what we learned. We believe that triangulated measurement of IT-related
overload can expand the knowledge of this phenomenon. However we also
recognize that measurement can be a double-edged sword. If the measurement is
inaccurate, it can create a spurious understanding of IT-related overload. There-
fore– like our predecessors William Lambton, de la Contamine, and Hipparchus of
Nicaea– we use triangulation. We employ it to measure IT-related overload. In
this chapter, we start with a philosophical discussion of theoretical systems and
measurements. We describe the use of self-report measures, which is where most
overload researchers have started. We then suggest approaches to complement the
self-report measures. These include physiological measures such as thermal-imagery
as well as measures of electrodermal activity such as heat flux, galvanic skin
response (GSR), and physiological measures of energy expenditure in Metabolic
Equivalent of Tasks (METs). We conclude with studies that employ some of these
measurement approaches and suggestions for moving forward on the neuroscience
measurement frontier.

Theoretical systems and measurements


As we discussed in Chapter 2, science practices are organized in schools of thought
in relation to specific kinds of scientific achievements. Traditions of scientific
Measures of IT-related overload 101

research form scientific languages with their own set of concepts, conventions,
codes, and rules, providing ways to look at the world through particular lenses.
Theory may also be viewed as a system in which constructs are related to each
other by propositions (Bacharach, 1989). Constructs are defined as “terms which,
though not observational either directly or indirectly, may be applied or even
defined on the basis of the observables” (Kaplan, 1964, p.55). Indeed, constructs
are operationalized into configurations of variables. A variable is an observable entity
which is capable of assuming two or more values (Schwab, 1980). Hypotheses
relate variables to one another and are typically tested using statistical methods.
Thus, both propositions and hypotheses belong to theoretical systems and are
statements of relationships. Operationalizing constructs into observables relates to
‘measurement’. Measurement is critical to the advancement of science. Sinan Aral
suggested that “revolutions in science have often been preceded by revolutions in
measurement” (Kitchin, 2014, p.1). Science may be considered a language since
it evolves constantly not only through paradigm shifts, but also with new instru-
ments and technologies.

Popper’s three worlds


Popper’s (1978) approach to science considers the existence of three worlds: World
1 is the world of physical bodies such as stones, stars, animals, or radiation; World 2
is the world of mental or psychological states or processes, or of subjective experi-
ences; World 3 consists of the products of the human mind, such as languages, tales
and stories, and scientific conjectures or theories.
The mind-body supervenience problem can be related to Popper’s three worlds.
The construct ‘mind’ can been perceived as a mere object in World 1 (i.e., brain),
composed of tissues and neurons whose reaction to stimuli can be studied as
impacting the body. The ‘mind’ is also defined as a thinking substance (Descartes,
1644), as mental representation or schemata (Kant, 1781–1787/2003), or as a meta-
phor– the soul’s vessel (Diderot, 1818–1819)– and is thus part of World 2. In phi-
losophical terms, theoretical products of the human mind exist in World 3. They are
produced to make sense not only of the so-called brain (the material, physiological
components of World 1), but also of the psychological processes and subjective
experiences in World 2. The philosophical debate around the mind-body relied
heavily on the ‘meanings of words’, which in fact form theories in World 3. In the
philosophical tradition, they are not subject to objective measurements. As we have
discussed, a series of paradigm shifts established cognitive psychology as the new sci-
ence of the mind, operationalizing abstract philosophical supervenience problems to
produce measurable observations and tests (Gardner, 1987).

Objective versus subjective measurements


Early scientists operationalized the mind as internal mental processes using the
method of introspection (i.e., verbal report), which is characterized as subjective
102 Measures of IT-related overload

(i.e., qualitative). They later applied the methods of natural sciences (e.g., sta-
tistics), characterized as objective (i.e., quantitative); for example, in measuring
learning performance. Popper (1959) stated that the terms ‘objective’ and
‘subjective’ are “heavily burdened with a heritage of contradictory usages and
of inconclusive and interminable discussions” (p.44). The contradictory usage
refers to the term ‘subjective’ being somehow tainted, as it applies to our
“feeling of conviction” (which has varying degrees) (Popper, 1959, p.46).
However, as Kant (1781–1787/2003) emphasized, “objective reasons too may
serve as subjective causes of judging” (in Popper, 1959, p.45). That is, objective
measures sometimes can lead to tainted conclusions. The method used to
measure objective versus subjective phenomena is not a certification of ‘objec-
tivity’ in the broader sense of the term.
In this chapter, we define the term objective as focusing on the object, the
material in World 1 (Popper, 1978). That is, the brain and its architecture sup-
port information processing. The brain is therefore a mere material oper-
ationalization of the mind in World 1. The functioning of some part of the
brain, such as the prefrontal cortex (PFC), enables information processing and
therefore mental processes such as decision-making in World 2. It is deemed
‘object’ in the sense it has universal properties and functions, such as a chair as
four legs and can be used to sit on. As we reported in Chapter 2, anatomic brain
damage, particularly of the PFC, leads to the inability to use affective feedback
in making judgments and decisions (Damasio, 1994). Also, individuals suffering
from Narcissistic Personality Disorder have less brain matter in areas that overlap
with the areas associated with empathy (i.e., left anterior insula, rostral and
median cingulate cortex, as well as part the PFC) (Schulze, Dziobek, Vater,
Heekeren, Bajbouj, Renneberg et al., 2013; Nenadic, Güllmar, Dietzek, Lang-
bein, Steinke, & Gader, 2015).
We define the term subjective as focusing on the subject, the psychological states
of subjective experiences (Popper, 1978). Phenomenological sociology (Weber,
1949), introduced a clear focus on the World 2 of subjective meanings. Weber
underlined that social action should be studied through interpretive means, referred
as Verstehen. In sociology, understanding the subjective meaning and purpose
attached to the actions of individuals is necessary to understand social actions
(Calhoun, 2002). The focus is on the subjective meaning that humans (i.e., subjects)
attach to their actions and interactions within specific social contexts.
In the late 1800s, Sir Francis Galton (1892) was inspired by the work of Darwin
(1859) on natural selection. Galton demonstrated that objective tests could provide
meaningful scores. He was the first to use statistical methods to study individual
differences and the inheritance of intelligence. To do so, he introduced the use of
questionnaires and surveys for collecting the data that he needed for his anthropo-
metric studies. He is commonly referred as the father of psychometrics (see Kaplan
& Saccuzzo, 2010).
In the early 1900s, Binet and Simon (1904) introduced the first standardized
Intelligence Quotient (IQ) test (referred to as the Binet-Simon). The Binet-Simon
Measures of IT-related overload 103

was revised and validated by Terman (1916), a psychologist at Stanford. In clinical


psychology it is referred to as the Standford-Binet Intelligence Scale (SBIS). The
SBIS is concerned with five factors: knowledge, quantitative reasoning, visual-spatial
processing, working memory, and fluid reasoning. It has verbal and non-verbal
components. It is mostly used to assess intelligence and is time-dependent. That is,
not only performance but also the time to complete the test is taken into account
when evaluating an individual’s IQ.
Tests like the Stanford-Binet are criterion-referenced, which means that
there are pre-specified categories to be attained. They are broad sets of tests
and are mostly predictive of aptitude, achievement, creativity, personality,
neuropsychological functions, and behaviours. Neuropsychological tests measure
cognitive, sensory, perceptual, or motor functions, while behavioural tests focus
on behaviours and their antecedents and consequences. The field of psycho-
metrics is concerned with the objective measurement of skills and knowledge
and of abilities, as well as of attitudes and personality traits. In other words,
psychometrics was developed to objectively assess, through the methods of
natural sciences, particularly statistics, whether a specific observable variable
accurately measures a given construct. Therefore, the main focus is on the
construction and validation of instruments such as scales, scorecards, or even
protocols to ensure their reliability and validity. Psychometric properties are
highly desirable when addressing individual differences. They are used exten-
sively in clinical and organizational psychology. The original postulation was to
objectively determine the ‘fit’ of people with their environment in order to
predict their performance. Such standardized tests are used heavily in many
different types of organization (i.e., military, aviation, astronautics) for recruit-
ment and training purposes.

Observables and measures of IT-related overload


Most prior research considered overload to be a construct measureable indirectly
through observables such as a large amount of information or too little proces-
sing capacity or resources. Much of this research captures observables using self-
report (questionnaire) measures as well as verbal protocol reports of overload
either in laboratory or field settings. Most researchers employ a computational
approach which focuses on human information processing abilities when inter-
acting with a task. In the fields of management and organizational behaviour,
overload is theorized from an input perspective (i.e., amount of information) in
the context of personality-environment fit/misfit regarding the demands made
by an organizational task. In the organizational literature, overload is assumed to
exhaust the resources an organization has available for its employees to process
the information needed for decision-making. Therefore, it is operationalized
through the measurement of observables such as amount or increase of infor-
mation, dimensions, alternatives, task complexity, design features, and techno-
logical features. Usually scholars do not consider at all the subjective experience
104 Measures of IT-related overload

of overload, let alone measure it. In philosophical terms, there are subjective
versus objective constructs of overload, and these require different measure-
ments. Further, while overload has been studied, underload has not been as
widely researched.

Subjective experience of overload


Most of the previous research attempting to measure the subjective experience of
overload asked the participants to report their experiences of overload in the work
context. Data were captured using large field surveys or verbal protocols.
Researchers typically used one, two, or a set of straightforward questions. For
example, in one field survey researchers asked managers: “What does information
overload mean to you?” or “How often do you experience it?” (Farhoomand &
Drury, 2002, p.128). In his large field survey of navy personnel about the amount
of information they used in decision-making, O’Reilly (1980) asked questions such
as: “In a typical work week, approximately how often do you have less than the
amount of information you could consistently handle for making the best possible
work-related decision?” or “Is the total amount of information you receive in a
typical work week enough to meet the information requirements of your job?”
(p.689). Notably, as we reported in Chapter 4, the enlisted navy personnel per-
formed better in situations of underload, but were less satisfied (O’Reilly, 1980).
Sometimes semi-structured interviews were used, such as in the context of the
police force when researchers asked about “the volume of messages sent and
received” and how overloaded the participants perceived themselves to be (Allen &
Shoard, 2005).
Overload due to technology has been captured in survey questions on partici-
pants’ subjective evaluation of the impacts of the introduction of a new technology
on the nature and amount of information received and the additional work that it
created (Schultz & Vandenbosch, 1998): “Lotus Notes results in info overload”,
“Because of the introduction of Lotus Notes I take more work home” (p.142), etc.
Other early researchers asked participants: “Thinking back over your experiences
with the system, how frequently have you felt overloaded with information?”
(Hiltz & Turoff, 1985, p.683). Straightforward questions provided interesting
results– particularly in laboratory settings. However, most operationalizations were
unidimensional and the scales did not display desirable psychometric properties.
More recently, overload has been conceived as multidimensional and more
attention has been paid to the psychometric properties in its operationalization.
These operationalizations use similar observables such as amount of information,
workload, resources, and design features, but they include multiple items to capture
different dimensions. Researchers focus on task requirements and individual dif-
ferences. For example, Ahuja and Thatcher (2005) used five items to measure the
quantitative and qualitative dimensions of work overload, including resources. Two
items represented the dimension of quantitative overload– for example, “To be
successful on my job requires more IT skills than I currently have”; Three items
Measures of IT-related overload 105

represented qualitative overload– for example, “I never have enough time to do


what is expected of me at work” (Ahuja & Thatcher, 2005, p.459). Qualitative
overload is also measured using another scale of three items (Pennington et al.,
2006); for example, “The quality of work expected was too difficult” (p.35). Tar-
afdar et al. (2007) also measured qualitative and quantitative overload, but they did
so in terms of role overload, or “when the requirements from an individual’s role
exceed his or her capacity in terms of the level of difficulty or the amount of
work” (Tarafdar et al., 2007, p.307). Interestingly, the five items relating to techno-
overload are similar to a well-known construct in psychology, forced compliance
(Festinger & Carlsmith, 1959), and reflect both time pressure and complexity. Two
sample items from the techno-overload scale are: “I am forced by this technology
to work much faster” and “I have a higher workload because of increased tech-
nology complexity” (Tarafdar et al., 2007, p.314). Karr-Wisniewski and Lu (2010)
combined three dimensions in their scales: information overload, communication
overload, and system feature overload. Their scales display sound psychometric
properties.

Objective measurements
Other research measures the objective experience of the overload. The basic pre-
mise in this research has been to use proxy measures of demands on the brain (i.e.,
brain load or mental load) using the amount of information or alternatives. This
makes it possible to test for bottleneck effects of observables on higher activities
such as decision-making or learning. Researchers mostly use questionnaires in the
context of laboratory experiments. Their research designs operationalize the
amount of information in terms of the actual input delivered to the participant.
Observables relate to message complexity, such as the number of words in the
body of an online message, the number of words in non-indented lines in the body
of a message, the number of lines according to header fields, the number of lines
excluding those of attachments, the number of indented lines excluding those of
attachments, and the number of contributors to ‘reply’ messages. (Jones et al.,
2004). Overload is also operationalized by manipulating the number of alternatives
in order to create bottleneck effects (Chervany & Dickson, 1974; Payne, 1976;
Jacoby, 1984; Cook, 1993) or by manipulating interruptions and task complexity
while using a technology (Speier et al., 1999). Still other operationalization
employs observables such as the numbers of ideas, idea diversity, time, and task
domain (Grise & Gallupe, 1999–2000).
In discussing objective measurements of cognitive load, we would be remiss if
we did not mention the work of Sweller and his colleagues (see Sweller et al.,
1998; Paas et al., 2004). Sweller (1988) was among the first to measure cognitive
load in the context of learning. He proposed four types of cognitive load: intrinsic,
extraneous, germane, and informational. He and his colleagues devised various
measurements for most of them. For example, in researching the inherent com-
plexity determined by the interaction between the nature of the material and the
106 Measures of IT-related overload

expertise of the learner (i.e., intrinsic cognitive load), he measured cognitive abil-
ities and skills. Sweller and colleagues noted the technology features when studying
extraneous cognitive load and measured the amount of information when studying
information load. Relevant to our research, they found that different technology
features differentially impacted extraneous cognitive load depending upon the
modalities involved (e.g., visual, auditory, tactile). Splitting attention between
technologies (multitasking) increases cognitive load, especially when different
modalities are engaged.

Cognitive versus emotional overload


To the best of our knowledge, we were among the first to try to measure both
cognitive overload and emotional overload. We built sets of items separating the
emotional and cognitive dimensions instead of measuring them as a single con-
struct. The items below were adapted from Rutkowski and Saunders (2010, p.94).
Cognitive overload (Cronbach’s alpha=.914) was composed of four items and
rated on a seven-point Likert scale from 1 (not at all) to 7 (very much):

 You cannot process the number of requests you receive to use new Internet
communication tools.
 You cannot handle the number of requests you receive to use new Internet
communication tools.
 You cannot cope with the number of requests you receive to use new Internet
communication tools.
 You are overwhelmed by the effort it takes to handle the number of requests
you receive to use new Internet communication tools.

Emotional overload (Cronbach’s alpha=.917) was composed of three items and


rated on a seven-point Likert scale from 1 (not at all) to 7 (very much):

 You feel pretty irritated by the number of requests you receive to use new
Internet communication tools.
 You feel emotionally pressured by the number of requests you receive to use
new Internet communication tools.
 You feel confused by the number of requests you receive to use new Internet
communication tools.

While we could separate overload into emotional overload (e.g., frustration,


stress, etc.) and cognitive overload (e.g., inability to process the information,
making mistakes, not completing all requested calculations, etc.) using factor ana-
lysis, we were unable to separate the dimensions in our subsequent Partial Least
Squares analysis and Structural Equation Modelling. However, based on a large
panel survey (n=1,868), we identified some of the elements characteristic of the
pool of resources (Rutkowski, Saunders, Wiener et al., 2013). Prior experience of
Measures of IT-related overload 107

overload in general (emotional and cognitive) was positively correlated (r=.502,


p=.0001) to anxiety, and negatively related (r=.560, p=.0001) to Need for Cog-
nition. Prior experience of overload with technology is very different for a ‘geek’
who has a high level of Need for Cognition than it is for most technology users,
especially when they are anxious (Tsai et al., 2007; Rutkowski, Saunders, Wiener et
al., 2013). It is striking that we could conclude from this research that highly
educated participants suffered significantly from prior experience of overload with
technology.
Saunders et al. (2017) recognized the challenges in teasing out the two separate
types of overload in their study of overload from mobile phones. Instead we used
Karr-Wisniewski and Lu’s (2010) three technology dimensions and adapted the
scales to ensure that each dimension had a cognitive and an emotional component.
Further, the items were written in terms of ‘I’ versus ‘you’. The Saunders et al.
(2017) scale is multidimensional and displays good psychometric properties. It is
displayed in Table 6.1.
As challenging as it appears, measuring IT-related overload is a ‘piece of cake’
compared to measuring the Emotional and Cognitive Memories of Overload from
Technology embedded in an individual’s schemata. The individual may not even
be aware of the structure of the schemata in Long-Term Memory, or the complex
blend of emotional tags that are embedded in his/her mental framework (Bower,
2001). From our research we learned two things about self-report measures of
Emotional and Cognitive Memories of Overload from Technology. First, although
the schemata are always evolving, items for Emotional and Cognitive Memories of
Overload need to be stated in the past tense. Second, whereas IT-related overload
can be associated with a particular type of technology such as mobile phones
(Saunders et al., 2017) or video contact equipment (Rutkowski & Saunders, 2010),
the memories are associated more generally with other technologies like computers
or the Internet. This is because overload situations were tagged and embedded in
their Long-Term Memory in the past when individuals first struggled with new
technologies,. Hence, when using self-reports of overload with one technology,
the individuals’ previous memories of overload most likely are not for the same
technology. Table 6.2 presents the measurement of past cognitive and emotional
memories used in Saunders et al. (2017).
From a scientific perspective, it is worthwhile to aim at falsifying any theoretical
construction to assess its strength (Popper, 1959). “There is actually a certain value
in not finding anything” (Bryson, 2003, p.57). It surely motivates researchers to
explore further. We therefore continue our expedition to understand the emotion-
cognition supervenience problem regarding IT-related overload. As many of our
predecessors did, we equipped ourselves with new instruments. Indeed, while self-
report measures may be relatively easy to develop and use, the individuals surveyed
cannot adequately convey the extent to which they experience overload using self-
report items since they might not be fully aware of how overloaded they are.
Further, self-reports are subject to a range of errors and biases that might threa-
ten the reliability and validity of empirical studies (e.g., Campbell & Fiske, 1959;
108 Measures of IT-related overload

TABLE 6.1 Operationalization of IT-related overload with item loadings

Construct # Item (loading)


Communication 1 I think that in a less connected environment, my attention
overload would be less divided allowing me to be more productive.
(.756)
2 I often find myself overwhelmed because my mobile
phone has allowed too many other people to have access
to my time. (.829)
3 The availability of mobile communication has created
more of an interruption than it has improved communica-
tions.*
4 I feel overwhelmed by the amount of mobile commu-
nications that I need to deal with. (.800)
Feature overload 1 I am often distracted by the features of and software apps
on my mobile phone.*
2 The features and software apps on my mobile phone are
often more complex than the tasks I have to complete
using these features or apps. (.726)
3 It bothers me that my mobile phone has so many features
that I can’t learn them all. (.760)
4 I feel stressed when I try to learn how to use new features
or software apps on my mobile phone. (.837)
Information 1 I am often distracted by the excessive amount of informa-
overload tion I receive through my mobile phone. (.740)
2 I feel overwhelmed by the amount of information deliv-
ered through my mobile phone on a daily basis. (.832)
3 The amount of information I receive through my mobile
phone increases the likelihood that I make mistakes. (.739)
4 I am concerned that the amount of information that I
receive through my mobile phone prevents me from pro-
cessing the most important pieces of information. (.733)
5 I am good at managing the amount of information that I
receive through my mobile phone.R, *
6 I feel stressed by the amount of information I receive
through my mobile phone. (.829)
7 I often feel pressured to deal with everything delivered by
my mobile phone. (.808)

Source: Saunders et al. (2017) *Based on the instrument validation process, the item was removed.

Hobfoll, 1989). One of these is the social desirability bias in which the respondent
tries to anticipate what is considered the most appropriate response. Another is
fundamental error bias (Ross, 1977), also referred to as self-serving attribution bias
(Forsyth, 1980) (see Chapter 3). From a socio-cognitivist perspective, confusion
about information load (i.e., the amount of information) and the so-called infor-
mation overload should be understood as the result of a cognitive distortion. Even
time spent on an activity may be difficult to measure using self-report measures.
Measures of IT-related overload 109

TABLE 6.2 Operationalization of memories of past cognitive and emotional overload with
item loadings
Construct # Item (loading)
Memories of past 1 I could not process the amount of information delivered
cognitive when I started using a computer. (.853)
overload 2 I could not handle the excessive amount of information
provided when I started using a computer. (.859)
3 I had problems adopting computers and learning how to
use them. (.797)
4 I was overwhelmed by the effort it took to learn using
computers. (.829)
Memories of past 1 I felt emotionally pressured when I first used a computer.
emotional (.866)
overload 2 I felt confused when I first used a computer. (.797)
3 I felt frustrated when I first used a computer. (.848)
4 I felt happy when I first used a computer.R, *

Source: Saunders et al. (2017) * Based on the instrument validation process, the item was removed.

For example, Junco (2013) compared actual versus self-reported times for use of
different technologies among 45 university students over a period of a month. The
self-reported time (hours and minutes) they felt they had accessed Facebook,
Twitter, and email was compared to the actual time, which had been monitored
with software installed on their computers (i.e., an objective measure). Although
correlations between self-reports and objective software measures of the actual time
spent accessing technologies were significant and reasonably high (e.g., the correla-
tions ranged from .587 for Facebook access to .628 for email access), the self-report
estimates were quite different from the objective software ones. For example, while
users self-reported an average of 145 minutes per day accessing Facebook on their
computer, the actual average time, according to the monitoring software, was 26
minutes per day.
Individuals who experience IT-related overload may attribute the state of
excessive emotional and cognitive overload to external factors such as too much
information and time constraints, rather than to internal ones such as their own
limited information processing capacity or previous failed coping strategies
(Ross, 1977). This distortion is self-serving attribution bias, which refers the
tendency to ascribe success to dispositional attributes and failure to situational
ones (Taylor & Koivumaki, 1976). This bias explains reports in the literature
about the difficulty of measuring overload (Bettman et al., 1990; Jacoby, 1984;
Malhotra, 1984) and the importance of using multiple scales to assess its existence
and potential behavioural consequences. Since self-reports reflect respondents’ sub-
jective experiences and interpretations, they lack objectivity (Donaldson & Grant-
Vallone, 2002). Therefore, the validated subjective measures of IT-related overload
should be complemented with objective measurement techniques that focus on the
brain and the body.
110 Measures of IT-related overload

Triangulation and technologies


The development of neurosciences and neurophysiological measurement has con-
tributed greatly to understanding the mind. Most cognitivists applied their research
in mapping brain structures to their models of memory (Tulving, 2002). However,
most of their findings are based on patients with impaired or damaged brain func-
tion (Damasio, 1994). The second brain, or mind-gut connection (Gershon, 1999),
is also being investigated and has had some significant coverage in the press.
Technological advances are being made in the field, but they are not yet at the
point of resolving the emotion-cognition supervenience problem.

Code red: IT-related overload in the operating room


In this section, as illustration of the triangulation of IT-related overload, we present
research we conducted in the context of surgeon training in the operating room
(OR). We believe the context is exceptional in the sense that it allows us not only
to triangulate measures, but also to limit variation in the attentional resources of
surgeons in training on minimally invasive surgery. Minimally invasive surgery is a
type of surgery in which the surgeon performs the operation through small inci-
sions and with a camera inside the body. Working in such a demanding environ-
ment requires cognitive abilities and skills that surely are not all that common. For
example, in an earlier research study we demonstrated, using self-report scales, that
surgeons have a generally high level of cognitive absorption (see Pluyter, Buzink,
Rutkowski, & Jakimowicz, 2010). Cognitive absorption is defined as “a state of
receptivity or openness … to undergo whatever experiential events, sensory or
imaginal, that may occur, with a tendency to dwell on, rather than go beyond, the
experiences themselves and the objects they represent” (Tellegen, 1981, p.222).
Cognitive absorption is an intrinsic dimension of personality that precedes deep
involvement and attention focus (Roche & McConkey, 1990). As we reported in
Chapter 4, individuals lack attentional resources when immersed (Csikszentmihalyi
& Csikszentmihalyi, 1988). We systematically distracted the trainees while they
were training on a simulator, and we observed that the surgeon trainees with the
highest levels of absorption reported less irritation when disturbed than those with
lower levels of cognitive absorption. This was an interesting result as the task error
score was significantly higher under distracting conditions than under the baseline
conditions (Pluyter et al., 2010). This opened our eyes to the way surgeons focus
their attentional resources. Are they using extra resources to ‘block out’ the dis-
tractions or are they so absorbed that they neither notice it nor experience any
associated stress?
Research has demonstrated that performing minimally invasive surgery is sig-
nificantly more stressful and requires more concentration for the surgeon than
performing open surgery (Berguer, Smith, & Chung, 2001). Surgeons have to
process much more information than in traditional open surgery, and consequently
their brain load and stress levels increase (Berguer et al., 2001). The high brain load
Measures of IT-related overload 111

negatively impacts performance in the OR (Stahl, Egan, Goldman, Tenney, Wik-


lund, Sandberg et al., 2005). Surgeons find minimal access surgery to be challen-
ging and cognitively demanding (Jakimowicz & Cuschieri, 2005), especially when
they are novices (i.e., non-experts). Indeed, novice surgeons need to allocate more
resources than expert surgeons (Zheng et al., 2010). Similarly, Cao, Zhou, Jones,
and Schwaitzberg (2007) demonstrated that novices, compared to expert surgeons,
have less spare cognitive capacity and perform more slowly and with more errors.
By experimentally manipulating the degree of mental load using arithmetic pro-
blems, they proved that training with the simulator provides better feedback to
trainees and significantly counters the effect of the induced mental load. Medical
students have not yet built up spare resources for unanticipated situations that
might arise during surgery, as they have not trained enough to automate their
responses. The more resources they are required to use, the more emotionally and
cognitively overloaded they are likely to feel. Shirom, Nirel, and Vinokur (2006)
related physical fatigue, emotional exhaustion, and cognitive weariness with the
quality of care provided by medical doctors.
As we mentioned previously, resources are expended when individuals per-
forming complex tasks have to attend to interruptions (Speier et al., 1999). In
addition, it is hypothesized that an increase in work overload is related to an
increase in technological complexity (Ayyagari et al., 2011) and the number of new
technologies that must be mastered for the role. The increased work overload leads
to increased strain (Ayyagari et al., 2011). Also as previously noted, professionals
experience role-related overload when they feel technically overloaded for a vari-
ety of reasons and are not able to meet the technical demands of the job (Tarafdar
et al., 2011).

Physiological markers
The use of self-report scales to measure stress and IT-related overload is probably
even more problematic in the OR than in laboratory experiments or many field
settings. Surgeons are unlikely to admit, or even recognize, that they are experi-
encing stress (Sexton, Thomas, & Helmreich, 2000; Yule, Flin, Paterson-Brown, &
Maran, 2006; Arora, Hull, Sevdalis, Tierney, Nestel, Woloshynowych et al., 2010).
It is therefore desirable to complement the subjective measures of overload with
objective measurements such as physiological markers. Physiological data collected
using psychophysiological tools are unobtrusive and not susceptible to social
desirability bias.
We used the Thermoview 8300 infrared thermal imaging camera as an objective
measure of IT-related overload. As presented in Chapter 2, the Peripheral Nervous
System controls vital centres, such as the cardiac regulation and vasomotor centres.
The latter regulates homeostatic processes such as body temperature and blood
flow. Homeostasis relates to the physiological efforts to keep the body balanced
and in equilibrium. Potential threats to our bodily homeostasis generate stress
(Levine, 2005). Congruently, research has demonstrated that stress exposure results
112 Measures of IT-related overload

in changes in skin temperature and facial temperature (Vinkers, Penning, Hell-


hammer, Verster, Klaessens, Olivier et al., 2013). Also, the body expends energy
resources as it attempts to maintain its equilibrium. For example, heat loss to the
environment is balanced by heat generated through metabolism (Bouzida, Bend-
ada, & Maldague, 2009). The thermal imaging camera can measure facial skin
temperatures. Facial temperatures have been correlated to blood flow (Pavlidis &
Levine, 2002). Or and Duffy (2007) reported that frontal forehead temperature (to
which we subsequently refer as ‘frontal temperature’) should remain fairly stable. In
contrast, the skin around the eyes is less thick and therefore, when vasoconstriction
occurs, periorbital temperature variation is easier to capture (Reyes, Lee, Liang,
Hoffman, & Huang, 2009). Mental strain is referred to in thermal imaging litera-
ture as “cognitive workload” (Stemberger, Allison, & Schnell, 2010), mental
workload and mental demand (Or & Duffy, 2007), mental effort (Reyes et al.,
2009), frustration (Puri, Olson, Pavlidis, Levine, & Starren, 2005), or stress (Pavli-
dis, Dowdall, Sun, Puri, Fei, & Garbey, 2007). In fact, these represent cognitive
and emotional manifestations of overload (Rutkowski & Saunders, 2010).
Mental strain is associated with temperature changes in specific areas of the
human face due to activity alteration of the Autonomic Nervous System (Or &
Duffy, 2007). The activity triggers subtle increases in periorbital temperatures in
skin directly surrounding the orbit (Stemberger et al., 2010). Excessive levels of
mental strain measured by the thermal imaging camera are associated with impaired
performance and low perceived ease of use of the technology under study (Reyes
et al., 2009). Researchers in the field measure both frontal and periorbital tem-
peratures without disentangling them at the construct level. For example, frontal
temperature has been used to infer psychological states (Pavlidis et al., 2007) and to
assess emotional states of stress (Puri et al., 2005).

Relevant triangulation findings


Before entering the OR, we tested our triangulation approach in the laboratory
(Pluyter, 2012; Pluyter, Rutkowski, Jakimowicz, & Saunders, 2012). We focused
on two zones: frontal versus periorbital. We were very satisfied with the results in
our sample of 122 participants, as the pattern was similar for the frontal temperature
markers and responses on the self-report scales of cognitive overload. We con-
cluded that the frontal thermal imaging temperature can be used to complement
self-report measures of cognitive overload. Interestingly, the periorbital tempera-
ture increased between runs and was not congruent with the self-reported cogni-
tive overload. Possibly it represents the individual’s need to expend additional
resources to achieve homeostasis, rather than indicating cognitive overload. It could
be related, therefore, to the more instantaneous effort required to cope with stress
and fatigue, which would be more representative of emotional overload. In our
study, the challenge was not to just collect the data, but to determine how to map
the thermal patterns to different mental activities: cognitive and emotional. Based
on this study we tentatively assumed that frontal temperatures measure cognitive
Measures of IT-related overload 113

overload, while periorbital temperatures measure emotional overload. We found


the first seeds of evidence of a prevalence of cognitive overload before emotional
overload. The latter is linked to the need to draw on more resources. However,
the results of a Pearson correlation test showed that periorbital and frontal tem-
peratures were significantly correlated (r=.562, p=.008). As we discussed earlier,
the brain areas required for cognition and emotion are highly interconnected
(Ghashghaei & Barbas, 2002). Did we find evidence for a self-report bias?
So we ploughed on, as good adventurers would. We added more instruments to
our arsenal. First, we used the SenseWear BodyMedia System to capture three
measures of body temperature: heat flux, galvanic skin response (GSR), physiolo-
gical measures of energy expenditure in Metabolic Equivalent of Tasks (METs).
We triangulated emotional overload measured through periorbital temperature,
heat flux, GSR, energy expense in METs, and the self-report measure of stress.
Indeed, stress is represented by a myriad of physiological consequences. Such
changes in body temperature associated with task performance reflect processes of
the Autonomic Nervous System’s energy regulation and mobilization. The body
dissipates heat in many forms. Heat flux represents the amount of heat being
emitted by the body through heat convection (Liden, Wolowicz, Stivoric, Teller,
Kasabach, Vishnubhatla et al., 2002). Galvanic skin response has been linked in
previous research to cognitive engagement (Pecchinenda & Smith, 1996). Elevated
GSR associated with task performance may reflect a process related to ‘energy
regulation’ or ‘energy mobilization’. In particular, it might reflect ‘an effortful
allocation of attentional resources’ to the task. Earlier research has proven that
electrodermal activity is also an indicator of emotional learning (Lanzetta & Orr,
1980, 1986) and associated with problem-solving tasks (Jutai & Hare, 1983). The
literature suggests that such physiological responses might also reflect the level of
effortful allocation of resources (Dawson, Schell, & Filion, 1990). As previously
explained, stress arises when an individual appraises the demands placed by the
environment as exceeding his resources and threatening his well-being (Ayyagari
et al., 2011). In order to cope, the individual draws on resources to recover his
balance and therefore becomes ‘warmer’.
Second, we triangulated cognitive overload as average frontal temperature and
self-reported cognitive overload with IT. The scores collected by the SenseWear
BodyMedia System were measured for each role played by each participant. Also
both periorbital temperature and frontal temperature were measured at baseline
and after each of the role-playing episodes. We used a virtual reality simulator
platform (Simbionix LAP Mentor™) in this study. The advantage of this simula-
tion technology is that it computes Laparoscopic Surgical Skills (LSS) scores to
assess users’ surgical skills and aptitude in the psychometric tradition. The LSS
performance scores are based on predefined benchmarks (i.e., criterion-referenced)
that are validated by experts as standards for proficient surgical skills (Schijven &
Jakimowicz, 2003; Seymour, 2008; Schijven & Bemelman, 2011). The participants,
in teams of three, met at the integrated laparoscopic OR equipped with the Karl
Storz OR1™ system for the four-and-a-half-hour crew (team) training simulation
114 Measures of IT-related overload

(Pluyter, Rutkowski, & Jakimowicz, 2014). Participants were assigned to one of


three roles using a reduced Latin-square randomized control design for each run:
surgeon, laparoscope navigator, and anesthetist. The statistical analyses revealed that
the participants experienced the most immersion when playing the role of the
surgeon.

Triangulating emotional overload


We observed five relevant findings in our triangulation of emotional overload.
First, the standard deviation of heat flux during the procedure was correlated with
self-reported stress levels and is therefore an interesting marker of stress.
Second, we could not conclude whether or not the variation in periorbital
temperature corresponds to frustration or to stress since the difference is not sig-
nificantly different per LSS performance criterion. Indeed, one would expect when
considering the supremacy of cognition over emotion that failing at the tasks
would have increased stress when playing the role of surgeon. It did not.
Third, high-performing participants did not report more or less stress in the self-
report scales than low-performing participants. High-performing participants did
differ on physiological measures of stress.
Fourth, participants who excelled reported less associated physical strain (e.g.,
muscular tension in the neck or shoulder) than those who failed. We speculated
that this is because they have better skills encoded in memory and, therefore, a
richer resource pool when considering their information processing abilities. They
did not need to burn more physiological resources to perform well. It may represent
the ‘easy’ deployment of encoded motor and intellectual skills. If it were to represent
sustained stress only, it should have been correlated with physiological energy
expenditure. Congruently, significant results were not found for GSR or MET as
expected in relation to emotional overload based on LSS performance score.
Fifth, we could conclude that the results regarding emotional overload were highly
significant when considering the role played rather than performance. There was a
significant increase in periorbital temperature from the maximum at baseline for each
of the three roles. The participants in the role of surgeon recorded a significantly
higher self-reported stress and periorbital temperature than those taking anesthetist or
navigator roles. Interestingly, the greatest difference in temperature over time was
reported for the role of navigator, then the role of anesthetist, and finally the role of
surgeon. That is, participants in the surgeon role drew on their physiological resources
at a higher, but constant, level. This finding is supported by body temperature results
that exhibited a similar pattern for the maximum, average, or standard deviation
energy expenditure in METs. Congruently, regarding subjective measurements of
verbal report, the participants claimed that the role of surgeon was the most stressful
role, followed by that of navigator, and then anesthetist. They made similar claims
regarding the allocation of physical resources. Often, bleeding situations were cited as
stressful when role playing the surgeon. One referred to stress related to “the fact that
bile kept on leaking and I couldn’t manage to stop it”. A smaller percentage reported a
Measures of IT-related overload 115

stressful event in the role of the navigator. One participant said, “As navigator, it is dif-
ficult in this new situation to get the image focus if you have to think for another person [i.e.
surgeon].” None reported a stressful event when playing the role of anesthetist.

Triangulating cognitive overload


Contrary to the results on periorbital temperature, we learned first that the average
frontal temperature during the procedure significantly varied as a function of the LSS
performance score. Particularly, the average frontal temperature of the participants
who showed poor LSS performance scores was significantly lower, while this was
higher for the participants who excelled at the surgical tasks. We conclude that the
higher average frontal temperature may be a good physiological marker of deep
involvement and attentional focus. One may expect the variation on the average
frontal temperature to be a sign of positive mental strain (see Pluyter et al., 2014).
Second, at the role level, there was no significant difference regarding self-report
measures of cognitive overload with IT, and the results displayed a pattern different
from that for periorbital temperature. In addition, the average frontal temperature
of participants in the anesthetist role was significantly lower than when in the sur-
geon role. There were no significant differences in the average frontal temperature
of participants role playing the navigator versus surgeon or navigator versus anes-
thetist. Hence, the triangulation results are more straightforward for cognitive
overload than for emotional overload.

Additional instruments
Additional instruments and, therefore, observables are available to measure IT-
related overload. For example, early work used pupil dilatation and eye movements
as observables of mental workload during decision-making tasks (Hess & Polt,
1964; Bradshaw, 1968) as well as emotional stimulation through audio stimuli
(Partala & Surakka, 2003). Eye-tracking is commonly applied in the domains of
human-computer interaction and design usability. Minas et al. (2014) used elec-
troencephalography, electrodermal activity, and facial electromyography to suggest
that confirmation bias in information processing during online team discussions is a
better explanation than overload for their results.
Functional neuroimaging techniques (e.g., Positron Emission Tomography and
Functional Magnetic Resonances Imaging, or fMRI) also could be employed to
explore the portions of the brain that are activated while performing tasks
(Dimoka, 2011; Dimoka, Pavlou, & Davis, 2011). Dimoka, Banker, Benbasat,
Davis, Dennis, Gefen et al. (2012) reported that “while self-reports may not be able
to capture unconscious processes that are unavailable to introspection, neurophy-
siological tools can capture unconscious processes with direct responses from the
human body” (pp.680–681). Using functional neuroimaging techniques would be
most useful in measuring prior experience of IT-related overload. Indeed, patterns
of neuronal activities should be observed mainly in the PFC and the ‘emotional
116 Measures of IT-related overload

brain’ (e.g., limbic system). Empirical regularity studies of the Episodic Memory
have demonstrated a pattern of activities in the left PFC when encoding informa-
tion into the Episodic Memory and in the right PFC when decoding. This is
known as the Hemispheric Encoding/Retrieval Asymmetry Model (see Kapur,
Craik, Tulving, Wilson, Houle, & Brown, 1994; Kapur, Craik, Jones, Brown,
Houle, & Tulving 1995). Such studies could provide evidence of neural correlates
to validate the self-report scales of prior experience of overload with IT. Indeed,
fMRI is a fascinating tool to study IT-related overload. Using fMRI, Jaeggi,
Buschkuehl, Etienne, Ozdoba, Perrig, and Nirkko (2007) demonstrated that parti-
cipants’ level of performance under conditions of task overload triggers different
activation increases in cortical areas. Congruent with the results of our research, the
participants who performed poorly as a surgeon expended additional mental resour-
ces with increasing difficulty, while the brains of the high-performing participants
‘kept cool’ in terms of activation changes and associated frontal temperatures (Pluyter
et al., 2014).
Overload is often conceptualized by default as a supervenience of cognition over
emotion by considering that the brain activity (cognitive overload) generated an
emotional response (stress/mental strain) that manifested itself bodily. However, for
the top surgeons, the body did ‘warm up’ while thinking harder without this being
linked at all to frustration. If we had used fMRI, as suggested by Jaeggi et al.
(2007), we probably would have found that the top surgeon’s brains remained
‘cool’. We were indeed dealing with a few cool-headed surgical trainees who were
cognitively absorbed.
Over this past decade of research, we have found personality dispositions such as
anxiety (i.e., neuroticism), intellectual engagement (i.e., Need for Cognition), and
cognitive absorption (i.e., openness/intellect) to be key in understanding IT-related
overload. For example, openness/intellect is linked to a larger bandwidth of
information processing. Mandler (1967) described such bandwidth in terms of
superchunking. These traits involve the PFC, particularly the working memory,
abstract reasoning, and the control of attention (DeYoung, Peterson, & Higgins,
2005). Recently, DeYoung, Hirsh, Shane, Papademetris, Rajeevan, and Gray
(2010) identified brain structures in their explanatory model of the Big Five. They
aimed at specifying the biological systems that linked the psychological mechanisms
underlying the traits of the Big Five. Different measurements represent similar
constructs, but point to different results when addressing overload. This is surely
related to the fact that the brain structure is composed of complementary, but also
conflicting, functions. All are highly interconnected– just as emotion and cognition
are; and just as the mind and the body are.

Conclusion
To conclude, we think we have a better understanding of the phenomena of IT-
related overload now that we have developed a triangulation approach using both
subjective and objective measurements of overload. However, the landscape is
Measures of IT-related overload 117

huge and filled with conceptual and measurement traps. We have learned that
emotional and cognitive overload may depend on the individual’s pool of resour-
ces. Some individuals make more use of either their cognitive aspects (e.g., mental
schemata, personality factors, skills, superchunks) or their emotional aspects (i.e.,
positive arousal, stress, mental strain, anxiety); it is mostly a combination of both
with individuals having their own preferred strategies. In our triangulation efforts,
we learned in a controlled setting how rich the pool of resources is, and how dif-
ficult it is to measure concepts such as overload. Much more research using trian-
gulation will be required to understand IT-related overload as it relates to the
functioning of the human pool of resources. Operationalization of observables and
instruments to gather measurements will be necessary to fully explore this fasci-
nating pool of resources: It is the very same pool that makes us so human and so
different from one another.
7
LEVERAGING THE POSITIVE SIDE OF IT

Bryson (2003) translated Newton’s law of universal gravitation into more readable
language:

if you double the distance between two objects, the attraction between them
becomes four times weaker. This can be expressed with the formula F=G
(Mm/r2) which is of course way beyond anything that most of us could make
practical use of, but at least we can appreciate that it is elegantly compact.
(pp.73–74)

From a practical perspective, Bryson’s expression of the universal gravitational law


could apply to our conceptualization of technology as a resource. Could it be that
technology as a resource is reducing the ‘friction of distance’ between individuals
to such an extent that it strengthens their attraction to one another? Indeed, that
is what technology is doing. It reduces the psychological distance between indivi-
duals and appears to do the same with geographical distance. Technology has made
it possible to share emotions as well as knowledge even when individuals are not
together physically.
One way of conveying emotions online is through use of the ‘like’ button. In
fact, Facebook users convey their emotions 2.7 billion times daily when using this
button (Montgomery, 2015). Justin Rosenstein, a Silicon Valley software engineer
and the designer of the ‘like’ button on Facebook, told Lewis (2017) in an inter-
view for The Guardian that “it is very common for humans to develop things with
the best of intentions that have unintended, negative consequences”. In designing
the ‘like’ button on Facebook, Rosenstein surely did not realize that he was not
only supporting Facebook in applying Newton’s law of universal gravitation but
also pushing Facebook followers’ brain buttons, activating their associative net-
works of emotion (Bower, 1981, 2001). That is, the ‘like’ button tags the
Leveraging the positive side of IT 119

pertinence of related information as well as the users’ preferences and, therefore,


the significance that we give to one another’s thoughts and actions. Rosenstein
clearly criticized increasing user engagement through the ‘like’ button. He recog-
nized it as a way of harvesting user preferences in order to market and sell products.
This is known as the ‘big data’ phenomenon.
Stephen Hawking displayed similar concerns about Artificial Intelligence’s
(AI’s) unintended consequences. He told Cellan-Jones (2014) of the BBC that
the full development of AI could spell the end of the human race. This is a
new human-robot supervenience problem. Shortly before he died, Stephen
Hawking urged designers to employ best practices and effective management
when creating AI (Mascarenhas, 2017).
Information Technology (IT) has been recognized as an important catalyst for
organizational efficiency, progress, and social transformation. Larger Internet
bandwidths offer companies and governmental institutions efficient and low-cost
synchronous and asynchronous telecommunication and social networking tools.
These technologies support collaboration, sharing, and exchange of information to
conduct business, public or private. Information Technology is an amazing
resource. However, as we emphasize in this book, it should be used wisely by
individuals, organizations, and societies.
Going ‘technocentric’ with the exponential development of automated and
data-driven decision-making systems portrays the bright side of IT. However, there
is little doubt that Information Technologies overload us through hyperconnec-
tivity. They saturate the self and affect brain functioning (e.g., the Brain Reward
System, or BRS). In the long run they may even fully substitute us. Nonetheless, we
must remember that as early hominids evolved by designing and handling complex
tools, they became smarter and fitter. They learned to accumulate exogenous
resources and conserve endogenous ones (see Chapter 1).
In this book we address the evolution of relevant supervenience problems.
Supervenience relates to many philosophical questions exploring ontological
relations between properties of systems. We have built on the mind-body
supervenience issue to demonstrate the rise of scientific practices (see Chapter 2)
and the difficulty of disentangling emotion and cognition while dealing with
information overload (Chapter 6). We also introduced the Information Tech-
nology-ego supervenience problem (Chapter 5), demonstrating that in our digital
world a form of fusion is occurring such that the boundary between technology
and the ego is blurring.
Our book has generally focused on the dark side of IT. Here, we lighten up and
focus on the many beneficial aspects of IT. There are too many to cover ade-
quately in this chapter; thus, we focus on a handful of newer Information Tech-
nologies that offer the potential to substantially brighten our futures: gamification,
robotics, business analytics and big data, robotics, and brain enhancements. For
each one, we concentrate on a challenge (or two) that it presents. We conclude the
chapter with a discussion of initiatives and policies which attempt to ensure our
bright future.
120 Leveraging the positive side of IT

Gamification

The bright side of games


Gamification, or the application of game design principles and elements in non-game
contexts (Colbert et al., 2016), is used by organizations to increase employee learning
capacity, motivation, and performance. Klingberg (2009) suggested that playing
games can help improve the working memory of frequent gamers. Further, online
games are a popular strategy for motivating individuals, by providing clear goals and
frequent real-time feedback, as well as for making monotonous tasks more enjoyable
(Liu, Santhanam, & Webster, 2017). For example, Uber employs gamification
embedded in mobile apps to incentivize its workers to ‘hit the road’ and work longer
hours (Wiener & Cram, 2017). Online games also offer the opportunity to study skill
acquisition with data from thousands of players (Griffiths, 2015).
Gamified systems include gamification objects and gamification mechanics (Liu
et al., 2017). Examples of gamification objects include leader boards with status infor-
mation, virtual coaches to provide tips and narratives, rewards such as badges for users
achieving in-app activity milestones, and activity streams showing the recent activities
of users and their friends. Gamification mechanics include conferring rewards, social
networking, giving kudos, forming teams, and providing cash incentives. To date, a
full understanding of gamification is lacking. A framework for the design and research
of gamified information systems was proposed by Liu et al. (2017). It appears that to
realize the full motivational impact, games may need to be designed to allow users to
customize the gaming experience to respond to their individualized needs. Further,
the impact of various rewards needs to be assessed.

A challenge from playing games: negative effects on employees


There is concern that game-playing activities may lead to forms of addiction or to
monitoring situations that negatively affect employees (Liu et al., 2017). For
example, Disneyland and Paradise Pier Hotels displayed efficiency numbers of their
housekeepers on leader boards. The quickest employees had their efficiency num-
bers displayed in green, while the slowest had their numbers displayed in red for
others to see. Instead of motivating them, many employees felt they were being
controlled by an “electronic whip” (Liu et al., 2017, p.1012).

Algorithms and big data

The bright side of algorithms and big data


One result of information overload, and recently of the ‘big data’ phenomenon is
that ‘algorithms’ are becoming more commonplace in many types of decision-
making. In contrast to humans, whose attribution processes are subject to distortion
(i.e., cognitive errors) such as confirmation bias (Ross, 1977), algorithms are said to
Leveraging the positive side of IT 121

be unemotional and therefore safe to use in decision-making activities. Con-


sequently, “in an increasingly quantitative business world, managers are asked
to deliver more data-driven decisions– precisely the sort at which machines
excel” (Schechner, 2017). In 2017, algorithms embedded in software fuelled
the $11.5 billion human resources and workforce management software
market. The software is used across a wide range of companies: Uber Tech-
nologies is using algorithms to allocate tasks among its self-employed workers;
Royal Dutch Shell is using its machine-learning software to match workers
with projects; Brookstone and the University of Colorado at Boulder use
algorithmic software to assess and respond to vacation requests sans humans;
Nexus A.I. is using algorithmic software to assign employees to teams based
on their profiles and backgrounds (Schechner, 2017). All these examples rely
heavily on algorithms and machine learning.
Big data (i.e., great in volume, velocity, variety, and veracity) is being applied to solve
interesting problems and improve society in numerous ways. To solve the problem of the
impact on overload of this growing corpus of big data, IT is being proposed as a way of
focusing our attention and selecting what is pertinent to us. This is the basic premise of
Enterprise Cognitive Computing, which we introduced in Chapter 5. These systems can
make sense of massive volumes of data. The algorithms they employ can use machine
learning to interpret new data in light of past data and to adjust existing models on the
basis of the new data (Tarafdar et al., 2017). In doing so they can stimulate operational
excellence, ‘delight’ customers, and create a better working environment for employees.
Algorithms and big data are also at the core of ‘smart’ applications. Sensors ran-
ging from simple feedback mechanisms, such as temperature thermostats, to deep
learning algorithms are used to capture and measure data (Wolfert, Ge, Verdouw,
& Bogaardt, 2017). For example, many smart cities apply IT to multiple facets of
everyday life using sensors embedded in roads, bridges, tunnels, power grids, hos-
pitals, water systems, and so on. These interconnected autonomous sensors form
the Internet of Things, which allows information sharing across platforms to enrich
the lives of city dwellers and transform their nations’ economies (Abaker, Hashem,
Chang, Anuar, Adewole, Yaqoob et al., 2016). Big data from healthcare can be
used to predict epidemics and diseases, to find cures, and to improve the quality of
life. Big data from transportation systems (tunnels, bridges, roads, railways, etc.) can
minimize traffic congestion by providing alternative routes, reduce accidents, and
optimize shipping movements (Abaker et al., 2016). Smart tourism has grown out
of many of the smart cities initiatives (Gretzel, Sigala, Xiang, & Koo, 2015). This
considers not only the residents but also tourists visiting the cities. Information
Technology is integrated into the cities’ physical infrastructure to improve services.
For example, in Barcelona tourists can use a smartphone app to find bicycles made
available for environmentally friendly transportation; Tourists in Brisbane can use
their smartphones to hone in on information from beacons placed at 100 points of
interest; Amsterdam has tourist signs that translate themselves into different lan-
guages; and Seoul has invested heavily in free Wi-Fi throughout the city and
smartphones that are made available for tourist use (Gretzel et al., 2015). Another
122 Leveraging the positive side of IT

of the myriad forms of smart technology is smart farming (Wolfert et al., 2017). Big
data and algorithms are affecting the entire food supply chain by providing pre-
dictive insights to farming operations on aspects such as yield models and feed
intake models and by driving real-time operational decisions.

Using algorithms and big data mindfully


Of course, there are many challenges in using algorithms and big data mindfully. It
is conceivable that machines can operate semi-autonomously over long periods and
possibly even autonomously if they can apply machine learning successfully.
However, if it gets to the point that humans cannot understand the underlying
algorithms enough to control the machines (and that day is not far off), we could
be in real trouble.
More immediately, we want to discuss the challenge of balancing individuals’
right to privacy with the wonderful benefits that their data can provide to them-
selves and others. For example, in the context of smart tourism, the location-based
services that are of value to tourists can also make them vulnerable. The digital
footprint of smart tourists who are subjects of ubiquitous always-on data collection is
massive and may be captured mindlessly– just because it can be (Gretzel et al., 2015).
These tourists may be identifiable when they do not wish to be. The capture of
personal healthcare data may be an even more sensitive issue– but very beneficial
when it comes to research and predictive purposes as well as the containment of
healthcare costs (Vezyridis & Timmons, 2015).
Businesses claim that governmental regulations about privacy and uses of big data
inhibit the innovative application of big data and the algorithms that use it
(Schroeder, 2014). But not all societies are equally concerned about privacy. For
example, Zestfinance’s algorithms and smart technologies developed in the USA
are showcased in South Korea’s new Songdo City– the world’s first greenfield
smart city. The US regulations about the use of big data and concerns about priv-
acy have prohibited these technologies from being fully applied to create such cities
in the USA.
A related issue is the governance of data in terms of who owns it. This concern
is probably most acute within the healthcare industry as it is populated by third-
party data brokers who are uninterested in maintaining either the privacy of the
data or its veracity. That is, they adopt a “finders keepers” ethic which assumes that
the data they purchase has been acquired with the individuals’ consent and it is
theirs to do with as they wish (Sax, 2016). Unfortunately, even if individuals are
not identifiable from the insights derived from their personal data, they may still be
detrimentally affected by ensuing discriminatory actions. Discriminatory targeting
practices might not be based on personal data, but rather insights from aggregated
data or institutionalized discrimination.
Further, developments in AI and quantum computing facilitate the compulsive
collecting or hoarding of data that is so commonplace today. Individuals, govern-
ments, and organizations all must determine what data to store, especially since data
Leveraging the positive side of IT 123

and information may quickly become outdated and dumped in cyberspace, where
individuals lack the ability to simply delete what is no longer pertinent.
Ultimately it is up to individuals to be aware of how their personal information
is being used and to manage this use. However, sometimes this is not possible
because their personal data is stored but not accessible to them. To be more cau-
tious about requests for personal information, Spiekermann-Hoff and Novotny
(2015) suggested the use of software agents who could compare the privacy pre-
ferences stored in the individual’s browser with companies’ privacy policies and
alert the individual if there is a discrepancy. Probably the best-known software
agent solution for individual privacy is W3C’s Platform for Privacy Preferences
Project. Another alternative for sharing personal information in return for services
is to include a level-of-service option. To get a minimum level of service, indivi-
duals need disclose only a minimum level of personal information. To get highly
personalized services with hopefully secure information, a small fee would be
charged. Spiekermann-Hoff and Novotny (2015), among others, have emphasized
the importance of government’s role in setting standards and establishing regulations
to protect individuals’ privacy rights in regard to their personal information.

Robotics

The bright side of robots


Certainly the phenomenon of work substitution by robots, as discussed in Chapter
5, creates a dark place in human society. The potential to cause the death of
patients during robotic surgery illustrates an even darker side of using robots
(Sharkey & Sharkey, 2013).
That said, there are many wonderful uses of robotics technology. Industrial robots
are being used to perform tasks that are “dangerous, dirty, or dull”. They can move
heavy objects repeatedly and reliably with an accuracy within 0.006 of an inch.
Their cost is dropping drastically and the market is growing, forecast to reach $37
billion by 2018 (Goodman, 2015). Currently, many robots can be found on auto-
mobile assembly lines, often working alongside humans. Amazon uses over 10,000
Kiva Systems robots to fetch items in its vast warehouses and bring them to human
employees to be packaged and then handed over to other robots for shipping
operations (Goodman, 2015). Robots also can be found taking care of the elderly,
performing laparoscopic surgery, detonating bombs (a truly dangerous task), driving
cars, and working on the International Space Station (Goodman, 2015).
Consider, for example, the World’s Most Therapeutic Robot according to the
Guinness World Records: PARO (Paro Robots, n.d.). It is an outgrowth of an
initiative undertaken by the Japanese government to develop robots to emotionally
and physically support its burgeoning elderly population. Approximately 25 per
cent of Japanese citizens are over 65 years of age. PARO looks like a cuddly white
baby harp seal and weights about six pounds. It has sensors for touch, light, sound,
temperature, and posture, allowing it to perceive people and its environment. It
124 Leveraging the positive side of IT

can learn to behave in the way its owner wants it to after it is stroked for displaying
a desirable behaviour. Thus, by moving its head and legs, making seal sounds, and
responding to light, sound, and touch, PARO appears to be alive. Research sug-
gests that dementia patients are comforted by PARO’s presence. PARO can help
human caregivers when their emotional resources run low: They are reliable,
trustworthy, and do not suffer from burnout or impatience (Johnston, 2015).

Ethical challenges related to robots


As cuddly as PARO seems, some ethical questions have been raised by ethicist and
philosophy professor Sharon Vallor about PARO replacing caregivers: “My ques-
tion is what happens to us, what happens to our moral character and our virtues in
a world where we increasingly have more and more opportunities to transfer our
responsibilities for caring for others, to robots?” (in Johnston, 2015).
As robots become ever more prevalent and smarter, more ethical questions will
be raised– concomitant with discussions of legal and policy issues. The infamous
Tesla incident in which the driver in a self-driving car was killed has created some
controversy in that industry (Golson, 2016). Will the robot be blamed? It is not
‘human’ and therefore its emotions will not be at stake. The chance is much
greater that the car company will be blamed. Of course, Tesla claims that the driver
is responsible for the vehicle even when it is on autopilot. Or will the software
developers who wrote the driving and navigation software for the car company be
held responsible? After all, it is they who have applied ethical rules to the software.
Did they embed Asimov’s three rules for robots into the software? If so, the death
of a human suggests that these rules were not followed. Even more nuanced ethical
decisions might need to be reflected in the software. For example,

when it’s clear an autonomous vehicle is about to become involved in an


unavoidable collision, should its crash-optimization algorithm cause it to hit a
telephone pole (killing the passenger), the motor cyclist to the left, the Chevy
on the right, or the pedestrian straight ahead?
(Goodman, 2015, p.299)

Surely, such difficult questions create juicy business for lawyers and ethicists.
These ethical issues are interesting from a cognitivist perspective. Kohlberg
(1976) proposed the process of moral development based on Piaget’s (1951) stages
of cognitive development in children. Kohlberg’s process of moral development,
which continues throughout the individual’s lifetime, has three levels with two
stages each:

 Level A – Pre-moral (Stage I: punishment and obedience orientation; Stage II:


naïve instrumental hedonism);
 Level B – Morality of conventional role conformity (Stage III: good boy rule of
maintaining good relationships; Stage IV: authority-maintained morality); and
Leveraging the positive side of IT 125

 Level C – Morality of self-accepted moral principles (Stage V: morality of con-


tract, or individual rights and democratically accepted law; Stage VI: morality of
individual principles of conscience).

Kohlberg’s work is fascinating in the sense that he demonstrated that ethics is


embedded not in behaviour itself, but rather in the reasoning to sustain the beha-
viour. In layman’s terms, consider the dilemma as to whether or not one should
steal a rare and expensive drug to save one’s spouse. Stage I individuals typically
argue: “Yes, I should steal the drug because she is my wife, and she will be mad at
me for not stealing it.” Or perhaps some Stage I individuals reason: “No, because
stealing is bad and I will end up in prison.” In contrast, Stage V individuals argue
either: “Yes, I shall steal the drug, because human life is worth more than any
pharmaceutical company’s profits, and it is unacceptable to live in a society that
favours profit over a human life”; or “Frankly, no, I should not steal it because
there are rules in the society and one should not steal for one’s own benefit. Doing
so would alter a precarious balance between right and wrong. I believe that such
situations are not realistic and that medical providers will find a way to deliver the
drug. Also, someone may need it more.”
One may ask whether the idea of ‘ethical’ robots is too far-fetched. If a human is
in command, he will be blamed for making the wrong decision. If the robot is in
command, the software developer likely will be blamed. Some believe that having
the most perfect algorithms is always superior to human information processing;
Others say that it is just about enforcing more legal regulations on developers. In
resolving ethical issues like these and in developing laws to provide guidance,
Kohlberg’s concept of moral development may prove helpful. As noted by Aristotle
(350 BCE):

For man, when perfected, is the best of animals, but, when separated from law
and justice, he is the worst of all; since armed injustice is the more dangerous,
and he is equipped at birth with arms, meant to be used by intelligence and
virtue, which he may use for the worst ends.
(Book 1/2)

Brain enhancements

Using brain enhancements mindfully


For decades, researchers have been involved in developing and evaluating ways of
enhancing brain capabilities. Some of these approaches have relied upon mis-
conceptions about the role of filters in processing incoming data. Tagg, Gandhi,
and Srinivasan Kumaar (2009), referencing Clay Shirky, noted that “Information
Overload has been with us for so long that it should be seen as a fact of life, rather
than a new problem; instead, the real problem is that our filters are no longer
126 Leveraging the positive side of IT

adequate” (p.3). Such statements suggest that incoming information cannot get
through the filter. For example, Outlook’s email filtering facility uses rules that
typically focus on filtering out the emails of certain message senders. However, this
function is perceived either to be too hard to use or useless (Tagg et al., 2009). It
likely is useless because the filter focuses on the wrong thing: the sender of the
message rather than the message’s pertinence to that recipient. The problem is not
at the filtering point, but rather in the brain that does not have the resources to
process the pertinent information which passes through the filter.
Another contributor to the myopic view on overload in the current overload
literature is the focus on the limited capacity of the Short-Term Memory and the
circumvention of the emotional nature of signals attached to input delivered by IT
and their effects on information processing. The role of cognitive schemata, in
overcoming limitations of the architecture of the cognitive system, and con-
sequences on the decision process have been considered. We all have been dealing
with overload since birth, and in the process we have developed multiple coping
strategies to deal with situations of insufficient and/or exhausted cognitive resour-
ces. In the 1970s, a simple way to conserve our cognitive resources was to apply
compensatory decision rules to simplify or reduce information search (Payne,
1976). Disabling the app or technology, as technostressed Mary did in Chapter 1, is
another ‘old-school’ option.
Recent developments to counter brain overload include the development of
brain chip implants and biochemical enhancements. Nanotechnology, Biotechnol-
ogy, Information Technology, and Cognitive Science have led to radical
enhancements of human abilities. For example, data can be transmitted directly to
the brain when the person with the brain implant is at rest. This drastically
increases the amount of information that can be assimilated each day. Just imagine
how wonderful it would be to wake up and find that you have been infused with
volumes of new information. However, it is not quite that easy. The information
cannot be fully integrated for 30 to 90 days as neurons grow around the chip to
accommodate the new data. Further, the newly chipped person would have to
spend nine full months in the classroom learning how to think in a different (but
mindful) way, process the new data, and categorize it meaningfully into information
(Hamlett, Cobb, & Guston, 2013).
Warren, Leff, Athanasiou, Kennard, and Darzi, (2009) presented an ethical per-
spective on the use of neurocognitive enhancement to improve the function of the
central executive control system in surgeons. Warren et al. (2009) reported that
most research focuses on Transcranial Magnetic Stimulation, brain-machine inter-
faces, and neurosurgical implants of devices and tissue are designed mostly for
alleviating pathologies, such as paralysis (Liepert, 2005; Birbaumer & Cohen, 2007).
The use of such technology is highly desirable in these cases. Warren et al. (2009)
also considered the use of psychopharmacological enhancement on surgeons in
their ethical analysis. Indeed, the administration of drugs has already proven effec-
tive in enhancing pilots’ performance on complex tasks in emergency situations
(Yesavage, Mumenthaler, Taylor, Friedman, O’Hara, Sheikh et al., 2002). There is
Leveraging the positive side of IT 127

an ongoing push in our society for higher levels of performance unhampered by


emotions. Both the pharmaceutical and technological industries no doubt are salivat-
ing at the possibility of a growing market for technology that enables its purchasers to
be ‘superhuman’.
Biochemical and technical enhancements to information processing capacities
have been tested already. Militaries have experimented with ways of enhancing
soldiers’ cognitive functions for more than 100 years (Moreno, 2012). The
highly desirable enhanced functions are part of what Kahneman (1973) called
the “pool” of resources. The enriched pool of resources can create a ‘super-
human’ with sustained alertness (e.g., in the early use of cocaine, caffeine and
nicotine, methamphetamine, modafinil). In fact, military superiority can be
gained by vanquishing the need to sleep and prolonging periods of alertness.
Some students erroneously think that they can achieve this superhuman status
by using drugs to overcome sleep, pull all-nighters, and increase their attention
span (Babcock & Byrne, 2000). While not a perfect solution, such biochemical
enhancements by the military and struggling students alike definitely stimulate
the Central Nervous System and affect cognition, albeit not always in the
desired way.
In one of our university courses in the Netherlands, students were asked to
develop innovative apps. Most of the 23 teams came up with an app to enhance
their physical appearance through sport or special dietary services. A few teams
stepped away from that narcissistic use of technology and thought of apps to deal
with socialization to support the elderly. Interestingly, one group came up with
an app designed to enhance the user’s brain and make them more alert. The
students wanted the app to help them learn better, as well as to activate their
hedonic BRS. When we discussed with them the ethical aspects of such apps, it
was clear that they could foresee the issues related to its use. They had thought of
medical regulations and privacy settings. They also claimed that individuals could
be free to use the apps or not, and they could select different thresholds of acti-
vation to match their own requirements. Further, they could share their profiles
and simulation activities with their friends so that they could all be on a same
wavelength.

A challenge from using brain enhancements


Using such neurocognitive enhancements, especially on healthy people, is ethi-
cally questionable. What about students who reject such technologies to increase
their brain capacity? Will they see their chances for success on the job market
dwindle in this new society? Though there may be numerous challenges in using
brain enhancements, the one we choose to consider here is the bodily harm that
it creates. Most notably, having a continuously processing chip in one’s brain
makes it difficult to sleep. At first the individual with the chip tosses and turns
during the night and experiences repetitive dreams. Over time they can suffer
from the loss of sleep.
128 Leveraging the positive side of IT

A bright new world


We have touched on only a few ways that IT has brightened our world: gami-
fication, robotics, algorithms and big data, and brain enhancement. For each of
these, we focused on a challenge that could diminish contributions. We could, of
course, discuss other challenges, but then our book would be ending on a dark
note. Instead we would like to end with a discussion of ways that these tech-
nologies, as well as the technologies that promote IT-related overload and IT
addiction, could be managed to highlight their bright sides. Given the extent of
the technological explosion, now is the time to develop strategies to promote
‘responsible’ IT use. This includes strategies that we can use as individuals and as
researchers; It also includes ways that organizations and governments can become
more responsible in regard to IT. Below are some suggestions about how to
innovate mindfully.

Helping ourselves
More than ever we are sensitive about the quality of the food we eat: its origin, its
calories, and its effect on and in our bodies. We read labels on food cans and
packages and look carefully at what we have on our plates. Virtually no one who
could afford to do otherwise would recommend a junk food diet to their child.
However, when considering IT and information junk, we are far from being
responsible. In a world filled with ‘fake news’ and obvious liars who hold public
office, we need to try to be more discerning about the sources of our information
and the relevance of the incoming data for making our lives better and more pro-
ductive. We need to consider mindfully how we are using our technology: Do we
spend too much time viewing screens? Are we becoming addicted to the tech-
nology? Is it changing our life for the worse? Can we use the technology more
efficiently and effectively?
We are constantly faced with technology of social saturation which reflects the
media potential for expression and connection to overpopulate the self (i.e., ego)
(Gergen, 1991). Denouncing and rejecting the technology of social saturation have
become common practices. Going ‘cold turkey’ on SNS or email applications on
our smartphones may indeed be a mindful solution. This is an efficient coping
strategy for sparing one’s resources. In technostressed Mary’s case (Chapter 1), dis-
abling her email work app sounds more like a desperate move to save her work-life
balance. Her mental resources are partially exhausted. Neither the technology nor
the designers are to be blamed. The lack of organizational policies or norms related
to email overload is the culprit. Mary is learning the hard way that, in Frost’s pre-
diction, working 8 hours ultimately leads to working 12 hours (Chapter 5).
Rejecting technology when overloaded is a short-term solution.
‘Going to war’ with technology and warning about the consequences of exces-
sive smartphone usage are salutary acts for software designers such as Justin
Rosenstein (see above) or Tristan Harris (Chapter 1). Rosenstein declared that he
Leveraging the positive side of IT 129

had removed the Facebook app from his smartphone (Lewis, 2017). He had assessed
the impact of his own design on “attention economy” to be too dangerous.
Rejecting his own technological design (the ‘like’ button) was a path of least resis-
tance. We also did not foresee the dark side of the Online Baby System (Chapter 4).
When technophiles are worried, it may give us more reason to be as well.
It is important to remember that IT-related overload is more than merely input
or output. It is an emotional and cognitive experience encoded in our brains that
helps us decide on the adoption of the next technology. These encoded memories
of past uses of technology can be positive or negative. If we had a bad experience
learning a new technology in the past, then we may be unwilling to try a new
technology now even though it may be very beneficial to us. On the other hand,
we may remember past experiences with new technologies as so easy or pleasant
that we think that we are capable of handling any new technology that is thrown
our way. The end result might be that we spend an inordinate amount of time
adjusting to a technology that is really not helpful to us, or we may try to master so
many new functions (such as those available on our smartphones) that we experi-
ence stress or become addicted. Either way, the result is undesirable, but avoidable
were we to think mindfully about our use of the technology. In some cases,
while learning a new technology may be unpleasant, it may be necessary to stave
off the alternative of being replaced by a machine.
Mindful use of IT may mean that we consider the extent to which we are
choosing instant gratification over a better life. For example, we might choose to
spend hours mesmerized by the screen in front of us rather than taking the time to
develop lasting relationships offline, or to do a better job of completing an assigned
task. Or we may choose to give up valuable personal information in return for
something that is of little worth in the long run.

Organizations’ role in a bright world


When IT-related overload and IT addiction occur at work, individual performance
suffers and, consequently, so does organizational performance. Our global society
feels the harmful impact from corporate boardrooms (Rutkowski & van Genuch-
ten, 2008) to hospital operating rooms (Tollner et al., 2005). In Chapter 5 we
discussed how organizations may create situations of stress, overload, and even
addiction for their employees in manifold ways: norms encouraging fast turnaround
to emails and texts, organizational designs promoting overload conditions, situa-
tions promoting collaboration overload, global virtual teams that ‘follow the sun’
on a 24-hour basis, assigned work tasks with heavy information loads that are not
easily processed, situations leading to work-family conflicts, frequent software
updates, substitution of workers with robots, etc.
More organizations need to more proactively promote ‘responsible IT’. Some
are already acting responsibly, for instance when they establish norms about
response times to electronic communications in terms of hours rather than minutes,
give their employees smartphone-free nights, discourage the ‘reply to all’ email
130 Leveraging the positive side of IT

option on their email systems, or delay installing the latest version of software until
a time when it is really needed. Other organizations are acting responsibly when
they institute flexible work programmes or when they negotiate with employees
about after-hours communications or required versus optional overtime. Still other
organizations are respectful not only of corporate assets, but also of the personal
information of their customers and employees and do not use it to their detriment
or sell it to third-party vendors who will use it detrimentally. Another avenue of
responsibility relates to organizational use of online games. Overusing games can
lead to IT addiction or employee stress from being monitored, thus diminishing
the benefits of their use. While work substitution and automation may make
undeniable economic sense, responsible organizations will ensure that those
employees whose jobs are eliminated are trained to survive in the new digital world.
As we all move forward, organizations should be mindful about how technology can
be used for everyone’s long-term well-being.

Governments’ role in the bright new world


Governments face the challenge of trying to satisfy multiple stakeholders. Because
so many of the problems they face are quite complex, it is not clear which actions
will benefit the various constituencies. For example, businesses claim that too much
regulation about big data will stifle innovation (Schroeder, 2014). However, indi-
vidual consumers and a growing number of academicians are screaming for more
regulation to protect individual privacy. To make things even more complicated,
values about IT and its uses vary across countries and regions (Schroeder, 2014).
For example, the French government passed legislation to discourage after-hours
work-related communications. However, such legislation may create another form
of conflict when the employee feels compelled to communicate with global virtual
teammates who are working when he/she is at home. Koreans, who do not seem
to value privacy as much as US citizens, are able to innovate with smart city
technology– developed in the USA– to a much greater extent than US companies,
which must adhere to regulations designed to protect the privacy of US citizens.
Governments also need to consider how they can use IT responsibly. With
smart governance, governments can be more accessible to their constituents and
can provide more sustainable services. Of course, not all governments (e.g., those
in developing countries) have the technological infrastructure or resources to
become smart or to adequately serve their citizens (Fredette, Marom, Steinert, &
Witters, 2012).

Conclusion

We’re all cosmologists


In the preface of Karl Popper’s (1959) seminal book The Logic of Scientific Discovery,
Penn Buckinghamshire wrote:
Leveraging the positive side of IT 131

I, however believe that there is at least one philosophical problem in which all
thinking men are interested. It is the problem of cosmology: the problem of
understanding the world—including ourselves, and our knowledge, as part of the world.
All science is cosmology, I believe. (p.15)

In cosmology, the term ‘big bang’, coined by Hoyle in 1952, is in fact an inter-
esting metaphor since it gives the impression the universe emerged from a ‘big’
explosion– ‘BANG’. While cosmologists note that the big bang was not an
explosion in the conventional sense of the term, they concede that it was sudden
expansion on a colossal scale (Bryson, 2003). Just as the language of science evolves
constantly, so do technologies. Technological advances, such as in particle physics,
astrophysics, or quantum mechanics, provide standard models of cosmology using
the language of dark matter, dark energy, and black holes. Cosmologists also try to
understand the primordial Big Bang-Genesis supervenience problem in which the
scientific explanation of the beginning of the world supervenes the story of crea-
tion in the Bible. The primary drive, or intrinsic intellectual reward, of a physical
cosmologist is obviously beyond the sky! When considering and pondering the
bright and dark sides of IT, we also may be studying what Penn Buckinghamshire
refers to as “cosmology”.

Revolutionary research
An issue in cosmology is measurement. In Chapter 6, we talked about the chal-
lenges of measuring IT-related overload. Researchers will run into similar chal-
lenges for measuring variables in studies of new technologies. Perhaps neuroscience
approaches and equipment like thermal imaging cameras will become more refined
and more affordable to assist in this endeavour.
Griffiths (2015) described a second cognitive revolution using computation and
big data which offers an alternative to merely analyzing behaviour, especially small-
scale laboratory studies. For example, instead of the collaborative filtering of big
data gathered from online websites, data could be used to understand how human
minds work. Collaborative filtering predicts individuals’ purchases based on the simi-
larity of their behaviour to that of others. A cognitive approach would consider the
way preferences, semantic representation, and categorization combine into com-
plex models of human cognition. Consequently, computation using big data would
allow researchers to formulate more precise hypotheses about how exactly the
mind works and the consequences of these cognitions on behaviour. This knowl-
edge could extend cognitive science and inform the research of computer scientists
trying to understand how individuals are using new technologies.
In addition to revolutionizing the way that cognitive research is conducted, big
data could also influence research methodologies across a range of disciplines (i.e.,
information systems, management, computer science, psychology, communica-
tions, etc.). Gone will be surveys of 100 to 500 respondents. In fact, the use of
surveys of thousands of participants may be on its way out. How can data gathered
132 Leveraging the positive side of IT

from hundreds or even thousands of respondents compare with millions of data


points analyzed with sophisticated data analytics? Of course, the data gathered from
website visits, GPS systems on smartphones, or sensors on the Internet of Things
may not represent good operationalization of certain constructs.
Sadly, something else that may be on its way out is theory. A danger in using big
data is that correlations may become ‘king’ and decision makers may rely on the
correlations without understanding why they may have been obtained. A quote by
statistician Sir Ronald Aylmer Fisher (1958) demonstrates the need to look beyond
mere correlations in understanding the world around us:

If… we choose a group of social phenomena with no antecedent knowledge


of the causation or absence of causation among them, then the calculation of
correlation coefficients, total or partial, will not advance us a step toward
evaluating the importance of the causes at work.
(In Freedman, 2010, p.56)

Theory will still be very helpful in the world of big data– but the “age-old search
for causality” may be abandoned in a world of research where “mere correlations
suffice” (Sax, 2016, p.26).

Mastering the two-headed dragon


In our digital society, we live in the company of a fairy-tale ‘two-headed dragon’
with a fierce appetite. He feeds his belly with a massive amount of data and
digests it with the help of new technologies that can categorize, organize, and
make sense of it all. One head of the dragon is always hungry for more data:
“Since collecting more data always translates itself into more potential new
insights waiting to be extracted from the data, data hungriness is a structural
condition of the big data world we have come to inhabit” (Sax, 2016, p.25). The
other head worries about not having the appropriate technology to support the
prodigious digestion process. This dragon can be quite friendly and helpful at
times. However, he can also be quite dangerous. When we users are under his
protection, we can be served, be connected to the world, be smarter, and be
protected from malware, viruses, or pirates. Without his protection, we lack
the information and connectivity to ‘be’: be productive; be efficient; be saved; be
taken seriously; be liked; or even be loved. Thus, to remain protected, we need
to master the dragon.
In future years, our challenge as researchers will be to develop and refine theory
to help us harness IT. We need to mindfully research the consequences of Infor-
mation Technologies, for poor or rich, at the workplace and in our private sphere.
As researchers, we should act as lords, and not as slaves, of markets and industry.
We ought to study the consequences of the usage of IT on our lives, and in par-
ticular our work-life balance, from multiple theoretical angles. Knowledge and
intellectual freedom should benefit our society by building solid theoretical
Leveraging the positive side of IT 133

foundations from which to tackle big data. We should not be content to surf on
the wave of new technologies built and designed for the profit of a few. The race
against the machine, or the robot, is not lost! It is just starting. We are now more
knowledgeable and aware of the dark side of IT. Shall we apply a form of mindful
optimism?
GLOSSARY

Addictive behaviour repeated and compulsive in nature, affecting individuals and


their surroundings.
Affect congruity phenomena the perceptual threshold for affect-congruent
attentional biases being higher than it is for affect-incongruent material.
Amount Illusion the assumption that overload is based primarily on the simple
volume or amount of information.
Antecedent that which precedes the observed behaviour and is hypothetically
governed by a set of natural or social laws; stimulus that cues behaviour.
Associative models focus on the activation of mental representations through
node activations.
Attention “the process of allocating resources to a stimulus or attributes of a sti-
mulus” (Basil, 1994, p.180).
Automation automatic operation of an apparatus, process, or system performed by
Information Technology to take the place of some aspect of human
performance.
Automation addiction when people rely on digital systems too heavily to per-
form their work adequately.
Behaviourism the study of the effects of the environment on the observable
behaviour of individuals without consulting hypothetical events or aspects of
cognition that occur within the mind.
Brain load the emotional and cognitive efforts required by individuals to
appraise and process inputs using the resources available to them; mental
load.
Brain overload the inability to adequately process input and handle the associated
brain load.
Brain Reward System (BRS) complex cerebral circuit engaging specific neuro-
nal pathways that are modulated by cortical oversight systems affiliated with
Glossary 135

emotion, memory, judgment, and decision making; responsive to positive and


negative reinforcement in animals and humans.
Central Nervous System (CNS) part of the nervous system that is divided into
the brain (containing about 1 trillion cells) and the spinal cord; consists of three
main functional components: the sensory system, the motor system, and
higher brain functions (e.g., the hypothalamus, sub-cortical, and cortical areas).
Cerebral cortex represents the highest level of neuronal organization and func-
tion; the uppermost region of the Central Nervous System.
Chunk the assemblage of units, also called bits, of a larger number of elements (i.e.,
information).
Cognition refers to the metamorphosis that a stimulus (e.g., information) goes
through while being processed by the human mind; “such terms as sensation,
perception, imagery, retention, recall, problem-solving and thinking, among
others, refer to hypothetical stages or aspects of cognition” (Neisser, 1967,
p.4).
Cognition-emotion supervenience type of supervenience which is referred to
in psychology as the ‘interplay of affect and cognition’ or, more commonly,
‘feeling and thinking’.
Cognitive absorption “a state of receptivity or openness… to undergo whatever
experiential events, sensory or imaginal, that may occur, with a tendency to
dwell on, rather than go beyond, the experiences themselves and the objects
they represent” (Tellegen, 1981, p.222).
Cognitive dissonance a state of mental discomfort experienced by a person who
simultaneously holds two or more contradictory views or perspectives, or is
confronted with information that conflicts with his cognition.
Cognitive load “the manner in which cognitive resources are focused and used
during learning and problem solving” (Chandler & Sweller, 1991, p.294).
Cognitive overload a construct that represents the symptoms that occur when
cognitive load overwhelms cognitive resources required for chunking.
Cognitive style “an individual’s characteristic and consistent approach to orga-
nizing and processing information and experience” (Tennant, 1988, p.3); may
be analytical or heuristic (e.g., sensation, intuitive, thinking, feeling).
Cognitivists scientists who focus their research on how the mind deals with
information; their research is designed to open up the black box of the mind.
Collaboration overload the situation when employees interact so much with
other employees that they cannot get their own work done during normal
work hours.
Collaborative filtering type of analysis that predicts individuals’ purchases based
on the similarity of their behaviour to that of others.
Communication overload the state when an individual is unable to process the
information that is received from another person or process; focuses on how
technology can be used to transmit messages.
Computationist models focus on formal operations using symbols to be com-
puted during information processing.
136 Glossary

Congruent the valence of an input matches the valence of a related experience


stored in an individual’s Long-Term Memory.
Construct term which “though not observational either directly or indirectly, may
be applied or even defined on the basis of the observables” (Kaplan, 1964,
p.55).
Contingency Boundedness the experience of IT-related overload as a function
of context and exogenous resources provided to the individual.
Coping a response to a distressing emotion, with the function of tension reduc-
tion; a cognitive process such as denial, repression, suppression, intellectuali-
zation, or problem-solving applied to reduce negative emotions.
Ego “the idea that in every individual there is a coherent organization of mental
processes” (Freud, 1927, p.15).
Ego psychology model model that explains mental processes are organized based
on the Freudian idea of the ego and the Id.
Emotion “intelligent interface that mediates between input and output” (Sherer,
1994, p.127) and is related to affect.
Emotional brain complex collection of structures that make up the limbic
system; commonly referred to as the archaic brain.
Emotional-Cognitive Overload (ECO) the negative emotional and cognitive
consequences of high brain load; occurs when an individual’s personal
resources are insufficient for handling the brain load that is created from an
incoming stimulus; brain overload.
Enterprise Cognitive Computing (ECC) applications software that use tools
such as “natural language processing, image recognition, intelligent search, and
decision analysis to adapt their underlying computational and modelling algo-
rithms or processing based on exposure to new data…. to enable an organi-
zation’s business processes” (Tarafdar, Beath, & Ross, 2017, p.3).
Episodic Memory part of explicit memory in Long-Term Memory that stores
personal experiences.
Expertise heuristic and cognitive abilities that are domain-specific; derived from
prior experience within a certain domain of information.
Explicit Memory declarative, conscious part of Long-Term Memory; a brain
construct that refers to the conscious recollection of factual information, pre-
vious experiences, and concepts. It is subdivided into the Semantic Memory
and the Episodic Memory.
Fear of Missing Out (FOMO) “a form of social anxiety—a compulsive concern
that one might miss an opportunity for social interaction, a novel experience,
or some other satisfying event aroused by posts seen on social media sites”
(Dossey, 2014, p.69).
Feelings the subjective experience of emotion.
Filter Model of Attention describes a sieve, or filter, that selectively accepts or
rejects information signals.
Flow “the holistic sensation that people feel when they act with total involve-
ment” (Csikszentmihalyi, 1975, p.36).
Glossary 137

Gamification the application of game design principles and elements in non-game


contexts.
Heuristics the operational path taken to solve a problem expeditiously.
Hyperconnectivity staying connected too long to the Internet, social media,
smartphones, and other Information Technologies.
Hypothalamus a part of the limbic system; located below the thalamus.
iDisorder “the negative relationship between technology usage and psychological
health” (Rosen, Whaling et al., 2013, p.1243).
Implicit Memory non-declarative, non-conscious part of Long-Term Memory.
Information “data endowed with relevance and purpose” (Pearlson, Saunders, &
Galletta, 2016, p.11).
Information load “the amount of data to be processed per unit of time” (Schick
et al., 1990, p.203).
Information overload occurs “when an individual’s information processing cap-
abilities are exceeded by the information processing requirements” (Karr-
Wisniewski & Lu, 2010, p.1062).
Information processing (IP) the way information is selected, encoded, and
activated in human memory; the “gathering, interpreting, and synthesis of
information in the context of organizational decision making” (Tushman &
Nadler, 1978, p.614).
Information processing capacity (IPC) resources that are deployed to cope
with the limited nature of the memory system; in organizations, IPC is the
organization’s ability to process the information needed to execute tasks,
reduce uncertainty, resolve technical exceptions, and provide adequate coor-
dination for the completion of organizational tasks.
Internet addiction psychological dependence on the Internet; type of IT addiction.
Introspection attending to one’s physiological sensations and reporting thoughts
or images as objectively as possible.
Involvement reflects the psychological importance and personal relevance of an
object or an event.
IT addiction the state of being challenged in balancing IT usage mindfully so as
to preserve one’s resources; includes Internet, mobile email, and SNS
addictions.
IT-related overload the state of being challenged in processing information used
in IT-related activities (or requests to use IT); type of brain overload.
Limbic system a complex collection of structures that is commonly referred to as
the emotional brain or archaic brain; includes the amygdala, hippocampus,
thalamus, hypothalamus, basal ganglia, and cingulate gyrus.
Long-Term Memory (LTM) More permanent part of memory; comprised of
explicit (i.e., declarative, conscious) memory and implicit (i.e., non-declarative,
non-conscious) memory.
Mechanistic models representations where the environmental conditions are
changed to alter the probability of certain behaviours occurring and researchers
use statistical approaches in their laboratories.
138 Glossary

Mental representations compositions of schemata that are stored and later


activated when the brain needs to process new elements of information
(bits or chunks); to solve problems, individuals act upon theses mental
representations, which are fused with their stored personal histories and
mental thesaurus.
Mentalism the terminology of the mind, used particularly in psychoanalysis.
Mind-body supervenience holds that mental phenomenon must be anchored in
some type of physical system.
Minimally invasive surgery a type of surgery in which the surgeon performs the
operation through small incisions and with a camera inside the body.
Monochronicity the extent to which people prefer to do one thing at a time.
Multitasking performing multiple tasks at the same time; shifting frequently from
one task to another (i.e., task-switching).
Narcissism a fundamental absorption towards the self and the constant need to
validate one’s existence; reflects a grandiose, inflated, self-centered self-concept
that suppresses low self-esteem based on defective attachment in childhood.
Need for Cognition (NFC) “a need to structure relevant situations in mean-
ingful, integrated ways. It is the need to understand and make reasonable the
experiential world” (Cohen, Stotland, & Wolfe, 1955, p.291).
Net Geners people born after 1980; includes the groups called Millennials and
Generation Y.
Objective focusing on the object, the material in Popper’s World 1; measured
quantitatively.
Organizational design process whereby organizational structure is made to fit
with specific characteristics both inside and outside the organizational system.
Organizational structure the pattern of interactions or the network of relation-
ships that exist among organizational members and units.
Over-connectivity the situation of being connected to the extent that the indi-
vidual may experience one or more of the various forms of IT addiction.
Oxytocin neurohormone that enhances brain reward system responses; affects
social attachment behaviour; commonly called the ‘love’ or ‘cuddle’ hormone.
Paradigm shift a fundamental change in scientific practices.
Pathological Internet Use (PIU) the consequences of problematic cognition
coupled with behaviour that intensifies or maintains maladaptive response;
has four elements (1) excessive Internet use, often associated with a loss of
sense of time or a neglect of basic drives; (2) withdrawal, including feelings
of anger, depression, and tension when Internet is not accessible; (3) toler-
ance, including the need for better computer equipment, more software, or
more hours of use; and (4) adverse consequences, including arguments, lying,
poor school or vocational achievement, social isolation, and fatigue; type of
IT addiction.
Peripheral Nervous System (PNS) part of the nervous system composed of
spinal and cranial nerves, the Autonomic Nervous System, and ganglia, which
are sensory receptor organs scattered throughout the body.
Glossary 139

Personality trait the tendency to manifest particular patterns of cognition,


emotion, motivation, and behaviour in response to a variety of eliciting
stimuli.
Pertinence relevance; a new input matches the information stored in memory.
Polychronicity “the extent to which people (1) prefer to be engaged in two or
more tasks or events simultaneously and are also actually engaged and (2)
believe their preference is the best way to do things” (Bluedorn, 2002, p.51).
Prefrontal cortex (PFC) plays a key role when dealing with information and
decision-making; serves attentional process in detecting errors or recovering
from disruptions.
Prior experience of Emotional-Cognitive Overload (PECO) part of the
mental framework that is associated with concepts stored in the Semantic
Memory; results from the encoding of Emotional-Cognitive Overload in the
Episodic Memory.
Process of attention attentional process that is supported by the Prefrontal cortex
and allows for detecting errors or recovering from disruptions; supports limited
attentional resources.
Psychometrics field of research concerned with the objective measurement of
skills and knowledge, abilities, attitudes, and personality traits.
Qualitative overload the situation where employees perceive that their work
roles or the work they are assigned to do exceeds their capability or skill levels,
their capacity in terms of the level of difficulty or the amount of work, or their
knowledge, creating pressure on them.
Quantitative overload “an individual’s perception that they cannot perform a
task because they lack critical resources” (Ahuja & Thatcher, 2005, p.435).
Resources “objects, personal characteristics, conditions and energies that are
valued by individuals or that serve as a means of attainment of other resources”
(Hobfoll, 1989, p.516); exist as a resource pool; necessary for cognitive pro-
cessing; may be extraneous, or endogenous physical, emotional, or cognitive
energy; serve as tools.
Robot “a reprogrammable multifunctional manipulator designed to move mate-
rial, parts, tools, or specialized devices through variable programmed motions
for the performance of tasks” (Hamilton & Hancock, 1986, p.70).
Role overload situation that occurs when employees feel like they have too much
to do in their various roles in light of available time and resources.
Schemata a form of analysis in interposition between the sensory data and the
abstract a priori categories in mind; dual– one part is rules (i.e., logic) and the
other is empirical perception (i.e., image); enriched through personal experi-
ences that build nets of representations explicitly in the memory; required for
current and future problem-solving.
Scientific practice traditions in science including the combination of accepted
laws, theories, applications, and instrumentation.
Scientific revolution dramatic change in a science which is characterized by
paradigm shift.
140 Glossary

Second brain relates to the gut-feelings transmitted via the stomach, esophagus,
small intestine, and colon to the CNS; also known as the mind-gut
connection.
Self-serving attribution bias the tendency to attribute success to dispositional
attributes and failure to situational ones; fundamental error attribution.
Semantic Memory part of explicit memory in Long-Term Memory that acts as a
mental thesaurus.
Short-Term Memory (STM) short-term storage and attentional system in the
form of a single limited-capacity memory; limited to 7 +/− 2 bits of
information.
Social Networking System (SNS) social media systems such as Facebook, Lin-
kedIn, and Instagram.
Social presence the degree to which a medium allows an individual to establish a
personal connection with others that resembles face-to-face interaction.
Subjective focusing on the subject, the psychological states of subjective experi-
ences; Popper’s World 2.
Superchunk comprised of first-order chunks which are combined in levels so that
they require less effort to store in memory and also make the information
easier to remember.
Supervenience the ontological relation that occurs when upper-level system
properties are determined by their lower-level properties.
System feature overload the state that occurs when the technology an individual
has to use to complete a task is too complex for the task and for the individual.
Technology of social saturation reflects the media potential for expression and
connection to overpopulate the self (i.e., ego).
Technophilia a form of overidentification with technology that leads to a dis-
solution of human-technology boundaries.
Technophobia the struggle to accept computer technology.
Technostress type of stress experienced in organizations by technology end users
as a result of their inability to cope with the demands of organizational com-
puter usage.
Triangulation a research strategy that uses a multitrait-multimeasure approach, or
convergent validation with the goal of ensuring that the results of research are
not the product of methodological artefact (i.e., observation is cross-verified
from two or more sources).
Valence a positive or negative emotional tag attached to events and concepts that
were activated in association with prior experience of the related emotional
tag.
Variable an observable entity which is capable of assuming two or more values.
Work an “ongoing, often unending stream of meaningful activities that [allow] the
worker to fulfil a distinct role” (Pearlson et al., 2016, p.77).
Work-family conflict situation that occurs when the time and energy demands
of one set of roles (i.e., work or family) makes it difficult to fulfill the demands
of another.
Glossary 141

Work-life balance the degree to which individuals can satisfactorily harmonize


the temporal, emotional, and behavioural demands of work and family life that
are levied on them.
Work overload situation of having too much work to do within the designated
conventional workday.
Working memory (WM) a single storage mechanism whose role is to activate
traces leading to temporal versus permanent change in the cognitive system
itself.
REFERENCES

Abaker, I., Hashem, T., Chang, V., Anuar, N. B., Adewole, K., Yaqoob, I., Gani, A.,
Ahmed, E., & Chiroma, H. (2016). The role of big data in smart city. International Journal
of Information Management, 36(5), 748–758.
Agence France-Presse (2016). French workers win legal right to avoid checking work email
out-of-hours. The Guardian (December 31). Available at: https://www.theguardian.
com/money/2016/dec/31/french-workers-win-legal-right-to-avoid-checking-work-ema
il-out-of-hours (accessed December 5, 2017).
Ahuja, M.K., Chudoba, K.M., Kacmar, C.J., McKnight, D.H., & George, J.F. (2007). IT
road warriors: Balancing work-family conflict, job autonomy, and work overload to
mitigate turnover intentions. MIS Quarterly, 31(1), 1–17.
Ahuja, M.K., & Thatcher, J.B. (2005). Moving beyond intentions and toward the theory of
trying: Effects of work environment and gender on post-adoption information technol-
ogy use. MIS Quarterly, 29(3), 427–459.
Aljukhadar, M., Senecal, S., & Daoust, C.E. (2012). Using recommendation agents to cope
with information overload. International Journal of Electronic Commerce, 17(2), 41–70.
Allen, D.K., & Shoard, M. (2005). Spreading the load: Mobile information and commu-
nications technologies and their effect on information overload. Information Research, 10
(2), article 227.
Anderson, J.R., & Bower, G.H. (1973). Human associative memory. Washington, DC:
Winston.
Arbesman, S. (Ed.) (2013). The half-life of facts: Why everything we know has an expiration date.
New York: Penguin.
Aristotle (350 BCE). Politics. Translated by Benjamin Jowett (Public domain). Available at:
http://pinkmonkey.com/dl/library1/gp017.pdf (accessed December 13, 2017).
Arnetz, B.B., & Wiholm, C. (1997). Stress: Psychophysiological symptoms in modern offi-
ces. Journal of Psychosomatic Research, 43(1), 35–42.
Arora, S., Hull, L., Sevdalis, N., Tierney, T., Nestel, D., Woloshynowych, M., Darzi, A. &
Kneebone, R. (2010). Factors compromising safety in surgery: Stressful events in the
operating room. The American Journal of Surgery, 199(1), 60–65.
References 143

Asimov, I. (1942). Runaround. Astounding Science Fiction, 29(3), 94–103. [Republished in


Robot Visions by Isaac Asimov, Penguin, 1991.]
Atkinson, R.C., & Shiffrin, R.M. (1968). Human memory: A proposed system and its
control processes. In K.W. Spence & J.T. Spence (Eds), The psychology of learning and
motivation (Vol.2, pp.89–195). New York: Academic Press.
Autor, D.H., & Dorn, D. (2013). The growth of low-skill service jobs and the polarization
of the US labour market. American Economic Review, 103(5), 1553–1597.
Ayyagari, R., Grover, V., & Purvis, R.L. (2011). Technostress: Technology antecedents and
implications. MIS Quarterly, 35(4), 831–858.
Babcock, Q., & Byrne, T. (2000). Student perceptions of methylphenidate abuse at a public
liberal arts college. Journal of American College Health, 49(3), 143–145.
Bacharach, S.B. (1989). Organizational theories: Some criteria for evaluation. Academy of
Management Review, 14(4), 496–515.
Baddeley, A.D. (Ed.) (1986). Working memory. Oxford: Oxford University Press.
Baddeley, A.D. (2000). The episodic buffer: A new component of working memory? Trends
in Cognitive Science, 4(11), 417–423.
Baddeley, A.D., & Hitch, G. (1974). Working memory. In G.H. Bower (Ed.), The psychol-
ogy of learning and motivation: Advances in research and theory (Vol. 1, pp.47–89). New York:
Academic Press.
Baddeley, A.D., & Logie, R.H. (1999). Working memory: The multiple component model.
In A. Miyake & P. Shah (Eds), Models of working memory (pp.28–61). New York: Cam-
bridge University Press.
Bandura, A. (1977). Self-efficacy: Toward a unifying theory of behavioral change. Psycholo-
gical Review, 84(2), 191–215.
Bargh, J.A., & Ferguson, M.J. (2000). Beyond behaviorism: On the automaticity of higher
mental processes. Psychological Bulletin, 126(6), 925–945.
Barkhi, R. (2002). Cognitive style may mitigate the impact of communication mode. Infor-
mation and Management, 39(8), 677–688.
Barki, H. & Hartwick, J. (1989). Rethinking the concept of user involvement. MIS Quar-
terly, 13(1), 53–63.
Barley, S.R., Meyerson, D.E., & Grodal, S. (2011). E-mail as a source and symbol of stress.
Organization Science, 22(4), 887–906.
Barnard, P.J. (1985). Interactive cognitive subsystems: A psycholinguistic approach to short-
term memory. In A. Ellis (Ed.), Progress in the psychology of language (Vol. 2, pp.197–258).
London: Lawrence Erlbaum.
Barnard, P.J., Duke, D.J., Byrne, R.W., & Davidson, I. (2007). Differentiation in cogni-
tive and emotional meanings: An evolutionary analysis. Cognition and Emotion, 21(6),
1155–1183.
Barrett, K., & Campos, J. (1987). Perspectives on emotional development II: A function-
alist approach to emotions. In J. Osofsky (Ed.), Handbook of infant development (2nd ed.,
pp.555–578). New York: Wiley.
Basil, M.D. (1994). Multiple resource theory I: Application to television viewing. Commu-
nication Research, 21(2), 177–207.
Baumeister, R.F., & Leary, M.R. (1995). The need to belong: Desire for interpersonal
attachments as a fundamental human motivation. Psychological Bulletin, 117(3), 497–529.
Berghel, H. (1997). Cyberspace 2000: Dealing with information overload. Communication of
the ACM, 40(2), 19–26.
Bergman, S.M., Fearrington, M.E., Davenport, S.W., & Bergman, J.Z. (2011). Millennials,
narcissism, and social networking: What narcissists do on social networking sites and why.
Personality and Individual Differences, 50(5), 706–711.
144 References

Berguer, R., Smith, W.D., & Chung, Y.H. (2001). Performing laparoscopic surgery is sig-
nificantly more stressful for the surgeon than open surgery. Surgical Endoscopy, 15(10),
1204–1207.
Bettman, J.R., Johnson, E., & Payne, J. (1990). A componential analysis of cognitive effort
and choice. Organizational Behavior and Human Decision Processes, 45(1), 111–139.
Binet, A. & Simon, T. (1904). Méthodes nouvelles pour le diagnostic du niveau intellectuel
des anormaux. L’année psychologique, 11, 191–244.
Birbaumer, N., & Cohen, L.G. (2007). Brain-computer interfaces: Communication and
restoration of movement in paralysis. The Journal of Physiology, 15(579), 621–636.
Block, J.J. (2008). Issues for DSM-V: Internet addiction. The American Journal of Psychiatry,
165(3), 306–307.
Bluedorn, A.C. (Ed.) (2002). The human organization of time: Temporal realities and experience.
Stanford, CA: Stanford University Press.
Boden, M.A. (Ed.) (1977). Artificial intelligence and natural man. New York: Basic Books.
Bolino, M.C., & Turnley, W.H. (2005). The personal costs of citizenship behavior: The
relationship between individual initiative and role overload, job stress, and work-family
conflict. Journal of Applied Psychology, 90(4), 740–748.
Bouzida, N., Bendada, A., & Maldague, X. (2009). Visualization of human body thermo-
regulation by infrared imaging. Journal of Thermal Biology, 34(3), 120–126.
Bower, G.H. (1981). Mood and memory. American Psychologist, 36(2), 139–148.
Bower, G.H. (1991). Mood congruity of social judgments. In J.P. Forgas (Ed.), Emotion and
social judgments (pp.31–54). Oxford: Pergamon Press.
Bower, G.H. (2001). Mood as a resource in processing self-relevant information. In J.P.
Forgas (Ed.), Handbook of affect and social cognition (pp.256–272). Mahwah, NJ: Lawrence
Erlbaum.
Bradshaw, J.L. (1968). Load and pupillary changes in continuous processing tasks. British
Journal of Psychology, 59(3), 265–271.
Broadbent, D. (1958). Perception and communication. London: Pergamon Press.
Brod, C. (1984). Technostress: The human cost of the computer revolution. Reading: Addison-Wesley.
Bridges, W. (1994). Job shift: How to prosper in a workplace without jobs. Reading, MA: Addi-
son-Wesley.
Brooks, S., Longstreet, P., & Califf, C. (2017). Social media induced technostress and its
impact on Internet addiction: A distraction-conflict theory perspective. Association of
Information System Transactions on Human-Computer Interaction, 9(2), 99–122.
Brynjolfsson, E., & McAfee, A. (2011). Race against the machine: How the digital revolution is
accelerating innovation, driving productivity, and irreversibly transforming employment and the
economy. Lexington, MA: Digital Frontier Press.
Bryson, B. (2003). A short history of nearly everything. New York: Broadway Books.
Buffardi, L.E., & Campbell, W.K. (2008). Narcissism and social networking web sites. Per-
sonality and Social Psychology Bulletin, 34(10), 303–1314.
Bureau of Labor Statistics (2017). Occupational outlook handbook. https://www.bls.gov/
ooh/transportation-and-material-moving/ (accessed December 30, 2017).
Cacioppo, J.T., & Petty, R.E. (1982). The need for cognition. Journal of Personality and Social
Psychology, 42(1), 116–131.
Cacioppo, J.T., Petty, R.E., Feinstein, J.A., & Jarvis, B.G. (1996). Dispositional differences
in cognitive motivation: The life and times of individuals varying in need for cognition.
Psychological Bulletin, 119(2), 197–253.
Calhoun, C.J. (2002). Classical sociological theory. Malden, MA: Blackwell.
Cao, C.G., Zhou, M., Jones, D.B., & Schwaitzberg, S.D. (2007). Can surgeons think and
operate with haptics at the same time? Journal of Gastrointestinal Surgery, 11(11), 1564–1569.
References 145

Campbell, D.T., & Fiske, D.W. (1959). Convergent and discriminant validation by the
multitrait-multimethod matrix. Psychological Bulletin, 56(2), 81–105.
Cannon, W.B. (1914). The interrelations of emotions as suggested by recent psychological
researches. American Journal of Psychology, 25(2), 256–282.
Cannon, W.B. (1927). The James/Lange theory of emotion: A critical examination and an
alternative theory. American Journal of Psychology, 39, 106–124.
Cannon, W.B. (Ed.) (1929). Bodily changes in pain, hunger, fear and rage. New York: D.
Appleton and Co.
Cannon, W.B. (Ed.) (1932). The wisdom of the body. New York: Norton.
Caplan, S.E. (2002). Problematic Internet use and psychosocial well-being: Development of
a theory-based cognitive-behavioral measurement instrument. Computers in Human Beha-
vior, 18(5), 553–575.
Caplan, S.E. (2003). Preference for online social interaction: A theory of problematic
Internet use and psychosocial well-being. Communication Research, 30(6), 625–648.
Caplan, S.E. (2007). Relations among loneliness, social anxiety, and problematic Internet
use. Cyberpsychology Behavior, 10(2) 234–242.
Caplan, S.E. (2010). Theory and measurement of generalized problematic Internet use: A
two-step approach. Computers in Human Behavior, 26(5), 1089–1097.
Carlson, N., & Buskist, W. (1997). Psychology: The science of behavior. Boston, MA: Allyn and
Bacon.
Carpenter, C.J. (2012). Narcissism on Facebook: Self-promotional and anti-social behavior.
Personality and Individual Differences, 52(4), 482–486.
Carr, N. (2017). How smartphones hijack our minds. Wall Street Journal (October 6). Avail-
able at: https://www.wsj.com/articles/how-smartphones-hijack-our-minds-1507307811
(accessed October 26, 2017).
Carvalho, S., Cunha, E., Sousa, C., & Matsuzawa, T. (2008). Chaînes opératoires and
resource exploitation strategies in chimpanzee (Pan troglodytes) nut cracking. Journal of
Human Evolution, 55(1), 148–163.
Cellan-Jones, R. (2014). Stephen Hawking warns artificial intelligence could end mankind.
The BBC (December 2). Available at: www.bbc.com/news/technology-30290540
(accessed June 12, 2017).
Chambers, A.B., & Nagel, D.C. (1985). Pilots of the future: Human or computer? Com-
munications of the ACM, 28(11), 74–87.
Chandler, P., & Sweller, J. (1991). Cognitive load theory and the format of instruction.
Cognition and Instruction, 8(4), 293–332.
Chang, S.L., & Ley, K. (2006). A learning strategy to compensate for cognitive overload in
online learning: Learner use of printed online material. Journal of Interactive Online Learning,
5(1), 104–117.
Chen, Y.C., Shang, R.A., & Kao, C.Y. (2009). The effects of information overload on
consumers’ subjective state towards buying decision in the Internet shopping environ-
ment. Electronic Commerce Research and Applications, 8(1), 48–58.
Chervany, N., & Dickson, G. (1974). An experimental evaluation of information in a pro-
duction environment. Management Science, 20(10), 1335–1344.
Chewning, E.G., & Harrell, A.M. (1990). The effect of information load on decision
makers’ cue utilization levels and decision quality in a financial distress decision task.
Accounting, Organizations and Society, 15(6), 527–542.
Chialastri, A. (2012). Automation in aviation. In F. Kongoli (Ed.), Aviation (pp.79–102).
Rijeka, Croatia: InTech.
Chou, C., & Hsiao, M.C. (2000). Internet addiction, usage, and gratifications: The Taiwan
college students’ case. Computers and Education, 35(1), 65–80.
146 References

Christensen, C.M., Bohmer, R., & Kenagy, J. (2000). Will disruptive innovations cure
health care? Harvard Business Review, 78(5), 102–112,199.
Clark, P.A. (1985). A review of the theories of time and structure for organizational sociol-
ogy. Research in the Sociology of Organizations, 4, 35–97.
Cobos, P., Sanchez, M., Garcia, C., Vera, M.N., & Vila, J. (2002). Revisiting the James
versus Cannon debate on emotion: Startle and autonomic modulation in patients with
spinal cord injuries. Biological Psychology, 61(3), 251–269.
Cohen, A.R., Stotland, E., & Wolfe, D.M. (1955). An experimental investigation of need
for cognition. Journal of Abnormal and Social Psychology, 51(2), 291–294.
Cohen, K.N., & Clark, J.A. (1984). Transitional object attachments in early childhood and per-
sonality characteristics in later life. Journal of Personality and Social Psychology, 46(1), 106–111.
Colbert, A., Yee, N., & George, G. (2016). The digital workforce and the workplace of the
future. Academy of Management Journal, 59(3), 731–739.
Cook, G.J. (1993). An empirical investigation of information search strategies with implica-
tions for decision support system design. Decision Sciences, 24(3), 683–699.
Craik, F.M.I., & Lockhart, R.S. (1972). Level of processing: A framework for memory
research. Journal of Verbal Learning and Verbal Behavior, 11(6), 671–684.
Cross, R., & Gray, P. (2013). Where has the time gone? California Management Review, 56(1),
50–66.
Cross, R., Rebele, R., & Grant, A. (2016). Collaborative overload. Harvard Business Review,
94(1), 74–79.
Csikszentmihalyi, M. (1975). Beyond boredom and anxiety. San Francisco, CA: Jossey-Bass.
Csikszentmihalyi, M., & Csikszentmihalyi, I. (1988). Optimal experience. Cambridge, UK:
Cambridge University Press.
Dahr, R. (1996). The effects of decision strategy on deciding to defer choice. Journal of
Behavioral Decision Making, 9(4), 265–281.
Damasio, A.R. (1994). Descartes’ error: Emotion, reason and the human brain. New York: Putnam.
Damasio, A., & Van Hoesen, G.W. (1983). Emotional disturbances associated with focal
lesions of the limbic frontal lobe. In K. Heilman & P. Satz (Eds), The neuropsychology of
human emotion: Recent advances (pp.85–110). New York: The Guilford Press.
Daniels, K. (2008). Affect and information processing. In G.P. Hodgkinson and W.H. Star-
buck (Eds), Oxford handbook of organizational decision-making (pp.325–341). Oxford:
Blackwell.
D’Arcy, J., Gupta, A., Tarafdar, M., & Turel, O. (2014). Reflecting on the “dark side” of
information technology use. CAIS, 35, article 5.
D’Arcy, J., Herath, T., & Shoss, M.K. (2014). Understanding employee responses to stressful
information security requirements: A coping perspective. Journal of Management Information
Systems, 31(2), 285–318.
Darwin, C. (Ed.) (1859). On the origin of species by means of natural selection, or the preservation of
favoured races in the struggle for life. New York: D. Appleton and Co.
Darwin, C. (Ed.) (1871). The descent of man, and selection in relation to sex. London: Murray.
Davila, J., Hershenberg, R., Feinstein, B.A., Gorman, K., Bhatia, V., & Starr, L.R. (2012).
Frequency and quality of social networking among young adults: Associations with
depressive symptoms, rumination, and co-rumination. Psychology of Popular Media Culture,
1(2), 72–86.
Davis, R.A. (2001). A cognitive-behavioral model of pathological Internet use. Computers in
Human Behavior, 17(2), 187–195.
Davis, R.A., Flett, G.L., & Besser, A. (2002). Validation of a new scale for measuring pro-
blematic Internet use: Implications for pre-employment screening. CyberPsychology and
Behavior, 5(4), 331–345.
References 147

Dawson, M.E., Schell, A.M., & Filion, D.L. (1990). The electrodermal response system. In
J.T. Cacioppo & L.G. Tassinary (Eds), Principles of psychophysiology: Physical, social and
inferential elements (pp.295–324). Cambridge: Cambridge University Press.
Denning, P.J. (1982). Electronic junk. Communications of the ACM, 25(3), 163–165.
Denzin, N.K. (1978). The research act (2nd ed.). New York: McGraw-Hill.
Descartes, R. (1644). Principles of philosophy. Amsterdam: Louis Elzevir.
Deutsch, J., & Deutsch, D. (1963). Attention: Some theoretical considerations. Psychological
Review, 70(1), 80–90.
DeWall, N.C., Buffardi, E.L., Bonser, I., & Campbell, W.K. (2011). Narcissism and implicit
attention seeking: Evidence from linguistic analyses of social networking and online pre-
sentation. Personality and Individual Differences, 51(1), 57–62.
DeYoung, C.G., Hirsh, J.B., Shane, M.S., Papademetris, X., Rajeevan, N., & Gray, J.R.
(2010). Testing predictions from personality neuroscience: Brain structure and the Big
Five. Psychological Science, 21(6), 820–828.
DeYoung, C.G., Peterson, J.B., & Higgins, D.M. (2005). Sources of openness/intellect:
Cognitive and neuropsychological correlates of the fifth factor of personality. Journal of
Personality, 73(4), 825–858.
Dickson, G.W., Senn, J.A., & Chervany, N.L. (1977). Research in management information
systems: The Minnesota experiments. Management Science, 23(9), 913–924.
Diderot, D. (1818–1819). Oeuvres de Denis Diderot (11 vols). Paris: J.L.J. Brière.
Dimoka, A. (2011). Brain mapping of psychological processes with psychometric scales: An
fMRI method for social neuroscience. NeuroImage, 54(Suppl. 1), S263–S271.
Dimoka, A., Banker, R.D., Benbasat, I., Davis, F.D., Dennis, A.R., Gefen, D., Gupta, A.,
Ischebeck, A., Kenning, P., Pavlou, P., Müller-Putz, G., Riedl, R., vom Brocke, J., &
Weber, B. (2012). On the use of neurophysiological tools in IS research: Developing a
research agenda for NeuroIS. MIS Quarterly, 36(3), 679–702.
Dimoka, A., Pavlou, P.A., & Davis, F. (2011). NeuroIS: The potential of cognitive neu-
roscience for information systems research. Information Systems Research, 22(4), 687–702.
Donaldson, S.I., & Grant-Vallone, E.J. (2002). Understanding self-report bias in organiza-
tional behavior research. Journal of Business and Psychology, 17(2), 245–260.
Dossey, L. (2014). FOMO, digital dementia, and our dangerous experiment. Explore: The
Journal of Science and Healing, 10(2), 69–73.
Drouin, M., Kaiser, D.H., & Miller, D.A. (2012). Phantom vibrations among under-
graduates: Prevalence and associated psychological characteristics. Computers in Human
Behavior, 28(4), 1490–1496.
Dunahoo, C.L., Hobfoll, S.E., Monnier, J., Hulsizer, M.R., & Johnson, R. (1998). There’s
more than rugged individualism in coping. Part 1: Even the Lone Ranger had Tonto.
Anxiety, Stress & Coping, 11(2), 137–165.
Duxbury, L.E., & Higgins, C.A. (2001). Work-life balance in the new millennium: Where are we?
Where do we need to go? Ottawa, ON: Canadian Policy Research Networks.
Ebbinghaus, H. (1913/1885). Memory: A contribution to experimental psychology. New York:
Columbia Teachers’ College.
ECRI Institute (2012). Top 10 health technology hazards for 2013. Health Devices, 41(11),
1–23. Available at: www.ecri.org/2013hazards (accessed December 13, 2017).
ECRI Institute (2013). Top 10 health technology hazards for 2014. Health Devices, 42(11),
354–380. Available at: www.healthit.gov/facas/sites/faca/files/STF_Top_Ten_Tech_Hazards_
2014-06-13.pdf (accessed December 13, 2017).
ECRI Institute (2014). Top 10 health technology hazards for 2015. Health Devices, 43(11),
1–31. Available at: https://www.ecri.org/press/Pages/ECRI-Institute-Announces-Top-
10-Health-Technology-Hazards-for-2015.aspx (accessed December 13, 2017).
148 References

Edmunds, A. & Morris, A. (2000). The problem of information overload in business orga-
nisations: A review of the literature. International Journal of Information Management, 20(1),
17–28.
Ekman, P. (1984). Expression and the nature of emotion. In K.R. Scherer & E. Ekman
(Eds), Approaches to emotion (pp.319–344). Hillsdale, NJ: Lawrence Erlbaum.
Eppler, M.J., & Mengis, J. (2004). The concept of information overload: A review of lit-
erature from organization science, accounting, marketing, MIS, and related disciplines.
The Information Society, 20(5), 325–344.
Ernest, M., & Paulus, M.P. (2005). Neurobiology of decision making: A selective review
from a neurocognitive and clinical perspective. Biology Psychiatry, 58(8), 597–604.
Eysenck, M.W., & Eysenck, H.J. (1980). Mischel and the concept of personality. British
Journal of Psychology, 71(2), 191–204.
Farhoomand, A.F., & Drury, D.H. (2002). Managerial information overload. Communications
of the ACM, 45(10), 127–131.
Festinger, L., & Carlsmith, J.M. (1959). Cognitive consequences of forced compliance. The
Journal of Abnormal and Social Psychology, 58(2), 203–210.
Fleeson, W. (2001). Towards a structure- and process-integrated view of personality:
Traits as density distributions of states. Journal of Personality and Social Psychology, 80(6),
1011–1027.
Foehr, U.G. (2006). Media multitasking among American youth: Prevalence, predictors, and pairings.
Menlo Park, CA: The Henry J. Kaiser Family Foundation.
Folkman, S., & Lazarus, R.S. (1988). The relationship between coping and emotion:
Implications for theory and research. Social Science & Medicine, 26(3), 309–317.
Forbes, B.C. (1921). Why do so many men never amount to anything? [Interview with
Thomas Edison]. American Magazine, 91, 10–11,85–86,89.
Forgas, J.P. (2003). Affective influences on attitudes and judgments. In R.J. Davidson, K.R.
Scherer, & H.Goldsmith (Eds), Handbook of affective sciences (pp.596–618). Oxford: Oxford
University Press.
Forsyth, D.R. (1980). The functions of attributions. Social Psychology Quarterly, 43(2),
184–118.
Frank, R.H. (Ed.) (1988). Passions within reasons: The strategic role of the emotions. New York:
Norton.
Fredette, J., Marom, R., Steinert, K., & Witters, L. (2012). The promise and peril of
hyperconnectivity for organizations and societies. In S. Dutta & B. Bilbao-Osorio (Eds),
The global information technology report 2012: Living in a hyperconnected world (pp.113–119).
Geneva, Switzerland: World Economic Forum.
Freedman, D.A. (2010). Statistical model and causal inference: A dialogue with social sci-
ences. Edited by D. Collier, J.S. Sekhon, & P.B. Stark. New York: Cambridge University
Press.
Freud, S. (1894). The neuro-psychoses of defence. In J. Strachey (Ed.), The standard edition of
the complete works of Sigmund Freud (Vol.3, pp.41–61). London: Hogarth.
Freud, S. (Ed.) (1927). The ego and the ID. London: Hogarth.
Frey, C., & Osborne, M. (2013). The future of employment: How susceptible are jobs to compu-
terisation? Oxford: University of Oxford.
Frijda, N.H. (1986). The emotions. New York: Cambridge University Press.
Frijda, N.H. (1994). Emotions are functional most of the time. In P. Ekman and R.J.
Davidson (Eds), The nature of emotion: Fundamental questions (pp.112–122). New York:
Oxford University Press.
Galbraith, J.R. (1974). Organization design: An information processing view. Interfaces, 4(3),
28–36.
References 149

Galton, F. (1892). Hereditary genius: An inquiry into its laws and consequences. London: Mac-
Millan and Co.
Gardner, H. (Ed.) (1987). The mind’s new science: A history of the cognitive revolution. New
York: Basic Books.
Gergen, K.J. (1991). The saturated self: Dilemmas of identity in contemporary life. New York:
Basic Books.
Gershon, M. (1999). The second brain: A groundbreaking new understanding of nervous disorders of
the stomach and intestine. New York: Harper Perennial.
Ghashghaei, H.T., & Barbas, H. (2002). Pathways for emotion: Interactions of prefrontal and
anterior temporal pathways in the amygdala of the rhesus monkey. Neuroscience, 115(4),
1261–1279.
Glusac, E. (2016). The challenge to unplug. The New York Times (Sept 18), p.2.
Goeders, N.E. (2003). The impact of stress on addiction. European Neuropsychopharmacology,
13(6), 435–441.
Goldin, C., & Katz, L. (2009). The race between education and technology: The evolution
of US educational wage differentials, 1890 to 2005. In The race between education and tech-
nology (pp.287–323). Cambridge, MA: Harvard University Press. [Working paper version
available at: www.nber.org/papers/w12984]
Goldschmidt, P.G. (2005). HIT and MIS: Implications of health information technology and
MIS. Communications of the ACM, 48(10), 68–74.
Golson, J. (2016). Tesla driver killed in crash with Autopilot active, NHSTA investigating.
The Verge (June 30). Available at: https://www.theverge.com/2016/6/30/12072408/tesla
-autopilot-car-crash-death-autonomous-model-s (accessed December 13, 2017).
Goodman, M. (2015). Future crimes: Everything is connected, everyone is vulnerable and what we
can do about it. New York: Doubleday.
Goonetilleke, R.S. & Luximon, Y. (2010). The relationship between monochronicity,
polychronicity and individual characteristics. Behaviour and Information Technology, 29(2),
187–198.
Goos, M., & Manning, A. (2007). Lousy and lovely jobs: The rising polarization of work in
Britain. The Review of Economics and Statistics, 89(1), 118–133.
Gretzel, U., Sigala, M., Xiang, Z., & Koo, C. (2015). Smart tourism: Foundations and
developments. Electronic Markets, 25(3), 179–188.
Griffiths, T.L. (2015). Manifesto for a new (computational) cognitive revolution. Cognition,
135, 21–23.
Grinbaum, A., & Groves, C. (2013). What is responsible about responsible innovation?
Understanding the ethical issues. In R Owen, J. Bessant & M. Heintz (Eds), Responsible
innovation: Managing the responsible emergence of science and innovation in society (pp.119–139).
Chichester, UK: Wiley & Sons.
Grise, M.-L., & Gallupe, R.B. (1999–2000). Information overload: Addressing productivity
paradox in face-to-face electronic meetings. Journal of Management Information Systems, 16
(3), 157–185.
Gross, J.J. (Ed.) (2007). Handbook of emotion regulation. New York: Guilford.
Groysberg, B., & Abrahams, R. (2014). Manage your work, manage your life. Harvard Business
Review, 92(3), 58–66. Gutek, B.A. (1983). Changing boundaries. In J. Zimmerman (Ed.),
The technological woman: Interfacing with tomorrow (pp.157–172). New York: Praeger.
Hall, E.T. (1983). The dance of life. New York: Anchor.
Hallowell, E.M. (2005). Overloaded circuits: Why smart people underperform. Harvard
Business Review, 83(1), 54–62.
Hamilton, J.E., & Hancock, P.A. (1986). Robotics safety: Exclusion guarding for industrial
operations. Journal of Occupational Accidents, 8(1–2), 69–78.
150 References

Hamlett, P., Cobb, M.D., & Guston, D.H. (2013). National citizens’ technology forum:
Nanotechnologies and human enhancement. In S.A. Hays, J.S. Robert, C.A. Miller, & I.
Bennett (Eds), Nanotechnology, the brain, and the future (pp.265–283). Dordrecht: Springer
Netherlands.
Hancock, J., Gee, K., Ciaccio, K., & Mae-Hwah Lin, J. (2008). I’m sad you’re sad: Emo-
tional contagion in CMC. Proceedings of the 2008 ACM Conference on Computer Supported
Cooperative Work (pp.295–298). New York: ACM.
Hancock, P.A. (2013). Automation: how much is too much? Ergonomics, 57(3), 449–454.
Hancock, P.A., Billings, D.R., & Schaefer, K. E. (2011). Can you trust your robot? Ergo-
nomics in Design, 19(3), 24–29.
Hassard, J. (1996). Images of time in work and organization. In S.R. Clegg, C. Hardy, & W.
R. Nord (Eds), Handbook of Organization Studies, (pp.581–598). London: Sage.
Haugtvedt, C.P., Petty, R.E., Cacioppo, J.T., & Steidley, T. (1988). Personality and ad
effectiveness: Exploring the utility of need for cognition. Advances in Consumer Research,
15, 209–212.
Heider, F. (1946). Attitudes and cognitive organization. The Journal of Psychology, 21(1),
107–112.
Heimer, L., & Van Hoesen, G.W. (2006). The limbic lobe and its output channels: Impli-
cations for emotional functions and adaptive behavior. Neuroscience Biobehavior Review, 30
(2), 126–147.
Hemmerling, T.M., & Taddei, R. (2011). Robotic anesthesia: A vision for the future of
anesthesia. Translational Medecine, 1(1), 1–20.
Hemmerling, T.M., Taddei, R., Wehbe, M., Zaouter, C., Cyr, S., & Morse, J. (2012). First
robotic tracheal intubations in humans using the Kepler intubation system. British Journal
of Anaesthesia, 108(6), 1011–1016.
Hess, E.H., & Polt, J.M. (1964). Pupil size in relation to mental activity during simple pro-
blem solving. Science, 140(3611), 1190–1192.
Hiltz, R.S., & Turoff, M. (1985). Structuring computer-mediated communication systems to
avoid information overload. Communications of the ACM, 28(7), 680–688.
Hobfoll, S.E. (1989). Conservation of resources: A new attempt at conceptualizing stress.
American Psychologist, 44(3), 513–524.
Hobfoll, S.E. (2002). Social and psychological resources and adaptation. Review of General
Psychology, 6(4), 307–324.
Hobfoll, S.E. (2011). Conservation of resource caravans and engaged settings. Journal of
Occupational and Organizational Psychology, 84(1), 116–122.
Hobfoll, S.E., & Freedy, J. (1993). Conservation of resources: A general stress theory applied
to burnout. In W.B. Schaufeli, C. Maslach, & T. Marek (Eds), Professional burnout: Recent
developments in theory and practice (pp.115–133). Washington, DC: Routledge.
Huber, G.P. (1983). Cognitive style as a basis for MIS and DSS designs: Much ado about
nothing? Management Science, 29(5), 567–579.
Hull, C.L. (Ed.) (1943). Principles of behavior: An introduction to behavior theory. Oxford, UK:
D. Appleon-Century.
Huysmans, J.H. (1970). The effectiveness of the cognitive-style constraint in implementing
operations research proposals. Management Science, 17(1), 92–104.
Ipsos MediaCT & Wikia (2013). Generation Z: A look at the technology and media habits
of today’s teens. Available at: www.wikia.com/Generation_Z:_A_Look_at_the_Techno
logy_and_Media_Habits_of_Today%E2%80%99s_Teens (accessed September 20, 2017).
Iselin, E.R. (1988). The effects of information load and information diversity on
decision quality in a structured decision task. Accounting, Organizations and Society, 13
(2), 147–164.
References 151

Iselin, E.R. (1993). The effects of the information and data properties of financial rations and
statements on managerial decision quality. Journal of Business Finance & Accounting, 20(2),
249–266.
Isidore, C., & Luhby, T. (2015). Turns out Americans work really hard… but some want to
work harder. CNN Money (July 9). Available at: http://money.cnn.com/2015/07/09/
news/economy/americans-work-bush/index.html (accessed October 26, 2017).
Jackson, T.W., & Farzaneh, P. (2012). Theory-based model of factors affecting information
overload. International Journal of Information Management, 32(6), 523–532.
Jacoby, J. (1984). Perspectives on information overload. The Journal of Consumer Research,
10(4), 432–435.
Jacoby, J., Speller, D., & Kohn-Berning, C. (1975). Constructive criticism and programmatic
research: Reply to Russo. Journal of Consumer Research, 2(2), 154–156.
Jaeggi, S.M., Buschkuehl, M., Etienne, A., Ozdoba, C., Perrig, W.J., & Nirkko, A.C.
(2007). On how high performers keep cool brains in situations of cognitive overload.
Cognitive, Affective & Behavioral Neuroscience, 7(2), 75–89.
Jakimowicz, J., & Cuschieri, A. (2005). Time for evidence-based minimal access surgery
training: Simulate or sink. Surgical Endoscopy, 19(12), 1521–1522.
Jaques, E. (1982). The form of time. New York: Crane Russak.
James, W. (1884). What is an emotion? Mind, 9(34), 188–205.
James, W. (Ed.) (1890). The principles of psychology. New York: Holt.
James, W. (1894). The physical basis of emotion. Psychological Review, 1, 516–529.
Jasperson, J., Carter, P.E., & Zmud, R.W. (2005). A comprehensive conceptualization of
post-adoptive behaviors associated with information technology enabled work systems.
MIS Quarterly, 29(3), 525–557.
Jick, T.D. (1979). Mixing qualitative and quantitative methods: Triangulation in action.
Administrative Science Quarterly, 24(4), 602–611.
Johnston, A. (2015). Robotic seals comfort dementia patients but raise ethical issues. KLWA
Local Public Radio (August 17). Available at: http://kalw.org/post/robotic-seals-comfort-
dementia-patients-raise-ethical-concerns#stream/0 (accessed December 13, 2017).
Jones, E.E., & Harris, V.A. (1967). The attribution of attitudes. Journal of Experimental Social
Psychology, 3(1), 1–24.
Jones, Q., Ravid, G., & Rafaeli, S. (2004). Information overload and the message dynamics
of online interaction spaces: A theoretical model and empirical exploration. Information
Systems Research, 15(2), 194–210.
Junco, R. (2013). Comparing actual and self-reported measures of Facebook use. Computers
in Human Behavior, 29(3), 626–631.
Jutai, J.W., & Hare, R.D. (1983). Psychopathy and selective attention during performance of
a complex perceptual-motor task. Psychophysiology, 20(2), 146–151.
Kahneman, D. (Ed.) (1973). Attention and effort. Englewood Cliffs, NJ: Prentice Hall.
Kahneman, D., & Treisman, A. (1984). Changing views of attention and automaticity. In R.
Parasuraman and D.R. Davies (Eds), Varieties of attention (pp.28–61). Orlando, FL: Academic
Press.
Kandell, J.J. (1998). Internet addiction on campus: The vulnerability of college students.
Cyberpsychology & Behavior, 1(1), 11–17.
Kant, I. (1781–1787/2003). Critique of pure reason. Translated by Norman Kemp Smith.
Basingstoke, UK: Palgrave MacMillan.
Kaplan, A. (1964). The conduct of inquiry. San Francisco, CA: Chandler.
Kaplan, R., & Porter, M. (2011). How to solve the cost crisis in health care. Harvard Business
Review, 89(9), 47–64.
152 References

Kaplan, R.M., & Saccuzzo, D.P. (2010). Psychological testing: Principles, applications, and issues
(8th ed.). Belmont, CA: Wadsworth, Cengage Learning.
Kapur, S., Craik, F.I.M., Jones, C., Brown, G.M., Houle, S., & Tulving, E. (1995).
Functional role of the prefrontal cortext in retrieval of memories: A PET study. Neu-
roReport, 6(14), 1880–1884.
Kapur, S., Craik, F.I.M., Tulving, E., Wilson, A.A., Houle, S., & Brown, G.M. (1994).
Neuroanatomical correlates of encoding in episodic memory: Levels of processing effect.
Proceedings of the National Academy of Sciences of the United States of America, 91(6), 2008–
2011.
Karaiskos, D., Tzavellas, E., Balta, G., & Paparrigopoulos, T. (2010). P02–232 –Social net-
work addiction: A new clinical disorder? European Psychiatry, 25(Suppl. 1), 855.
Karr-Wisniewski, P., & Lu, Y. (2010). When more is too much: Operationalizing technol-
ogy overload and exploring its impact on knowledge worker productivity. Computers in
Human Behavior, 26(5), 1061–1072.
Kellogg, R.T. (1990). The psychology of writing. New York: Oxford.
Kim, K.K., & Michelman, J. E. (1990). An examination of factors for the strategic use of
information systems in the healthcare industry. Management Information Systems, 14(2),
201–215.
Kitchin, R. (2014). Big data, new epistemologies and paradigm shifts. Big Data & Society, 1(1),
1–12. Klaus, M.H., & Kennel, J.H. (1985). Parent-infant bonding. Mosby Press: St Louis.
Klausegger, C., Sinkovics, R.R., & “Joy” Zou, H. (2007). Information overload: A cross-
national investigation of influence factors and effects. Marketing Intelligence & Planning, 25
(7), 691–718.
Klingberg, T. (2009). The overflowing brain: Information overload and the limits of working memory.
Oxford: Oxford University Press.
Kluver, H., & Bucy, P.C. (1937). Psychic blindness and other symptoms following bilateral
temporal lobectomy in rhesus monkeys. American Journal of Physiology, 119(2), 352–353.
Knapp, T.J., & Robertson, L.C. (Eds) (1986). Approaches to cognition: Contrasts and con-
troversies. Hillsdale, NJ: Lawrence Erlbaum.
Kock, N. (2000). Information overload and worker performance: A process-centered view.
Knowledge and Process Management, 7(4), 256–264.
Koeniger, P., & Janowitz, K. (1995). Drowning in information, but thirsty for knowledge.
International Journal of Information Management, 15(1), 5–16.
Kohlberg, L. (1976). Moral stages and moralization: The cognitive-developmental approach.
In T. Lickona (Ed.), Moral development and behavior: Theory, research and social issues (pp.31–
53). New York: Holt, Rinehart, Winston.
Köhler, W. (1925/1917). The mentality of apes. New York: Humanities Press.
König, C.J., & Waller, M.J. (2010). Time for reflection: A critical examination of poly-
chronicity. Human Performance, 23(2), 173–190.
Korac-Kakabadse, N., Kouzmin, A., & Korac-Kakabadse, A. (2001). Emerging impact of
online over-connectivity. Paper presented at the 9th European Conference on Informa-
tion Systems. Bled, Slovenia, 27–29 June.
Kraut, R., Patterson, M., Lundmark, V., Kiesler, S., Mukopadhyay, T., & Scherlis, W.
(1998). Internet paradox: A social technology that reduces social involvement and psy-
chological well-being? American Psychologist, 53(9), 1017–1031.
Krugman, P. (2013). Sympathy for the Luddites. The New York Times (June 13). Available at:
www.nytimes.com/2013/06/14/opinion/krugman-sympathy-for-the-luddites.html?_r=0
(accessed November 5, 2017).
Kuhn, T. S. (1962). The structure of scientific revolutions. Chicago, IL: The University of Chi-
cago Press.
References 153

Kuss, D.J., & Griffiths, M.D. (2011). Online social networking and addiction: A review of
the psychological literature. International Journal of Environmental Research and Public Health,
8(9), 3528–3552.
Lanzetta, J.T., & Orr, S.P. (1980). Influence of facial expressions on the classical condition-
ing of fear. Journal of Personality and Social Psychology, 39(6), 1081–1087.
Lanzetta, J.T., & Orr, S.P. (1986). Excitatory strength of expressive faces: Effects of happy
and fear expressions and context on the extinction of a conditioned fear response. Journal
of Personality and Social Psychology, 50(1), 190–194.
Lashley, K.S. (Ed.) (1929). Brain mechanisms and intelligence. Chicago, IL: Chicago University Press.
Lazarus, R.S. (1994). Emotion and adaptation. Oxford: Oxford University Press.
Lazarus, R.S., & Folkman, S. (1984). Stress, appraisal, and coping. New York: Springer.
Lazarus, R.S., & Smith, C.A. (1989). Knowledge and appraisal in the cognition-emotion
relationship. Cognition and Emotion, 2(4), 281–300.
Leavitt, H.J. (1958). Managerial psychology. Chicago, IL: University of Chicago Press.
LeDoux, J. (1992). Emotion and the amygdala. In J.P. Aggleton (Ed.), The amygdala: Neu-
robiological aspects of emotion, memory, and mental dysfunction (pp.339–351). New York:
Wiley-Liss.
LeDoux, J. (Ed.) (1998). The emotional brain: The mysterious underpinnings of emotional life.
London: Clays Ltd.
Lee, A.R., Son, S.M., & Kim, K.K. (2016). Information and communication technology
overload and social networking service fatigue: A stress perspective. Computers in Human
Behavior, 55, 51–61.
Lee, Y.K., Chang, C.T., Lin, Y., & Cheng, Z.H. (2014). The dark side of smartphone
usage: Psychological traits, compulsive behavior and technostress. Computers in Human
Behavior, 31, 373–383.
Levine, S. (2005). Developmental determinants of sensitivity and resistance to stress. Journal of
Psychoneuroendocrinology, 30(10), 939–946.
Lewis, P. (2017). Our minds can be hijacked: The tech insiders who fear a smartphone dys-
topia. The Guardian (October 6). Available at: https://www.theguardian.com/technology/
2017/oct/05/smartphone-addiction-silicon-valley-dystopia (accessed September 9, 2017).
Liden, G.B., Wolowicz, M., Stivoric, J., Teller, A., Kasabach, C., Vishnubhatla, S., Pelletier,
R., Farringdon, J., & Boehmke, S. (2002). Characterization and implications of the sen-
sors incorporated into the SenseWear™ armband for energy expenditure and activity
detection. Available at: www.bodymedia.com/site/docs/papers/Sensors.pdf (accessed
September 12, 2017).
Liepert, J. (2005). Transcranial magnetic stimulation in neurorehabilitation. Acta Neurochir-
urgica Supplementum, 93, 71–74.
Liu, D., Santhanam, R., & Webster, J. (2017). Towards meaningful engagement: A frame-
work for design and research of gamified information systems. MIS Quarterly, 41(4),
1011–1034.
Logan, G.D. (2004). Working memory, task switching, and executive control in the task
span procedure. Journal of Experimental Psychology: General, 133(2), 218–236.
Ma, H., Li, S., & Pow, J. (2011). The relation of Internet use to prosocial and antisocial
behavior in Chinese adolescents. Cyberpsychology, Behavior and Social Networking, 14(3),
123–130.
Mackenzie, K.D. (Ed.) (1976). A theory of group structures: Basic theory (Vol.1). New York:
Gordon & Breach.
Makris, N., Oscar-Berman, M., Jaffin, S.K., Hodge, S.M., Kennedy, D.N., Caviness, V.S.,
Marinkovic, K., Breiter, H.C., Gasic, G.P., & Harris, G.J. (2008). Decreased volume of
the brain reward system in alcoholism. Biological Psychiatry, 64(3), 192–202.
154 References

Mascarenhas, Y. (2017). Stephen Hawking: AI could “develop a will of its own” in conflict
with ours that “could destroy us”. International Business Times (November 8). Available at:
www.ibtimes.co.uk/stephen-hawking-ai-could-develop-will-its-own-conflict-ours-that-
could-destroy-us-1646352 (accessed November 11, 2017).
Maslach, C., & Jackson, S.E. (1981). The measurement of experienced burnout. Journal of
Organizational Behavior, 2(2), 99–113.
Malhotra, N.K. (1984). Reflections on the information overload paradigm in consumer
decision making. The Journal of Consumer Research, 10(4), 436–440.
Malhotra, N.K., Jain, A.K., & Lagakos, S.W. (1982). The information overload controversy:
An alternative viewpoint. Journal of Marketing, 46(2), 27–37.
Mandler, G. (1967). Organization and memory. The Psychology of Learning and Motivation, 1,
327–372.
Mason, R.O., & Mitroff, J. (1973). A program for research on Management Information
Systems. Management Science, 19(5), 475–487.
Medina, H., Verhulst, M., & Rutkowski, A.F. (2015). Is it health IT? Task complexity and
work substitution. Paper presented at the 2015 Americas Conference on Information
Systems, Puerto Rico, August 13–15.
Mehdizadeh, S. (2010). Self-presentation 2.0: Narcissism and self-esteem on Facebook.
Cyberpsychology, Behavior and Social Networking, 13(4), 357–364.
Meier, R.L. (1963). Communications overload: Proposals from the study of a university
library. Administrative Science Quarterly, 7(4), 521–544.
Menninger, K. (Ed.) (1963). The vital balance: The life process in mental health and illness. New
York: Viking.
Metz, R. (2017). Smartphones are weapons of mass manipulation, and this guy is declaring
war on them. MIT technology review (October 19). Available at: https://www.technolo
gyreview.com/s/609104/smartphones-are-weapons-of-mass-manipulation-and-this-
guy-is-declaring-war-on-them/ (accessed November 27, 2017).
Miller, D.T., & Ross, M. (1975). Self-serving biases in the attribution of causality: Fact or
fiction? Psychology Bulletin, 82(2), 13–25.
Miller, G.A. (1956a). The magical number seven, plus or minus two: Some limits on our
capacity for processing information. The Psychological Review, 63(2), 81–97.
Miller, G.A. (1956b). Human memory and the storage of information. IRE, Transaction
Information Theory, 2(3), 129–137.
Miller, N.E. (1980). Applications of learning and biofeedback to psychiatry and medicine. In
H.I. Kaplan, A.M. Freedman, & B.J. Sadock (Eds), Comprehensive textbook of psychiatry (3rd
ed., pp.468–484). Baltimore, MD: Williams and Wilkins.
Minas, R.K., Potter, R.F., Dennis, A.R., Bartelt, V., & Bae, S. (2014). Putting on the
thinking cap: Using NeuroIS to understand information processing biases in virtual teams.
Journal of Management Information Systems, 30(4), 49–82.
Modell, J.H. (2005). Assessing the past and shaping the future of anesthesiology: The 43rd
Rovenstine Lecture. Anesthesiology, 102(5), 1050–1057.
Mohan, G. (2013). Facebook is a bummer, study says. Los Angeles Times (August 14).
Available at: www.latimes.com/science/sciencenow/la-sci-sn-facebook-bummer-20130814-
story.html (accessed November 30, 2017).
Molina, B. (2017). Do smartphones keep us in or out of touch? USA Today (August 8), 1B,2B.
Monetta, L., & Joanette, Y. (2003). Specificity of the right hemisphere’s contribution to
verbal communication: The cognitive resources hypothesis. Journal of Medical Speech Lan-
guage Pathology, 11(4), 203–211.
Montgomery, K.C. (2015). Youth and surveillance in the Facebook era: Policy interventions
and social implications. Telecommunications Policy, 39(9), 771–786.
References 155

Moore, T.V. (1938). Cognitive psychology. Philadelphia, PA: Lippincott.


Moreland, V. (1993). Techno-stress and personality type. On-line, 17(4), 59–62.
Moreno, J.D. (2012). Mind wars: Brain science and the military in the twenty-first century. New
York: Bellevue Literary Press.
Moreno, M.A., Jelenchick, L.A., Egan, K.G., Cox, E., Young, H., Gannon, K.E., &
Becker, T. (2011). Feeling bad on Facebook: Depression disclosures by college students
on a social networking site. Depression and Anxiety, 28(6), 447–455.
Morris, D.Z. (2017). New french law bars work email after hours. Fortune (January 1).
Available at: http://fortune.com/2017/01/01/french-right-to-disconnect-law/ (accessed
October 27, 2017).
Murali, V., & George, S. (2007). Lost online: An overview of Internet addiction. Advances in
Psychiatric Treatment, 13(1), 24–30.Neisser, U. (Ed.) (1967). Cognitive psychology. Engle-
wood Cliffs, NJ: Prentice Hall.
Neisser, U. (Ed.) (1976). Cognition and reality: Principles and implications of cognitive psychology.
New York: W.H. Freeman/Times Books/Henry Holt & Co.
Nelson, M.R. (2001). We have the information you want, but getting it will cost you:
Being held hostage by information overload. Available at: www.acm.org/crossroads/
xrds1-1/mnelson.html (accessed January 1, 2009).
Nenadic, I., Güllmar, D., Dietzek, M., Langbein, K., Steinke, J., & Gader, C. (2015). Brain
structure in narcissistic personality disorder: A VBM and DTI pilot study. Psychiatry
Research Neuroimaging, 231(2), 184–186.
Newell, A., Rosenbloom, P.S., & Laird, J.E. (1989). Symbolic architectures for cognition. In M.
Posner (Ed.), Foundations of cognitive science (pp.93–131). Cambridge, MA: MIT Bradford Books.
Newell, A., Shaw, J.C., & Simon, H.A. (1957). Empirical explorations of the logic theory
machine: A case study in heuristics. Proceedings of the Western Joint Computer Conference
(pp.218–230). New York: The Institute of Radio Engineers.
Newell, A., & Simon, H.A. (1972). Human problem solving. Englewood Cliffs, NJ: Prentice Hall.
Nicolelis, M. (2017). The troubled marriage of brains and computers. Wall Street Journal
(October 19). Available at: https://www.wsj.com/articles/the-troubled-marriage-of-bra
ins-and-computers-1508418316 (accessed November 5, 2017).
Nisbett, R.E., & Ross, L. (1980). Human inference: Strategies and shortcomings of social judgement.
Englewood Cliffs, NJ: Prentice-Hall.
Nisbett, R., & Wilson, T. (1977). Telling more than we can know: Verbal reports on
mental processes. Psychological Review, 84(3), 231–259.
Norman, D.A. (1968). Toward a theory of memory and attention. Psychological Review, 75
(6), 522–536.
Norman, D.A. (Ed.) (1969). Memory and attention. New York: Wiley.
Norman, D.A. (1980). Twelve issues for cognitive sciences, Cognitive Science, 4(1), 1–32.
NPR All Tech Considered (2017). From Post-it notes to algorithms: How automation is
changing legal work (November 7). Available at: https://www.npr.org/sections/alltech
considered/2017/11/07/561631927/from-post-it-notes-to-algorithms-how-automatio
n-is-changing-legal-work (accessed December 5, 2017).
O’Keeffe, G., & Clarke-Pearson, K. (2011). Clinical report: The impact of social media on
children, adolescents, and families. Pediatrics, 127(4), 800–804.
Oldroyd, J.B., & Morris, S.S. (2012). Catching falling stars: A human resource response to
social capital’s detrimental effect of information overload on star employees. Academy of
Management Review, 37(3), 396–418.
Or, C.K.L., & Duffy, V.G. (2007). Development of a facial skin temperature-based metho-
dology for non-intrusive mental workload measurement. Occupational Ergonomics, 7(2),
83–94.
156 References

O’Reilly, C. (1980). Individuals and information overload in organizations: Is more neces-


sarily better? The Academy of Management Journal, 23(4), 684–696.
O’Reilly, M. (1996). Internet addiction. A new disorder enters the medical lexicon. Cana-
dian Medical Association Journal, 154(12), 1882–1883.
Oulasvirta, A., Rattenbury, T., Ma, L., & Raita, E. (2012). Habits make smartphone use
more pervasive. Personal and Ubiquitous Computing, 16(1), 105–114.
Paas, F., Renkl, A., & Sweller, J. (2004). Cognitive load theory: Instructional implications of
the interaction between information structures and cognitive architecture. Instructional
Science, 32(1), 1–8.
Packard, M.G., Cahill, L., & McGaugh, J.L. (1994). Amygdala modulation of hippo-
campal-dependent and caudate nucleus-dependent memory processes. Neurobiology,
91(18), 8477–8481.
Panger, M.A., Brooks, A., Richmond, B.G., & Wood, B. (2002). Older than the Oldowan?
Rethinking the emergence of hominin tool use. Evolutionary Anthropology, 11(6), 235–245.
Panksepp, J., Knutson, B., & Burgdorf, J. (2002). The role of brain emotional systems in
addictions: A neuro-evolutionary perspective and new “self-report” animal model.
Addiction, 97(4), 459–469.
Park, C.L., & Folkman, S. (1997). Meaning in the context of stress and coping. Review of
General Psychology, 1(2), 115–144.Partala, T., & Surakka, V. (2003). Pupil size variation
as an indication of affective processing. International Journal of Human-Computer Studies,
59(1–2), 185–198.
Paro Robots (n.d.). Paro home page. Available at: www.parorobots.com/ (accessed
December 7, 2017).
Paul, S., & Nazareth, D.L. (2010). Input information complexity, perceived time pressure,
and information processing in GSS-based work groups: An experimental investigation
using a decision schema to alleviate information overload conditions. Decision Support
Systems, 49(1), 31–40.Pavlidis, I., Dowdall, J., Sun, N., Puri, C., Fei, J., & Garbey, M.
(2007). Interacting with human physiology. Computer Vision and Image Understanding, 108
(1–2), 150–170.
Pavlidis, I., & Levine, I. (2002). Thermal image analysis for polygraph testing. IEEE Engi-
neering in Medicine and Biology Magazine, 21(6), 56–64.
Pavlov, I.P. (Ed.) (1927). Conditioned reflexes: An investigation of the physiological activity of the
cerebral cortex. Translated by G.V. Anrep. Oxford: Oxford University Press.
Payne, J.W. (1976). Task complexity and contingent processing in decision making: An
information search and protocol analysis. Organizational Behavior and Human Performance,
16(2), 366–387.
Pearlson, K.E., Saunders, C.S., & Galletta, D.F. (Eds) (2016). Managing and using information
systems: A strategic approach. New York: John Wiley & Sons.
Pecchinenda, A., & Smith, C.A. (1996). The affective significance of skin conductance
activity during a difficult problem-solving task. Cognition and Emotion, 10(5), 481–503.
Pennebaker, J.W., & Beall, S. (1986). Confronting a traumatic event: Toward an under-
standing of inhibition and disease. Journal of Abnormal Psychology, 95(3), 274–281.
Pennington, R.R., Kelton, A.S., & DeVries, D.D. (2006). The effects of qualitative overload
on technology acceptance. Journal of Information Systems, 20(2), 25–36.
Piaget, J. (1951). Psychology of intelligence. London: Routledge and Kegan Paul.
Pluyter, J. R. (2012). Designing immersive surgical training against information technology-related
overload in the operating room. PhD Dissertation, Tilburg University.
Pluyter, J.R., Buzink, S.N., Rutkowski, A.-F., & Jakimowicz, J. (2010). Do absorption and
realistic distraction influence performance of component task surgical procedure? Surgical
Endoscopy, 24(4), 902–907.
References 157

Pluyter, J.R., Rutkowski, A.-F., & Jakimowicz, J.J. (2014). Immersive training: Breaking the
bubble and measuring the heat. Surgical Endoscopy, 28(5), 1545–1554.
Pluyter, J.R., Rutkowski, A.-F., Jakimowicz, J.J., & Saunders, C.S. (2012). Measuring users’
mental strain when performing technology based surgical tasks on a surgical simulator
using thermal imaging technology. Proceedings of the 45th Hawaii International Conference on
System Sciences (pp.2920–2926). Washington, DC: IEEE Computer Society.
Ponce de León, M.S., Golovanova, L., Doronichev, V., Romanova, G., Akazawa, T.,
Kondo, O., Ishida, H., & Zollikofe, C.P.E. (2008). Neanderthal brain size at birth pro-
vides insights into the evolution of human life history. Proceedings of the National Academy
of Science of the United States of America, 105(37), 13764–13768.
Popper, K. (Ed.) (1959). The logic of scientific discovery. London: Routledge.
Popper, K. (1978). Three worlds: The Tanner lecture on human values. Delivered at The
University of Michigan (April 7). Available at: https://tannerlectures.utah.edu/_docum
ents/a-to-z/p/popper80.pdf (accessed September 29, 2017).
Porter, G., & Kakabadse, N.K. (2006). HRM perspectives on addiction to technology and
work. Journal of Management Development, 25(6), 535–560.
Powers, W. (Ed.) (2010). Hamlet’s BlackBerry. New York: HarperCollins.
Pratarelli, M.E., Browne, B.L., & Johnson, K. (1999). The bits and bytes of computer/
Internet addiction: A factor analytic approach. Behavior Research Methods, Instruments, and
Computers, 31(2), 305–314.
Puri, C., Olson, L., Pavlidis, I., Levine, J., & Starren, J. (2005). StressCam: Non-
contact measurement of users’ emotional states through thermal imaging. Paper
presented at the Conference on Human Factors in Computing Systems, Portland,
OR, April 2–7.
Ragu-Nathan, T.S., Tarafdar, M., Ragu-Nathan, B.S., & Tu, Q. (2008). The consequences
of technostress for end users in organizations: Conceptual development and empirical
validation. Information Systems Research, 19(4), 417–433.
Revelle, W. (1994). Individual differences in personality and motivation: Non-cognitive
determinants of cognitive performance. In A. Baddeley and L Weiskrantz, (Eds), Atten-
tion: Selection awareness and control: A tribute to Donald Broadbent (pp.346–373). Oxford:
Clarendon Press.
Revsine, L. (1970). Data expansion and conceptual structure. Accounting Review, 45(4), 704–
712.
Reyes, M.L., Lee, J.D., Liang, Y., Hoffman, J.D., & Huang, R.W. (2009). Capturing driver
response to in-vehicle human-machine interface technologies using facial thermography.
Proceedings of the International Driving Symposium on Human Factors in Driver Assessment,
Training and Vehicle Design, 5, 536–542.
Riger, S. (1993). What’s wrong with empowerment? American Journal of Community Psy-
chology, 21(3), 279–292.
Rimé, B. (2009). Emotion elicits the social sharing of emotion: Theory and empirical
review. Emotion Review, 1(1), 60–85.
Rimé, B., Noël, P., & Philippot, P. (1991). Episode émotionnel, réminiscences cognitives et
réminiscences sociales. Cahiers Internationaux de Psychologie Sociale, 11, 93–104.
Robey, D., & Taggart, W.M. (1982). Human information processing in information and
decision support systems. MIS Quarterly, 6(2), 62–73.
Roche, H., Blumenschine, R.J., & Shea, J.J. (2009). Origins and adaptations of early Homo:
What archeology tells us. In F.E. Grine, J.G. Fleagle, & R.E. Leakey (Eds), The first
humans: Origin and early evolution of the genus Homo (pp.135–147). Dordrecht: Springer.
Roche, S.M., & McConkey, K.M. (1990). Absorption: Nature, assessment, and correlates.
Journal of Personality and Social Psychology, 59(1), 91–101.
158 References

Romanow, D., Cho, S., & Straub, D. (2012). Riding the wave: Past trends and future
directions for health IT research. MIS Quarterly, 36(3), 3–10.
Rose, J.M., Roberts, F.D., & Rose, A.M. (2004). Affective responses to financial data and
multimedia: The effects of information load on cognitive load. International Journal of
Accounting Information Systems, 5(1), 5–24.
Rosen, L.D., Carrier, M.L., & Cheever, N.A. (2013). Facebook and texting made me do
it: Media-induced task-switching while studying. Computers in Human Behavior, 29(3),
948–958.
Rosen, L.D., Cheever, N.A., & Carrier, L.M. (Eds) (2012). iDisorder: Understanding our
obsession with technology and overcoming its hold on us. New York: Palgrave Macmillan.
Rosen, L.D., Whaling, K., Rab, S., Carrier, L.M., & Cheever, N.A. (2013). Is Facebook
creating “iDisorders”? The link between clinical symptoms of psychiatric disorders and
technology use, attitudes and anxiety. Computers in Human Behavior, 29(3), 1243–1254.
Rosman, A., Biggs, S., Graham, L., & Bible, L. (2007). Successful audit workpaper review
strategies in electronic environments. Journal of Accounting, Auditing & Finance, 22(1),
57–83.
Ross, L. (1977). The intuitive psychologist and his shortcomings: Distortions in the attribu-
tion process. Advances in Experimental Social Psychology, 10, 173–220.
Rowe, R., Maughan, B., Moran, P., Ford, T., Briskman, J., & Goodman, R. (2010). The
role of callous and unemotional traits in the diagnosis of conduct disorder. Journal of Child
Psychology and Psychiatry, 51(6), 688–695.
Rutkowski, A.F. (2016). Work substitution: A neo-Luddite look at software growth. IEEE
Software, 33(3), 101–104.
Rutkowski, A.F., Rijsman, J.B., & Gergen, M. (2004). Paradoxical laughter at a victim as
communication with a non-victim. International Review of Social Psychology, 17(4), 5–11.
Rutkowski, A.F., & Saunders, C. (2010). Growing pains with information overload. IEEE
Computer, 43(6), 94–96.
Rutkowski, A.F., Saunders, C., & Hatton, L. (2013). The generational impact of software.
IEEE Software, 30(3), 87–89.
Rutkowski, A., Saunders, C., Wiener, M., & Smeulders, R. (2013). Intended usage of a
healthcare communication technology: Focusing on the role of IT-related overload.
Paper presented at the 34th International Conference on Information Systems, Milan,
Italy, December 15–18.
Rutkowski, A.F., & van Genuchten, M. (2008). No more reply-to-all. IEEE Computer,
41(7), 95–96.
Ryan, T., & Xenos, S. (2011). Who uses Facebook? An investigation into the relationship
between the Big Five, shyness, narcissism, loneliness, and Facebook usage. Computers in
Human Behavior, 27(5), 1658–1664.
Salanova, M., Llorens, S., & Cifre, E. (2013). The dark side of technologies: Technostress
among users of information and communication technologies. International Journal of Psy-
chology, 48(3), 422–436.
Sanz, C.M., & Morgan, D.B. (2013). Ecological and social correlates of chimpanzee tool
use. Proceedings of the Royal Society B: Biological Sciences, 368(1630). doi:10.1098/
rstb.2012.0416
Sarker, S., Ahuja, M., & Sarker, S. (2018). Work-life conflict of globally distributed software
development personnel: An empirical investigation using border theory. Information Sys-
tems Research, 29(1), 103–126.
Sarker, S., Sarker, S., & Jana, D. (2010). The impact of the nature of globally distributed
work arrangement on work–life conflict and valence: The Indian GSD professionals’
perspective. European Journal of Information Systems, 19(2), 209–222.
References 159

Sarker, S., Xiao, X., Sarker, S., & Ahuja, M. (2012). Managing employees’ use of mobile
technologies to minimize work-life balance impacts. MIS Quarterly Executive, 11(4),
143–157.
Saunders, C., Rutkowski, A.F., Pluyter, J., & Spanjers, R. (2016). Health information
technologies: From hazardous to the dark side. Journal of the Association for Information Sci-
ence and Technology, 67(7), 1767–1772.
Saunders, C.S., Van Slyke, C., & Vogel, D. (2004). My time or yours? Managing time
visions in global virtual teams. Academy of Management Executive, 18(1), 19–31.
Saunders, C., Wiener, M., Klett, S., & Sprenger, S. (2017). The impact of mental repre-
sentations on ICT-related overload in the use of mobile phones. Journal of Management
Information Systems, 34(3), 803–825.
Savage, T.S., & Wyman, J. (1843–1844). Observations on the external characters and habits
of Troglodytes niger, Geoff. And on its organization. Boston Journal of Natural History, 4,
362–386.
Sax, M. (2016). Big data: Finders keepers, losers weepers? Ethics and Information Technology,
18(1), 25–31.
SBS6 (2009). Baby Mobile. (Dutch national TV news, October 4.)
Schachter, S. (Ed.) (1959). The psychology of affiliation. Stanford, CA: Stanford University Press.
Schaefer, A., & Philippot, P. (2005). Selective effects of emotion on the phenomenal char-
acteristics of autobiographical memories. Memory, 13(2), 148–160.
Schaefer, K.E., Adams, J.K., Cook, J.G., Bardwell-Owens, A., & Hancock, P.A. (2015). The
future of robotic design: Trends from the history of media representations. Ergonomics in
Design, 23(1), 13–19.
Schechner, S. (2017). Meet your new boss: An algorithm. The Wall Street Journal (December
10). Available at: https://www.wsj.com/articles/meet-your-new-boss-an-algorithm-
1512910800 (accessed December 14, 2017).
Scherer, K.R. (1994). Emotion serves to decouple stimulus and response. In P. Ekman & R.
J. Davidson (Eds), The nature of emotion: Fundamental questions (pp.127–130). New York:
Oxford University Press.
Schick, A.G., Gordon, L.A., & Haka, S. (1990). Information overload: A temporal approach.
Accounting, Organizations and Society, 15(3), 199–220.
Schijven, M.P., & Bemelman, W.A. (2011). Problems and pitfalls in modern competency-
based laparoscopic training. Surgical Endoscopy, 25(7), 2159–2163.
Schijven, M., & Jakimowicz, J. (2003). The learning curve on the Xitact LS 500 laparoscopy
simulator: Profiles of performance. Surgical Endoscopy, 18(1), 121–127.
Schlenker, B.R., & Leavy, M.R. (1982). Social anxiety and self-presentation: A con-
ceptualization and model. Psychological Bulletin, 92(3), 641–669.
Schlotz, W., Hellhammer, J., Schulz, P., & Stone, A.A. (2004). Perceived work overload
and chronic worrying predict weekend-weekday differences in the cortisol awakening
response. Psychosomatic Medicine, 66(2), 207–214.
Schneider, S.C. (1987). Information overload: Causes and consequences. Human Systems
Management, 7(2), 143–153.
Schneider, W., & Fisk, A.D. (1982). Concurrent automatic and controlled visual search: Can
processing occur without resource cost? Journal of Experimental Psychology: Learning,
Memory, and Cognition, 8(4), 261–278.
Schroeder, R. (2014). Big data and the brave new world of social media research. Big Data &
Society, 1(2), 1–11.
Schultz, U., & Vandenbosch, B. (1998). Information overload in a groupware environment:
Now you see it, now you don’t. Journal of Organizational Computing and Electronic Com-
merce, 8(2), 127–148.
160 References

Schulze, L., Dziobek, I., Vater, A., Heekeren, H.R., Bajbouj, M., Renneberg, B., Heuser,
I., & Roepke, S. (2013). Gray matter abnormalities in patients with narcissistic personality
disorder. Journal of Psychiatric Research, 4(10), 1363–1369.
Schwab, D.P. (1980). Construct validity in organizational behavior. In B.M. Staw & L.L.
Cummings (Eds), Research in organizational behavior (Vol.2, pp.3–43). Greenwich, CT:
JAI Press.
Schwarz, N. (1990). Feelings as information: Informational and motivational functions
of affective states. In E.T. Higgins & R. Sorrentino (Eds), Handbook of motivation and
cognition: Foundations of social behavior (Vol.2, pp.527–561). New York: Guilford
Press.
Scoville, W.B., & Milner, B.J. (1957). Loss of recent memory after bilateral hippocampal
lesions. Journal of Neurology, Neurosurgery and Psychiatry, 20(1), 11–21.
Sergeeva, A., Huysman, M.H., & Faraj, S.A. (2016). Material enactment of work practices:
Zooming in on the practice of surgery with the Da Vinci robot. Paper presented at Ifip
WG8.2 Working Conference, Dublin, Ireland, December 9–10.
Sexton, J.B., Thomas, E.J., & Helmreich, R.L. (2000). Error, stress, and teamwork in med-
icine and aviation: Cross sectional surveys. British Medical Journal, 320(7237), 745–749.
Seymour, N.E. (2008). VR to OR: A review of the evidence that virtual reality simulation
improves operating room performance. World Journal of Surgery, 32(2), 182–188.
Shapira, N.A., Goldsmith, T.D., Keck, P.E., Khosla, U.M., & McElroy, S.L. (2000). Psy-
chiatric features of individuals with problematic Internet use. Journal of Affective Disorder,
57(1–3), 267–272.
Sharkey, N., & Sharkey, A. (2013). Robotic surgery: On the cutting edge of ethics. Com-
puter, 46(1), 56–64.
Shirom, A., Nirel, N., & Vinokur, A.D. (2006). Overload, autonomy, and burnout as
predictors of physicians’ quality of care. Journal of Occupational Health Psychology, 11(4),
328–342.
Shiv, B., & Fedorikhin, A. (1999). Heart and mind in conflict: Interplay of affect and cog-
nition in consumer decision making. Journal of Consumer Research, 26(3), 278–282.
Simnett, R. (1996). The effect of information selection, information processing and task
complexity on predictive accuracy of auditors. Accounting, Organizations and Society, 21(2),
699–719.
Simon, H.A. (1971). Designing organizations for an information-rich world. In M. Green-
berger (Ed.), Computers, communications, and the public interest (pp.37–72). Baltimore, MD:
The Johns Hopkins Press.
Simon, H.A. (1980). The behavioral and social sciences. Science, 209, 71–77.
Simon, H.A., & Newell, A. (1971). Human problem solving: The state of the theory in
1970. American Psychologist, 26(2), 145–159.
Simpson, C.W., & Pruzak, L. (1995). Troubles with information overload. Moving quantity
to quality in information provision. International Journal of Information Management, 15(6),
413–425.
Skinner, B.F. (1935). The generic nature of the concepts of stimulus and response. Journal of
General Psychology, 12(1), 40–65.
Skinner, B.F. (1985). Cognitive science and behaviourism. British Journal of Psychology, 76(3),
291–301.
Slagel, J.M., & Weinger, M.B. (2009). Effects of intraoperative reading on vigilance and
workload during anaesthesia care in an academic medical center. Anesthesiology, 110(2),
275–283.
Snowball, D. (1980). Some effects of accounting expertise and information load: An
empirical study. Accounting, Organizations, and Society, 5(3), 323–338.
References 161

Spada, M.M. (2014). An overview of problematic Internet use. Addictive Behaviors, 39(1),
3–6.
Spanjers, R.W.L. (2012). Be patient: A longitudinal study on adoption and diffusion of IT-inno-
vation in Dutch Healthcare. Doctoral dissertation, Tilburg University.
Spanjers, R., & Rutkowski, A.F. (2005). The Telebaby® case. In J. Tan (Ed.), E-health care
information systems: An introduction for students and professionals (pp.27–36). San Francisco,
CA: Jossey-Bass.
Spanjers, R., Rutkowski, A.F., & Feuth, S. (2003). Telebaby: Live videostreaming from a
neonatal ward using Internet. Paper presented at the 9th Americas Conference on Infor-
mation Systems, Tampa, FL, August 4–6.
Spanjers, R., Rutkowski, A.F., & Van Genuchten, M. (2007). BabyMobile, virtual baby visit
at the hospital using UMTS. Paper presented at the 11th International Association of
Science and Technology for Development International Conference on Internet and
Multimedia Systems and Applications, Honolulu, HI, August 20–22.
Sparrow, P.R. (1999). Strategy and cognition: Understanding the role of management
knowledge structures, organizational memory and information overload. Creativity and
Innovation Management, 8(2), 140–148.
Speier, C., Valacich, J.S., & Vessey, I. (1999). The influence of task interruption on indi-
vidual decision making: An information overload perspective. Decision Sciences, 30(2),
337–360.
Spiekermann-Hoff, S., & Novotny, A. (2015). A vision for global privacy bridges: Technical
and legal measures for international data markets. Computer Law and Security Review, 31(2),
181–200.
Spink, A. (2004). Multitasking information behavior and information task switching: An
exploratory study. Journal of Documentation, 60(4), 336–351.
Spink, A., Cole, C., & Waller, M. (2008). Multitasking behavior. Annual Review of Informa-
tion Science and Technology, 42, 93–118.
Spitzer, M. (Ed.) (2012). Digitale Demenz: Wie wir uns und unsere Kinder um den Verstand
bringen. München: Droemer Knaur Verlag.
Squire, L.R., & Alvarez, P. (1995). Retrograde amnesia and memory consolidation: A neu-
robiological perspective. Current Opinion in Neurobiology, 5(2), 169–177.
Stahl, J.E., Egan, M.T., Goldman, J.M., Tenney, D., Wiklund, R.A., Sandberg, W.S.,
Gazelle, S., & Rattner, D.W. (2005). Introducing new technology into the operating
room: Measuring the impact on job performance and satisfaction. Surgery, 137(5),
518–526.
Stemberger, J., Allison, R.S., & Schnell, T. (2010). Thermal imaging as a way to classify
cognitive workload. Paper presented at the Canadian Conference on Computer and
Robot Vision, Ottawa, Canada, May 31–June 2.
Strasburger, V.C., & Hogan, M.J. (2013). Children, adolescent and the media. Pediatrics, 132
(5), 958–961.
Streufert, S. & Streufert, S.C. (Eds) (1978). Behavior in the complex environment. New York:
Wiley.
Subrahmanyam, K., Kraut, R., Greenfield, P., & Gross, E. (2000). The impact of home
computer use on children’s activities and development. The Future of Children – Children
and Computer Technology, 10(2), 123–144.
Sutcliffe, K.M., & Weick, K.E. (2008). Information overload revisited. In G.P. Hodgkinson
& W.H. Starbuck (Eds), The Oxford handbook of organizational decision making (pp.56–75).
Oxford: Oxford University Press.
Swain, M.R., & Haka, S.F. (2000). Effects of information load on capital budgeting deci-
sions. Behavioral Research in Accounting, 12(1), 171–198.
162 References

Sweller, J. (1988). Cognitive load during problem solving: Effects on learning. Cognitive
Science, 12(2), 257–285.
Sweller, J., Van Merrienboer, J., & Paas, F. (1998). Cognitive architecture and instructional
design. Educational Psychology Review, 10(3), 251–296.
Sykes, K., & Macnaghten, P. (2013). Opening up dialogue and debate. In R. Owen, J.
Bessant, & M. Heitz (Eds), Responsible innovation: Managing the responsible emergence of science
and innovation in society (pp.85–107). Chichester: John Wiley.
Szczepanski, S.M., & Knight, R.T. (2014). Insights into human behavior from lesions to the
prefrontal cortex. Neuron, 83(5), 1002–1018.
Tagg, R., Gandhi, P., & Srinivasan Kumaar, R. (2009). Recognizing work priorities and
tasks in incoming messages through personal ontologies supplemented by lexical clues.
Paper presented at the 17th European Conference on Information Systems, Verona, Italy,
8–10 June.
Tam, K.Y., & Ho, S.Y. (2005). Web personalization as a persuasion strategy: An elaboration
likelihood model perspective. Information Systems Research, 16(3), 271–291.
Tamir, D.I., & Mitchell, J.P. (2012). Disclosing information about the self is intrinsically
rewarding. Proceedings of the National Academy of Sciences, 109(21), 8038–8043.
Tarafdar, M., Beath, C.M., & Ross, J.W. (2017). Enterprise cognitive computing applications.
Working Paper No. 420. Cambridge, MA: MIT Center for Information Systems
Research (CISR).
Tarafdar, M., Pullins, E.B., & Ragu-Nathan, T.S. (2015). Technostress: Negative effect on
performance and possible mitigations. Information Systems Journal, 25(2), 103–132.
Tarafdar, M., Tu, Q., & Ragu-Nathan, T.S. (2010). Impact of technostress on end-user
satisfaction and performance. Journal of Management Information Systems, 27(3), 303–334.
Tarafdar, M., Tu, Q., Ragu-Nathan, B.S., & Ragu-Nathan, T.S. (2007). The impact of
technostress on role stress and productivity. Journal of Management Information Systems,
24(1), 301–328.
Tarafdar, M., Tu, Q., Ragu-Nathan, T.S., & Ragu-Nathan, B.S. (2011). Crossing to the
dark side: Examining creators, outcomes, and inhibitors of technostress. Communications of
the ACM, 54(9), 113–120.
Taylor, S.E., & Koivumaki, J.H. (1976). The perception of self and others: Acquaintance-
ship, affect, and actor-observer differences. Journal of Personality and Social Psychology, 33(4):
403–408.
Teasdale, J.D. (1993). Selective effects of emotion on information processing. In A.Baddeley
& L.Weiskrantz (Eds), Attention: Selection, awareness, and control: A tribute to Donald Broad-
bent (pp.374–389). Oxford: Clarendon Press.
Tellegen, A. (1981). Practicing the two disciplines of relaxation and enlightenment:
Comment on “Role of the feedback signal in electromyography biofeedback: The
relevance of attention” by Qualls and Sheehan. Journal of Experimental Psychology:
General, 110(2), 217–226.
Tennant, M. (Ed.) (1988). Psychology and adult learning. London: Routledge.
Terman, L.M. (1916). The uses of intelligence tests. In L.M. Terman (Ed.), The measurement
of intelligence: An explanation of and a complete guide for the use of the Stanford revision and
extension of the Binet-Simon Intelligence Scale (pp.3–21). Boston: Houghton Mifflin.
Thatcher, A., & Goolam, S. (2005). Development and psychometric properties of the Pro-
blematic Internet Use Questionnaire. South African Journal of Psychology, 35(4), 793–809.
Tingley, K. (2017). Learning to love our robot co-workers. The New York Times Magazine
(February 23), 30–32,58,63. Available at: https://www.nytimes.com/2017/02/23/maga
zine/learning-to-love-our-robot-co-workers.html (accessed November 6, 2017).
Toffler, A. (1970). Future shock. New York: Random House.
References 163

Tollner, A.M., Riley, M.A., Matthews, G., & Shockley, K.D. (2005). Divided attention
during adaptation to visual-motor rotation in an endoscopic surgery simulator. Cognition,
Technology and Work, 7(1), 6–13.
Tolman, E.C. (1948). Cognitive maps in rats and men. Psychological Review, 55(4), 189–208.
Tomkins, S.S. (1984). Affect theory. In K.R. Scherer and P. Ekman (Eds), Approaches to
emotion (pp.163–195). Hillsdale, NJ: Erlbaum.
Treisman, A. (1964). Selective attention in man. British Medical Bulletin, 20(1), 12–16.
Treisman, A., & Riley, J. (1969). Is selective attention selective perception or selective
response? A further test. Journal of Experimental Psychology, 79(1), 27–34.
Trompenaars, F., & Hampden-Turner, C. (Eds) (2011). Riding the waves of culture: Under-
standing diversity in global business. New York: Nicholas Brealey Publishing.
Tsai, H.Y., Compeau, D., & Haggerty, N. (2007). Of races to run and battles to be won:
Technical skill updating, stress, and coping of IT professionals. Human Resource Manage-
ment, 46(3), 395–409.
Tulving, E. (1962). Subjective organization in free recall of unrelated word. Psychology
Review, 69(4), 344–354.
Tulving, E. (1972). Episodic and semantic memory. In E. Tulving & W. Donaldson (Eds),
Organization of memory (pp.381–403). New York: Academic Press.
Tulving, E. (Ed.) (1983). Elements of episodic memory. New York: Oxford.
Tulving, E. (2002). Episodic memory: From mind to brain. Annual Review of Psychology, 53, 1–25.
Turel, O., & Serenko, A. (2010). Is mobile email addiction overlooked? Communications of
the ACM, 53(5), 41–43.
Turkle, S. (Ed.) (2011). Alone together: Why we expect more from technology and less from each
other. New York: Basic Books.
Tushman, M.L., & Nadler, D.A. (1978). Information processing as an integrating concept in
organizational design. Academy of Management Review, 3(3), 613–624.
Tuttle, B., & Burton, F.G. (1999). The effects of a modest incentive on information over-
load in an investment analysis task. Accounting, Organizations and Society, 24(8), 673–687.
Tversky, A., & Kahneman, D. (1973). Availability: A heuristic for judging frequency and
probability. Cognitive Psychology, 5(2), 207–232.
Ursin, H. (1980). Personality, activation and somatic health. In S. Levine & H. Ursin (Eds),
Coping and Health (NATO Conference Series III: Human factors, pp.259–280). New
York: Plenum.
Vaillant, G.E. (1977). Adaptation to life. Boston, MA: Little Brown.
Van Knippenberg, D., Dahlander, L., Haas, M.R., & George, G. (2015). Information,
attention, and decision making. Academy of Management Journal, 58(3), 649–657.
Vatsyayan, S.H. (1981). A sense of time: An exploration of time in theory, experience, and art.
Delhi: Oxford University Press.Vezyridis, P., & Timmons, S. (2015). On the adoption of
personal health records: Some problematic issues for patient empowerment. Ethics and
Information Technology, 17(2), 113–124.
Vinkers, C.H., Penning, R., Hellhammer, J., Verster, J.C., Klaessens, J.H., Olivier, B., &
Kalkman, C.J. (2013). The effect of stress on core and peripherical temperature. Stress, 16
(5), 520–530.
Volkow, N.D., & Wise, R.A. (2005). How can drug addiction help us understand obesity?
Nature Neuroscience, 8(5), 555–560.
Volkskrant (2001). Big mother. (Dutch national newspaper, August 9.)
von Neumann, J. (Ed.) (1958). The computer and the brain. New Haven, CT: Yale University
Press.Waller, M.J., Conte, J.M., Gibson, G., & Carpenter, A. (2001). The impact of
individual time perception on team performance under deadline conditions. Academy of
Management Review, 26(4), 586–600.
164 References

Wallis, C. (2006). The multitasking generation. Time Magazine, 167(13), 48–55.


Walter, N., Ortbach, K., & Niehaves, B. (2013). Great to have you here! Understanding and
designing social presence in information systems. Paper presented at the European Con-
ference on Information Systems, Utrecht, Netherlands, June 6–8.
Wang, K., Shu, Q., & Tu, Q. (2008). Technostress under different organizational
environments: An empirical investigation. Computers in Human Behavior, 24(6), 3002–
3013.
Warren, O.J., Leff, D.R., Athanasiou, T., Kennard, C., & Darzi, A. (2009). The neurocog-
nitive enhancement of surgeons: An ethical perspective. Journal of Surgical Research, 152(1),
167–172.
Watson, J.B. (1913). Psychology as the behaviorist views it. Psychological Review, 20(2),
158–177.
Weber, M. (Ed.) (1949). On the methodology of social sciences. Translated by E. Shils. New
York: Free Press.
Wegman, C. (Ed.) (1985). Psychoanalysis and cognitive psychology: A formalization of Freud’s
earliest theory. NewYork: Academic Press.
Whetten, D.A. (1989). What constitutes a theoretical contribution? Academy of Management
Review, 14(4), 490–495.
White, K.B. (1984). MIS project teams: An investigation of cognitive style implications. MIS
Quarterly, 8(2), 95–101.
White, K.B., & Leifer, R. (1986). Information systems development success: Perspectives
from project team participants. MIS Quarterly, 10(3), 215–223.
White, M., Hill, S., McGovern, P., Mills, C., & Smeaton, D. (2003). High-performance
management practices, working hours and work-life balance. British Journal of Industrial
Relations, 41(2), 175–195.
Wickens, C.D. (1980). The structure of attentional resources. In R. Nickerson (Ed.), Atten-
tion and Performance (Vol. VIII, pp.239–257). Hillsdale, NJ: Lawrence Erlbaum.
Wickens, C.D. (Ed.) (1992). Engineering psychology and human performance. New York:
HarperCollins.
Wiener, M., & Cram, W.A. (2017). Technology-enabled control: Effectiveness, socio-
emotional consequences, and ethical dilemmas. Paper presented at the 23rd Americas
Conference on Information Systems, Boston, MA, August 10–12.
Wikipedia (2017). Ignaz Semmelweis. Available at: https://en.wikipedia.org/wiki/Ignaz_
Semmelweis (accessed December 6, 2017).
Winograd, E., & Neisser, U. (Eds) (1992). Affect and accuracy in recall: Studies of “flashbulb”
memories. New York: Cambridge University Press.
Wolfert, S., Ge, L., Verdouw, C., & Bogaardt, M.J. (2017). Big data in smart farming: A
review. Agricultural Systems, 153, 69–80.
Wrangham, R.W. (Ed.) (1994). Chimpanzee cultures. Cambridge, MA: Harvard University
Press &Chicago Academy of Sciences.
Wylie, G. & Allport, A. (2000). Task switching and the measurement of “switch costs”.
Psychological Research, 63(3–4), 212–233.
Yellowlees, P.M., & Marks, S. (2007). Problematic Internet use or Internet addiction?
Computers in Human Behavior, 23(3), 1447–1453.
Yen, J.Y., Ko, C.H., Yen, C.F., Wu, H.Y., & Yang, M.J. (2007). The comorbid psychiatric
symptoms of Internet addiction: Attention deficit and hyperactivity disorder (ADHD),
depression, social phobia, and hostility. Journal of Adolescent Health, 41(1), 93–98.
Yesavage, J.A., Mumenthaler, M.S., Taylor, J.L., Friedman, L., O’Hara, R., Sheikh, J.,
Tinklenberg, J., & Whitehouse, P.J. (2002). Donepezil and flight simulator performance:
Effects on retention of complex skills. Neurology, 59(1), 123–125.
References 165

Young, K.S. (Ed.) (1998). Caught in the Net: How to recognize the signs of Internet addiction and a
winning strategy for recovery. New York: John Wiley.
Young, K.S. (1999). The evaluation and treatment of Internet addiction. In L. VandeCreek
& T. Jackson (Eds), Innovations in clinical practice: A source book (pp.17,19–31). Sarasota, FL:
Professional Resource Press.
Young, K.S., & Rogers, R.C. (1998). The relationship between depression and Internet
addiction. CyberPsychology & Behavior, 1(1), 25–28.
Yule, S., Flin, R., Paterson-Brown, S., & Maran, N. (2006). Non-technical skills for sur-
geons in the operating room: A review of the literature. Surgery, 139(2), 140–149.
Zajonc, R.B. (1980). Feeling and thinking: Preferences need no inferences. American Psy-
chologist, 35(2), 151–175.
Zheng, B., Cassera, M.A., Martinec, D.V., Spaun, G.O., & Swanstrom, L.L. (2010). Mea-
suring mental workload during the performance of advanced laparoscopic tasks. Surgical
Endoscopy, 24(1), 45–50.
INDEX

accidents, industrial and military 92, 94, 121 blender metaphor 5–8
Attention Deficit Hyperactivity Disorder bottlenecks 7, 27, 42, 44, 46, 105
(ADHD) 14 Bower, G.H. 29, 30, 43, 63, 69, 71,
algorithms 91, 120–122, 124, 125, 128 107, 118
Amazon 123 brain chip implants 126
American Academy of Pediatrics 74 brain injuries 25
amount illusion 7, 8, 45, 49, 52, 53, 57 brain load 2, 4, 8–10, 44, 46–51,
Amsterdam 121 105, 110
amygdala 25, 29 brain overload 2, 3, 5–8, 11, 12, 18,
anaesthesiology 9, 90, 91 22, 23, 26, 34, 35, 43, 44, 46, 47,
anxiety 14, 29, 30, 32, 43, 60, 62, 63, 51, 60, 126
67–71, 74, 75, 80, 88, 107, 116, 117 Brain Reward System (BRS) 21, 25, 31–33,
apps 3, 74, 108, 120, 121, 126–129 66, 68, 72, 119, 127
Aral, Sinan 101 Brisbane, Australia 121
Archytas of Tarentum 92 Broadbent, D. 23, 26–28, 32, 42
Ardipithecus 1 Brookstone 121
Aristotle 18, 31, 125 Bryson, Bill 99, 100, 107, 118, 131
Artificial Intelligence (AI) 15, 23, 33, 92, 93, Buckinghamshire, Penn 130, 131
119, 122 burnout 11, 43, 45, 51, 52, 79, 84, 86,
Artificial Intelligence Laboratory (MIT) 92 97, 124
Asimov, Isaac 93, 124
associative models 21, 23, 29, 74, 118 Cannon, W.B. 20, 24
automation 78, 90–92, 96, 98, 111, 130 Capek, Karel 92
autonomic nervous system 24, 112, 113 Caplan, S.E. 69, 72, 75
auto-reply 4, 34 Carr, Nicholas 2, 89
Cawood, Andrew 81
Barcelona 121 Center for Internet Addiction 68
behaviourism 20–25, 30–35, 69, 74
big bang 131 Central Nervous System (CNS) 24, 35
big data 119–122, 128, 130–133 cerebral circuit 25
‘Big Five’ personality traits 31, 70, 116 cerebral cortex 24
Biotechnology 126 chunking 6, 7, 9, 10, 23, 27, 32, 34, 41, 42,
BlackBerry 75, 88 44, 45, 47, 50
Index 167

cognitive absorption 110, 116 Facebook 56, 67, 68, 70, 72–74, 88, 109,
cognitive load 41, 42, 80, 105, 106 118, 129
cognitive overload 45, 1–133 facial temperature 112
cognitivism 8, 20, 22, 23, 24, 26, 30–35, 38, fatigue 2, 4, 13, 25, 39, 43, 55, 111, 112
40–44, 57, 65, 110, 124 Fear of Missing Out (FOMO) 62, 63, 66, 80
collaborative filtering 131 Federal Aviation Administration 92
communication overload 39, 40, 105, 108 filtering 7, 8, 26, 27, 40, 42, 44–46, 73, 75,
computationist models 23 81, 125, 126, 131
congruence 9, 22, 23, 29, 32, 46, 49, 52, Fisher, Sir Ronald Aylmer 132
63, 71, 111, 112, 114, 116 flow 23, 28, 37, 55, 61, 62, 111, 112
constructs 28, 42, 71, 72, 100, Folkman, S. 65
101, 103–106, 108, 109, 112, France 76, 86, 99, 130
116, 132 Freud, Sigmund 32, 33, 65, 66
contingency boundedness 45, 52, 53, 57 Frost, Robert 77, 128
full working memory model 27
Damasio, A. 25, 26, 30, 44, 102, 110 functionalism 19, 20, 22, 63
Darwin, Charles 1, 102 Functional Magnetic Resonances Imaging
Da Vinci® Surgical System 95 (fMRI) 115, 116
Davis, R.A. 71, 72, 75 functional neuroimaging techniques 26, 115
deadlines 55
decision support systems (DSS) 38 Galbraith, J.R. 54, 78
de la Contamine, Charles Marie 99, 100 Galton, Sir Francis 102
depression 13, 14, 52, 68, 69, 71–73 galvanic skin response (GSR) 100,
Descartes, René 18, 101 113, 114
deskilling 91 gamification 119, 120, 128
Devol, George 92 Gardner, H. 18, 19, 23, 31, 101
Diagnostic and Statistical Manual of Mental General Motors (GM) 92, 94
Disorders (DSM) 14, 70, 71 germ theory 17, 18
Diderot 19, 101 Great Trigonometrical Survey 99
digital footprint 122
Disneyland 120 Harris, Tristan 2, 75, 128
driverless cars 91 Hawking, Stephen 119
Health Information Technology (HIT) 90,
Ebbinghaus, H. 19 91, 95, 120, 124
Edison, Thomas 15 hemispherical specialization 38
ego 32, 65, 70, 73, 74, 119, 128 hemispheric encoding 116
email overload 4, 128 heuristics 23, 32, 35, 41, 47, 48, 63, 64
Emotional-Cognitive Overload (ECO) 6, Hipparchus of Nicaea 100
11, 44, 46, 48, 49, 52 homeostasis 24, 25, 36, 47, 64, 66,
Emotional-Cognitive Overload Model 111, 112
(ECOM) 5, 6, 12, 15, 35, 38, 44–46, Honda 92
50–55, 57 Hoyle, Fred 131
emotions 8, 9, 12, 19, 20, 25, 29, 31–33, hyperconnectivity 13, 61, 62, 119
35, 40, 43, 44, 46, 55, 57, 63–65, 67, 70,
75, 93, 118, 124, 127 id 65
Enterprise Cognitive Computing (ECC) 76, iDisorders 2, 62, 68, 69, 71, 73
77, 91, 92 implicit memory 28
episodic memory 28, 29, 45, 48, 50, 64, 116 individual differences 7, 9, 31, 37, 44, 46,
ethics 2, 67, 122, 124–127 52, 53, 55, 102–104
Everest, George 99 industrial revolution 81
evolution 1, 15, 22, 27, 92, 119 information processing (IP) 40, 41, 43, 44,
experts 4, 6, 8, 10, 31, 32, 35, 45, 47–52, 46, 49, 53
81, 91, 106, 111, 113 information processing capacity (IPC) 31,
explicit memory 28 32, 34, 39–41, 43, 44, 54, 56, 78, 79,
eye-tracking 115 82, 89
168 Index

Instagram 67, 72, 73 mindfulness 2, 4, 15, 34, 74, 75, 83, 98,
interactive cognitive subsystems (ICS) 30 122, 125, 126, 128–130, 132, 133
International Space Station 123 mind-gut 19, 32, 35, 110
Internet 3, 12–14, 58–61, 66–72, 74, 86–88, minimally invasive surgery 110
98, 106, 107, 119, 121, 132 Minnesota experiments 38
Internet of Things 121, 132 Mitroff, J. 38, 43
introspection 19, 102, 115 mobile mindset study 67
IQ tests 103 modal model 27–29
IT addiction 1–3, 5, 7, 12–15, 22, 35, Mohan, Geoffrey 73
61, 62, 66–72, 74, 75, 77, 78, monochronicity 55
86–89, 98, 128–130 multitasking 12–14, 50, 56, 57, 67, 73,
IT-related overload 1, 2, 4, 13, 15, 20, 33, 91, 106
34, 36–57, 77–79, 82, 87, 89, 99–117, Myers-Briggs personality types 38
128, 129, 131
nanotechnology 126
Jacoby, J. 35, 41, 42, 105, 109 narcissism 31, 35, 68, 70, 72, 73,
James, W. 19, 20, 63 102, 127
natural selection 102
Kant, Immanuel 18, 19, 23, 32, Neanderthals 1, 15
101, 102 need for cognition (NFC) 47, 48, 53,
Kiva Systems 123 107, 116
Knight Rider 93 neo-Luddites 89
Kohlberg, L. 124, 125 net generation 13, 14, 56, 57, 70
Köhler, W. 1 Netherlands 11, 48, 53, 58, 60, 76, 91,
Krugman, Paul 89 121, 127
Kuhn, T.S. 18 neurohormones 25
neuroticism 31, 116
Lambton, William 99, 100 Newton, Sir Isaac 118
Laparoscopic Surgical Skills (LSS) 113–115 Nexus A.I. 121
laughter 65–66 Nielsen 81
Laws of Robotics (Asimov) 93 non–experts 10, 48–50, 111
Lazarus, R.S. 65 Norman, D.A. 27, 28
Leavitt, H.J. 77
Leonardo da Vinci 92 objective time 55
‘Like’ buttons 118, 119, 129 objectivity 47, 55, 101–105, 109, 111, 116
limbic system 24–26, 28, 32, 66, 116 Occupational Safety and Health
LinkedIn 73, 88 Administration 93
Lister, Joseph 17 online baby system (OBS) 58–64, 66–68,
logic theory 23 71, 72, 74, 75
loneliness 59, 67–69, 71–73 organizational design 54, 77–79, 81,
long-term memory (LTM) 25, 28, 29, 34, 85, 88, 96–98, 129
41, 42, 45–49, 63, 69, 74 Outlook 126
Luddites 89, 90 over-connectivity 31, 58, 68, 72
lying 13, 22 oxytocin 24, 25, 66, 72

management information systems (MIS) paradigm shifts 18, 19, 101


38–42 Paradise Pier hotels 120
Mason, R.O. 38, 43 PARO 123, 124
mentalism 31, 32 Pasteur, Louis 17
metabolic equivalent of tasks (METs) 100, pathological internet use (PIU) 13, 14,
113, 114 67–72, 74, 75
Metz, Rachel 2 pattern recognition 27, 42
Miller, G.A. 9, 23, 27, 34, 40, 44 Pavlov, Ivan 21
mind–body supervenience 18–20, 23, 24, Peacock–Edwards, Rick 35
26, 32, 66, 101, 119 Pearson correlation test 113
Index 169

peripheral nervous system (PNS) 24 social networking systems (SNS) 3, 12, 14,
personality traits and disorders 8, 22, 24, 31, 66, 69–74, 86, 98, 128
32, 34, 47, 48, 53, 61, 67–70, 75, 102, social phobia 14, 69
103, 110, 116, 117 Songdo City 122
pertinence 6–10, 27 ,29, 37, 40, 42–46, 48, South Africa 94
49, 52, 54, 57, 63, 75, 92, 93, 119, 121, South Korea 122, 130
123, 126 Spitzer, M. 73
phantom vibration syndrome 2, 73 Standford-Binet Intelligence Scale
phenomenological sociology 102 (SBIS) 103
pilots 8, 35, 126 Star Trek: The Next Generation 93
Platform for Privacy Preferences 123 Steve Jobs schools 73
polychronicity 55, 56 stimulus–response (S–R) 21, 22, 24, 63
Popper, K. 18, 101, 102, 107, 130 subjective time 55
positron emission tomography 26, 115 subjectivity 20, 30, 55, 101, 102, 104, 109,
prefrontal cortex (PFC) 25, 26, 28, 32, 102, 111, 114, 116
115, 116 superchunking 10, 27, 32, 47,
prior experience of ECO (PECO) 116, 117
48–50, 53 supervenience 18–20, 23, 24, 26, 32,
privacy 73, 90, 122, 123, 127, 130 34, 36, 64, 66, 73, 89, 101, 107, 110,
psychoanalysis 31, 32, 64, 70 116, 119, 131
psychometrics 102–105, 107, 113 suppressed emotions 64
Puerperal Fever 17 Survey of India see Great Trigonometrical
Survey
qualitative overload 39, 43, 104, 105 Sweller, J. 34, 42, 47, 105, 106
quantum computing 122
task-switching 28, 52, 54, 56, 57, 67, 80,
requests to use IT 11 82, 87
Revelle, W. 31 technophilia 61, 69, 73, 129
Robots 78, 89, 90, 92–96, 98, 119, technophobia 61
123–125, 123, 128, 129, 133 techno-strain 89
Rosenstein, Justin 118, 119, 128 technostress 2, 11, 12, 14, 39, 50, 61, 78,
Royal Dutch Shell 121 86, 87, 89, 97, 98, 126, 128
Tesla 124
Savage, T.S. 1 Thermoview 8300 camera 111
schemata 19, 23, 26–29, 34, 41–45, three worlds theory (Popper) 101
48, 49, 53, 57, 63–65, 69, 101, time management 74
107, 117, 126 tools 1–3, 15, 39, 79, 91, 92, 98, 106, 111,
scientific revolution 18 115, 116, 119
second brain 35, 110 transcranial magnetic stimulation 126
self-driving 91, 124 trauma 25, 63–65
self-serving attribution bias 49, 108, 109 triangulation 99, 100, 110, 112–117
semantic memory 28, 29, 48 Turkle, S. 68
Semmelweis, Ignaz 17, 18 Twitter 5, 73, 88, 109
SenseWear BodyMedia system 113
sensory memory 27, 28 Uber 120, 121
Seoul 121 underload 7–9, 49, 95, 104
September 11 attacks (9/11) 66 United States of America (USA) 27, 77,
seven (magical number) 9, 27, 34, 91–94, 122, 130
40, 41
Sherer, K.R. 20 valence 7, 9, 21, 29, 30, 43, 46, 50–52, 64,
short-term memory (STM) 28, 42, 45 65, 74
Simbionix LAP Mentor 113 Vallor, Sharon 124
smart farming 122
smartphones 2, 3, 10, 11, 13, 14, 37, 51, 57, Watson, J.B. 20
58, 67, 73, 75, 88, 91, 121, 128, 129, 132 Waugh, Andrew Scott 99
170 Index

withdrawal 12–14, 35, 66, 68, work-life balance 16, 76, 78, 81–86, 98,
61, 88 128, 132
working day 77, 82, Work-Life Balance campaign 85
83, 88 Wundt, Wilhelm 19
work-family conflict 16, 83–85, 87, Wyman, J. 1
88, 97, 129
working memory (WM) 26–30 Young, Kimberley 68

Você também pode gostar