Next Article in Journal
Early-Stage Detection of Cyber Attacks
Next Article in Special Issue
SyrAgri: A Recommender System for Agriculture in Mali
Previous Article in Journal
Future-Aware Trend Alignment for Sales Predictions
Previous Article in Special Issue
Factors Affecting Decision-Making Processes in Virtual Teams in the UAE
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

The Limitations of Decision-Making

Capgemini UK, Forge End, Woking, Surrey GU21 6DB, UK
Submission received: 11 October 2020 / Revised: 20 November 2020 / Accepted: 25 November 2020 / Published: 29 November 2020
(This article belongs to the Special Issue Artificial Intelligence and Decision Support Systems)

Abstract

:
In a world faced with technological, health and environmental change and uncertainty, decision-making is challenging. In addition, decision-making itself is becoming a collaborative activity between people and artificial intelligence. This paper analyses decision-making as a form of information processing, using the ideas of information evolution. Information evolution studies the effect of selection pressures and change on information processing and the consequent limitations of that processing. The analysis identifies underlying information evolution factors that affect the quality of information used throughout decision-making and, hence, affect the quality of decisions. These factors imply a set of challenges in which the pressures that drive useful trade-offs in a static environment also hinder decision-making of the required quality in times of change. The analysis indicates the information evolution characteristics of a good decision-making approach and establishes the theoretical basis for tools to demonstrate the information evolution limitations of decision-making.

1. Introduction

We live in a time of rapid and unprecedented change. Over recent years the world has faced major uncertainties caused by threats to the economy (the financial challenge), our health (the COVID-19 pandemic) and the environment (climate change). Authors, such as Taleb [1], have pointed out the difficulties that these changes imply for decision-making. But now, the increasing adoption of artificial intelligence (AI) [2] is adding extra complexity as smarter technology is integrated into the decision-making process.
Many papers have discussed the difficulties of change—there is a large body of literature addressed to the study of organisational change (see [3], for example); however, decision-making is a form of information processing, so we can apply the ideas of information evolution [4], which studies the effects of change on information processing.
Information processing is not just what computer systems or individual people do. For organisations and teams within them, information processing is a complex combination of activities by individuals, groups and technology (increasingly including AI) subject to a range of incentives and constraints [4]. Information evolution applies to information processing in this wider sense and helps to understand the factors that drive different levels of information quality. Poor information quality translates into poor decisions and it is important to understand how quality can be lost or improved in complex decision-making environments.
This paper analyses these topics using the techniques of information evolution and addresses two questions:
  • How does information evolution apply to decision-making?
  • With its emphasis on information quality, does information evolution analysis have the potential to improve the quality of decision-making?
Section 2 provides a summary of the key ideas and principles of information evolution, derived from [4,5,6], that affect decision-making. These are based on a simple underlying idea—the combinatorial challenge associated with information processing is considerable because of the large number of potential states that may need to be processed. This means that information processing always involves trade-offs and shortcuts that affect the pace, quality and friction of the processing. But when the environment changes, these trade-offs may not apply in the same way. Information processing always has some limitations, but these are exacerbated by changes in the environment.
Decision-making has been widely studied in many disciplines. Section 3 reviews some decision-making approaches in fields ranging from philosophy to business, machine learning to software development. The discussion links these approaches to information evolution.
Section 4 frames decision-making in terms of information evolution. This highlights the importance of information quality in relation to decision-making and the impact on decision-making if the level of quality is not understood. The analysis shows that information evolution generates a set of challenges that will diminish the quality of decision-making unless they are addressed. The forces that drive good trade-offs in a static environment also hinder decision-making of the required quality in times of change.
Section 5 relates these challenges to the underlying factors that drive them. These underlying factors are related to:
  • The nature of external selection pressures (those created by the environment of the organisation);
  • The nature of internal selection pressures (the incentives and constraints generated by the organisation) and how well they match the external selection pressures;
  • The information processing conventions for decision-making that arise as a result of these selection pressures and how well they recognise the quality of the information that they use or produce;
  • The nature of the information structures used in decision-making (where these are information structures used in the wider information processing sense described above, not just data structures embedded in technology);
  • The nature of the decision-making process and its compatibility with the need to improve information quality when the quality is deficient;
  • The human and technological capabilities used in decision-making and how well they are integrated to maintain the level of pace, friction and information quality.
This analysis also highlights the information evolution characteristics of a good decision-making approach—it should be sceptical, discriminating, iterative, capable and mature (where these terms all have specific meanings in terms of information evolution).
The paper concludes that information evolution provides a theoretical basis for analysing the limitations of decision-making as information processing. The analysis identifies ways in which information evolution has the potential to improve decision-making and several avenues for further research are identified.

2. An Overview of Information Evolution

As Christian says [7] “We have seen that all living organisms are informavores. They collect information, process it, and act on it.” As Buonomano [8] says: “the brain is, at its core, a prediction or anticipation machine”. More generally, Interacting Entities (IEs)—like people, animals, organisations, political parties, teams or computer systems—interact with their environments to achieve favourable outcomes either directly or via other entities. To improve the likelihood of favourable outcomes and reduce the likelihood of unfavourable outcomes, they need to be able to connect environment states with potential actions and subsequent outcomes. Information, in a variety of forms of information artefact, enables this connection. The nature of interactions affects the favourability of outcomes—firms try to make profits, people try to find partners, political parties try to get elected. To improve the favourability, IEs need to connect states of the environment (using descriptive information) with expected outcomes (using predictive information) and the actions needed to achieve the outcomes (using prescriptive information). It is this set of connections that gives meaning to information, not just in terms of recognition (what is a “pandemic”?) but in terms of wider connections (what is “pandemic” connected with and what outcomes are related to it?).
The combinatorial challenge associated with these connections is huge [9] so IEs make trade-offs and use shortcuts. The selection pressures of the environment (i.e., everything outside the IE that affects any interaction) determine the outcome of interactions and the effectiveness of the trade-offs. So long as the trade-offs are good enough then the IE will be able to achieve favourable outcomes.
IEs in different environments are subject to different selection pressures and so different information ecosystems develop, each with their own conventions that embed the trade-offs and how information is processed. We call them just “ecosystems” where the meaning is clear. Examples include the separate worlds of Finance Managers, banking systems, mathematicians and graph theorists and, as these examples show, IEs may belong to more than one ecosystem and ecosystems may be nested and more specialised depending on the selection pressures under consideration.
But what is the nature of the connections? Call a slice a contiguous subset of space–time. A slice can correspond to an entity at a point in time (or more properly within a very short interval of time), a fixed piece of space over a fixed period of time or, much more generally, an event that moves through space and time. This definition allows for great flexibility in discussing information. For example, slices are sufficiently general to support a common discussion of nouns and verbs, the past and the future.
Slices are also flexible enough to allow a discussion of physical events (which are slices in themselves) and abstractions (e.g., “four-colouring”) because each of the latter is represented by the set of slices containing the symbolic representations recognised by the corresponding ecosystem (graph theorists in this case). This distinction is examined in much more detail in [5,10,11].
Information corresponding to ecosystem conventions is called content with respect to the ecosystem. Content is structured in terms of chunks and assertions. A chunk specifies a constraint on sets of slices (e.g., “John”, “lives in Rome”, “four-colouring”). An assertion hypothesises a relationship between constraints (e.g., “John lives in Rome”) that may or may not correspond to an actual relationship. Pieces of information are connected in an associative model (for example, Quine’s “field of force whose boundary conditions are experience” [12], the World Wide Web, or Kahneman’s “associative memory” [13]) with the nature of the connections determined by relevant ecosystem conventions and constraints.
The effectiveness of information in supporting favourable outcomes is determined by three measures: pace, friction and quality [5]. Different ecosystem conventions embed different trade-offs between these. Since humans are instinctively poor at assessing information quality [13] we often need to be explicit in analysing it [10,11]. Many ecosystems (e.g., science, the law, software development) have conventions about how quality is addressed involving elements of review or challenge.
Sufficient quality is required to discriminate between options that have different outcomes. This was recognised by Bateson in his definition of information [14] as “the difference that makes a difference”. Generally, this means that the quality required for different purposes can vary significantly between different environment properties. Some distinctions can be ignored but others merit considerable rigour. Floridi [15] characterises quality in this sense as “fitness for purpose”.
Fitness measures how effectively an IE can achieve favourable outcomes in its environment. There are three levels of fitness as follows:
  • level 1 (narrow fitness): associated with a single interaction;
  • level 2 (broad fitness): associated with multiple interactions (of the same or different types) and the consequent need to manage and prioritise resources between the different types—this is the type of fitness linked to specialisation, for example;
  • level 3 (adaptiveness): associated with environment change and the consequent need to adapt.
In this paper, we are most concerned with the relationship between steady-state pressures and environmental change so we can combine 1 and 2 and consider static fitness (1 and 2) and adaptiveness (3).
We can expect adaptiveness to be difficult—change suffers from a general phenomenon called ecosystem inertia [4]. Ecosystem conventions take time to develop and may not keep up with the rate of change in the environment. Existing conventions may not suit the changed environment and may reduce an IE’s chances of favourable outcomes. There are examples of this in many disciplines, for example:
  • Kuhn’s discussion of paradigm shifts in science [16];
  • “change resistance” in organisations—for example [17]: “one of the most baffling and recalcitrant of the problems which business executives face is employee resistance to change”;
  • the “digital divide” [18] as some people find it difficult to keep up with changing digital technology.
Selection pressures may trigger different responses to the combinatorial challenge, each resulting in particular trade-offs and shortcuts. Since change may involve chaotic events or, in a more recent discussion, “Black Swans” [1], it is important to understand the limitations that these trade-offs and shortcuts imply. Information evolution ideas enable an analysis of these types of limitation. This approach has been applied to several diverse topics, including different measures of information (like truth, for example) [6], meaning [10], inference [11], digital transformation [4,19] and artificial intelligence [9]. These different analyses focus on the concept of a viewpoint—the way in which trade-offs and shortcuts manifest themselves in information processing.
Examples of viewpoints are everywhere in everyday life. In a court case, the prosecution and defence come to quite different conclusions based on the same information but different types of reasoning. The same is true of political parties. In businesses, the business processes, technology and roles played by people define the viewpoint of the business to the inputs it receives.
Figure 1 describes viewpoints using a diagramming technique defined in [4] based loosely on semantic nets and linked to the underlying model of information evolution described in Appendix A (and [4,5,6]).
The diagram describes the relationship between the following ideas that are used in subsequent sections:
  • a viewpoint combines descriptive, prescriptive and predictive perspectives in which each perspective is a pattern, corresponding to ecosystem conventions, that constrains the nature of information processing and the way in which it is structured;
  • any viewpoint has a domain of applicability—the set of environment states in which it can be reliably applied;
  • an IE (which could be a person, a team, a computer system or anything else that processes information) implements a viewpoint and its capability determines how the information artefacts are interpreted and the quality that can be achieved.
We can summarise some of the key elements of this discussion (and [4,5,6]) in the set of principles contained in Table 1.

3. Decision-Making Approaches and Information Evolution

In this section, we consider a range of approaches to decision-making. This is not an exhaustive list but highlights some aspects of decision-making and their relationship with information evolution.
In the world of psychology, Kahneman [13] discusses how humans respond to information and the biases that are likely to affect their decision-making. At the heart of the problem, from an information evolution perspective, is that the instinctive approach of people (that Kahneman calls “System 1” and see [20] for a more recent discussion), is blind to information quality. Based on this analysis, Kahneman et al. [21] identify a list of questions that can be used to help avoid bias in business decision-making.
In his discussion of “Black Swan” events, Taleb [1] focuses on the nature of the decision-making ecosystem especially about for and rare events (Black Swans) considering the impact of poor information quality. As he says: “… to make a decision you need to focus on the consequences (which you can know) rather than the probability (which you can’t know) …”. But, as some of the examples below demonstrate, knowledge of accurate probabilities is often assumed.
In the world of philosophy, there has been extensive debate about decision-making (see, for example, [22]). Concepts like expected utility and causal decision theory have been analysed but these concepts do not address all of the decision-making components included in Figure 1 and Figure 2. Because they rely on probabilities, they are subject to the problems that Taleb discusses.
In the philosophy of science, there has been considerable analysis of what Quine [12] calls “contrary experiences” in science and the decisions that they provoke. Popper’s emphasis on refutation [23] proposes attributes of the decision-making ecosystem itself while Kuhn [16] describes the impact of ecosystem inertia on that ecosystem. In the information evolution approach, these are not mutually incompatible—instead, they are different perspectives of an underlying model.
In his critical appraisal of the AI technique called deep learning, Marcus [24] highlights the limitations of deep learning in an era of change (“deep learning presumes a largely stable world”) when information quality is poor (“in problems where data are limited, deep learning often is not an ideal solution”). Moreover, he highlights an inappropriate set of selection pressures imposed by the machine learning community itself (“life is not a Kaggle competition”), which do not accurately represent real life.
Addressing a similar topic from a different direction, Pearl [25] demonstrates the need for causal models in decision-making as well as data and statistics. Pearl addresses “the unfortunate failure to address causation” in the world of data and statistics and highlights some of the shortcomings.
Although there is considerable academic debate about its merits [26], Maslow’s hierarchy of needs [27] presents a point of view in which decision-making is driven by needs (linked to desired outcomes) and particular needs (e.g., hunger) take precedence over others (e.g., what he calls self-actualisation).
There are numerous examples of the application of mathematics to decision-making in animals (e.g., [28]) as well as humans. Friedkin et al. [29] provide a summary of the application of mathematical techniques to group decision-making and contrast mathematical optimisation against consensus building. As they say: “Flawed decisions may be generated by an unrealistic optimization model or by an informal social process of group deliberation. Successful decisions may be generated by both.”
In politics, there is an emphasis on the creation of specific information ecosystems to tackle complex decisions. For example, in the UK, [30] describes the so-called COBRA Civil Contingencies Committee (the abbreviation was derived from “Cabinet Office Briefing Room A”) as follows: “COBRA is […] is convened to handle matters of national emergency or major disruption. Its purpose is to coordinate different departments and agencies in response to such emergencies.” In other words, it involves the creation of a specific ecosystem separate from normal government ecosystems to tackle events that they are not designed for.
The idea of planning for various operational contingencies lies at the heart of business continuity thinking [31]. At a more strategic level, Wilkinson and Kupers [32] discuss the importance of scenario planning as a tool to help break the habit “of assuming that the future will look much like the present” and avoiding relying on models based on poor quality information. Business continuity planning and scenario planning, in this sense, are specifically designed to tackle ecosystem inertia.
Of course, decision-making is critical to the world of business in general and the management literature has much advice on decision-making. For example:
  • The design of governance bodies [4] and approval levels places responsibility for different decisions with different bodies. Each body will form its own ecosystem.
  • Lovallo et al. [33] analyse the selection pressures on managers and ask why managers in large hierarchical organisations are so risk-averse. They conclude that “CEOs are evaluated on their long-term performance, but managers at lower levels essentially bet their careers on every decision they make—even if outcomes are negligible to the corporation as a whole”.
  • Wilson and Daugherty [34] discuss the relationship between humans and AI—highlighting the importance of integrating human and AI ecosystems.
Authors have also focused on the specific difficulties of decision-making in time of change:
  • Heifetz et al. [35] discuss changes to the ecosystems required to respond to change.
  • Webb [36] addresses ecosystem inertia (“leadership teams get caught in a cycle of addressing long-term risk with rigid short-term solutions, and in the process, they invite entropy”) and recommends that “deep uncertainty merits deep questions”—these are the questions that challenge ecosystem conventions.
  • Hopper and Spetzler [37] address information quality directly: “the experts […] aren’t asked to provide a comprehensive view […] With exquisite mock precision, they describe these highly specific futures, shrugging off uncertainty on the grounds that the future is ultimately unknowable”.
In their analysis of culture [38], Schein and Schein define culture in the following way:
“The culture of a group can be defined as the accumulated shared learning of that group as it solves its problems of external adaptation and internal integration; which has worked well enough to be considered valid […] This accumulated learning is a pattern or system of beliefs, values, and behavioral norms that come to be taken for granted as basic assumptions and eventually drop out of awareness.”
This definition covers selection pressures (“external adaptation”), ecosystem maturity (“internal integration”), good enough tradeoffs (“worked well enough”) and ecosystem inertia (“taken for granted”, “eventually drop out of awareness”).
In [39], Christian and Griffiths address the impact of time on decision-making: “But in practice, when the clock—or the ticker—is ticking, few aspects of decision-making (or of thinking more generally) are as important as one: when to stop.” In their analysis, they focus on decision-making as a process rather than a single event and analyse the balance between the improvement of information quality and the timing of a decision.
In the world of information technology (IT), the authors of the Agile manifesto [40] describe the priorities of the Agile software development approach in the following way:
  • “Individuals and interactions over processes and tools”—creating a mature information ecosystem;
  • “Customer collaboration over contract negotiation”—developing an integrated ecosystem;
  • “Responding to change over following a plan”—being adaptive.
These priorities determine how decisions about software development are taken.

4. Information Quality and Decision-Making

Section 2 describes the trade-offs and shortcuts that information evolution imposes on information processing. This section introduces a representation of decision-making in terms of information evolution and considers the impact on decisions of the trade-offs and shortcuts. This highlights the importance of information quality in relation to decision-making and the impact on decision-making if the level of quality is not understood. The analysis implies a set of specific decision-making challenges based on the level of information quality supported.
First, consider decision-making in terms of information evolution. Suppose that an IE (any kind of IE) is deciding the state of a target (which could be itself, another IE, or anything else). This means that the decision-making process can include information about any or all of the following:
  • the current state and history of the IE (because this determines what it can do);
  • the current state and history of the target;
  • the current state and history of the environment (where this includes anything else that might be relevant).
Figure 2 shows the generic process by which this information is turned into a decision using the diagram convention described in the Appendix A.
Figure 2 raises several questions. How should the information available be processed—what viewpoint or viewpoints should be used? There may be several viewpoints available and, often, there will be a business-as-usual viewpoint corresponding to the normal process taken. But, in addition, there may be several scenarios identified (where the term “scenario’ is taken from scenario planning [32]). Each of these is a hypothesis of the form: if there are particular properties of the environment (for example, trends of a particular type) then some specific types of outcomes are possible and these will require actions of a particular type. In this case what information is needed to make the decision and how can the decision-making IE discriminate between potential outcomes? Given the nature of the viewpoint, is a different decision-making IE required to make the decisions? The answers to these questions depend on having sufficiently good quality information (where the components of information quality are discussed in [6]).
But, selection pressures generate an approach to quality that is good enough for a particular environment and that may not be sufficient in a changed environment. The problem is exacerbated, especially in times of change, because people have particular limitations with respect to information quality.
Simon [41] addresses this in his concept of bounded rationality. As described in [42], this is “the theory that when people make decisions, their rationality is limited (bounded) by the difficulty of the decision problem, their own cognitive limitations, and the time available to make the decision.” Simon invented the term “satisficing” to embrace the concept of doing what is good enough. Many authors, starting with Ferrero [43] and then Zipf [44] have captured a related idea—the “principle of least effort”—especially as it concerns searching for information.
Kahneman [13] captures another important principle. In relation to his System 1 (the automatic processing of information in humans), he says: “what you see is all there is”. In other words, people instinctively deal with the information in front of them rather than assessing its quality.
The introduction of more technology like deep learning will not necessarily help. As the quotes from Marcus [24] above demonstrate, deep learning is not resilient to changes in the environment.
So, in times of change, an understanding of information quality is important in decision-making but may not be available. The following sections analyse this question starting with an analysis of ecosystem inertia.

4.1. Ecosystem Inertia

For decision-making to be effective, an IE needs to be able to discriminate between options. Discrimination requires each element of Figure 1 and Figure 2 to be of sufficient quality with respect to those differences. Under normal circumstances, when ecosystems have had time to mature, quality is built into the conventions (otherwise the IEs would not succeed). But as Floridi [15] points out, quality is about “fitness for purpose” and when the environment is changing then the conventions do not necessarily apply because they may apply to a different purpose. The need for discrimination often gets lost in the development of ecosystem conventions as the Schein and Schein [38] quotation above demonstrates.
The Ecosystem Inertia Principle (see Table 1) means that ecosystem conventions lag behind the needs of the environment in times of change. So, decision-making in response to change may not have the luxury of the time needed to develop a sufficiently mature ecosystem.
Worse than this, static fitness will create pressure to use existing conventions with minimal change. But a change in the environment may challenge all of the conventions that support the full decision-making process and the information it uses. If existing conventions do not include a suitable assessment of information quality, then it may not even be apparent that the existing conventions are not appropriate.
This means that we have the following general challenge.
Ecosystem Inertia Challenge: Ecosystem inertia means that decision-making IEs may not have time to respond to change before a decision is required. Static fitness will put on pressure to use existing conventions with minimal change, but they may not be appropriate, and their limitations may not be recognised.
Ecosystem inertia sets the general framework, but we can be more specific about some of its implications.
Ecosystem inertia implies that it is not necessarily possible to make a rigorous decision based on high-quality information. Information of sufficient quality needed to discriminate accurately may take time to develop. While this is happening, it is important not to close off options that may be needed in the future. But static fitness will encourage a focus only on the initially chosen option with the result that others may be inadvertently closed off.
This leads to the following challenge.
The Open Options Challenge: When quality is not sufficient to discriminate between options the implementation plan should be sufficiently decoupled to minimise the pace and friction required to change course between them when information quality improves. But static fitness will encourage a focus on one option only and limit the implementation of decoupling.
One way to understand the need for change is through the use of scenarios. When faced with a changing environment and complicated decisions, scenarios provide a shortcut for recognising that a business-as-usual viewpoint might not be appropriate and understanding what options should be considered as Wilkinson and Kupers [32] suggest.
In the absence of such scenarios, then much more analysis may be required, and this may be time-consuming (it is easier to test a hypothesis than to create and test a hypothesis). Static fitness pressures will have the following two types of impact here:
  • They will prevent scenarios from being analysed at all in advance of decision-making;
  • They will prevent scenarios from being analysed during or as a result of decision-making.
This leads to the following challenge.
The Scenario Challenge: In the absence of suitable adaptiveness, the scenarios required to help recognise and understand the viewpoints required for complex decision-making may not be available when required. This will reduce the quality of decision-making.
Complex decisions are often heterogeneous in the sense that the evaluation of different options requires different viewpoints that are not directly comparable. So, how can such comparisons be made? One natural approach is to reduce each option to a number (e.g., a financial number for business cases). However, this approach suffers from two underlying issues.
First, as discussed in [6], information quality is not generally a total order and so cannot reliably be reduced to a single number. For example, business cases often consider non-financial benefits (that cannot be reduced to money) as well as financial benefits. Or, in the public sector, the impact of a change on different policies cannot be reduced to a single number.
Secondly, there is a danger of what Hopper and Spetzler [37] call “exquisite mock precision”—providing an appearance of good quality information while masking poor quality underneath.
Another approach is to assume that outcomes occur in a hierarchy of layers. This is the case in “Maslow’s hierarchy of needs” [27] (although there is considerable debate about the concepts—see [26]). In this case, needs lower down the hierarchy trump those above. There are simple cases of this in the animal kingdom [28]—for example, fish heading to spawning grounds will temporarily ignore that objective when encountering a predator.
A more refined approach is to use weightings for different quality attributes and to force the information to be structured in a way that supports the specific types of comparison. In this case, the weightings act as a proxy for selection pressures. But again, we hit quality problems—how can the quality of the weightings be established? They too will not generally conform to a total order so it may not be possible to reduce selection pressures reliably to a series of weightings. Just as before, there is a danger of “exquisite mock precision” [37].
More rigorous analysis may call on AI or computer simulations, but these will suffer from the same problem—they require high-quality information and models, but these will not necessarily be available in a changing environment.
In any comparison process, there may be an analysis of risks, often evaluated in terms of probability and impact [45] and often without a corresponding analysis of the quality of those values. This is a simple version of the concept of expected value initiated by mathematicians Fermat and Pascal as a basis for comparing different outcomes. It is a sound mathematical technique so why does it not apply to all decision-making?
There are two problems: it requires numbers and it requires the numbers to be accurate. This is the territory analysed by Taleb [1] who demonstrates that neither of these may be available, especially in cases that may deliver unfavourable outcomes. Where there is the possibility of a large, unfavourable (possibly existential) outcome for which the quality of information is not good then actions that may lead in that direction should be avoided until better quality information is available.
Taleb discusses global issues like the financial crash of 2008. But, at a smaller scale, the same analysis applies to the risk-averse nature of middle managers discussed by Lovallo et al. [33]—their decisions may not be existential for their firm but may be for their careers.
There are two underlying problems here. Good quality information structures that allow heterogeneous options to be compared may not be available and, in any case, the quality of the information needed may not be good enough. As before, it takes time to develop the integrated structures and information of the right quality and the time may not be available.
So, we have another challenge.
Heterogeneous Comparison Challenge: In the case of significant environment change, the level of information quality available for different options may vary and the integrated information structures needed to provide a high-quality decision perspective may not be available.
This challenge opens up the question of information structure and its impact on decision-making.

4.2. Information Structure

The way in which information is structured has an impact on how easily it can be used and the level of quality possible—this is the Information Structure Principle (Table 1). For example, Marcus [24] analyses the limitations of deep learning with respect to the hierarchical structure of natural language and concludes that deep learning, in its current state of development, cannot take advantage of the hierarchical structure of language and draw on its implications.
Information has structure at different levels. Content is expressed in symbolic modelling tools [5] like human languages, types of mathematics, relational databases, graph databases, knowledge graphs or programming languages and each of these conforms to its own set of structures.
But at a deeper level, information represents the connections described in Section 2 (and described much more fully in [4,5,6]). So, in this paper, the term information structure refers to the structure of these connections. This is the structure that the different modelling tools express in their own different ways. It corresponds to the structure contained in the models used in technology architectures [46], for example.
Graph Theory [47] is a natural technique for modelling connections. Box and line diagrams, based on graphs, are widely used in many fields to represent structure. For example, many IT architecture concepts are represented using graph-based diagrams [46].
But, as shown in [4,5], to model information connections generally we also need to model connections between connections and graphs cannot support these. So, in this paper, we use linnets, a generalisation of graphs, described in Appendix A (and also in [4,5,6]). Like graphs, linnets contain vertices and connectors (like edges) but, unlike graphs, connectors can connect other connectors as well as vertices. Linnets arise naturally from a discussion of the process and relationship connections that information provides.
Linnets are labelled using ecosystem content so we can use them to represent the ways in which ecosystem content is connected.
For a complex entity or set of slices, we are likely to be interested in different perspectives, each representing a particular aspect. For example, in representing an organisation we may be interested in perspectives like organisation structure, business processes, IT architecture, performance management and so forth [4]. For each perspective, there will be a set of ecosystem conventions for representing the perspective.
Each perspective uses some form of structure pattern—Table 2 contains some examples.
Appendix A contains a definition of the term information structure, based on linnets, that can be applied to the connection-oriented view of information presented here. Since structures are themselves linnets, we can have different levels of structure.
Just like any information artefact, structures have a level of quality depending on how well they represent the more detailed view. If structures are used as a proxy for the underlying detail then their quality is critical. Since selection pressures apply to structures, we have the following principle.
Structure Level Principle: Selection pressures will drive IEs to process information at a level of structure that is compatible with the level of quality required by selection pressures. The structures that evolve will be constrained by the capabilities of the IEs that use them.
There are two ideas here: structures will be driven by selection pressures and they will be constrained by the processing capabilities of the IEs that use them. Consider the latter first.
It is well understood that technology uses specific data structures to enable the processing required. Types of structure include relational databases and knowledge graphs. Interfaces are described using techniques like the extensible markup language (XML) [48] that apply particular structure patterns.
However, the structure of information is also important to people. Structure affects the workings of Simon’s bounded rationality. Numerous disciplines have analysed the information structures appropriate to people and the thinking goes back a long way. Gallo [49] notes the following about Aristotle’s thinking:
“Brevity is a crucial element in making a persuasive speech. An argument, Aristotle said, should be expressed ‘as compactly and in as few words as possible.’ He also observed that the opening of a person’s speech is the most important since ‘attention slackens everywhere else rather than at the beginning.’”
These examples show that information structure is critical to decision-making for people. We can understand why simple structure patterns like hierarchies and layers work well for people—they are compact structures, easy to navigate and minimise the dependencies (and hence processing) between different parts of the structure. They all have straightforward relationships between the parts of the structure—in particular, they are acyclic in the sense that the relationships do not contain loops (see Appendix A for a definition).
Human-Centric Structure Principle: Under the influence of bounded rationality, people prefer brief decision perspectives with acyclic structures.
The limitations of people are magnified when we consider committees containing people from different ecosystems (perhaps, in organisational terms, representing different organisational functions). In this case, their different ecosystem conventions may use different, incompatible structures until the decision-making ecosystem itself is fully established [4]. For example, software development teams use structure patterns like user stories and software designs but these may not form part of other organisational ecosystems.
Now consider the impact of change and selection pressures on structures. What types of perspective and structure are of most interest if we need to take change into account? The extent and complexity of a change depends on how far it extends—its dependency connectivity. If, when changing A, then B will need to change, then the dependency connectivity will relate A and B. Change is easier if the number of dependencies for a change is minimised. The need for “loose coupling” to minimise dependencies is discussed in [4].
Dependency connectivity can form the basis for a dependency perspective containing one or more dependency structures. This is routine practice in the technology architecture profession in which various types of architectural diagrams [46] express elements of the dependency perspective. It is also routine practice in the project management profession in which project dependencies are managed in a dependency log [45].
Dependency connectivity is easiest to manage if the connectivity is acyclic because loose coupling is supported by acyclic dependency structures. This means that adaptiveness pressures encourage the adoption of good quality, acyclic dependency structures aligned, where possible, with ecosystem boundaries (to minimise the additional impact of ecosystem integration [4]).
Just like any form of structure, a dependency structure has a level of quality according to how well it represents the underlying dependency connections. As an information artefact, it may suffer from the same quality issues as any other [6].
Because of the Structure Level Principle above, the nature of this quality is important. In the technology world, it is managed through a concept called “technical debt” [50]. Technical debt is used to describe those instances in which the underlying technical implementation does not meet the overall architecture (the structure).
Static fitness pressures drive a reduction in structure quality because:
  • Conforming to the structure may involve incorporating an additional ecosystem (e.g., architects) in the decision and this may incur additional friction introduced by organisational silos [4];
  • The “simplest” change (that with best pace and friction) may not conform to the structure;
  • There may be additional information artefacts to update;
  • There may be pressures from the external environment (e.g., customers) to act fast.
In other words, unless there is adaptiveness pressure, dependency connectivity will drift away from any dependency structure.
Machine learning models suffer from a similar issue. The concepts of “model drift” and “concept drift” [51] describe how models developed using a particular set of data become less effective as the environment (and the data used for learning) changes. In the absence of adaptiveness pressures to upgrade models, the performance will deteriorate.
These ideas are captured in Figure 3.
We can summarise this in the following challenge.
Structure Evolution Challenge: Without adequate adaptiveness pressure, the quality of information structures will degrade and reduce the quality of human and machine learning decision-making.
Decision-making should be explainable—it should be possible to demonstrate that it is delivering reliable results. For example, organisations may need to explain how business processes related to regulatory compliance are compliant. This is an issue with the current state of deep learning—as Marcus [24] says: “deep learning thus far is not sufficiently transparent”. As people and AI communicate on increasingly complex topics, explainability will be needed to ensure that information and rationale can be communicated effectively between AI and people.
Explainability is also required for ethical purposes. For example, the EU has published a report emphasising the importance of explainability in AI [52].
Relationships between humans and AI are just one (albeit important) set of examples of ecosystem integration. As ecosystems grow their conventions, they often develop their information structures (e.g., human languages, protocols for exchanging information). Usually, these information structures are not initially integrated with other ecosystems (because that takes time and sufficiently strong selection pressures to make it happen). But if the structures are not integrated then they are not mutually explainable—there is no way to exchange information with sufficiently high quality. Think of Medieval French and Quantum Mechanics or the mutual incomprehension often found between technology users and technology specialists. In the latter case, as pointed out in [4], many specialist disciplines (e.g., business analysis, user research) have been established to form the links that integrate the structures and provide mutual explainability.
Again, we have a challenge.
Explainability Challenge: Explainability requires integrated information structures. But the information structures supporting different options for decision-making are unlikely to be integrated when the environment changes and this constrains their mutual explainability.

5. An Information Evolution Approach to Decision-Making

The previous section demonstrates that, in times of change, decision-making faces a set of challenges. Ultimately, these challenges are a result of the Combinatorial Challenge described in Table 1. They are symptoms of the combination of environment change, the forces of information evolution and the decision-making approach. In this section, we show how a decision-making approach based on the principles of information evolution can address the underlying issues and hence mitigate the challenges.

5.1. Information Evolution Approach to Decision-Making

Several of the challenges (the Ecosystem Inertia Challenge generally but also the Open Options, Scenario, Structure Evolution challenges) are exacerbated by the application of static fitness pressures. So, these will be mitigated by a balance between static fitness and adaptiveness pressures. This will make it easier for the decision-making IE to select the right viewpoint rather than using the one most readily to hand. This may range from the use of existing conventions to blue-sky research (or anything in between). As discussed in [4], given time and the right capabilities, balanced selection pressures will generate a mature ecosystem in which all of the capabilities used are integrated effectively.
Normally, the decision-making IE will form part of an organisation. In this case, the selection pressures will be shaped, partly at least, by the mechanisms of the organisation (such as governance, performance management, organisation structure and so forth [4]). So, we can distinguish internal selection pressures (applied to the decision-making IE) from external selection pressures (applied to the organisation). Any mismatch between the two will diminish the ability to deal effectively with external selection pressures.
A fully mature IE will recognise any mismatch and will also recognise how well it can deal with particular environment states and when it is appropriate to call on another IE. For example, different governance boards in organisations are empowered (or able) to make particular types of decisions and, in exceptional circumstances, there may be a need to call on an IE designed for exceptions (like COBRA [30]).
A characteristic of ecosystem conventions is that the rationale for the conventions is lost [4,38]. This means that the validity of the conventions with respect to a particular environment state will not be challenged and, when faced with environment change, an inappropriate choice of decision-making viewpoint may be used. As Taleb [1] says: “we overestimate what we know, and underestimate uncertainty, by compressing the range of possible uncertain states (i.e., by reducing the space of the unknown)”. To overcome this, the decision-making IE should be sceptical (to reiterate Taleb [1]) and make an explicit assessment of the validity of the conventions in use and information quality is essential. This approach is supported by Kahneman et al.’s assessment of bias [21] and analysis in the decision-making literature [53]. Scepticism will:
  • Enable a recognition of ecosystem inertia (the Ecosystem Inertia Challenge);
  • Enable a better understanding of the risks of closing down options (the Open Options challenge);
  • Emphasise the need for scenarios to recognise the need for change in a form that is relatively compatible with static fitness pressures (the Scenario Challenge);
  • Recognition of the need to manage the equivalent of technical debt to understand the quality of structures that are used for decision-making (the Structure Evolution Challenge).
Recognition of each issue will prompt the need to resolve it which requires the mature ecosystem mentioned above.
The assessment of quality needs to be aligned with the need to discriminate between options. When faced with the Heterogeneous Comparison Challenge the decision-making IE must be able to find a way to discriminate between those that might cause very different outcomes, especially very negative outcomes. It is not possible to rely on the prevailing ecosystem conventions for information quality. The descriptive, predictive and prescriptive information needs to be of sufficient quality to be able to discriminate—this is the essence of Floridi’s definition of quality as “fitness for purpose” [15].
The decision-making process needs to be able to operate with poor quality information. This means that it should be iterative because that enables:
  • Improvements in the most important elements of information quality (required for the key elements of discrimination) to be prioritised;
  • The information required for discrimination to be tested against the external selection pressures as quickly as possible;
  • Limitations to be understood before too many resources have been expended.
This iterative approach lies at the heart of the Agile and DevOps approach to software development [54].
Iterative decision-making needs to address the Open Options Challenge. The balance between developing information of the right quality and taking action needs to avoid closing down options that may be required when the quality of information improves. This means that the principle of loose coupling should apply to plans. Different types of loose coupling are discussed in [4].
Iterative decision-making also needs to align with the needs of the Structure Evolution, Heterogeneous Comparison and Explainability Challenges. Improving the quality of information iteratively will need to address the different components of Figure 2 and build information structures that can provide structures of suitable quality, compare heterogeneous options and provide an explanation of the rationale. In other words, iterations need to address the specific decision but also the maturity of the ecosystem.
In the absence of good quality information, early iterations should steer away from the rocks and avoid open-ended unfavourable outcomes. This is Taleb’s “Black Swan” territory and as he says in this space “no theory or model should be better that just any theory or model” [1]. This will follow from scepticism about the models and an understanding of the environment state in which viewpoints can give a reliable result (which will follow from an understanding of quality).
Finally, the discussion above pre-supposes that the decision-making IE has access to and uses the full range of the capabilities needed in Figure 1 and Figure 2. This will enable the most appropriate decision-making viewpoint(s) for the state can be chosen and implemented.
In order to improve the quality of decisions, the information evolution analysis shows that a decision-making approach should be sceptical, discriminating, iterative, capable and mature (where these terms have specific information evolution meanings). This is shown in Figure 4.

5.2. Information Evolution Analysis and Further Research

The analysis above demonstrates that some decision-making challenges are a natural result of information evolution and that to mitigate the effect of these challenges the decision-making approach needs specific characteristics. But the analysis raises further questions that will form the subject of further research.
First of all, how can a mature decision-making ecosystem be created to recognise and improve information quality sufficiently through the decision-making process? As the discussion above demonstrates, the different components (e.g., people, teams, technology, AI) may process information using different conventions about acceptable quality which may not apply in the same way to the same set of environmental states. So, how can these be combined to produce reliable decisions when the environment is changing and when information about the nature of that change is unreliable? How can information generated by integrating all the different components (e.g., people and AI, people from different disciplines) meet an acceptable quality threshold, able to discriminate as needed, in these circumstances?
Secondly, the discussion about information structures represents the connected view of information described in [4,5,6]. The nature of these structures has an impact on the pace, friction and quality of the information processing that uses them and, therefore, on the quality of the overall decision-making. In particular, ecosystem interfaces will require conversion between, or integration of, different structures (as discussed in Section 4) represented in different forms (e.g., language, computer databases, deep learning models). So, further research is needed into the types of structure defined in Appendix A and their use in analysing pace, friction and quality.
Finally, how can information evolution be used to provide tools to understand the limitations suggested above and how would these tools apply to instances of decision-making? How can the challenges be diagnosed and improvements made? Kahneman’s dictum “what you see is all there is” [13] suggests an answer—by providing visibility of the underlying causes and their relationships with the challenges and the improvements. If the underlying causes of the challenges and the limitations of existing ecosystem conventions are diagnosed in terms of information evolution then this will also clarify how improvements can be made. This is shown in Figure 5.
Understanding the challenges and potential improvements will follow from an analysis of the following (elements of the middle box in Figure 5):
  • The nature of external selection pressures, potential extreme events and relevant scenarios;
  • How well internal selection pressures match external selection pressures and how static fitness pressures are balanced with adaptiveness;
  • How well the decision-making conventions allow the right viewpoint to be used for decision-making and the selection of the right IE to implement the viewpoint;
  • Whether the right decision-making capabilities (human and technology) are available;
  • Whether the capabilities are integrated effectively and avoid the difficulties of ecosystem boundaries (e.g., organisational silos and limited human/technology interfaces);
  • How well the information structures in use support the decision-making approach required;
  • Whether the nature of the decision-making process matches the needs of the external selection pressures;
  • Whether the pace, friction and quality of the decisions made matches the needs of the external selection pressures and, if not, where the deficiencies are (from answers to the previous questions) and how they can be improved.
Further research will analyse these questions and consider potential information evolution tools to provide visibility of quality deficiencies, their underlying causes and the resulting potential for improvement.

6. Conclusions

The increasing adoption of AI and an increasingly complex environment is adding extra complexity to decision-making. Decision-making is a form of information processing so we can apply the ideas of information evolution [4] which studies the effects of change on information processing.
But information processing is not just what computer systems or individual people do. For organisations and teams within them, information processing is a complex combination of activities by individuals, groups and technology (increasingly including AI) subject to a range of incentives and constraints [4]. Information evolution applies to information processing in this wider sense and helps to understand the factors that drive different levels of information quality.
Poor information quality translates into poor decisions and as AI is increasingly incorporated into decision-making it is important to understand how quality can be lost or improved in complex decision-making environments.
Section 4 and Section 5 show that information evolution provides a theoretical basis for analysing limitations of decision-making as information processing. Applying information evolution techniques highlights several challenges in decision-making. These mean that, without paying careful attention to the selection pressures applied to decision-making, static fitness pressures will limit the ability to make good quality decisions in times of change.
The analysis links the challenges to the underlying factors that drive them and their impact on information quality. These underlying factors are related to:
  • The nature of external selection pressures (those created by the environment of the organisation);
  • The nature of internal selection pressures (the incentives and constraints generated by the organisation) and how well they match the external selection pressures;
  • The information processing conventions for decision-making that arise as a result of these selection pressures and how well they recognise the quality of information they use or produce;
  • The nature of the information structures used in decision-making;
  • The nature of the decision-making process and its compatibility with the need to improve information quality when the quality is deficient;
  • The human and technological capabilities used in decision-making and how well they are integrated to maintain the level of pace, friction and information quality.
Information evolution analysis can be used to characterise a decision-making approach that generates good quality decisions—the approach should be sceptical, discriminating, iterative, capable and mature (where these terms have specific information evolution meanings).
Future research is required to address the following questions:
  • How can a mature decision-making ecosystem be created to recognise and improve information quality through the decision-making process sufficiently to discriminate between outcomes in a changing environment?
  • What role do different types of information structure (as defined in Appendix A) and their relationships play in the pace, friction and quality of information?
  • How can information evolution be used to provide tools that provide visibility of quality deficiencies, their underlying causes and the resulting potential for improvement?

Funding

This research received no external funding.

Acknowledgments

The author would like to thank Jay Willis and Chris Budleigh for useful discussions in the early development of this paper. The author would also like to thank the referees for their helpful comments.

Conflicts of Interest

The author declares no conflict of interest.

Appendix A

The sections above discuss the structures of information artefacts, based on the underlying model summarised in Section 2 and defined in more detail in [4,5,6]. This Appendix extends the definitions in [4,5,6] to include information structures and also repeats some material from [4,5,6] to improve the readability of this paper and explain the diagram convention used.
In line with the Information Connectivity Principle, the model describes connections and is therefore based on graph theory ideas [47]. However, as explained in Section 4, graphs cannot provide the richness needed. Therefore, the model is based on a generalisation of graphs called linnets (“linked nets”).
Pointers
We need to reason both about the structure of an information artefact and also about what is being modelled. To keep these distinct, we use the notion of a pointer. Define a pointer to be a tuple c = (l, v, S) for which:
  • l is a label: lab (c);
  • S is a set which is the set of all possible values of the pointer: cont (c);
  • v is the value (a member of S or ø).
Linnets
An unrooted linnet C is a tuple (V, E, N) which satisfies the following conditions:
  • There is a set of pointers P with unique labels LP;
  • V ⊆ LP is a finite set that we call vertices;
  • E ⊆ LP is a finite set that we call connectors;
  • V and E are disjoint;
  • N is the set of connections of D where N ⊆ {(a, e, b): a, b ∈ V ∪ E, e ∈ E, a, be};
  • if e ∈ E and (a, e, b), (a’, e, b’) ∈ N, then a = a’ and b = b’.
The vertices, connectors and connections of C are known as V(C), E(C) and N(C), respectively.
Define the proximity of a vertex to be 0. If (e1, e, e2) is a connection, then the proximity of e, prox (e), is defined as follows: if (e1, e, e2) is a connection then prox (e) = min (prox (e1), prox (e2)) + 1 where min is undefined if either prox (e1) or prox (e2) is undefined. A connector is rooted if its proximity is defined. A linnet is an unrooted linnet in which all connectors are rooted. So, in a linnet, all connectors connect back to vertices eventually. (We can get around this technicality, if needed, by defining vertices as connections that loop back to themselves, but since we want to draw diagrams with points and lines, the definition here is sufficient.)
Information Structures
If C and D are both linnets, then D is a structure of C if the following hold:
  • C’ is a sub-linnet of C;
  • there is a function fv from 2V(C’) to V(D) for which V(D) is the range of fv;
  • there is a function fe from 2E(C’) to E(D) for which E(D) is the range of fe;
  • if (v, e, w) ∈ N (D), then v, w ∈ V(D) (so D is graphical).
This means that the vertices of C’ map to a vertex D and similarly for connectors.
The definition is careful to allow structures not to be perfect. Or, in terms of information evolution, structures have a level of quality depending on how well they represent the underlying linnet. We need to allow this because technical debt [50], for example, describes the circumstance in which the quality of structures is not perfect.
Acyclic Linnets
If C is a linnet, then a vertex path is a set {(vi, ei, vi+1) ∈ N(C), 0 ≤ in, for some n}. A cycle is a vertex path for which v0 = vn (but in which no other vertices are the same).
A linnet C is acyclic if it contains no cycles.
Information Connection Diagrams
Linnets provide the tool for modelling connections and we can use pointers to differentiate between reasoning about the linnet itself and reasoning about the modelled ecosystem. The pointer labels are used for the definition of the linnet and the values of the pointers are used for the objects we want to model. So, if we are drawing the linnet the pointer labels usually have little relevance. To express the same concept differently, there are different ecosystems involved: the modelling ecosystem (about the linnet) and the modelled ecosystem or ecosystems.
As mentioned above, we want to represent patterns as well as content so we will use quotes (“”) for content and represent metamodel concepts without quotes. Under some circumstances, we may want to represent a whole linnet (C) as part of a larger linnet (D). In this case, we can construct a linnet C’ with one additional vertex labelled something like linnet_C with a connector to each of V(C) and E(C). The vertex linnet_C will participate in D, but we can access C through the connectors from linnet_C. We can use a similar mechanism to model lists or sets.
Processes take one outcome to another so we can represent them as in Figure A1a. Figure A1b shows the assertion “John ran to the shops” in the same form. In this version, “John” is a property of “ran to”, “the shops” is a type of destination and the starting state is unspecified.
Figure A1. Process connections.
Figure A1. Process connections.
Information 11 00559 g0a1
We can extend properties (of a slice) to relationships between slices, as above. In this case, a relationship is the outcome of an ecosystem measurement (μ“relationship”) as shown in Figure A2a. The outcome is an information artefact that expresses the value. To simplify, we can write relationships as a dotted line (an in (b)) or when the relationship is between two sets of slices, we can write it as in (c).
Figure A2. Relationship connections.
Figure A2. Relationship connections.
Information 11 00559 g0a2
We can also use patterns for connectors. Figure A3 shows two different examples. In Figure A3a the sets on either side of the connector are treated as single entities (so {“IEi”} is in {“Gj”}). In Figure A3b there are (unspecified) connectors between elements of the sets (so elements in {“IEi”} are in elements of {“Gj”}).
Figure A3. Connector patterns.
Figure A3. Connector patterns.
Information 11 00559 g0a3
Where we want to consider a set that will not conveniently fit within braces then we can use a rounded box, as in Figure 2, representing a set that contains several elements defined by what is in the box.
Finally, when we want to reference a linnet without including all of the additional connectors required (as discussed above) we can use a square box. This is a mechanism for naming and analysing a particular grouping or subset of a diagram.

References

  1. Taleb, N.N. The Black Swan: The Impact of the Highly Improbable; Random House: New York, NY, USA, 2007. [Google Scholar]
  2. Dwivedi, Y.K.; Hughes, L.; Ismagilova, E.; Aarts, G.; Coombs, C.; Crick, T.; Duan, Y.; Dwivedi, R.; Edwards, J.; Eirug, A.; et al. Artificial Intelligence (AI): Multidisciplinary Perspectives on Emerging Challenges, Opportunities, and Agenda for Research, Practice and Policy. Int. J. Inf. Manag. 2019. [Google Scholar] [CrossRef]
  3. Kotter, J.P. Leading Change: Why Transformation Efforts Fail. Harvard Business Review. January 2007. Available online: https://hbr.org/2007/01/leading-change-why-transformation-efforts-fail (accessed on 28 November 2020).
  4. Walton, P. Information Evolution and Organisations. Information 2019, 10, 393. [Google Scholar] [CrossRef] [Green Version]
  5. Walton, P. A Model for Information. Information 2014, 5, 479–507. [Google Scholar] [CrossRef] [Green Version]
  6. Walton, P. Measures of Information. Information 2015, 6, 23–48. [Google Scholar] [CrossRef] [Green Version]
  7. Christian, D. Origin Story: A Big History of Everything; Little, Brown and Company: New York, NY, USA, 2018. [Google Scholar]
  8. Buonomano, D. Your Brain Is a Time Machine: The Neuroscience and Physics of Time; W.W. Norton & Company: New York, NY, USA, 2017. [Google Scholar]
  9. Walton, P. Artificial Intelligence and the Limitations of Information. Information 2018, 9, 332. [Google Scholar] [CrossRef] [Green Version]
  10. Walton, P. Information and Meaning. Information 2016, 7, 41. [Google Scholar] [CrossRef] [Green Version]
  11. Walton, P. Information and Inference. Information 2017, 8, 61. [Google Scholar] [CrossRef] [Green Version]
  12. Quine, W.V.O. “Two Dogmas of Empiricism”, Reprinted in from a Logical Point of View, 2nd ed.; Harvard University Press: Cambridge, MA, USA, 1951; pp. 20–46. [Google Scholar]
  13. Kahneman, D. Thinking, Fast and Slow; Macmillan: London, UK, 2011. [Google Scholar]
  14. Bateson, G. Steps to an Ecology of Mind; Ballantine Books: New York, NY, USA, 1972. [Google Scholar]
  15. Floridi, L. The Logic of Information: A Theory of Philosophy as Conceptual Design; Oxford University Press: Oxford, UK, 2019. [Google Scholar]
  16. Kuhn, T.S. The Structure of Scientific Revolutions, 2nd ed.; University of Chicago Press: Chicago, IL, USA, 1970. [Google Scholar]
  17. Lawrence, P.R. How to Deal with Resistance to Change. Harvard Business Review. January 1969. Available online: https://kashenterprise.org/wp-content/uploads/2020/05/How-to-Deal-With-Resistance-to-Change.pdf (accessed on 28 November 2020).
  18. Norris, P. Digital Divide: Civic Engagement, Information Poverty and the Internet Worldwide; Cambridge University Press: New York, NY, USA, 2001. [Google Scholar]
  19. Walton, P. Digital Information and Value. Information 2015, 6, 733–749. [Google Scholar] [CrossRef] [Green Version]
  20. Mercier, H.; Sperber, D. The Enigma of Reason; Harvard University Press: Cambridge, MA, USA, 2017. [Google Scholar]
  21. Kahneman, D.; Lovallo, D.; Sibony, O. The Big Idea: Before You Make That Big Decision …. Harvard Business Review. June 2011. Available online: https://hbr.org/2011/06/the-big-idea-before-you-make-that-big-decision (accessed on 28 November 2020).
  22. Steele, K.; Orri Stefánsson, H. Decision Theory, the Stanford Encyclopedia of Philosophy (Winter 2016 Edition); Edward, N.Z., Ed.; Available online: https://plato.stanford.edu/archives/win2016/entries/decision-theory/ (accessed on 28 November 2020).
  23. Popper, K.R. The Logic of Scientific Discovery; Hutchinson: London, UK, 1959. [Google Scholar]
  24. Marcus, G. Deep Learning: A Critical Appraisal. arXiv 2018, arXiv:1801.00631. [Google Scholar]
  25. Pearl, J.; MacKenzie, D. The Book of Why: The New Science of Cause and Effect; Basic Books: New York, NY, USA, 2018. [Google Scholar]
  26. Wahba, M.A.; Bridwell, L.G. Maslow reconsidered: A review of research on the need hierarchy theory. Organ. Behav. Hum. Perform. 1976, 15, 212–240. [Google Scholar] [CrossRef]
  27. Maslow, A.H. A theory of human motivation. Psychol. Rev. 1943, 50, 370–396. [Google Scholar] [CrossRef] [Green Version]
  28. Willis, J. Modelling Swimming Aquatic Animals in Hydrodynamic Models. Ecol. Model. 2011, 222, 3869–3887. [Google Scholar] [CrossRef]
  29. Friedkin, N.E.; Proskurnikov, A.V.; Mei, W. Mathematical Structures in Group Decision-Making on Resource Allocation Distributions. Sci. Rep. 2019, 9, 1377. [Google Scholar] [CrossRef]
  30. COBRA Explanation. Available online: https://www.instituteforgovernment.org.uk/explainers/cobr-cobra (accessed on 28 November 2020).
  31. Business Continuity Good Practice Guidelines. Available online: https://www.thebci.org/training-qualifications/good-practice-guidelines.html (accessed on 28 November 2020).
  32. Wilkinson, A.; Kupers, R. Living in the Futures. Harvard Business Review. May 2013, Volume 91, pp. 118–127. Available online: https://sgs.salzburgglobal.org/fileadmin/user_upload/Documents/2010-2019/2013/518/SessionDocument_LivingFuture_518.pdf (accessed on 28 November 2020).
  33. Lovallo, D.; Koller, T.; Uhlaner, R.; Kahneman, D. Your company is too risk-averse. Harvard Business Review. March–April 2020. Available online: https://hbr.org/2020/03/your-company-is-too-risk-averse (accessed on 28 November 2020).
  34. Wilson, H.J.; Daugherty, P.R. Collaborative Intelligence—Humans and AI Are Joining Forces. Harvard Business Review. July–August 2018. Available online: https://hbr.org/2018/07/collaborative-intelligence-humans-and-ai-are-joining-forces (accessed on 28 November 2020).
  35. Hiefetz, R.; Grashow, A.; Linsky, M. Leadership in a (Permanent) Crisis. Harvard Business Review. July–August 2009. Available online: https://hbr.org/2009/07/leadership-in-a-permanent-crisis (accessed on 28 November 2020).
  36. Webb, A. How to Do Strategic Planning Like a Futurist. 2019. Available online: https://hbr.org/2019/07/how-to-do-strategic-planning-like-a-futurist (accessed on 28 November 2020).
  37. Hopper, P.; Spetzler, C. You Can’t Make Good Predictions Without Embracing Uncertainty. hbr.org. 2019. Available online: https://hbr.org/2016/05/you-cant-make-good-predictions-without-embracing-uncertainty (accessed on 28 November 2020).
  38. Schein, E.H.; Schein, P. Organizational Culture and Leadership; John Wiley and Sons, Inc.: Hoboken, NJ, USA, 2017. [Google Scholar]
  39. Christian, B.; Griffiths, T. Algorithms to Live by: The Computer Science of Human Decisions; Henry Holt and Co.: New York, NY, USA, 2016. [Google Scholar]
  40. Beck, K.; Beedle, M.; van Bennekum, A.; Cockburn, A.; Cunningham, W.; Fowler, M.; Grenning, J.; Highsmith, J.; Hunt, A.; Jeffries, R.; et al. Manifesto for Agile Software Development. Available online: http://www.agilemanifesto.org (accessed on 5 November 2015).
  41. Simon, H. Bounded Rationality and Organizational Learning. Organ. Sci. 1991, 2, 125–134. [Google Scholar] [CrossRef]
  42. Wade, M.; Noronha, A.; Macaulay, J.; Barber, J. Orchestrating Transformation: How to Deliver Winning Performance with a Connected Approach to Change; International Institute for Management Development: Lausanne, Switzerland, 2019. [Google Scholar]
  43. Ferrero, G. L’inertie mentale et la loi du moindre effort. Revue Philosophique de la France et de l’Étranger 1894, 37, 169–182. [Google Scholar]
  44. Zipf, G.K. Human Behavior and the Principle of Least Effort; Addison-Wesley Press: Boston, MA, USA, 1949. [Google Scholar]
  45. The PRINCE 2 Definition. Available online: https://www.prince2.com/uk/what-is-prince2 (accessed on 28 November 2020).
  46. The TOGAF Standard. Available online: https://publications.opengroup.org/standards/togaf (accessed on 28 November 2020).
  47. Harary, F. Graph Theory; Addison-Wesley: Reading, MA, USA, 1969. [Google Scholar]
  48. The XML Standard. Available online: https://www.w3.org/TR/xml/ (accessed on 28 November 2020).
  49. Gallo, C. The Art of Persuasion Hasn’t Changed in 2000 Years. 2019. Available online: https://hbr.org/2019/07/the-art-of-persuasion-hasnt-changed-in-2000-years (accessed on 28 November 2020).
  50. Allman, E. Managing Technical Debt. Commun. ACM 2012, 55, 50–55. [Google Scholar] [CrossRef]
  51. Sobolewski, P.; Wozniak, M. Concept Drift Detection and Model Selection with Simulated Recurrence and Ensembles of Statistical Detectors. J. Univers. Comput. Sci. 2013, 19, 462–483. [Google Scholar]
  52. Hamon, R.; Junklewitz, H.; Sanchez, I. Robustness and Explainability of Artificial Intelligence. JEC Technical Report. EU. 2020. Available online: https://publications.jrc.ec.europa.eu/repository/bitstream/JRC119336/dpad_report.pdf (accessed on 28 November 2020).
  53. Pennycook, G.; Cheyne, J.A.; Koehler, D.J.; Fugelsang, J.A. On the belief that beliefs should change according to evidence: Implications for conspiratorial, moral, paranormal, political, religious, and science beliefs. Judgm. Decis. Mak. 2020, 15, 476–498. Available online: http://journal.sjdm.org/20/200414/jdm200414.pdf (accessed on 28 November 2020).
  54. Kim, G.; Debois, P.; Willis, J.; Humble, J. The Devops Handbook: How to Create World-Class Agility, Reliability, and Security in Technology Organizations; IT Revolution Press: Portland, OR, USA, 2016. [Google Scholar]
Figure 1. Viewpoints from an information evolution perspective.
Figure 1. Viewpoints from an information evolution perspective.
Information 11 00559 g001
Figure 2. An information evolution model of decision-making.
Figure 2. An information evolution model of decision-making.
Information 11 00559 g002
Figure 3. The evolution of information structure.
Figure 3. The evolution of information structure.
Information 11 00559 g003
Figure 4. Information evolution characteristics to improve the quality of decision-making.
Figure 4. Information evolution characteristics to improve the quality of decision-making.
Information 11 00559 g004
Figure 5. Information evolution analysis tools will enable the underlying issues that drive the decision-making challenges to be diagnosed and improvements recommended.
Figure 5. Information evolution analysis tools will enable the underlying issues that drive the decision-making challenges to be diagnosed and improvements recommended.
Information 11 00559 g005
Table 1. Information evolution principles.
Table 1. Information evolution principles.
TitlePrinciple
Combinatorial ChallengeInformation is a response to the combinatorial challenge posed by the environment of an IE and the need to connect states of the environment with actions and outcomes.
Information Connection PrincipleSymbolic information represents hypothesised relationships between constrained sets of slices.
Information Ecosystem PrincipleUnder the influence of selection pressures in the environment, information ecosystems form with their own information processing conventions. The conventions embed tradeoffs and shortcuts that produce good enough pace, friction and quality in interactions with the environment.
Ecosystem Inertia PrincipleEcosystem conventions take time to develop so there is a lag in the response of ecosystems to changes in the environment.
Information Structure PrincipleThe structure of an information artefact constrains its use (because it affects the pace and friction associated with its use).
Discrimination PrincipleIEs require the quality of information used in a viewpoint to be good enough to discriminate between outcomes with different levels of favourability.
Table 2. Some examples of information structure patterns.
Table 2. Some examples of information structure patterns.
Type of Structure PatternExample
Rooted, directed treeSimple organisation chart
Layered structureLayered architecture model [46]
FlowBusiness process description
Acyclic directed graphInference (e.g., the structure of the dependencies in mathematical proofs)
GeneralOther examples that do not conform to the simpler structure patterns above
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Walton, P. The Limitations of Decision-Making. Information 2020, 11, 559. https://0-doi-org.brum.beds.ac.uk/10.3390/info11120559

AMA Style

Walton P. The Limitations of Decision-Making. Information. 2020; 11(12):559. https://0-doi-org.brum.beds.ac.uk/10.3390/info11120559

Chicago/Turabian Style

Walton, Paul. 2020. "The Limitations of Decision-Making" Information 11, no. 12: 559. https://0-doi-org.brum.beds.ac.uk/10.3390/info11120559

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop