Next Article in Journal
Design of Product–Service Systems: Toward An Updated Discourse
Previous Article in Journal
Designing and Integrating a Digital Thread System for Customized Additive Manufacturing in Multi-Partner Kayak Production
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Establishing the Foundations to Measure Organizational Agility for Military Organizations

Air Force Institute of Technology, Wright-Patterson AFB, OH 45433, USA
*
Author to whom correspondence should be addressed.
Submission received: 1 September 2020 / Revised: 6 October 2020 / Accepted: 5 November 2020 / Published: 11 November 2020

Abstract

:
There is an ongoing demand for organizations to become more agile in order to prosper amongst their competitors. Many military organizations have declared a renewed focus towards organizational agility. The goal of this research is to isolate the variables needed to measure organizational agility (OA) in military organizations, allowing for the future development of a suitable method to measure OA without the need to interact with outside organizations. This article begins by providing a suitable and formal definition of organizational agility by exploring and analyzing relevant scholarly literature on the subject. Related terms, such as organizational resiliency, flexibility, robustness, versatility, and adaptability are also explored to examine their definition boundaries and any overlapping areas. Existing methods to measure organizational agility are examined and summarized, and the current limitations to their application are highlighted. Previous studies to find characteristics associated with organizational agility were also examined, and an initial set of 88 organizational agility characteristics was built. Since these included possible redundant or overlapping characteristics, the Q-sort method was employed to discover, analyze, and eliminate redundant items from the dataset, ultimately resulting in 64 unique characteristics. The result is a suitable definition for organization agility applicable to military organizations and a list of potential associated characteristics that summarizes related research to date. This groundwork establishes the foundation to conduct a multi-organization study to further refine the characteristic list and ultimately develop a method to measure organizational agility.

1. Introduction

Over the last decade we have seen smaller, more efficient agile organizations outmaneuver traditionally established institutions. The pace of change has accelerated throughout the information age; an age where information is readily available, and transformative technologies can topple legacy designs overnight. Although particularly evident in the business sector, this phenomenon has also gained significant momentum in the defense sector. The President, Department of Defense (DoD) executives, Congress, and the service chiefs have all come to the same conclusion; that a more agile, flexible, and technologically advanced force is needed to outdo their adversaries [1].
Nation-state militaries spend a significant amount of financial resources and are expected to succeed against their opponent, yet oftentimes, they do not directly engage with their opponents for decades at a time. What happens in a sector where innovation and agility are both vitally important, but a timely and consistent feedback mechanism to measure one’s progress is virtually non-existent? Although the true test of a military is during a turbulent period of engagement with an opponent, interim methods must be developed to measure each critical organizational trait.

1.1. Literature Review Summary

A literature review, consisting of publications on agility and measurement development, was completed to determine if a common definition for OA exists. An online search using Google scholar and EOS.web, an integrated library system designed for special libraries, was initially used to locate and scope the body of relevant work, focusing on terms such as agility, resiliency, and flexibility. Highly cited publications from the core topic area of agility were then reviewed for their relevance and to help shape the remaining searches. Using the referenced sources and bibliographies of those publications, the literature search expanded to cover topics closer to the boundaries of the research area. Based on the initial findings, the focus terms were expanded to also include robustness, versatility, ambidexterity, and adaptability. Continuous efforts were then made to uncover increasingly more recent publications, trying to follow the academic discovery and advancement in the same chronological manner that it had originally occurred.
It is important to capture and explain the relevant terms, especially terms that do not have a widely accepted definition or where the reader may arrive with preconceived notions. This article focuses on organizational agility, and thus, an in-depth review of that term is warranted. This article will also explore several related terms that were uncovered during the review (resiliency, flexibility, robustness, versatility, adaptability, and rapidness) in an effort to define related terms that are frequently used in conjunction with, and sometimes errantly in place of, agility [2]. The focus of this section is to provide relevant contextual information on the subject of agility; it is not meant to be an exhaustive ontological framework or to fully define the related terms.

1.2. Defining Organizational Agility

The term organizational agility became a widely discussed and published topic in the fields of business, software development, and manufacturing starting in the late twentieth century. Although the concept of organizational agility was being developed during the same period and some overlap between industries exist, the concept was largely developed within each specific domain in relative isolation from the other domains. This caused industry-unique definitions and confusion amongst individuals when the term was applied.
The construct of organizational agility has several distinct definitions across a large number of publications, many offering their own, often tailored, definition. Of those reviewed, 24 publications were found that distinctly attempted to define organizational agility. Table 1 provides a snapshot of the leading definitions that have been published. The goal was to promote or create a definition that encompassed the necessary aspects of the versions already published. This method mirrored the approach previously used by Ryan et al. [2] in their publication on terminology related to flexibility. This method is appropriate, because it follows the true meaning of what language is: the majority-accepted method of communication.
It was found that many authors blur the line between capability and capacity and, far too often, mistakenly use them interchangeably. Capacity is an ability that exists at present and capability represents a higher level ability that can be achieved in the future [25]. Each definition in Table 1 was evaluated to determine the intended context and assessed whether it represented a capacity, capability, or both. Of the 24 definitions of organizational agility, 10 were categorized as capacity; 10 as a future capability; and four provided a mix of capacity and capability. A breakdown of the important components was achieved by analyzing the specific words and meaning within these definitions. As shown in Figure 1, the most repeated components of the definition are “rapid response” and “stimuli is external environment.” These are followed closely by “customer driven output,” “environment of uncertainty,” and “opportunistic outcome.”
It is important to note that this method of finding common themes amongst definitions suffers from interpretation errors. Interpretation errors are reduced by evaluating each definition element in the context that it was originally provided and making logical contextual adjustments, when necessary, to apply it to the new context. Omissions by the author are also an important source of interpretation error; each omission may be due to purposeful deletion of that element due to its lack of importance in that context. For instance, if an author describes an item as being externally stimulated and others describe it as internally stimulated, further contextual analysis is required for any version that omits internal/external completely. It may be found that an author purposely omitted the element to mean that it is both internally and externally stimulated or that their contextual application does not require further delineation, thus meaning one, the other, or neither. Despite these inherent errors, the cumulative effect of these two error sources is considered insignificant after making the contextual adjustments [2].
The definition provided by Teece et al. [23] includes each of the key components described in Figure 1. Therefore, the following definition is suggested and supported for this field of study:
Organizational Agility: “Capacity of an organization to efficiently and effectively redeploy/redirect its resources to value creating and value protecting (and capturing) higher-yield activities as internal and external circumstances warrant” [23].
This definition contains a few “loaded” terms, and thus, it is prudent to provide additional meaning and explanation for key elements of this definition [26].
Efficiently: in a manner that produces desired results with little or no waste.
Effectively: producing a decided, decisive, or desired effect.
Value Creating: increase in the worth of goods or services.
Value Protecting: maintaining the same worth of goods or services.
Higher Yield: increase in production from an investment.
Warrant: to serve as or give adequate ground or reason for something.

1.3. Related Terms

When examining organizational agility, several related terms consistently appear. It is important to determine the degree of commonality, overlap, and uniqueness of these terms. Organizational resiliency is related to organizational agility, and the two terms are often used interchangeably for one another. There are a significant number of publications that address personal resiliency, however only eight were found that specifically addressed organizational resiliency. Table 2 provides a snapshot of the leading definitions that have been cited in the literature.
Using the same method as previously described, the key components of organizational resiliency were “response to disruption” (vice opportunity), “recovery outcome” (vice advance), and “reactive” (versus proactive). The definition provided by Lengnick-Hall et al. [34] is the only definition that includes each of these key components. Therefore, the following definition will be applied and supported:
Organizational Resiliency: “Ability to effectively absorb, develop situation-specific responses to, and ultimately engage in transformative activities to capitalize on disruptive surprises that potentially threaten organization survival” [34].
Organizational flexibility, robustness, versatility, and adaptability are constructs that also highly relate to organizational agility [2]. Although their work specifically focused on system flexibility rather than organizational flexibility, the research is in the same domain (defense) and is still applicable to this discussion. In their work, the authors reviewed over 200 papers and found 21 relevant definitions for flexibility. Their efforts culminated in an accepted definition through the breakdown of key elements and application of a similar democratic method. Their resultant definition will be applied to this research with a single change. The term system used in their definition was expanded to include the organizations that design, develop, manufacture, and operate the specific hardware solution, and then replaced with the word “organization” to make it applicable to organizations [2].
Organizational Flexibility: “the measure of how easily [an organization’s] capabilities can be modified in response to external change.”
Organizational Robustness: “the measure of how effectively [an organization] can maintain a given set of capabilities in response to external changes after it has been fielded.”
Organizational Versatility: “the measure of how broadly [an organization’s] capability extend in terms of foreseeable and unforeseeable sources of change.”
Organizational Adaptability: “the measure of how effectively [an organization] can modify its own capabilities in response to change after it has been fielded.”

1.4. Relationships

The formal definitions put forth in the previous sections leave out their relationship to one another. Agility and resiliency are both organizational characteristics; each describing an organizational response to different stimuli. Agility and resiliency share many of the same key components of their definition. They both require responses to stimuli that may be internal or externally produced and result in an increase (or restoration) in output capability. In manufacturing for instance, that may be the number of units produced, the number of different types of units, the individual unit performance, or even an increase in company profit. In the defense sector, this may manifest itself as speed of production, number of missions supported, decrease in mission time, increase in trained soldiers, etc. Where the definitions of agility and resiliency differ is the type of stimuli. Resiliency is associated with the occurrence of a disruption or issue, which could also be described as a disruption or issue to the status quo, and implies that if the organization does not respond, the output capability will be reduced. Agility is associated with opportunities, where the organization has the opportunity to respond to an event, but failure to do so does not jeopardize the status quo output capability. An organization can possess one, both, or neither of these attributes.
Evaluation of flexibility also shows significant definition overlap with agility and resiliency. Flexibility encompasses the nature of a system (organization) to adapt to change, which is also found in both agility and resiliency. Flexibility differs in that it is determined by the response without a time element. This means that only a single dimension (capability, time, or cost) is required to understand flexibility, while agility and resiliency both require two dimensions (capability and time) to be measured. Any time an organization displays agility or resiliency, it also displays flexibility.

1.5. Organizational Agility Framework

Now that we have a working top-level definition of organizational agility, further analysis and breakdown can be accomplished. According to Teece et al. [23] in their 2016 paper, the framework to organizational agility is through a three-step process consisting of sensing, seizing, and transforming, as shown in Figure 2.
Sensing is the identification of technological opportunities and is critical if an organization is to ever attempt to capitalize on them. “Generative-sensing capabilities involve undertaking actions to proactively create hypotheses about the future implications of observed events and trends, and testing these hypotheses to grease the pathways for new products, services, and business models” [3]. Scenario planning and what-if analysis (i.e., development planning within the DoD) are typical sensing techniques. Sensing is more than predicting future customer desires; it also includes the synthesis of different ideas, processes, and technologies to form new products that provide value to the consumer. Existing organizations tend to focus on existing ideas and processes, whereas new entrants are often more poised to develop new combinations and technological innovation [35,36]. Within each of those organizations, middle-level management is the most acute at splicing together different ideas and technologies, and executive level management is better poised to understand the changing customer desires [37].
Seizing is the implementation of new systems, processes, or services. It is the first step that requires the sizable expenditure of resources, as investments in development are often required [23]. The total amount of uncertainty has been reduced, with a portion being converted into quantifiable risk. An organization must be poised to seize opportunity, as “addressing opportunities involves maintaining and improving technological competences and complementary assets and then, when the opportunity is ripe, investing heavily in the particular technologies and designs most likely to…acceptance” [23]. In the business world, this often involves having a stockpile of cash reserves, equipment and/or expertise, while this manifests in the DoD as trained personnel, stockpile of equipment, allies, the budgeting processes, and a decision process that evaluates and welcomes opportunities.
Transforming is the restructuring of an organization to capitalize on a new technology. The newest methodology to do this is through a practice known as “build–measure–learn” where a minimum viable product is produced, allowing the company to release it, learn from their successes and mistakes, and quickly improve the product [23]. Similarly, the DoD has recently created an acquisitions model with similar characteristics known as rapid prototyping. This, when paired with creating small “startup” units within the organization to manage the new technology, allows an organization a reduction in risk when developing a new technology, while remaining poised to capitalize on those that succeed. Each transformation has a cost that must be overcome each time an organization attempts to take advantage of an opportunity. This transformation cost represents the non-value-added effort required for the organization to transition from one to state to another. For organizations with a high transformation cost, this can be seen as an agility inhibitor.

1.6. Existing Agility Measurement Methods

Despite the desire for organizations to become agile, the ability to measure agility has remained elusive. The difficulty arises when trying to create a measure that is both general enough to apply to multiple industries, yet specific enough to capture the important essence of each particular industry [16]. To address this, most measures of agility to date are domain specific. Further, agility joins other important metrics such as morale, happiness, satisfaction, justice, and quality, in that it is not directly measurable. A latent construct, which is where a variable is inferred through a model from other variables that are more readily observed, is required [38]. To date, there have been several attempts at measuring agility. A summary of these methods follows.
  • The two-dimensional dichotomy is the most common method used to measure organizational agility. It frequently manifests itself in the form of magnitude of variety/change and the response time/rate [39]. These variables exist with a degree of dichotomy; the actions required by an organization to increase the magnitude of variety of services or products is often contradictory to a firm’s ability to increase efficiency and reduce their response time [40]. The magnitude of variety/change attempts to capture an organizations current capability of interest, and to quantify their change in that domain. For instance, for a smart phone manufacturer, it may be increased production, greater features on a device, a greater variety of devices produced, or a new method to reduce the cost to produce each item [40]. The response time/rate variable is meant to capture the temporality of the change in a suitable unit of time, such as days, months, per year, or per cycle [40]. Both dimensions are applicable across multiple industries, however they must be calibrated for their respective industry.
  • First-order models that calculate agility by relying on the magnitude of variety and response rate have been developed by multiple authors [41,42]. These first order models often lack support and applicability across different industries (domains). More specifically, no models have been developed to apply to the defense sector.
  • Agility curves were developed and presented by [39]. The agility curves have significant meaning: two points on the graph can result in the same agility rating, and there is an inherent tradeoff between the magnitude of variety change and rate of variety change. Both of these notions are aligned with the argument of dichotomy between the dimensions. This model is supported within the academic community; however, it lacks a simple, repeatable method to measure the magnitude of variety and response rate, and the scale can be difficult to determine and is thus limited in its actual implementation.
  • Comprehensive agility measurement tool (CAMT), developed at Old Dominion University [16], has proven industry agnostic. The tool relies on ten “agility enablers” to measure agility on a scale of 1 to 5 and an analytical hierarchy process (AHP) to ensure that it can be effectively applied to a multitude of industries. Starting from the set of 41 agility enablers found by [43], the survey administrator selects the ten most relevant factors for the given domain and assesses them utilizing a 5-point Likert scale. After applying a weighted average to each of the ten areas, a weighted agility measure is calculated. Although CAMT uses a mathematical model, it is highly subjective due to the administrator’s selection of the ten relevant factors, and the weights applied to each agility enabler. The subjectivity required within CAMT has inhibited its overall support and application.
  • Key Agility Index (KAI) is a method developed by Lomas et al. to measure design process agility by assessing the product development process and making the case that each product process provided a narrow glimpse of the overall organization’s agility. They developed the Key Agility Index, which is the ratio of “time taken to complete change related tasks and time taken to complete the whole project” [44]. This method has high internal validity within a domain, but the authors warn against comparison between different market sectors. Further, this model fails to take into account other factors, such as an effective systems engineering plan. For instance, a product with a poor quality systems engineering plan will likely require a greater number of changes and greater overall variability in the time required to complete change related tasks [44].
Each of these methods provides a different approach to measure organizational agility but currently lacks application within the defense sector. Further, there are no measurement methods that tie directly to the definition of OA that we have adopted.

1.7. Research Objectives

The objective of this manuscript is to isolate the variables needed to measure organizational agility (OA) in military organizations, allowing for the future development of a suitable method to measure OA without the need to interact with outside organizations. Once captured, these variables will form the necessary and common foundation needed by researchers to develop a method to measure OA. The OA measure, in-turn, will allow organizations the metrics needed to evaluate their internal agility and provide organizations the tools necessary to efficiently and effectively re-allocate resources in their ongoing quest to increase their OA.

2. Development of a Set of Factors

2.1. Developing a New Organizational Agility Measure

Measures of performance are present in nearly all aspects of life. Their contribution to individual and organization performance is undeniable, and their mere existence often causes changes in behavior. More specifically, measures of performance provide a means to quantify success and in turn contribute to the development of effective incentive structures. When accurately and effectively measured, they can be used to steer performance to achieve higher level objectives, ultimately changing behaviors. Unfortunately, most fields outside of the financial sector struggle to obtain suitable measures that are valid and reliable [45]. Latent constructs are developed when a variable of interest cannot be observed or measured directly, and thus, measurement is achieved via a theoretical relationship between that variable of interest and other, more directly, measurable indicators, known as factors.
The work completed by Colquitt [46] in summarizing a method to utilize survey research to create a latent construct, and his subsequent application to develop an organizational justice measure, can be similarly applied to develop a measure for organizational agility. Utilizing the assumption that there is a set of factors that can be used to measure organizational agility, the next step is to identify any relevant factors.

2.2. Factors Related to Organizational Agility

Many researchers have attempted to capture the important characteristics of organizational agility. By collecting the sets of characteristics developed by other researchers, a more complete single set of characteristics was created. The process was to collect all prospective characteristics that could be used to measure agility, and then to systematically remove duplicates and non-relevant items. Since agility is highly related to constructs such as flexibility, rapidness, resiliency, and robustness, any characteristic used in their descriptions were also collected.
Utilizing a 3-round Delphi study designed to develop the framework for a survey questionnaire on leanness and agility, Kuruppalil [43] identified the top 45 agility indicators for job shops from 14 different domains. Yusuf et al. [11] studied manufacturing agility and found 32 key attributes comprised within 10 different domains, which was later reduced to seven a few years later [11,47]. Research conducted by Lepore et al. [48] that focused on military rapid development projects found 11 unique attributes by utilizing in-person interviews. Table 3 provides a summary of agility characteristics offered by these publications.
Each of these characteristic sets were created to fully encompass organizational agility, meaning that each of these sets are believed to be comprehensive and complete, albeit in their respective domains. With the sets provided by Kuruppalil [43] and Yusuf et al. [11], both originating in the manufacturing domain, one would expect there to be significant overlap in sets. Further, the characteristic set provided by Lepore et al. [48] provides a well-needed bridge into the military domain. By combining the three sets into a single set, it is reasonable to believe that (1) the new set will be larger than each of the individual sets, (2) the new set will have a greater chance of encompassing the factors necessary to develop a latent construct, and (3) there will be redundancies within the new set. In most cases when combining datasets, redundancy is relatively easy to identify and eliminate. In this case, however, redundancies are difficult to recognize due to the varied wording used to describe each characteristic. The Q-sort method was used to compare, combine, and reduce redundancies in these sets.

2.3. The Q-Sort Method

Q-sort is “a method of assessing reliability and construct validity of questionnaire items being prepared for survey research” [49]. First developed and published by Catell [50], the Q-sort method was one of the six correlation methods (O, P, Q, R, S, and T). The Q-sort method was further refined by Stephenson [51] and Block, J. [52] into the incarnation that is used today. It is an iterative process where the level of agreement between judges is measured and used to determine overall construct validity [49,50,51,52,53].
The procedure to conduct a Q-sort is as follows:
  • Collect items to be sorted. These items are expected to be a sample from the entire population of items that could be used.
  • Select number and capacity of judges. One of the most useful features of the Q-sort method is the limited experience and training that is required of the judges to conduct the sorting. Judges should be knowledgeable in the domain specific to the items, but do not need any formal experience in the Q-sort method itself. The minimum number of judges is two, however the benefit of having additional judges beyond two is often quickly outweighed by the level of disruption it causes when calculating Cohen’s Kappa and the level of agreement. For these reasons, two judges are often preferred.
  • Apply a suitable construct in which the judges can sort the items. This construct may be developed in advance or by the judges themselves. It is recommended that the construct include an “other” category for items that are difficult to fit into a single category.
  • Judges sort the items independently. Methods to ensure independence include keeping each judge out of view of the other, sort via a computer database, or having the items to be sorted in a different, random order for each judge.
  • Calculate Cohen’s Kappa and the agreement ratio. To calculate the agreement ratio, a table that utilizes the number of items for each category is constructed. Figure 3 provides a generic setup for judges (most common); a similar three-dimensional model can be created if three judges were used.
Converting Figure 3 into percentages can be done by dividing each table cell by N, resulting in Figure 4.
The agreement ratio is then calculated as:
A g r e e m e n t   R a t i o = N u m b e r   o f   A g r e e m e n t s   ( d i a g o n a l   e l e m e n t s ) N u m b e r   o f   I t e m s   P l a c e d   ( N )
Cohen’s Kappa is calculated by first determining the chance of agreement and then removing it from the total number of actual agreements. Chance agreements are calculated by multiplying the cross row-column totals, as:
T o t a l   C h a n c e   A g r e e m e n t i = i P i + | P + i
From there, the total number of actual agreements is also calculated, as:
N u m b e r   o f   A c t u a l   A g r e e m e n t = i P i i
The difference between the Number of Actual Agreements and Total Chance of Agreement, standardized for the maximum possible value, which is known as Cohen’s Kappa, can then be calculated as:
C o h e n s   K a p p a = i P i i i P i + | P + i 1 ( i P i i i P i + | P + i )
There is no agreement on a minimum acceptable Cohen Kappa. Landis and Koch published a detailed guideline in their 1977 work, where they provided the following recommendation [54]:
  • Perfect Agreement: Kappa > 0.81.
  • Substantial Agreement: 0.61 ≤ Kappa ≤ 0.80.
  • Moderate Agreement: 0.41 ≤ Kappa ≤ 0.60.
  • Fair Agreement: 0.21 ≤ Kappa ≤ 0.40.
  • No to Slight Agreement: Kappa ≤ 0.20.
Using the guidelines from Landis and Koch, a minimum Kappa of 0.61, representing “substantial agreement,” was used.

2.4. Applying the Q-Sort Method to Organizational Agility

The Q-sort method was applied to the agility characteristics already described. The ultimate goal was to determine which, if any, characteristics were redundant in the set. In accordance with the recommendations by Ozer [53], two judges were used. Both judges had backgrounds representative of the expected survey respondents’ that would be used later in this research but possessed minimal knowledge on the Q-sort method. The judges were given 1 h of training and expectations briefing. The two-day evaluation was performed in 2019. The procedure required a two-round Q-sort method, each round further delineating and categorizing each characteristic [53].
Round 1. Both judges were given the complete set of items from Table 3 (N = 88) and were asked to categorize each item. Previous research on the OA framework by Teece et al. [23] resulted in three categories for OA characteristics, including sensing, seizing, and transforming. These three categories, along with their descriptions provided by Teece et al. [23], were used to form the bins for the first round. A brief description of these categories was given to each judge to better align their meaning against that of the original authors, and to reduce any pre-conceived notions. Each of the items were written on a 3 × 5 index card, and subsequently shuffled (randomized) for each judge to ensure independence. Once the judges were both complete, the cards were sorted and the agreement ratio calculated, as shown in Figure 5. The dataset was then normalized (divide by N) and Cohen’s Kappa was calculated to be 0.74, as shown in Figure 6. This met the criteria of 0.61, (“substantial agreement”), and the process was advanced to round 2.
Round 2. The categories used in round 1 (seizing, sensing, transforming) were each broken down into subcategories. The judges were allowed to select the subcategories via a discussion and consensus process amongst themselves. Although the judges were allowed to select from 2–5 subcategories, each of the subcategory selections also resulted in three subcategories. From there, the same process as described in the previous round was repeated. The hierarchical structure and results of round 2 are shown in Figure 7. It is important to note that the first time through in the category of transforming, the judges resulted in a Cohen’s Kappa of 0.498. This was significantly lower than the goal of 0.61, so a mediation round occurred. During this mediation round, each judge was given 60 s to discuss the disparate items. Following the time limited discussion, each judge then re-scored the item in secret. After the second attempt within the “transforming” category, the Cohen’s Kappa was increased to 0.914. The mediation process had been pre-determined and agreed upon by the judges before the start of the sorting, however extreme caution should be taken when employing such a technique, as it may invalidate the assumption of independence. In this case, it was determined the breach of independence was preferred over proceeding with a Cohen’s Kappa of 0.498.
At this point, the Q-sort method was complete in its entirety. A final round of discussions was completed to determine which, if any, items were redundant in nature. The judges were given the items, one subcategory at a time (of the nine total subcategories), and they searched for redundancies. Open discussion and deliberation was encouraged, and it took both judges to agree before a redundancy was declared. In most cases, redundancy were between two items, however a few occurrences of three-item redundancy did occur. In total, 24 redundant items were removed from the list.

3. Results and Discussion

In the ongoing effort to identify the characteristics of an agile organization, this research accomplished three important objectives.
First, through the analysis of the available OA definitions, an acceptable, commonly applicable definition was found that can be utilized in the defense sector and for future OA measure development. This definition was found by evaluating 24 different definitions that have been previously offered and adjusting them for the relevant context. By disassembling them each of them into their basic components, analysis was completed to determine the relevance of each piece in the greater context of the OA. A definition that contained the most important components, while purposely omitting contentious items, was selected and supported. This effort culminates in a single, commonly accepted definition that can be used by organizational behavioralists, researchers, and practitioners from here forward.
Second, utilizing three highly researched and distinct sets of OA characteristics, each representing a different domain or industry focus, a larger, more encompassing set was created. Since each of the original characteristic sets found in the literature were the result of extensive studies, each were expected to contain all of the needed characteristics to construct a measure for OA. Further, each of these sets were the result of a different research method and/or domain, thus resulting in different, albeit similar, outcome sets. The aggregation of characteristic sets, by its very nature, greatly decreased the likelihood that a particular important characteristic was missing, as it would have to have been missing in all three of the original researcher’s lists. Thus, a more complete characteristic set was created.
Third, characteristics from the aggregated set contained some redundant and overlapping terms. By using the Q-sort method, each characteristic was systematically analyzed against all other items in the set. Twenty-four characteristics were selected for removal from the set, reducing the set by 27%. The reduced set offers significant advantages over the full, aggregated set. During future efforts in this area, researchers can more efficiently focus their attention, and if a survey is used, respondents will be better suited to answer the questions and have less errors due to concept overlap. Table 4 contains the final, reduced set of OA characteristics.
Together, these three objectives help in establishing a common understanding of OA. Further, they form the necessary foundation to establish a method to measure, and ultimately improve, OA.

4. Significance

The result is a suitable definition for organization agility applicable to military organizations and a list of potential associated characteristics that summarize related research to date. This groundwork establishes the foundation to conduct a multi-organization study to further refine the characteristic list and ultimately develop a method to measure organizational agility. With these results, practitioners can identify the important characteristics related to OA and can refocus internal training and resources to improve their organization in terms of OA. The foundations of OA developed here are the important bridge and re-invigoration needed in the ongoing study of OA and the ultimate goal of fully measuring it.

5. Limitations and Future Work

This research encountered several limitations. First, although the literature review found a significant number of related publications, recency was an ongoing issue. The research was limited to publications in English, and thus, the probability that numerous non-English publications were omitted is quite high. Second, the lists of characteristics used were from the manufacturing domain and a study of the rapidness of defense acquisitions. No direct study of OA-related characteristics, specifically relating to OA in the military sector, was found. Thirdly, only two judges with relevant experience were available for the two-day Q-sort. Further, adding a third or fourth judge would have likely extended the process, which would have caused additional availability issues with the existing judges.
Future work in this arena is envisioned to include (1) further literature search in using additional search techniques, databases, and languages; (2) re-accomplishing the Q-sort with new and possibly more judges to compare against the existing results; (3) research to solicit additional OA related characteristics unique to military organizations, and (4) development of a survey to collect data on OA using the reduced set of characteristics.

6. Conclusions

There is a continuous need for organizations to become agile in order to survive and succeed amongst their peers. A method to accurately measure organizational agility within the DoD has yet to be fully developed. Through a literature review, a suitable and formal definition for organizational agility was found and support confirmed. An initial set of related characteristics, which can be used to develop a latent construct, was discovered and analyzed. Utilizing the Q-sort method, redundant characteristics were eliminated resulting in 64 remaining characteristics that will be used to develop the necessary survey questions to continue this research.

Author Contributions

Conceptualization, J.G. and D.J.; methodology, J.E. and J.G.; software, J.G.; validation, J.G., J.E. and D.J.; formal analysis, J.G.; investigation, J.G..; resources, J.G..; data curation, J.G.; writing—original draft preparation, J.G.; writing—review and editing, J.E. and D.J.; visualization, J.G.; supervision, D.J.; project administration, D.J.; funding acquisition, D.J. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Modigliani, P. Speed and Agility: How Defense Acquisition Can Enable Innovation. In Proceedings of the Thirteenth Annual Acquisition Research Symposium, Monterey, CA, USA, 30 April 2016. [Google Scholar]
  2. Ryan, E.; Jacques, D.; Colombi, J. An Ontological Framework for Clarifying Flexibility-Related Terminology via Literature Survey. Syst. Eng. 2012, 16, 99–110. [Google Scholar] [CrossRef]
  3. Goldman, S.L.; Nagel, R.; Preiss, K. Agile Competitors and Virtual Organizations: Strategies for Enriching the Customer; Wiley: New York, NY, USA, 1994. [Google Scholar]
  4. Gehani, R.R. Time-based management strategic roles. Int. J. Oper. Prod. Manag. 2010, 15, 19–35. [Google Scholar] [CrossRef]
  5. Cho, H.; Jung, M.; Kim, M. Enabling technologies of agile manufacturing and its related activities in Korea. Comput. Ind. Eng. 1996, 30, 323–334. [Google Scholar] [CrossRef]
  6. Morgan, G. Images of Organization, 2nd ed.; Sage Publications: Thousand Oaks, CA, USA, 1997. [Google Scholar]
  7. Dyer, L.; Shafer, R.A. From Human Resource Strategy to Organizational Effectiveness: Lessons from Research on Organizational Agility; CAHRS Working Paper Series; Cornel University: Ithaca, NY, USA, 1998; p. 125. [Google Scholar]
  8. Kidd, P.T. Agile manufacturing: A strategy for the 21st century. In Proceedings of the IEE Colloquium on Agile Manufacturing, Coventry, UK, 26 October 1995. [Google Scholar]
  9. Feng, S.C.; Zhang, C. A Modular Architecture for Rapid Development of CAPP Systems for Agile Manufacturing. IIE Trans. 1998, 30, 893–903. [Google Scholar] [CrossRef]
  10. Sharifi, H.; Zhang, Z. A Methodology for Achieving Agility in Manufacturing Organizations: An introduction. Int. J. Prod. Econ. 1999, 62, 7–22. [Google Scholar] [CrossRef]
  11. Yusuf, Y.Y.; Sarhadi, M.; Gunasekaran, A. Agile manufacturing: The drivers, concepts and attributes. Int. J. Prod. Econ. 1999, 62, 33–43. [Google Scholar] [CrossRef]
  12. Grewal, R.; Tansuhaj, P. Building Organizational Capabilities for Managing Economic Crisis: The Role of Market Orientation and Strategic Flexibility. J. Mark. 2001, 65, 67–80. [Google Scholar] [CrossRef]
  13. Dove, R. Response Ability: The Language, Structure, and Culture of the Agile Enterprise; Wiley: New York, NY, USA, 2002. [Google Scholar]
  14. Albert, D.S.; Hayes, R.E. Power to the Edge: Command Control in the Information Age; Information Age Transformation Series; CCRP Publication Series: Arlington, VA, USA, 2003. [Google Scholar]
  15. Van Oosterhout, M.; Waarts, E.; van Hillegersberg, J. Change factors requiring agility and implications for IT. Eur. J. Inf. Syst. 2006, 15, 132–145. [Google Scholar] [CrossRef]
  16. Erande, A.S.; Verma, A.K. Measuring agility of organizations—A comprehensive agility measurement tool (CAMT). Int. J. Appl. Manag. Technol. 2008, 6, 31–44. [Google Scholar]
  17. Kosonen, M.; Doz, Y. Fast Strategy: How Strategic Agility Will Help You Stay Ahead of the Game; Pearson Education Limited: Harlow, UK, 2007. [Google Scholar]
  18. Worley, C.G.; Lawler, E.E., III. Effective Organizations Agility and Organization Design: A Diagnostic Framework; CEO Publications: Los Angeles, CA, USA, 2009; Volume 1, pp. 1–37. [Google Scholar]
  19. Lu, Y.; Ramamurthy, K. Understanding the link between information technology capability and organizational agility: An empirical examination. MIS Q. 2011, 35, 931–954. [Google Scholar] [CrossRef] [Green Version]
  20. Weber, Y.; Tarba, S.Y. Strategic Agility: A State of the Art Introduction to the Special Section on Strategic Agility. Calif. Manag. Rev. 2014, 56, 5–12. [Google Scholar] [CrossRef]
  21. Worley, C.G.; Williams, T.D.; Williams, T.; Lawler, E.E., III. The Agility Factor: Building Adaptable Organizations for Superior Performance; Wiley: San Francisco, CA, USA, 2014. [Google Scholar]
  22. Lee, O.K.D.; Sambamurthy, V.; Lim, K.H.; Wei, K.K. How Does IT Ambidexterity Impact Organizational Agility? Inf. Syst. Res. 2015, 26, 398–417. [Google Scholar] [CrossRef]
  23. Teece, D.; Peteraf, M.; Leih, S. Dynamic Capabilities and Organizational Agility: Risk, Uncertainty, and Strategy in the Innovation Economy. Calif. Manag. Rev. 2016, 58, 13–35. [Google Scholar] [CrossRef] [Green Version]
  24. Walter, A.-T. Organizational Agility: Ill-Defined and Somewhat Confusing? A Systematic Literature Review and Conceptualization. Manag. Rev. Q. 2020, 1–49. [Google Scholar] [CrossRef] [Green Version]
  25. Ability, Capability, Capacity and Competence Blog: The Knowledge Economy. Business Process Incubator. Available online: https://www.businessprocessincubator.com/content/ability-capability-capacity-and-competence/#:~:text=A%20Capacity%20is%20the%20ability,be%20aware%20of%20its%20competence (accessed on 24 September 2020).
  26. Meriam-Webster Dictionary, (n.d.). Meriam-Webster Dictionary. Available online: https://www.merriam-webster.com/dictionary/ (accessed on 27 February 2019).
  27. Wildavsky, A.B. Searching for Safety; Social Philosophy and Policy Center Transaction Books: New Brunswick, NJ, USA, 1998; Volume 10. [Google Scholar]
  28. Home, J.F., III; Orr, J.E. Assessing behaviors that create resilient organizations. Employ. Relat. Today 1997, 24, 29–39. [Google Scholar] [CrossRef]
  29. Bunderson, J.S.; Sutcliffe, K.M. Comparing alternative conceptualizations of functional diversity in management teams: Process and performance effects. Acad. Manag. J. 2002, 45, 875–893. [Google Scholar]
  30. Riolli, L.; Savicki, V. Information system organizational resilience. Omega 2003, 31, 227–233. [Google Scholar] [CrossRef]
  31. Sutcliffe, K.M.; Vogus, T.J. Organizing for Resilience. In Positive Organizational Scholarship; Cameron, K., Quinn, R.E., Eds.; Berett-Koehler: San Francisco, CA, USA, 2003; Volume 94, p. 110. [Google Scholar]
  32. Gittell, J.H.; Cameron, K.; Lim, S.; Rivas, V. Relationships, Layoffs, and organizational resilience: Airline industry responses to September 11. J. Appl. Behav. Sci. 2006, 42, 300–329. [Google Scholar] [CrossRef]
  33. Vogus, T.J.; Sutcliffe, K.M. Organizational resilience: Towards a theory and research agenda. In Proceedings of the 2007 IEEE International Conference on Systems, Man, and Cybernetics, SMC 2007, Montreal, QC, Canada, 1 December 2007; pp. 3418–3422. [Google Scholar]
  34. Lengnick-Hall, C.A.; Beck, T.E.; Lengnick-Hall, M.L. Developing a capacity for organizational resilience through strategic human resource management. Hum. Resour. Manag. Rev. 2011, 21, 243–255. [Google Scholar] [CrossRef]
  35. Cohen, W.M.; Levinthal, D.A. Absorptive capacity: A new perspective on learning and innovation. Adm. Sci. Q. 1990, 35, 128–152. [Google Scholar] [CrossRef]
  36. Henderson, R.M.; Clark, K.B. Architectural innovation: The reconfiguration of existing product technologies and the failure of established firms. Adm. Sci. Q. 1990, 35, 9–30. [Google Scholar] [CrossRef] [Green Version]
  37. Kendall, F. Department of Defense Risk, Issue, and Opportunity Management Guide for Defense Acquisition Programs; Office of the Deputy Assistant Secretary of Defense Systems Engineering: Washington, DC, USA, 2017. [Google Scholar]
  38. Everitt, B. An Introduction to Latent Variable Models; Chapman and Hall: London, UK, 1984. [Google Scholar]
  39. Singh, J.; Sharma, G.; Hill, J.; Schnackenberg, A. Organizational Agility: What it is, What it is not, and Why it Matters. Acad. Manag. Proc. 2018, 2013, 1–40. [Google Scholar] [CrossRef]
  40. March, J.G. Exploration and exploitation in organizational learning. Organ. Sci. 1991, 2, 71–87. [Google Scholar] [CrossRef]
  41. Adler, P.S.; Goldoftas, B.; Levine, D.I. Flexibility Versus Efficiency? A Case Study of Model Changeovers in the Toyota Production System. Organ. Sci. 2008, 10, 43–68. [Google Scholar] [CrossRef]
  42. Bahrami, H. The Emerging Flexible Organization: Perspectives from Silicon Valley. Calif. Manag. Rev. 2012, 34, 33–52. [Google Scholar] [CrossRef]
  43. Kuruppalil, Z. Leanness and Agility in Job Shops: A Framework for a Survey Instrument Developed Using the Delphi Method. Ph.D. Thesis, Indiana State University, Terre Haute, IN, USA, August 2007. [Google Scholar]
  44. Lomas, C.; Wilkinson, J.; Maropoulos, P.; Matthews, P. Measuring Design Process Agility for the Single Company Product Development Process. Int. J. Agil. Manuf. 2006, 9, 105–112. [Google Scholar] [CrossRef] [Green Version]
  45. Skyrme, D.J.; Amidon, D.M. New Measures of Success. J. Bus. Strategy 1998, 19, 20–24. [Google Scholar] [CrossRef]
  46. Colquitt, J.A. On the dimensionality of organizational justice: A construct validation of a measure. J. Appl. Psychol. 2001, 86, 386–400. [Google Scholar] [CrossRef] [Green Version]
  47. Gunasekaran, A.; Yusuf, Y.Y. Agile manufacturing: A taxonomy of strategic and technological imperatives. Int. J. Prod. Res. 2002, 40, 1357–1385. [Google Scholar] [CrossRef]
  48. Lepore, D.F.; Colombi, J.; Wade, J.; Boehm, B.; Majchrzak, A.; Lane, J.A.; Koolmanojwong, S.; Hudson, G.; Hudson, A.; Lawrence, T.; et al. Expedited Systems Engineering for Rapid Capability and Urgent Needs; A013 Final Technical Report SERC-2012-TR-034; Systems Engineering Research Center: Hoboken, NJ, USA, 31 December 2012; pp. 1–144. [Google Scholar]
  49. Nahm, A.Y.; Rao, S.S.; Solis-Galvan, L.E.; Ragu-Nathan, T.S. The Q-Sort Method: Assessing Reliability and Construct Validity of Questionnaire Items at A Pre-Testing Stage. J. Mod. Appl. Stat. Methods 2016, 1, 114–125. [Google Scholar] [CrossRef]
  50. Cattell, R.B. The Description and Measurement of Personality; Measurements and Adjustments Series; World Book Company: Yonkers, NY, USA, 1946. [Google Scholar]
  51. Stephensen, W. The Study of Behavior: Q Technique and Its Methodology; University of Chicago Press: Chicago, IL, USA, 1953. [Google Scholar]
  52. Block, J. The Q-Sort Method in Personality Assessment and Psychiatric Research; Charles C. Thomas Publisher: Springfield, IL, USA, 1961. [Google Scholar]
  53. Ozer, D.J. The Q-Sort Method and the Study of Personality Development. In Studying Lives through Time: Personality and Development; Funder, D.C., Parke, R.D., Tomlinson-Keasey, C., Widaman, K., Eds.; American Psychological Association (APA): Washington, DC, USA, 1993; pp. 147–168. [Google Scholar] [CrossRef]
  54. Landis, J.R.; Koch, C.G. The Measurement of Observer Agreement for Categorical Data. Biometrics 1997, 33, 159–174. [Google Scholar] [CrossRef] [Green Version]
Figure 1. Frequency of definition components of organizational agility.
Figure 1. Frequency of definition components of organizational agility.
Systems 08 00044 g001
Figure 2. Foundations of agile organizations [23].
Figure 2. Foundations of agile organizations [23].
Systems 08 00044 g002
Figure 3. Two-judge agreement during Q-sort.
Figure 3. Two-judge agreement during Q-sort.
Systems 08 00044 g003
Figure 4. Normalized two-judge agreement during Q-sort.
Figure 4. Normalized two-judge agreement during Q-sort.
Systems 08 00044 g004
Figure 5. Round 1 Q-sort results—agreement ratio.
Figure 5. Round 1 Q-sort results—agreement ratio.
Systems 08 00044 g005
Figure 6. Round 1 Q-sort results—Cohen’s Kappa.
Figure 6. Round 1 Q-sort results—Cohen’s Kappa.
Systems 08 00044 g006
Figure 7. Hierarchical layout and results of Q-sort.
Figure 7. Hierarchical layout and results of Q-sort.
Systems 08 00044 g007
Table 1. Summary of organizational agility definitions [2,3,4,5,6,7,8,9,10,11,12,13,14,15,16,17,18,19,20,21,22,23,24].
Table 1. Summary of organizational agility definitions [2,3,4,5,6,7,8,9,10,11,12,13,14,15,16,17,18,19,20,21,22,23,24].
YearAuthor(s)DefinitionCapabilityCapacity
1995Goldman, Nagel & Preiss (L. Goldman et al., 1995)Firms ability to cope with rapid, relentless, and uncertain changes and thrive in a competitive environment of continually and unpredictably changing opportunities.XX
1995Gehani (Gehani, 1995)An agile organization can quickly satisfy customer orders; can introduce new products frequently in a timely manner; and can even get in and out of its strategic alliances speedily.X
1996Cho, Jung, Kim (Cho et al., 1996)Capability of surviving and prospering in a competitive environment of continuous and unpredictable change by reacting quickly and electively to changing markets, driven by customer-designed products and servicesX
1997Morgan (Morgan, 1997)Internal operations at a level of fluidity and flexibility that matches the degree of turmoil in external environments.X
1998Dyer & Shafer (Dyer & Shafer, 1998)Capacity to be infinitely adaptable without having to change…necessary core competence for organizations operating in dynamic external environments…develop a built-in capacity to shift, flex, and adjust either alone or with alliance partners, as circumstances change.XX
1998Kidd (Kidd, 1995)Unites organizational processes and people with advanced technology to meet customer demands for customized high quality products and services in a relatively short timeframe.X
1998Feng and Zhang, 1998)An agile enterprise could swiftly reconfigure operations, processes, and business relationships, thriving in an environment of continuous and unpredictable change.X
1999Sharifi and Zhang (1999)The ability to cope with unexpected changes, to survive unprecedented threats of business environment, and to take advantage of changes as opportunities.X
1999Yusuf, Sarhadi, Gunasekaran (Yusuf et al., 1999)Agility is the successful exploration of competitive bases through the integration of reconfigurable resources and best practices in a knowledge-rich environment to provide customer-driven products and services in a fast changing market environment.X
2001Dove (Dove, 2002)Providing the potential for an organization to thrive in a continuously changing, unpredictable business environment.X
2001Grewal & Tansuhaj (Grewal & Tansuhaj, 2001)Organizational ability to manage economic and political risks by promptly responding in a proactive or reactive manner to market threats and opportunities.X
2003Alberts & Hayes (Albert & Hayes, 2003)The synergistic combination of robustness, resilience, responsiveness, flexibility, innovation, and adaption.XX
2006Van Oosterhout, et al., 2006)The ability to swiftly and easily change businesses and business processes beyond the normal level of flexibility to effectively manage unpredictable external and internal changes.X
2008Erande, Verma (Erande & Verma, 2008)Ability to respond to unpredictable changes with quick response and profitability.X
2008Doz & Kosonen (Kosonen & Doz, 2007)Capacity to continuously adjust and adapt strategic direction in a core business to create value for a company.X
2009Worley & Lawler (Worley & Iii, 2009)Dynamic organization design capability that can sense the need for change from both internal and external sources, carry out those changes routinely, and sustain above average performance.XX
2011Tallon, Pinsonneault (Tallon & Pinsonneault, 2011)Agility is the persistent, systemic variations in an organizations’ outputs, structures or processes that are identified, planned, and executed as a deliberate strategy to gain competitive advantage.X
2011Ryan, Jacques & Colombi (Ryan et al., 2012)The measure of how quickly a system’s capabilities can be modified in response to external change.X
2011Lu and Ramamurthy (2011)Firm-wide capability to deal with changes that often arise unexpectedly in business environments via rapid and innovative responses that exploit changes as opportunities to grow and prosper.X
2014Weber & Tarba (Weber & Tarba, 2014)The ability to remain flexible in the face of new developments.X
2014Worley, William, Lawler & O’Toole (Worley et al., 2014)The capability to make timely, effective, sustained organizational change…a repeatable organizational resource.X
2015Lee, Sambumurthy, Lim & Wei (Lee, et al., 2015)Firm’s ability to simultaneously pursue exploration and exploitation in their management of IT resources and practicesX
2016Teece, Peteraf & Leih (Teece et al., 2016)Capacity of an organization to efficiently and effectively redeploy/redirect its resources to value creating and value protecting (and capturing) higher-yield activities as internal and external circumstances warrantX
2020Walter (Walter 2020)Organizational Agility is a learned, permanently-available dynamic capability that can be performed to a necessary degree in a quick and efficient fashion, and whenever needed in order to increase business performance in a volatile market environment.X
Table 2. Summary of organizational resiliency definitions [27,28,29,30,31,32,33,34].
Table 2. Summary of organizational resiliency definitions [27,28,29,30,31,32,33,34].
YearAuthor(s)DefinitionRecoverAdvance
1988Wildavsky (Wildavsky, 1988)The capacity to cope with unanticipated dangers after they have become manifest.X
1998Home III & Orr (Home III & Orr, 1997)Resilience is a fundamental quality of individuals, groups, organizations, and systems as a whole to respond productively to significant change that disrupts the expected pattern of events without engaging in an extended period of regressive behavior.X
2002Bunderson& Sutcliffe (Bunderson & Sutcliffe, 2002)Capacity to maintain desirable functions and outcomes in the midst of strain.X
2003Riolli&Savicki (Riolli & Savicki, 2003)Organizational ability to manage economic and political risks by promptly responding in a proactive or reactive manner to market threats and opportunities.XX
2003Sutccliffe&Vogus (Sutcliffe & Vogus, 2003)The ability to absorb, strain, or change with a minimum of disruption.X
2006Gittell, Cameron, Lim & Rivas (Gittell et al., 2006).Ability to bounce back from crisisX
2007Vogus& Sutcliffe (Vogus & Sutcliffe, 2007)Maintenance of positive adjustment under challenging conditions such that the organization emerges from those conditions strengthened and more resourceful.X
2011Lengnick-Hall, Beck &Lengnick-Hall (Lengnick-Hall et al., 2011)Ability to effectively absorb, develop situation-specific responses to, and ultimately engage in transformative activities to capitalize on disruptive surprises that potentially threaten organization survival. XX
Table 3. Initial (expanded) set of organizational agility characteristics.
Table 3. Initial (expanded) set of organizational agility characteristics.
ManufacturingKuruppalil (1998)
Adaptive evaluation and reward metricKnowledge management
Capability to quickly adjust bus. & man. strategiesKnowledge of competitors
Capability to quickly adjust orgl characteristics/designMass customization
Concurrent engineeringMulti skilled people
Concurrent technologyOrganization flexibility
Continuous improvementProactive customer relationships
Customer and supplier integrationProactively exploration of new opportunities
Decentralized organizationProduct model flexibility capability
Developing unique capabilities & characteristicsProduct volume flexibility capability
Development of effective responses to new challengesPull production
Effective sensing of changes in the business environmentQuality over product life
Electronic commerceQuick response to changing regulation/legislation
Employee satisfactionRapid adjustment of people capabilities (skills & knowledge)
Empowering workforce with knowledgeRapid adoption of new methods, techniques, tech & processes
Encouraging innovationRapid delivery
Enhancing skill and knowledge by trainingRapid partnership
External integration of informationRapid prototyping
Fast product development cycleReconfigurable production/process technology
Faster manufacturing timesReconfigurable supply chain and business partnership
Flexible production technologyResponsiveness to market change
Internal integration of informationTeam based leadership
Investing in innovationVirtual enterprising
Investment in appropriate technology
Manufacturing Job ShopsYusuf, Sarhardi, Gunasekaran (1999)
Concurrent execution of activitiesShort development cycle times
Enterprise integrationContinuous improvement
Information accessible to employeesCulture of change
Multi-venturing capabilitiesRapid partnership formation
Developed business practice difficult to copyStrategic relationship with customers
Empowered individuals working in teamsClose relationship with suppliers
Cross functional teamsTrust-based relationship with customers/suppliers
Teams across company bordersNew product introduction
Decentralised decision makingCustomer-driven innovations
Technology awarenessCustomer satisfaction
Leadership in the use of current technologyResponse to changing market requirements
Skill and knowledge enhancing technologiesLearning organization
Flexible production technologyMulti-skilled and #exible people
Quality over product lifeWorkforce skill upgrade
Products with substantial value-additionContinuous training and development
First-time right designEmployee satisfaction
DoD “Rapid” AcquisitionsLepore & Colombi (2012)
Build and Maintain TrustRight-size the Program - Eliminate Major Program Oversight
Designing out All Risk Takes Forever…Accept Some RiskStrive for a Defined Set of Stable Rqmts Focused on Warfighter
Incremental Deployment is Part of the Product PlanThe Government Team Leads the Way
Keep an Eye on “Normalization”Use Mature Technology – Focus on the State of the Possible
Maintain High Levels of Motivation and ExpectationsWork to Exploit Maximum Flexibility Allowed
Populate Your Team with Specific Skills and Experience
Table 4. Final (reduced) set of organizational agility characteristics.
Table 4. Final (reduced) set of organizational agility characteristics.
Adaptive evaluation and reward metricInvestment in appropriate technology
Build and Maintain TrustKnowledge management
Capability to quickly adjust busikness & manufacturing strategiesKnowledge of competitors
Close relationship with suppliersLeadership in the use of current technology
Concurrent execution of activitiesLearning organization
Continuous improvementMaintain High Levels of Motivation and Expectations
Continuous training and developmentMulti-venturing capabilities
Cross functional teams (including intra & inter company borders)New product introduction
Culture of changePartnership
Customer and supplier integrationPopulate Your Team with Specific Skills and Experience
Decentralized decision makingProactive customer relationships
Decentralized organizationProactively exploration of new opportunities
Designing out All Risk Takes Forever…Accept Some RiskProduct Flexibility
Developed business practice difficult to copyProducts with substantial value-addition
Developing unique capabilities & characteristics difficult to copyQuality over product life
Development of effective responses to new challenges from competitorsRapid adjustment of people capabilities (skills & knowledge)
Effective sensing of changes in the business environmentRapid adoption of new methods, techniques, tech & processes
Electronic commerceRapid delivery
Employee satisfactionRapid partnership formation
Empowered individuals working in teamsRapid prototyping
Empowering workforce with knowledgeResponsiveness to market change
Encouraging innovationRight-size the Program–Eliminate Major Program Oversight
Enhancing skill and knowledge by trainingShort development cycle times
Enterprise integrationSkill and knowledge enhancing technologies
External integration of informationStrive for a Defined Set of Stable Rqmts Focused on Warfighter
Fast product development cycleTeam based leadership
Faster manufacturing timesTeams across company borders
First-time right designTechnology awareness
Flexible production technologyTrust-based relationship with customers/suppliers
Incremental Deployment is Part of the Product PlanUse Mature Technology–Focus on the State of the Possible
Information accessible to employeesVirtual enterprising
Internal integration of informationWork to Exploit Maximum Flexibility Allowed
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Geiger, J.; Elshaw, J.; Jacques, D. Establishing the Foundations to Measure Organizational Agility for Military Organizations. Systems 2020, 8, 44. https://0-doi-org.brum.beds.ac.uk/10.3390/systems8040044

AMA Style

Geiger J, Elshaw J, Jacques D. Establishing the Foundations to Measure Organizational Agility for Military Organizations. Systems. 2020; 8(4):44. https://0-doi-org.brum.beds.ac.uk/10.3390/systems8040044

Chicago/Turabian Style

Geiger, Jeremy, John Elshaw, and David Jacques. 2020. "Establishing the Foundations to Measure Organizational Agility for Military Organizations" Systems 8, no. 4: 44. https://0-doi-org.brum.beds.ac.uk/10.3390/systems8040044

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop