Next Article in Journal
Social Determinants of Health Influencing the New Zealand COVID-19 Response and Recovery: A Scoping Review and Causal Loop Diagram
Previous Article in Journal
Implementation of an Expanded Decision-Making Technique to Comment on Sweden Readiness for Digital Tourism
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Development of Perceived Complex Problem-Solving Instrument in Domain of Complex Systems

1
Department of Industrial and Systems Engineering, Mississippi State University, Mississippi State, MS 39762, USA
2
Department of Management, Humanities Faculty, Birjand Branch, Islamic Azad University, Birjand 97177-11111, Iran
*
Author to whom correspondence should be addressed.
Submission received: 5 June 2021 / Revised: 26 June 2021 / Accepted: 1 July 2021 / Published: 7 July 2021
(This article belongs to the Section Complex Systems)

Abstract

:
The ability to solve modern complex systems becomes a necessity of the 21st century. The purpose of this study is the development of an instrument that measures an individual’s perception toward solving complex problems. Based on literature and definitions, an instrument with four stages named perceived complex problem-solving (PCPS) was designed through exploratory and confirmatory stages. The instrument is validated and scaled through different models, and the final model is discussed. After completing validation and scale development of the PCPS instrument, the final model of the PCPS instrument was introduced to resolve the gap in the literature. The final model of the PCPS instrument is able to find and quantify the degree of perception an individual holds in dealing with complex problems and can be utilized in different settings and environments. Further research about the relationship between Systems Thinking and CPS revealed individuals with a high level of systems thinking have a better understanding of the characteristics of complex problems and so better perception of CPS.

1. Introduction

Modern complex systems deal with more socio-technical dimensions and interact directly with the surrounding environment, and this interaction creates challenges and issues [1]. The management of this turbulent work environment mandates the need for a skillset that involves creativity, continuous learning, innovation, and collaboration. Complex problem-solving (CPS) skills have become a necessary competence in today’s workforce [2] and attract job seekers. This is evident through different programs that emphasize finding better approaches and methods in solving complex-system problem domains, for example, in different programs such as The Program for International Student Assessment (PISA) [3,4], The Program for the International Assessment of Adult Competencies (PIAAC) [5], and the O*net job database (the U.S. Department of Labor’s Occupational Information Network) [6]. PISA is an assessment of the Organization for Economic Co-operation and Development (OECD), which includes the assessment of students’ problem-solving skills and direct assessment of life competencies that apply across different areas of the school curriculum. PIAAC is an international assessment of adult skills managed by the OECD, which is currently being implemented by 25 countries in Europe, the Americas, and Asia. Although CPS has received attention in the literature, still not clearly defined, and the continued divergence in the definitions and perspectives will muddle the field and slow the progress of developing methods that can be applied to different disciplines [7].
Within the 21st century, modern complex systems still confront challenges with a high level of integration, ambiguity, uncertainty, and interdependence between systems and their related elements, making blurred the lines between technical, social, political, managerial, and organizational considerations [8,9]. Ackoff [10] claimed that one of the approaches that help us to evaluate and understand the complexities and challenges of third-millennium organizations is a systemic approach or systemic attitude, and he stated, in dealing with complex systems problems, one should focus on the system as a whole, rather than on the parts. In system theory, problems are studied based on their conditions, requirements, and developments, as well as their contributing factors and their interrelationships, are examined, and appropriate solutions are provided. Therefore, systems thinking is necessary for a more comprehensive and systematic approach in dealing effectively with modern complex systems and their problems/challenges. The study of factors that strengthen CPS skills helps employers hire competent employees and invest in their training.
In the 2000s, there was a belief that systems thinking can be an answer to complex systems problems [11,12,13], and there is convergence around their definitions [14,15], This belief was translated later into action, where some studies appeared to show the significance of systems thinking in the domain of complex systems and recruiting employees [16]. However, what remained unanswered is the relationship between an individual’s systems thinking (ST) and his/her general perception of different stages in the CPS process—that is a current gap in the literature.
To address the gap and to improve the body of knowledge, the aims of the study are (1) to develop and validate a new perceived complex problem-solving (PCPS) instrument and (2) to investigate the relationship between ST and CPS using the developed instrument. The intent of the study is also to compare the effect of seven different dimensions of systems thinking, discussed later [1], on the performance of CPS.
The contribution of the study has three dimensions. From a methodological dimension, because of the simulation method in this field and the lack of an instrument that is easy to use in general and being base on CPS theories, this study develops and validates a new CPS instrument in the literature. Several validity and reliability measures are conducted to establish the development of the instrument. From a theoretical dimension, this study is important for academics since it helps to bridge the literature gap in the field by providing comparisons and relationships between different systems thinking dimensions with the perception of CPS stages. From a practical dimension, this study emphasizes on the importance of employees who obtain high-level ST and CPS skills to deal with modern complex system problems, so this study encourages HRM professionals to consider ST and CPS skills as work requirements in recruiting employees and hold training programs for both experienced managers and newcomers in the organization. This study can also be implemented in educational programs for students to evaluate and screen their skillset and capability in modern complex system problems.
An overview of CPS and ST is provided next, followed by the research hypotheses, the research methods, and the analyses performed to assess the validity and reliability of the theoretical model. The study concludes with a discussion, implications, limitations, and future research.

2. Background and Hypotheses

2.1. Complex Problem-Solving

2.1.1. Theories in Problem-Solving

In the way of solving a problem, there are different theories, which can be grouped into three.
(1)
Behaviorist: The behaviorists emphasize the role of stimulus–response interactions in problem-solving because of their emphasis on trial-and-error learning and habit strength [17].
(2)
Cognitive: By progression in cognitive psychology by the work of the Gestaltists, more focus and attention were devoted to the mental processes of learning and problem-solving. This attitude believes that solving a problem is done in several steps and each step has a specific goal. Wallas and Polya were two cognitive psychologists who identified four stages of problem-solving [18,19]. The four stages of Wallas were preparation, incubation, inspiration, and verification, and the four stages of Polya were (a) problem understanding (b) plan devising; (c) carrying out the plan, and (d) backward-looking. Polya promoted general problem-solving strategies called heuristics and introduced them as keys in problem-solving expertise and intellectual performance. Heuristics methods were related to concepts such an analogical problem solving [20,21], symbolic problem solving [22,23], and abstract thought such as categorization, induction, and generalization [24,25,26].
(3)
Information processing: Through further emergence of differences between experts and novices in solving problems, the information-processing theory of learning evolved. These theories pay attention to the exclusive nature of human beings in collecting and processing information to solve their problems, like computers to solve very complex problems. Base on this theory Newel and Simon work in artificial intelligence and combine thinking in human problem solving and artificial intelligence [27]. In this theory, problem solving is related to concepts such as working memory capacity [28,29], reasoning [30,31], long-term memory, and cognitive retrieval of relevant information [32].
Today, the dominant attitude in the study of problem-solving is the theory of information processing, which, like Gestalt’s view of problem-solving, has certain assumptions. This theory believes that solving a problem is done in several steps and each step has a specific goal. Problem-solving is combined with evaluating the situation of a certain problem, selecting and applying the next specific steps, and this method continues until the end of problem-solving, just as a computer does.

2.1.2. Complex Problem-Solving

Modern complex problems are considered ill-defined problems with a lack of clear paths to obtain an optimal solution [33]. With the growth of complexity, it is difficult for problem solvers to evaluate the performance of the system since extracting information might be difficult to achieve. Therefore, the problem solver has interactions with the task until he/she gets information about progression [34] and reduces the gap between the initial state and the goal state by performing non-routine cognitive activities [14,35].
The research area in problem-solving began in cognitive psychology with the experimental work of the Gestaltists in Germany (e.g., Dunker, 1935 in [34]), typically with simple laboratory work (e.g., the “disk problem”, later known as the “Tower of Hanoi”) (e.g., [36] and Dunker’s “X-ray” problem [37]), and it was thought it could be generalizable to more complex problems. At the beginning of the 1970s, researchers gradually became convinced that the theoretical concepts and empirical findings from simple laboratory tasks could not be generalized to complex real-world problems, and even under different circumstances, the basic CPS processes were different [38]. Since 1975, after global events such as the oil crisis, a new path has opened in the psychology of thinking that addresses complex problems and led to different reactions in North America and Europe [34]. The two ideas formed do not define problem-solving in the same way, and their divergent definitions led to different measurements of CPS.
(a)
Two major approaches emerged in Europe, in Britain by Donald Broadbent [39], and in Germany by Dietrich Dörner [40,41]. Both approaches focused on complex laboratory tasks based on computer simulation, but these approaches differed somewhat in theoretical objectives and methods. In the British approach, mathematical problems were used in computer simulation systems to examine cognitive problem-solving processes under consciousness and unconsciousness.
In the German school, Funke and Frensch [34] stated that one obstacle must be removed in simple problem-solving, while in CPS, several obstacles require a set of cognitions and prioritization programs to move forward the target situation. Dörner and Funke [42] claimed Funke and Frensch’s definitions did not fully include the content or the relationship between the simulation and the real world. Therefore, they redefined a practical CPS as a collection of self-regulated psychological processes and activities that combine cognitive motivational and emotional aspects in a dynamic environment to achieve a bricolage and not perfect or optimal solutions. Complex problems require high knowledge and collaboration among many people [42]. In PISA 2012, the definition of CPS is the individual’s capacity for cognitive processing to understand and solve problem situations [3]. The PISA 2015 defines collaborative problem-solving, and it showed that the students with a high level of collaborative problem-solving abilities could successfully carry out complicated problem-solving tasks with high collaboration complexity [4]. In PIAAC, it defines problem-solving in technology-rich environments [5].
Base on the German school definition, in the early 1980s, Dörner introduced the computer simulation scenario of “microworlds” such as Tailorshop [43] and “Lohhausen” [44], with several variables to allow experimental research of complex problems under controlled conditions [45]. Researchers in this field have found that, although the upper limit of complexity is not limited, the lower limits can be identifiable [46]. Therefore, they introduced “minimal complex systems” scenarios that consist of a single task or problem [47]. Then, a “multiple complex systems” approach [48] was introduced in response to the weaknesses of minimal complex systems.
(b)
The CPS definition in the North American approach emphasizes “the study of cognition in complex real-world conditions” [49] (p.135) and several techniques and tools developed in this approach. The O*net staff survey, which is the result of the efforts of the US Department of Labor, has developed several tools for measuring skills, knowledge, and abilities. It has assessed the importance of complex problems-solving in different occupations by eight items in the prototype version, then revised them in one item [6,50]. Although other tools such as personal problem-solving [51], managerial problem-solving [52], problem-solving styles [53], and social problem-solving [54] developed in this approach, still research for the development of a general theory in the evaluation of CPS abilities is not presented in the North American literature.
Despite much research in this area, the difference between the concept of a “simple problem” and a “complex problem” is still somewhat obscure, but we know that the greater the number of variables and the greater the relationships between them, cause the greater complexity of the problem [49,55]. It is still an open question which measurement can best assess the CPS or whether various other constructions should be proposed [7]. After an extensive survey in the works of literature that has attempted to expand the theories, definitions, and related concepts in the domain of problem-solving such as [18,19,27,31,33,36,42,43,51,56], the lack of a suitable questionnaire to assess recognition of CPS and its process is still a current gap. We used Stenberg’s definitions in his book Cognitive psychology [33], about complex, insightful, and ill-structured problems and the processes of solving such problems, and also, the definitions and problem-solving processes in the prototype version of the O*net questionnaire and its revision [6,50], for designing an instrument to assess an individual’s perception of CPS. The perceived problem-solving inventory does not directly assess problem-solving ability nor assesses one’s function in a hypothetical problem situation. As stated in various sources in Heppner and Patersen [56], individuals act in hypothetical situations differently than real situations. This inventory evaluates a general knowledge of a person about complex problems and the process of solving them. True perception of CPS supports us in distinguishing it from simple problem-solving. Know that as barriers between a given state and a goal state are complex, change dynamically during problem-solving, and intransparent. Different aspects of a given state and the goal state are obscure for problem solvers and hard to identify. Solutions are not immediately obvious and are a combination of activities as a result of interaction between different solvers and their situation and are not necessarily perfect or optimal. Awareness of these facts helps us to perform better and more realistically in passing the stages of real-world CPS.
In research conducted annually by The National Association of Colleges and Employers, problem-solving ability is one of the most important skills that employers seek on candidates’ resumes. For example, the results of this annual survey showed that in 2016, employers, after the ability of the work team, are looking for problem-solving skills in work applicants [57]. This skill topped the list in 2017 [58], and in 2020 [59], respondents, with 91.2%, stated that it was the first skill they were looking for in a candidate’s resume. Additionally, Mourshed and her colleagues [60], in their survey, stated that employers are looking for students with high problem-solving skills in the entry stage. In another study [61], it was shown that problem-solving skills lead to job success in new workforce entrants. In annual O*net surveys, the results show that problem sensitivity was among the top 10 job needs among the various occupations, and the greatest need for CPS is in occupations with the highest demands, financial values, and high rewards, such as senior executives, lawyers, judges, crisis management managers, or surgeons [6].

2.2. Systems Thinking

Numerous studies have linked complex systems and issues to systems thinking (ST) (e.g., [1,16,62,63,64,65]). Several researchers [14,15] stated that the definitions of CPS and ST have some overlap. Funke [14] stated that five attributes distinguish complex problems from simple problems, which include (1) the complexity of the problem situation, (2) the relationships between the variables involved, (3) the dynamics of the situation and developments within the system, and the role of time, (4) partial or complete lack of transparency, and (5) polytely (a Greek term for “many goals”) and the possibility of conflict in the existence of several goals. Dörner and Funke [42] considered at least three aspects for complex systems: (1) Different levels of abstraction, (2) change (potentially unpredictable) over time, and (3) knowledge-rich with many potential strategies. Jaradat [1] introduced the characteristics of complex systems as (1) increasing complexity, (2) ambiguity, (3) high levels of uncertainty, (4) emergence, (5) evolutionary development, (6) interconnectivity, and (7) integration.
According to Checkland [66], ST is the ability to think and speak in a holistic language to understand and deal with complex system problems. Flood and Carson [67] and Richmond [68] define ST as a framework that helps individuals to address complex things. Jaradat and his colleagues stated that an individual’s systemic thinking capacity could be an effective response to a complex system problem [1,9]. Although some tools and techniques have been developed for ST such as [69,70], Jaradat and his colleagues developed a systems thinking skills preferences (STSP) instrument (with α = 0.91) based on the grounded theory method, which is the first instrument for evaluating an individual’s systemic thinking capacity, it includes seven dimensions: (1) level of complexity, (2) level of independence (autonomy), (3) level of interaction, (4) level of change, (5) level of uncertainty, (6) level of the systems worldview (hierarchical view), and (7) level of flexibility (see Figure 1) [1,9]. This instrument was used in data collection for obtaining participants’ predisposition for ST skills.

2.3. Hypotheses Development and the Proposed Theoretical Model

In research, ST has been conceptualized in relation to dealing with complex systems and problems. However, there are still gaps in this area.
a.
Although Maani and Maharaj [13] have attempted to show the relationship between ST and performance in CPS in a sample of 10 participants, the relationship between ST and the general perception of complex problems nontransparent aspects without specific training in CPS has not yet been investigated.
b.
Most of the complex problems-solving research belongs to the German school and are based on computer simulation. In the North American approach, questionnaires were developed in the field of CPS importance [6], personal problem-solving [71], problem-solving styles [53], and social problem-solving [54], regardless of novelty, simplicity, or complexity of problems, and whether or not there are single or multiple barriers or goals. Therefore, there is a lack of questionnaires that assess perceived complex problems-solving based on theories and are easy to use for students, administrators, and employees.
In this study, to address these gaps, a questionnaire was developed to assess the individual’s perceptions of CPS, and its validity and reliability evaluated by factor-analysis results, in addition to providing an examination of the relationship between systems thinking and perceived complex problem-solving, which enriches the body of current literature.

2.4. The Relationship between Systems Thinking and Complex Problem-Solving

In many studies, ST is considered an appropriate response to complexity because it provides a more holistic view of a problem area [9]. Senge [72] argued that due to overwhelming complexity, ST is needed more than ever. Richmond [73] described ST as a superior approach in dealing with complexity. Sweeney and Sterman developed a list of systemic thinking features to assess students’ capability in complexity [62]. In another study, Keating, Kaufman, and Dreyer examined whether ST in an organization could provide a framework for analyzing and solving complex issues. The results of this study showed that ST can prepare us to solve problems effectively in today’s turbulent environment and can be used as a suitable framework for analyzing and solving problems in the management of organizations [12]. Jackson [11], in his study on the effectiveness of the use of ST in solving complex social problems, showed that ST could be used as a coherent method to solve social problems. In another study in the information and communications technologies sector, Petkov and his colleagues [74] showed that techniques from soft systems and multiple criteria decision making (MCDM) could be effective in particular stages of a CPS intervention. Considering the widespread belief about the connection between ST and complexity, Mani and Maharaj [13] examined the relationship between ST and performance in CPS for empirical substantiation of this belief. Based on simulation tests, they showed a certain type of ST, and more importantly, the subject’s approach to the problem is relevant to solving a problem.
Due to the five features of the complex problem [14,49] and the features of complex systems [1,9,42] (as described in the previous section) and the ST skills [1], it is evident that many of the CPS can be managed through ST. ST skills help individuals understand the structure of problems, leading to better performance in problem-solving in complexity [13] (p. 7). However, overall, what remains neglected in studies is the effect of ST on the general perception of complex problems and their nontransparent aspects. Therefore, in this study, this issue has been considered, and different skills of ST on PCPS are evaluated.

3. Methodology

In this study, after validation of the perceived complex problem-solving (PCPS) instrument, the relationship between ST and PCPS was examined. In other words, we investigated the impact of systems thinking skills preferences (STSP) on the PCPS of managers and students. To measure this relationship, two studies were performed. The first study targeted managers who face high levels of complex system problems in their organizations, and the second study targeted students as prospective future workforce. Two different samples were considered for testing the construct validity and internal consistency of the theoretical model across different samples. Figure 2 shows the research framework.

3.1. Materials

In this study, two questionnaires were used: The Systems Thinking Skills preferences (STSP) Questionnaire (with α = 0.91), developed by [1,9], with 39 questions, evaluates seven preferential categories/systems skills dimensions (Figure 1) and determines the individual’s desire for Holistic or Reductionist thinking. Based on these dimensions, one score determines the total ST score for each individual. Due to the lack of a suitable questionnaire to assess CPS abilities, a questionnaire consisting of nineteen five-point Likert scale questions is developed and tested for validity and reliability (with 0.89). The questionnaire consists of four stages of CPS: (1) Problem Identification and Definition (questions 1–5; an example question in this dimension designed for students is “I am often facing unique and new problems in my engineering coursework.”), (2) Information Gathering about problems and solutions (questions 6–11; an example question designed for students is “The methods, resources, or people through which information can be collected are not recognized well.”), (3) Evaluating solutions and Developing Approaches (questions 12–16; an example question in this dimension designed for students is “It is hard to evaluate and assess the strengths and weaknesses of new ideas and solutions.”), and (4) Implementation Planning (questions 17–19; an example question in this dimension designed for students is “It is difficult to present and develop an executive plan for the realization of new ideas.”), which totally assesses the PCPS (see Appendix A. Table A1). All items are scored on a five-point Likert scale, ranging from 1 = strongly disagree to 5 = strongly agree. A total score can be calculated as a general index of the PCPS of a person.
These questionnaires are used to measure individuals’ assessment of their perception to CPS and determine their ST skills. Demographic factors are added to the proposed theoretical model.

3.2. Sample and Data Collection Procedure

3.2.1. Study 1

Participants

The statistical population of this study was managers of the governmental executive organizations in the South Khorasan Province in Iran. The respondents were n = 250, including 49 females and 201 males, and three CEOs, 46 deputies, and 201 office managers. Respondents answered questions related to their age, managerial background, and work experience. The sample characteristics are shown in Table 1.

Procedure

Step 1. The development of a perceived complex problem-solving questionnaire
The initial version of the questionnaire was developed to assess an individual’s perception of CPS. In order to determine its validity and reliability, according to [75], the initial version of the PCPS questionnaire was given to 10 experts teaching public administration and management at different universities. The validity of its content (the relevance of the phrase, simplicity of the phrase, and the clarity of the phrase) was evaluated. Questions were accepted with CVI > 0.7, and then its reliability was evaluated among 250 employees with α = 0.895. All “Cronbach’s Alpha if Item Deleted” values were less than the overall Cronbach’s Alpha of 0.895, suggesting all questions are reliable.
Step 2. The translation of the STSP Questionnaire
According to the literature [76,77], the STSP questionnaire were translated from their original form into the Persian language. The STSP instrument is translated to the Persian language through a panel of experts to better accommodate the language used by participants and to obtain a valid analysis. Then, by comparing the two versions, modifications were made. The instrument was given to a small group of managers, and the reliability was evaluated with α = 0.841, and the final survey was produced. All “Cronbach’s Alpha if Item Deleted” values were less than the overall Cronbach’s Alpha of 0.841, suggesting all questions are reliable.
The Persian version of the PCPS and ST questionnaires was used in this study. The sample size consisted of seventeen governmental executive organizations of South Khorasan. The selection criteria were based on stratified random sampling. Four hundred-fifty paper questionnaires were distributed among CEOs, deputies, and office managers of provincial organizations in the summer of 2020, and 250 questionnaires were returned.

3.2.2. Study 2

Participants

The statistical population of this study was students at Mississippi State University in the United States. Four hundred eighty-one students participated in the study. From 481 collected responses, 373 students’ responses were analyzed. The pair-wise deletion has been used in data analysis. The sample characteristics are shown in Table 2. The percentage of female and male respondents were 35.9% and 64.1%, respectively, and 67.3% undergraduate and 32.7% graduate students. Their age range was from 18 to 60 with a mean of 28.7 years and SD of 10.0 years, and they were made up of 83.9% full-time students and 16.1% part-time students. Additionally, 9.9% were distance learning students, and 90.1% were on campus. The mean CGPA of students was 3.45, with an SD of 0.54 ranging from 2.00 to 4.00. They have passed an average of 54.6 credits/hours in their program with an SD of 37.6.

Procedure

A web-based survey was used to collect data for this study, and emails were sent to students in the Fall of 2020–2021. In this study, the original version of the STSP instrument [9] and the English version of the PCPS instrument were used.

4. Data Analysis

4.1. Factor Analysis and Scale Development

The purpose of this study is to bridge the literary gap with regard to an instrument for defining the PCPS of an individual. To meet this end, an individual’s perception will be analyzed when faced with modern complex system problems. The scale development was conducted in two main stages––the exploratory and confirmatory stage. Other studies have applied similar development framework scales, initiated by studies with the pilot test (gathering experts’ feedbacks), followed by a meticulous construction of the validity in EFA (exploratory stage). Finally, the framework is completed by constructing validity analysis using CFA (confirmatory stage) [78,79,80,81].
In the exploratory stage, to achieve an initial theoretical model of the PCPS instrument, the KMO test, Bartlett’s test of Sphericity, and anti-image correlation matrix have been done to assure the sample size is adequate. Then, EFA framework performed and showed four factors retained with eigenvalues greater than one. The reliability for both studies were more than 0.8, which were very good. After EFA procedures, the baseline model of the PCPS instrument with four main factors/constructs and 19 items has been designed. In the next stage, a confirmatory factor analysis (CFA) procedure was done, and by measuring construct validity, uni-dimensionality, discriminant validity, and composite reliability (CR), the final structural model of the PCPS instrument was provided. Finally it has proved that the final model of the PCPS instrument fits the data well and is able to measure the state of PCPS at the individual level. For more details about the analysis procedures and development of the instrument, see Appendix B.

4.2. Structural Equation Modeling (SEM)

Study Variables

The variables listed below are developed in the proposed theoretical model (see Figure 3).

Latent Independent Variable

The “Systems Thinking Skills Preferences (STSP)” is an abstract theoretical variable and cannot be directly measured; therefore, we used a latent variable (unobservable variable) to indirectly measure it through the seven observed variables associated with the seven dimensions of the STSP instrument. This latent variable indirectly measures the individuals’ overall systemic skills preferences based on the seven dimensions, which resulted from an extensive systematic review using grounded theory in the domain of complex systems. The seven dimensions are (1) level of Complexity, (2) level of Independence, (3) level of Interaction, (4) level of Change, (5) level of Uncertainty, (6) level of Systems Worldview, and (7) level of Flexibility. Figure 1 indicates the detailed definition of each dimension with a simple description of each.

Latent Dependent Variable

To assess individuals’ PCPS, the study utilized the PCPS instrument with its four stages: (1) Level of Problem Identification and Definition, (2) Level of Information Gathering, (3) Level of Evaluating Solutions and Developing Approaches, and (4) Level of Implementation Planning dimensions. These four dimensions, which are condensed into one latent variable called PCPS, are used as a problem-solving perception indicator for the study’s population.
Before interpreting the results of the study, the proposed theoretical model needs to be validated through the establishment of construct validity. As mentioned, the proposed theoretical model shows the structural relationship between dependent and independent latent variables (that is, STSP and PCPS) through the regression and measurement weights.
The construct validity of the theoretical model is obtained through the investigation of model fit indices. The fit indices values indicated that the proposed theoretical model obtained the construct validity and measured what it is intended to measure; consequently, it is deemed valid to test the study’s hypotheses. The construct validity was conducted (1) to show that the proposed theoretical model was able to measure what it is intended to measure (i.e., the proposed model fits the data), (2) to show that the associated results of the model can be generalizable, and (3) to test the study hypotheses.
To test the study hypotheses, the proposed theoretical model was tested through structural equation modeling using AMOS software version 25.0. The standardized solution for the theoretical model consists of the full structural model used to assess all the relationships among the study’s variables (see Figure 3).
As seen in Figure 3, practitioners/students with high scores on the ST dimensions of levels of Complexity, Independence, Interaction, Change, Uncertainty, Systems Worldview, and Flexibility also have high scores on four stages of PCPS, including (1) Level of Problem Identification and Definition, (2) Level of Information Gathering, (3) Level of Evaluating Solutions and Developing Approaches, and (4) Level of Implementation Planning dimensions. For example, a practitioner/student with a high score in the Level of Problem Identification and Definition dimension indicates his/her better understanding and defining the problems, and a practitioner/student with a high score in the Complexity dimension indicates his/her clear skill preference toward Complexity compared to Simplicity (see Figure 1). The practitioners with low scores on the seven dimensions of ST skills preferences are associated with low scores on the four stages of PCPS.
Since the relationship between the ST Skills Preference and PCPS latent variables is significant with p-value < 0.001 (t-value = 3.31) and standardized regression weight of β1 = +0.25 (with the standard error of 0.03) for practitioners in study 1 and with p-value of 0.013 (t-value = 2.47) and standardized regression weight of β1 = +0.18 (with the standard error of 0.003) for students in study 2, the main hypothesis is supported. This indicates that the ST skills preferences of practitioners/students have a positive relationship with their PCPS. In other words, the ST of practitioners/students affects their PCPS.

5. Discussion

The competitive environment, rapid changes, and the expansion of communication have led organizations to complex systems with multiple relationships. In such situations, complex challenges and problems have arisen, and as a result, the ability to solve complex problems is a necessary competency for an individual and organization. Therefore, complex problem-solving (CPS) has been considered in numerous international evaluations both in the field of education and in the industry.
In Phase I of the study, the literature about the history, definitions, and process of CPS were reviewed. Most assessments of CPS were using computer simulations, and there was no questionnaire for professional assessment with regard to other questionnaires like personality, critical thinking, and performance. Although several typical problem-solving questionnaires were designed in specific areas regardless of the simplicity or complexity of the problem, a questionnaire based on CPS definitions does not exist. As a result, to bridge this literature gap, a questionnaire was designed in Phase II. In this phase, based on theories and processes, four main stages were derived, and 32 phrases were designed for the purpose of assessing the level of general knowledge and understanding of people about complex problems and the processes needed to solve them. Then, in Phase III, after gathering experts’ feedback and ideas, 19 items were chosen, and the PCPS instrument was developed. The content validity of the questionnaire (the relevance of the item, simplicity of the item, and the clarity of the item) was evaluated by ten university faculties and experts, and all 19 questions were accepted with CVI > 0.7. The main purpose of this phase was to determine the capability of the instrument to capture an individual’s PCPS.
Along with using the PCPS instrument to gather data, the scale development of the instrument was started in Phase IV. In the data collection of two studies, 250 managers and 373 students from different races, gender, educational backgrounds, and occupations have participated in the experiment. This dataset had no missing value and passed normality test criteria. Some comprehensive scale development techniques were performed in two stages called the exploratory stage and the confirmatory stage. To shape the initial theoretical model, the dataset has been analyzed in the exploratory factor analysis framework and resulted in the initial theoretical model called the baseline model. To make the final decision about the number of factors, after checking eigenvalues and the scree plot, four factors were retained with eigenvalues greater than one, including Level of Problem Identification and Definition, Level of Information Gathering, Level of Evaluating Solutions and Developing Approaches, and Level of Implementation Planning.
After attaining the initial theory of the PCPS instrument, the confirmatory stage began to test the initial theoretical model. In the confirmatory stage, the baseline model was tested and modified through the CFA framework. After completing six main steps of CFA, the best-fitted model to the dataset called the final model was retained. The final model consisted of four distinct factors (constructs) and 17 items (questions), which measure different individuals’ PCPS. The final model had the best theoretical and logical support along with good construct validity and reliability results, and it will service as the validated theoretical model for the PCPS instrument and measure the level of perception of individuals in CPS.
The PCPS instrument presented in this study allows for better understanding with regard to individuals’ PCPS. The application of this instrument is broad with usefulness in industry, education, and government, and will allow management/superiors to identify the strengths and weaknesses of an individual in terms of cognitive thinking. Therefore, for further research in this study, the tool has been used to assess the relationship between an individual’s systems thinking skills preferences (STSP) and his/her PCPS. Base on testing, the main hypothesis is supported. This indicates that the STSP of practitioners/students have a positive relationship with their PCPS. In other words, practitioners/students with high scores on ST dimensions of levels of Complexity, Independence, Interaction, Change, Uncertainty, Systems Worldview, and Flexibility also have high scores on four stages of PCPS, including (1) Level of Problem Identification and Definition, (2) Level of Information Gathering, (3) Level of Evaluating Solutions and Developing Approaches, and (4) Level of Implementation Planning dimensions. The contribution of this hypothesis is consistent with other studies such as [62], who developed a list of ST features to assess students’ capability in complexity. Keating and his colleagues [12] showed ST could provide a framework for analyzing and solving complex issues in the management of today’s organizations. Mani and Maharaj [13] showed ST has a relationship with performance in CPS. As they mentioned, ST aids in understanding the structure of a problem and then would lead to better performance.

Future Studies and Limitations

This tool does not directly assess problem-solving ability, but rather examines the level of perception of individuals from complex problems and complex problem-solving processes. The higher a person’s score in PCPS, the better their knowledge and understanding of CPS and its process for achieving more effective results. This test does not ask the participants about a hypothetical and specific situation and neither designed for a specific setting like management or education, etc., so it can be used in different settings wherever individual needs to deal with complex problems. For this goal, further research by investigating many ways of applying the tool in a more interactive setting and comparing new and old results for improving the reliability of the instrument further.

6. Conclusions

The intent of this study was to test the validity of the PCPS instrument in the domain of complex systems, so two studies were done segregated, and the results were reported separately. As the two sample populations were in different settings, cultures, and ages, they were not integrated. The results of the two studies showed this instrument can be used in other populations with different settings and can probably remain valid and reliable. In addition, this investigation gave an assessment of the connection between systems thinking (ST) and perceived complex problem-solving (CPS), which advances the assortment of the current literature.

Author Contributions

Conceptualization, M.N., A.M. and M.M.; methodology, M.N., A.M. and M.M.; software, M.N.; validation, M.N., A.M., R.J. and M.M.; formal analysis, M.N.; investigation, M.N., A.M. and M.M.; resources, M.N., A.M. and M.M.; data curation, M.N. and A.M.; writing—original draft preparation, M.N. and A.M.; writing—review and editing, M.N., A.M. and R.J.; visualization, M.N. and A.M.; supervision, M.N., R.J. and M.M.; project administration, M.N. and A.M.; funding acquisition, R.J. and A.M. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Institutional Review Board Statement

The study was conducted according to the guidelines of the MSU HRPP, and approved by the Institutional Review Board of Mississippi State University (protocol code IRB-19-401 and 25 November 2019).

Informed Consent Statement

Informed consent was obtained from all subjects involved in the study.

Conflicts of Interest

The authors declare no conflict of interest.

Appendix A

Table A1. Perceived complex problem-solving questionnaires.
Table A1. Perceived complex problem-solving questionnaires.
Problem Identification and DefinitionIdentification: Identifying the Nature of Problems and the Goal We Want to Achieve. (Find out What the Problem Is?)
Definition: What Information Does the Problem Give Us and What Does It Ask? And Redefine the Problem
1. It is difficult to accurately identify and define problems and issues related to my coursework in the major of engineering study [my job function and tasks] [33]. Strongly disagreeDisagreeNeutralAgreestrongly agree
3. It is difficult to accurately identify the different dimensions and aspects of problems related to my coursework in the major of engineering study [my job function and tasks] [33].Strongly disagreeDisagreeNeutralAgreestrongly agree
4. I am often facing unique and new problems in my engineering coursework [my job function and tasks for managers].Strongly disagreeDisagreeNeutralAgreestrongly agree
5. Problems in my engineering coursework [my job function and tasks] are hard to predict and expect.Strongly disagreeDisagreeNeutralAgreestrongly agree
6. Problems in my engineering coursework [my job function and tasks] are difficult to analyze.Strongly disagreeDisagreeNeutralAgreestrongly agree
information Gathering about problems and solutionsKnowing how to find information and identify essential information.
7. It is hard to gather the information needed to make decisions in my field of engineering study [my job function and tasks].Strongly disagreeDisagreeNeutralAgreestrongly agree
8. It is often hard to identify the causes of the problems and issues that I face in my coursework in the major of engineering study [my job function and tasks] [33].Strongly disagreeDisagreeNeutralAgreestrongly agree
9. The methods, resources, or people through which information can be collected are not recognized well in my coursework in the major of engineering study [my job function and tasks].Strongly disagreeDisagreeNeutralAgreestrongly agree
10. In most cases, I feel that there is no expertise to gather information about the causes of problems in my coursework [my job function and tasks].Strongly disagreeDisagreeNeutralAgreestrongly agree
11. It is often hard to identify potential solutions to problems and issues well. Strongly disagreeDisagreeNeutralAgreestrongly agree
12. It is hard to reorganize the information collected to identify new solutions in my coursework in the major of engineering study [my job function and tasks].Strongly disagreeDisagreeNeutralAgreestrongly agree
Evaluating solutions and Developing ApproachesDeveloping Approaches and Evaluating the likely success of an option in reaction to the demands of the situation.
13. It is hard to classify the information obtained to evaluate potential solutions in my coursework in the major of engineering study [my job function and tasks].Strongly disagreeDisagreeNeutralAgreestrongly agree
14. It is often difficult to predict the potential outcomes of the solutions in my coursework in the major of engineering study [my job function and tasks]. Strongly disagreeDisagreeNeutralAgreestrongly agree
15. The problems I face require new solutions and creative ideas in my coursework in the major of engineering study [my job function and tasks].Strongly disagreeDisagreeNeutralAgreestrongly agree
16. It is hard to evaluate and assess the strengths and weaknesses of new ideas and solutions in my coursework in the major of engineering study [my job function and tasks]. Strongly disagreeDisagreeNeutralAgreestrongly agree
17. In addition to assess strengths and weaknesses of new ideas, the possibility of successful implementing is hard to be explored in my coursework in the major of engineering study [my job function and tasks].Strongly disagreeDisagreeNeutralAgreestrongly agree
Implementation PlanningDeveloping approaches for implementing an idea.
18. In the area of my coursework problems, beside coming up with new ideas and solutions, it is hard to expect presenting an executive plan to implement the new ideas and solutions in my coursework in the major of engineering study [my job function and tasks].Strongly disagreeDisagreeNeutralAgreestrongly agree
19. It is difficult to present and develop an executive plan for the realization of new ideas in my coursework in the major of engineering study [my job function and tasks]. Strongly disagreeDisagreeNeutralAgreestrongly agree
20. There does not exist the required competencies and capabilities to develop an executive plan to implement the ideas in my major of engineering study [my job function and tasks].Strongly disagreeDisagreeNeutralAgreestrongly agree
The phrase “my coursework in the major of engineering study” was used in students’ questionnaires, and the phrase “my job function and tasks” was used in managers’ questionnaires.

Appendix B

Appendix B.1. Factor Analysis and Scale Development

Exploratory Factor Analysis (EFA) procedures were conducted as the dimension reduction (data-driven) technique using SPSS software, version 26; this shapes the initial theoretical model for the PCPS called the “baseline model” [82]. The CFA, unlike EFA, is a theory-driven technique that requires a priori theoretical model (priori for this study was the baseline model resulted in EFA). Confirmatory factor analysis (CFA) procedures acted as the confirmatory stage utilizing AMOS, version 25, to confirm the structure of the baseline model. The CFA provided several analytics, including theory and hypothesis testing through construct validity, evaluation of method effects, examination of the stability of the factor model over participants, and a correlation between error terms.

Appendix B.2. Exploratory Stage

In the exploratory stage, factor analysis using SPSS software to determine the initial number of latent factors and respective items for each latent factor (construct) for the PCPS instrument. The following steps were conducted in the exploratory stage to achieve an initial theoretical model of the PCPS instrument.

Appendix B.2.1. Sample Size Adequacy

The data should be appropriate for the use of factor analysis [83]. To assure sample size adequacy, three criteria have been tested, including the KMO test, Bartlett’s test of Sphericity, and anti-image correlation matrix. The adequate results have been achieved from KMO (study 1: 0.89 > 0.50 and Study 2: 0.88 > 0.50) and Bartlett test (study 1: Chi-square (136) = 1821.4, p < 0.001 and study 2: Chi-square (171) = 1876.1, p < 0.001) [84,85]. In the anti-image correlation matrix, high inter-correlations depict the importance of an item to a factor [84]. The matrix showed that almost all of the items loaded higher than 0.40 in respective factors, and there was no extreme multicollinearity between the items. These results prove that the data and sample size are appropriate for factor analysis (EFA framework).

Appendix B.2.2. Exploratory Factor Analysis Procedure

To perform EFA framework, a decision should be made in four criteria: (1) factor extraction method, (2) factor rotation method, (3) factor selection, and (4) choosing association matrix. Principal components analysis is the most frequently used EFA extraction method [84] and has been chosen as the extraction method. To interpret the meaning of the four retained factors, orthogonal (Varimax) rotation has been chosen as the factor rotation method.
Factor Selection: To make the final decision about how many factors should be extracted, two criteria have been checked: (a) Eigenvalues shows variance explained by that particular factor out of the total variance [84]. Four factors have been kept with eigenvalues greater than one using Kaiser’s criterion of retaining. (b) The aim of the scree plot is to determine the optimal extracted factors. All the factors on the steep slope should be retained, and the other factors should be neglected [84]. Using the scree plot, four factors retained with eigenvalues greater than one.
These four factors extracted in EFA measure the four stages of the PCPS instrument, including Level of Problem Identification and Definition, Level of Information Gathering, Level of Evaluating Solutions and Developing Approaches, and Level of Implementation Planning stages. Table A2 shows the factors’ operational definitions and respective descriptions.
Table A2. Factors and Respective Operational Definition.
Table A2. Factors and Respective Operational Definition.
Construct# of QsDescriptionOperational Definition
λ15Items related to Problem Identification and DefinitionProblem Identification: Identifying the nature of problems and the goal we want to achieve. (Find out what the problem is?)
Problem Definition: What information does the problem give us, and what does it ask? And redefine the problem.
λ26Items related to Information Gathering about problems and solutionsInformation Gathering: Knowing how to find information and identify essential information.
λ35Items related to Evaluating Solutions and Developing Approaches to problems Evaluating Solutions and Developing Approaches: Developing Approaches and Evaluating the likely success of an option in reaction to the demands of the situation.
λ43Items related to Implementation Planning for problems and solutionsImplementation Planning: Developing approaches to implementing an idea or solution.
Reliability: Cronbach’s Alpha is conducted and yielded very good results in studies 1 and 2 with 0.92 and 0.89, respectively (Alpha greater than 0.8 and 0.9 is very good and Excellent, respectively) [86].
After completing the EFA procedures, the initial model of the PCPS instrument has been designed—the baseline model. The baseline model consisted of four main factors/constructs and 19 items with 19 corresponding loadings. This multi-vocal model served as the initial model to start CFA procedures. The confirmatory stage has been designed and conducted to test the initial theory from the exploratory stage and, if necessary, whether to correct the baseline model or conduct a new model. The next section provides a confirmatory framework along with a detailed illustration of the final structural model of the PCPS instrument.

Appendix B.3. Confirmatory Stage

Confirmatory Factor Analysis (CFA) is applied when researchers have clear hypotheses regarding a specific scale or instrument––the baseline model from the exploratory stage. CFA can be used to test whether the items are related to the hypothesized latent constructs as expected, and also the model has a sufficient number of latent constructs. If the CFA test finds this relationship, then the model will achieve structural construct validity [87]. The inability of the exploratory stage to clearly explain relationships between items with their respective latent constructs makes EFA far less suitable for the purpose of scale development and construct validity [88]. As such, the CFA is found to be more powerful and appropriate for theory and scale development [88]. There are several beneficial software packages that may be used to conduct CFA; while any of the major software packages would work well, Amos 25.0 was selected for its ease of use and user interface.

Appendix B.3.1. Confirmatory Factor Analysis Procedure

The CFA application is comprised of six steps. It starts from model specification, followed by model identification, parameter estimation, the model fit, and finally, the end model is re-specified and compared with other rival models [89]. In this section, the six steps consecutively have been explained. (1) Model Specification is concerned with formulating a model based on a theory and/or previous studies in the field [87]. Initial relationships between variables need to be made clear. The initial theoretical model––the baseline model obtained from the exploratory stage—was used in the confirmatory stage. (2) Model Identification is concerned with whether one can derive a unique value for each parameter whose value is unknown [87]. The model was identified by constraining four weight coefficients for each of four latent constructs to be equal one. (3) Parameter Estimation: its aim is to estimate population parameters by minimizing the difference between the observed and the implied model [87]. The maximum likelihood method, a widely used method, has been chosen as the estimation method in pursuit of the parameter values that provide the greatest benefit to the observed data. (4) Construct Validity: it examined the degree to which the proposed model fits the data [87]. To attain construct validity, several model fit indices should achieve their respective fitness thresholds. (5) Model Re-specification is concerned with improving the model fit by applying modification. Any decision regarding the model modification must be theoretically defensible [87]. After applying all the aforementioned steps to the theoretical model, the base model for the PCPS instrument has been created and then verified. For Study 1 and 2, the following model fits the indices, respectively, achieved: Chi-square/DF (1.96 and 2.06), CFI (0.94 and 0.94), GFI (0.91 and 0.92), RMSEA (0.062 and 0.061), and SRMR (0.050 and 0.052); where values of 5.0 and 3.0 are acceptable and good, respectively, for Chi-square/DF, values of 0.90 and 0.95 are acceptable and good, respectively, for CFI and GFI, and values of 0.08 and 0.06 are acceptable and good, respectively, for SRMR and RMSEA [90,91,92,93].

Appendix B.3.2. Model Comparison

After the construct validity (model fit) has been achieved, the last step of CFA (that is, model comparison) was performed. (6) Model comparison: it tests the sufficient number of factors (constructs) and respective observed variables for those factors (the structural model). If a scale were originally posited as containing multiple distinct factors (constructs), the measurement models should directly test this by comparing the fit of that model with more parsimonious nested models, including 1-factor, 2-factor, 3-factor models, etc. Two models are nested if one is derived from the other one by placing restrictions on it. Since the base model is originally a 4-factor model, all the best 3-factor, 2-factor, and 1-factor models derived from the base model were all nested to each other. (a) The best 3-factor model was nested with the new model and had one more constraint than the new model; the correlation between third and fourth factors constrained to be one (these two factors constrained to be totally dependent on each other). (b) The best 2-factor model was nested with the new model and had two more constraints than the base model, including the covariances among first, third, and fourth factors constrained to be one; i.e., all first, second, and third factors served as one single factor. The best 1-factor model was the original model in which all the covariances among four factors were constrained to be one. Chi-square difference test was conducted based on the below formulas shown in Equation (A1), and the results of these tests shown in Table A3:
Chi-square difference test = χ2 (model with fewer factors) − χ2 (model with more factors)/(DF (fewer factor model) − DF (more factor model))
The null and alternative hypothesis for all the following model comparisons using Chi-square difference test was:
H0comparison: There was no statistically significant difference between the base model (4-factor) and the fewer factor model, and the addition of the additional factor did not significantly improve the fit to the data; therefore, the base model is not preferred to the fewer factor model.
H1comparison: There was a statistically significant difference between the base model (4-factor) and the fewer factor model, and the addition of the additional factor did significantly improve the fit to the data; therefore, the base model is preferred to the fewer factor model.
Table A3. Comparisons of the base model with nested rival models.
Table A3. Comparisons of the base model with nested rival models.
Comparison between the Base Model andΔχ²ΔDFp-ValueResultDecision
Study 1The best 3-factor model82.81<0.001Reject H0The base model selected
The best 2-factor model114.03<0.001Reject H0The base model selected
1-factor model131.56<0.001Reject H0The base model selected
Study 2The best 3-factor model48.11<0.001Reject H0The base model selected
The best 2-factor model68.33<0.001Reject H0The base model selected
1-factor model103.86<0.001Reject H0The base model selected
According to Table A3, the statistical significance test for the difference between the base model and, respectively, 1-factor, the best 2-factor, and the best 3-factor models resulted in the rejection of the null hypotheses for both first and second studies. In other words, the deduction of the factors did not significantly improve the fit to the data; therefore, the base model was preferred to the other rival nested models. This result emphasized that the sufficient number of factors for the PCPS instrument was four factors, which is the base model. The base model served as the final model for the PCPS instrument in measuring perception of CPS of individuals in the domain of complex systems.

Appendix B.4. The Final Model

After conducting the Chi-square difference test to verify the sufficient number of factors for the PCPS instrument, the base model was selected as the final model of the study. Table A4 shows the structure of the final model with respective factor loadings. The final model consisted of four distinct factors (constructs) and 17 items (questions), which measure different individual’s PCPS. Validity and reliability features of the final model were demonstrated below:
Table A4. The final model of PCPS instrument after exploratory and confirmatory stages for practitioners and students.
Table A4. The final model of PCPS instrument after exploratory and confirmatory stages for practitioners and students.
FactorsItemFactor Loading (for Managers)Factor Loading (for Students)
Problem Identification and DefinitionItem 10.70.7
Item 20.60.5
Item 30.70.8
Item 40.50.6
Information GatheringItem 50.70.7
Item 60.60.7
Item 70.60.5
Item 80.20.5
Item 90.70.7
Item 100.90.8
Evaluating Solutions and Developing ApproachesItem 110.70.8
Item 120.70.6
Item 130.60.6
Item 140.60.6
Implementation PlanningItem 150.70.6
Item160.80.8
Item 170.80.9
  • Construct validity: For sample study 1 and sample study 2, the following model fits the indices, respectively, achieved: Chi-square/DF (1.96 and 2.06), CFI (0.94 and 0.94), GFI (0.91 and 0.92), RMSEA (0.062 and 0.061), and SRMR (0.050 and 0.052); where values of 5.0 and 3.0 are acceptable and good, respectively, for Chi-square/DF, values of 0.90 and 0.95 are acceptable and good, respectively, for CFI and GFI, and values of 0.08 and 0.06 are acceptable and good, respectively, for SRMR and RMSEA [90,91,92,93]. The construct validity’s result suggested that the final model fitted the data well and was able to measure what was intended to measure.
  • Uni-dimensionality: This will be achieved when all measuring items have acceptable factor loadings for the related factor [85]. The sample size of this study was between 200 and 400, and according to Field [84] (pp. 440), factor loading greater than 0.4 on one factor demonstrates an acceptable relationship. As shown in Table A4, all the factor loading had acceptable and excellent factor loading. Therefore, the final model for both studies achieved the uni-dimensionality criterion.
  • Discriminant Validity: The covariance greater than 0.85 between two factors indicates the two factors are redundant or experiencing a serious multicollinearity problem [87]. Additionally, all the covariances between factors in the final model were below 0.85. Therefore, the final model had discriminant validity among its factors.
  • Composite Reliability (CR): Indicates the reliability and internal consistency of a latent factor (construct). The final model has achieved the CR criterion (CR > 0.7 and 0.8 are good and excellent, respectively) for all four factors (see Table A5) [94].
Table A5. Composite reliability results for the final model.
Table A5. Composite reliability results for the final model.
FactorsProblem Identification and DefinitionInformation GatheringEvaluating Solutions and Developing ApproachesImplementation Planning
Study 10.710.780.750.80
Study 20.730.800.740.79
As has been discussed above, the final model respected all criteria of construct validity, uni-dimensionality, discriminant validity, and composite reliability. As a result, the main null hypothesis of the study (H0main) was supported. There is no statistically significant difference between the final model of the PCPS instrument and the actual data model in order to measure the state of PCPS at the individual level; i.e., the final model of the PCPS instrument fits the data well and is able to measure the state of PCPS at the individual level.

References

  1. Jaradat, R. Complex system governance requires systems thinking- how to find systems thinkers. Int. J. Syst. Syst. Eng. 2015, 6, 53–70. [Google Scholar] [CrossRef] [Green Version]
  2. Mainzer, K. Challenges of Complexity in the 21st Century. An Interdisciplinary Introduction. Eur. Rev. 2009, 17, 219–236. [Google Scholar] [CrossRef] [Green Version]
  3. OECD. PISA 2012 Assessment and Analytical Framework; Mathematics, Reading, Science, Problem Solving and Financial Literacy: Paris, France, 2013. [Google Scholar]
  4. OECD. PISA 2015 Results (Volume I); Mathematics, Reading, Science, Problem Solving and Financial Literacy: Paris, France, 2016. [Google Scholar]
  5. OECD. Literacy, Numeracy and Problem Solving in Technology-Rich Environment; Mathematics, Reading, Science, Problem Solving and Financial Literacy: Paris, France, 2012. [Google Scholar]
  6. Hubbard, M.; McCloy, R.; Campbell, J.; Nottingham, J.; Lewis, P.; Rivkin, D.; Levine, J. Revision of O* NET Data Collection Instruments; National O* NET Consortium, Employment Security Commission: Raleigh, NC, USA, 2000. [Google Scholar]
  7. Kyllonen, P.; Carrasco, C.A.; Kell, H.J. Fluid Ability (Gf) and Complex Problem Solving (CPS). J. Intell. 2017, 5, 28. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  8. Boardman, J.; Sauser, B. System of Systems—The meaning of of. In Proceedings of the 2006 IEEE/SMC International Conference on System of Systems Engineering, Los Angeles, CA, USA, 24–26 April 2006; p. 6. [Google Scholar] [CrossRef]
  9. Jaradat, R.; Keating, C.; Bradley, J. Individual Capacity and Organizational Competency for Systems Thinking. IEEE Syst. J. 2017, 12, 1203–1210. [Google Scholar] [CrossRef]
  10. Ackoff, R. ‘Whole-ing’ the parts and righting the wrongs. Syst. Res. 1995, 12, 43–46. [Google Scholar] [CrossRef]
  11. Jackson, M. Critical Systems Thinking and Practice. Eur. J. Oper. Res. 2001, 128, 233–244. [Google Scholar] [CrossRef]
  12. Keating, C.B.; Kauffmann, P.; Dryer, D. A framework for systemic analysis of complex issues. J. Manag. Dev. 2001, 20, 772–784. [Google Scholar] [CrossRef]
  13. Maani, K.; Maharaj, V. Links Between Systems Thinking and Complex Problem Solving-Further Evidence. In Proceedings of the 20th International Conference of the System Dynamics Society, Palermo, Italy, 28 July–1 August 2002. [Google Scholar]
  14. Funke, J. Complex Problem Solving. In Encyclopedia of the Sciences of Learning; Springer: Berlin/Heidelberg, Germany, 2012; Volume 21, pp. 682–685. [Google Scholar]
  15. Stadler, M.; Becker, N.; Gödker, M.; Leutner, D.; Greiff, S. Complex problem solving and intelligence: A meta-analysis. Intelligence 2015, 53, 92–101. [Google Scholar] [CrossRef]
  16. Karam, S.; Nagahi, M.; Dayarathna, V.L.; Ma, J.; Jaradat, R.; Hamilton, M. Integrating systems thinking skills with multi-criteria decision-making technology to recruit employee candidates. Expert Syst. Appl. 2020, 160, 113585. [Google Scholar] [CrossRef]
  17. Glaser, R.; Chi, M.T.; Farr, M. The Nature of Expertise; Lawrence Erlbaum Associates: Hillsdale, NJ, USA, 1988. [Google Scholar]
  18. Ormrod, J.E. Human Learning: Principles, Theories, and Educational Applications; Merrill Publishing Co: Columbus, OH, USA, 1990. [Google Scholar]
  19. Pólya, G. Mathematics and Plausible Reasoning: Induction and Analogy in Mathematics; Princeton University Press: NJ, NJ, USA, 1990. [Google Scholar]
  20. Gick, M.L.; Holyoak, K.J. Analogical problem solving. Cogn. Psychol. 1980, 2, 306–355. [Google Scholar] [CrossRef] [Green Version]
  21. Keane, M.T. Analogical Problem Solving; Halsted press: New York, NY, USA, 1988. [Google Scholar]
  22. Kaufmann, G. A theory of symbolic representation in problem solving. J. Ment. Imag. 1985, 9, 51–69. [Google Scholar]
  23. McMullen, M.B.; Darling, C.A. Symbolic problem solving: An important piece of the emergent literacy puzzle. Early Child Dev. Care 1996, 121, 25–35. [Google Scholar] [CrossRef]
  24. Medin, D.L.; Ross, B.H. The specific character of abstract thought: Categorization, problem solving, and induction. Adv. Psychol. Hum. Intell. 1989, 5, 189–223. [Google Scholar]
  25. Ross, B.H.; Kennedy, P.T. Generalizing from the use of earlier examples in problem solving. J. Exp. Psychol. Learn. Mem. Cogn. 1990, 16, 42. [Google Scholar] [CrossRef]
  26. Simon, H.A.; Lea, G. Problem solving and rule induction: A unified view. Knowl. Cogn. 1974, 50, 105–128. [Google Scholar]
  27. Newell, A.; Simon, H.A. Human Problem Solving (No. 9); Prentice-hall Englewood Cliffs: Hoboken, NJ, USA, 1972. [Google Scholar]
  28. Reber, P.J.; Kotovsky, K. Implicit learning in problem solving: The role of working memory capacity. J. Exp. Psychol. Gen. 1997, 126, 178. [Google Scholar] [CrossRef]
  29. Wiley, J.; Jarosz, A.F. How working memory capacity affects problem solving. In Psychology of Learning and Motivation; Elsevier: Amsterdam, The Netherlands, 2012; Volume 56, pp. 185–227. [Google Scholar]
  30. Leighton, J.P.; Sternberg, R.J. The Nature of Reasoning; Cambridge University Press: Cambridge, UK, 2004. [Google Scholar]
  31. Sternberg, R.J.; Berg, C.A. Intellectual Development; Cambridge University Press: Cambridge, UK, 1992. [Google Scholar]
  32. Williams, M.D.; Hollan, J.D. The process of retrieval from very long-term memory. Cogn. Sci. 1981, 5, 87–119. [Google Scholar] [CrossRef]
  33. Sternberg, R.J.; Sternberg, K. Cognitive Psychology; Nelson Education: Toronto, ON, Canada, 2016. [Google Scholar]
  34. Funke, J.; Frensch, P. Complex problem solving research in North America and Europe: An integrative review. Foreign Psychol. 1995, 5, 42–47. [Google Scholar]
  35. Mayer, R.E.; Wittrock, M.C. Problem solving. In Handbook of Educational Psychology; Lawrence Erlbaum Associates Publishers: Mahwah, NJ, USA, 2006; pp. 287–303. [Google Scholar]
  36. Mayer, R.E. Thinking, Problem Solving, Cognition; WH Freeman/Times Books/Henry Holt & Co: New York, NY, USA, 1992. [Google Scholar]
  37. Ewert, P.H.; Lambert, J.F. Part II: The Effect of Verbal Instructions upon the Formation of a Concept. J. Gen. Psychol. 1932, 6, 400–413. [Google Scholar] [CrossRef]
  38. Sternberg, R.J. Expertise in complex problem solving: A comparison of alternative conceptions. Complex Probl. Solving Eur. Perspect. 1995, 1, 295–321. [Google Scholar]
  39. Broadbent, D.E. Levels, hierarchies, and the locus of control. Q. J. Exp. Psychol. 1977, 29, 181–201. [Google Scholar] [CrossRef]
  40. Dörner, D. Wie Menschen eine Welt verbessern wollten [How people wanted to improve the world]. Bild. Wiss. 1975, 12, 48–53. [Google Scholar]
  41. Dörner, D.; Wearing, A.J. Complex problem solving: Toward a (computersimulated) theory. Complex Probl. Solving Eur. Perspect. 1995, 2, 65–99. [Google Scholar]
  42. Dörner, D.; Funke, J. Complex problem solving: What it is and what it is not. Front. Psychol. 2017, 8, 1153. [Google Scholar] [CrossRef]
  43. Dörner, D. On the Difficulties People Have in Dealing with Complexity. Simul. Games 1980, 11, 87–106. [Google Scholar] [CrossRef]
  44. Dörner, D.; Kreuzig, H.W.; Reither, F.; Stäudel, T. Lohhausen: Vom Umgang mit Unbestimmtheit und Komplexität; Huber: Bern, Switzerland, 1983. [Google Scholar]
  45. Brehmer, B.; Dörner, D. Experiments with computer-simulated microworlds: Escaping both the narrow straits of the laboratory and the deep blue sea of the field study. Comput. Hum. Behav. 1993, 9, 171–184. [Google Scholar] [CrossRef]
  46. Greiff, S.; Funke, J. Measuring Complex Problem Solving: The MicroDYN approach. In The Transition to Computer-Based Assessment. New Approaches to Skills Assessment and Implications for Large-Scale Testing; Office for Official Publications of the European Communities: Luxemburg, 2009; pp. 157–163. [Google Scholar]
  47. Funke, J. Analysis of minimal complex systems and complex problem solving require different forms of causal cognition. Front. Psychol. 2014, 5, 739. [Google Scholar] [CrossRef] [Green Version]
  48. Greiff, S.; Fischer, A.; Stadler, M.; Wüstenberg, S. Assessing complex problem-solving skills with multiple complex systems. Think. Reason. 2015, 21, 356–382. [Google Scholar] [CrossRef]
  49. Funke, J. Complex problem solving: A case for complex cognition? Cogn. Process. 2010, 11, 133–142. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  50. Peterson, N.; Mumford, M.D.; Borman, W.; Jeanneret, P.; Fleishman, E. Development of Prototype Occupational Information Network (O*NET) Content Model. Volume I: Report [and] Volume II: Appendices; Utah Department of Workforce Services: Provo, UT, USA, 1995. [Google Scholar]
  51. Heppner, P.P.; Krauskopf, C.J. An information-processing approach to personal problem solving. Couns. Psychol. 1987, 15, 371–447. [Google Scholar] [CrossRef]
  52. Church, H.J.; Smith, V.M.; Schell, B.H. Managerial Problem Solving: A Review of the Literature in Terms of Model Comprehensiveness. Organ. Behav. Teach. Rev. 1989, 13, 90–106. [Google Scholar] [CrossRef]
  53. Cassidy, T.; Long, C. Problem-solving style, stress and psychological illness: Development of a multifactorial measure. Br. J. Clin. Psychol. 1996, 35, 265–277. [Google Scholar] [CrossRef]
  54. D’Zurilla, T.J.; Maydeu-Olivares, A. Conceptual and methodological issues in social problem-solving assessment. Behav. Ther. 1995, 26, 409–432. [Google Scholar] [CrossRef]
  55. Funke, J.; Fischer, A.; Holt, D. When Less Is Less: Solving Multiple Simple Problems Is Not Complex Problem Solving—A comment on Greiff et al. (2015). J. Intell. 2017, 5, 5. [Google Scholar] [CrossRef] [Green Version]
  56. Heppner, P.P.; Petersen, C.H. The development and implications of a personal problem-solving inventory. J. Couns. Psychol. 1982, 29, 66–75. [Google Scholar] [CrossRef]
  57. National Association of Colleges and Employers (NACE). The Attributes Employers Seek on a Candidate’s Resume. Available online: https://www.naceweb.org/talent-acquisition/candidate-selection/the-attributes-employers-seek-on-a-candidates-resume/ (accessed on 7 December 2016).
  58. National Association of Colleges and Employers (NACE). The Key Attributes Employers Seek on Students’ Resumes. Available online: https://www.naceweb.org/about-us/press/2017/the-key-attributes-employers-seek-on-students-resumes/#:~:text=Following%20problem%2Dsolving%20skills%20and,see%20evidence%20of%20on%20resumes (accessed on 30 November 2017).
  59. National Association of Colleges and Employers (NACE). Key Attributes Employers Want to See on Students’ Resumes. Available online: https://www.naceweb.org/talent-acquisition/candidate-selection/key-attributes-employers-want-to-see-on-students-resumes/ (accessed on 13 January 2020).
  60. Mourshed, M.; Farrell, D.; Barton, D. Education to Employment: Designing a System that Works; Mckinsey center for government: New York, NY, USA, 2013. [Google Scholar]
  61. Casner-Lotto, J.; Barrington, L. Are They Really Ready to Work? Employers’ Perspectives on the Basic Knowledge and Applied Skills of New Entrants to the 21st Century US Workforce; ERIC: New York, NY, USA, 2006. [Google Scholar]
  62. Sweeney, L.B.; Sterman, J.D. Bathtub dynamics: Initial results of a systems thinking inventory. Syst. Dyn. Rev. 2001, 16, 249–286. [Google Scholar] [CrossRef] [Green Version]
  63. Keating, C.B. Emergence in system of systems. In System of Systems Engineering: Innovations for the 21st Century; John Wiley & Sons, Inc.: New York, NY, USA, 2008; pp. 169–190. [Google Scholar]
  64. Maani, K.; Li, A. Decision-making in complex systems: Relationship between scale of change and performance. Syst. Res. Behav. Sci. 2010, 27, 567–584. [Google Scholar] [CrossRef]
  65. Hossain, N.U.I.; Nagahi, M.; Jaradat, R.; Keating, C. Development of a New Instrument to Assess the Performance of Systems Engineers. In Proceedings of the International Conference on Industrial Engineering and Operations Management, Toronto, ON, Canada, 23–25 October 2019; Available online: http://ieomsociety.org/toronto2019/papers/229.pdf (accessed on 31 December 2020).
  66. Checkland, P. Systems Thinking, Systems Practice; J. Wiley: Chichester, Sussex, UK; New York, NY, USA, 1981. (In English) [Google Scholar]
  67. Flood, R.L.; Carson, E.R. Dealing with Complexity: An Introduction to the Theory and Application of Systems Science; Springer Science & Business Media: Berlin, Germany, 2013. [Google Scholar]
  68. Richmond, B. The "Thinking" in Systems Thinking: Seven Essential Skills; Pegasus Communications: Arcadia, CA, USA, 2000. [Google Scholar]
  69. Frank, M. Characteristics of engineering systems thinking-a 3D approach for curriculum content. IEEE Trans. Syst. Man Cybern. Part C 2002, 32, 203–214. [Google Scholar] [CrossRef]
  70. Hopper, M.; Stave, K.A. Assessing the effectiveness of systems thinking interventions in the classroom. In Proceedings of the 26th International Conference of the System Dynamics Society, Athens, Greece, 20–24 July 2008. [Google Scholar]
  71. Heppner, P.; Baker, C. Applications of the Problem Solving Inventory. Meas. Eval. Couns. Dev. 1997, 29, 229–241. [Google Scholar] [CrossRef]
  72. Senge, P.M. The Art and Practice of the Learning Organization; Doubleday: New York, NY, USA, 1990. [Google Scholar]
  73. Richmond, B. Systems thinking: Critical thinking skills for the 1990s and beyond. Syst. Dyn. Rev. 1993, 9, 113–133. [Google Scholar] [CrossRef] [Green Version]
  74. Petkov, D.; Petkova, O.; Andrew, T.; Nepal, T. Mixing Multiple Criteria Decision Making with soft systems thinking techniques for decision support in complex situations. Decis. Support Syst. 2007, 43, 1615–1629. [Google Scholar] [CrossRef]
  75. Lawshe, C.H. A Quantitative Approach to Content Validity. Pers. Psychol. 1975, 28, 563–575. [Google Scholar] [CrossRef]
  76. van de Vijver, F.; Hambleton, R.K. Translating tests. Eur. Psychol. 1996, 1, 89–99. [Google Scholar] [CrossRef]
  77. Solano-Flores, G.; Backhoff, E.; Contreras-Niño, L.Á. Theory of test translation error. Int. J. Test. 2009, 9, 78–91. [Google Scholar] [CrossRef]
  78. Ambrose, P.J.; Rai, A.; Ramaprasad, A. Internet usage for information provisioning: Theoretical construct development and empirical validation in the clinical decision-making context. IEEE Trans. Eng. Manag. 2006, 53, 112–129. [Google Scholar] [CrossRef]
  79. Kishore, R.; Swinarski, M.E.; Jackson, E.; Rao, H.R. A Quality-Distinction Model of IT Capabilities: Conceptualization and Two-Stage Empirical Validation Using CMMi Processes. IEEE Trans. Eng. Manag. 2012, 59, 457–469. [Google Scholar] [CrossRef]
  80. Schoenecker, T.; Swanson, L. Indicators of firm technological capability: Validity and performance implications. IEEE Trans. Eng. Manag. 2002, 49, 36–44. [Google Scholar] [CrossRef]
  81. Jae-Nam, L.; Young-Gul, K. Understanding outsourcing partnership: A comparison of three theoretical perspectives. IEEE Trans. Eng. Manag. 2005, 52, 43–58. [Google Scholar] [CrossRef]
  82. Jaradat, R.M.; Keating, C.B. Systems thinking capacity: Implications and challenges for complex system governance development. Int. J. Syst. Syst. Eng. 2016, 7, 75–94. [Google Scholar] [CrossRef]
  83. Rietveld, T.; Hout, R.V. Statistical Techniques for the Study of Language and Language Behaviour; De Gruyter Mouton: Ankara, Turkey, 2011. [Google Scholar]
  84. Field, A. Discovering Statistics Using SPSS for Windows; SAGE Publications: London, UK, 2000. [Google Scholar]
  85. George, D.; Mallery, P. Reliability analysis. SPSS for Windows, step by step: A simple guide and reference. Boston Allyn Bacon 2003, 222, 232. [Google Scholar]
  86. Russell, D.W. In Search of Underlying Dimensions: The Use (and Abuse) of Factor Analysis in Personality and Social Psychology Bulletin. Personal. Soc. Psychol. Bull. 2002, 28, 1629–1646. [Google Scholar] [CrossRef]
  87. Awang, Z. Analyzing the SEM Structural Model. In A Handbook on SEM, 4th ed.; UiTM Kelantan: Kelantan, Malaysia, 2012. [Google Scholar]
  88. Ahire, S.L.; Devaraj, S. An empirical comparison of statistical construct validation approaches. IEEE Trans. Eng. Manag. 2001, 48, 319–329. [Google Scholar] [CrossRef]
  89. Bollen, K.A.; Long, J.S. Testing Structural Equation Models; Sage: Thousand Oaks, CA, USA, 1993. [Google Scholar]
  90. Hair, J.; Anderson, R.; Babin, B.; Black, W. Multivariate Data Analysis: A Global Perspective (Volume 7); Pearson: Hoboken, NJ, USA, 2010. [Google Scholar]
  91. Meyers, L.A.; Pourbohloul, B.; Newman, M.E.J.; Skowronski, D.M.; Brunham, R.C. Network theory and SARS: Predicting outbreak diversity. J. Theor. Biol. 2005, 232, 71–81. [Google Scholar] [CrossRef]
  92. Byrne, B.M. Structural Equation Modeling with AMOS: Basic Concepts, Applications, and Programming (Multivariate Applications Series); Taylor & Francis Group: New York, NY, USA, 2010; Volume 396, p. 7384. [Google Scholar]
  93. Hu, L.T.; Bentler, P.M. Cutoff criteria for fit indexes in covariance structure analysis: Conventional criteria versus new alternatives. Struct. Equ. Modeling A Multidiscip. J. 1999, 6, 1–55. [Google Scholar] [CrossRef]
  94. Tseng, W.-T.; Dörnyei, Z.; Schmitt, N. A New Approach to Assessing Strategic Learning: The Case of Self-Regulation in Vocabulary Acquisition. Appl. Linguist. 2006, 27, 78–102. [Google Scholar] [CrossRef]
Figure 1. Seven dimensions of the “ST Skills Preferences Instrument” [1].
Figure 1. Seven dimensions of the “ST Skills Preferences Instrument” [1].
Systems 09 00051 g001
Figure 2. Research framework.
Figure 2. Research framework.
Systems 09 00051 g002
Figure 3. The full structural model analysis of the proposed theoretical model for both samples of practitioners and students.
Figure 3. The full structural model analysis of the proposed theoretical model for both samples of practitioners and students.
Systems 09 00051 g003
Table 1. Sample characteristics (Study 1).
Table 1. Sample characteristics (Study 1).
VariableCategoriesNumber (Percentage)
GenderMale80.4%
Female19.6%
Age≤301.6%
31–4036.4%
41–5050.0%
51–6010.8%
≥601.2%
Level of educationHigh school diploma0.0%
Bachelor’s degree31.2%
Master’s degree56.0%
Ph.D.12.8%
The major of study in the highest degreeEngineering39.2%
Social science14.8%
Business/Management28.0%
Health-related2.0%
Others16.0%
Work experience (year)Less than 108.8%
11–2048.4%
21–3036.4%
More than 306.4%
Management experience
(year)
Less than 1058.8%
11–2033.6%
21–306.4%
More than 301.2%
Managerial levelCEO1.2%
Vice president/Deputy18.4%
Office manager80.4%
Table 2. Sample characteristics (Study 2).
Table 2. Sample characteristics (Study 2).
VariableCategoriesNumber (Percentage)
GenderMale63.8%
Female36.2%
Ethnicity and RaceAsian12.3%
African American5.0%
Caucasian72.7%
Hispanic2.3%
Middle Eastern2.3%
Multi-racial3.1%
Native American1.2%
Prefer not to disclose1.2%
Currently employed (not including co-op/internship)No54.2%
Yes45.8%
Completed a co-opNo83.1%
Yes16.9%
Completed a professional internshipNo78.1%
Yes21.9%
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Nagahi, M.; Maddah, A.; Jaradat, R.; Mohammadi, M. Development of Perceived Complex Problem-Solving Instrument in Domain of Complex Systems. Systems 2021, 9, 51. https://0-doi-org.brum.beds.ac.uk/10.3390/systems9030051

AMA Style

Nagahi M, Maddah A, Jaradat R, Mohammadi M. Development of Perceived Complex Problem-Solving Instrument in Domain of Complex Systems. Systems. 2021; 9(3):51. https://0-doi-org.brum.beds.ac.uk/10.3390/systems9030051

Chicago/Turabian Style

Nagahi, Morteza, Alieh Maddah, Raed Jaradat, and Mohammad Mohammadi. 2021. "Development of Perceived Complex Problem-Solving Instrument in Domain of Complex Systems" Systems 9, no. 3: 51. https://0-doi-org.brum.beds.ac.uk/10.3390/systems9030051

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop