Next Article in Journal / Special Issue
Mathematical Creativity in Adults: Its Measurement and Its Relation to Intelligence, Mathematical Competence and General Creativity
Previous Article in Journal
Investigating the Structure of Intelligence Using Latent Variable and Psychometric Network Modeling: A Commentary and Reanalysis
Previous Article in Special Issue
An Experimental Approach to Investigate the Involvement of Cognitive Load in Divergent Thinking

A Minimal Theory of Creative Ability

Department of Psychology, University of Amsterdam, 1018 WS Amsterdam, The Netherlands
Author to whom correspondence should be addressed.
Received: 26 October 2020 / Revised: 4 February 2021 / Accepted: 8 February 2021 / Published: 16 February 2021
(This article belongs to the Special Issue Intelligence and Creativity)


Despite decades of extensive research on creativity, the field still combats psychometric problems when measuring individual differences in creative ability and people’s potential to achieve real-world outcomes that are both original and useful. We think these seemingly technical issues have a conceptual origin. We therefore propose a minimal theory of creative ability (MTCA) to create a consistent conceptual theory to guide investigations of individual differences in creative ability. Building on robust theories and findings in creativity and individual differences research, our theory argues that creative ability, at a minimum, must include two facets: intelligence and expertise. So, the MTCA simply claims that whenever we do something creative, we use most of our cognitive abilities combined with relevant expertise to be creative. MTCA has important implications for creativity theory, measurement, and practice. However, the MTCA isn’t necessarily true; it is a minimal theory. We discuss and reject several objections to the MTCA.
Keywords: creativity measurement; creative potential; divergent thinking; intelligence; expertise creativity measurement; creative potential; divergent thinking; intelligence; expertise

1. Introduction

The creative act—from a toddler’s first building-block tower to Picasso’s Guernica—is a fascinating achievement of the human cognitive system. Creative ability is generally summarized into a short standard definition: the ability to produce original and useful products (Runco and Jaeger 2012), a definition that applies to all domains of creativity, from humor to the culinary arts and science to inventions. Creativity is an imperative skill the human race needs to solve global challenges such as climate change or privacy in a digital world. Organizations—from cancer research institutes to the fashion industry—are all looking for people with strong creative abilities (Ananiadou and Claro 2009; Casner-Lotto and Barrington 2006; IBM 2010). Creativity is therefore one of the most important 21st century skills educators want to instill upon their students (Henriksen et al. 2016). In order to select creative people or track creativity development good psychometric tests that reliably distinguish varying levels of creative ability (between people or over time) are a necessity. Unfortunately, existing instruments purporting to measure creativity often suffer from conceptual and psychometric shortcomings and do not meet the high requirements needed for personnel selection or measuring change (e.g., Baer 1994a; Barbot 2019; Montag et al. 2012; Said-Metwaly et al. 2017).
In our view, these seemingly technical issues have a conceptual origin. Creativity research lacks a clear formal model to understand and measure creative ability (Glăveanu 2012). In this paper, we propose a minimal theory of creative ability (MTCA), with creative ability being defined as people’s potential to achieve real-world outcomes that are both original and useful. MTCA describes which facets a theory of creative ability, at a minimum, must include. MTCA builds on earlier proposals for a modest conceptualization of creative ability (e.g., Baer 2012; Ericsson 1999; Runco 2009; Silvia 2015; Simonton 2003b). We reformulate these prior ideas into a more precise and concise form. We propose that, while many different variables are associated with creativity, only intelligence and expertise are essential for explaining and predicting individual differences in real-world creativity. The MTCA has important implications. It offers a parsimonious account for classic phenomena and findings in creativity research. It also provides clear guidelines for measuring creative ability and predicting creative achievements. The MTCA is in essence a minimal theory, and we discuss and dismiss possible variables and phenomena that could falsify it.

2. Positioning the MTCA

The MTCA is a theory based in differential psychology, the field Cronbach (1957) describes as the psychology of individual differences. The aim of differential psychology is to determine the nature, magnitude, causes, and consequences of psychological differences between individuals in the general population. So, with the MTCA we address how and why people differ in real-world creativity, where the major challenge is measuring creative ability. Please note that we focus on creative ability, not the creative process (for different accounts of the creative process see e.g., Gabora 2017; Hélie and Sun 2010; Kozbelt et al. 2010; Nijstad and Stroebe 2006; Schmidhuber 2010; for a thorough analysis of the study of individual differences versus processes see Cronbach 1957). So, the MTCA does not answer questions, such as, “How does the creative act take place?” and “What happens in the brain during creative problem solving?”. Instead, we focus on questions such as “What are the components of creative ability?”; “How can creative ability be measured?”; “Are individual differences in creative ability stable over time?”; and “Can creative ability be trained?”. We show that the MTCA provides clear answers to each of these questions about creative ability.
Because the MTCA is concerned with individual differences in creative ability, it is positioned in the Person perspective of creativity. The Person perspective represents one of the four general approaches to creativity research and focuses on which characteristics make a person creative (Rhodes 1961). With MTCA, we argue that intelligence and expertise are the essential person characteristics that distinguish between people scoring high or low on creative ability. The MTCA is also concerned with the Product perspective of creativity, focusing on the extent to which ideas, acts, and output are judged, often by relevant experts and stakeholders, to be creative (Amabile 1982; Montag et al. 2012; Simonton 2003c). Real-world creative products or outcomes are the most important criteria that the MTCA strives to predict. These creative outcomes may range from personal and everyday creative outcomes to eminent creative contributions (Boden 2004). For instance, they may refer to a sudden insight in how to solve a Sudoku (mini-c creativity); a new vegetable dish that your toddler enjoys eating (little-c creativity); a newly developed creativity test (Pro-c creativity); or a renowned paradigm-changing scientific theory (Big-C creativity). As such the MTCA addresses creative ability and potential at all levels, from mini-c to Big-C (Kaufman and Beghetto 2009).
The Process perspective, concerned with the cognitive processes that take place when someone is being creative, is not central to the MTCA. The MTCA only addresses research positioned in the Process perspective inasmuch as it says something about creative abilities. For example, we argue that many cognitive processes are involved in being creative and therefore cognitive ability is related to creative ability (e.g., Kozbelt et al. 2010; Simonton 2003b; Mumford and McIntosh 2017). Similarly, we only address the Press perspective, which investigates the circumstances or states that influence the expression of creative ability, to the extent that it informs us about creative abilities. For instance, environmental stressors, such as noise and cognitive load, make it harder for people to be creative because they hinder the execution of cognitive abilities (Byron et al. 2010; De Dreu et al. 2012). So, the MTCA does not deny that creative processes exist (it actually assumes their existence), nor that there are important situational constraints and triggers that influence how creativity emerges. However, the MTCA simply is not a theory about the creative process or environment, but about individual differences in creative ability (cf. Cronbach 1957; see e.g., Eysenck 1995, for a model in which both individuals and environmental factors interact to express creative achievements).
The MTCA is based on two key assumptions. The first assumption is that in order to be creative we use a wide variety of cognitive functions: functions that are generally assessed in intelligence tests. The second assumption is that creativity relies on the novel combination of existing knowledge and skills, i.e., expertise. Combined, these assumptions lay the groundwork for the MTCA, where intelligence and expertise are essential to understanding why people differ in creative ability. Below, we explore these assumptions further and conclude with a quasi formula for a minimal theory of creative ability.
Assumption 1.
Creative ability requires many different cognitive abilities.
The MTCA is based on the common assumption that we use a wide variety of our cognitive abilities to be creative (and probably tap all our cognitive functions across different tasks and settings). Whatever the challenge is, we use many cognitive abilities, such as those assessed in intelligence tests. For instance, we may use our perceptual abilities, conduct convergent and divergent thinking, search our memory for analogies, make use of our knowledge base, focus our attention, acquire new knowledge, etc. (e.g., Amabile and Pratt 2016; Beaty et al. 2014; Benedek et al. 2014; Kuncel et al. 2004; Newell and Simon 1972; Nijstad and Stroebe 2006; Sternberg et al. 2019).
We consider this assumption rather uncontroversial because most theories of creativity are consistent—or at least not inconsistent—with the idea that many cognitive functions are needed to be creative. For example, process theories of creativity are consistent when they describe the creative process in terms of consecutive stages (e.g., problem identification, idea generation, idea evaluation; see e.g., Basadur et al. 1982; Mumford and McIntosh 2017; Perry-Smith and Mannucci 2017; Wallas 1926) or some sort of cycle (e.g., including repetition and recursion) where different executive functions play a role (for overview see Kaufman and Glaveanu 2019; Kozbelt et al. 2010). This also applies to Amabile’s (1982) componential model and other componential theories that incorporate several creativity-relevant cognitive processes (e.g., breaking cognitive and perceptual sets, remembering accurately). We think typical cognitive process theories such as the Geneplore model (Finke et al. 1992) or Hélie and Sun’s (2010) Explicit–Implicit Interaction theory are also consistent with this assumption as they involve many cognitive functions. For instance, according to the Geneplore (generate–explore) model, people first retrieve existing elements from memory, form simple associations among these elements, and integrate, and transform them. The new ideas that result from these generative processes are then explored for their implications, checked against criteria and constraints and, if needed, refined (Finke et al. 1992).
Furthermore, theories on creativity that stress the importance of one cognitive function, such as associative thinking (Mednick 1962) or incubation (Sio and Ormerod 2009; Wallas 1926), are not inconsistent with our assumption as long as they acknowledge that other cognitive processes play a role in creativity as well. Inconsistency arises when a theory assumes that only one specific function is essential. This occurs when creative ability is reduced to divergent thinking ability only, for instance, as some psychometric theories of creativity propose (see Kozbelt et al. 2010).
Assumption 2.
Creative ability requires expertise.
We further assume that people apply their cognitive functions to analyze, combine, and integrate existing knowledge and skills to be creative (Simonton 2003b). This is why being creative in any domain also requires expertise in the domain at hand. Thus, the MTCA is based on the assumption that creativity always requires expertise (Baer 2011a, 2011b; Kim 2011a, 2011b; Plucker and Beghetto 2004). As with the “many cognitive abilities” assumption, we are certainly not the first, nor the only researchers to make this claim. A number of creativity theories recognize that creativity manifests in specific domains. For example, Baer and Kaufman (2005) use an amusement park metaphor to describe how domain specific creative abilities (e.g., writing sonnets) are related to broader and broader abilities (e.g., poetry writing, creative writing), where each level has its own requirements in terms of domain specific knowledge (e.g., word meanings) and skills (e.g., spelling, grammar, rhyming). Domain specificity also follows from Amabile’s (1982) componential model, where people use domain relevant skills to be creative and their motivation to be creative may be very specific to tasks within particular domains. Another example stems from Darwinian creativity models that argue that creativity involves a process of random generation and selective retention and elaboration of ideas (e.g., Campbell 1960; Simonton 1999b; but see Gabora 2017 for an alternative evolutionary model). According to these models, people’s brains produce (quasi)random variations of existing ideas that are part of a creator’s knowledge base, and the greater the knowledge base, the more potential new combinations that are truly creative are possible.
Other theories that discuss the domain specificity of creative output are, for example, Boden’s (2004) H-creativity or Big-C creativity (Kaufman and Beghetto 2009), which typically emerges by applying domain specific knowledge acquired over at least a decade. This is supported by archival evidence (e.g., Simonton 1991). For instance, Hayes (1989) discovered that for eminent composers at least 8 years of musical study were required before they wrote a masterwork, and the vast majority required at least 10 years. From the perspective of expertise development, practicing skills, engaging in activities, and enriching the knowledge base in a particular domain provide the necessary building blocks for creative work in that domain (Beghetto and Kaufman 2007; Ericsson et al. 1993; Simonton 2008; Weisberg 1999, 2018).
Although Big-C creative achievements in a particular field often require 10 years of expertise development in that field, there are notable exceptions. For instance, break-through inventions may open up new territory for accelerated discovery, sometimes resulting in entirely new domains. For instance, celestial objects and phenomena were suddenly observable with Galileo’s refinement of the telescope thereby explosively advancing the field of astronomy (Simonton 2012); and because microscopic organisms and cells could suddenly be explored with Van Leeuwenhoek’s single-lensed microscope, an entirely new domain, microbiology, came into being (Simonton 2012). However, even then, people build on their expertise (e.g., knowledge, scientific observation and reporting skills) to make discoveries and build and advance a domain.

3. A Quasi Formula for the MTCA

In sum, the MTCA assumes that being creative requires a variety of cognitive functions similar to those generally assessed in intelligence tests and expertise relevant to the domain at hand. Although there may be many factors that predict real-world creativity, we argue that only intelligence and expertise are essential for explaining individual differences in real-world creativity. The minimal theory of creative ability (MTCA) can be expressed by the following quasi formula:
C (Creativity) = I (Intelligence) × E (Expertise).
Thus, for a highly intelligent person without any expertise (in, say, culinary arts), C would be zero (in the domain of culinary arts). Furthermore, a resourceful and inventive craftsman without any formal education, perhaps even unable to read, may come up with creative solutions for problems in his or her area of expertise. Yet, creativity always requires some level of intelligent processing. The MTCA simply claims that whenever we are confronted with a creative task, we use a wide variety of cognitive abilities (e.g., our memory, our ability to reason by analogy, our visual–spatial skills, our metacognitive capacities, etc.) in combination with relevant expertise (e.g., experience with carpentry techniques, paint textures, conducting scientific experiments, etc.) to be creative. That is still a lot, but that is all there is. We argue that there is no special creative talent or faculty, nor is there a specialized (brain) area for being creative (e.g., Dietrich and Kanso 2010).
Just as with other theories, we suggest that creative ability constitutes a multiplicative relation of multiple factors. For instance, Eysenck (1995) defined creativity as a product of personality, environmental variables and cognitive ability, including intelligence and knowledge, whereas Simonton (1999a) and Jauk et al. (2013) used the multiplicative relation between multiple factors to explain skewed distributions in creative achievement. In the case of MTCA, the multiplicative relationship is limited to intelligence and expertise. The multiplication sign indicates that I and E both are necessary for creativity, but can also compensate for each other.

4. Phenomena Consistent with the MTCA

As a parsimonious framework of creative ability, the MTCA is consistent with a number of recurring phenomena in creativity research.
Phenomenon 1.
Creativity is related to intelligence.
Creativity research is fraught with evidence that intelligence is related to performance on commonly used tests of creative ability, such as divergent thinking, as well as to creative achievements (De Dreu et al. 2012; Jauk et al. 2013; Silvia 2015; Sternberg et al. 2019; see Sternberg and O’Hara 2000 for an in depth discussion of the relationship between intelligence and creativity.). For instance, Silvia (2015) estimates the correlation between IQ and creative ability as measured with divergent thinking tests to be r ≈ 0.50. In addition, tests of intelligence and executive functioning are robust predictors of creative eminence and actual creative achievements (Jauk et al. 2013; Karwowski et al. 2016; Silvia 2015; Simonton 2003a). The idea that cognitive abilities are strongly related is not new. Different subtests of IQ tests show a positive manifold of intercorrelations and different composite scores (factor analytic or sum scores) correlate very strongly (van der Maas et al. 2006). According to this perspective, cognitive abilities that in some creativity theories are considered special for creativity, such as divergent thinking, are simply part of a bigger construct: intelligence1. Although divergent thinking tests are not often part of IQ tests, we see no reason why valid divergent thinking tests with good test–retest reliability (e.g., with more items and automated scoring) shouldn’t be included. In fact, there have been attempts to make this happen (Kaufman et al. 2011; Süb and Beauducel 2005). We note that specific creative achievements may require a different balance of cognitive functions (Murphy 2017). Writing a poem requires more language abilities than solving a chess puzzle. However, because cognitive abilities are strongly correlated (van der Maas et al. 2006), and we use a wide variety of cognitive abilities to analyze, combine, and integrate existing knowledge and skills during creative problem solving, it is hardly surprising that general intelligence tests robustly predict real-world creativity.
Phenomenon 2.
Creative achievement is domain specific.
Most creativity tests correlate weakly among each other, i.e., have low convergent validity. This could partly be due to the low test–retest reliability (i.e., stability) of standardized creativity tests and resulting error variance (Barbot 2019; Cropley 2000). However, an even more important reason follows from the MTCA: creative achievements build on expertise, and because expertise is domain specific, creative achievements are also domain specific. Whatever the challenge, people simply have to work with the knowledge and skills that they have. An experienced sketch artist with good drawing skills may score higher on the figural subset of the Torrance Test of Creative Thinking (TTCT Torrance 2008) than an experienced creative writer, who, in turn, is more likely to score higher on the verbal subset of the TTCT and the Remote Associates Test (Mednick 1963). The MTCA thus also explains why many experts excel in their profession, but fail to find creative solutions for simple tasks in other domains (Kaufman et al. 2010a). More generally, the expertise component of MTCA clarifies the substandard construct validity of existing creative ability tests (cf. Baer 2012; Han 2003). A prime example of poor construct validity is that the figural and verbal components of the popular TTCT correlate < 0.10 (Baer 2011a, 2011b). Another example is that creativity scores from different domains (e.g., poetry and mathematics) tend not to correlate, or correlate only weakly (Baer 1994a, 1994b; Runco and Albert 1985; Simonton 2003b).
Domain specificity may also explain the low predictive validity of domain general creativity tests, such as the Alternative Uses Task (AUT Guilford 1967). Performance on such creativity tests generally correlates weakly with objective indicators of overt creative behaviors (Kim 2008; Zeng et al. 2011). For example, in a meta-analysis, Kim (2008) found a correlation of 0.22 between divergent thinking test scores and creative achievement. Note that for IQ, correlations with external criteria such as job success are reported to vary between 0.27 and 0.61 (for reviews see Schmidt and Hunter 1998; Sternberg et al. 2001) and predictive values of IQ as high as 0.81 are reported for educational achievement (Deary et al. 2007).
Phenomenon 3.
Many experts have a few creative achievements; few experts have many.
The distribution of achievements is often highly skewed in the population. For example, Murray (2009) describes this for professional golfers. Numerous players have won one or two tournaments, only four have won >30 tournaments, and then only one player (Jack Nicklaus) has won 71 tournaments (Murray 2009). This distribution is referred to as the Lotka curve, and also applies to creative achievements. This curve was first described by Lotka (1926) as a power law function where many authors produce a few publications, but only a few authors produce many publications (e.g., Simonton 2003c). Lotka’s curve has been found to hold for achievements in numerous domains, including chess and music composition (Murray 2009; Simonton 1999c). Many models have been proposed to explain this phenomenon (for overviews see Den Hartigh et al. 2016; Simonton 1999a). Generally, these models invoke some combination of factors, such as latent ability and number of produced outputs. In the MTCA creative ability is a combination of intelligence, which is generally assumed to be normally distributed in the general population, and expertise, which is either normally (in very selective samples) or exponentially distributed. Combining the normal distribution of intelligence and the skewed distribution of expertise results in a skewed distribution, which may explain why the distribution of creative achievements is also skewed (Den Hartigh et al. 2016; Simonton 1999a).
Phenomenon 4.
Creative achievements across a career follow an inverted U-curve.
Archival studies show how creative achievements unfold over the career of a creator, be it a scientist, composer, or painter (e.g., Ericsson 1999; Kozbelt 2008; Simonton 1997). What these studies tend to show is that the relationship between the number of creative achievements and career age follows an inverted U-shape, where the number of creative achievements of a creator steeply rises in the early decades of a career, then plateaus and slowly declines. The increase in creativity early in a career can be perfectly explained by expertise development, where the age curve for productivity appears to be a function of career age rather than chronological age (e.g., Khan and Minbashian 2019). Immersion in a domain over time leads to an increase in knowledge, activities, constraints, skills and procedures of a particular domain (Kozbelt 2008). By enriching the knowledge base in a particular domain, the creator has the necessary building blocks for creative work in that domain (Ericsson et al. 1993; Simonton 2008; Weisberg 1999, 2018). However, assuming that creators continue to develop their expertise across their career, why do their creative achievements not show a similar linear increase? Apart from extraneous factors that are beyond the scope of MTCA (e.g., successful scientists get managerial positions that limit their time to express their creative ability), the plateauing of creativity can be explained by how the acquisition of expertise follows an S-shaped curve: after a period of steep increase the acquisition of expertise levels off (Krampe and Charness 2018). That creativity ultimately declines may be caused by various extraneous factors such as poor health or changes in priorities or interest (Kozbelt 2008), but can also be explained by the fact that cognitive abilities, in particular what is generally considered as fluid intelligence, tend to decline during adulthood (Salthouse 2009). Simonton (1997) presented an endogenous explanation of these growth patterns. In future work an elaboration of the MTCA formula in such a dynamic model might be of interest.
Phenomenon 5.
Limited effects of creativity training.
Many forms of creativity training exist, most of which focus on the enhancement of a specific ability, such as divergent thinking or convergent thinking (Scott et al. 2004). Many studies show that creativity training effects are generally limited to the abilities that are specifically targeted and hardly generalize to real-world creativity (Baer 2012; J. M. Baer 1988; Scott et al. 2004). This lack of transfer makes perfect sense according to the MTCA. People rely, not on single, but on many cognitive abilities to turn domain specific expertise into real-world creative output. The same studies also show that creativity training effects tend to increase when creativity training is applied to exercises and examples within a relevant domain (Scott et al. 2004). This small, but nevertheless robust strengthening effect, also makes perfect sense according to the MTCA. Real-world creative output builds on domain specific expertise and relevant examples and exercises during training may help people to successfully apply newly acquired skills in their field of expertise.
Phenomenon 6.
Absence of a specific neural basis of creativity.
Decades of research exploring the neural correlates of creativity fail to show a specific brain area involved in creativity (Dietrich and Kanso 2010). If anything, this works shows diffuse prefrontal activation during the performance on creativity tasks (Dietrich and Kanso 2010). In line with this finding, more recent neural models of creativity include a large prefrontal network that is implicated in controlled memory retrieval and central executive processes (Beaty et al. 2016, 2018). This is much in line with the MTCA, which predicts that performance on creative ability tests relies on many cognitive abilities that underlie intelligence in general.

5. Implications

The MTCA has four important implications for creativity research and practice. First, because it relies on intelligence and expertise, creative ability has both a domain general component (i.e., intelligence) and a domain specific component (i.e., expertise) (e.g., Barbot et al. 2016; Plucker and Beghetto 2004). Because creative abilities depend on domain specific expertise, creative achievements are by definition domain specific. Baer (2012) suggests that, for each domain, we need “mini theories” about creative ability that specify the specific skills, knowledge and cognitive abilities that are required. Indeed, researchers probably need to specify the domain specific knowledge and skills that are relevant for a particular domain. For example, visuospatial abilities are probably more important in visual arts and architecture, whereas verbal comprehension abilities may be more relevant for poetry. Potentially, these specific cognitive abilities may have greater predictive value for creative performance in those specific domains, rather than an entire intelligence test (cf. Benedek et al. 2014). However, given the positive manifold of cognitive abilities, it may not make much difference which domain general abilities exactly are assessed as they are all highly correlated.
Second, because expertise and intelligence are essential components of creative ability, assessing individual differences in real-world creative potential should at the very least be done with tests of domain expertise and an IQ test. Creativity is ranked by managers among the most important skills of the 21st century (Ananiadou and Claro 2009; Casner-Lotto and Barrington 2006; IBM 2010), so if one wants to predict if a job candidate will be creative one should at the least test for expertise and intelligence. The good news, as explained below, is that assessing intelligence and expertise is rather straightforward; both have good psychometric properties, and can easily be implemented by an HR-team.
Third, training in domain general activities, such as divergent thinking, will not likely transfer to real-world creative achievements. Instead, creative achievements can only be enhanced by improving domain specific expertise (for recent discussions, see Ericsson and Harwell 2019; Macnamara et al. 2014). Whether intelligence can be trained or modified is another debate (Jaeggi et al. 2008), but domain specific expertise can certainly be developed. It is one of the main tasks of our educational system.
Fourth, although the MTCA is primarily concerned with individual differences in creative ability, it has implications for the environmental factors that constrain or facilitate creative achievements. On the one hand, a conducive environment, for example at someone’s work, can facilitate creative achievements (Amabile et al. 1996). On the other hand, situational factors that tax people’s cognitive abilities are expected to diminish people’s capacity for creativity. This is indeed what work on the effects of environmental stressors, such as noise and cognitive load, shows (e.g., Byron et al. 2010; De Dreu et al. 2012).

6. Objections and Limitations

There are several possible objections to, and limitations of, the MTCA. First, we accept the possibility that creativity is associated with other person characteristics such as positive affectivity, intrinsic motivation, or vulnerability to psychopathology (e.g., Amabile et al. 1996; Baas 2019; Baas et al. 2016; Hennessey 2019; Sternberg 2018). We do, however, propose that these characteristics are not essential factors for real-world creativity to emerge. For instance, someone may also achieve a creative feat while being generally grumpy, driven by dreams of glory and fame, and without psychopathological symptoms (Baas 2019; Baas et al. 2016; Byron and Khazanchi 2012). In addition, the correlations between these person characteristics and creative tests and achievements tend to be quite low. For example, the correlations with creativity for depressive mood (r = −0.06; Baas et al. 2019) and positive affectivity (r = 0.08; Baas et al. 2008) are close to zero.
Of all personality characteristics openness to experience is perhaps the single best predictor of creativity. Decades of research show that the personality trait openness to experience is related to divergent thinking (Gocłowska et al. 2019), distinguishes between scientists and artists that are low or high on creativity (Feist 1999, 2019), and predicts creative achievements across the lifespan beyond intelligence (Feist and Barron 2003). Moreover, the reported correlation of openness to experience with creative achievements is sometimes higher than that of intelligence (e.g., Carson et al. 2005; Jauk et al. 2013). Indeed, openness to experience is included in some creativity models (e.g., Eysenck 1995; Lubart and Guignard 2004). So, shouldn’t openness to experience, then, be included in a minimal theory of creativity? Again, we believe that openness to experience may not be essential. First, people can achieve remarkable creative feats by rather closed-mindedly exploring a narrow domain (Nijstad et al. 2010; Zabelina and Beeman 2013). Second, in many studies, including the one by Carson et al. (2005), university student samples were used. This greatly constricts the possible variance in intelligence of the sample without constraining variance in openness to experience, which, in turn, limits the effect sizes that can be obtained (Hunter and Schmidt 2004). In fact, using a somewhat more representative sample, Jauk et al. (2013) obtained structural equation models in which openness to experience was indirectly related to creative achievements through engagement in creative activities. It was intelligence—crucial to turning creative activities into creative achievements—that was the main predictor of creative achievements (Jauk et al. 2013). Third, further complicating things is that openness to experience is moderately related to intelligence and expertise. Openness to experience correlates with IQ scores (Harris 2004; Chamorro-Premuzic and Furnham 2006) and in some models of personality structure, intelligence and openness to experience are facets of a common factor (DeYoung 2006). In addition, with a preference for engaging in novel and varied experiences, people high in openness to experience are more likely to develop broad and rich expertise (Silvia and Sanders 2010). Before we can accept openness to experience as a third, crucial factor of creative ability, high-powered studies with a representative sample of the normal population are needed that would show that openness to experience significantly adds to the prediction of creative performance above and beyond Intelligence (I) and Expertise (E) and their interaction (I x E, also see the Discussion below).
Second, one may argue that the MTCA relies too much on expertise and intelligence as “can” factors, but misses out on motivation as the “want” factor. According to this perspective people, with all their cognitive abilities and expertise, achieve nothing if they are unmotivated to put their abilities to practice (Amabile 1983). First, we would like to remind the reader that the MTCA, based in differential psychology, is a theory to better understand individual differences in creative ability; i.e. “can” factors are key. We do not deny that motivational factors, such as persistence, interest, and passion, are important for the expression of creative ability in real-world creative outcomes (e.g., Amabile et al. 1996; Grohman et al. 2017; de Jesus et al. 2013). Furthermore, motivation is required to achieve expertise (Ericsson et al. 1993). However, motivation is volatile and easily influenced by environmental factors (e.g., Amabile et al. 1996). As such, rather than being an essential component of creative ability, motivation may be better studied from the Press perspective, research that is critical for understanding how the expression of creative ability is facilitated or constrained by the situation.
Third, the MTCA heavily rests on two concepts, intelligence and expertise, that also are not without their problems. The concept of intelligence is surrounded by controversies. It is, for instance, unclear whether intelligence is a unitary construct (Jensen 1998), a set of modules (Gardner 2011), or a network of interacting cognitive functions (van der Maas et al. 2006, 2017). However, in all these theories intelligence is a broad construct that captures a wide variety of cognitive functions, and that is also how it is measured. Furthermore, expertise, which refers to the characteristics of highly skilled and knowledgeable people “who are consistently able to exhibit superior performance for representative tasks in a domain” (Ericsson 2018, p. 14), is a research area with its share of debates (for instance, on the role of talent, Colvin 2011). Both concepts certainly require further clarification. However, there is one essential difference between these concepts and creativity: measuring general intelligence or specific expertise is much less problematic than the measurement of creative abilities, such as divergent thinking. Despite many limitations, IQ tests belong to the best tests produced by the field of psychology (Gottfredson 1997). Subtests of IQ tests show a positive manifold of intercorrelations (e.g., high convergent validity). Different composite scores (factor analytic or sum scores) correlate very strongly. Retest reliabilities of IQ (sub-)test scores are high, and criterion validity (in terms of the prediction of scholastic success) is also high (e.g., Deary et al. 2007). As expertise is domain specific by definition, generally applicable expertise tests do not exist. Yet, domain specific expertise can often be tested with existing tests, such as college exams or exams used for certifications. In a famous area of expertise research, chess, van der Maas and Wagenmakers (2005) found that different measures of chess ability show the same positive manifold of correlations that intelligence subtests do. In addition, these measures were excellent predictors of chess performance (Elo ratings). Tests of domain expertise of good quality exist for numerous scholarly domains (e.g., GRE Subject tests; Powers 2004). While objective tests of creativity in artistic domains are problematic, constructing good tests of technical skills in these domains appears far less challenging (e.g., Chan and Zhao 2010; Law and Zentner 2012). Given the narrow set of skills to be assessed in a test of expertise, it is relatively easy to meet the high psychometric standards required for measuring individual differences.
Fourth, expertise and general intelligence are not entirely independent constructs. On the one hand, the definitions of crystallized intelligence and expertise overlap somewhat. On the other hand, we would expect an interaction between expertise and general intelligence because intelligence plays a role in developing expertise (Deary et al. 2007). For example, a minimum level of general intelligence is clearly required to reach high levels of expertise in, say, music composition. However, this developmental dependency is only strong in the early stages of skill acquisition; in later stages the relationship between intelligence and level of expertise breaks down (Krampe and Charness 2018). For example, chess performance (Elo ratings) can be predicted in part by a player’s intelligence, but chess experience remains the strongest predictor of attained skills (Grabner et al. 2007). We note that for many studies that fail to show a significant correlation between intelligence and creative performance or between intelligence and expertise, the sample is restricted to experts (Bayer and Folger 1966; Cole and Cole 1973). The restriction of range in a key variable is an important methodological limitation. This restriction of range can of course also occur in practice. Suppose one wants to hire a full professor. The variation in IQ and expertise of the sample of candidates might be very small, such that these measures are not informative. Luckily, in these very restricted cases, one can consider creative achievement directly. In less restricted samples, for college admission for instance, we expect added value for both IQ and expertise measures.
Fifth, the MTCA does not incorporate divergent thinking as a separate critical component of creative ability. We certainly consider divergent thinking as part of MTCA, but then as part of the many cognitive abilities that support creativity. Thus, as discussed under Phenomenon 1, it would fall under Intelligence. Divergent thinking tests are not often part of standard IQ tests (Kaufman 2015). However, we do not see why a valid and reliable test of divergent thinking could not be included in intelligence assessment because, in our view, it falls within the positive manifold of cognitive abilities that IQ tests purport to measure (cf. Silvia 2015). Some study results suggest that incorporating divergent thinking may increase the predictive validity of intelligence tests (Kim 2008; Plucker 1999).
Sixth, and finally, it could be argued that MTCA focuses too much on Pro-c and Big-C creativity (e.g., scientists developing a new creativity test, people that have received a Nobel prize) rather than one’s personal and everyday creative insights (mini-c and little-c creativity, e.g., a child who endearingly impersonates a firefighter with imaginary attributes; Boden 2004; Kaufman and Beghetto 2009). Indeed, the creative process hardly differs for different levels of creativity. However, rich and strong expertise within a particular domain may be more critical for Pro-c and Big-C creative achievements within that domain (Kaufman and Beghetto 2009). However, one also needs some expertise for mini-c and little-c. For example, when impersonating a firefighter, you need to know which attributes belong to a firefighter and use creative problem solving to find original alternatives within your reach. So, with mini-c and little-c creativity intelligence may weigh in more than expertise, but, still, both I and E are required.
A possible limitation is that the MTCA ignores that creative achievements often, and increasingly, emerge in social networks (Perry-Smith and Mannucci 2017). Creators may greatly benefit from the expertise of others that may provide the missing link of a puzzle that creators are working on (Hargadon and Bechky 2006; Johnson 2010). As a consequence, person characteristics that facilitate help seeking and giving may be important for creativity to emerge in social settings. Still, the importance of expertise and intelligence is upheld. Once shared, the expertise of others becomes part of one’s own expertise. Although people may certainly borrow the expertise of others, people still need the cognitive abilities to process, assimilate, develop, and integrate the knowledge and perspectives that others offer (Nijstad and Stroebe 2006).
Another limitation is that MTCA primarily focuses on the standard definition of creative achievement, which refers to recognized outcomes that are both novel and useful (e.g., Montag et al. 2012; Runco and Jaeger 2012). However, sometimes other creativity criteria are added, including surprise (Simonton 2012) and aesthetic value and authenticity (Kharkhurin 2014). How flexible is MTCA in dealing with such additions to the standard definition? Indeed, MTCA can also help to explain variance in creative outcomes when other dimensions or criteria are considered. For instance in Kharkhurin’s conception of creativity, aesthetic value is about the content and the techniques of an (artistic) work and an outcome is high on aesthetic value when it presents the fundamental truth of nature, strives to arrange expressive elements in a perfect order, expresses the essence of the phenomenal reality in an efficient manner, and is sufficiently complex. Using their expertise and cognitive abilities, people analyze and explore a particular phenomenon to better capture its essence and using their expertise people give content and context to an idea and apply skills and techniques. This is, for instance, what artists do when preparing a work of art and then using their skills (e.g., painting techniques) and knowledge (e.g., about material) to create an art work. Authenticity refers to creative work that expresses a creative person’s inner self (Kharkhurin 2014). Obviously, people need self-knowledge (expertise) to make an authentic work. However, this aspect of creativity may not be assessed by independent stakeholders/experts and, as a personal judgment, would fall in the mini-c level of creativity.

7. Discussion

With the Minimal Theory of Creative Ability (MTCA) we contend that intelligence and expertise are the only necessary components of the ability to be creative, the ability to produce novel and useful ideas or products (e.g., Montag et al. 2012; Runco and Jaeger 2012). The MTCA builds on previous theories, models, and findings in creativity research and offers five important contributions. First, MTCA is carefully cast in an individual differences (as opposed to a process) approach (cf. Cronbach 1957), thereby advancing conceptual clarity. Second, MTCA is a formal theory that is both simple and falsifiable, paving a path for further rigorous creativity research. Third, we compiled a comprehensive list of six robust phenomena in creativity research that any theory of creative ability should account for—and MTCA does. Fourth, we discuss six possible objections to MTCA and refute these. Fifth, MTCA has an important implication for measuring creative ability because MTCA simply reduces creative ability to just two other measurable concepts. We claim that creative ability can best be measured with a combination of an intelligence test and tests of expertise, potentially solving the problems with measuring individual differences in creative ability and predicting real-world creative achievements.
We call the MTCA a minimal theory on purpose. It is not that we believe the MTCA is necessarily true. With a minimal theory, we propose a Popperian challenge: Can we falsify the MTCA? There is a vast amount of creativity research, which may well provide strong evidence against the MTCA. Rejections of MTCA could involve new research, but also reviews of existing research lines. In all cases we think methodological rigor is required. Creativity research will benefit from strict methodological research practices. This includes using instruments with known psychometric properties, sufficient study power, a representative sample of the normal population, and preregistration of studies (de Groot 2014; Nosek et al. 2015), and using theory testing rather than discovery based research (Oberauer and Lewandowsky 2019).
We see at least four ways in which the MTCA could be rejected. First, there is possibly more to creativity than the combination of intelligence and expertise. This requires a differential psychological study showing that some stable individual differences factor (e.g., a personality trait, divergent thinking ability, etc.) significantly adds to the prediction of creative performance above and beyond Intelligence (I) and Expertise (E) and their interaction (I × E). However, existing research faces many methodological problems (e.g., the poor test–retest reliability of divergent thinking tests, the restriction of range in intelligence in WEIRD study samples). Thus, showing that individual differences in real-world creativity require more than the combination of intelligence and expertise probably needs new empirical research that adheres to the strict methodological research practices outlined above.
Second, the MTCA could be rejected if we can somehow demonstrate that creativity is domain general. One way is to argue that individuals commonly display creativity in many diverse domains. However, the polymath or Renaissance man is uncommon (Cassandro 1998; Kaufman et al. 2010b; Ma and Uzzi 2018). Indeed, versatility, achieved by developing multiple areas of expertise, rather than solely focusing on one domain, may aid creative achievement (Simonton 2000). Furthermore, creativity in very different domains is not ruled out by MTCA as long as one has expertise in each domain (see also Simonton 2003b). A better way to falsify the MTCA by demonstrating that creative ability is domain general would be to develop a reliable and valid test of domain general creativity with predictive power comparable to IQ tests (so high predictive validity) and also good convergent and discriminant validity. Perhaps two recently published creativity test batteries, the Evaluation of Potential Creativity (Lubart et al. 2011) and the Runco Creativity Assessment Battery (Creativity Testing Services 2011), will meet these high psychometric standards. Similarly, major improvements in creativity assessments that rest on divergent thinking could be used to falsify the MTCA, where creativity could then be considered equal to divergent thinking, C = DT, rather than C = I × E.—although, as mentioned above, it is very possible that divergent thinking is just missing from standard IQ tests.
Third, if valid measures of creative ability within domains are available, several tests of the MTCA’s domain specificity assumption are possible. Assessing actual creative performance using panels of expert judges with the Consensual Assessment Technique (Amabile 1982; Baer and McKool 2009; Myszkowski and Storme 2019) may be useful. By using expert ratings to assess creative achievements (C) we can test how much variance is explained by I, E and I × E and whether additional variables significantly help predict C. If we have expert ratings of C in different domains, we could also test whether the residuals of the regression of I, E and I × E on C correlate. This would suggest domain generality. Another research line might concern the threshold hypothesis, which says that the linear relationship between C and I disappears above some level of I (e.g., Barron 1963; Torrance 1962). The evidence on this effect, however, is mixed (Jauk et al. 2013; Karwowski et al. 2016; Preckel et al. 2006). Again, we stress the importance of using a ‘good’ test of C, along with sufficient variance in intelligence. A low correlation between an IQ measure and some measure of C, with a low test–retest reliability, is not a rejection of the MTCA (e.g., Wallach and Kogan 1965), nor is a low correlation between IQ and C when there is restriction of range in IQ scores.
Fourth, if creativity improves for some reason without increasing expertise or intelligence then we could reject the MTCA. For example, in the MTCA domain general training of creativity is impossible (see Baer 2012 for a similar point). According to MTCA it is only possible to improve creative ability by improving general cognitive functions (such as working memory) or improving domain specific expertise. Thus, the MTCA could be rejected by demonstrating that domain general creativity training improves creativity without improving an aspect of intelligence (Scott et al. 2004). Similarly, MTCA predicts that creativity improves with age due to increasing expertise. So, if we can prove that younger children are more creative than older children (German and Defeyter 2000), adolescents are more creative than adults (e.g., Stevenson et al. 2014), or that creativity does not improve with age (prior to cognitive decline) (e.g., Simonton 1977, 1997), then we could reject the MTCA. A related prediction is that creative ability improves, rather than declines, with increased expertise. So, if cultivating expertise consistently leads to less creative performance, such as “habitual” performance or entrenchment, then we would have arguments against the MTCA (Dane 2010; Ford 1996; Weisberg 2018).

8. Conclusions

The MTCA states that creative ability within domains is essentially due to the combined effects of intelligence and expertise. To assess creative ability, one simply needs to measure intelligence and domain specific expertise to determine a person’s ability to be creative within a given field. If MTCA holds, then the construction of valid domain general creativity tests is doomed to fail.

Author Contributions

Conceptualization, H.v.d.M.; writing—original draft preparation, C.S., M.B., H.v.d.M.; writing—review and editing, C.S., M.B., H.v.d.M.; supervision, M.B., H.v.d.M.; project administration, C.S. All authors have read and agreed to the published version of the manuscript.


This research was supported by the Jacobs Foundation Fellowship 2019-2021 awarded to Claire Stevenson (2018 1288 12).

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.


We thank Maartje Raijmakers for her helpful comments on a previous ver-sion of this manuscript.

Conflicts of Interest

The authors declare no conflict of interest.


  1. Amabile, Teresa M. 1982. Social Psychology of Creativity: A Consensual Assessment Technique. Journal of Personality and Social Psychology 43: 997–1013. [Google Scholar] [CrossRef]
  2. Amabile, Teresa M. 1983. The Social Psychology of Creativity: A Componential Conceptualization. Journal of Personality and Social Psychology 45: 357–76. [Google Scholar] [CrossRef]
  3. Amabile, Teresa M., Regina Conti, Heather Coon, Jeffrey Lazenby, and Michael Herron. 1996. Assessing the Work Environment for Creativity. Academy of Management Journal 39: 1154–84. [Google Scholar] [CrossRef]
  4. Amabile, Teresa M., and Michael G. Pratt. 2016. The Dynamic Componential Model of Creativity and Innovation in Organizations: Making Progress, Making Meaning. Research in Organizational Behavior 36: 157–83. [Google Scholar] [CrossRef]
  5. Ananiadou, Katerina, and Magdalean Claro. 2009. 21st Century Skills and Competences for New Millennium Learners in OECD Countries. OECD Education Working Papers, No. 41. OECD Publishing (NJ1). Paris: OECD Publishing. [Google Scholar] [CrossRef]
  6. Baas, Matthijs. 2019. In the Mood for Creativity. In The Cambridge Handbook of Creativity. Edited by James C. Kaufman and Robert J. Sternberg. Cambridge: Cambridge University Press, pp. 257–72. [Google Scholar]
  7. Baas, Matthijs, Carsten K. W. De Dreu, and Bernard A. Nijstad. 2008. A Meta-Analysis of 25 Years of Mood-Creativity Research: Hedonic Tone, Activation, or Regulatory Focus? Psychological Bulletin 134: 779–806. [Google Scholar] [CrossRef]
  8. Baas, Matthijs, Bernard A. Nijstad, Nathalie C. Boot, and Carsten K. W. De Dreu. 2016. Mad genius revisited: Vulnerability to psychopathology, biobehavioral approach-avoidance, and creativity. Psychological Bulletin 142: 668–692. [Google Scholar] [CrossRef] [PubMed]
  9. Baas, Matthijs, Bernard A. Nijstad, Jessie Koen, Nathalie C. Boot, and Carsten K. W. De Dreu. 2019. Vulnerability to Psychopathology and Creativity: The Role of Approach-Avoidance Motivation and Novelty Seeking. Psychology of Aesthetics, Creativity, and the Arts 14: 334. [Google Scholar] [CrossRef]
  10. Baer, John. 1994a. Why You Shouldn’t Trust Creativity Tests. Educational Leadership 51: 80–83. [Google Scholar]
  11. Baer, John. 1994b. Why You Still Shouldn’t Trust Creativity Tests. Educational Leadership 52: 72–73. [Google Scholar]
  12. Baer, John. 2011a. Four (More) Arguments against the Torrance Tests. Psychology of Aesthetics, Creativity, and the Arts 5: 316–17. [Google Scholar] [CrossRef]
  13. Baer, John. 2011b. How Divergent Thinking Tests Mislead Us: Are the Torrance Tests Still Relevant in the 21st Century? The Division 10 Debate. Psychology of Aesthetics, Creativity, and the Arts 5: 309–13. [Google Scholar] [CrossRef]
  14. Baer, John. 2012. Domain Specificity and the Limits of Creativity Theory. The Journal of Creative Behavior 46: 16–29. [Google Scholar] [CrossRef]
  15. Baer, John, and James C. Kaufman. 2005. Bridging Generality and Specificity: The Amusement Park Theoretical (APT) Model of Creativity. Roeper Review 27: 158–63. [Google Scholar] [CrossRef]
  16. Baer, John M. 1988. Long-Term Effects of Creativity Training with Middle School Students. The Journal of Early Adolescence 8: 183–93. [Google Scholar] [CrossRef]
  17. Baer, John, and Sharon S. McKool. 2009. Assessing Creativity Using the Consensual Assessment Technique. In Handbook of Research on Assessment Technologies, Methods, and Applications in Higher Education. Hershey: IGI Global, pp. 65–77. [Google Scholar] [CrossRef]
  18. Barbot, Baptiste. 2019. Measuring Creativity Change and Development. Psychology of Aesthetics, Creativity, and the Arts 13: 203–10. [Google Scholar] [CrossRef]
  19. Barbot, Baptiste, Maud Besançon, and Todd Lubart. 2016. The Generality-Specificity of Creativity: Exploring the Structure of Creative Potential with EPoC. Learning and Individual Differences 52: 178–87. [Google Scholar] [CrossRef]
  20. Barron, Frank. 1963. Creativity and Psychological Health. Oxford: D. Van Nostrand. [Google Scholar]
  21. Basadur, Min, George B. Graen, and Stephen G. Green. 1982. Training in Creative Problem Solving: Effects on Ideation and Problem Finding and Solving in an Industrial Research Organization. Organizational Behavior and Human Performance 30: 41–70. [Google Scholar] [CrossRef]
  22. Bayer, Alan E., and John Folger. 1966. Some Correlates of a Citation Measure of Productivity in Science. Sociology of Education 39: 381–90. [Google Scholar] [CrossRef]
  23. Beaty, Roger E., Mathias Benedek, Paul J. Silvia, and Daniel L. Schacter. 2016. Creative Cognition and Brain Network Dynamics. Trends in Cognitive Sciences 20: 87–95. [Google Scholar] [CrossRef] [PubMed]
  24. Beaty, Roger E., Yoed N. Kenett, Alexander P. Christensen, Monica D. Rosenberg, Mathias Benedek, Qunlin Chen, Andreas Fink, Jiang Qiu, and Thomas R. Kwapil. 2018. Robust Prediction of Individual Creative Ability from Brain Functional Connectivity. Proceedings of the National Academy of Sciences 115: 1087–92. [Google Scholar] [CrossRef] [PubMed]
  25. Beaty, Roger E., Paul J. Silvia, Emily C. Nusbaum, Emanuel Jauk, and Mathias Benedek. 2014. The Roles of Associative and Executive Processes in Creative Cognition. Memory & Cognition 42: 1186–97. [Google Scholar] [CrossRef]
  26. Beghetto, Ronald A., and James C. Kaufman. 2007. The Genesis of Creative Greatness: Mini-c and the Expert Performance Approach. High Ability Studies 18: 59–61. [Google Scholar] [CrossRef]
  27. Benedek, Mathias, Emanuel Jauk, Markus Sommer, Martin Arendasy, and Aljoscha C. Neubauer. 2014. Intelligence, Creativity, and Cognitive Control: The Common and Differential Involvement of Executive Functions in Intelligence and Creativity. Intelligence 46: 73–83. [Google Scholar] [CrossRef] [PubMed]
  28. Boden, Margaret A. 2004. The Creative Mind: Myths and Mechanisms. London: Routledge. [Google Scholar] [CrossRef]
  29. Byron, Kris, and Shalini Khazanchi. 2012. Rewards and Creative Performance: A Meta-Analytic Test of Theoretically Derived Hypotheses. Psychological Bulletin 138: 809–30. [Google Scholar] [CrossRef]
  30. Byron, Kris, Shalini Khazanchi, and Deborah Nazarian. 2010. The Relationship between Stressors and Creativity: A Meta-Analysis Examining Competing Theoretical Models. Journal of Applied Psychology 95: 201–12. [Google Scholar] [CrossRef]
  31. Campbell, Donald T. 1960. Blind Variation and Selective Retentions in Creative Thought as in Other Knowledge Processes. Psychological Review 67: 380–400. [Google Scholar] [CrossRef]
  32. Carson, Shelley H., Jordan B. Peterson, and Daniel M. Higgins. 2005. Reliability, Validity, and Factor Structure of the Creative Achievement Questionnaire. Creativity Research Journal 17: 37–50. [Google Scholar] [CrossRef]
  33. Casner-Lotto, Jill, and Linda Barrington. 2006. Are They Really Ready to Work? Employers’ Perspectives on the Basic Knowledge and Applied Skills of New Entrants to the 21st Century U.S. Workforce; Partnership for 21st Century Skills. Available online: (accessed on 26 June 2019).
  34. Cassandro, Vincent J. 1998. Explaining Premature Mortality Across Fields of Creative Endeavor. Journal of Personality 66: 805–33. [Google Scholar] [CrossRef]
  35. Chamorro-Premuzic, Tomas, and Adrian Furnham. 2006. Intellectual Competence and the Intelligent Personality: A Third Way in Differential Psychology. Review of General Psychology 10: 251–67. [Google Scholar] [CrossRef]
  36. Chan, David W., and Yongjun Zhao. 2010. The Relationship Between Drawing Skill and Artistic Creativity: Do Age and Artistic Involvement Make a Difference? Creativity Research Journal 22: 27–36. [Google Scholar] [CrossRef]
  37. Cole, Jonathan R., and Stephen Cole. 1973. Social Stratification in Science. Chicago: University of Chicago Press. [Google Scholar]
  38. Colvin, Geoff. 2011. Talent Is Overrated: What Really Separates World-Class Performers from Everybody Else. London: Hachette UK. [Google Scholar]
  39. Creativity Testing Services. 2011. Runco Creativity Assessment Battery (RCAB). Bishop: Creativity Testing Service, Available online: (accessed on 26 June 2019).
  40. Cronbach, Lee J. 1957. The Two Disciplines of Scientific Psychology. American Psychologist 12: 671–84. [Google Scholar] [CrossRef]
  41. Cropley, Arthur J. 2000. Defining and Measuring Creativity: Are Creativity Tests Worth Using? Roeper Review 23: 72–79. [Google Scholar] [CrossRef]
  42. Dane, Erik. 2010. Reconsidering the Trade-off Between Expertise and Flexibility: A Cognitive Entrenchment Perspective. Academy of Management Review 35: 579–603. [Google Scholar] [CrossRef]
  43. De Dreu, Carsten K. W., Bernard A. Nijstad, Matthijs Baas, Inge Wolsink, and Marieke Roskes. 2012. Working Memory Benefits Creative Insight, Musical Improvisation, and Original Ideation Through Maintained Task-Focused Attention. Personality and Social Psychology Bulletin 38: 656–69. [Google Scholar] [CrossRef]
  44. Deary, Ian J., Steve Strand, Pauline Smith, and Cres Fernandes. 2007. Intelligence and Educational Achievement. Intelligence 35: 13–21. [Google Scholar] [CrossRef]
  45. Den Hartigh, Ruud J. R., Marijn W. G. Van Dijk, Henderien W. Steenbeek, and Paul L. C. Van Geert. 2016. A Dynamic Network Model to Explain the Development of Excellent Human Performance. Frontiers in Psychology 7. [Google Scholar] [CrossRef]
  46. DeYoung, Colin G. 2006. Higher-Order Factors of the Big Five in a Multi-Informant Sample. Journal of Personality and Social Psychology 91: 1138–51. [Google Scholar] [CrossRef]
  47. Dietrich, Arne, and Riam Kanso. 2010. A Review of EEG, ERP, and Neuroimaging Studies of Creativity and Insight. Psychological Bulletin 136: 822–48. [Google Scholar] [CrossRef]
  48. Ericsson, K. Anders, Ralf T. Krampe, and Clemens Tesch-Römer. 1993. The Role of Deliberate Practice in the Acquisition of Expert Performance. Psychological Review 100: 363–406. [Google Scholar] [CrossRef]
  49. Ericsson, K. Anders. 1999. Creative Expertise as Superior Reproducible Performance: Innovative and Flexible Aspects of Expert Performance. Psychological Inquiry 10: 329–33. [Google Scholar]
  50. Ericsson, K. Anders. 2018. An Introduction to the Second Edition of The Cambridge Handbook of Expertise and Expert Performance: Its Development, Organization, and Content. In The Cambridge Handbook of Expertise and Expert Performance, 2nd ed. Edited by K. Anders Ericsson, Robert R. Hoffman, Aaron Kozbelt and Mark A. Williams. Cambridge: Cambridge University Press, pp. 3–20. [Google Scholar]
  51. Ericsson, K. Anders, and Kyle W. Harwell. 2019. Deliberate Practice and Proposed Limits on the Effects of Practice on the Acquisition of Expert Performance: Why the Original Definition Matters and Recommendations for Future Research. Frontiers in Psychology 10. [Google Scholar] [CrossRef] [PubMed]
  52. Eysenck, Hans Jürgen. 1995. Genius: The Natural History of Creativity. Cambridge: Cambridge University Press. [Google Scholar]
  53. Feist, Gregory J. 1999. The Influence of Personality on Artistic and Scientific Creativity. In Handbook of Creativity. Edited by Robert J. Sternberg. Cambridge: Cambridge University Press, pp. 273–96. [Google Scholar]
  54. Feist, Gregory J. 2019. The Function of Personality in Creativity. In The Cambridge Handbook of Creativity. Edited by James C. Kaufman and Robert J. Sternberg. Cambridge: Cambridge University Press, pp. 353–73. [Google Scholar]
  55. Feist, Gregory J., and Frank X. Barron. 2003. Predicting Creativity from Early to Late Adulthood: Intellect, Potential, and Personality. Journal of Research in Personality 37: 62–88. [Google Scholar] [CrossRef]
  56. Finke, Ronald A., Thomas B. Ward, and Steven M. Smith. 1992. Creative Cognition: Theory, Research, and Applications. Cambridge: MIT Press. [Google Scholar]
  57. Ford, Cameron M. 1996. A Theory of Individual Creative Action in Multiple Social Domains. Academy of Management Review 21: 1112–42. [Google Scholar] [CrossRef]
  58. Gabora, Liane. 2017. Honing Theory: A Complex Systems Framework for Creativity. Nonlinear Dynamics, Psychology, and Life Sciences 21: 35–88. [Google Scholar] [PubMed]
  59. Gardner, Howard. 2011. Frames of Mind: The Theory of Multiple Intelligences. London: Hachette UK. [Google Scholar]
  60. German, Tim P., and Margaret Anne Defeyter. 2000. Immunity to Functional Fixedness in Young Children. Psychonomic Bulletin & Review 7: 707–12. [Google Scholar] [CrossRef]
  61. Glăveanu, Vlad Petre. 2012. Habitual Creativity: Revising Habit, Reconceptualizing Creativity. Review of General Psychology 16: 78–92. [Google Scholar] [CrossRef]
  62. Gocłowska, Małgorzata A., Simone M. Ritter, Andrew J. Elliot, and Matthijs Baas. 2019. Novelty Seeking Is Linked to Openness and Extraversion, and Can Lead to Greater Creative Performance. Journal of Personality 87: 252–66. [Google Scholar] [CrossRef] [PubMed]
  63. Gottfredson, Linda S. 1997. Why g Matters: The Complexity of Everyday Life. Intelligence 24: 79–132. [Google Scholar] [CrossRef]
  64. Grabner, Roland H., Elsbeth Stern, and Aljoscha C. Neubauer. 2007. Individual Differences in Chess Expertise: A Psychometric Investigation. Acta Psychologica 124: 398–420. [Google Scholar] [CrossRef] [PubMed]
  65. Grohman, Magdalena G., Zorana Ivcevic, Paul Silvia, and Scott Barry Kaufman. 2017. The Role of Passion and Persistence in Creativity. Psychology of Aesthetics, Creativity, and the Arts 11: 376–85. [Google Scholar] [CrossRef]
  66. de Groot, Adrianus D. 2014. The Meaning of ‘Significance’ for Different Types of Research [Translated and Annotated by Eric-Jan Wagenmakers, Denny Borsboom, Josine Verhagen, Rogier Kievit, Marjan Bakker, Angelique Cramer, Dora Matzke, Don Mellenbergh, and Han L. J. van Der Maas]. Acta Psychologica 148: 188–94. [Google Scholar] [CrossRef]
  67. Guilford, Joy P. 1967. Creativity: Yesterday, Today and Tomorrow. The Journal of Creative Behavior 1: 3–14. [Google Scholar] [CrossRef]
  68. Han, Ki-Soon. 2003. Domain-Specificity of Creativity in Young Children: How Quantitative and Qualitative Data Support It. The Journal of Creative Behavior 37: 117–142. [Google Scholar] [CrossRef]
  69. Hargadon, Andrew B., and Beth A. Bechky. 2006. When Collections of Creatives Become Creative Collectives: A Field Study of Problem Solving at Work. Organization Science 17: 484–500. [Google Scholar] [CrossRef]
  70. Harris, Julie A. 2004. Measured Intelligence, Achievement, Openness to Experience, and Creativity. Personality and Individual Differences 36: 913–29. [Google Scholar] [CrossRef]
  71. Hayes, John R. 1989. Cognitive Processes in Creativity. In Handbook of Creativity. Edited by John A. Glover, Royce R. Ronning and Cecil R. Reynolds. Perspectives on Individual Differences. Boston: Springer US, pp. 135–45. [Google Scholar] [CrossRef]
  72. Hélie, Sébastien, and Ron Sun. 2010. Incubation, Insight, and Creative Problem Solving: A Unified Theory and a Connectionist Model. Psychological Review 117: 994–1024. [Google Scholar] [CrossRef] [PubMed]
  73. Hennessey, B. A. 2019. Motivation and Creativity. In The Cambridge Handbook of Creativity. Edited by James C. Kaufman and Robert J. Sternberg. Cambridge: Cambridge University Press, pp. 374–95. [Google Scholar]
  74. Henriksen, Danah, Punya Mishra, and Petra Fisser. 2016. Infusing Creativity and Technology in 21st Century Education: A Systemic View for Change. Journal of Educational Technology & Society 19: 27–37. [Google Scholar]
  75. Hunter, John E., and Frank L. Schmidt. 2004. Methods of Meta-Analysis: Correcting Error and Bias in Research Findings. Thousand Oaks: SAGE. [Google Scholar]
  76. IBM. 2010. Capitalizing on Complexity: Insights from the Global Chief Executive Officer Study. CTB10. Available online: (accessed on 2 November 2020).
  77. Jaeggi, Susanne M., Buschkuehl Martin, Jonides John, and Perrig Walter J. 2008. Improving Fluid Intelligence with Training on Working Memory. Proceedings of the National Academy of Sciences 105: 6829–33. [Google Scholar] [CrossRef] [PubMed]
  78. Jauk, Emanuel, Mathias Benedek, Beate Dunst, and Aljoscha C. Neubauer. 2013. The Relationship between Intelligence and Creativity: New Support for the Threshold Hypothesis by Means of Empirical Breakpoint Detection. Intelligence 41: 212–21. [Google Scholar] [CrossRef]
  79. Jensen, Arthur R. 1998. The g Factor: The Science of Mental Ability. Westport: Praeger Publishers/Greenwood Publishing Group. [Google Scholar]
  80. de Jesus, Saul Neves, Claudia Lenuţa Rus, Willy Lens, and Susana Imaginário. 2013. Intrinsic Motivation and Creativity Related to Product: A Meta-Analysis of the Studies Published Between 1990–2010. Creativity Research Journal 25: 80–84. [Google Scholar] [CrossRef]
  81. Johnson, Steven. 2010. Where Good Ideas Come from: The Natural History of Innovation. London: Penguin UK. [Google Scholar]
  82. Karwowski, Maciej, Jan Dul, Jacek Gralewski, Emanuel Jauk, Dorota M. Jankowska, Aleksandra Gajda, Michael H. Chruszczewski, and Mathias Benedek. 2016. Is Creativity without Intelligence Possible? A Necessary Condition Analysis. Intelligence 57: 105–17. [Google Scholar] [CrossRef]
  83. Kaufman, James C. 2015. Why Creativity Isn’t in IQ Tests, Why It Matters, and Why It Won’t Change Anytime Soon Probably. Journal of Intelligence 3: 59–72. [Google Scholar] [CrossRef]
  84. Kaufman, James C., and Ronald A. Beghetto. 2009. Beyond Big and Little: The Four C Model of Creativity. Review of General Psychology 13: 1–12. [Google Scholar] [CrossRef]
  85. Kaufman, James C., Ronald A. Beghetto, and John Baer. 2010a. Finding Young Paul Robeson: Exploring the Question of Creative Polymathy. In Innovations in Educational Psychology: Perspectives on Learning, Teaching, and Human Development. Edited by Robert J. Sternberg and David D. Preiss. New York: Springer Publishing Company, pp. 141–62. [Google Scholar]
  86. Kaufman, James C., Ronald A. Beghetto, John Baer, and Zorana Ivcevic. 2010b. Creativity Polymathy: What Benjamin Franklin Can Teach Your Kindergartener. Learning and Individual Differences 20: 380–87. [Google Scholar] [CrossRef]
  87. Kaufman, James C., and Vlad P. Glaveanu. 2019. A Review of Creativity Theories. In The Cambridge Handbook of Creativity. Edited by James C. Kaufman and Robert J. Sternberg. Cambridge: Cambridge University Press, pp. 27–43. [Google Scholar]
  88. Kaufman, James C., Scott Barry Kaufman, and Elizabeth O. Lichtenberger. 2011. Finding Creative Potential on Intelligence Tests via Divergent Production. Canadian Journal of School Psychology 26: 83–106. [Google Scholar] [CrossRef]
  89. Khan, Mahreen, and Amirali Minbashian. 2019. The Effects of Ageing on Creative Performance Trajectories. Applied Psychology 71: 384–408. [Google Scholar] [CrossRef]
  90. Kharkhurin, Anatoliy V. 2014. Creativity. 4in1: Four-Criterion Construct of Creativity. Creativity Research Journal 26: 338–52. [Google Scholar] [CrossRef]
  91. Kim, Kyung Hee. 2008. Meta-Analyses of the Relationship of Creative Achievement to Both IQ and Divergent Thinking Test Scores. The Journal of Creative Behavior 42: 106–30. [Google Scholar] [CrossRef]
  92. Kim, Kyung Hee. 2011a. Proven Reliability and Validity of the Torrance Tests of Creative Thinking (TTCT). Psychology of Aesthetics, Creativity, and the Arts 5: 314–15. [Google Scholar] [CrossRef]
  93. Kim, Kyung Hee. 2011b. The APA 2009 Division 10 Debate: Are the Torrance Tests of Creative Thinking Still Relevant in the 21st Century? Psychology of Aesthetics, Creativity, and the Arts 5: 302–8. [Google Scholar] [CrossRef]
  94. Kozbelt, Aaron. 2008. Longitudinal Hit Ratios of Classical Composers: Reconciling ‘Darwinian’ and Expertise Acquisition Perspectives on Lifespan Creativity. Psychology of Aesthetics, Creativity, and the Arts 2: 221–35. [Google Scholar] [CrossRef]
  95. Kozbelt, A., Ronald A. Beghetto, and Mark A. Runco. 2010. Theories of Creativity. In The Cambridge Handbook of Creativity. Edited by James C. Kaufman and Robert J. Sternberg. Cambridge: Cambridge University Press, pp. 20–47. [Google Scholar]
  96. Krampe, Ralf T., and Neil Charness. 2018. Aging and Expertise. In The Cambridge Handbook of Expertise and Expert Performance, 2nd ed. Edited by K. Anders Ericsson, Robert R. Hoffman, Aaron Kozbelt and Mark A. Williams. Cambridge: Cambridge University Press, pp. 835–56. [Google Scholar]
  97. Kuncel, Nathan R., Sarah A. Hezlett, and Deniz S. Ones. 2004. Academic Performance, Career Potential, Creativity, and Job Performance: Can One Construct Predict Them All? Journal of Personality and Social Psychology 86: 148–61. [Google Scholar] [CrossRef]
  98. Law, Lily N. C., and Marcel Zentner. 2012. Assessing Musical Abilities Objectively: Construction and Validation of the Profile of Music Perception Skills. PLoS ONE 7: e52508. [Google Scholar] [CrossRef] [PubMed]
  99. Lotka, Alfred J. 1926. The Frequency Distribution of Scientific Productivity. Journal of the Washington Academy of Sciences 16: 317–23. [Google Scholar]
  100. Lubart, Todd, Maud Besançon, and Baptiste Barbot. 2011. Evaluation Du Potentiel Créatif (EPoC) [Evaluation of Creative Potential]. Paris: Editions Hogrefe France. [Google Scholar]
  101. Lubart, Todd, and Jacques-Henri Guignard. 2004. The Generality-Specificity of Creativity: A Multivariate Approach. In Creativity: From Potential to Realization. Washington, DC: American Psychological Association, pp. 43–56. [Google Scholar] [CrossRef]
  102. Ma, Yifang, and Brian Uzzi. 2018. Scientific Prize Network Predicts Who Pushes the Boundaries of Science. Proceedings of the National Academy of Sciences 115: 12608–15. [Google Scholar] [CrossRef] [PubMed]
  103. van der Maas, Han L. J., Conor V. Dolan, Raoul P. P. P. Grasman, Jelte M. Wicherts, Hilde M. Huizenga, and Maartje E. J. Raijmakers. 2006. A Dynamical Model of General Intelligence: The Positive Manifold of Intelligence by Mutualism. Psychological Review 113: 842–61. [Google Scholar] [CrossRef] [PubMed]
  104. van der Maas, Han L. J., Kees-Jan Kan, Maarten Marsman, and Claire E. Stevenson. 2017. Network Models for Cognitive Development and Intelligence. Journal of Intelligence 5: 16. [Google Scholar] [CrossRef] [PubMed]
  105. van der Maas, Han L. J., and Eric-Jan Wagenmakers. 2005. A Psychometric Analysis of Chess Expertise. The American Journal of Psychology 118: 29–60. [Google Scholar]
  106. Macnamara, Brooke N., David Z. Hambrick, and Frederick L. Oswald. 2014. Deliberate Practice and Performance in Music, Games, Sports, Education, and Professions: A Meta-Analysis. Psychological Science 25: 1608–18. [Google Scholar] [CrossRef]
  107. Mednick, Martha T. 1963. Research Creativity in Psychology Graduate Students. Journal of Consulting Psychology 27: 265–66. [Google Scholar] [CrossRef] [PubMed]
  108. Mednick, Sarnoff. 1962. The Associative Basis of the Creative Process. Psychological Review 69: 220–32. [Google Scholar] [CrossRef] [PubMed]
  109. Montag, Tamara, Carl P. Maertz, and Markus Baer. 2012. A Critical Analysis of the Workplace Creativity Criterion Space. Journal of Management 38: 1362–86. [Google Scholar] [CrossRef]
  110. Mumford, Michael D., and Tristan McIntosh. 2017. Creative Thinking Processes: The Past and the Future. The Journal of Creative Behavior 51: 317–22. [Google Scholar] [CrossRef]
  111. Murphy, Kevin. 2017. What Can We Learn from ‘Not Much More than g’? Journal of Intelligence 5: 8. [Google Scholar] [CrossRef] [PubMed]
  112. Murray, Charles. 2009. Human Accomplishment: The Pursuit of Excellence in the Arts and Sciences, 800 B.C. to 1950. New York: Harper Collins. [Google Scholar]
  113. Myszkowski, Nils, and Martin Storme. 2019. Judge Response Theory? A Call to Upgrade Our Psychometrical Account of Creativity Judgments. Psychology of Aesthetics, Creativity, and the Arts 13: 167–75. [Google Scholar] [CrossRef]
  114. Newell, Allen, and Herbert Alexander Simon. 1972. Human Problem Solving. Upper Saddle River: Prentice-Hall. [Google Scholar]
  115. Nijstad, Bernard A., Carsten K. W. De Dreu, Eric F. Rietzcshel, and Matthijs Baas. 2010. The Dual Pathway to Creativity Model: Creative Ideation as a Function of Flexibility and Persistence. European Review of Social Psychology 21. [Google Scholar] [CrossRef]
  116. Nijstad, Bernard A., and Wolfgang Stroebe. 2006. How the Group Affects the Mind: A Cognitive Model of Idea Generation in Groups. Personality and Social Psychology Review 10: 186–213. [Google Scholar] [CrossRef] [PubMed]
  117. Nosek, B. A., G. Alter, G. C. Banks, D. Borsboom, S. D. Bowman, S. J. Breckler, S. Buck, C. D. Chambers, G. Chin, G. Christensen, and et al. 2015. Promoting an Open Research Culture. Science 348: 1422–25. [Google Scholar] [CrossRef] [PubMed]
  118. Oberauer, Klaus, and Stephan Lewandowsky. 2019. Addressing the Theory Crisis in Psychology. Psychonomic Bulletin & Review 26: 1596–618. [Google Scholar] [CrossRef]
  119. Perry-Smith, Jill E., and Pier Vittorio Mannucci. 2017. From Creativity to Innovation: The Social Network Drivers of the Four Phases of the Idea Journey. Academy of Management Review 42: 53–79. [Google Scholar] [CrossRef]
  120. Plucker, Jonathan A. 1999. Is the Proof in the Pudding? Reanalyses of Torrance’s (1958 to Present) Longitudinal Data. Creativity Research Journal 12: 103–14. [Google Scholar] [CrossRef]
  121. Plucker, Jonathan A., and Ronald A. Beghetto. 2004. Why Creativity Is Domain General, Why It Looks Domain Specific, and Why the Distinction Does Not Matter. In Creativity: From Potential to Realization. Washington: American Psychological Association, pp. 153–67. [Google Scholar] [CrossRef]
  122. Powers, Donald E. 2004. Validity of Graduate Record Examinations (GRE) General Test Scores for Admissions to Colleges of Veterinary Medicine. Journal of Applied Psychology 89: 208–19. [Google Scholar] [CrossRef] [PubMed]
  123. Preckel, Franzis, Heinz Holling, and Michaela Wiese. 2006. Relationship of Intelligence and Creativity in Gifted and Non-Gifted Students: An Investigation of Threshold Theory. Personality and Individual Differences 40: 159–70. [Google Scholar] [CrossRef]
  124. Rhodes, Mel. 1961. An Analysis of Creativity. The Phi Delta Kappan 42: 305–10. [Google Scholar]
  125. Runco, Mark A. 2009. Parsimonious creativity and its measurement. In Measuring Creativity. Edited by Ernesto Villalba. Proceedings of Can Creativity Be Measured? Brussels: Publications Office of the EU, pp. 393–406. [Google Scholar]
  126. Runco, Mark A., and Selcuk Acar. 2012. Divergent Thinking as an Indicator of Creative Potential. Creativity Research Journal 24: 66–75. [Google Scholar] [CrossRef]
  127. Runco, Mark A., and Robert S. Albert. 1985. The Reliability and Validity of Ideational Originality in the Divergent Thinking of Academically Gifted and Nongifted Children. Educational and Psychological Measurement 45: 483–501. [Google Scholar] [CrossRef]
  128. Runco, Mark A., and Garrett J. Jaeger. 2012. The Standard Definition of Creativity. Creativity Research Journal 24: 92–96. [Google Scholar] [CrossRef]
  129. Said-Metwaly, Sameh, Wim Van den Noortgate, and Eva Kyndt. 2017. Methodological Issues in Measuring Creativity: A Systematic Literature Review. Creativity. Theories–Research-Applications 4: 276–301. [Google Scholar] [CrossRef]
  130. Salthouse, Timothy A. 2009. When Does Age-Related Cognitive Decline Begin? Neurobiology of Aging 30: 507–14. [Google Scholar] [CrossRef] [PubMed]
  131. Schmidhuber, Jürgen. 2010. Formal Theory of Creativity, Fun, and Intrinsic Motivation (1990–2010). IEEE Transactions on Autonomous Mental Development 2: 230–47. [Google Scholar] [CrossRef]
  132. Schmidt, Frank L., and John E. Hunter. 1998. The Validity and Utility of Selection Methods in Personnel Psychology: Practical and Theoretical Implications of 85 Years of Research Findings. Psychological Bulletin 124: 262–74. [Google Scholar] [CrossRef]
  133. Scott, Ginamarie, Lyle E. Leritz, and Michael D. Mumford. 2004. The Effectiveness of Creativity Training: A Quantitative Review. Creativity Research Journal 16: 361–88. [Google Scholar] [CrossRef]
  134. Silvia, Paul J. 2015. Intelligence and Creativity Are Pretty Similar After All. Educational Psychology Review 27: 599–606. [Google Scholar] [CrossRef]
  135. Silvia, Paul J., and Camilla E. Sanders. 2010. Why Are Smart People Curious? Fluid Intelligence, Openness to Experience, and Interest. Learning and Individual Differences 20: 242–45. [Google Scholar] [CrossRef]
  136. Simonton, Dean K. 1977. Creative Productivity, Age, and Stress: A Biographical Time-Series Analysis of 10 Classical Composers. Journal of Personality and Social Psychology 35: 791–804. [Google Scholar] [CrossRef]
  137. Simonton, Dean K. 1991. Personality Correlates of Exceptional Personal Influence: A Note on Thorndike’s (1950) Creators and Leaders. Creativity Research Journal 4: 67–78. [Google Scholar] [CrossRef]
  138. Simonton, Dean K. 1997. Creative Productivity: A Predictive and Explanatory Model of Career Trajectories and Landmarks. Psychological Review 104: 66–89. [Google Scholar] [CrossRef]
  139. Simonton, Dean K. 1999a. Talent and Its Development: An Emergenic and Epigenetic Model. Psychological Review 106: 435–57. [Google Scholar] [CrossRef]
  140. Simonton, Dean K. 1999b. The Continued Evolution of Creative Darwinism. Psychological Inquiry 10: 362–67. [Google Scholar]
  141. Simonton, Dean K. 1999c. Origins of Genius: Darwinian Perspectives on Creativity. Oxford: Oxford University Press. [Google Scholar]
  142. Simonton, Dean K. 2000. Creative Development as Acquired Expertise: Theoretical Issues and an Empirical Test. Developmental Review 20: 283–318. [Google Scholar] [CrossRef]
  143. Simonton, Dean K. 2003a. Genius and g: Intelligence and Exceptional Achievement. In The Scientific Study of General Intelligence. Edited by Helmuth Nyborg. Oxford: Pergamon, pp. 229–45. [Google Scholar] [CrossRef]
  144. Simonton, Dean K. 2003b. Expertise, Competence, and Creative Ability: The Perplexing Complexities. In The Psychology of Abilities, Competencies, and Expertise. Edited by Robert J. Sternberg and Elena L. Grigorenko. Cambridge: Cambridge University Press. [Google Scholar]
  145. Simonton, Dean K. 2003c. Scientific Creativity as Constrained Stochastic Behavior: The Integration of Product, Person, and Process Perspectives. Psychological Bulletin 129: 475–94. [Google Scholar] [CrossRef] [PubMed]
  146. Simonton, Dean K. 2008. Scientific Talent, Training, and Performance: Intellect, Personality, and Genetic Endowment. Review of General Psychology 12: 28–46. [Google Scholar] [CrossRef]
  147. Simonton, Dean K. 2012. Foresight, Insight, Oversight, and Hindsight in Scientific Discovery: How Sighted Were Galileo’s Telescopic Sightings? Psychology of Aesthetics, Creativity, and the Arts 6: 243–54. [Google Scholar] [CrossRef]
  148. Sio, Ut Na, and Thomas C. Ormerod. 2009. Does Incubation Enhance Problem Solving? A Meta-Analytic Review. Psychological Bulletin 135: 94–120. [Google Scholar] [CrossRef] [PubMed]
  149. Sternberg, Robert J. 2018. A Triangular Theory of Creativity. Psychology of Aesthetics, Creativity, and the Arts 12: 50–67. [Google Scholar] [CrossRef]
  150. Sternberg, Robert J., Elena L. Grigorenko, and Donald A. Bundy. 2001. The Predictive Value of IQ. Merrill-Palmer Quarterly 47: 1–41. [Google Scholar] [CrossRef]
  151. Sternberg, Robert J., James C. Kaufman, and Anne M. Roberts. 2019. The Relation of Creativity to Intelligence and Wisdom. In The Cambridge Handbook of Creativity. Edited by James C. Kaufman and Robert J. Sternberg. Cambridge: Cambridge University Press, pp. 337–52. [Google Scholar]
  152. Sternberg, Robert J., and Linda A. O’Hara. 2000. Intelligence and Creativity. In Handbook of Intelligence. Cambridge: Cambridge University Press, pp. 611–30. [Google Scholar] [CrossRef]
  153. Stevenson, Claire E., Sietske W. Kleibeuker, Carsten K. W. de Dreu, and Eveline A. Crone. 2014. Training Creative Cognition: Adolescence as a Flexible Period for Improving Creativity. Frontiers in Human Neuroscience 8. [Google Scholar] [CrossRef]
  154. Süb, Heinz-Martin, and André Beauducel. 2005. Faceted Models of Intelligence. In Handbook of Understanding and Measuring Intelligence. Edited by Oliver Wilhelm and Randall W. Engle. London: SAGE, pp. 313–22. [Google Scholar]
  155. Torrance, E. Paul. 1962. Guiding Creative Talent. Englewood Cliffs: Prentice-Hall, Inc. [Google Scholar] [CrossRef]
  156. Torrance, E. Paul. 2008. The Torrance Tests of Creative Thinking—Norms—Technical Manual—Figural (Streamlined) Forms A and B. Bensenville: Scholastic Testing Service. [Google Scholar]
  157. Wallach, Michael A., and Nathan Kogan. 1965. A New Look at the Creativity-Intelligence Distinction. Journal of Personality 33: 348–69. [Google Scholar] [CrossRef] [PubMed]
  158. Wallas, Graham. 1926. The Art of Thought. San Diego: Harcourt, Brace. [Google Scholar]
  159. Weisberg, Robert W. 1999. Creativity and Knowledge: A Challenge to Theories. In Handbook of Creativity. Edited by Robert J. Sternberg. Cambridge: Cambridge University Press, pp. 226–50. [Google Scholar]
  160. Weisberg, Robert W. 2018. Expertise and Structured Imagination in Creative Thinking: Reconsideration of an Old Question. In The Cambridge Handbook of Expertise and Expert Performance, 2nd ed. Edited by K. Anders Ericsson, Robert R. Hoffman, Aaron Kozbelt and Mark A. Williams. Cambridge: Cambridge University Press, pp. 812–34. [Google Scholar]
  161. Zabelina, Darya, and Mark Beeman. 2013. Short-Term Attentional Perseveration Associated with Real-Life Creative Achievement. Frontiers in Psychology 4. [Google Scholar] [CrossRef]
  162. Zeng, Liang, Robert W. Proctor, and Gavriel Salvendy. 2011. Can Traditional Divergent Thinking Tests Be Trusted in Measuring and Predicting Real-World Creativity? Creativity Research Journal 23: 24–37. [Google Scholar] [CrossRef]
Sometimes, divergent thinking performance is seen as an indicator of everyday creative achievement (little-c creativity; see e.g., Kaufman and Beghetto 2009). In our view, divergent thinking cannot be treated as the criterion variable. Instead, it should be seen as a cognitive ability that predicts real-world creative achievements (Karwowski et al. 2016; Runco and Acar 2012). Going one step further, we see divergent thinking as one of the many cognitive abilities, such as memory search and retrieval or analogical reasoning, involved in creativity that fall under the umbrella of intelligence.
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Back to TopTop