Next Article in Journal
An Assessment of the Potential to Produce Commercially Valuable Lipids on Highway Right-of-Way Land Areas Located Within the Southeastern United States
Next Article in Special Issue
Expectations and Challenges in the Labour Market in the Context of Industrial Revolution 4.0. The Agglomeration Method-Based Analysis for Poland and Other EU Member States
Previous Article in Journal
Sustainable Wine Tourism Development: Case Studies from the Greek Region of Peloponnese
Previous Article in Special Issue
Identification of Variables That Cause Agricultural Graduates Not to Return to the Rural Sector in Ecuador. Application of Fuzzy Cognitive Maps
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Competences of Flexible Professionals: Validation of an Invariant Instrument across Mexico, Chile, Uruguay, and Spain

by
Andrea Conchado Peiró
*,
José Miguel Carot Sierra
and
Elena Vázquez Barrachina
Department of Applied Statistics and Operational Research, and Quality, Universitat Politècnica de Valencia, Camino de Vera, s/n. Building 7A, 46022 Valencia, Spain
*
Author to whom correspondence should be addressed.
Sustainability 2020, 12(12), 5224; https://0-doi-org.brum.beds.ac.uk/10.3390/su12125224
Submission received: 26 March 2020 / Revised: 12 June 2020 / Accepted: 22 June 2020 / Published: 26 June 2020

Abstract

:
The purpose of this study was to validate and test latent mean differences in a second-order factorial structure for self-assessed competences across four Spanish-speaking countries (Spain, Chile, Mexico, and Uruguay). Assessments of 11,802 higher education graduates about their own level of competences were examined. According to our findings, latent mean differences observed in our data lend support to earlier findings in the context of universities from these four countries. In order to compare assessments from different countries, we previously found support for metric and scalar invariance in a second-order factor structure, including innovation, cooperation, knowledge management, and communication, organizational and participative competences. These findings have serious managerial implications in regard to institutional evaluations developed by national accreditation bodies and identification of competence requirements by the labor market. In addition, our research provides a powerful tool for young students and employers, as it contains valuable information about what competences should be expected by students when finishing their studies.

1. Introduction

The inclusion of competences in study programs has most often been considered as an opportunity to improve quality in higher education [1,2]. Similarly, Latin American institutions have been working for the last decade in the development of quality assurance programs in higher education. These programs aim to set out the standards provided by universities and particularly the competences every student could expect to develop in universities. Most of them have been implemented by the National Accreditation Commission (CNAP) in Chile, the Higher Education Accreditation Council (COPAES) in Mexico, and the National System of Accreditation and Promotion of Quality in Higher Education in Uruguay. As a result, the majority of universities have upgraded existing study programs and initiated new courses with this approach.
Prior to examining these required competences, a definition for the term “competence” should be given. Many authors have attempted to define the concept of “competence”, but currently, there is still no accepted definition. In fact, there is so much discussion around this term that it is difficult to find a definition capable to fit all approaches to use this term [3]. The understanding of competence depends to a high extent on the cultural context [4]. The term “competency”, often used in the US, refers to a particular behavior that can be learned and assessed as job performance [5,6,7]. This approach belongs to a body of literature that emphasizes the importance of fitting the competences required in each workplace.
The alternative term “competence”, prevalent in the UK, has been used to refer to the set of learning outcomes to be acquired as a result of a training period which qualifies the learner to develop a particular task or occupation. As clearly pointed out in [8], a competency is a part of generic competence, which can be used in real performance contexts. The well-known taxonomy of competences developed by [9] to denominate a combination of mental skills (knowledge), the affective domain (attitudes), and the psychomotor domain concerning manual or physical skills (skills) has become an essential reference of this approach. This influential taxonomy is widely considered as the basis for a multi-dimensional framework of competence to underpin the European Qualifications Framework [10]. Following this approach, competences have been defined as individual capacities, skills, and aptitudes that have a positive direct effect on different productivity gains [11]. It has also been pointed out that a competence is more than just knowledge and skills, as it involves the ability to meet complex demands by drawing on and mobilizing psychosocial resources (including skills and attitudes) [12]. Throughout this paper, we use the term “competence” in accordance with this approach (KSA: Knowledge, Skills, and Attitudes). More recent research has drawn its attention to competence measurement and assessment through systematic approaches [13,14].
The definition and selection of key competences and learning outcomes concerning the student’s workload for teaching, learning, and assessment activities have received much attention over the last two decades [15,16,17]. In this context, competences for sustainable development have seen renewed importance as a tool for preparing graduates to transform our society into a more sustainable one [18]. In order to accomplish this aim, study programs are currently improving to provide graduates with a complete set of sustainability competences [19] through appropriate frameworks for teaching and learning involving specific teaching methodologies and alliances with other stakeholders [20]. The definition of this set of competences for sustainable development remains still unclear due to the complexity of their articulation in higher education programs [21]. While some programs have focused on the integration of emotional intelligence, other approaches have failed to address action taking, personal commitment, or system and future orientation [22]. Differences in the conceptualization of individual competences for sustainability vary to a great extent across countries due to cross-cultural validity issues [23].

1.1. Cross-Cultural Differences in Self-Assessment of Competences

Formal research concerning measurement invariance is particularly important in cross-cultural research. Through the application of this analysis in diverse groups of individuals, we may obtain critical information on the judicious use of latent construct assessments [24,25,26]. According to scientific contributions in the framework of the Tuning project, European and Latin American graduates differ in their opinion about the competences they possess. European graduates considered that some particular competences were important for them, such as the capacity for analysis and synthesis, problem-solving, ability to work autonomously, and information management skills. In contrast, Latin American graduates underlined that the commitment to quality, ethical commitment, and the ability to make decisions were also relevant [17,27].
There is a vast amount of literature on the assessment of competences by European graduates. Overall, graduates seemed to feel better prepared for their job than the job actually required. In all cases, a large majority of flexible graduates thought that their level of competences was high enough to meet employers’ requirements. Nonetheless, these flexible graduates also experienced a shortage of competences pertaining to the realm of authority, the ability to mobilize the capacity of others, and the ability to perform well under pressure [28]. In particular, widespread dissatisfaction was found among Spanish graduates [29]. A sizable proportion of these graduates felt that their professional careers did not match their academic performance in higher education. Moreover, they associated the difficulties of their transition to the labor market with the excessively theoretical, generalist, and obsolete approach of their studies, whilst a substantial part of the education acquired at university was regarded as irrelevant [30].

1.2. Classifications of Competences

As previously mentioned, various heterogeneous approaches have been put forward to address the question of what competences should graduates possess. Unfortunately, there appears to be little agreement on this issue [31]. The term “generic competence” is generally understood to mean those competences which provide the basis for continuous learning, problem-solving, and analytical thinking. In the literature, generic competences refer to systemic, instrumental, and interpersonal competences [27]. On the other hand, the term “specific competence” has been applied for vocational or field-specific knowledge, skills, and attitudes [32,33]. There are three categories of specific competences, depending on their specificity to firms, tasks, or the economic sector [34].
Regarding the relationship between competences and the labor market, it has been examined to what extent specific and generic competences could predict the labor market outcomes [35]. On the other hand, several authors have attempted to characterize the link between different learning environments and competences. Proactive learning environments have been found to foster reflective competences [36,37], whereas other authors point out to their effectiveness in the acquisition of generic and specific competences [38].
According to the employers’ perspective, some authors have studied the requirements of the labor market requires for Spanish university graduates: vocational and generic competences, the latter category being divided into interpersonal, methodological, and knowledge-related competences [39]. Furthermore, it has been called into question the particular approach of conflicts of interest between firms and apprentices, using this basic division of industry-specific and generic skills [40].
Based on a quantitative approach, a remarkable increase of new classifications of competences has been found in the literature. Some authors differentiated between management competences, compared to general-academic and discipline-specific competences, within the context of the EU’s Targeted Socio-Economic Research (TSER) program [33]. A more exhaustive classification differentiated between generic, socio-emotional, participative, specialized, organizational, rule-application, physical, and methodological competences [41]. However, further analysis dropped the rule-application and physical competences from the list [42], with the inclusion of an item concerning the ability to assert one’s authority. Although we may consider that this item does not refer to particular knowledge or skill, its attitudinal approach based on discipline and organizational routines is considered a useful resource in building one’s identity work [13].
More recent evidence highlights the importance of a reduced number of competence dimensions, such as cognitive, professional, social-reflexive, and physical (or manual) skills [43]. Competences defined in the framework of the Latin American Tuning project were classified into learning capabilities, social values, interpersonal skills, and technological and international skills [17]. Other proposals of generic competences refer to the mobilization of human resources, functional flexibility, innovation, and knowledge management, whereas specific competences concern mainly to professional expertise [31].
Since 2009, much more information on this issue of a common classification of competences has become available in the Spanish context. Some authors suggested a division into methodological, social, participative, and specialized competences [44]. It has also been concluded that competences in higher education could be divided into six groups, namely interpersonal competences, knowledge management, communication, organizational skills, innovation, and participative competences [45], which is the factor structure this paper is based on.

1.3. Current Study

In light of the above, no one to the best of our knowledge has studied how the Bologna principles concerning competences have been implemented in Latin American universities. Moreover, despite this interest in a common classification of competences, there is little agreement on which competences should be emphasized throughout higher education studies. Lastly, although the analysis of competences in European universities is an undeniably interesting issue, current solutions to this question appear not to be well-grounded in the quantitative procedures of data analysis. While most previous works used exploratory procedures of data analysis, such as factor analysis, this research aimed to develop a more elegant methodology for addressing the question. This paper outlines a new approach to the issue of finding a set of competences in higher education across countries according to the competences classification provided by [45].

2. Methods

2.1. Participants and Procedure

The participants for this study were 11,802 higher education graduates from four Spanish-speaking countries: 4680 graduates (69.7%) were from Spain, 3994 (33.8%) from Mexico, 2554 (21.6%) from Chile, and 574 (4.9%) were from Uruguay. Data were obtained in the framework of two different research projects: the Spanish participants were surveyed in the “The flexible professional in the knowledge society” (REFLEX) during the period of 2005–2006. Likewise, participants from Chile, Mexico, and Uruguay were interviewed in the follow-up project, the “El Profesional Flexible en la Sociedad del Conocimiento” (PROFLEX) project in Latin America. In this project, interviews with graduates were carried out between 2007 and 2008. In both projects, the questionnaire was administered to graduates in any short-cycle degree of higher education, according to the International Standard Classification of Education (ISCED). This category includes graduates but also Bachelor’s (ISCED 6), Master’s (ISCED 7), and Doctoral (ISCED 8) degrees.
Both projects were implemented in different universities in Europe, Latin America, and Japan. Spain, Chile, Mexico, and Uruguay were particularly selected for the analysis of measurement invariance following a two-fold strategy. Firstly, Spanish-speaking countries were selected to avoid potential confusion due to language misunderstandings [25,27]. Regarding sample sizes, only for these four countries, representative samples were obtained at the national level.
All participants had obtained their degrees in a high number of universities: 33 from Spain, 17 from Chile, 9 from Mexico, and 12 from Uruguay. The average age of Spanish participants was 30.5 years (standard = 3.3), whilst 65.7% of the graduates in the sample were female. The average age of Mexican graduates was 28.5 years (SD = 3.8), and there were 54.5% females. In the case of Chilean graduates, the average age was 29.6 years (SD = 3.8), and 55.4% of the participants were female. Finally, in the Uruguayan group, the average age of the participants was 28.6 years (SD = 3.9), and the sample was made up of 59.8% of females. All participants were volunteers and had previously been instructed on the aim and purpose of the study, as well as their rights to withdraw from the questionnaire at any time, during or after the data collection.

2.2. Instrument

The instrument was composed of 19 items in which graduates were required to rate their own level of generic competences from 1 (very low) to 7 (very high) so that higher scores represented the perception of a higher level of competence. This scale allowed to measure six constructs of competences, including (a) innovation, (b) interpersonal, (c) knowledge management, (d) communication, (e) organizational, and (f) participative competences, as well as the general construct “competence” represented by a second-order factor [21]. Each latent factor was measured with three to four items, and its internal consistency was considered acceptable, despite the low Cronbach”s alpha values obtained in the second factor “Cooperation” (α = 0.695) and the fourth factor “Communication” (α = 0.677), due to the low number of items in the Cooperation subscale [32], as shown in Table 1, with additional descriptive statistics.

2.3. Data Analysis

This model hypothesized a priori that six first-order factors and a single second-order factor structure explained the variability found in the observed data (Model 0). Several criteria were used in order to evaluate the goodness-of-fit of this scale and test measurement invariance. Maximum likelihood estimation procedures were used to estimate all model parameters. In order to correct for non-normality, robust statistics were handled [46]. However, given the very large sample size and the well-known sensitivity of the chi-square statistic to the sample size, the appearance of a statistically significant model misfit was not surprising [47,48]. Therefore, the overall absolute model fit for each country was assessed using other goodness-of-fit indexes: Root Mean Square Error of Approximation (RMSEA), the Incremental Comparative Fit index (CFI), as well as the Standardized Root Mean Square Residual (SRMR). Confidence intervals for RMSEA values (CI) were also reported to test the accuracy of the analysis [49,50]. In terms of the RMSEA, values less than 0.05 indicated an acceptable model fit, representing a reasonable approach to the population [49], whereas CFI values near 1.0 were considered optimal, and values greater than 0.90 showed a satisfactory fit [51]. Finally, an SRMR value under 0.08 is generally considered as an indicator of good fit [48]. The analysis was computed using the EQS software version 6.2 [32].
Once the factorial structure was established, a multi-group confirmatory factorial analysis was performed to test the validity of the scale under study in Spain, Chile, Mexico, and Uruguay by building and assessing several nested models. The evidence of multi-group invariance laid on a set of incremental goodness-of-fit indexes, including both overall (CFI, RMSEA, and SRMR) and incremental goodness-of-fit indexes (∆CFI and ∆χ2) [23]. Whenever a non-significant change in the χ2 was observed, and changes in CFI were lower or equal to 0.01, we considered that requirements for invariant criteria were met [52,53,54]. Differences in scaled or corrected chi-square tests were computed using the robust procedure [55].
The primary purpose of determining the evidence of configural invariance (Model 1) across the samples focused on establishing a well-fitting multi-group baseline model [53]. The next stage in the procedure was to evaluate the first-order metric invariance (Model 2) for the four countries. For the following level, we tested scalar invariance (Model 3) to examine whether the scores from different countries had the same unit of measurement. In order to determine whether intercepts were invariant across countries, this model was nested (Model 3), and intercepts of the measured variables were constrained to make them equal across groups. Among these items, we included those associated with measurement variables whose factor loadings had previously been fixed to the value 1.00. Finally, we evaluated the invariance of the second-order factor loadings by introducing equality constraints on all second-order factor loadings (Model 4). Effect sizes were assessed subsequently (Model 5), following [56]. As in the case of first-order latent mean differences, analogous conditions were specified for the second-order model to avoid misspecification problems. Therefore, equality constraints were placed on second-order factor loadings or, as was the case, on the variances of latent factors. However, given that the estimation of variance parameters for dependent variables was not consistent with the hypothesized model, residual variances of first-order factors were constrained to the value 1.0 (Model 5). Furthermore, latent factor means for the Spanish sample were fixed to zero like the reference group due to the need to fix an arbitrary origin for the latent factor intercepts at this level of invariance [32]. The method for computing the standardized effect size was in [57].

3. Results

3.1. Validation of the Measurement Instrument

As shown in Table 2, testing of the initially hypothesized model for each group yielded a marginally good fit. However, exploration of the modification indexes suggested that the model fit would improve if a new factor loading was added between Item 13 (Ability to assert your authority) and Factor 5 (Organizational skills). As this modification was meaningful and coherent with the theoretical background [13], the parameter corresponding to this factor loading was freely estimated.
After carrying out the related modification, the model was estimated for each country. Goodness-of-fit indexes showed a remarkable improvement in the fit in the four countries, as can be seen in Table 2. The improvement in the corrected difference of the χ2 statistic was significant in all cases. Moreover, the absolute increase in the CFI index equaled or surpassed the required threshold of 0.01 [52], indicating that the data fitted the model better. The unstandardized estimates for the additional factor loading between item 13 and factor 5 were 0.762, 0.918, 1.089, and 1.174 for Spain, Mexico, Chile, and Uruguay, respectively.

3.2. Factorial Invariance Analysis

Reasonable evidence of configural invariance across countries was achieved, as shown in Table 3. These results indicate a well-fitting multi-group baseline model against which to compare all subsequently specified invariance models [52]. Next, first-order factor loading equality (metric invariance) was tested by constraining factor loadings to be equal across countries (Model 2), obtaining a satisfactory fit of the multi-sample data. The decrease in the CFI index (−0.003) between Model 1 and Model 2 indicated that factor loadings were not substantially different across countries. Subsequently, scalar invariance of the model was assessed yielding a satisfactory fit to the multi-sample data. Goodness-of-fit results from Model 3 showed a remarkable improvement in the overall fit. Similarly, differences in CFI were negligible (i.e., ΔCFI = 0.007). Subsequently, the invariance of second-order factor loadings was examined in Model 4, leading to a good fit of the multi-sample data. Lastly, the computed difference in the robust CFI values between Model 4 and the configural Model 1 was ΔCFI = 0.005, which confirmed the invariance of all first and second order factor loadings. Invariant factor loadings can be examined in Appendix A.

3.3. Latent Factor Mean Differences

Thanks to the evidence of invariant factor loadings and intercepts, factor means could be compared further across countries. Model 5 showed a reasonable fit to the data. As shown in Table 4, Mexican, Chilean, and Uruguayan graduates obtained on average greater positive scores in all competences. The largest difference was found in participative competences in all countries, followed by communication competences in Chile and Uruguay and knowledge management in Uruguay. Differences in knowledge management and communication with regard to Spanish graduates were moderate in Chile and large in Mexico and Uruguay, but positive in these countries.

4. Discussion

This study examined the construct validity of an instrument for measuring self-assessed competences and its invariance across Spain, Chile, Mexico, and Uruguay, according to previous research. All items seemed to work well in these four countries, regardless of cultural differences. Results showed that the instrument comprised six first-order factors: interpersonal competences, knowledge management, communication, organizational skills, innovation, and participative competences [45], as well as a global competence factor.
Thus, our work led us to conclude that this factorial structure is invariant across countries, which confirms previous findings in the literature about the classifications of competences. However, the most valuable contribution of this work is the comparison of self-assessed competences by higher education graduates across countries. We obtained comprehensive results showing that competences can be quantified and compared across different contexts. The methodology we devised in this paper represents a powerful tool for the assessment of competences acquired in higher education.
An interesting result of this work was the definition of a knowledge management construct. Thanks to this factor, our instrument constitutes a novelty with regard to the traditional division of generic and specific competences. As reported in this work, we found evidence that the knowledge management factor is composed of learning processes and specific competences, not limited to the field of study of the graduate’s degree. Our research is in line with previous results [31], as they also point out the existence of this factor. In fact, other works also refer to knowledge management competences [53] or professionally knowledgeable graduates [43]. On the other hand, the reference to specific competences is present in most of the previous research using this particular term, or the words “specialized” and “technical” [27,34]. Most of these works also corroborate the need to develop learning process skills [17], general-cognitive abilities [43], and methodological or theoretical competences [42,44]. This common agreement led us to strengthen our confidence in the conclusion that specific competences and learning skills should be combined in the same construct concerning knowledge management.
Regarding interpersonal competences, the current study differs in the term used to refer to this construct. Our interpersonal competences construct includes an item about the ability to mobilize the capacities of others. This supports the previous definition of a participative competence construct [18] as the ability to construct the environment, make decisions, and assume responsibility, among other tasks. The term “participative competences” has been frequently used in previous works to refer to this construct [41,42,44]. Other works have also used the term “mobilization of human resources” [31,58], whose meaning is hardly distinguishable from the item included in the interpersonal competences construct defined by this work. However, we should be aware that our construct does not consider the socio-emotional approach, which is undeniably essential in the graduates’ workplaces.
Both constructs, knowledge management and interpersonal competences, were found to be consistent with well-established models of competences. However, this work reinforces the conclusions suggested by more recent works, such as those regarding innovation, communication, participative, and organizational competences. The innovation competences construct is in complete agreement with the works published during the last decade [31,58]. However, in contrast to them, we found that the item about awareness to new opportunities should be included in the participative competences construct, rather than in the innovation construct. This slight discordance could be due to the wider definition of the construct in [31] that combines the innovation and knowledge management constructs.
Likewise, our conclusions regarding communicative competences are in line with [59]. The only difference is the inclusion of the item regarding the use of computers and internet in the communication construct. We classified this item into the innovation construct, as well as other authors did in their research [31]. This lack of references to communication skills does not mean that previous works have discarded it from their classifications of competences. Instead, it has been frequently combined with cooperation skills.
Given an adequate level of configural, first-order metrics, and scalar and second-order metric invariance, differences between latent means were tested. As expected, Mexican, Chilean, and Uruguayan graduates gave a greater self-assessment of their own level of competences when compared to Spanish graduates. These results offer compelling evidence that Spanish graduates tend to feel less self-confident about their competences. This result could be due to the well-known difficulties in Spain during the last decade that have become apparent in the transition from higher education to the labor market [29]. Therefore, Mexican, Chilean, and Uruguayan graduates may be likely to rate their own competences higher than their Spanish counterparts, as only some of them consider their professional careers match their academic performance in higher education. Additionally, and according to previous research, Latin American academics, graduates, employers, and graduates considered all competences to be important [17].
The largest differences with reference to Spanish graduates were found in Uruguay, followed by Mexico and Chile, whereas the highest assessments corresponded to participative competences, communication competences, and knowledge management. These findings are consistent with the fact that Latin American students point to participative competences as one of the most important competences, through the commitment to quality, ethical commitment and the ability to make decisions [5]. Results concerning knowledge management were also in complete agreement with this report. Finally, although the approach to communication skills is solely focused on the ability to communicate in a second language, it obtained the greatest difference between what was considered important and the rating given to its achievement. Therefore, the latent mean differences observed in our data lend support to earlier findings in the context of Latin American universities.
Overall, our results found satisfactory agreement with earlier classifications of competences, although slight differences were also observed in some constructs. Further research could benefit from our conclusions, as a meaningful contribution to previous classifications of competences. However, to the best of our knowledge, we consider that the most remarkable result to emerge from the data is the validation of a second-order factorial structure. This result implies that a general construct of competence underlies for each construct we defined in this work. As far as we know, we believe that no other authors have provided evidence of such a hierarchical factorial structure in their measurement model.
From a practical point of view, the findings of our research have serious managerial implications for quality assurance programs in higher education developed by accreditation bodies. Our results could be exploited for establishing a common framework of key competences so that study plans and academic programs could be assessed. We are confident that the definition of a set of competences will make it easier for academics to create new teaching and learning activities, aimed to develop some particular learning outcomes, as well as the assessment of the level of competences achieved by students. With this background, universities will be able to determine whether the Bologna guidelines concerning competence-based education have been successfully implemented in new degree proposals. In addition, our research provides a powerful tool for young students and employers, as it contains valuable information about what competences should be expected by students when finishing their studies.
It is plausible that a number of limitations may have influenced the conclusions obtained in this work. The first is the limited selection of items shown in the questionnaire which omit some interesting approaches such as the emotional dimension of teamwork. The other negative factor inherent to our methodology is that we have focused on testing non-invariant factor loading and intercepts. Although we could have examined the equality of other parameters, such as error measurement covariates or factor variances, we considered that these parameters had little interest in this research. Finally, there is a lack of available updated datasets dealing with the research question of this paper. We analyzed a dataset gathered about a decade ago as we could not find any other international assessment performed for higher education graduates in the Spanish and Latin American countries.

5. Conclusions

In conclusion, this paper presented a robust measurement instrument for self-assessed competences and has described to what extent graduates’ perceptions differ according to the country where they studied. Our research provided a framework for a new way to define hierarchical constructs of competences in higher education and succeeded in making considerable insight with regard to the implementation of competence-based education in Latin American countries, as it is further substantiated by findings from the present study.
This study contributed toward enhancing our understanding of competences in higher education. In our view, the strength of our study lies in the definition of a methodology for the quantification and comparison of self-assessed competences across different contexts through the validation of an invariant measurement instrument. The present findings might help to identify improvement areas concerning the processes of teaching and learning. Specifically, our approach would lend itself well to the use in the assessment of required competences by employers recruiting higher education graduates in the labor market. The use of standardized instruments may be useful for updating the contents of current study programs according to these requirements and the self-assessments provided by graduates.
Our research could also be a useful aid in the assessment of learning outcomes in the curricula of higher education students. Some universities are already working on the development of an evaluation system to provide a score in competences for each student, serving as a complement of academic marks. Policymakers could also encourage stakeholders to develop institutional accreditation processes based on this methodology. Meanwhile, we are confident that future students and employers may use these findings to examine which should be expected from each professional profile.

Author Contributions

Conceptualization, J.M.C.S. and A.C.P; methodology, E.V.B. and A.C.P.; writing—original draft preparation, E.V.B. and A.C.P. and J.M.C.S.; funding acquisition, J.M.C.S. All authors have read and agreed to the published version of the manuscript.

Funding

This research was funded by the European Commission, grant number AML/19.0902/04/16909/II-0546-A, and grant number CIT2-CT-2004-506352.

Conflicts of Interest

The authors declare no conflict of interest.

Appendix A

Table A1. Factor loadings invariant across countries.
Table A1. Factor loadings invariant across countries.
Latent Factor2nd Order Factor LoadingR2Item Description1st Order Factor LoadingR2
F1Innovation0.9130.834Ability to come up with new ideas and solutions0.8590.737
Ability to use computers and the internet0.6800.463
Willingness to question prevailing ideas0.5800.337
F2Cooperation0.9040.816Ability to mobilize the capacities of others0.7820.611
Ability to work productively with others0.7240.524
F3Knowledge management0.9100.828Mastery of your own field or discipline0.5980.358
Knowledge of other fields or disciplines0.5620.316
Analytical thinking0.7940.631
Ability to rapidly acquire new knowledge0.7960.633
F4Communication0.8310.691Ability to present products, ideas or reports to an audience0.7820.612
Ability to write reports, memos or documents0.7070.500
Ability to make your meaning clear to others0.7900.623
Ability to write and speak in a foreign language0.3820.146
F5Organizational0.9910.983Ability to use time efficiently0.6050.366
Ability to coordinate activities0.7660.586
Ability to perform well under pressure0.7240.523
F6Participative competences0.9990.999Ability to assert your authority0.4440.623
Alertness to new opportunities0.7340.539
Ability to negotiate effectively0.6890.475

References

  1. Bricall, J.M. University 2000: Executive Summary; Ministry of Education: Madrid, Spain, 2000.
  2. Krüger, K.; Jiménez, L.; Piqué, V. Higher education in the Spanish transition to a knowledge society. In European Studies on Inequalities and Social Cohesion; Warzywoda-Kruszyńska, W., Rokicka, E., Woźniak, W., Eds.; Lodz University Press: Lodz, Poland, 2007; pp. 59–109. [Google Scholar]
  3. Delamare, L.D.; Winterton, J. What Is Competence? Hum. Resour. Dev. Int. 2005, 8, 27–46. [Google Scholar]
  4. Cseh, M. Facilitating learning in multicultural teams. Adv. Dev. Hum. Res. 2003, 5, 26–40. [Google Scholar] [CrossRef]
  5. Boyatzis, R.E. The Competent Manager: A Model for Effective Performance; Wiley: New York, NY, USA, 1982. [Google Scholar]
  6. McClelland, D. Identifying competencies with behavioural-event interviews. Psychol. Sci. 1998, 9, 331–339. [Google Scholar] [CrossRef]
  7. Spencer, L.; Spencer, S. Competence at Work: A Model for Superior Performance; Wiley: New York, NY, USA, 1993. [Google Scholar]
  8. Mulder, M. Conceptions of Professional Competence. In International Handbook of Research in Professional and Practice-Based Learning; Billet, S., Harteis, C., Gruber, H., Eds.; Springer: Dordrecht, The Netherlands, 2014; pp. 107–137. [Google Scholar]
  9. Bloom, B.S.; Mesia, B.B.; Krathwohl, D.R. Taxonomy of Educational Objectives: The Classification of Educational Goals; David McKay Company: New York, NY, USA, 1964. [Google Scholar]
  10. Winterton, J. Competence across Europe: Highest common factor or lowest common denominator? J. Eur. Ind. Train. 2009, 33, 681–700. [Google Scholar] [CrossRef]
  11. Hartog, J. Capabilities, Allocation, and Earnings; Springer, Kluwer Academic Pub: Dordrecht, The Netherlands, 1992. [Google Scholar]
  12. Salganik, L.H.; Rychen, D.S. Key Competencies: For a Successful Life and a Well-Functioning Society; Hogrefe and Huber Pub: Boston, MA, USA, 2003. [Google Scholar]
  13. Blömeke, S.; Zlatkin-Troitschanskaia, O.; Kuhn, C.; Fege, J. Modeling and Measuring Competencies in Higher Education; Sense Publishers: Rotterdam, The Netherlands, 2013. [Google Scholar]
  14. Shavelson, R. On the measurement of competency. Empir. Res. Vocat. Educ. Train. 2010, 2, 41–64. [Google Scholar]
  15. Rychen, D.S.; Tiana, A. Developing Key Competencies in Education: Some Lessons from International and National Experience; UNESCO International Bureau of Education: Paris, France, 2004. [Google Scholar]
  16. Kennedy, D. Writing and Using Learning Outcomes: A Practical Guide; University College Cork: Cork, Ireland, 2007. [Google Scholar]
  17. Beneitone, P.; Esquetini, C.; González, J.; Maletá, M.M.; Siufi, G.; Wagennar, R. Tuning América Latina. Reflections on and Outlook for Higher Education in Latin America; Final Report; Publicaciones de la Universidad de Deusto: Bilbao, Spain, 2007. [Google Scholar]
  18. De Kraker, J.; Lansu, A. Competences and competence-based learning for sustainable development. In Crossing Boundaries. Innovative Learning for Sustainable Development in Higher Education; De Kraker, J., Lansu, A., van Dam-Mieras, M.C., Eds.; Verlag für Akademische Schriften: Frankfurt am Main, Germany, 2007; pp. 103–114. [Google Scholar]
  19. Lozano, R.; Merrill, M.; Sammalisto, K.; Ceulemans, K.; Lozano, F.J. Connecting Competences and Pedagogical Approaches for Sustainable Development in Higher Education: A Literature Review and Framework Proposal. Sustainability 2017, 9, 1889. [Google Scholar] [CrossRef] [Green Version]
  20. Zamora-Polo, F.; Sánchez-Martín, J.; Serrano, M.C.; Espejo-Antúnez, L. What Do University Students Know about Sustainable Development Goals? A Realistic Approach to the Reception of this UN Program amongst the Youth Population. Sustainability 2019, 11, 3533. [Google Scholar] [CrossRef] [Green Version]
  21. Mochizuki, Y.; Fadeeva, Z. Competences for sustainable development and sustainability: Significance and challenges for ESD. Int. J. Sustain. High. 2010, 11, 391–403. [Google Scholar] [CrossRef]
  22. Lambrechts, W.; Mulà, I.; Ceulemans, K.; Molderez, I.; Gaeremynck, V. The integration of competences for sustainable development in higher education: An analysis of bachelor programs in management. J. Clean. Prod. 2013, 48, 65–73. [Google Scholar] [CrossRef] [Green Version]
  23. Vandenberg, R.; Lance, C.E. A Review and Synthesis of the Measurement Invariance Literature: Suggestions, Practices, and Recommendations for Organizational Research. Organ. Res. Methods 2000, 3, 4–70. [Google Scholar] [CrossRef]
  24. Hambleton, R.K.; Patsula, L. Increasing the validity of adapted tests: Myths to be avoided and guidelines for improving test adaptation practices. J. Appl. Test. Technol. 1999, 1, 1–13. [Google Scholar]
  25. Heine, S.J.; Lehman, D.R.; Peng, K.; Greenholtz, J. What’s wrong with cross-cultural comparisons of subjective Likert scales? The reference-group effect. J. Pers. Soc. Psychol. 2002, 82, 903–918. [Google Scholar] [CrossRef] [PubMed]
  26. Morse, B.; Weinhardt, J.M.; Griffeth, R.W.; De Oliveira, M.Z. Cross-cultural Measurement Invariance of the Employment Opportunity Index (EOI) in Mexican and Brazilian Professionals. Int. J. Sel. Assess. 2014, 22, 139–148. [Google Scholar] [CrossRef]
  27. González, J.; Wagenaar, R. Tuning Educational Structures in Europe. Universities’ Contribution to the Bologna Process; Publicaciones de la Universidad de Deusto: Bilbao, Spain, 2005. [Google Scholar]
  28. Calmand, J.; Frontini, M.; Rostan, M. Being flexible: Graduates facing changes in their work environment. In The Flexible Professional in the Knowledge Society: New Challenges for Higher Education; Allen, J., Van der Velden, R., Eds.; Springer: Dordrecht, The Netherlands, 2011; pp. 83–110. [Google Scholar]
  29. Alonso, L.E.; Fernández, C.J.; Nyssen, J.M. El Debate Sobre Las Competencias. Una Investigación Cualitativa en Torno a la Educación Superior y el Mercado de Trabajo en España; Agencia Nacional de Evaluación de la Calidad y Acreditación: Madrid, Spain, 2008. (In Spanish) [Google Scholar]
  30. National Agency for Quality Assessment and Accreditation (ANECA). Los Procesos de Inserción Laboral de los Titulados Universitarios en España. Factores de Facilitación y Obstaculización; Agencia Nacional de Evaluación de la Calidad y Acreditación: Madrid, Spain, 2009. (In Spanish) [Google Scholar]
  31. Allen, J.; Van der Velden, R. The Flexible Professional in the Knowledge Society; Springer: Berlin/Heidelberg, Germany, 2011. [Google Scholar]
  32. Bentler, P.M. EQS Structural Equations Program Manual; Multivariate Software: Encino, CA, USA, 2006. [Google Scholar]
  33. Heijke, H.; Meng, C.; Ramaekers, G. An investigation into the role of human capital competences and their pay-off. Int. J. Manpow. 2003, 24, 750–773. [Google Scholar] [CrossRef] [Green Version]
  34. Nordhaug, O. Human Capital in Organizations, Competence, Training and Learning; Scandinavian University Press: Oslo, Norway, 1993. [Google Scholar]
  35. Semeijn, J.H.; Van Der Velden, R.; Heijke, H.; Van Der Vleuten, C.; Boshuizen, H.P.A. Competence indicators in academic education and early labour market success of graduates in health sciences. J. Educ. Work. 2006, 19, 383–413. [Google Scholar] [CrossRef]
  36. Billing, D. Teaching for transfer of core/key skills in higher education: Cognitive skills. High. Educ. 2007, 53, 483–516. [Google Scholar] [CrossRef]
  37. Usher, E.L.; Pajares, F. Self-Efficacy for self-regulated learning a validation Study. Educ. Psychol. Meas. 2008, 68, 443–463. [Google Scholar] [CrossRef]
  38. Meng, C.; Heijke, H. Student Time Allocation, the Learning Environment and the Acquisition of Competences; ROA Research Memorandum 001, Maastricht University: Maastricht, The Netherlands, 2005. [Google Scholar]
  39. Hernández-March, J.; Del Peso, M.M.; Leguey, S. Graduates’ Skills and Higher Education: The employers’ perspective. Tert. Educ. Manag. 2009, 15, 1–16. [Google Scholar] [CrossRef]
  40. Smits, W. Industry-specific or generic skills? Conflicting interests of firms and workers. Labour Econ. 2007, 14, 653–663. [Google Scholar] [CrossRef]
  41. García-Aracil, A.; Mora, J.G.; Vila, L.E. The rewards of human capital competences for young European higher education graduates. Tert. Educ. Manag. 2004, 10, 287–305. [Google Scholar] [CrossRef]
  42. García-Aracil, A.; Van Der Velden, R. Competencies for young European higher education graduates: Labor market mismatches and their payoffs. High. Educ. 2007, 55, 219–239. [Google Scholar] [CrossRef] [Green Version]
  43. Kellerman, P. Acquired and required competences of graduates. In Careers of University Graduates: Views and Experiences in Comparative Perspectives; Teichler, U., Ed.; Springer: Dordrecht, The Netherlands, 2007; pp. 115–131. [Google Scholar]
  44. Clemente-Ricolfe, J.S.; Escribá-Pérez, C. Analysis of the perception of generic skills acquired at university. Rev. Educ. 2013, 362, 535–561. [Google Scholar]
  45. Conchado, A.; Carot, J.M.; Bas, M.C. Competencies for knowledge management: Development and validation of a scale. J. Knowl. Manag. 2015, 19, 836–855. [Google Scholar] [CrossRef]
  46. Yuan, K.-H.; Bentler, P.M. 5. Three Likelihood-Based Methods for Mean and Covariance Structure Analysis with Nonnormal Missing Data. Sociol. Methodol. 2000, 30, 165–200. [Google Scholar] [CrossRef]
  47. Gerbing, D.W.; Anderson, J.C. The Effects of Sampling Error and Model Characteristics on Parameter Estimation for Maximum Likelihood Confirmatory Factor Analysis. Multivar. Behav. Res. 1985, 20, 255–271. [Google Scholar] [CrossRef] [PubMed]
  48. Hu, L.; Bentler, P.M. Cutoff criteria for fit indexes in covariance structure analysis: Conventional criteria versus new alternatives. Struct. Equ. Model. A Multidiscip. J. 1999, 6, 1–55. [Google Scholar] [CrossRef]
  49. Browne, M.W.; Cudeck, R. Alternative ways of assessing model fit. Sociol. Methods. Res. 1992, 21, 230–258. [Google Scholar] [CrossRef]
  50. Steiger, J.H. Structural Model Evaluation and Modification: An Interval Estimation Approach. Multivar. Behav. Res. 1990, 25, 173–180. [Google Scholar] [CrossRef] [Green Version]
  51. Kline, R.B. Principles and Practice of Structural Equation Modeling; Guilford press: New York, NY, USA, 2011. [Google Scholar]
  52. Byrne, B.M. Testing for multigroup equivalence of a measuring instrument: A walk through the process. Psicothema 2008, 20, 872–882. [Google Scholar]
  53. Chen, F.F.; Sousa, K.H.; West, S.G. Teacher’s Corner: Testing Measurement Invariance of Second-Order Factor Models. Struct. Equ. Model. A Multidiscip. J. 2005, 12, 471–492. [Google Scholar] [CrossRef]
  54. Cheung, G.W.; Rensvold, R.B. Evaluating Goodness-of-Fit Indexes for Testing Measurement Invariance. Struct. Equ. Model. A Multidiscip. J. 2002, 9, 233–255. [Google Scholar] [CrossRef]
  55. Satorra, A. Scaled and adjusted restricted tests in multi-sample analysis of moment structures. In Innovations in Multivariate Statistical Analysis: A Festschrift for Heinz Neudecker; Heijmans, R.D.H., Pollock, D.S.G., Satorra, A., Eds.; Kluwer Academic Publishers: London, UK, 2000; pp. 233–247. [Google Scholar]
  56. Hong, S.; Malik, M.L.; Lee, M.-K. Testing Configural, Metric, Scalar, and Latent Mean Invariance Across Genders in Sociotropy and Autonomy Using a Non-Western Sample. Educ. Psychol. Meas. 2003, 63, 636–654. [Google Scholar] [CrossRef]
  57. Hancock, G.R. Effect size, power, and sample size determination for structured means modeling and mimic approaches to between-groups hypothesis testing of means on a single latent construct. Psychometrika 2001, 66, 373–388. [Google Scholar] [CrossRef]
  58. Kivinen, O.; Nurmi, J. Job requirements and competences: Do qualifications matter? In Careers of University Graduates. Views and Experiences in Comparative Perspectives; Teichler, U., Ed.; Springer: Dordrecht, The Netherlands, 2007; pp. 131–142. [Google Scholar]
  59. Shmatko, N. Competences of Engineers. Evidence from a Comparative Study for Russia and EU countries. Foresight STI Gov. 2012, 6, 32–47. [Google Scholar]
Table 1. Description of the scale, Cronbach’s alpha, and descriptive statistics.
Table 1. Description of the scale, Cronbach’s alpha, and descriptive statistics.
Latent FactorCronbach’s αItems
F1Innovation0.728Ability to come up with new ideas and solutions
Ability to use computers and the internet
Willingness to question prevailing ideas
F2Cooperation0.695Ability to mobilize the capacities of others
Ability to work productively with others
F3Knowledge management0.773Mastery of your own field or discipline
Knowledge of other fields or disciplines
Analytical thinking
Ability to rapidly acquire new knowledge
F4Communication0.677Ability to present products, ideas or reports to an audience
Ability to write reports, memos or documents
Ability to make your meaning clear to others
Ability to write and speak in a foreign language
F5Organizational0.761Ability to use time efficiently
Ability to coordinate activities
Ability to perform well under pressure
F6Participative competences0.784Ability to assert your authority
Alertness to new opportunities
Ability to negotiate effectively
Table 2. Goodness-of-fit statistics of the initial and improved model (Model 0) in each country.
Table 2. Goodness-of-fit statistics of the initial and improved model (Model 0) in each country.
S-B χ2dfS-B χ2CFIRMSEARMSEA 90% CISRMR
Spain
Initial2712.51461994.00.9080.052[0.050–0.054]0.041
Improved2439.01451803.40.9170.049[0.047–0.051]0.039
Mexico
Initial1889.41461248.70.9720.042[0.040–0.044]0.025
Improved1691.41451121.50.9750.040[0.048–0.052]0.024
Chile
Initial1433.0146931.90.9080.048[0.045–0.051]0.043
Improved1298.9145849.40.9180.045[0.038–0.042]0.041
Uruguay
Initial645.1146388.30.9080.057[0.050–0.057]0.055
Improved612.0145366.50.9160.055[0.048–0.061]0.053
Note: S-B χ2 = Satorra-Bentler χ2 Statistic, df = Degrees of Freedom; CFI = Robust Comparative Fit Index; RMSEA = Robust Root Mean Square Error of Approximation; CI = Confidence Interval; SRMR = Standardized Root Mean Square Residual.
Table 3. Multiple-group confirmatory factor analysis of nested models.
Table 3. Multiple-group confirmatory factor analysis of nested models.
ModelModel ComparisonS-B χ2dfS-B χ2CFIRMSEARMSEA 90% CISRMR
1Configural invariance-6457.25844199.90.9450.046[0.045–0.047]0.043
2Metric invariance (1st order)2 vs. 16769.66234055.80.9420.046[0.045–0.047]0.052
3Scalar invariance3 vs. 111,157.46807845.80.9490.049[0.048–0.051]0.105
4Metric invariance (2nd order)4 vs. 111,575.36988261.20.9440.051[0.050–0.052]0.193
5Latent factor mean differences5 vs. 19209.96806426.10.9470.049[0.048–0.051]0.166
Table 4. Estimated latent mean differences, standard error, and Cohen’s d for Model 5.
Table 4. Estimated latent mean differences, standard error, and Cohen’s d for Model 5.
CompetencesMexicoChileUruguay
Est. Diff.Std. Err.Cohen’s dEst. Diff.Std. Err.Cohen’s dEst. Diff.Std. Err.Cohen’s d
Innovation0.250.0211.8640.550.0260.6200.640.0571.743
Interpersonal0.320.0231.8010.610.0271.0110.670.0601.640
Knowledge management0.200.0171.5250.540.0210.5971.760.1366.287
Communication0.320.0271.3860.810.0320.6280.850.0701.454
Organizational0.210.0202.4220.510.0250.7190.570.0535.387
Participative competences0.790.0233.8850.860.0282.0071.120.0574.903

Share and Cite

MDPI and ACS Style

Conchado Peiró, A.; Carot Sierra, J.M.; Vázquez Barrachina, E. Competences of Flexible Professionals: Validation of an Invariant Instrument across Mexico, Chile, Uruguay, and Spain. Sustainability 2020, 12, 5224. https://0-doi-org.brum.beds.ac.uk/10.3390/su12125224

AMA Style

Conchado Peiró A, Carot Sierra JM, Vázquez Barrachina E. Competences of Flexible Professionals: Validation of an Invariant Instrument across Mexico, Chile, Uruguay, and Spain. Sustainability. 2020; 12(12):5224. https://0-doi-org.brum.beds.ac.uk/10.3390/su12125224

Chicago/Turabian Style

Conchado Peiró, Andrea, José Miguel Carot Sierra, and Elena Vázquez Barrachina. 2020. "Competences of Flexible Professionals: Validation of an Invariant Instrument across Mexico, Chile, Uruguay, and Spain" Sustainability 12, no. 12: 5224. https://0-doi-org.brum.beds.ac.uk/10.3390/su12125224

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop