Next Article in Journal
Controlling-Supportive Homework Help Partially Explains the Relation between Parents’ Math Anxiety and Children’s Math Achievement
Next Article in Special Issue
Digital Competence of University Teachers of Social and Legal Sciences from a Gender Perspective
Previous Article in Journal
Brazilian and Spanish Mathematics Teachers’ Predispositions towards Gamification in STEAM Education
Previous Article in Special Issue
Educational Innovation in the Evaluation Processes within the Flipped and Blended Learning Models
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Declared and Real Level of Digital Skills of Future Teaching Staff

Department of Social Pedagogy and Adult Education, Pedagogical University of Cracow, 30-084 Kraków, Poland
Submission received: 13 September 2021 / Revised: 1 October 2021 / Accepted: 3 October 2021 / Published: 9 October 2021

Abstract

:
Digital competence is undoubtedly one of the key skills that teaching staff should possess. Currently, there are many theoretical frameworks and ways to measure skills and knowledge related to the use of information and communication technologies (ICT). This article is an attempt to show the real and declared level of digital skills among future teaching staff. The research was conducted in Poland among 128 students of pedagogical faculties (first-year undergraduate studies). The research used a triangulation of research methods and techniques: diagnostic survey and competency tests related to the use of word processors and spreadsheets, and the level of knowledge about the use of ICT. Competency tests were in accordance with the European Computer Skills Certificate (ECDL) standard. The collected data showed the following: (1) more than half of the students rate their own skills in the use of word processors and spreadsheets, and their overall theoretical knowledge as high or very high; (2) in the case of the real assessment of digital competence, only less than 20% reached the passing threshold in the areas of word processors and theoretical knowledge, with only 1.6% passing in the area of spreadsheets; (3) the declared and actual levels of digital competence were moderately related in the surveyed group; (4) attitudes towards new media, self-assessment of digital skills, and previous learning experience in handling ICT are not predictive factors for ECDL test results.

1. Introduction

Operating computers, smartphones, and the Internet are elementary skills within modern society. Limitations in this area generate a phenomenon defined as digital exclusion [1]. Therefore, for more than two decades, the issue of shaping the effective use of ICT in private and everyday life is a priority for formal, nonformal, and informal education. Digital knowledge and skills are crucial for many professional groups. Recent months showed how important it is for teachers and students to have appropriate ICT knowledge and skills [2]. Nowadays, the level of digital competence among teachers is attracting increasing interest from both public and school-related stakeholders, and experts in media pedagogy. All of this translated into increasingly bold attempts to operationalise the concept of digital competence along with its simultaneous measurement.
This text is part of the scientific discourse related to the digital competence of future teaching staff. It is the new generation of teachers that will soon change the face of school digitalisation. Moreover, this text is an attempt to validate research tools that measure different key aspects of digital skills such as data processing, text editing, and elementary knowledge related to the use of ICT. The text also shows the relationship between the self-assessment of digital competences and the actual results of competence tests. The text fills an important gap by emphasising the importance of moving away from measuring digital competences through self-declaration [3,4,5,6,7,8] in favour of real measurement based on competency tests, as is the case in determining the level of other skills, e.g., language skills [9,10].

2. Theoretical Framework

Digital competences are now firmly established in the canon of skills that teaching staff must possess [11]. In the literature, digital competencies can be encountered in different ways. Usually, these skills are included in the general category of the ability to use hardware and software. High proficiency in operating software and hardware is something natural in many professions and not subject to indepth analysis [12]. Systematic analysis of the concept reveals that digital literacy is most often understood as the ability to use information and communication technologies (ICT) [13]. However, this is not an exhaustive definition. Media scholars indicate that digital literacy should not be considered and defined in isolation from information processes or the understanding of social phenomena occurring through new media [14]. Reflecting on the mechanisms related to the impact of digital media on individuals and groups is thus an integral component of digital competence [15].
The definition of digital competences depends on the context in which these skills are situated. For example, researchers around the Digital Centre indicate that digital skills are a concept that is closely related to human needs. Therefore, analysis of these key skills should be in relation to the group that they concern [16]. This type of flexible approach to defining digital competences is also evident in other studies, where ICT skills are strictly assigned to occupation, potential occupation, or age category [17,18]. This approach makes the development of an unambiguous and subject-independent definition of digital competence a difficult task, fraught with the errors of reductionism. The complexity of this concept manifests itself primarily in the creation of definitions that depend on the time of the implementation of the research itself (i.e., the stage of development of the information society) [19,20], the needs and the level of computerisation of a given institution, or the represented field. Nevertheless, attempts to construct definitions of digital competences and their measurement are a valuable activity, allowing for intentional activities to be conducted, aimed at supporting individuals in their professional work and everyday life in the information society. In this text, digital competences were intentionally reduced to the use of selected software: those most often used in education and higher education. Nevertheless, digital competence is a very complex key skill that evolves in accordance with the changing sociotechnical environment. In this study, ICT skills and knowledge of new media are included as digital competences. Although there is some divergence in the definitions of digital literacy, digital skills, and digital competence in the literature, in this text, all three terms include basic software skills and ICT knowledge. This text does not claim to be definitive, but merely an attempt to measure the elementary components of all three terms. The main defining characteristics of digital literacy (DL) are presented in Figure 1.
Digital competence is, therefore, a complex concept that goes beyond the simple operation of software and hardware. Definitions of digital competence clearly emphasise that it is a skill; therefore, its measurement should be realised through the performance or solution of specific activities using target hardware and software. However, in the current literature, it is increasingly common to study this key competence using self-assessment, i.e., without the use of computers, smartphones, the Internet, or software, but only one’s own intuition or conclusions resulting from the experiences of the respondents’ use of ICT. Looking at research carried out in the last five years in Poland, it is immediately apparent that digital competences as measured by media educators are not realised through the use of the more complex and laborious measurement procedure, the competence test. A brief summary of the results of research conducted among future media educators in this area is presented in Table 1.
Such diverse methodologies for measuring digital competences are, on the one hand, a richness, as they allow for an understanding of a selected fragment of digital competences to be reached, and offer a new perspective on the operationalisation of the concept. On the other hand, increasing the number of indicators makes it impossible to conduct comparative research or to create universal research tools. Moreover, taking into account the issue of the great freedom in interpreting the notion of key competences, this area is characterised by increasing distortions resulting from the misuse of measurement techniques based on self-declarations.
Another issue, and one that should be clearly stated when conducting research on digital competences among future teaching staff, is the conditions associated with the specific characteristics of the current generation of students training to become teachers. This is a group whose formal education falls at an intensive time in the development of the information society. It is a collective characterised by a relatively high level of confidence in the effectiveness of the use of ICT in education, and who very frequently use media [27,28] while at the same time recognising the possibilities of using software and hardware not only in education, but also in the effective management of leisure time [29]. Simultaneously, the prior preparation of pedagogical students in the effective use of ICT might raise concerns and reservations about the level of digital competence that they possess [30]. Therefore, the training of new pedagogical staff for an increasingly computerised school requires analysis using standardised knowledge tests, showing the real level of possessed skills, as it is these that are the foundation for building more complex, specialised knowledge linking digital competence to pedagogy [31]. On the basis of the above theoretical analyses, the following research hypotheses are posited:
Hypothesis 1 (H1).
Students rate their own digital competence well or very well.
Hypothesis 2 (H2).
The real level of digital competence differs from the declared one.
Hypothesis 3 (H3).
Digital competence is related to previous educational experiences.

3. Research Methodology

3.1. Research Problems

The aim of the research was to measure digital competence among future teaching staff. The goal was achieved using a triangulation of research tools based primarily on a competency test showing the actual level of knowledge and skills in operating the most popular office software. The indirect aim of the research was to validate the research tools that test skill level in the operation of selected office suite software.
The specific aim of the research was to juxtapose the results of the competency test with self-declarations in terms of assessing one’s own level of DL and assessing the validity of using ICT in education, and one’s own experiences of shaping DL at earlier stages of education.
The objects of the study were answers from a questionnaire survey. In addition, the results of competence tests on the use of word processors, spreadsheets, and ICT knowledge were examined.
The following research problems were assumed in the study:
  • What is the level of knowledge and skills related to the use of ICT equipment among students of education?
  • How do future educators assess their own digital competencies?
  • To what extent is the level of digital competence in using an office suite related to the self-diagnosis of digital competence, the assessment of the relevance of using ICT in education, and previous experience with formal education in the development of digital competence—in short, how well does the objective align with the subjective in the evaluation of digital competence?

3.2. Test Procedure

The research was conducted at the largest Polish state university orientated towards the education of pedagogical personnel, the Pedagogical University of Kraków. Research was conducted using a triangulation of research methods and techniques. Measurement was conducted during three meetings that took place within the framework of an academic course in information technology. During the first meeting, the students, future pedagogical staff, completed an online questionnaire composed of questions related to the self-assessment of the level of their digital competence, a self-assessment of the speed by which they learn to use new software and hardware, their attitude towards the use of ICT in the teaching process, and their own experiences related to the formation of digital competence in secondary school. In addition, during the first meeting, the students also completed a knowledge assessment test on the basics of handling IT equipment (ECDL Module I). The students had 45 min to complete the test.
The second stage of the research, which took place a week later, involved a test of skills connected with the use of text editors. These activities were also conducted in the computer lab of the Pedagogical University in Krakow. The students had 45 min to complete a series of practical tasks associated with operating word processors (Word 2013) in accordance with the ECDL standard (word processing module). After completing the tasks, the number of correctly solved commands was separately recorded in the protocol for each of the students.
The third and final stage, which took place during the third week, was the spreadsheet skills test. The students completed the ECDL compliant test (spreadsheets module) using hardware and software resources available in the computer lab. The students had 45 min to complete 32 tasks. Results were checked by the tutor and entered against each student’s name.
Figure 2 presents the entire research procedure.

3.3. Research Tools

The whole research was embedded in the quantitative stream of pedagogical research allowing for the measurement of digital competence and attitudes towards the use of new media. A triangulation of research methods and techniques was used in the study. The measurement of digital competence was realised according to the standard of the European Computer Skills Certificate to capture the real level of knowledge and skills related to the use of ICT possessed by the respondents. The following tests were used in the implementation of the ECDL standard [32,33]:
  • Operation of digital devices and knowledge of IT equipment (theoretical test) consisting of 32 questions with single-choice answers. A maximum of 1 point could be obtained for each question. The points were then converted into a percentage of correct answers on a scale of 0 to 100%. Each student had 45 min to complete the test.
  • Use of word processing software at a basic level. Each student received a set of instructions in a PDF file and a set of work files in which the activities were performed. Each student was given 45 min to complete 32 tasks. Each correctly completed task could be awarded 1 point. The points were then converted into percentages on a scale from 0 to 100%.
  • Spreadsheet maintenance. As in the previous modules, students were given work files and 32 tasks to complete with a time limit of 45 min. Results were then compiled and stored as percentages.
All of the ECDL assessment tasks contained within this module were completed in accordance with the ECDL Foundation Syllabus: Computer Fundamentals Module B1 Syllabus—Version 1.0; ECDL/ICDL Word Processing Module B3 Syllabus—Version 5.0; ECDL/ICDL Spreadsheets Module B4 Syllabus—Version 5.0.
Additionally, the objective measurement of digital competence was preceded by a diagnostic survey consisting of:
  • A general self-assessment of digital competence level (5 questions) using a Likert scale from 1—very low to 5—very high. Previously used tools [34] were employed to develop the scale. The Cronbach alpha coefficient was 0.778.
  • Self-assessment of digital competence in using an office suite, consisting of 5 questions using a Likert scale from 1—very low to 5—very high. This scale was used in previous comparative studies in Visegrad countries [35,36]. The Cronbach alpha coefficient was 0.780.
  • An assessment of the ability to use new hardware and software (3 questions). Previous multiauthor studies from the Smart Ecosystem for Learning and Inclusion (SELI) project [37] were used to create the scale using responses on a 5-degree Likert scale from 1—very rarely to 5—very often. The Cronbach alpha coefficient was 0.720.
  • The validity of the use of ICT in education consisting of 6 questions using a 5-degree response scale from 1—very much disagree to 5—very much agree. The scale was the author’s own and was derived from Serbian research on the use of ICT in education [38]. The Cronbach alpha coefficient was 0.630.
Cronbach’s alpha for the entire tool was 0.816. Exploratory factor analysis (EFA) was conducted for the entire survey questionnaire. Results are attached as Appendix A.

3.4. Sampling Procedure and Characteristics of the Study Sample

The sample selection was purposive in nature. The key for the selection of respondents was the following conditions: (1) having the status of a student of pedagogy; (2) participating in the academic course “information technology” as first year students; (3) being able to use ICT at a basic level. The research should be treated as a pilot study aimed primarily at testing the time-consuming measurement of digital competences with the use of the ECDL standard and going beyond self-evaluation. The collected results do not authorise generalisation to the whole population of students in Poland. The research requires the procedure to be restarted with the use of sampling frames.
The research involved 128 first-year full-time and extramural students. The research was conducted in Poland at the largest national pedagogical university, namely, the Pedagogical University of Kraków. The research comprised 94.53% females and 5.47% males, which reflects the structure of students in pedagogical faculties. The students came from the following areas: large city (26.56%), small city (16.40%), village (39.84%), medium city (17.2%). The vast majority of the respondents graduated from a general secondary school (82.03%), while 17.97% had a diploma from a secondary technical school. Data were collected in the academic year 2020/2021.

3.5. Research Ethics

Participation in the study was linked to the completion of mandatory classes in the academic subject of information technology. The students were informed about the purpose of the study. Completing the online survey and competency tests is a standard procedure that first-year undergraduate students undergo. Participation in the research had no bearing on obtaining credit for the course, and was solely related to the diagnostic activities. The results of the study were not personally linked to the students in any way, and results were anonymised prior to analysis. The protocols with the results of individual tests after the final results had been recorded were secured by removing the students’ personal data, i.e., name and surname.

4. Test Results

4.1. Declared Level of Digital Literacy—A Diagnostic Survey

Future teachers rate their own skills in operating the software for creating multimedia presentations the highest, with more than half of the respondents declaring that they have high or very high skills in this field. A little more than half of the respondents also assessed their digital competence in operating text editors as very high. Almost one-quarter of the respondents declared having high skills in handling spreadsheets. The weakest assessment of the respondents was given to their skills in creating and using databases. The detailed percentage distribution of responses related to self-assessment is presented in Figure 3.

4.2. Objective Level of Digital Skills—Skills Test

Each of the ECDL skills tests allows a maximum of 32 points to be gained, i.e., 1 point for each correctly solved task. The points were converted into percentages on a scale from 0% to 100%. On the basis of the collected data, it is clear that the respondents are least able to handle spreadsheets, while significantly better results were obtained in word processing and theoretical knowledge related to the use of ICT. Thus, in this group, word processing skills and theoretical knowledge are at a higher level than that of mathematical calculations in a spreadsheet. Basic descriptive statistics for the competence tests are presented in Table 2.
The pass rate for the ECDL competency tests was calibrated at 75% and above. Taking into account the ECDL principles, the highest pass rates here were for the word processing skills test and the use of ICT knowledge test. However, in both cases, less than 20% of the respondents reached the official pass mark. For the handling of spreadsheets, only slightly more than 1.5% of the respondents achieved a positive result. The detailed distribution of results is presented in the Figure 4.

4.3. Declared vs. Actual Level of Digital Skills

The declared and actual levels of digital competences were positively correlated. This first applies to the use of word processors and spreadsheets. Only the relationship between the self-evaluation of theoretical knowledge and the actual result of the ECDL test, which assesses the level of knowledge, was not statistically significant. However, the self-evaluation and actual test scores were only slightly above the threshold of average correlation power. This means that the two forms of assessment were not compatible with each other; therefore, the self-evaluation is not the same as the hard evaluation by means of standardised tests. Correlation values are shown in Table 3.

4.4. Prediction of Real Level of Digital Skills

Using multivariate regression analysis, predictive analyses were conducted on changes in the results of competence tests regarding the knowledge of ICT use, word processing, and the use of spreadsheets. All three variables mentioned were included in the model as dependent variables. In turn, the independent variables became the subjective assessment of new media literacy, ways of operating new hardware and software, attitudes towards the use of new media in education, and experiences with formal education in information technology at earlier stages. Results are surprising because none of the aforementioned factors had any effect on the adopted model on the change in the level of basic ICT skills as determined by the ECDL tests. Details of multivariate regression analysis are shown in Table 4.

5. Discussion

Digital competence is not a new theoretical and practical construct. We encountered the term in reference to ICT literacy in the literature ever since the commercialisation of the first personal computers [39,40]. With the development of information society, including the rise in popularity of various e-services, this concept is also increasingly prominent in the pedagogical literature [41,42]. Although every educator is intuitively aware of the indicators that characterise this key skill, there are now many diverse approaches to defining these skills and measuring the concept [43]. We can often observe the dominance of quick diagnostic surveys, which are characterised by a relatively uncomplicated methodology based on self-declarations. It is a simple way of measuring digital competences, but it is burdened with many errors. The measurement of digital competence through the use of tests of knowledge and skills, i.e., classical methods known and valued for years in the pedagogical sciences, appear to be slightly more complicated. Both methods have their positive and negative sides, and these are presented in Table 5.
The aim of the present text was to validate a selected aspect of digital competence that is consistent with the ECDL framework, and to compare these results with self-declarations among pedagogical students. The study was conducted in several stages. The first stage was characterised by rapid data collection due to the used diagnostic survey. However, the next stage, which involved the assessment of digital competence levels using the ECDL standard, was time-consuming and required the involvement of additional resources in the form of both hardware and software. The issue of time-consumption is the justification for the use of quick measurements through self-declaration by many researchers dealing with this topic.
However, collected data showed that students overestimate their own digital skills. At the stage of self-evaluation, most of the respondents defined their knowledge and skills above their actual level, which was confirmed by the results of competence tests. Therefore, one should be aware when using self-evaluation in this type of research that the Dunning–Kruger effect [44,45,46] can significantly distort the level of assessed digital competence. Research conducted using self-assessment alone can lead to significant distortions in research findings, generating unreliable data. In many cases, it is difficult for the respondents to relate to the assessment of their own digital skills. Therefore, a clear postulate arising from this research is to limit the measurement of digital competences based on self-declaration, and develop new standards of measurement that involve authentic activities using ICT. Transposing the described situation to other areas, it is easy to imagine how the results of tests of key competences in other areas, e.g., foreign language and mathematical skills, which contain measurements based only on vague, subjective feelings, can be distorted.
Collected data showed that the level of DL among students is a challenge in preparing for the teaching profession. Only one in five students of pedagogical sciences would meet the passing criteria of the ECDL tests in word processing and knowledge of ICT activities. In addition, only less than 2% of the respondents would be able to obtain a positive result in managing a spreadsheet. This means that the effects of education in the field of information technology at earlier stages (secondary and primary school) were not achieved or were erased (e.g., forgotten due to the underuse of individual skills formed in secondary school). The research results presented in the article, therefore, represent a question about the quality of education related to digital competences and the necessary catalogue of digital competences that secondary school graduates should have [47,48]. Due to the complexity of the notion of digital competence, a further series of questions also arise relating to the preparation of a universal set of competences characteristic of particular professional groups, e.g., teachers [49,50,51].
Predicting changes in digital competence levels is not an easy task. Collected data showed that the change in proficiency in using word processing and spreadsheets, and the knowledge of ICT among future teachers, is not influenced by attitudes towards the use of new media in education, previous educational experiences, or experimenting with new software and hardware. The lack of such influences may be due to the methodological limitations of this text (and especially the small research sample used). Other individual variables, such as the frequency of using Office suites [52], readiness to learn [53,54,55], and situational factors related to work experience [56], which are difficult to include in the course of a single study due to the limitations of the length of the tool, may also be a limiting factor in the prediction of change in the selected elements of key competences.

6. Research Limitations and New Directions in Digital Literacy Research

This study consisted of two parts, the measurement of digital competences using a diagnostic survey allowing for a quick, albeit “surface” verification of this key skill, and the more complex study that involved taking a competency test. This research was characterised by several methodological limitations that may have contributed to the distortion of the results. First, the self-assessment part is characterised by a high level of subjectivity that does not provide a realistic representation of the level of knowledge and skills related to the use of ICT. Second, the test surveys were conducted using real tasks related to the operation of the Office suite. Not all students at earlier educational stages had the opportunity to form these competencies using the MS Office suite. In addition, the imposed time constraints (45 min) to complete the 32 tasks for each test may have caused stress among the test takers. Nevertheless, this stage of the study was conducted under friendly conditions. This limitation was minimised by repeatedly emphasising to the subjects that the test scores did not have any bearing on the final evaluation of their academic course activities.
The research is also characterised by theoretical limitations. The approach applied to measuring digital competence was only limited to selected elements of operating an Office suite and knowledge of how ICT works, which means that only a fragment of the measurement of digital competence is presented in this text. Moreover, due to the rapid development of digital skills, it is currently difficult to construct permanent definitions and measurement tools that measure a universal core of digital competences, and one that is not subject to rapid change over time.
Therefore, further research should include the development of a universal theoretical framework, while preparing tools that allow for the quick and precise measurement of digital competences. Such a measure should be similar to competence tests related to foreign language skills (e.g., TELC A1–C2 standard).

7. Conclusions

Measuring digital competence using self-assessment is a quick but very imprecise process. On the other hand, measuring digital competences using competence tests that require completing specific actions at the computer is a complex, time-consuming task that requires direct reference to existing standards. Due to the convenience of data collection, many researchers prefer the first type of data collection. The second type, going beyond self-declaration, is time-consuming and burdened with technical requirements (e.g., software unification), but nevertheless brings much more precise results.
This text compared the two measurements, showing the pros and cons of both approaches. The collected data did not allow for generalisation due to the sampling procedure, but they are a voice in the discussion on the need for increased attention to the preparation of new pedagogical staff in the use of ICT in education. Moreover, the text is an attempt to polemicise against the popularised standards of data collection based on a fast but vague path, fraught with many theoretical and procedural shortcomings. Therefore, a deeper debate on the development of a universal digital competence framework for future educators is necessary [57].

Funding

The article was written as part of the project “Teachers of the future in the information society”between risk and opportunity paradigm" funded by the Polish National Agency for Academic Exchange (NAWA) under the Bekker programme—grant number: PPN/BEK/2020/1/00176.

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Informed consent was obtained from all subjects involved in the study.

Data Availability Statement

Data available on request by e-mail.

Conflicts of Interest

The author declares no conflict of interest.

Appendix A

Table A1. EFA: exploratory factor analysis.
Table A1. EFA: exploratory factor analysis.
Kaiser–Meyer–Olkin test
MSA
Overall MSA0.751
I can create a web page0.730
I can work with a computer better than others0.830
I can work on the Internet better than others0.813
I can use a smartphone better than others0.795
I can also use the Internet in a creative way0.680
Handling a word processor—self-assessment0.776
Self-assessment on using a spreadsheet programme.0.792
Self-assessment in using a presentation software0.776
Self-assessment on using a database editor.0.770
Knowledge of how computers and the Internet work0.837
Experimenting with new software0.813
Ease of use of new software0.714
Ease of use of new ICT devices0.648
Prediction of using ICT in the professional work of an educator0.751
Education using ICT is interesting0.650
ICT used in teaching improves students’ concentration0.543
ICT used in teaching increases student engagement0.742
ICT used in teaching increases students’ interest in the subject matter0.743
Modern school needs ICT0.730
Level of Computer Science teaching in a secondary school0.730
Equipment of a computer lab in a secondary school0.623
Content-related preparation of computer science teachers in secondary school0.638
Bartlett’s test
X2dfp
1001.309231.000<001
Chi-squared Test
Valuedfp
Model242.486149<001
Factor Loadings
Factor 1Factor 2Factor 3Factor 4Uniqueness
I can create a web page.0.817
I can work with a computer better than others.0.7270.390
I can work on the Internet better than others.0.9030.243
I can use a smartphone better than others.0.8160.411
I can also use the Internet in a creative way.0.898
Handling a word processor—self-assessment.0.4100.647
Self-assessment on using a spreadsheet programme.0.6070.478
Self-assessment in using a presentation software0.663
Self-assessment on using a database editor.0.6320.616
Knowledge of how computers and the Internet work.0.4940.628
Experimenting with new software.0.6630.604
Ease of use of new software.0.6720.585
Ease of use of new ICT devices.0.6170.657
Prediction of using ICT in the professional work of an educator.0.4580.738
Education using ICT is interesting.0.4030.717
ICT used in teaching improves students’ concentration.−0.4230.797
ICT used in teaching increases student engagement.0.7360.466
ICT used in teaching increases students’ interest in the subject matter.0.6470.550
Modern schools need ICT.0.7450.418
Level of computer-science teaching in a secondary school.0.5660.508
Equipment of a computer lab in a secondary school.0.8240.315
Content-related preparation of computer science teachers in secondary school.0.6480.569
Note: applied rotation method is promax.
Factor Characteristics
SumSq. LoadingsProportion var.Cumulative
Factor 12.9250.1330.133
Factor 22.5800.1170.250
Factor 32.2320.1010.352
Factor 41.5490.0700.422
Factor Correlations
Factor 1Factor 2Factor 3Factor 4
Factor 11.0000.5980.2910.217
Factor 20.5981.0000.198−0.025
Factor 30.2910.1981.0000.005
Factor 40.217−0.0250.0051.000
Additional fit indices
RMSEARMSEA 90% confidenceTLIBIC
0.0780.054–0.0860.806−480.467

References

  1. Tomczyk, L.; Eliseo, M.A.; Costas, V.; Sanchez, G.; Silveira, I.F.; Barros, M.-J.; Amado-Salvatierra, H.R.; Oyelere, S.S. Digital Divide in Latin America and Europe: Main Characteristics in Selected Countries. In Proceedings of the 2019 14th Iberian Conference on Information Systems and Technologies, Coimbra, Portugal, 19–22 June 2019. [Google Scholar]
  2. Potyrała, K.; Tomczyk, Ł. Teachers in the Lifelong Learning Process: Examples of Digital Literacy. J. Educ. Teach. 2021, 47, 255–273. [Google Scholar] [CrossRef]
  3. Hajduová, Z.; Smoląg, K.; Szajt, M.; Bednárová, L. Digital Competences of Polish and Slovak Students—Comparative Analysis in the Light of Empirical Research. Sustainability 2020, 12, 7739. [Google Scholar] [CrossRef]
  4. Záhorec, J.; Hašková, A.; Poliaková, A.; Munk, M. Case Study of the Integration of Digital Competencies into Teacher Preparation. Sustainability 2021, 13, 6402. [Google Scholar] [CrossRef]
  5. Tomczyk, L.; Potyrala, K.; Demeshkant, N.; Czerwiec, K. University Teachers and Crisis E-Learning: Results of a Polish Pilot Study on: Attitudes towards e-Learning, Experiences with e-Learning and Anticipation of Using e-Learning Solutions after the Pandemic. In Proceedings of the 2021 16th Iberian Conference on Information Systems and Technologies, Chaves, Portugal, 23–26 June 2021. [Google Scholar]
  6. Guillén-Gámez, F.D.; Ramos, M. Competency Profile on the Use of ICT Resources by Spanish Music Teachers: Descriptive and Inferential Analyses with Logistic Regression to Detect Significant Predictors. Technol. Pedagog. Educ. 2021, 30, 511–523. [Google Scholar] [CrossRef]
  7. Cabero-Almenara, J.; Romero-Tena, R.; Palacios-Rodríguez, A. Evaluation of Teacher Digital Competence Frameworks through Expert Judgement: The Use of the Expert Competence Coefficient. J. New Approaches Educ. Res. 2020, 9, 275. [Google Scholar] [CrossRef]
  8. Guillén-Gámez, F.D.; Mayorga-Fernández, M.J.; Contreras-Rosado, J.A. Incidence of Gender in the Digital Competence of Higher Education Teachers in Research Work: Analysis with Descriptive and Comparative Methods. Educ. Sci. 2021, 11, 98. [Google Scholar] [CrossRef]
  9. Evtyugina, A.; Zhuminova, A.; Grishina, E.; Kondyurina, I.; Sturikova, M. Cognitive-Conceptual Model for Developing Foreign Language Communicative Competence in Non-Linguistic University Students. Int. J. Cogn. Res. Sci. Eng. Educ. 2020, 8, 69–77. [Google Scholar]
  10. Rabiej, A.; Banach, M. Testowanie Biegłości Językowej Uczniów w Wieku Szkolnym Na Przykładzie Języka Polskiego Jako Obcego. Acta Univ. Lodz. 2020, 27, 451–467. [Google Scholar]
  11. Potyrała, K.; Demeshkant, N.; Czerwiec, K.; Jancarz-Łanczkowska, B.; Tomczyk, Ł. Head Teachers’ Opinions on the Future of School Education Conditioned by Emergency Remote Teaching. Educ. Inf. Technol. 2021, 1–25. [Google Scholar] [CrossRef]
  12. Lankshear, C.; Knobel, M. Digital Literacy and Digital Literacies: Policy, Pedagogy and Research Considerations for Education. Nord. J. Digit. Lit. 2015, 10, 8–20. [Google Scholar] [CrossRef]
  13. Biezā, K.E. Digital Literacy: Concept and Definition. Int. J. Smart Educ. Urban Soc. 2020, 11, 1–15. [Google Scholar] [CrossRef]
  14. Koltay, T. The Media and the Literacies: Media Literacy, Information Literacy, Digital Literacy. Media Cult. Soc. 2011, 33, 211–221. [Google Scholar] [CrossRef]
  15. Eger, L.; Tomczyk, Ł.; Klement, M.; Pisoňová, M.; Petrová, G. How Do First Year University Students Use ICT in Their Leisure Time and for Learning Purposes? Int. J. Cogn. Res. Sci. Eng. Educ. 2020, 8, 35–52. [Google Scholar] [CrossRef]
  16. Ramowy Katalog Kompetencji Cyfrowych. Available online: https://depot.ceon.pl/bitstream/handle/123456789/9068/Ramowy_katalog_kompetencji_cyfrowych_Fra.pdf?sequence=1&isAllowed=y (accessed on 28 September 2021).
  17. Liu, Z.-J.; Tretyakova, N.; Fedorov, V.; Kharakhordina, M. Digital Literacy and Digital Didactics as the Basis for New Learning Models Development. Int. J. Emerg. Technol. Learn. 2020, 15, 4. [Google Scholar] [CrossRef]
  18. Novković Cvetković, B.; Stošić, L.; Belousova, A. Media and Information Literacy—the Basic for Application of Digital Technologies in Teaching from the Discourses of Educational Needs of Teachers/Medijska i Informacijska Pismenost—Osnova Za Primjenu Digitalnih Tehnologija u Nastavi Iz Diskursa Obra. Croat. J. Educ. 2018, 20. [Google Scholar] [CrossRef]
  19. Ziemba, E. Synthetic Indexes for a Sustainable Information Society: Measuring ICT Adoption and Sustainability in Polish Government Units. The Neuropharmacology of Alcohol; Springer International Publishing: Cham, Switzerland, 2019; pp. 214–234. [Google Scholar]
  20. Ziemba, E. The Contribution of ICT Adoption to the Sustainable Information Society. J. Comput. Inf. Syst. 2019, 59, 116–126. [Google Scholar] [CrossRef]
  21. Majewska, K. Przygotowanie Studentów Pedagogiki Resocjalizacyjnej Do Stosowania Nowych Technologii w Profilaktyce Problemów Młodzieży. Pol. J. Soc. Rehabil. 2020, 20, 283–298. [Google Scholar]
  22. Kiedrowicz, G. Współczesny Student w Świecie Mobilnych Urządzeń. Lub. Rocz. Pedagog. 2018, 36, 49. [Google Scholar] [CrossRef] [Green Version]
  23. Wobalis, M. Kompetencje Informatyczne Studentów Filologii Polskiej w Latach 2010–2016. Pol. Innow. 2016, 4, 109–124. [Google Scholar] [CrossRef]
  24. Jędryczkowski, J. New media in the teaching and learning process of students. Dyskursy Młodych/Adult Educ. Discourses 2019, 20. [Google Scholar] [CrossRef]
  25. Pulak, I.; Staniek, J. Znaczenie Nowych Mediów Cyfrowych w Przygotowaniu Zawodowym Nauczycieli Edukacji Wczesnoszkolnej w Kontekście Potrzeb Modernizacji Procesu Dydaktycznego. Pedagog. Przedszkolna i Wczesnoszkolna 2017, 5, 77–88. [Google Scholar]
  26. Romaniuk, M.W.; Łukasiewicz-Wieleba, J. Zdalna Edukacja Kryzysowa w APS w Okresie Pandemii COVID-19; Akademia Pedagogiki Specjalnej: Warszawa, Poland, 2020. [Google Scholar]
  27. List, A. Defining Digital Literacy Development: An Examination of Pre-Service Teachers’ Beliefs. Comput. Educ. 2019, 138, 146–158. [Google Scholar] [CrossRef]
  28. List, A.; Brante, E.W.; Klee, H.L. A Framework of Pre-Service Teachers’ Conceptions about Digital Literacy: Comparing the United States and Sweden. Comput. Educ. 2020, 148, 103788. [Google Scholar] [CrossRef]
  29. García-Martín, J.; García-Sánchez, J.-N. Pre-Service Teachers’ Perceptions of the Competence Dimensions of Digital Literacy and of Psychological and Educational Measures. Comput. Educ. 2017, 107, 54–67. [Google Scholar] [CrossRef]
  30. Arteaga, M.; Tomczyk, Ł.; Barros, G.; Sunday Oyelere, S. ICT and Education in the Perspective of Experts from Business, Government, Academia and NGOs: In Europe, Latin America and Caribbean; Universidad del Azuay: Cuenca, Ecuador, 2020. [Google Scholar]
  31. Fedeli, L. School, Curriculum and Technology: The What and How of Their Connections. Educ. Sci. and Soc. 2018, 2. [Google Scholar] [CrossRef] [Green Version]
  32. Konan, N. Computer Literacy Levels of Teachers. Procedia Soc. Behav. Sci. 2010, 2, 2567–2571. [Google Scholar] [CrossRef] [Green Version]
  33. Leahy, D.; Dolan, D. Digital Literacy: A Vital Competence for 2010? In Key Competencies in the Knowledge Society; Springer: Berlin/Heidelberg, Germany, 2010; pp. 210–221. [Google Scholar]
  34. Tomczyk, Ł.; Szotkowski, R.; Fabiś, A.; Wąsiński, A.; Chudý, Š.; Neumeister, P. Selected Aspects of Conditions in the Use of New Media as an Important Part of the Training of Teachers in the Czech Republic and Poland—Differences, Risks and Threats. Educ. Inf. Technol. 2017, 22, 747–767. [Google Scholar] [CrossRef] [Green Version]
  35. Eger, L.; Klement, M.; Tomczyk, Ł.; Pisoňová, M.; Petrová, G. Different User Groups of University Students and Their Ict Competence: Evidence from Three Countries in Central Europe. J. Balt. Sci. Educ. 2018, 17, 851–866. [Google Scholar] [CrossRef]
  36. Eger, L.; Egerová, D.; Mičík, M.; Varga, V.; Czeglédi, C.; Tomczyk, L.; Sládkayová, M. Trust Building and Fake News on Social Media from the Perspective of University Students from Four Visegrad Countries. Commun. Today 2020, 1, 73–88. [Google Scholar]
  37. Oyelere, S.S.; Tomczyk, Ł. ICT for Learning and Inclusion in Latin America and Europe Case Study from Countries: Bolivia, Brazil, Cuba, Dominican Republic, Ecuador, Finland, Poland, Turkey, Uruguay; University of Eastern Finland: Joensuu, Finland, 2019. [Google Scholar]
  38. Stošić, L.; Stošić, I. Perceptions of Teachers Regarding the Implementation of the Internet in Education. Comput. Human Behav. 2015, 53, 462–468. [Google Scholar] [CrossRef]
  39. Pangrazio, L.; Godhe, A.-L.; Ledesma, A.G.L. What Is Digital Literacy? A Comparative Review of Publications across Three Language Contexts. E-Learn. digit. media 2020, 17, 442–459. [Google Scholar] [CrossRef]
  40. Falloon, G. From Digital Literacy to Digital Competence: The Teacher Digital Competency (TDC) Framework. Educ. Technol. Res. Dev. 2020, 68, 2449–2472. [Google Scholar] [CrossRef] [Green Version]
  41. Reynolds, R. Defining, Designing for, and Measuring “Social Constructivist Digital Literacy” Development in Learners: A Proposed Framework. Educ. Technol. Res. Dev. 2016, 64, 735–762. [Google Scholar] [CrossRef]
  42. Tomczyk, Ł. Skills in the Area of Digital Safety as a Key Component of Digital Literacy among Teachers. Educ. Inf. Technol. 2020, 25, 471–486. [Google Scholar] [CrossRef] [Green Version]
  43. Trifonas, P.P. (Ed.) Learning the Virtual Life: Public Pedagogy in a Digital World; Routledge: London, UK, 2012. [Google Scholar]
  44. Peled, Y. Pre-Service Teacher’s Self-Perception of Digital Literacy: The Case of Israel. Educ. Inf. Technol. 2020, 26, 2879–2896. [Google Scholar] [CrossRef]
  45. Mahmood, K.; University of the Punjab. Do People Overestimate Their Information Literacy Skills? A Systematic Review of Empirical Evidence on the Dunning-Kruger Effect. Commun. Inf. Lit. 2016, 10, 199. [Google Scholar] [CrossRef]
  46. Dunning, D. The Dunning–Kruger Effect. In Advances in Experimental Social Psychology; Elsevier: Amsterdam, The Netherlands, 2011; pp. 247–296. [Google Scholar]
  47. Frania, M. Selected Aspects of Media Literacy and New Technologies in Education as a Challenge of Polish Reality. Perspect. Innov. Econ. Bus. 2014, 14, 109–112. [Google Scholar] [CrossRef]
  48. Duda, E.; Dziurzyński, K. Digital Competence Learning in Secondary Adult Education in Finland and Poland. Int. J. Pedagogy Innov. New Technol. 2019, 6, 22–32. [Google Scholar] [CrossRef]
  49. Góralczyk, N. Identity and Attitudes Towards The Past, Present and Future of Student Teachers in The Digital Teacher of English Programme. Teaching English with Technology 2020, 20, 42–65. [Google Scholar]
  50. Hall, R. Systemu Certyfikacji ECDL–Wpływ Zmian w Procedurze Oceny Egzaminów Na Jakość Systemu Certyfikacji. Nierówności społeczne a wzrost gospodarczy 2015, 1, 192–204. [Google Scholar] [CrossRef]
  51. Cabero-Almenara, J.; Guillen-Gamez, F.D.; Ruiz-Palmero, J.; Palacios-Rodríguez, A. Classification Models in the Digital Competence of Higher Education Teachers Based on the DigCompEdu Framework: Logistic Regression and Segment Tree. J. e-Learn. Knowl. Soc. 2021, 17, 49–61. [Google Scholar] [CrossRef]
  52. Eger, L. Learning a Jeho Aplikace (E-Learning and Its Applications); Západočeská univerzita v Plzni: Plzen, Czech, 2020. [Google Scholar]
  53. Jakešová, J.; Kalenda, J. Self-Regulated Learning: Critical-Realistic Conceptualization. Procedia Soc. Behav. Sci. 2015, 171, 178–189. [Google Scholar] [CrossRef] [Green Version]
  54. Mirke, E.; Kašparová, E.; Cakula, S. Adults’ Readiness for Online Learning in the Czech Republic and Latvia (Digital Competence as a Result of ICT Education Policy and Information Society Development Strategy). Period. Eng. Nat. Sci. (PEN) 2019, 7, 205. [Google Scholar] [CrossRef]
  55. Jakešová, J.; Gavora, P.; Kalenda, J.; Vávrová, S. Czech Validation of the Self-Regulation and Self-Efficacy Questionnaires for Learning. Procedia Soc. Behav. Sci. 2016, 217, 313–321. [Google Scholar] [CrossRef] [Green Version]
  56. Tomczyk, Ł. Edukacja Osób Starszych. Seniorzy w Przestrzeni Nowych Mediów; Difin: Warszawa, Poland, 2015. [Google Scholar]
  57. Tomczyk, Ł.; Jáuregui, V.C.; de La Higuera Amato, C.A.; Muñoz, D.; Arteaga, M.; Oyelere, S.S.; Akyar, Ö.Y.; Porta, M. Are Teachers Techno-Optimists or Techno-Pessimists? A Pilot Comparative among Teachers in Bolivia, Brazil, the Dominican Republic, Ecuador, Finland, Poland, Turkey, and Uruguay. Educ. Inf. Technol. 2020, 26, 2715–2741. [Google Scholar] [CrossRef]
Figure 1. Main characteristics of digital competence (own illustration prepared in Piktochart). Source: author’s illustration, prepared using Piktochart.
Figure 1. Main characteristics of digital competence (own illustration prepared in Piktochart). Source: author’s illustration, prepared using Piktochart.
Education 11 00619 g001
Figure 2. Scheme of the test procedure.
Figure 2. Scheme of the test procedure.
Education 11 00619 g002
Figure 3. Self-assessment of digital competence in using an Office suite.
Figure 3. Self-assessment of digital competence in using an Office suite.
Education 11 00619 g003
Figure 4. Percentage distribution of responses on a scale from 0% to 100% (X axis).
Figure 4. Percentage distribution of responses on a scale from 0% to 100% (X axis).
Education 11 00619 g004aEducation 11 00619 g004b
Table 1. Measurements of digital competence conducted in the last five years in Poland—samples of future teachers.
Table 1. Measurements of digital competence conducted in the last five years in Poland—samples of future teachers.
AuthorsToolConclusions
Majewska (2020) [21]Triangulation of tools, including a survey questionnaireAbout 20% of respondents use ICT efficiently
Kiedrowicz (2018) [22]Diagnostic testsThe Internet is most often used for entertainment and communication
Wobalis (2016) [23]Triangulation of tools, including knowledge testDigital literacy levels have been steadily declining for several years
Jedryczkowski (2019) [24]Data from e-learning platformStudents of pedagogical faculties have a low level of information literacy and self-education in relation to operating e-learning platforms
Pulek, Staniek (2017) [25]Diagnostic survey, self-evaluationStudents highly rate their skills in operating entertainment websites (e.g., social networking sites, music and video sites), while their skills in operating office software are much lower
Romaniuk, Łukasiewicz-Wieleba (2020) [26]Diagnostic survey, self-evaluationMost students rate their own digital skills as good
Table 2. Results of three ECDL standardised competency tests–descriptive statistics.
Table 2. Results of three ECDL standardised competency tests–descriptive statistics.
WordExcelTheoretical
Mean0.5990.3470.617
Std. error of mean0.0190.0180.009
Median0.5940.3750.625
Mode0.4690.3750.656
Std. deviation0.1500.1460.098
Skewness0.0170.429−1.059
Std. error of skewness0.3040.3020.214
Kurtosis0.250−0.1653.914
Std. error of kurtosis0.5990.5950.425
Shapiro–Wilk0.9850.9590.937
P value of Shapiro–Wilk0.6440.034<0.001
Range0.7190.6560.688
Minimum0.2190.0630.125
Maximum0.9380.7190.813
25th percentile0.5080.2190.563
50th percentile0.5940.3750.625
75th percentile0.6880.4060.688
25th percentile0.5080.2190.563
50th percentile0.5940.3750.625
75th percentile0.6880.4060.688
Table 3. Spearman’s correlation coefficient between self-evaluation and ECDL test scores.
Table 3. Spearman’s correlation coefficient between self-evaluation and ECDL test scores.
Variable1234567
1. Word–self-evaluation
2. Excel–self-evaluation0.577 ***
3. PowerPoint–self-evaluation0.570 ***0.511 ***
4. Access–self-evaluation0.21 *0.478 ***0.208 *
5. Theory–self-evaluation0.385 ***0.382 ***0.340 ***0.419 ***
6. Theoretical test ECDL0.0890.0770.018−0.0330.161
7. Word ECDL0.354 **0.1280.181−0.1430.0600.283 *
8. Excel ECDL0.273 *0.471 ***0.312 *0.1520.2230.2350.067
* p < 0.05, ** p < 0.01, *** p < 0.001.
Table 4. Multivariate regression analysis for independent variables ECDL test scores.
Table 4. Multivariate regression analysis for independent variables ECDL test scores.
Model 1: TheoreticalModel 2: WordModel 3: Excel
βSEpβSEpβSEp
I can create a web page−0.110.090.250.090.140.520.130.140.35
I can work with a computer better than others−0.110.140.440.350.240.150.160.250.52
I can work on the Internet better than others−0.070.150.65−0.020.260.95−0.230.260.36
I can use a smartphone better than others0.220.130.10−0.250.230.290.320.200.11
I can also use the Internet in a creative way0.010.090.89−0.120.140.39−0.090.140.54
Experiment with new software−0.140.100.18−0.070.150.630.050.170.79
Ease of use of new software0.220.120.080.330.170.060.260.190.19
Ease of use of new ICT devices0.130.120.290.080.180.63−0.030.180.87
Prediction of using ICT in the professional work of an educator−0.020.110.880.120.150.41−0.200.160.22
Education using ICT is interesting0.050.110.63−0.240.170.170.310.160.06
ICT used in teaching improves students’ concentration−0.020.100.85−0.140.140.30−0.110.150.49
ICT used in teaching increases student engagement−0.060.120.600.140.200.490.060.160.71
ICT used in teaching increases students’ interest in the subject matter0.050.120.660.010.190.94−0.210.170.24
Modern schools need ICT0.110.120.370.060.200.770.180.190.34
Level of computer science teaching in a secondary school0.000.120.98−0.280.200.17−0.250.160.13
Equipment of a computer lab in a secondary school0.160.120.18−0.100.220.650.030.160.84
Content-related preparation of secondary school computer science teachers−0.060.110.560.160.170.350.240.170.16
Assessment in computer science in secondary school0.100.100.320.200.140.17−0.080.150.62
R = 0.412; R2 = 0.170; F = 1.245; p < 0.001R = 0.610; R2 = 0.372; F = 1.418; p < 0.001R = 0.602; R2 = 0.362; F = 1.389; p < 0.001
Table 5. Measuring digital competence through self-declaration and a knowledge and skills test.
Table 5. Measuring digital competence through self-declaration and a knowledge and skills test.
Digital
Competence
Self-DeclarationsKnowledge and Skills Test
Advantages
  • Speed of data collection
  • Ability to easily collect data on large samples (including representative samples)
  • No need for additional hardware (computers, software)
  • Possibility of easy modification of research tools
  • Possibility to use typical scales (e.g., Likert)
  • Precise measurement of the real level of knowledge and skills
  • Ability to use existing standards e.g., ECDL
  • Transparency of the examination procedure based on the pass rate
Disadvantages
  • Subject to large subjectivity error (e.g., Dunning–Kruger effect)
  • Multiple research tools, leading to the loss of standardisation and the inability to compare results across the literature
  • Time-consuming research
  • The need to prepare the subjects for the procedure in organisational terms
  • Need for additional equipment
  • Complex procedure for modifying research tools
  • Differing level of respondents’ preparation for using hardware and software
  • The cost of software and hardware that should be included in the research tool
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Tomczyk, Ł. Declared and Real Level of Digital Skills of Future Teaching Staff. Educ. Sci. 2021, 11, 619. https://0-doi-org.brum.beds.ac.uk/10.3390/educsci11100619

AMA Style

Tomczyk Ł. Declared and Real Level of Digital Skills of Future Teaching Staff. Education Sciences. 2021; 11(10):619. https://0-doi-org.brum.beds.ac.uk/10.3390/educsci11100619

Chicago/Turabian Style

Tomczyk, Łukasz. 2021. "Declared and Real Level of Digital Skills of Future Teaching Staff" Education Sciences 11, no. 10: 619. https://0-doi-org.brum.beds.ac.uk/10.3390/educsci11100619

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop