Next Article in Journal
Managing the Emotional Intensities of Gifted Students with Mindfulness Practices
Previous Article in Journal
A Visual Approach for Solving Problems with Fractions
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Digital Competences of Students of Library Studies: Comparison of Research Results for 2018–2020

Faculty of Art, Masaryk University in Brno, 602 00 Brno, Czech Republic
Submission received: 12 October 2021 / Revised: 5 November 2021 / Accepted: 8 November 2021 / Published: 12 November 2021

Abstract

:
This study focuses on the analysis of changes in the digital competence profile of students of Information and Library Studies at Masaryk University in Czechia. As a research tool, we used the DigComp self-assessment questionnaire that students were asked to fill in after completing the course. Our research shows that students are insufficiently prepared for work as highly qualified information specialists. At the same time, we found that their competence profile remained very stable between 2018 and 2020. This finding indicates that students do not readily respond to new societal changes at the level of individual competences. The research results are based on data collected from 152 students during three runs of a compulsory course at the university. Information Science and Library Science students have long perceived their competences to be strongest in the domains of information and data literacy and communication and collaboration. Programming is the weakest competency among the competences, followed by solving technical problems and engaging in active citizenship through digital technologies. These findings can be used to innovate the curriculum to meet the demands of digitally competent information workers.

1. Introduction

The work of a librarian or information scientist is increasingly associated with digital technologies. A future librarian can therefore be expected to be able to work with modern technologies and use them to change individual processes involved in the practice of an information specialist. Beverley et al. [1] presented 10 areas of work of an information specialist in the context of healthcare. The authors claim that an information specialist should be a project leader and manager; an expert in finding information and working with resources, providing assistance in working with documents (and gaining access to them); should be able to critically evaluate information; work with data sources; have the ability to synthesize diverse data; create usable and comprehensible reports and disseminate their conclusions within the institution.
Such a delimitation of the role of the information specialist is certainly possible, but it raises two significant issues. The first is that Beverley et al. [1] did not describe the role of the information specialist as a profession, but in terms of a certain general requirement for any scientist that, in addition to the characteristics mentioned above, may be supplemented with other attributes pertaining to a particular professional domain. In this sense, the notion of scientific literacy (or according to ALA, information literacy) completely replicates the role of the information specialist [2,3]. The second issue is that the study referred to above is anchored in only one specific discourse, namely the medical discourse, which does not easily allow for an extension of the role of the information specialist. At the same time, the approach represented by Beverly et al. does not explicitly reflect technology as a catalyst for changes and shifts in the field.
Miller [4] has also adhered to the concept of the information specialist as a kind of “meta-scientist”, emphasizing that it is information specialists who can support scientific work in libraries. She argues that they should take an active part in education and build partnerships between information specialists and specialised scientists. In this respect, the author approaches the library as a service in an academic institution, which is developed through information specialists.
Engerer and Sabir [5] have distinguished three different discourses: information specialists, librarians facilitating research, and iHumanists. Those belonging to the last category are perceived as full-fledged members of the professional community as autonomous scientists. The authors draw attention to the fact that while the first two discourses are traditional, they do not reflect the real needs of the current state of the world. iHumanists or information humanists have knowledge in three basic domains: the first concerns the ability to comprehensively analyse complex interconnected systems, the second area focuses on a certain methodological proficiency and the third focuses on technological skills.
In their study, Engerer and Sabir [5] placed a strong emphasis on what we call digital competences in our study. This is evident at the analytical level, at which a sound knowledge of technologies and the ability to place them in a broader social and scientific framework is essential for iHumanists, and also in the third dimension, which focuses on what we might call “engineering” skills; i.e., specific knowledge, procedures and the ability to solve a specific problem with the use of certain technologies.
As indicated above and confirmed by other studies [6,7], librarianship, or the work of information specialists embedded in it, requires strongly developed digital competences, such as a broad ability (and willingness) to use technology to solve a variety of problems ranging from civic needs and links to eGovernment to the support of science.
The aim of our study was to analyse and reflect the digital competences of students of the Library Studies (LIS) Bachelor’s degree programme at Faculty of Art at Masaryk University in Brno (Czechia). In our research, we applied the digital competences model (DigComp) defined by the European Union for citizens [8]. There are several reasons why we used this model. In general, DigComp is currently probably the most recognised and highest quality model of digital competences that is widely applicable and usable. In the Czech context, it is also used, for example, by the Ministry of Labour and Social Affairs and other state institutions.
The second motivation for using this framework was the very role of libraries, which we understand as community-centred and socially engaged [9,10,11] and which is also often associated with digital services. In order for a librarian to fulfil their social role, it is necessary to be a citizen in the full sense of the word but also be digitally competent, which is the direction that we pursued in our research.

1.1. Mater Definition of the Concept of Digital Competence

Janssen et al. [12] pointed out the lexical issue concerning the use of the notion of digital literacy by many authors, while in the Scandinavian environment [13,14], the notion of competence with an emphasis on a broader educational concept is preferred. Digital competences do not exist on their own, and it is problematic to evaluate them separately, because they form an interconnected whole with a broader personal and educational background. Janssen et al. [12] presented a model that uses thematic blocks: (1) competence as a tool of everyday life; (2) the ability to communicate and cooperate through ICT; (3) the ability to work with information; (4) the ethical and legal dimension; (5) a certain sociological understanding of digital competences and (6) the ability to learn and develop through ICT.
Although we applied the DigComp model [8] in our analysis, Janssen’s position is crucial for us. On the one hand, it determined our research methods, but above all, it also frames the overall design of the curriculum in the field of digital competences for our students.
Although some authors argue that digital competences are tied to a profession [15,16] or only to work on a computer [17], our view is that to consider competences as something that serves the labour market can be short-sighted. In our library context, we see as more important their integration either into the role of iHumanist [5] or into the level of the civic competence profile.
The DigComp framework [8] distinguishes 21 competences, which are divided into five basic domains. These competences as a whole are aimed at the ability of fully-fledged digitally conceived citizenship. However, citizenship is not understood individually. It is clear from the structure and scaling of competences that it is moving towards a social dimension—competences are something to help both the individual and their surroundings; in the case of the two highest levels, there is even the expectation to change the nature of work as such and the approach to certain problems in society at large or in the field of work activity. In this respect, we can note that the DigComp framework shows a rare alignment with the model presented by Engerer and Sabir [5] and represents a good springboard for its further development and possibly also its evaluation.
The DigComp framework distinguishes the following five dimensions of competence (individual competences are listed in the results tables): information and data literacy, communication and collaboration, digital content creation, safety and problem solving. It is clear from looking at this list that this is indeed a relatively comprehensive approach to what every European citizen (and from our perspective, also every librarian, information specialist or iHumanist) should be able to do (and in what context).
The relationship between the concepts of information literacy and digital literacy is not clearly defined. Numerous studies and competence frameworks, such as the MIL from UNESCO [18], perceive information literacy as a superior concept, which includes digital competences [19] or other technical skills [20]. On the contrary, for example, DigComp [5], which we follow, and other studies [21,22] tend to the opposite opinion—namely considering a general concept of which an integral part is information literacy. In this research, we will lean towards this second (DigComp) variant.

1.2. Self-Assessment as a Method of Evaluating Digital Competences

Research into students’ digital competences is now commonplace [23,24,25]. There are several ways to approach this; one can encounter test-based or practice-oriented tasks as well as self-assessment [26,27,28]. Digital competences can be thought of as transferable competences [12]. Therefore, self-evaluation is a crucial tool for measuring this, as it reveals how people perceive their capabilities in specific life situations. Some cultures or research of this kind could lead to an overestimation of competences [29,30], but this is not a general problem, and this research does not account for students’ overestimation of their strengths as it does not seem to occur; rather, the opposite is found [31]. As Aesaert et al. [32] point out, as ability increases, so does the underestimation of the self. Our study population of university students belongs to this overestimated population. Measuring digital competence according to the DigComp framework can be found relatively frequently in current research discourse [33,34,35].
The advantages of self-assessment in transferable competences can be considered to include the fact that self-assessment leads to self-reflection and possible further development [24,25]. The degree of self-esteem also determines how a person works with competences in everyday life and to what extent they are able and willing to rely on them to solve specific problems. Although self-esteem can be affected by many factors, it is something that each person subconsciously calls upon whenever they think about their ability to solve a problem. The research is focused on the area of university students in the field of librarianship and information studies [5].

2. Methods

Student self-evaluation is a well-established concept in pedagogical literature and has been mostly associated with positive effects on the educational process and personal development of the student [36,37,38,39,40]. At the same time, we agree with Stallings and Tascoine [39], who argue that self-assessment is a key tool for personal development and reflection, even though the way students rate themselves is not unproblematic. In our research, for example, we identified a significant degree of systematic underrating oneself among the students, which, in our opinion, has a cultural tradition. Students who are able to complete tasks that require at least level 5 without major problems oscillated around level 3 in their self-assessment, which clearly results in a discrepancy.
In our research, we used a self-evaluation questionnaire, in which students assigned themselves to individual levels of digital competences as described by the DigComp framework [8]. We do not use a simple numerical scale, but in each case we indicate a specific description of the level in order to reduce the impact of interpretation of the level by individual students. The DigComp model defines 21 competences with 8 levels of difficulty (from foundation to highly specialised levels). In our research, we added level 0, which indicates that the student does not have the given competence at all. This level was introduced in response to the fact that DigComp level 1 already assumes some (although limited) degree of competence.
Given that the research took place in the Czech environment (i.e., with Czech-speaking students), it was necessary to localise the DigComp framework, including the description of individual items. A translation of the framework can be found in a monograph on digital competences [41]. In Czech, there is an official translation of only specific competences, published by the Ministry of Labour and Social Affairs, which has been produced with the decisive participation of the author of the present research. Nevertheless, we perceive the problem of localisation as significant, because linguistic shifts in meaning can affect the understanding of certain concepts.
The research we conducted was designed is such a way that students completed a standardised course (with the same study materials, test questions, assignments, etc.), but there were variable lectures within the course. The lectures were given by the same teacher and thus the influence of this parameter should be limited. The course is compulsory for Bachelor’s degree students of Information and Library studies.
At the end of the semester, after submitting all the necessary assignments and passing the tests, students filled in a self-evaluation questionnaire. This was a mandatory part of the course, but its content was not reflected in the evaluation and its primary function was to help students to self-evaluate or analyse the areas in which they could further develop and improve. The questionnaire was completed in the university information system. Students were informed that anonymised results were to be published as part of the research.
The self-evaluation questionnaire was filled in by students in the university’s internal information system. Response time was not limited. The student always saw the level number and its description. Individual competences in the questionnaire were sorted chronologically according to DigComp 2.1. Individual levels were sorted from 0 to 8.
This research was conducted and we were able compare the results from Autumn 2018, Autumn 2019 and Spring 2020 semesters. The last course (Spring 2020) was specific in that, although it had the same content, it was included in the new Bachelor’s degree accreditation, which had an impact both on the semester of completion (it was moved from the third semester to the second semester) and on the context—students in this semester lacked the preceding course focused on algorithmisation, which subsequently affected the score in programming. The code of the course also changed. In presenting the results, however, we follow only the scheme indicating the courses by semesters.
The dimension of self-evaluation is essential for us in this research—it is not an exact score that is important for the performance of job tasks, but rather the personal belief that the student has mastered the activity or competence at a certain level. Our position is that self-evaluation provides a certain indication with regard to the possibility of a student’s involvement in specific areas of work, whether the position of a librarian, an information specialist, iHumanist or any other role is concerned.
A specific limit to the representativeness of the data may be that the course in which the data were collected is mandatory for students. On the other hand, self-evaluation was not part of the course evaluation, so students had no reason to skew the results in their favour intentionally. We do not have a tool for determining the external validity of data. However, from the results of the knowledge test at the end of the course and continuous tasks, it can be estimated that the absolute level of competences is around level 5 for students, with a higher point in selected areas (especially competences related to information literacy and online communication and cooperation), or exceptionally one point lower (programming). Thus, it can be estimated that the self-assessment results mimic the students’ competency profile well but are not “calibrated”. Students know what they can or cannot do, but they cannot determine an adequate level of competence according to the description of the DigComp framework [5].
The research limit was a short time of three years for data collection (this is not long-term research) and a small sample, given by a limited number of students in a given study program (see Table 1). Our research is thus a probe into a specific student population in a particular curriculum, rather than simply generalisable research of a quantitative nature.
Our research sample consisted of students of the Information and Library Studies Bachelor’s study program at Faculty of Art at Masaryk University in Brno. In the Autumn 2018 semester, there were 41 students (of whom 65% were women), in Autumn 2019 there were 47 students (72% women) and in Spring 2020 there were 64 students (75% women). In all cases, the sample included students of both full-time and combined form of studies. In total, we analysed data collected from 152 students in the course of three years. We use normalised data; therefore, the different number of respondents in individual years is not problematic.
Our research sought to answer two research questions:
  • Is there any difference between the competence profile of students of teacher training, as described by Napal Fraile et al. [42], and the results of students of information and library studies?
  • Are the digital competences of students of information and library studies gradually increasing in line with the development of information society?

3. Results

We used a scale of competences from 0 to 8—i.e., a nine-point scale—in our research. The calculation of the average value is constructed in such a way that it expects a linear character of the competence. In other words, individual frequencies listed in the tables are multiplied by the level and subsequently normalised. As we stated in the methodological part of our study, it seems that the results are affected by a relatively large statistical error related to the lower ability of students to evaluate themselves. However, what the results demonstrate quite clearly is a certain distribution of knowledge among individual competences or their domains.
The data are presented in summary in two graphs. Figure 1, Figure 2 and Figure 3 provides a view of the distribution of competences according to student responses by course. We have used box plots to illustrate the competency distribution in the sample as a whole. Between Fall 2019 and Spring 2020, we can see a significant shift in the subjective reflection of digital competency levels; there is a decrease in the number of responses corresponding to complete beginners (levels 0 and 1) and an increase in the representation of advanced users (levels 5 and 6). The data do not provide an answer to whether this is a broader trend or whether (concerning the Fall 2018 data) this is just a drop in competency for students in the Fall 2019 semester (see Figure 4).
The distribution of competences in the student distribution is normal, as indicated by the values χ2 (Fall 2018) = 1.13; χ2 (Fall 2019) = 4.69 and χ2 (Spring 2020) = 1.25.
If we excluded the programming component, it would result in the following shifts in average competence: Autumn 2018—3.84, Autumn 2019—2.59, Spring 2020—3.76. An interesting perspective can be obtained when we compare the competence area as outlined in the DigComp framework. Here, we do not evaluate individual competences, but competence areas that seem to have a higher information value with regard to the research tool. The average values and their deviations from the overall average in a given year are shown in Table 2.
The table above shows quite clearly that students rated themselves highest in information and data literacy with a score exceeding others by 0.51–0.71 points. The results for the ability to cooperate and communicate were also above average (0.22–0.31). The result for competences in the field of safety is interesting in that it corresponds well with the average value (variance of −0.14 to −0.17)). Digital content creation and problem solving were evaluated very similarly, and students felt weaker than the average. Table 2 shows the highest and lowest students of the evaluated competence for individual years.
In the case of content creation, this decline can be attributed to programming, which in the long run appears to be the weakest competence with the highest deviation from the average in negative values (0.69–2.05 below the average of the whole). The category of problem solving is the least homogeneous and clear-cut domain within the entire DigComp framework, which can impact the way students perceive and understand it.
The spring semester was affected by the partial need to switch to the online teaching mode due to the COVID-19 pandemic, which may have had a negative effect on some competences. However, we cannot sufficiently analyse this effect from the data we collected.
The decline in students’ self-assessment in the Autumn 2019 semester is unclear, and we do not have a robust enough model to explain it. It seems that a poorly explained assignment or misinterpretation of the questionnaire is unlikely because the same model (textual description of the questionnaire, study materials, ongoing tasks, questionnaire format, instructions, etc.) was used as in the other two examined intervals. Above all, there is a significant drop in assessment at grades 5–7 compared to previous years, which could be related to a socio-psychological anomaly in the specific self-assessment in the group among students, which could indicate lower self-confidence. Because the result was a finding with a considerable time lag, it is not usable for further research. We see the group perception of the competence profile among the students themselves as a probable cause, which (due to the design of the whole course) probably has a strong influence on the result.
Table 3 indicates that students have difficulties with technically oriented competences, even in areas where this is contrary to expectation, such as copyright or identifying digital competence gaps, which are topics that are accentuated by the LIS (Library and Information Science) curriculum of the field of study (not only in this course). Conversely, the highest scores point to three competences related to information and data literacy and also to elements directed towards online cooperation and communication in various forms.
As we have already pointed out, when interpreting the results provided above, we need to take into consideration the systematic error associated with self-evaluation, specifically with a systematic underrating of oneself. Nevertheless, the results show a relatively good consistency in most competences when monitoring deviations, which means that it is not possible to consider them as a measurement error or to question their reliability, as has been quite clearly demonstrated by the last two tables.
Figure 5 demonstrates that the weakest digital competence in terms of self-perception is programming. This fact can be explained by the research being carried out at the Faculty of Arts. At the same time, we must emphasize that the field of librarianship and information science is computerised and that its digital transformation is unquestionable [5]. The development of programming skills must therefore become part of the curriculum, and students should, in addition to formal and informal support, also receive psychological intervention in this area. Students perceive the competence associated with programming as significantly more demanding than other competences, which is certainly not the goal of the DigComp framework.

4. Analysis and Discussion

Fraile et al. [42] focused in their empirical study on the self-assessment of students of teacher training in the field of natural and technical sciences in the area of 21 competences defined by the DigComp framework. The data showed an extremely low level of perceived competences, which is probably related to the inability of students to assess themselves appropriately, but also to the fact that, in Spain, there are limited opportunities for training students in this area. The authors distinguished three levels of competence (low, medium, high). We would like to at least briefly compare our results with the results of this research.
Fraile et al. [42] also reported the worst results for the competence related to programming (over 75% of respondents rated it as low), followed by the integration of services and content and licenses and copyright. The research also showed a strong position of information and data literacy, which was, however, associated with a very small number of respondents who achieved a high level and with a very large variance. The remaining four competence areas were perceived relatively evenly (the strongest was the area of safety, which also had the highest number of excellent respondents), which is also not apparent from our research. The homogeneity of results for individual competences decreased with the semesters.
These results seem to clearly indicate that students of LIS cannot be easily compared with Spanish science teachers. Librarians are, naturally, stronger in the domain of information and data literacy, but also in the field of communication and collaboration tools. Here, we agree with Napal Fraile et al. [42] that a closer examination might reveal that greater differences occur during out-of-school learning than within a formal curriculum. As part of our course, we performed measurements before and after the semester in 2020, and the difference in average competence corresponds to an increase of 0.85 points. We can also perceive this result as favorable for possible further educational activities—it seems that targeted courses for the development of digital competences can improve these competences relatively quickly at the level of university studies, although we cannot claim that the result depends only on such a course, disregarding the overall impact of the curriculum and society at large.
If we consider possible changes at the curriculum level, we would argue for changes that to a large extent coincide with the recommendations of the above research focused on future teachers. We see it as necessary to focus on supporting hard technical skills—developing programming competences, integrating services and content or solving problems through technology. If students improved in these three areas, there would be a significant shift in the entire competence profile, and subsequent shifts in other areas can be expected as well.
Self-directed learning [43] seems to be inaccessible to a large proportion of students, although it is strongly emphasised by the LIS curriculum. It is necessary to look for ways to strengthen it in this respect, both at the level of personal development and specific learning competences.
However, in relation to professional identity, we consider the development of competences that are included in the domain of information and data literacy to be essential. The fact that only 8–15% of students indicated the two highest levels in their self-assessment is definitely not optimal. This may suggest that students do not have sufficient professional qualities or that their professional self-confidence, but also the ability to profile themselves as experts, is extremely low in this area.
As far as the structure of competences in the DigComp framework is concerned, there is a certain turning point in the proficiency levels—moving beyond level 4 signifies the ability not only to work independently but also to help others. Competence, therefore, includes the area of capacity as well as the dimension of obligation. The question is whether the students feel ready to accept the commitment element of competence. On the one hand, we consider competence as a personal skill but, on the other hand, as a commitment to be honoured.
At first glance, it is clear that in comparison with the “ordinary citizen”, the ability of students of Library and Information studies to work with information can be expected to be substantially higher than in the general population. Moreover, previous research [43] has shown that these students consider working with information to be probably the most important competence, and they often reduce the other competences to those of some kind of a “pendant”. In our perspective, the goal of using of these types of frameworks is not to create an “average user”, but to analyse areas in which the respondents have the greatest problems and areas in which, on the other hand, they excel.
Our results, together with the research presented by Fraile et al. [42], raise another important question: the question of the design of the DigComp model. It turns out that it does not lead—at least for information specialists and teachers—to a consistent level of competence and that the levels defined, according to which we could explicitly expect a level around 4–6, are not detected. The result is therefore a widely accepted model [44,45,46,47], which, however, does not fulfill one of its key functions: i.e., the possibility of the adequate self-assessment of individuals who want to use it to learn something about digital competences. Its general and universal character, which was probably supposed to lead to good sustainability, made it essentially a very problematic tool in terms of producing scores that can be relied upon without knowledge of a broader research context.
Another topic that needs to be discussed is the presence of programming as a competence directed towards creation. Does an active citizen really have to be able to create new objects using code? Would it not be more appropriate to aim at the ability to apply simple algorithmisation to certain tasks to save time and to use a computer for routine tasks? Here, our research converges with the results of Napal Fraile et al. [42], even though when we consider the competences of iHumanists [5], it is beyond any doubt that they would need a more thorough and more deeply technically oriented foundation.
The answer to the second research question is more difficult—we can see that the competence profile of students with regard to their average competence has been relatively stable, which may imply that during the three years of our research, no deeply structural changes in the information society [48,49] have occurred. In other words, the changes that have taken place in the last three years were not found to have any significant effect on the competence profile of students.
At the same time, we did not observe any systematic increase in the level of competences—students reached the highest level in the Autumn 2018 semester, which was the first year of measurement. On the one hand, we encountered the declared development of the information society; on the other hand, only a very slow adaptation of students to this kind of change was seen. However, we could probably identify a visible shift if longer time scales were used.
A question that our quantitative research cannot answer is how students approach self-assessment as such. It is possible that unlocking new possibilities and the ability to work with an increasing number of tools may lead to the Socratic attitude of “I know that I know nothing”, because deeper skills and knowledge reveal a wider horizon of possibilities.
We suggest that the teaching of programming and algorithmic thinking should become not only a separate course (e. g. “Introduction to Programming”), which would be undoubtedly helpful but, above all, one of the most unusual courses in the curriculum. Students will also develop a lasting competency to program if they learn to solve problems in their field-specific topics thanks to programming. Therefore, programming must not be fixed to one course but must appear in various forms throughout the curriculum. Hauck [45] argues that such an approach leads to active study and can be very effective. We believe that such an approach could also be helpful for other problematic competences.

5. Conclusions

The results that we present in this study have not yet been published, apart from the results for Autumn 2018. They represent a systematic, time-multiplied probe into the structure of digital competences according to the DigComp framework [8]. The ethical aspect of the research is not problematic; we ensured a high degree of anonymity of the data, and the students were acquainted with the research design in advance.
Let us summarise the main research findings of our study:
  • Self-evaluation is as an important and interesting tool for digital competence research for students and for curriculum developers and educators, asit allows observers to respond to the level of competences perceived by the students.
  • There seems to be a relatively clear agreement as to which areas are problematic for non-technical students (programming, integration of services and content, technical problem solving, copyright). These areas are supplemented with specific, often probably locally conditioned, weak competences, such as engagement in active citizenship through digital technologies in Czechia. On the other hand, we identified a strong emphasis on the area of information and data literacy and, naturally, field-specific strengths.
  • Both students in the research of Napal Fraile et al. [42] and students in our research had a difficulty with appropriate self-evaluation. Napal Fraile et al. [42] therefore applied a reduced three-level model; in our case, the level descriptors turned out to be a problematic point, most probably also due to a certain degree of vagueness.
  • Our research revealed a relatively high degree of competence stability, in the sense that in the course of the (fewer than) three years during which our research was conducted, there were no significant changes in the competence domains in which students achieved above average or below average scores. A high degree of stability was also found in the competences which showed extreme values.
  • In our sample, the variance in students’ self-evaluation also gradually increased—students are becoming an increasingly less homogeneous group in terms of their digital competences, although education seems to at least partially reduce this variance.
  • A research challenge to be addressed by further studies is undoubtedly a more detailed examination of the process of student self-evaluation and the search for certain clusters of students with similar competence characteristics, which could have significant educational implications.

Funding

Technology Agency of the Czech Republic—Eta 2 program; project Platform for knowledge transfer: information literacy for high school students in an open collaborative virtual learning environment (TL02000040).

Institutional Review Board Statement

Students were informed at the first face-to-face meeting about the research and were also continuously informed about how the data would be processed. The course had a website with teaching materials where research information was provided as well. Students who did not participate in the research were not penalised in any way during their studies. There was no need to use particular forms for informed consent for this type of result. In the research, we worked only with aggregated data so that no individual student could be identified according to the published data. The researcher was also a course teacher, so he had access to all the students’ learning outcomes based on their enrollment in the course. The data were not passed on to any third party.

Informed Consent Statement

Informed consent was obtained from all subjects involved in the study.

Data Availability Statement

The data were obtained manually from the university’s closed information system.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Beverley, C.A.; Booth, A.; Bath, P.A. The Role of the Information Specialist in the Systematic Review Process: A Health Information Case Study. Health Inf. Libr. J. 2003, 20, 65–74. [Google Scholar] [CrossRef] [PubMed]
  2. Jin, K.Y.; Reichert, F.; Cagasan, L.P., Jr.; de la Torre, J.; Law, N. Framework for Information Literacy for Higher Education; Association of College & Research Libraries: Chicago, IL, USA, 2015. [Google Scholar]
  3. Sales, D.; Pinto Molina, M. Pathways into Information Literacy and Communities of Practice: Teaching Approaches and Case Studies; Chandos Publishing: Cambridge, UK, 2017. [Google Scholar]
  4. Miller, F.Q. Encountering Relatable Information in Experiential Learning Spaces. J. Doc. 2019, 75, 517–529. [Google Scholar] [CrossRef]
  5. Engerer, V.P.; Sabir, F. Information Professionals Meet Arthur Prior. J. Librariansh. Inf. Sci. 2020, 52, 288–305. [Google Scholar] [CrossRef]
  6. Hallman, C.N. Technology: Trigger for Change in Reference Librarianship. J. Acad. Librariansh. 1990, 16, 204–208. [Google Scholar]
  7. Alvaro, E.; Brooks, H.; Ham, M.; Poegel, S.; Rosencrans, S. E-Science Librarianship: Field Undefined. Issues Sci. Technol. Librariansh. 2011, 66. [Google Scholar]
  8. Carretero, S.; Vuorikari, R.; Punie, Y. DigComp 2.1: The Digital Competence Framework for Citizens with Eight Proficiency Levels and Examples of Use; Publications Office of the European Union: Luxembourg, 2017. [Google Scholar]
  9. Edwards, J.B.; Robinson, M.S.; Unger, K.R. Transforming Libraries, Building Communities: The Community-Centered Library; Scarecrow Press: Lanham, MD, USA, 2013. [Google Scholar]
  10. Fillip, B.; Foote, D. Making the Connection: Scaling Telecenters for Development; Information Technology Applications Center: Bethesda, MD, USA, 2007. [Google Scholar]
  11. Moor, A.; Assem, R. Public Libraries as Social Innovation Catalysts. In Proceedings of the CIRN Prato Community Informatics Conference, Prato, Italy, 28–30 October 2013; pp. 28–30. [Google Scholar]
  12. Janssen, J.; Stoyanov, S.; Ferrari, A.; Punie, Y.; Pannekeet, K.; Sloep, P. Experts’ Views on Digital Competence: Commonalities and Differences. Comput. Educ. 2013, 68, 473–481. [Google Scholar] [CrossRef]
  13. Krumsvik, R.J. Situated Learning and Teachers’ Digital Competence. Educ. Inf. Technol. 2008, 13, 279–290. [Google Scholar] [CrossRef]
  14. Krumsvik, R.J. Teacher Educators’ Digital Competence. Scand. J. Educ. Res. 2014, 58, 269–280. [Google Scholar] [CrossRef]
  15. Siddoo, V.; Sawattawee, J.; Janchai, W.; Yodmongkol, P. Exploring the Competency Gap of It Students in Thailand: The Employers’ View of an Effective Workforce. J. Tech. Educ. Train. 2017, 9, 1–15. [Google Scholar]
  16. Špernjak, A.; Šorgo, A. Outlines for science digital competence of elementary school students. In Proceedings of the 2018 41st International Convention on Information and Communication Technology, Electronics and Microelectronics (MIPRO), Opatija, Croatia, 21–25 May 2018; pp. 825–829. [Google Scholar]
  17. Calzarossa, M.C.; Ciancarini, P.; Maresca, P.; Mich, L.; Scarabottolo, N. The ECDL Programme in Italian Universities. Comput. Educ. 2007, 49, 514–529. [Google Scholar] [CrossRef]
  18. Calzada Prado, J.; Marzal, M.Á. Incorporating Data Literacy into Information Literacy Programs: Core Competencies and Contents. Libri 2013, 63, 123–134. [Google Scholar] [CrossRef] [Green Version]
  19. Kirschner, P.A.; Stoyanov, S. Educating Youth for Nonexistent/Not yet Existing Professions. Educ. Policy 2020, 34, 477–517. [Google Scholar] [CrossRef] [Green Version]
  20. Grizzle, A.; Moore, P.; Dezuanni, M.; Asthana, S.; Wilson, C.; Banda, F.; Onumah, C. Media and Information Literacy: Policy and Strategy Guidelines; Unesco: Paris, France, 2014. [Google Scholar]
  21. Hauck, J. From Service to Synergy: Embedding Librarians in a Digital Humanities Project. Coll. Undergrad. Libr. 2017, 24, 434–451. [Google Scholar] [CrossRef]
  22. Mackey, T.P.; Jacobson, T.E. Reframing Information Literacy as a Metaliteracy. Coll. Res. Libr. 2011, 72, 62–78. [Google Scholar] [CrossRef]
  23. Calvani, A.; Cartelli, A.; Fini, A.; Ranieri, M. Models and Instruments for Assessing Digital Competence at School. J. E-Learn. Knowl. Soc. 2008, 4, 183–193. [Google Scholar]
  24. Calvani, A.; Fini, A.; Ranieri, M. Assessing digital competence in secondary education. Issues, models and instruments. In Issues in Information and Media Literacy: Education, Practice and Pedagogy; Informing Science Press: Santa Rosa, CA, USA, 2009; pp. 153–172. [Google Scholar]
  25. Sillat, L.H.; Tammets, K.; Laanpere, M. Digital Competence Assessment Methods in Higher Education: A Systematic Literature Review. Educ. Sci. 2021, 11, 402. [Google Scholar] [CrossRef]
  26. Ferrari, A. Digital Competence in Practice: An Analysis of Frameworks; Publications Office of the European Union: Luxembourg, 2012. [Google Scholar]
  27. Ramírez-Montoya, M.S.; Mena, J.; Rodríguez-Arroyo, J.A. In-Service Teachers’ Self-Perceptions of Digital Competence and OER Use as Determined by a XMOOC Training Course. Comput. Hum. Behav. 2017, 77, 356–364. [Google Scholar] [CrossRef]
  28. Ortega-Sánchez, D.; Gómez-Trigueros, I.M.; Trestini, M.; Pérez-González, C. Self-Perception and Training Perceptions on Teacher Digital Competence (TDC) in Spanish and French University Students. Multimodal Technol. Interact. 2020, 4, 74. [Google Scholar] [CrossRef]
  29. Mabe, P.A.; West, S.G. Validity of Self-Evaluation of Ability: A Review and Meta-Analysis. J. Appl. Psychol. 1982, 67, 280. [Google Scholar] [CrossRef]
  30. Larrouy-Maestri, P.; Wang, X.; Nunes, R.V.; Poeppel, D. Are You Your Own Best Judge? On the Self-Evaluation of Singing. J. Voice 2021. [Google Scholar] [CrossRef]
  31. Maderick, J.A.; Zhang, S.; Hartley, K.; Marchand, G. Preservice Teachers and Self-Assessing Digital Competence. J. Educ. Comput. Res. 2016, 54, 326–351. [Google Scholar] [CrossRef]
  32. Aesaert, K.; Voogt, J.; Kuiper, E.; Braak, J. Accuracy and Bias of ICT Self-Efficacy: An Empirical Study into Students’ over-and Underestimation of Their ICT Competences. Comput. Hum. Behav. 2017, 75, 92–102. [Google Scholar] [CrossRef]
  33. Siiman, L.A.; Mäeots, M.; Pedaste, M.; Simons, R.J.; Leijen, Ä.; Rannikmäe, M.; Timm, M. An Instrument for Measuring Students’ Perceived Digital Competence According to the DIGCOMP Framework. In Proceedings of the International Conference on Learning and Collaboration Technologies, Toronto, ON, Canada, 17–22 July 2016; Springer: Cham, Switzerland; pp. 233–244. [Google Scholar]
  34. Morze, N.; Kuzminska, O.; Mazorchuk, M.; Pavlenko, V.; Prokhorov, A. Digital Competency of the Students and Teachers in Ukraine: Measurement, Analysis, Development Prospects. Inf. Commun. Technol. Educ. Res. Ind. Appl. Commun. Comput. Inf. Sci. 2018, 2104, 366–379. [Google Scholar]
  35. Siiman, L.A.; Mäeots, M.; Pedaste, M. A Review of Interactive Computer-Based Tasks in Large-Scale Studies: Can They Guide the Development of an Instrument to Assess Students’ Digital Competence? In Proceedings of the Technology Enhanced Assessment; Joosten-ten Brinke, D., Laanpere, M., Eds.; Springer International Publishing: Cham, Switzerland, 2017; pp. 148–158. [Google Scholar]
  36. Borislow, B. Self-Evaluation and Academic Achievement. J. Couns. Psychol. 1962, 9, 246. [Google Scholar] [CrossRef]
  37. Klenowski, V. Student Self-evaluation Processes in Student-centred Teaching and Learning Contexts of Australia and England. Assess. Educ. Princ. Policy Pract. 1995, 2, 145–163. [Google Scholar] [CrossRef]
  38. Rolheiser, C.; Ross, J.A. Student Self-Evaluation: What Research Says and What Practice Shows. Plain Talk Kids 2001, 43, 57. [Google Scholar]
  39. Stallings, V.; Tascoine, C. Student Self-Assessment and Self-Evaluation. Math. Teach. 1996, 89, 548–554. [Google Scholar] [CrossRef]
  40. McMillan, J.H.; Hearn, J. Student Self-Assessment: The Key to Stronger Student Motivation and Higher Achievement. Educ. Horiz. 2008, 87, 40–49. [Google Scholar]
  41. Černý, M. Digitální Kompetence v Transdisciplinárním Nahlédnutí: Mezi Filosofií, Sociologií, Pedagogikou a Informační Vědou; Masarykova Univerzita: Brno, Czech Republic, 2019. [Google Scholar]
  42. Napal Fraile, M.; Peñalva-Vélez, A.; Mendióroz Lacambra, A.M. Development of Digital Competence in Secondary Education Teachers’ Training. Educ. Sci. 2018, 8, 104. [Google Scholar] [CrossRef] [Green Version]
  43. Morris, T.H. Adaptivity through Self-Directed Learning to Meet the Challenges of Our Ever-Changing World. Adult Learn. 2019, 30, 56–66. [Google Scholar] [CrossRef]
  44. Černý, M. Reflexe evropského rámce digitálních kompetencí studenty (především) Filozofické fakulty MU optikou kvalitativního výzkumu. Gramotnost Pregramotnost Vzdělávání 2020, 1, 43–62. [Google Scholar]
  45. Evangelinos, G.; Holley, D. A Qualitative Exploration of the EU Digital Competence (DIGCOMP) Framework: A Case Study within Healthcare Education. In Proceedings of the International Conference on E-Learning, E-Education, and Online Training, Bethesda, MD, USA, 18–20 September 2014; pp. 85–92. [Google Scholar]
  46. Lucas, M.; Moreira, A.; Costa, N. Quadro europeu de referência para a competência digital: Subsídios para a sua compreensão e desenvolvimento. Observatorio (Obs*) 2017, 11, 181–198. [Google Scholar] [CrossRef] [Green Version]
  47. Pérez-Escoda, A. Digital Competence’s Frameworks in Europe: An Approaching to Spanish and Norwegian Framework. In Proceedings of the Second International Conference on Technological Ecosystems for Enhancing Multiculturality, Salamanca, Spain, 1–3 October 2014; pp. 469–474. [Google Scholar]
  48. Webster, F. Theories of the Information Society; Routledge: London, UK, 2014. [Google Scholar]
  49. Van Dijk, J.A. The Deepening Divide: Inequality in the Information Society; Sage Publications: Thousand Oaks, CA, USA, 2005. [Google Scholar]
Figure 1. Development of the frequency of representation of individual levels of competences according to students’ self-evaluation in Autumn 2018.
Figure 1. Development of the frequency of representation of individual levels of competences according to students’ self-evaluation in Autumn 2018.
Education 11 00729 g001
Figure 2. Development of the frequency of representation of individual levels of competences according to students’ self-evaluation in Autumn 2019.
Figure 2. Development of the frequency of representation of individual levels of competences according to students’ self-evaluation in Autumn 2019.
Education 11 00729 g002
Figure 3. Development of the frequency of representation of individual levels of competences according to students’ self-evaluation in Spring 2020.
Figure 3. Development of the frequency of representation of individual levels of competences according to students’ self-evaluation in Spring 2020.
Education 11 00729 g003
Figure 4. Average competences of students and their variance over the years.
Figure 4. Average competences of students and their variance over the years.
Education 11 00729 g004
Figure 5. Distribution of individual competences in the three semesters examined.
Figure 5. Distribution of individual competences in the three semesters examined.
Education 11 00729 g005
Table 1. Research sample by year.
Table 1. Research sample by year.
Autumn 2018Autumn 2019Spring 2020
Man1435%1328%1523%
Woman2765%3472%4975%
Total41100%47100%64100%
Table 2. Average competence values for individual semesters and the respective values of deviation from average.
Table 2. Average competence values for individual semesters and the respective values of deviation from average.
Autumn 2018Autumn 2019Spring 2020
Competence AreaAverageDeviationAverageDeviationAverageDeviation
Information and data literacy4.330.543.070.514.470.71
Communication and collaboration4.010.222.870.313.980.22
Digital content creation3.51−0.282.25−0.312.63−1.13
Safety3.62−0.172.42−0.143.60−0.16
Problem solving3.51−0.282.16−0.393.25−0.51
Table 3. Sets of three extremely positively and three extremely negatively evaluated competences.
Table 3. Sets of three extremely positively and three extremely negatively evaluated competences.
Highest Lowest
Autumn 2018Managing information4.37Programming2.68
Collaborating through digital technologies4.37Solving technical problems3.22
Evaluating information4.34Engaging in active citizenship3.23
Autumn 2019Searching information3.30Programming1.87
Sharing through digital technologies3.30License and copyright2.00
Interacting through digital technologies3.28Creatively using technologies2.09
Spring 2020Evaluating information4.68Programming1.61
Collaborating through digital technologies4.52Solving technical problems3.0
Searching information4.50Engaging in active citizenship through digital technologies3.13
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Cerny, M. Digital Competences of Students of Library Studies: Comparison of Research Results for 2018–2020. Educ. Sci. 2021, 11, 729. https://0-doi-org.brum.beds.ac.uk/10.3390/educsci11110729

AMA Style

Cerny M. Digital Competences of Students of Library Studies: Comparison of Research Results for 2018–2020. Education Sciences. 2021; 11(11):729. https://0-doi-org.brum.beds.ac.uk/10.3390/educsci11110729

Chicago/Turabian Style

Cerny, Michal. 2021. "Digital Competences of Students of Library Studies: Comparison of Research Results for 2018–2020" Education Sciences 11, no. 11: 729. https://0-doi-org.brum.beds.ac.uk/10.3390/educsci11110729

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop