Next Article in Journal
A Novel Optimized Link-State Routing Scheme with Greedy and Perimeter Forwarding Capability in Flying Ad Hoc Networks
Previous Article in Journal
A Risk-Structured Model for the Transmission Dynamics of Anthrax Disease
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

An Integrated Instruction and a Dynamic Fuzzy Inference System for Evaluating the Acquirement of Skills through Learning Activities by Higher Middle Education Students in Mexico

by
Cecilia Leal-Ramírez
* and
Héctor Alonso Echavarría-Heras
Centro de Investigación Científica y de Estudios Superiores de Ensenada, Carretera Ensenada—Tijuana No. 3918, Zona Playitas, Ensenada 22860, Baja California, Mexico
*
Author to whom correspondence should be addressed.
Submission received: 3 February 2024 / Revised: 21 March 2024 / Accepted: 25 March 2024 / Published: 28 March 2024
(This article belongs to the Special Issue New Advances in Fuzzy Logic and Fuzzy Systems)

Abstract

:
Background: The evaluation of the development of a student’s abilities and skills through a learning activity is a topic strongly questioned by the education system in Mexico. Several instruments have been developed to achieve said evaluation. However, these involve both qualitative and subjective assessment, thereby avoiding the possibility of unambiguously verifying the development of a student’s aptitudes. Methods: We developed a new instrument composed of an integrated instruction and a dynamic fuzzy inference system. Integrated instruction is a table that contains a set of instructions and a set of indicators that make it possible to evaluate knowledge, procedure, and attitude without establishing qualitative or subjective criteria to rank them. The dynamic fuzzy inference system assesses indicators under a criterion to demonstrate the development of a student’s abilities and skills. Results: The method was applied to three different learning activities, where the assessment was precise and transparent for the student, contributing to an extraordinary identification of the acquainted knowledge, procedure, and attitude that the student displayed to develop the activity. Conclusions: Our instrument evaluates the development of abilities and skills without ambiguity or subjectivity, making efficient feedback possible and allowing it to be perfected without difficulties for future adaptations.

1. Introduction

Referring to modern curricular design undoubtedly brings about discussions of Education 4.0. This composite epitomizes the impact of the Fourth Industrial Revolution on education, including advanced technology, digitalization, and new teaching methods. It emphasizes personalized learning experiences, using technologies such as artificial intelligence, virtual reality, and online platforms to improve education delivery. Education 4.0 aims to equip students with the skills needed for global context growth, creativity, critical thinking, collaboration, and technical skills. This paradigm shift in education is accompanied by the need to align learning methods with the requirements of a rapidly changing technology-driven society [1]. The notion of a competencies framework fits within the context of Education 4.0. A competencies framework is a descriptive tool pinpointing the skills, knowledge, personal traits, and behaviors to be developed by an individual necessary for proficient role performance within a workplace or organization [2,3]. Moreover, competence frameworks can be conceived as configured tools that arrange the essential skills, knowledge, behaviors, and attributes required by an individual to achieve effective performance in various roles or professions [2]. The primary purpose of competencies frameworks is to align individual capabilities with an organization’s strategic goals, ensuring that workforce skills directly contribute to success. Additionally, they serve as benchmarks for performance evaluation, aiding organizations in identifying skill gaps and areas for targeted improvement [4].
Competencies can be categorized into two groups. Cross-Functional Competencies and Subject-Specific Competencies [5]. Cross-functional competencies, such as critical thinking, teamwork, collaboration, communication, and creativity, contribute to holistic student development for addressing complex challenges. On the other hand, Subject-Specific Competencies focus on developing specialized knowledge and technical skills, encompassing research, design, technology application, and best practice integration within disciplines, ensuring proficiency in both theory and practical application [5].
Implementation of the Education 4.0 course in Mexico is still a pending goal [6]. However, adaptations to enhance the acquisition of competencies by Mexican students have already been attempted. Indeed, the Secretary of Public Education (SEP by its acronym in Spanish) reformed the curricular framework of the higher middle education system and implemented this new framework in 2008, which included a set of competencies to be developed by students [7]. Competencies in the context of the aforesaid Mexican educational reform were conceived as abilities and skills necessary for the development of a basic activity or one related to a work environment. Therefore, evaluation of acquired competencies required a different approach than that used before said reform, which gave way to an evaluation based on constructivism, on the flexible curriculum, on the notion of collaborative learning, in situated teaching, in a learning based on problem solving, and in the use of simulators in teaching. The introduction of this evaluation perspective sometimes contradicted evaluation techniques focused on the memorization process and the management of information [8].
To acquire competencies, it becomes necessary to integrate “knowing how”, “knowing how to do”, and “knowing how to be” to perform a task [9]. These notions imply “knowing how to know”, which relates to prior knowledge and the new knowledge created, “knowing how to do”, linked to the application of expertise efficiently and effectively, and “knowing how to be”, which concerns a set of attitudes that shape behavior or thought to carry out a task. Therefore, developing a sound evaluation scheme for acquiring competencies poses inherent difficulties [10].
Competency-focused assessment has faced various challenges and limitations, and its effectiveness has been the subject of continuous debate and analysis. Teachers have faced various challenges when adapting to a broader conception of evaluation. The evaluation by competencies requires an inference through specific actions [11]. In addition, it requires expected learning outcomes [12], which include the continuous evaluation of the development of abilities and skills by students [12]. On the other hand, the lack of adequate training for teachers has hindered their ability to design instruments aimed at evaluating the development of competencies effectively. Furthermore, developing such training requires additional time and effort on their part, which has prompted resistance to the implementation of the reform. Therefore, attempts to comply with the demands of the 2008 reform, to date, have not had the expected results. Since 2008, the educational model has included: (1) updating study plans and programs, (2) improving the training and professional development of teachers, (3) strengthening educational evaluation, (4) integrating information technologies in education, (5) the implementation of standardized tests, (6) the evaluation of teacher performance, and, currently, (7) guaranteeing more equitable access to quality education. The values reported by the Program for International Student Assessment (PISA) [13] have shown that since 2006, the performance of Mexican students presents severe deficiencies, mainly in mathematics, science and reading.
In response to this scenario, in 2022 the SEP once again reformed the curricular framework of the higher middle education system, with the main purpose that students develop abilities, skills, knowledge, and culture that allows them to learn to learn for life. Among other things, achieving this objective requires a continued evaluation of the student’s development of competencies, or, put another way, the acquisition of abilities and skills.
The educational model based on the development of abilities and skills requires a continuing and integral evaluation where performance is assessed by attitude, knowledge, and its application or procedure [12]. In theory, the current orientation of the Mexican higher middle education system emphasizes aligning both the objectives of the units and the learning activities that must be developed into a study program with the competencies that the student must acquire [14]. This scheme assigns the teacher the responsibility of carrying out planning where both the learning activities and the evaluation instruments per unit are established. However, in practice, this is far from reality. On the one hand, the educational system manipulates the degree of student learning development by overvaluing or undervaluing the results [15]. On the other hand, plans are often poorly prepared because the learning activities and their corresponding evaluation instruments do not generate any evidence about the real development of the student’s abilities and skills [15]. Leal-Ramírez and Echavarria-Heras [15] proposed a method to align competencies with the objectives of the unit and learning activities, as well as a fuzzy inference system that allowed evaluation of the adequacy of learning activities. This protocol conceives the operation of a method of aligning competencies in a way that precludes the well-known manipulation that the educational system exercises on the results. However, this alignment is not enough. It is a well-known fact that the learning activities and evaluation instruments should be considered the two essential parts of an evaluation system. Therefore, both must complement each other for the student to demonstrate the “knowing how”, “knowing how to do”, and “knowing how to be” [9].
In attempts to build instruments to evaluate the development of competencies, several paradigms emerged. These include oral presentations, research projects, case studies, peer evaluation, self-assessment and reflection, interviews, practical exams, portfolios, and the rubric [16,17]. Practitioners who created these evaluation instruments planned to collect evidence on student performance. However, although all these paradigms aim to evaluate abilities and skills, the type of assessment that all of them handle is qualitative, which implies subjectivity. This type of assessment can lead to inconsistencies and can affect the reliability of the evaluation results [18]. However, some paradigms integrate qualitative and quantitative assessment, as is the case with research projects and the rubric. This reveals one of the approaches most used by teachers. It allows practically any activity to be evaluated [19]. The rubric is a table that is made up of a set of criteria that detail each of the tasks to be performed. It specifies the expected levels of performance (called scores); these can be qualitative or quantitative. However, the use of the rubric produces certain problems. One of them is an inadequate design that leads to bias or imprecise evaluations and subjective interpretation. Another problem is the time and resources required to create the rubrics, particularly when it comes to projects with multiple criteria. In addition, rubrics may not be suitable for all evaluation situations, or may provide insufficient feedback to provide information to students that allows them to understand their performance and/or the understanding of how to improve it [20,21]. Moreover, in practice, we have identified that the main problem in evaluating the development of skills and abilities is not the paradigms used to evaluate the learning activity, but the lack of understanding about how these function.
In this work, we present a new instrument to assess learning activities, that is, a paradigm called “Integrated Instruction”. This instrument relies on a fuzzy inference system aimed at evaluating the development of a student’s abilities and skills. Its structure is in the form of a table, like a rubric. However, the Integrated Instruction only has eight columns. This table transcribes the instructions corresponding to the learning activity. These instructions represent the evaluation criteria. The remaining columns contain a set of indicators classified into the following essential attributes: knowledge, procedure, and attitude. Unlike the rubric, Integrated Instruction does not contain performance levels, nor qualitative or quantitative scores. Building the dynamic fuzzy inference system requires a set of indicators contained in the Integrated Instruction and located into these attributes. The dynamic fuzzy inference system acts as an expert to interpret the indicators presented in the Integrated Instruction. The fuzzy system determines a quantitative rating as a result. The extensive experience of the teacher in each topic determines the set of characteristics contained in a learning activity. The purpose of Integrated Instruction is to facilitate the teacher’s development of learning activities. Its use facilitates the adaptation of instructions focused on demonstrating the development of the student’s abilities and skills, and not only on assigning them a qualitative or quantitative assessment. Regularly, the abilities and skills develop around a certain time interval. Therefore, the teacher must focus on adapting learning activities as contemplated in the study plan, aligned to evidence the development of said capabilities, considering the required times to acquire them. This work aims to present and explain the operation of this new instrument to objectively evaluate the development of skills and abilities without assigning biased ratings and under complete plainness.
The Methods section presents the structure of the Integrated Instruction and the mathematical formulation of the elements that make up the structure of the fuzzy inference system to evaluate it. The Results section illustrates the performance of the offered protocol by relying on examples dealing with evaluating the skills and abilities developed by a group of students using real data. Finally, in the discussion section, we elaborate on the instrument’s scope, advantages, barriers to its use, and we also justify why we consider it a powerful tool to generate objective evidence about a student’s actual performance. We conclude with a brief description of the possible limitations in the implementation of the instrument within the Mexican higher middle education system and highlight the versatility of the offered instrument in adapting to further evaluation tasks.

2. Materials and Methods

2.1. Integrated-Instruction

Integrated Instruction is conceived as an evaluation instrument characterized by a structure in the form of a table, which contains only eight columns (Figure 1).
The Integrated Instruction contains one or more instructions (Figure 1). The text of each must be harmonized with the learning activity. The relevance of the instruction lies in the way the text is written. Regardless of the context, the text must follow several rules: (a) select the appropriate verb and use it in a way that expresses a command, since this way the instruction will be clearer, (b) start the text with a verb, (c) use precise and simple terms so that the reader can understand the instruction, (d) avoid ambiguous and confusing wording and denials as much as possible and also avoid reiteration of the instructions.
We must note that an integral evaluation requires consideration of the conjunction of all the attributes mentioned above [3]. Therefore, the Integrated Instruction (Figure 1) includes the following attributes: knowledge, procedure, and attitude. Each of these contemplates a set of elements of the learning activity, which are defined by indicators. Therefore, an instruction must include indicators related to at least one of them, but Integrated Instruction must include enough instructions so that all of them are covered (Figure 1).

2.2. Dynamic Fuzzy Inference System

Fuzzy inference systems can emulate approximate reasoning by relating input data to output data through a computational process that allows a solution to be determined with a high degree of precision. These systems used fuzzy logic, which allows them to represent common knowledge in a mathematical language through the theory of fuzzy sets and the characteristic functions associated with them [22].
A fuzzy inference system consists of four processes: fuzzification, inference, aggregation, and defuzzification. The fuzzification process converts a precise input into a fuzzy value, that is, the degree of membership of the given input into a suitable fuzzy set. Such ranking is determined by the membership functions associated with said set. Fuzzy inference works based on a set of fuzzy rules. It evaluates each of the different rules grounded on the degrees of membership that the input variables have in fuzzy sets (called antecedents) [23]. Then, relying on the interpretation of the connectedness of the antecedents, it determines the contribution of each rule to output fuzzy sets called consequents [23]. Finally, the defuzzification process combines the contributions of each of the rules to obtain the value of the system output.
In this contribution, we adapt the fuzzy inference system presented in [15] to a new version. Particularly, we carry out modifications in the fuzzy sets and rules implied in the paradigm offered by these authors to characterize the emerging fuzzy sets through dynamic membership functions and to adapt the rules to a behavior more suitable for efficient evaluation. That is, in the scheme offered here, the fuzzy sets depend on several parameters within the fuzzy inference system, which have a direct correspondence with the Integrated Instruction designed to evaluate a learning activity. The dynamic fuzzy inference system evaluates a learning activity in a particular way. Therefore, for each learning activity, there will be a corresponding Integrated Instruction and a unique fuzzy inference system for it.
In this section we will present the mathematical formulation of the dynamic fuzzy inference system. We begin by defining the antecedents of the dynamic fuzzy inference system. Correspondingly, C , P , and A will denote the fuzzy sets that represent them and that are associated with the terms “Knowledge”, “Procedure” and “Attitude”, respectively:
C = x 1 , μ P x 1 , x 1 , μ S x 1 , x 1 , μ M x 1 | x 1 U ,
P = x 2 , μ N F x 2 , x 2 , μ I x 2 , x 2 , μ F x 2 | x 2 U ,
A = x 3 , μ N A x 3 , x 3 , μ A x 3 | x 3 U ,
where U is the universe that contains all the possible elements that belong to a particular context. These terms are attributes to be induced by a set of elements composing the learning activity and that are defined by indicators in the Integrated Instruction. The membership functions associated with the fuzzy sets C and P , Equations (1) and (2), respectively, each have a direct relationship with linguistic terms: μ P x 1 con “Little”, μ S x 1 with “Enough”, μ M x 1 with “Much”, μ N F x 2 with “No-Realized”, μ I x 2 with “Incomplete”, and μ F x 2 with “Realized”. Membership functions assign each element of U a number within the interval [ 0 , 1 ] :
μ P x 1 = 1 i f x 1 a P × c 10 1 2 x 1     a P   ×   c 10 b P   ×   c 10     a P   ×   c 10 2 i f a P × c 10 x 1 a P   ×   c 10   +   b P × c 10 2 2 x 1     b P   ×   c 10 b P   ×   c 10     a P   ×   c 10 2 i f a P   ×   c 10   +   b P   ×   c 10 2 x 1 b P × c 10 0 i f x 1 b P × c 10   f o r   0 x 1 c ,  
μ S x 1 = e x 1     m S   ×   c 10 2 2 σ S   ×   c 10 2   f o r   0 x 1 c ,
μ M x 1 = 0 i f x 1 a M × c 10 2 x 1 a M × c 10 b M × c 10 a M × c 10 2 i f a M × c 10 x 1 m M × c 10 1 2 x 1 b M × c 10 b M × c 10 a M × c 10 2 i f m M × c 10 < x 1 < b M × c 10 1 i f x 1 b M × c 10     f o r   0 x 1 c ,
μ N F x 2 = 1 i f x 2 a N F × p 10 1 2 x 2 a N F × p 10 b N F × p 10 a N F × p 10 2 i f a N F × p 10 x 2 a N F × p 10 + b N F × p 10 2 2 x 2 b N F × p 10 b N F × p 10 a N F × p 10 2 i f a N F × p 10 + b N F × p 10 2 x 2 b N F × p 10 0 i f x 2 b N F × p 10   f o r   0 x 2 p ,
μ I x 2 = e x 2 m I × p 10 2 2 σ I × p 10 2   f o r   0 x 2 p ,
μ F x 2 = 0 i f x 2 a F × p 10 2 x 2 a F × p 10 b F × p 10 a F × p 10 2 i f a F × p 10 x 2 m F × p 10 1 2 x 2 b F × p 10 b F × p 10 a F × p 10 2 i f m F × p 10 < x 2 < b F × p 10 1 i f x 2 b F × p 10   f o r   0 x 2 p ,
where the parameters m , σ , a , and b are real numbers (Table 1) and the parameters c and p are positive integers, which are determined by the Integrated Instruction (Figure 1).
Particularly, the membership functions expressed in Equations (4) and (7) are of z-typing, those expressed in Equations (5) and (8) are Gaussian-typing functions, and those expressed in Equations (6) and (9) are functions of a typing s. In general, the choice of the type of membership functions that associate to the fuzzy sets of a fuzzy inference system depends mainly on the context of the addressed problem and its solution, while keeping in mind that the essential aim is always improving the performance, accuracy, or flexibility of said system. However, practice reveals that, in a wide range of contexts, Gaussian-type membership functions are the most used [24,25]. Particularly, in the context of education this is a widespread approach [26]. Gaussian functions have the property of providing smooth transitions between different degrees of membership, which is crucial to imprinting into a fuzzy inference system a remarkable versatility for modeling uncertainty. In addition, they can be efficiently implemented, which makes them practical in applications that require fast responses. For their part, s-type functions are useful for assigning membership values in modeling endeavors based on linguistic terms or concepts where, at the beginning, there is a certainty of non-membership, going through an area of uncertainty and ending with a certainty of belonging. Z-type membership functions work in the opposite way than s-type ones; they are suitable for representing situations where an element clearly belongs to a fuzzy set up to a certain point, after which the certainty of its membership decreases to zero. Combining Gaussian membership functions with z- and s-type membership functions allows modeling a broader range of membership relations and transitions between membership states, especially where different sections of the membership curve must be modeled with different degrees of sensitivity. All said membership function types can be adapted according to the specific needs of the application, which can lead to improved system performance. In the present study, knowledge and procedure are attributes that determine, for the student, the possibility to acquire competencies while addressing a learning activity. It is obvious that the combination of different levels of both attributes could increase or decrease this possibility. Therefore, we chose the Gaussian membership functions, s and z, to characterize these attributes, given the properties implicit in the combined use of said functions that allow a more natural characterization.
The membership functions associated with the fuzzy set A each have a direct relationship with linguistic terms. μ N E x 3 is associated with “Not Expected” and μ E x 3 with “Expected”. These are the trapezoidal type, which are defined as follows:
μ N E x 3 = 0 i f x 3 < a N E × a 10 x 3     a N E   ×   a 10 b N E   ×   a 10     a N E   ×   a 10 i f a N E × a 10 x 3 b N E × a 10 1 i f b N E × a 10 < x 3 < c N E × a 10 x 3     d N E   ×   a 10 d N E   ×   a 10     c N E   ×   a 10 i f c N E × a 10 x 3 d N E × a 10 0 i f x 3 > d N E × a 10     f o r   0 x 3 a ,
μ E S x 3 = 0 i f x 3 < a E S × a 10 x 3     a E S   ×   a 10 b E S   ×   a 10     a E S   ×   a 10 i f a E S × a 10 x 3 b E S × a 10 1 i f b E S × a 10 < x 3 < c E S × a 10 x 3     d E S   ×   a 10 d E S   ×   a 10     c E S   ×   a 10 i f c E S × a 10 x 3 d E S × a 10 0 i f x 3 > d E S × a 10     f o r   0 x 3 a ,
where a , b , c , and d are real numbers (Table 1) and the parameter a is a positive integer, which is determined by the integrated instruction (Figure 1). The shape of the trapezoidal membership function becomes ideal for representing attributes with a zone of certainty surrounded by zones of uncertainty. In the context of attitude, this type of function allows us to model how the perception or evaluation can vary gradually, reflecting the attitude towards a concept in which it can not only be absolutely expected or unexpected, but also can often present itself on a spectrum of uncertainty. The fuzzy set A is characterized by two trapezoidal membership functions. The first function (Equation (10)) characterizes the part where there is certainty of an unexpected attitude, then it gradually changes towards the objective of not being certain of an expected attitude. The second function (Equation (12)) characterizes the part where there is no certainty of an expected attitude, then it gradually changes towards the objective of having the certainty of a hopeful attitude.
Now, we denote by means of D the fuzzy set associated with the term “Learning”, which represents the consequent of the dynamic fuzzy inference system:
D = x 4 , μ T x 4 , x 4 , μ M M x 4 , x 4 , μ M A x 4 , x 4 , μ R x 4 , x 4 , μ B x 4 , x 4 , μ M B x 4 , x 4 , μ E x 4 , x 4 , μ E X x 4 | x 4 U .
The membership functions that associate with D (Equation (12)) each have a direct relationship with linguistic terms. Namely, μ T x associates with “Terrible”, μ M M x does with “Very-Bad”, correspondingly μ M A x links to “Bad”, μ R x connects to “Regular”, μ B x with “Good”, μ M B x does with “Very-Good”, μ E x assigns to “Excellent”, and μ E X x with “Outstanding”. Furthermore, these membership functions are triangular and are defined as follows:
μ T x 4 = 0 i f x 4 a T x 4     a T m T     a T i f a T x 4 m T b T     x 4 b T     m T i f m T x 4 b T 0 i f o t h e r w i s e f o r   0 x 4 q ,
μ M M x 4 = 0 i f x 4 a M M x 4     a M M m M M     a M M i f a M M x 4 m M M b M M     x 4 b M M     m M M i f m M M x 4 b M M 0 i f o t h e r w i s e f o r   0 x 4 q ,
μ M A x 4 = 0 i f x 4 a M A x 4     a M A m M A     a M A i f a M A x 4 m M A b M A     x 4 b M A     m M A i f m M A x 4 b M A 0 i f o t h e r w i s e f o r   0 x 4 q ,
μ R x 4 = 0 i f x 4 a R x 4     a R m R     a R i f a R x 4 m R b R     x 4 b R     m R i f m R x 4 b R 0 i f o t h e r w i s e f o r   0 x 4 q ,
μ B x 4 = 0 i f x 4 a B x 4     a B m B     a B i f a B x 4 m B b B     x 4 b B     m B i f m B x 4 b B 0 i f o t h e r w i s e f o r   0 x 4 q ,
μ M B x 4 = 0 i f x 4 a M B x 4     a M B m M B     a M B i f a M B x 4 m M B b M B     x 4 b M B     m M B i f m M B x 4 b M B 0 i f o t h e r w i s e f o r   0 x 4 q ,
μ E x 4 = 0 i f x 4 a E x     a E m E     a E i f a E x m E b E     x b E     m E i f m E x b E 0 i f o t h e r w i s e f o r   0 x 4 q ,
μ E X x 4 = 0 i f x 4 a E X x 4     a E X m E X     a E X i f a E X x 4 m E X b E X     x 4 b E X     m E X i f m E X x 4 b E X 0 i f o t h e r w i s e f o r   0 x 4 q ,
where q = 10.5 and a , b , and m are real numbers (Table 1). The triangular shape of the membership functions makes it easier to interpret the fuzzy rules and the logical understanding behind them. It is especially useful in the design phase when there is a straightforward certainty on how different inputs affect the outputs of the system is required. Triangular membership functions, due to their simplicity, facilitate aggregation and defuzzification stages, especially when using the centroid method. They often provide a good enough approximation for modeling fuzzy relationships in many practical problems [24,25]. This is particularly true in Mamdani-type fuzzy inference systems, where the goal is to capture fuzzy rule-based reasoning rather than to accurately model the mathematical complexities of membership functions. In summary, the triangular shape allows for direct and efficient calculations to determine the output value.
Linguistic terms are fundamental for formulating rules because these allow the manipulation of imprecise or vague concepts and can be easily understood and interpreted by humans. Our system composes a total of 18 fuzzy rules, which we express by using the linguistic terms defined in questions (4–11) and (13–20) as follows:
If( x 1 , i n t is C μ P 1 )and( x 2 , i n t is P μ N F 1 )and( x 3 , i n t is A μ N E 1 )then( y 1 is D μ T 1 )
If( x 1 , i n t is C μ P 2 )and( x 2 , i n t is P μ I 2 )and( x 3 , i n t is A μ N E 2 )then( y 2 is D μ M A 2 )
If( x 1 , i n t is C μ P 3 )and( x 2 , i n t is P μ F 3 )and( x 3 , i n t is A μ N E 3 )then( y 3 is D μ B 3 )
If( x 1 , i n t is C μ P 4 )and( x 2 , i n t is P μ N F 4 )and( x 3 , i n t is A μ E 4 )then( y 4 is D μ T 4 )
If( x 1 , i n t is C μ P 5 )and( x 2 , i n t is P μ N I 5 )and( x 3 , i n t is A μ E 5 )then( y 5 is D μ R 5 )
If( x 1 , i n t is C μ P 6 )and( x 2 , i n t is P μ F 6 ) and( x 3 , i n t is A μ E 6 )then( y 6 is D μ M B 6 )
If( x 1 , i n t is C μ S 7 ) and( x 2 , i n t is P μ N F 7 )and( x 3 , i n t is A μ N E 7 )then( y 7 is D μ T 7 )
If( x 1 , i n t es C μ S 8 )and( x 2 , i n t is P μ I 8 )and( x 3 , i n t is A μ N E 8 )then( y 8 is D μ R 8 )
If( x 1 , i n t is C μ S 9 )and( x 2 , i n t is P μ F 9 )and( x 3 , i n t is A μ N E 9 )then( y 9 is D μ M B 9 )
If( x 1 , i n t is C μ S 10 )and( x 2 , i n t is P μ N F 10 )and( x 3 , i n t is A μ E 10 )then( y 10 is D μ M M 10 )
If( x 1 , i n t is C μ S 11 )and( x 2 , i n t is P μ I 11 )and( x 3 , i n t is A μ E 11 )then( y 11 is D μ B 11 )
If( x 1 , i n t is C μ S 12 )and( x 2 , i n t is P μ F 12 )and( x 3 , i n t is A μ E 12 )then( y 12 is D μ E 12 )
If( x 1 , i n t is C μ M 13 )and( x 2 , i n t is P μ N F 13 )and( x 3 , i n t is A μ N E 13 )then( y 13 is D μ T 13 )
If( x 1 , i n t is C μ M 14 )and( x 2 , i n t is P μ I 14 )and( x 3 , i n t is A μ N E 14 )then( y 14 is D μ B 14 )
If( x 1 , i n t is C μ M 15 )and( x 2 , i n t is P μ F 15 )and( x 3 , i n t is A μ N E 15 )then( y 15 is D μ E 15 )
If( x 1 , i n t is C μ M 16 )and( x 2 , i n t is P μ N F 16 )and( x 3 , i n t is A μ E 16 )then( y 16 i s D μ M A 16 )
If( x 1 , i n t is C μ M 17 )and( x 2 , i n t is P μ I 17 )and( x 3 , i n t is A μ E 17 )then( y 17 is D μ M B 17 )
If( x 1 , i n t is C μ M 18 )and( x 2 , i n t is P μ F 18 )and( x 3 , i n t is A μ E 18 )then( y 18 is D μ E X 18 )
where x 1 , i n t , x 2 , i n t and x 3 , i n t are the input values to the dynamic fuzzy inference system. The superscript of the fuzzy sets C , P , A and D indicates the number of the rule, and the subscript specifies the membership function that characterizes the linguistic term associated with these sets.
The fuzzy inference system that we consider here is dynamic and of Mamdani type. It also has the peculiarity of being unique for each Integrated Instruction prepared. For their part, the membership functions (4–11) and (13–20) that characterize the fuzzy sets (1–3) and (12) are established in terms of the values of the parameters c ,   p , and a , and the values of the parameters presented in Table 1. The behavior of the system is established by the 18 fuzzy rules given above. The values taken by the parameters c ,   p , and a bear that the fuzzy inference system is dynamic. Furthermore, these values vary from one Integrated Instruction to another; nonetheless, when the values of these parameters are the same, the indicators of each Integrated Instruction will be conceptually different. In both cases the dynamic fuzzy inference system will evaluate different indicators.
The maximum operator is a function definable in every completely ordered set that assigns the maximum of these to each n-tuple of values. On the contrary, the minimum operator, equally definable, assigns to each n-tuple of values the minimum of them. In the fuzzy sets’ theory, both operators function in the same way, such that their output is a fuzzy set [27,28]. Moreover, being M and N , two fuzzy sets, the minimum is obtained through:
μ M N x = m i n   [ μ M ( x ) ,   μ B ( x ) ]
and the maximum through:
μ M N x = m a x   [ μ M ( x ) , μ N ( x ) ]
These two operations are also known as intersection (Equation (21)) and union (Equation (21)), and are widely used in all processes of a fuzzy inference system. The maximum operation (Equation (21)) is used to evaluate the prepositions associated with the antecedents, whose arguments are the input values x 1 , i n t , x 2 , i n t , and x 3 , i n t . The implication of the rule follows by using the minimum operation (Equation (22)). Each rule measures the degree of truth of the relationship between the antecedents and the consequent according to the input values. In the inference route, the output of rule i is a fuzzy set D μ i , where 1 i n . In the aggregation process, the output of each rule i is combined as follows:
D = D μ 1 D μ 2 D μ n ,
where ⊕ means the maximum operator and μ_ represents the membership functions, namely, μ B , μ T , μ R , μ M A , μ M B , μ M M , μ E , and μ E X . Equation (23) can be equivalently expressed as:
D = i = 1 n D μ i .
Finally, to perform the defuzzification process, we select the center of gravity method [28], that is, a procedure to convert a fuzzy output into a precise output:
y = i = 1 n y i μ D y i i n μ D y i .

2.3. Data

To illustrate the operation of the evaluation instrument proposed here, we depended on learning activities taken from a programming course designed for Mexican higher middle education students. Between August 2023 to December 2023, several learning activities were developed by 18 students from a single group. We used only three of these, appearing in Appendix A, Appendix B and Appendix C, to confirm related study cases.

3. Results

To illustrate the operation of the proposed evaluation instrument and reviewing the differences and similarities between the developments addressed, we explain the results of the three study cases. In each, as a first step, the Integrated Instruction was harmonized with its corresponding learning activity (Appendix A, Appendix B and Appendix C). As a second step, the associated dynamic fuzzy inference system was built. Finally, as a last step, the learning activities developed by the 18 students were harmonized in the Integrated Instruction and evaluated by its corresponding dynamic fuzzy inference system.

3.1. Study Case 1

We first present the method applied to the learning activity in Appendix A. We begin by harmonizing the Integrated Instruction with said activity (Figure 2). It is required to evidence of knowledge about types and design of algorithms contemplated in the referred learning activity. It is also necessary to provide proof of the development of intuitive and comprehensive thinking in the first part of the activity. Finally, there is a need to assess both the concentration and ingenuity of the student in the second part. Therefore, these four characteristics are included as indicators in the integrated instruction in the “Attitude” column.
In Figure 2, it is possible to observe that Integrated Instruction includes indicators in knowledge, procedure, and attitude. The instructions are prepared in such a way that each of them evaluates different elements of the learning activity. After harmonizing the Integrated Instruction, the involved fuzzy sets were constructed through Equations (1)–(3) and (12) of the dynamic fuzzy inference system, based on the values that c ,   p , and a take in the Integrated Instruction (illustrated in Figure 2) and considering the values of the parameters of the membership functions (4–11) and (13–20) presented in Table 1 (Figure 3).
Figure 3 shows that the domain of the antecedents (fuzzy sets C ,   P , and A ) have a direct relationship with the values of the parameters c ,   p , and a , respectively. This is explained by Equations (1)–(3) that define said sets one to one; these bear the shape of the membership functions defined by Equations (4)–(6) in Figure 3a, by the Equations (7)–(9) in Figure 3b, and by Equations (10) and (11) in Figure 3c.
For each student, in the Integrated Instruction, we indicated with the symbol “✓” all those indicators that were present in the learning activity developed and with the symbol “x” those that were not present; this explains the way by which we find the input values x 1 , i n t , x 2 , i n t y x 3 , i n t (Table 2). Finally, the evaluation of the learning activities developed by the 18 students were carried out through the dynamic fuzzy inference system that is generated by using the fuzzy sets created (Figure 3), the 18 rules that model the behavior of said system (Section 2.2), and its defuzzification method (Equations (23)–(25)).
In this case study, we have 18 evaluations. Let us analyze the results in parts. (i) by total number of indicators, that is x 1 , i n t + x 2 , i n t + x 3 , i n t : student number 8 has 20 indicators present in the development of the learning activity, in which they obtained a 6.5 rating. However, student number 2 has 19 indicators present in his development, but obtained a rating of 6.74, that is, student number 2 obtained a higher rating than student number 8, even though student number 8 obtained a greater number of indicators in his development. This is because the dynamic fuzzy inference system does not evaluate proportionally. This system integrates all the indicators under rules that establish criteria on the possible performance that the student can show under certain elements present in the considered learning activity; (ii) for the same total number of indicators: student number 8 and number 6 have 20 indicators present in their development, but student number 8 has a rating of 6.5 while student number 6 has a rating of 6.12; both students agree on the value of x 3 , i n t but display different records in the x 1 , i n t and x 2 , i n t indicators. This demonstrates, once again, that the system works with rules that establish criteria based on the possible performance that a student can show relative to certain elements present in the development of the learning activity. This means that student number 8 presented characteristics in his development that allowed him to obtain better performance than student number 6; (iii) due to the lack of indicators, that is, when x 1 , i n t = 0 ó x 2 , i n t = 0 ó + x 3 , i n t = 0 : student number 15 did not present indicators related to the attitude, that is, x 3 , i n t = 0 , however, they obtained a passing rating (6.01). This means that the student could do enough to obtain this assessment despite not developing the skills associated with attitude. It is worth clarifying that attitude is an implicit quality in the development of the learning activity; it is also a fact that is observed during said development and has nothing to do with a literal “positive” or “negative” attitude, but rather has to do with the student’s response.
On the other hand, from Table 2 we can determine that 72% of the total number of students obtained the total number of indicators in x 1 , i n t . This means that almost 30% did not obtain sufficient knowledge to carry out the activity. The maximum number of indicators in x 3 , i n t was only obtained by student number 1 and student number 13. This is an important piece of information and can quickly identify both the students and the indicator that separates them and the rest of the students, which in this case was ingenuity and where only students’ number 1 found the shortest paths to move a chair from position B1 to E3 (Appendix A). Another important fact is that more than 50% of the students presented half or less than half of the total number of indicators in x 3 , i n t . This may be due to one or more of the following possibilities: (a) they were not concentrated for some reason; (b) they did not understand the instructions; (c) their abilities or skills were not sufficient to correctly carry out the activity; (d) they need to develop some of their abilities or skills in terms of attitude, which in this case was concentration. This information is useful because the teacher can make appropriate adjustments to the learning activity or to the topics covered in class.

3.2. Study Case 2

With the considered 18 students, the teacher formed two groups, E and F, with 10 and 8 students each, respectively. In group E, there were students displaying more knowledge to develop pseudocodes than in group F. In turn, in each group the teacher formed teams in pairs.
In this study case, we explain the method applied to the learning activity that corresponds to group E (Appendix B), which contains five teams. First, we harmonize the Integrated Instruction with said activity (Figure 4). This corresponds to a bimonthly exam where the objective to be achieved by the student is to solve a problem; to do so, the student must create a pseudocode following a set of instructions. The student must demonstrate that they can solve the problem, that they can understand and follow instructions, and that they can make correct use of the structures in an interpreter such as PSeInt. Furthermore, verifying the effectiveness of the activity requires providing evidence of the development of the student’s analytical, intuitive, and purposeful thinking, as well as concentration to develop the activity and reach a good result. Therefore, we include these four characteristics as indicators in the Integrated Instruction in the “Attitude” column (Figure 4).
In Figure 4, Integrated Instruction includes indicators tied to the Knowledge, Procedure, and Attitude attributes. The instructions are prepared in such a way that each of them evaluates different elements of the learning activity. After harmonizing the Integrated Instruction, the required fuzzy sets were constructed through Equations (1)–(3) and (12) of the dynamic fuzzy inference system, based on the values that c ,   p , and a take in the Integrated Instruction, illustrated in the Figure 4, and the parameter values of the membership functions (4–11) and (13–20), presented in Table 1 (Figure 5).
Figure 5 shows that the domain of the antecedents (fuzzy sets C ,   P , and A ) have a direct relationship with the values of the parameters c ,   p , and a , respectively. This is explained by Equations (1)–(3) that define said sets one to one; these bear the shape of the membership functions defined by Equations (4)–(6) in Figure 5a, by the Equations (7)–(9) in Figure 5b, and by Equations (10) and (11) in Figure 5c.
For each, in the Integrated Instruction we indicated with the symbol “✓” all those indicators that were present in the learning activity developed by the team and with the symbol “x” those that were not present; this explains the way in which we found the input values x 1 , i n t , x 2 , i n t y x 3 , i n t (Table 3). Finally, the evaluation of the learning activities developed by the five teams were carried out through the adapted dynamic fuzzy inference system, which is made up of the fuzzy sets in Figure 5, involving the 18 rules that model the behavior of said system (Section 2.2) and its defuzzification method (Equations (23)–(25)).
In this case study, we dealt with five teams, therefore we have five evaluations (Table 3). From Table 3, we can note that teams T 1 and T 3 have a difference equal to 1 in the total number of indicators; however, their rating is practically the same. This means that both teams need to present more indicators that demonstrate the development of their abilities and skills. Although team T 3 has a higher value in x 1 , i n t than team T 1 , it is not enough to obtain a higher rating than team T 1 , which has one indicator less in the “Knowledge” attribute. Correspondingly, team T 3 would have to work harder to present a larger number in x 1 , i n t that determines better performance. With this, we demonstrate that the dynamic fuzzy inference system does not yield a proportional assessment. On the other hand, the system integrates all the indicators and operates with the criteria established in the 18 rules to evaluate the development of the student’s abilities and skills.

3.3. Study Case 3

In this case study, we present the method applied to the learning activity that corresponds to group D (Appendix C), which contains four teams. First, we harmonize the Integrated Instruction with this learning activity. This activity also corresponds to a bimonthly exam where the objective to be accomplished by the student is to solve a problem; to do so, the student must create a pseudocode following a set of instructions. The student must also demonstrate that they can solve the problem, and therefore that they can understand and follow instructions, as well as make correct use of the structures in an interpreter such as PSeInt. Furthermore, the evaluation requires providing evidence of the development of the student’s analytical, intuitive, and purposeful thinking, as well as their concentration to develop the activity and reach a good result. Therefore, these four characteristics are also included as indicators in the integrated instruction in the “Attitude” column (Figure 6).
In Figure 6, Integrated Instruction includes indicators in the Knowledge, Procedure, and Attitude attributes. The instructions are prepared in such a way that each of them evaluates different elements of the learning activity. After harmonizing the Integrated Instruction, the required fuzzy sets were constructed through Equations (1)–(3) and (12) of the dynamic fuzzy inference system, based on the values that c ,   p , and a take in the Integrated Instruction, illustrated in Figure 6, and with the values of the parameters of the membership functions (4–11) and (13–20), presented in Table 1 (Figure 7).
Figure 7 shows that the domain of the antecedents (fuzzy sets C ,   P , and A ) have a direct relationship with the values of the parameters c ,   p , and a , respectively. This is explained by Equations (1)–(3) that define said sets one to one; these bear the shape of the membership functions defined by Equations (4)–(6) in Figure 7a, by the Equations (7)–(9) in Figure 7b, by Equations (10) and (11) in Figure 7c.
For each team, in the Integrated Instruction, we indicated with the symbol “✓” all those indicators present in the learning activity developed and with the symbol “x” those absent; this explains the way in which we find the input values x 1 , i n t , x 2 , i n t , and x 3 , i n t (Table 4). Finally, the evaluation of the learning activity developed by the four teams was carried out through the dynamic fuzzy inference system adapted, which is made up of the fuzzy sets displayed in Figure 7, and that involve the 18 rules that model the behavior of said system (Section 2.2) and its defuzzification method (Equations (23)–(25)).
Table 4 shows that teams I 2 and I 3 obtained the same rating, 6.71. However, the total number of indicators that team I 2 achieved was 10 while that of team I 3 was 12. These teams are in the same situation as teams T 1 and T 2 from case study 2. In the present case, both teams did not show enough indicators in their development to obtain a higher rating. Even though both groups (E and F) pursued the same programming course, group F performed better. It should be noted that each group developed a learning activity according to their knowledge level. It is also worth clarifying that student number 15 and student number 16 developed the activity not as a team but individually and that, despite this, student number 16 obtained a rating of 10 while student number 15 obtained a rating of 6.71, the same as team I 2 . This means that student number 15 displayed fewer abilities and skills than student number 16 to develop the activity. If these students had developed the activity as a team, student number 15 would probably have acquired the skills and abilities of student number 16 more easily. Furthermore, they would have had even better results as a group than those obtained by developing the activity individually, compared the results obtained by group E from study case 2, in which there are students with greater abilities and skills to create pseudocodes in PSeInt. On the other hand, both team I 2 and student number 15, which together represent three3 students, presented half of the total number of indicators in x 3 , i n t , which indicates that what they lacked was both intuition and analytical ability.

4. Discussion

The results exhibited clearly illustrate how our evaluation instrument works. Particularly, we explained in detail three different study cases in which Integrated Instruction and the dynamic fuzzy inference system are applied. Through these examples we intend to demonstrate that the main task to be developed by the teacher will be to conceive learning activities focused on demonstrating the development of the student’s abilities and skills. The teacher must harmonize the Integrated Instruction scheme with the learning activity. To clarify, one of the fundamental tasks to be developed by the teacher is the adaptation of the Learning Activity, regardless of whether or not the teacher uses our instrument to evaluate it. Therefore, the development of Learning Activities is not a new task for the teacher. So, considering that the Learning Activities are a set of instructions like those presented in Appendix A, Appendix B and Appendix C, our methodology requires that these instructions be written in the Integrated Instruction format presented in Figure 1 and the solution or answer expected by the teacher is written as indicators in the corresponding knowledge. We call this harmonizing. Examples of the harmonization of the Learning Activities presented in Appendix A, Appendix B and Appendix C are shown in Figure 2, Figure 4 and Figure 6, respectively.
In the Integrated Instruction format, the teacher does not need to write a quantitative or qualitative value for the instructions or for the indicators (Figure 2, Figure 4 and Figure 6), as in the case of the rubric where these could cause subjectivity for different reasons [21,29,30]. It is pertinent to mention that the dynamic fuzzy inference system evaluates the development of the student’s abilities and skills through a criterion established in its fuzzy rule base (Section 2.2). Therefore, it is worth highlighting that the evaluation of Learning Activities using our instrument is an easy task for the teacher since the result produced by the dynamic fuzzy inference system does not induce subjectivity.
Furthermore, said evaluation is carried out in a transparent manner, which is convenient for the teacher and the student. That is, in Integrated Instruction you can observe both the indicators present and those that are not present in the development of the Learning Activity by the student. In feedback to the student, the teacher can show exactly the indicators that were lacking in the application of a procedure or in the knowledge required to develop the activity and the required attitude that the student did not take to provide a valid response to the instructions given.
Using Integrated Instruction as an evaluation instrument has other advantages. (a) Possibility of improving the preparation of the learning activities: the evaluation of these (Table 2, Table 3 and Table 4) provides values that could be used statistically to know the behavior of the students based on the development of the activity. That is, the teacher could identify elements of the activity that one or more students could not have developed for some reason. For example, in case study 1, two students obtained the maximum number of identifiers in x 3 , i n t = 4 . (b) Possibility of knowing, at the group level, if the knowledge, procedure, or attitude is present in the abilities of many of the students through the values of x 1 , i n t , x 2 , i n t , and x 3 , i n t , respectively (Table 2, Table 3 and Table 4). If the result was a very low value, the teacher could review one or some topics or make some adjustments in the preparation of the learning activity. (c) Performing statistical analysis using the specifications of integrated instruction. Each instruction has a number (Figure 2, Figure 4 and Figure 6), so it would be possible to know, at the group level, what the average of the total number of students was, whether teams correctly developed the instruction, or what was the trend in the development of the activity in terms of the number of students. This would allow us to know general situations present in the development of the activity, as in case study 3, where three students still need to develop their intuition and analytical capacity. (d) To provide feedback to the student in a precise manner. Within Integrated Instruction, each indicator establishes a point in the development of the activity; the student can know exactly where they failed to perform the instruction correctly, without ambiguity. For the teacher, this implies a reduction in the work time that he must invest to carry out this task, especially when there are many students in the group.
In another published research work, it has been stated: using a simple assessment instrument, such as ranking ten items composing a learning activity and evaluating it based on the number of correct items encourages a superficial evaluation of knowledge or encourages memorization more than deep or deep understanding critical thinking [31]. For obvious reasons, using a simple assessment instrument cannot provide enough information for detailed feedback. That is, students may know which items they got wrong, but they may not know how to improve their skills. The validity and reliability of this type of assessment may be questionable [32], especially if the set of items does not adequately provide information on the skills and abilities that are intended to be assessed. Other examples about assessment instruments such as portfolios, oral presentations, research projects, peer assessment, case studies, practical exams, self-assessment and reflection, interviews, and the rubric are used to evaluate learning activities, however the type of assessment that all these paradigms handle is qualitative, which implies subjectivity that can lead to inconsistencies and therefore affect the reliability of the evaluation results. Different authors have used these instruments and have demonstrated in some way what was said above. For example, Oakley et al. [31] offers valuable insights into subjectivity in portfolio evaluation. Luoma [33] provides updated methods for the evaluation of oral skills, addressing subjectivity in the evaluation of presentations. Buckley [34] discusses the evaluation of classroom research projects and the challenges in maintaining objectivity. Li et al. [35] examine the effectiveness of peer assessment and discuss variability and subjectivity as challenges. Kim et al. [36] address evaluation in medical case studies and the challenges of subjectivity. Norcini et al. [37] review assessment methods in medical education, including practical examinations, and discusses subjectivity and inconsistency. Andrade et al. [38] present research on how students respond to self-assessment and discuss the subjectivity inherent in this process. Cleland et al. [39] provide insights into assessing clinical skills through interviews, highlighting challenges of subjectivity. Brookhart et al. [40] examine the quality and effectiveness of descriptive rubrics, including discussions of subjectivity and consistency in assessment”.
We agree that introducing a new evaluation paradigm, such as the one offered here, could face opposition. It is evident that the Integrated Instruction approach does not amount to a simple evaluation instrument. The most important implication when using Integrated Instruction as an evaluation instrument is the change of mentality that the teacher must have when producing learning activities. This means that the teacher must make learning activities in which the development of skills and abilities by the student through identifiers related to knowledge, procedure, and attitude (Figure 2, Figure 4 and Figure 6) can be evidenced.
It is possible to suggest that Integrated Instruction can be used as an evaluation instrument in various fields of study. For example, computer science, mathematics, and biology could also be enabled in various fields of basic science. In addition, it could help improve the results of the PISA exams, which evaluate more basic sciences. Moreover, the only requirement that the teacher needs to achieve to use the Integrated Instruction is to master knowledge in the pertaining field of study.
Integrated Instruction, in the future, can be a tool for the teacher, which facilitates the task of preparing an entire study program in which the learning activities evaluation process and analysis of results can be systematized, allowing the comparison of the results of two or more groups in different school years or within the same school year. However, we think that if we can provide an app that allows easy use of the method, it will be more quickly accepted by teachers as an evaluation tool. We are currently working on the development of the required platform.

5. Conclusions

The use of fuzzy inference systems has wide application in many areas of knowledge and in the development of technologies. However, sometimes we do not apply our mathematics models to other areas of knowledge or to activities that can benefit from them. An example of the above is the present study, in which we present Integrated Instruction as an evaluation instrument of skills and abilities development through a learning activity developed by the student. Our evaluation instrument, based on a fuzzy inference paradigm, determines a precise value for the evaluation based on criteria established in a set of rules. The impact of this model is beneficial for the purpose of evaluating a learning activity in a transparent manner, as demonstrated in the three-case studies (A, B, and C).
The introduction of our evaluation instrument may face some disadvantages, like any other newly proposed method, namely: (a) the refusal to use it because both teachers and students are familiar with the traditional evaluation schemes. Certainly, according to Fullan [41], changes in education require a process of adaptation and acceptance by everyone involved, which can be a significant challenge; (b) the lack of training for teachers could limit their expectation to an effective use of our instrument. Tondeur et al. [42] highlight the importance of continuous teacher training as a critical component for the effective integration of new technologies and methodologies in educational environments; (c) inequality in access to technological resources to implement our evaluation instrument. Concerning this issue, UNESCO [43] points out that equity in access to technological resources is essential to guarantee inclusive and quality education; (d) the reliance of our developments on fuzzy inference systems could pose substantial analytical barriers to most teachers, as they are more prone to rely on conventional statistical-based evaluation schemes. Teachers may have concerns about the fairness and validity of the fuzzy inference system in accurately assessing students’ skills acquisition. They may question whether the system adequately captures the nuances of student learning and whether it provides reliable results that can inform instructional decisions. Nevertheless, the effectiveness of fuzzy based evaluation method has been profusely demonstrated. Fuzzy inference systems handle the uncertainty and ambiguity inherent in learning assessment in an effective manner. Fuzzy logic allows degrees of membership to different categories, which better reflects the complexity of the students learning [44]. Fuzzy inference systems-based evaluation can work with linguistic variables which facilitate the interpretation and communication of the results to teachers so it can improve pedagogical decision-making [45]. Additionally, these systems allow a more effective personalization of instruction, which can improve learning outcomes [46]. Furthermore, they allow for continuous formative assessment, providing immediate and relevant feedback to students and teachers. This is crucial to adjust teaching and learning strategies, and to promote continuous improvement [47]. Likewise, fuzzy inference systems are flexible and can easily adapt to different educational contexts and changes in learning objectives or curricula. This makes them particularly useful in dynamic educational environments [48]. In summary, although as we have explained, the adoption of our assessment instrument may offer opportunities to improve the evaluation of student skills, before it could be fully implemented, it is crucial to address afore listed disadvantages, and also taking measures aimed to granting careful planning, appropriate teacher training, and also assuring investment in technological resources, to warrant a successful transition to the use of our innovative and effective evaluation instrument.
As a future work, concerning the promotion for adoption of our evaluation system, we could provide an app that could allow easy application of the method, this to achieve the goal of a faster acceptance by teachers. We are working on the development of the required platform.

Author Contributions

C.L.-R. and H.A.E.-H. both contributed to conceptualization, methodology, validation, formal analysis writing, review and editing funding and acquisition. C.L.-R. deal with programming and computing duties. All authors have read and agreed to the published version of the manuscript.

Funding

This research received internal funding from Centro de Investigacion Científica y de Educación Superior de Ensenada Grant number 622157.

Data Availability Statement

Involved data is available from https://drive.google.com/drive/folders/1_XJSA39YcsPzKVhIx1S10mOlxGltJI58?usp=sharing, (accessed on 28 January 2020).

Acknowledgments

We thank CICESE for its encouragement.

Conflicts of Interest

The authors declare no conflicts of interest.

Appendix A

CLASS 2-LEARNING ACTIVITY—22 AUGUST 2023
NAME:
(1)
Look at the two solutions below and indicate whether they are a qualitative algorithm. In the case of that they are not, please modify them so that they are.
Solution 1Solution 2
I dress quickly.Bring the tools I am going to use.
I get up in the morning.Make the arrangement carefully.
I take a 10-min shower.Locate the damage to the tank.
I have tea with milk, and I leave.Pass the invoice for the work done.
I finish showering at 7:25.Organize how I am going to do the work.
I arrived early at school.See what type of fix you need.
The clock says 7:15.Buy arrangement efficiently.
Is it an algorithm?Is it an algorithm?
        Algorithm 1           Algorithm 2    
(2)
Make an algorithm to move the chair that is in position B1 to position E3.
Rules:
(a)
You can move a chair one position at a time.
(b)
You may move a chair to an adjacent position if there is no chair in that adjacent position.
(c)
You cannot move a chair in a diagonal position.
Figure A1. Instructions for Class 2-learning activity. Distribution of chairs on a board. Arrangement of the chairs that should not be moved (brown color) and the chair that should be moved (blue color) on a board.
Figure A1. Instructions for Class 2-learning activity. Distribution of chairs on a board. Arrangement of the chairs that should not be moved (brown color) and the chair that should be moved (blue color) on a board.
Mathematics 12 01015 g0a1

Appendix B

CLASS 17—EXAM—2ND BIMESTER—SUBGROUP A—5 DECEMBER 2023
NAME:
(1)
Make a pseudocode in PSeInt (pseudocode interpret) to calculate the average of the expenses incurred for each water, telephone and gas service in the months of January, February and March:
(a)
The pseudocode must use the PARA structure in solving the problem.
(b)
The pseudocode must ask the user about the amounts spent for each service.
(c)
The pseudocode should produce the following output:
-
The average expense for water service was: WWWW
-
The average expense for telephone service was: YYYY
-
The average expense for gas service was: XXXX
(2)
The pseudocode must verify that the average expense for water service has not exceeded 300 pesos, otherwise it must calculate the surplus and produce the following output:
You have exceeded your water consumption; you have paid a surplus of: ZZZZ
(3)
Write “Average of three payments for services” as the name of the Algorithm.
(4)
Once points 1 to 3 have been completed, save your pseudocode in a file whose name is the last name of a member of your team by adding the word version_1, example:
Smith_version_1
(5)
Make a second version of your pseudocode, example:
Smith_version_2
(6)
In the second version of your pseudocode change the FOR structure to the WHILE or REPEAT structure.
(7)
In the second version of your pseudocode, write the necessary instructions so that it calculates the average considering only amounts spent greater than zero. If all of them are equal to or less than zero, show the following message to the user:
Averaging can only be done with amounts greater than zero.

Appendix C

CLASS 17—EXAM—2ND BIMESTER—SUBGROUP B—5 DECEMBER 2023
NAME:
(1)
Create a pseudocode in PSeInt (pseudocode interpret) to perform one of the basic operations (addition, subtraction, multiplication, or division) between two numbers.
(2)
The pseudocode should show the user the following options menu:
(a)
Sum
(b)
Subtraction
(c)
Multiplication
(d)
Division
(3)
The pseudocode must ask the user:
Which option do you want to perform?
(4)
The pseudocode must save the option chosen by the user in the variable option.
(5)
The pseudocode should ask the user to enter the value of the first number and save it in the variable num_1.
(6)
The pseudocode should ask the user to enter the value of the second number and save it in the variable num_2.
(7)
The pseudocode must perform the operation chosen by the user and save the result in the variable result.
(8)
The output of the pseudocode will be as follows: for example, in the case that num_1 is equal to 4 and num_2 is equal to 5, and the chosen option is “1” then the output will be:
The sum of 4 + 5 is equal to 9.
(9)
The pseudocode must verify if the amount that the variable result contains is greater than $100.00 pesos, then it must increase that amount by 10% and save the result of the operation in the variable result. If the amount is less than $100.00 pesos then you must increase that amount by 20% and save the result of the operation in the variable result. Show the user the increase that was made.
(10)
Write “basic operations” as the pseudocode name.
(11)
Once points 1 to 7 have been completed, save your pseudocode in a file, use the last name of one of the team members as the name, adding the word “version_1”. Example:
Smith_version_1

References

  1. Hussin, A.A. Education 4.0 made simple: Ideas for teaching. Int. J. Educ. Lit. Stud. 2018, 6, 92–98. [Google Scholar]
  2. Fitsilis, P. Navigating the Skills Revolution: The Essential Role of Competence Frameworks. Qeios 2024. [Google Scholar] [CrossRef]
  3. Le Deist, F.D.; Winterton, J. What is competence? Hum. Resour. Dev. Int. 2005, 8, 27–46. [Google Scholar] [CrossRef]
  4. Sultana, R.G. Competence and competence frameworks in career guidance: Complex and contested concepts. Int. J. Educ. Vocat. Guid. 2009, 9, 15–30. [Google Scholar] [CrossRef]
  5. Miranda, J.; Navarrete, C.; Noguez, J.; Molina-Espinosa, J.M.; Ramírez-Montoya, M.S.; Navarro-Tuch, S.A.; Bustamante-Bello, M.; Rosas-Fernández, J.B.; Molina, A. The core components of education 4.0 in higher education: Three case studies in engineering education. Comput. Electr. Eng. 2021, 93, 107278. [Google Scholar] [CrossRef]
  6. Alvarez-Cedillo, J.; Aguilar-Fernandez, M.; Sandoval-Gomez Jr, R.; Alvarez-Sanchez, T. Actions to Be Taken in Mexico towards Education 4.0 and Society 5.0. Int. J. Eval. Res. Educ. 2019, 8, 693–698. [Google Scholar] [CrossRef]
  7. Razo, A.E. La Reforma Integral de la Educación Media Superior en el aula: Política, evidencia y propuestas. Perfiles Educ. 2018, XL, 90–106. [Google Scholar] [CrossRef]
  8. Díaz-Barriga, A. El enfoque de competencias en la educación, ¿Una alternativa o un disfraz de cambio? Perfiles Educ. 2006, XXVIII, 7–36. [Google Scholar]
  9. Arellano-Gámez, L.A. La competencia es un saber, saber ser y un saber hacer. Rev. Od Los Andes 2009, 4, 3–5. [Google Scholar]
  10. Van Der Vleuten, C.P. The assessment of professional competence: Developments, research, and practical implications. Adv. Health Sci. Educ. 1996, 1, 41–67. [Google Scholar] [CrossRef]
  11. Ruiz, M. Instrumentos de Evaluación de Competencias. Universidad Tecnológica de Chile, 2007. Available online: https://www.academia.edu/8621606/INSTRUMENTOS_DE_EVALUACI%C3%93N_DE_COMPETENCIAS (accessed on 5 December 2023).
  12. Morales-López, S.; Hershberger del Arenal, R.; Acosta-Arreguín, E. Evaluación por competencias: ¿cómo se hace? Rev. De La Fac. De Med. De La UNAM 2020, 63, 46–56. [Google Scholar] [CrossRef]
  13. Programa para la Evaluación Internacional de Alumnos. 2018. Available online: https://www.oecd.org/pisa/publications/PISA2018_CN_MEX_Spanish.pdf (accessed on 5 December 2023).
  14. Romero-Escobar, H.M. Modelo Para Alinear Las Competencias del Docente, del Alumno y Las Requeridas Por la Instrucción, y Mejorar la Calidad de la Educación en el Nivel de Educación Media Superior. Ph.D. Thesis, Universidad Iberoamericana, Tijuana, México, 2018. [Google Scholar]
  15. Leal-Ramírez, C.; Echavarría-Heras, H.A.; Romero-Escobar, H.M. A Mamdani Type-Fuzzy Inference-Alignment Matrix Method Aimed to Evaluate Competencies Acquired by Students Enrolling at the Mexican Higher Middle Education System I: Formulation and Explanation Based on Simulation, and A Real but Incomplete Data Set. Comput. Y Sist. 2022, 6, 1–31. [Google Scholar] [CrossRef]
  16. Del Pozo-Flórez, J.A. Competencias profesionales. Herramientas para a evaluación: El portafolios, la rúbrica y las pruebas situacionales. Rev. Española De Pedagog. 2012, 71, 375–377. [Google Scholar]
  17. Rahmawati, Y. Assessing cross-cultural understanding and intercultural communication skills in EFL classrooms: Challenges, best practices, and perceptions. NextGen Educ. Rev. J. 2023, 1, 22–32. [Google Scholar] [CrossRef]
  18. Lam, R. Teacher Learning of portafolio Assessment Practices: Testimonies of two writing teachers. In Teacher Learning with Classroom Assessment; Jiang, H., Hill, M., Eds.; Springer: Singapore, 2018. [Google Scholar]
  19. Alsina, J. Rúbricas para la Evaluación de Competencias. In España; Cuadernos de Docencia Universitaria, Ediciones Octaedro: Barcelona, Spain, 2013. [Google Scholar]
  20. Cano, E. Las rúbricas como instrumento de evaluación de competencias en educación superior: ¿uso o abuso? Profesorado. Rev. De Currículum Y Form. Profr. 2015, 19, 265–280. [Google Scholar]
  21. Jonsson, A.; Svingby, G. The use of scoring rubrics: Reliability, validity and educational consequences. Educ. Res. Rev. 2007, 2, 130–144. [Google Scholar] [CrossRef]
  22. Zadeh, L.A. Fuzzy logic = computing with words. IEEE Trans. Fuzzy. Syst. 1996, 4, 103–111. [Google Scholar] [CrossRef]
  23. Zadeh, L.A. Fuzzy logic and the calculus of fuzzy if-then rules. In Proceedings of the 22nd International Symposium on Multiple-Valued Logic, Los Alamitos, CA, USA, 19–22 May 2004; IEEE Computer Society Press: Washington, DC, USA, 1992; p. 480. [Google Scholar]
  24. Mehmet Kaya, A.A. Determination of fuzzy logic membership functions using genetic algorithms. Fuzzy Sets Syst. 2001, 118, 297–306. [Google Scholar]
  25. Sadollah, A. Introductory chapter: Which membership function is appropriate in fuzzy system? In Fuzzy Logic Based in Optimization Methods and Control Systems and Its Applications; IntechOpen: London, UK, 2018. [Google Scholar]
  26. Ibrahim, A.H.; Claus, G.S. Fuzzy Systems in Education: A More Reliable System for Student Evaluation. In Fuzzy Systems; Azar, A.T., Ed.; IntechOpen: Rijeka, Croatia, 2010. [Google Scholar]
  27. Lee, E.T.; Zadeh, L.A. Note on fuzzy languages. Inf. Sci. 1969, 1, 421–434. [Google Scholar] [CrossRef]
  28. Klir, G.J.; Bo, Y. Fuzzy Sets and Fuzzy Logic: Theory and Applications; Prentice Hall: Hoboken, NJ, USA, 1995. [Google Scholar]
  29. Reddy, Y.M.; Andrade, H. A review of rubric use in higher education. Assess. Eval. High. Educ. 2010, 35, 435–448. [Google Scholar] [CrossRef]
  30. Panadero, E.; Jonsson, A. The use of scoring rubrics for formative assessment purposes revisited: A review. Educ. Res. Rev. 2013, 9, 129–144. [Google Scholar] [CrossRef]
  31. Oakley, G.; Pegrum, M.; Johnston, S. Introducing e-portfolios to pre-service teachers as tools for reflection and growth: Lessons learnt. Asia Pac. J. Teach. Educ. 2014, 42, 36–50. [Google Scholar] [CrossRef]
  32. Ambrose, S.A.; Bridges, M.W.; DiPietro, M.; Lovett, M.C.; Norman, M.K. How Learning Works: Seven Research-Based Principles for Smart Teaching; John Wiley & Sons: Hoboken, NJ, USA, 2010. [Google Scholar]
  33. Luoma, S. Assessing Speaking; Cambridge University Press: Cambridge, UK, 2019. [Google Scholar]
  34. Buckley, C.A. Assessment of classroom-based, authentic research projects. Biochem. Mol. Biol. Educ. 2016, 44, 12–19. [Google Scholar]
  35. Li, H.; Xiong, Y.; Hunter, C.V.; Guo, X.; Tywoniw, R. Does peer assessment promote student learning? A meta-analysis. Assess. Eval. High. Educ. 2019, 45, 193–211. [Google Scholar] [CrossRef]
  36. Kim, S.; Phillips, W.R. Using case-based discussions to teach ethics in medicine. Fam. Med. 2014, 46, 712–716. [Google Scholar]
  37. Norcini, J.; McKinley, D.W. Assessment methods in medical education. Teach. Teach. Educ. 2017, 23, 239–250. [Google Scholar] [CrossRef]
  38. Andrade, H.L.; Du, Y. Student responses to criteria-referenced self-assessment. Assess. Eval. High. Educ. 2017, 42, 472–484. [Google Scholar] [CrossRef]
  39. Cleland, J.; Roberts, R.; Kitto, S.; Strand, P.; Johnston, P. Students’ clinical experiences in the undergraduate phase of medical education: A qualitative study of the medical student voice. Med. Educ. 2018, 52, 739–750. [Google Scholar]
  40. Brookhart, S.M.; Chen, F. The quality and effectiveness of descriptive rubrics. Educ. Rev. 2015, 67, 343–368. [Google Scholar] [CrossRef]
  41. Fullan, M. The New Meaning of Educational Change, 4th ed.; Teachers College Press: New York, NY, USA, 2007. [Google Scholar]
  42. Tondeur, J.; Petko, D.; Christensen, R.; Drossel, K.; Starkey, L. Quality criteria for conceptual technology integration models in education: Bridging research and practice. Educ. Technol. Res. Dev. 2021, 69, 2187–2208. [Google Scholar] [CrossRef]
  43. UNESCO. Informe de Seguimiento de la Educación en el Mundo 2019: Migración, Desplazamiento y Educación: Construyendo Puentes, no Muros; UNESCO: Paris, France, 2019. [Google Scholar]
  44. Zadeh, L.A. Fuzzy sets. Inf. Control 1965, 8, 338–353. [Google Scholar] [CrossRef]
  45. Kuncheva, L.I. Fuzzy Classifier Design; Springer: Berlin/Heidelberg, Germany, 2000. [Google Scholar]
  46. Negnevitsky, M. Artificial Intelligence: A Guide to Intelligent Systems, 2nd ed.; Pearson Education: London, UK, 2005. [Google Scholar]
  47. Hwang, G.J.; Li, K.C.; Lam, S.L. Applying a fuzzy logic approach to model the process of problem-solving in a project-based learning environment. Comput. Educ. 2018, 115, 202–212. [Google Scholar]
  48. Dounis, A.I.; Caraiscos, C. Advanced control systems engineering for energy and comfort management in a building environment—A review. Renew. Sustain. Energy Rev. 2009, 13, 1246–1261. [Google Scholar] [CrossRef]
Figure 1. Structure of Integrated Instruction. The symbols c ,   p and a represent parameters that correspond to the values of the total sum of the indicators located in the “Knowledge”, “Procedure” and “Attitude” columns, respectively. The x symbol represents indicators that are not included in the learning activity undertaken by the student. The sum of all the “✓” symbols found in the column Id_c is characterized by x 1 , i n t . Similarly, the sum of all the “✓” symbols found in the column Id_p is characterized by x 2 , i n t and the sum of all the “✓” symbols found in the column Id_a is characterized by x 3 , i n t .
Figure 1. Structure of Integrated Instruction. The symbols c ,   p and a represent parameters that correspond to the values of the total sum of the indicators located in the “Knowledge”, “Procedure” and “Attitude” columns, respectively. The x symbol represents indicators that are not included in the learning activity undertaken by the student. The sum of all the “✓” symbols found in the column Id_c is characterized by x 1 , i n t . Similarly, the sum of all the “✓” symbols found in the column Id_p is characterized by x 2 , i n t and the sum of all the “✓” symbols found in the column Id_a is characterized by x 3 , i n t .
Mathematics 12 01015 g001
Figure 2. Integrated Instruction harmonized with the learning activity of Appendix A. The total number of indicators that the Knowledge has is c = 3 , the Procedure has p = 24 indicators, and the Attitude has a = 4 indicators.
Figure 2. Integrated Instruction harmonized with the learning activity of Appendix A. The total number of indicators that the Knowledge has is c = 3 , the Procedure has p = 24 indicators, and the Attitude has a = 4 indicators.
Mathematics 12 01015 g002
Figure 3. Fuzzy sets tied to the Integrated Instruction presented in Figure 2: (a) Antecedent associated with the term “Knowledge” defined by Equation (1); (b) Antecedent associated with the term “Procedure” defined by Equation (2); (c) Antecedent associated with the term “Attitude” defined by Equation (3); (d) Consequent associated with the term “Learning” defined by Equation (12).
Figure 3. Fuzzy sets tied to the Integrated Instruction presented in Figure 2: (a) Antecedent associated with the term “Knowledge” defined by Equation (1); (b) Antecedent associated with the term “Procedure” defined by Equation (2); (c) Antecedent associated with the term “Attitude” defined by Equation (3); (d) Consequent associated with the term “Learning” defined by Equation (12).
Mathematics 12 01015 g003aMathematics 12 01015 g003b
Figure 4. Integrated Instruction harmonized with the learning activity of Appendix B. The total number of indicators that the Knowledge has is c = 7 , the Procedure has p = 7 and the Attitude has a = 4 .
Figure 4. Integrated Instruction harmonized with the learning activity of Appendix B. The total number of indicators that the Knowledge has is c = 7 , the Procedure has p = 7 and the Attitude has a = 4 .
Mathematics 12 01015 g004
Figure 5. Fuzzy sets tied to the Integrated Instruction presented in Figure 4: (a) Antecedent associated with the term “Knowledge” defined by Equation (1); (b) Antecedent associated with the term “Procedure” defined by Equation (2); (c) Antecedent associated with the term “Attitude” defined by Equation (3); (d) Consequent associated with the term “Learning” defined by Equation (12).
Figure 5. Fuzzy sets tied to the Integrated Instruction presented in Figure 4: (a) Antecedent associated with the term “Knowledge” defined by Equation (1); (b) Antecedent associated with the term “Procedure” defined by Equation (2); (c) Antecedent associated with the term “Attitude” defined by Equation (3); (d) Consequent associated with the term “Learning” defined by Equation (12).
Mathematics 12 01015 g005
Figure 6. Integrated Instruction harmonized with the learning activity of Appendix C. The total number of indicators that the Knowledge has is c = 5 , the Procedure has p = 11 and the Attitude has a = 4 .
Figure 6. Integrated Instruction harmonized with the learning activity of Appendix C. The total number of indicators that the Knowledge has is c = 5 , the Procedure has p = 11 and the Attitude has a = 4 .
Mathematics 12 01015 g006
Figure 7. Fuzzy sets tied to the Integrated Instruction presented in Figure 6: (a) Antecedent associated with the term “Knowledge” defined by Equation (1); (b) Antecedent associated with the term “Procedure” defined by Equation (2); (c) Antecedent associated with the term “Attitude” defined by Equation (3); (d) Consequent associated with the term “Learning” defined by Equation (12).
Figure 7. Fuzzy sets tied to the Integrated Instruction presented in Figure 6: (a) Antecedent associated with the term “Knowledge” defined by Equation (1); (b) Antecedent associated with the term “Procedure” defined by Equation (2); (c) Antecedent associated with the term “Attitude” defined by Equation (3); (d) Consequent associated with the term “Learning” defined by Equation (12).
Mathematics 12 01015 g007
Table 1. Values of the parameters of the membership functions (MF) that characterize the fuzzy sets C , P , A and D .
Table 1. Values of the parameters of the membership functions (MF) that characterize the fuzzy sets C , P , A and D .
MF a b c d m σ Equation
C μ P x 1 a N E = 0.5 b N E = 4.5 --------------------(4)
μ S x 1 -------------------- m S = 1 σ S = 5 (5)
μ M x 1 a N E = 5.5 b N E = 9.5 ---------- m N E =   ( a N E + b N E ) / 2 -----(6)
P μ N F x 2 a N F = 0 b N F = 3.6 --------------------(7)
μ I x 1 -------------------- m I = 1.274 σ I = 5 (8)
μ F x 1 a F = 6.4 b F = 10 ---------- m F =   ( a F + b F ) / 2 -----(9)
A μ N E x 3 a N E = 0 b N E = 0 c N E = 1 d N E = 8.796 ----- (10)
μ E S x 3 a E S = 1.2 b E S = 8.9 c N E = 10 d N E = 10----------(11)
D μ T x 4 a T = 1 b T = 1 m T = 0 -----(13)
μ M M x 4 a M M = 0.5 b M M = 2.5 m M M = 1.5 -----(14)
μ M A x 4 a M A = 2 b M A = 4 m M A = 3 -----(15)
μ R x 4 a R = 3.5 b R = 5.5 m R = 4.5 -----(16)
μ B x 4 a B = 5 b B = 7 m B = 6-----(17)
μ M B x 4 a M B = 6.5 b M B = 8.5 m M B = 7.5 -----(18)
μ E x 4 a E = 8 b E = 10 m E = 9 -----(19)
μ E X x 4 a E X = 9.5 b E X = 10.5 m E X = 10 -----(20)
Table 2. Input and output values of the dynamic fuzzy inference system. The learning activity evaluated corresponds to the one presented in Appendix A and with corresponding Integrated Instruction as shown in Figure 2.
Table 2. Input and output values of the dynamic fuzzy inference system. The learning activity evaluated corresponds to the one presented in Appendix A and with corresponding Integrated Instruction as shown in Figure 2.
Student x 1 , i n t x 2 , i n t x 3 , i n t y
1324410
231426.74
332439.59
411936.93
532239.54
611816.12
731928.7
831616.5
921737.34
1032439.59
1131717.07
1212237.7
1322349.18
1432229.26
1531206.01
1631827.89
1732239.54
1831928.7
Table 3. Input and output values of the dynamic fuzzy inference system. The learning activity evaluated corresponds to the one presented in Appendix B and with corresponding integrated instruction is the one shown in Figure 4.
Table 3. Input and output values of the dynamic fuzzy inference system. The learning activity evaluated corresponds to the one presented in Appendix B and with corresponding integrated instruction is the one shown in Figure 4.
TeamStudent x 1 , i n t x 2 , i n t x 3 , i n t y
T 1 16537.40
2
T 2 37547.89
4
T 3 57537.41
6
T 4 777410
8
T 5 97739.59
10
Table 4. Input and output values of the dynamic fuzzy inference system. The learning activity evaluated corresponds to the one presented in Appendix C and with corresponding Integrated Instruction is the one shown in Figure 6.
Table 4. Input and output values of the dynamic fuzzy inference system. The learning activity evaluated corresponds to the one presented in Appendix C and with corresponding Integrated Instruction is the one shown in Figure 6.
TeamStudent x 1 , i n t x 2 , i n t x 3 , i n t y
I 1 1151139.59
12
I 2 132626.71
14
I 3 154626.71
16511410
I 4 1751139.59
18
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Leal-Ramírez, C.; Echavarría-Heras, H.A. An Integrated Instruction and a Dynamic Fuzzy Inference System for Evaluating the Acquirement of Skills through Learning Activities by Higher Middle Education Students in Mexico. Mathematics 2024, 12, 1015. https://0-doi-org.brum.beds.ac.uk/10.3390/math12071015

AMA Style

Leal-Ramírez C, Echavarría-Heras HA. An Integrated Instruction and a Dynamic Fuzzy Inference System for Evaluating the Acquirement of Skills through Learning Activities by Higher Middle Education Students in Mexico. Mathematics. 2024; 12(7):1015. https://0-doi-org.brum.beds.ac.uk/10.3390/math12071015

Chicago/Turabian Style

Leal-Ramírez, Cecilia, and Héctor Alonso Echavarría-Heras. 2024. "An Integrated Instruction and a Dynamic Fuzzy Inference System for Evaluating the Acquirement of Skills through Learning Activities by Higher Middle Education Students in Mexico" Mathematics 12, no. 7: 1015. https://0-doi-org.brum.beds.ac.uk/10.3390/math12071015

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop