Next Article in Journal
Carbon Dioxide Capture and Storage (CCS) in Saline Aquifers versus Depleted Gas Fields
Previous Article in Journal
Geological and Geochemical Characterization of Variscan Pegmatites in the Sidi Bou Othmane District, Central Jebilet Province, Morocco
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Investigating Students’ Perception with an Online Dynamic Earth Course during COVID-19: A Quantitative Inquiry

by
Md Iftekhar Alam
1,*,
Jian Su
2,
Hongwei Yang
3 and
Jacob Benner
4
1
Geological Sciences Department, Ohio University, Athens, OH 45701, USA
2
Office of Digital Learning, University of Tennessee at Knoxville, Knoxville, TN 37902, USA
3
School of Education, University of West Florida, Pensacola, FL 32514, USA
4
Department of Earth, Environmental and Planetary Sciences, University of Tennessee at Knoxville, Knoxville, TN 37996, USA
*
Author to whom correspondence should be addressed.
Submission received: 6 March 2024 / Revised: 10 May 2024 / Accepted: 17 May 2024 / Published: 28 May 2024

Abstract

:
This study investigated Earth science students’ experiences with online education during the COVID-19 pandemic at the University of Tennessee, Knoxville, in the US. We used an existing survey from the online education literature, the Online Learning Environment Survey (OLES), which consists of three instruments: (a) community of inquiry (CoI), (b) Institutional Support (IS), and (c) Self-Directed Online Learning Scale (SDOLS). The survey rating subscales ordered from highest to lowest are autonomous learning, asynchronous online learning, institutional support, teaching presence, social presence, and cognitive presence, respectively, indicating interest for the online learning environment. Among all of the subscales, the asynchronous online category was rated the highest by the students. The data were then analyzed using Rasch modeling. According to the Rasch analyses, asynchronous online teaching represents the most favorable course delivery technique for geoscience education. Overall, the survey data show a general interest in online delivery and the effectiveness of the modality, thus indicating potential for evolving into an online Earth science program. Finally, also discussed are possible future extensions of the research (e.g., extending the research to other introductory online geoscience courses).

1. Introduction

One of the key attributes of an online learning program is bi-directional communication between learners and instructors, which is primarily dependent on the delivery method and overall accessibility [1,2,3]. The online format means that course-related activities (e.g., lecturing by the instructor) may not require either a physical classroom or a designated class meeting time (i.e., asynchronous). While the transition to online education has been witnessed in many disciplines such as physics, chemistry, and mathematics [4,5], the online delivery of Earth science courses is still not well established, except for several online programs in a small number of universities (e.g., the University of Florida, Texas A&M University, Western Governors University, Ohio University).
Asynchronous online courses feature the flexibility to access teaching and learning without the simultaneous presence of students and instructors in a physical classroom. This course delivery mode can be very effective during adverse conditions, such as the recent COVID-19 global pandemic. Although online course delivery is known to be effective, when it comes to geoscience courses, the lack of an adequate field support system could pose challenges to the successful delivery of such courses [6]. On the other hand, technological advances in the last decade have had a significant impact on online education [7,8], so it may be reasonable to anticipate that the online delivery of geoscience courses could also benefit from such advances, particularly introductory-level courses (i.e., physical geology and historical geology).
This study presents a novel analysis of students’ feedback survey data from an online Dynamic Earth course developed before and through the COVID-19 pandemic. The objective of the study is to evaluate the student responses of an online geoscience course using OLES and analyzing its future implications for further development. The study serves two purposes: (1) to understand students’ lived experiences regarding the online delivery of Earth science courses (e.g., course accessibility), and (2) to gain insights into the development of other online Earth science courses based on the findings from this study.
In this context, our goal is to address the following research questions (RQs):
RQ1: How important did students perceive various community of inquiry factors to be in terms of contributing to their online learning in the introductory-level Earth science course?
RQ2: How important did students perceive various self-directed learning factors to be in terms of contributing to their online learning in the introductory-level Earth science course?
The organization of the paper is as follows. First, we discuss a literature review explaining the context and relevance of the study. Then, we present the method while explaining the research site, survey instrument, data collection, and data analysis. The Methodology section is followed by a synthesis of the results, discussion, and conclusion.

2. Literature Review

Existing studies examined and compared the learning outcomes between online and traditional classroom-based onsite classes in various disciplines including geology. Burger (2015) [9] suggested an effective online Earth science course using a distinct delivery method that directly compared student performance and experience, including perceived course difficulty, learning effectiveness, student engagement, satisfaction and retention, and student comfort with technology. The study also concluded that the rarity of online Earth science classes may be attributed to employers’ and professional geologists’ negative perceptions of the online delivery approach, which cannot provide in-person field experiences. Ni (2013) [10] suggested that the online teaching mode can be more successful and accessible than traditional classroom teaching, primarily because of the flexibility of course delivery and independent mode of instruction. Triantafyllou and Timcenko (2016) [11] reported successful online student learning in mathematics courses delivered through media technology, including screencasts, online readings, quizzes, and lecture notes.
A typical learning goal in geoscience education is to ensure that students develop 3-dimensional perception and interpretation skills. To develop a strong conceptual foundation, it is important for the student to manipulate and reconcile 3D representations of structures that are not fully elucidated by 2D textbook figures. The field component of introductory geoscience courses, therefore, is imperative because it may assist students in making inferences and reconciling classroom learning [12,13,14]. This necessary learning goal has been a long-standing limitation for online geoscience courses, as the virtual medium was never thought to be a viable alternative to the field component. With recent advancements in computer graphics, increasing availability of virtual fieldtrips, and high-resolution 3D images, a technology-enabled Earth science education is no longer hindered by those limitations.
User experience is a crucial metric of the success of an online class [15]. Several obstacles could exist, which are not directly related to the subject matter but rather to the course delivery interface and clarity of the course structure. Challenges may include (1) the demonstration of geologic concepts with examples with no in-person interaction, (2) student background in technology, and (3) student adaptability to the online environment [16,17,18]. Importantly, different students in the same course may have vastly different levels of background and experience with technology, which could lead to various responses [19].
Key geologic topics, including earthquakes, plate tectonics, groundwater movement, contaminants, and mountain-building processes, can be better perceived through models [20,21,22], animations, and simulation than they can through still photographs in textbooks, as these processes are dynamic and temporally complex. The modeling of such concepts is best performed using a digital medium. For example, there are several obstacles in comprehending the concept of plate tectonics. First, it requires a general factual basis of Earth’s composition. Second, unobservable multiscale processes variable in time and space (e.g., at great depth in subduction zones over tens of millions of years) pose further obstacles to comprehension [23,24,25,26].
Apart from the demonstration of the key geologic concepts, efficient course delivery in an online platform could be an added challenge. For a better understanding of the online learning environment in the field of geoscience, we have applied an established tool comprising community of inquiry (CoI), Institutional Support (IS), and Self-Directed Learning (SDL). CoI proposes an approach of online learning depending on the three parameters, including social presence, teaching presence, and cognitive presence. It is reported that [27] social interactions may not be as effective as the online learning modes for critical thinking. Online learning experiences have the potential for the deepest levels of reflective thought and learning. The integration of cognitive, social, and teaching elements expands learning opportunities beyond simple social exchanges. The CoI applies a meaningful inquiry supported by teaching, social, and cognitive elements, which are essential for promoting deep and durable learning in online learning environments [28,29]. Students benefit from the cognitive processes that online instructors encourage through collaborative learning activities for a sustainable reflective discourse [30]. On the other hand, SDL has been a primary theoretical construct in adult education and its research has evolved over time [31]. Peer study suggests SDL as both a process and a personal attribute [32]. Applications of the OLES in online geoscience education is still largely unexplored, where the scopes of the online geoscience mode can be addressed through user feedback on social presence, teaching presence, cognitive presence, and self-directed learning.

3. Methodology

The study was approved by the Institutional Review Board (IRB) of the University of Tennessee at Knoxville (UTK), where the research was conducted. With the IRB approval, UTK students, enrolled in the Dynamic Earth Online classes in summer 2019 and 2020 and in fall 2020, participated electronically to complete the Online Learning Environment Survey [33,34,35].

3.1. Research Site

UTK is the flagship campus of the University of Tennessee system. As per Data USA (2020) [36], approximately 19,867 students were enrolled in AY 2020, comprising white/Caucasians (77%), black/African Americans (5.55%), Hispanics/Latino Americans (4.83%), two or more races (3.8%), Asians (3.63%), American Indian or Alaska Native (0.16%), and Native Hawaiians or other Pacific Islanders (0.0556%). The campus offers geoscience degrees at both undergraduate and graduate levels.
Dynamic Earth (GEOL 101, or physical geology) is a heavily enrolled multi-section introductory course that includes geology majors, minors (2–5%, internal data), and non-major students. Primarily motivated by the scarcity of an entirely online geology course at UTK, we developed Dynamic Earth Online. Initially delivered as a 5-week accelerated summer course, the course was later expanded to 15 weeks and delivered in both fall and spring semesters. The course enrollment primarily consists of non-STEM major students who seek to fulfill a natural science general education.

3.2. Survey Instrument

Establishing an effective e-learning system is a complex process. The designing of such a system is often dependent on the implementation of various factors including infrastructure, content and assessment, quality of learner support systems, assumptions made by learners and educators about the learning experience itself, and peer support networks for learners and educators [37]. Obtaining students’ feedback about the learning environment is an essential step to identify the areas where improvements could be made in the future. Based on this rationale, we applied the OLES instrument for this study.
The OLES consists of three separate instruments: (a) community of inquiry (CoI), (b) Institutional Support (IS), and (c) the Self-Directed Online Learning Scale (SDOLS). As outlined in Table 1, CoI is composed of three subscales representing three subdimensions: (a) teaching presence (TP), (b) social presence (SP), and (c) cognitive presence (CP) [38,39]. IS consists of just one item. The SDOLS has two subscales representing two subdimensions: (a) autonomous learning (AUL) and (b) asynchronous online learning (ASL) [40]. The authors slightly modified the survey instruments in the OLES to make the items more relevant to the current study. This modified survey consists of a set of 46 closed-ended items (Appendix). All OLES items were measured on a five-point Likert scale: 1 for strongly disagree (SD), 2 for disagree (D), 3 for neutral (N), 4 for agree (A), and 5 for strongly agree (SA).

3.3. Data Collection

The participants of the survey were enrolled in the Dynamic Earth Online class over the course of multiple semesters. After the last day of grade submission, the students were invited to take the survey anonymously by using a link provided by the instructor through the QuestionPro survey software versions 2019 and 2020 were available from the Office of Information Technology (OIT) at UTK. The students had one week to complete the survey.
In total, 56 participants completed the survey with responses to all three instruments. The results included the views of non-geoscience major students from general education, architecture, construction engineering, and business departments. We believe that including the feedback from non-geoscience major students might eliminate the bias that could have been introduced if the survey participants were limited to geoscience major students only. At the same time, the feedback from non-geoscience majors may contribute to the efforts to engage the students from other disciplines so that they are more likely to pursue geoscience career opportunities.

3.4. Data Analysis

The Rasch model was constructed based on the assumption that the most parsimonious and effective predictor of a characteristic is the relationship between an item’s difficulty and a person’s ability. It is explained based on the underlying logic that subjects have a higher probability of correctly answering (i.e., endorsing) easier items and a lower probability of correctly answering more difficult ones [41]. Rasch analysis has been successfully used in multiple disciplines including human sciences, health [42], mathematics [43], and education [44]. Recently, the field of geoscience has witnessed an increasing number of Rasch analysis applications [45].
In the study, we performed Rasch analysis based on the rating scale model (RSM) separately for each of the subscales of CoI and SDOLS using the Winsteps software (version 4.3.2). In particular, the Wright item–person map of each Rasch analysis addressed the perceptions of student respondents regarding various aspects of the online Earth science course (RQ1 and RQ2).
Before looking at the maps, each analysis began with an investigation of the psychometric properties of all subscales, each of which had a Cronbach’s alpha statistic of 0.825 or higher. First, we examined each subscale to see whether the unidimensionality assumption of the Rasch analysis was satisfied using the Principal Component Analysis of Standardized Rasch Residuals (PCASRR). Second, we investigated how well the items of each subscale fitted the rating scale model based on the infit and outfit mean squares (MNSQ) statistics. Third, we examined the reliability and separation statistics of persons/students and items. Finally, we analyzed the Wright item–person map to estimate the item difficulty and student ability together on a logit scale. In each map, the mean of the item difficulty was arbitrarily set to zero. Items of greater difficulty were placed towards the top of the map, whereas those of less difficulty were at the bottom. Accordingly, students with greater ability were placed at the top of the map, whereas those with less ability were at the bottom. For each subscale, the map provided the distribution of the items on the right side and that of the students on the left side.

4. Results

The data from all participants who responded to one or more of the five subscales and the IS scale are summarized using descriptive statistics in Figure 1, which consists of a panel of two subfigures. For each response category of each subscale and the IS scale, the subfigure on the upper level of the panel documents the total number/count of responses, whereas the other subfigure on the lower level is the corresponding proportion of responses. Figure 1 shows a clear dominance of the participants in favor of online learning. Based on Figure 1a, for each subscale/scale, most of the students responded to the strongly agree or agree category. Specifically, as per Figure 1b, for the CoI subscales, around 60% of the participants rated using the two highest categories; for the IS scale and SDOLS subscales, the proportions were as high as 80%.

4.1. Rasch Modeling Analyses

As per the PCASRR analyses, for each subscale, the fundamental Rasch assumption of unidimensionality was satisfied from a practical perspective because a single, primary Rasch dimension existed that explained over 50% of the raw variance (Table 2). Regarding the quality of items, among the three subscales of CoI, there were only three items in the SP and CP subscales that had a score of slightly above 1.50 on either infit or outfit or both MNSQ statistics, thus supporting the quality of the vast majority of the CoI items as being adequate [46]. For the SDOLS instrument, a similar conclusion was drawn about the quality of items except for SDL_Q01, which had a relatively high outfit MNSQ statistic of 2.19. Next, all point-measure correlations in each subscale were positive and (moderately) strong (at least 0.63), indicating that there was a proper alignment between the item and the latent construct that the subscale was designed to measure [46]. Next, regarding the reliability and separation statistics of each subscale, the CoI instrument subscales mostly had satisfactory scores on person separation, person reliability, and item reliability (CP may be a little low on item reliability), but their item separation scores were all lower than the threshold of 3.00. In contrast, the SDOLS showed a weaker response than CoI on these four statistics. A major underlying reason is that the two subscales of SDOLS had a smaller number of items than the subscales of CoI [46].

4.1.1. Community of Inquiry

Teaching Presence: Based on the Wright map for the TP subscale of the CoI instrument in Figure 2, the students more easily endorsed Item 2, suggesting that the students easily agreed that the instructor clearly communicated important course goals. Next, Items 3, 4, 1, and 8 were the four items that students almost as easily endorsed, indicating that the students very much agreed that the instructor clearly communicated instructions on participation in course learning activities, related due dates and time frames, important course topics, and that the instructor helped keep the students on task in a way that helped them to learn. At the mean difficulty level, there was a group of three items: Items 11, 13, and 9, with the same level of difficulty estimates. This indicates that, on average, the students shared the view that the instructor helped them to focus discussions on relevant issues in a way that helped them to learn, that the instructor provided timely feedback, and that the instructor encouraged them to explore new concepts in the course. Slightly above the mean difficulty level were Items 10 and 6, where the students had some reservations regarding the instructor’s actions reinforcing the development of a sense of community among students and on the instructor being helpful in guiding the class towards understanding topics in a way that helped the students clarify their thinking. The students also found it relatively difficult to endorse Item 7 (The instructor helped keep the students engaged and participating in productive dialogues) and Item 5 (The instructor was helpful in identifying areas of agreement and disagreement on course topics that helped the students to learn). Finally, the hierarchy continued to advance upward, reaching the item that the students found most difficult to endorse, Item 12, indicating that the students hardly agreed that the instructor provided feedback that helped them understand their strengths and weaknesses.
Social Presence: Based on the Wright map for the SP subscale of the CoI instrument in Figure 3, the students felt comfortable interacting with other students (Item 19) and in participating in the course discussions (Item 18) and in disagreeing with other students without compromising a sense of trust (Item 20). Next, Items 17 and 21, with the same level of endorsability, were slightly more difficult for the students to agree on. Despite that, since the two items were still below the mean level of difficulty, it was concluded that the students felt comfortable conversing through the online medium and felt that their points of view were acknowledged by other students in the class. Next, the hierarchy continued to move upward, and right above the mean difficulty level was Item 15, indicating that the students had some reservations about being able to form distinct impressions of some of the other students in the class. Even more difficult to endorse were Items 14 and 22. From this, it is interpreted that the students tended to disagree that getting to know other students in the class gave them a sense of belonging in the course and that online discussions helped them to develop a sense of collaboration. Finally, the most difficult item to endorse was Item 16, suggesting the students hardly agreed that online communication was an excellent medium for social interaction.
Cognitive Presence: Based on the Wright map for the CP subscale of the CoI instrument in Figure 4, the map begins with a group of five items with an identical level of endorsability, which the students found it easy to agree on: (1) Item 24 (Some course activities piqued my curiosity), (2) Item 27 (Brainstorming and finding relevant information helped me resolve content-related questions), (3) Item 29 (Combining new information from a range of sources helped me answer questions raised in course activities), (4) Item 30 (Learning activities in this course helped me construct explanations/solutions), and (5) Item 31 (Reflecting on course content and discussions helped me understand fundamental concepts in this class). Next, at the mean difficulty level were two items: Items 32 and 33. This indicates that, on average, the students could describe ways to test and apply the knowledge from the course and that they developed solutions to course problems that could be used in practice. Next, the hierarchy continued to advance upward to reach Items 26, 34, and 28. The interpretation is that, compared with the previous items, the students found it difficult to share the views that they used various information sources to explore problems/issues presented in the course, that they could apply the knowledge created in this course to their work or other non-class related activities, and that those online discussions were valuable in helping them appreciate different perspectives. Finally, Items 23 and 25 were the two most difficult items, suggesting that students hardly agreed that the problems posed in this course increased their interest in course issues and that they felt motivated to explore content-related questions.

4.1.2. Self-Directed Online Learning Scale

Autonomous Learning: Based on the Wright map for the AUL subscale of the SDOLS instrument in Figure 5, the easiest item was Item 38, suggesting the students believed they were in control of their online learning. In contrast, the most difficult item on the subscale was Item 36, indicating that students hardly agreed that they were able to make such decisions about their online learning as selecting online project topics. Between these two items, there were four items, in ascending order of difficulty: Items 37, 40, 39, and 41. First, below the mean level of difficulty, the students had little difficulty agreeing that they worked online during times that they found convenient and that they approached online learning in their own way. Second, on average, the students tended to agree they played an important role in their online learning. Third, they had reservations about being able to remain motivated even though the instructor was not always online.
Asynchronous Online Learning: Based on the Wright map for the ASL subscale of the SDOLS instrument in Figure 6, the easiest item was Item 42, indicating that the students were easily able to access the discussion forum at places convenient for them. Next, they were almost equally easily able to read posted messages at times that were convenient for them (Item 43). Next, Item 46 was slightly more difficult, but still below the mean level of difficulty, suggesting that the students usually took notes while watching a video on the computer. Finally, Items 44 and 45 were the two most difficult items on the subscale, which indicates that the students had difficulty in relating the content of online course materials to the information they read from the books and in understanding course-related information when it was presented in video format.
Finally, the literature recommends that the sample size should be at least six times the number of items for stable results in a factor analysis, of which Rasch analysis is a special type for categorical data [47,48,49]. This sample size of 56 participants (relative to the number of items in each of the five subscales) failed to meet this criterion in two instances: (a) 13 items in the TP subscale and (b) 12 items in the CP subscale. To further strengthen the results presented above, an additional Monte Carlo statistical simulation study was conducted based on Linacre (1994) [50] to see whether the relatively small sample size in each of the two subscales had any negative impact on the estimates of model parameters (i.e., difficulty/endorsability of item statements and person/student participant ability). Based on the parameter estimates from the collected/real data, for each of the two subscales, 100 datasets were simulated to have the same number of 56 participants and (13 in TP and 12 in CP) items as the collected data and were each analyzed under the RSM to produce 100 sets of difficulty and ability parameter estimates. Next, for each subscale, the 100 estimates of each model parameter were aggregated across all 100 simulated datasets to arrive at the mean and median estimates of the parameter, both of which were then compared with the estimate from the collected data. As shown in Figure 7, an overlay of three-line charts documenting the three estimates of each parameter and their comparisons was created, respectively, for item difficulty and person ability parameters (TP and CP results are in the top and bottom panel of the figure, respectively). Evidently, for virtually all parameters in each subscale, the median and mean estimates from the simulated data were very close to each other and were also close to the estimate from the collected data, which further strengthens the results of the study.

5. Discussion

A long-standing debate in geoscience education is the question of how we replace or create the field component with virtual course delivery. It may be true that field experience cannot be replaced entirely with virtual or online modality. However, under the curriculum of certain introductory courses, rigorous fieldwork may not be required to cover the concepts and laboratory studies effectively and efficiently to meet learning goals. These introductory Earth science courses may include Physical Geology, and Historical Geology, in which, logistics and services could be arranged, including rocks, minerals, and fossil samples [7,51].
This study was conducted to investigate students’ perceptions of the delivery of online Earth science courses while overcoming the challenge of such courses being field-based. We studied the outcomes of converting the Dynamic Earth (aka Physical Geology) in-person course to its equivalent online mode by evaluating students’ feedback. Our study assessed student experiences of the online class using three surveys. A total of two research questions were proposed and addressed in the study.

5.1. Addressing RQ1 through RQ2

Regarding RQ1, the results of the Rasch analysis demonstrate the students’ perceptions of their community of inquiry experience in this introductory geoscience course. On the TP subscale, while the students easily endorsed that the instructor clearly communicated important course goals, they found it harder to agree that the instructor provided feedback that helped them understand their strengths and weaknesses. On the SP subscale, the students indicated that they felt comfortable interacting with other students, participating in the course discussions, and disagreeing with other students without compromising a sense of trust. At the same time, they found it challenging to endorse that online communication was an excellent medium for social interaction. On the CP scale, the students easily agreed that some course activities piqued their curiosity, brainstorming and finding relevant information helped them resolve content-related questions, combining new information from various sources helped them answer questions from the course activities, learning activities in the course helped them construct explanations/solutions, and reflecting on the course content and discussions helped them to understand fundamental concepts. However, they hardly agreed that the problems posed in this course increased their interest in course issues and that they felt motivated to explore content-related questions. Finally, there were items measuring other aspects of community of inquiry whose level of difficulty (i.e., endorsability) fell in between the most endorsable and the least endorsable items outlined above.
Regarding RQ2, the results of the Rasch analysis demonstrated the students’ perceptions of their self-directed learning experience in this introductory-level geoscience course. On the AUL subscale, while they did not have any difficulty endorsing that they were in control of their online learning, they found it difficult to agree that they were able to make such decisions about their online learning such as selecting online project topics. On the ASL subscale, the students easily endorsed that they could access the discussion forum at places convenient for them, but found it challenging to relate the content of online course materials to the information they read from the books and understand course-related information when it was presented in video format. Finally, there were items measuring other aspects of self-directed learning whose level of difficulty (i.e., endorsability) fell in between the most endorsable and the least endorsable items outlined above.
Our Rasch analysis results suggest that there is a strong positive correlation between students’ experiences and course objectives. Overall, the Rasch analysis shows that the students had less difficulty in agreeing with most of the survey questions. In contrast, students found it more difficult to agree on the aspects of communication and appearance. These include having to know their peers in the class well enough, lack of the appearance of the instructor, participation in course discussions, and project work. These are some of the common challenges for asynchronous online courses, where the physical presence of class participants and instructors may not be mandatory [52,53]. Needless to say, because of this spatiotemporal flexibility of the asynchronous online mode, it offers the range and freedom to complete the course without being present in a physical classroom at a particular class time. We believe that the students’ responses to the lack of enough peer interaction could be due to their unfamiliarity with the learning style, which could be improved through professional development and more exposure to online learning. Finally, the students agreed that the support and facilities were adequate for the online course and wished to have more detailed feedback on their homework submissions.

5.2. Implications for Online Earth Science Programs

Our study suggests an overall interest in asynchronous online programs in the Earth science discipline. Although the survey was administered to the students enrolled on one course (introductory physical geology), this viewpoint might be extended to the other Earth science courses as well. At the same time, the critical rating for the CoI subscale could be directed to scopes of improvements as necessary. Also, the interface could be smoother, which is critical for online education. In addition, due to diverse student backgrounds (e.g., traditional vs. non-traditional), students may not possess the same level of skills required to undertake the technology related to the various tasks involved in online course structures.
The various factors of CoI are highly useful in explaining the effectiveness of online learning [27]. Our results related to the CoI factors highlight a significant influence in all three spaces of social, cognitive, and teaching presence in online geoscience learning. One of the main factors that may have controlled the positive feedback in the study is students being able to express themselves with no peer influence and/or confrontation. At the same time, there may be some limitations in initiating enough interaction, which would improve with the familiarity of the mode, prior experience, and practice. The online learning platform is well known for its flexibility to accommodate learners’ busy schedule as it offers the opportunity for individuals to take charge of their own learning. SDL is a crucial component of a learner for better adjustment and success in online learning [54,55,56]. Interest, curiosity, and a desire for self-improvement were among the most important factors reflected in the student feedback in this study, where they could use various devices and places to learn, and meet their self-directed learning requirements at their own pace [57]. Therefore, learners present an increasing interest for online learning, which, in this case, was geoscience.
Having an online Earth science program used to be a rare occurrence due to its nature of being a field-based science. However, with technological advancements, laboratory materials and virtual or self-guided fieldwork are available through different service providers and publishers for introductory geoscience classes [8]. Compared with traditional, face-to-face course delivery, the online option helps to reach more diverse groups of students who may not be available for in-person classes [58] and, thus, contributes to student enrollment. Consequently, online course delivery has become an important revenue generator for colleges and universities [59] and an integral component of their long-term strategies [60].

5.3. Limitations and Future Directions

It is critical to address the authors’ plan for potential areas of future expansion of the study at this stage. Although the result of the study shows a very good correlation between the online delivery of a geoscience course and user experience feedback, a wider demographic representation would have provided a better reflection of the outcome universally. Our results are based on the data reflecting the responses of students from one introductory-level course from one institution. Authors plan to address these two areas as a future extension of this study in two steps. First, we plan to conduct a survey over a longer time and a broader demographic representation to obtain a larger sample size. Then, we will include other introductory core courses into the study to assess the potential of developing a comprehensive online Earth science program.

6. Conclusions

The current study investigated the level of endorsability of certain aspects of online learning environments and resources through Rasch modeling, contained substantial evidence of the demand for online Earth science courses, and presented supportive evidence to promote online Earth science programs. Our results showed that, with advanced technology and lab services, introductory Earth science classes could be easily transformed into their online equivalent while also overcoming the challenges of geoscience education usually being field based. To effectively and efficiently implement the online delivery of introductory courses, both the instructors and students need to be trained through appropriate professional development sessions, and a trial-and-error approach may need to be taken. As per our analysis, fewer items were marked as being difficult to carry out than those marked as easy. These difficult items could be subject-specific challenges such as geoscience, in this case. The course can be updated with continued studies in the field as needed and modifications to the instrument over time. In the end, the study may also serve to promote similar research using surveys designed to measure geoscience students’ feedback regarding the online delivery of introductory Earth science courses.

Author Contributions

Conceptualization: M.I.A. and J.S.; methodology, software, and validation: M.I.A. and H.Y.; formal analysis and investigation: M.I.A. and H.Y.; resources: M.I.A., J.S. and H.Y.; writing—original draft: M.I.A. and H.Y.; writing—review and editing: J.S. and J.B. All authors have read and agreed to the published version of the manuscript.

Funding

This research was partially supported by the UTK Office of Information Technology (OIT) new Course Improvement with Technology Enhancement (CITE) Grant.

Data Availability Statement

The datasets generated and/or analyzed during the current study are available from the corresponding author on reasonable request.

Acknowledgments

The authors would like to thank the University of Tennessee Course Improvement with Technology Enhancement (CITE) for the support for the study. We would also like to thank the Office of Information Technology (OIT) for providing the software and survey platform.

Conflicts of Interest

The authors declare that they have no known conflicts of interest that could have appeared to influence the work reported in this paper.

Appendix A

Table A1. Community of inquiry (Teaching Presence).
Table A1. Community of inquiry (Teaching Presence).
No.Survey QuestionsResponses
1The instructor clearly communicated important course topics56
2The instructor clearly communicated important course goals.
3The instructor provided clear instructions on how to participate in course learning activities
4The instructor clearly communicated important due dates/time frames for learning activities
5The instructor was helpful in identifying areas of agreement and disagreement on course topics that helped me to learn
6The instructor was helpful in guiding the class towards understanding course topics in a way that helped me clarify my thinking
7The instructor helped to keep course participants engaged and participating in productive dialogue
8The instructor helped keep the course participants on task in a way that helped me to learn
9The instructor encouraged course participants to explore new concepts in this course.
10Instructor actions reinforced the development of a sense of community among course participants.
11The instructor helped to focus discussion on relevant issues in a way that helped me to learn.
12The instructor provided feedback that helped me understand my strengths and weaknesses.
13The instructor provided feedback in a timely fashion.
Table A2. Community of inquiry (Social Presence).
Table A2. Community of inquiry (Social Presence).
No.Survey QuestionsResponses
14Getting to know other course participants gave me a sense of belonging in the course.56
15I was able to form distinct impressions of some course participants.
16Online or web-based communication is an excellent medium for social interaction.
17I felt comfortable conversing through the online medium.
18I felt comfortable participating in the course discussions.
19I felt comfortable interacting with other course participants.
20I felt comfortable disagreeing with other course participants while still maintaining a sense of trust.
21I felt that my point of view was acknowledged by other course participants.
22Online discussions helped me to develop a sense of collaboration.
Table A3. Community of inquiry (Cognitive Presence).
Table A3. Community of inquiry (Cognitive Presence).
No.Survey QuestionsResponses
23Problems posed in this course increased my interest in course issues.56
24Some course activities piqued my curiosity.
25I felt motivated to explore content related questions.
26I utilized a variety of information sources to explore problems/issues presented in this course.
27Brainstorming and finding relevant information helped me resolve content related questions.
28Online discussions were valuable in helping me appreciate different perspectives.
29Combining new information from a range of sources helped me answer questions raised in course activities.
30Learning activities in this course helped me construct explanations/solutions.
31Reflecting on course content and discussions helped me understand fundamental concepts in this class.
32I can describe ways to test and apply the knowledge created in this course.
33I have developed solutions to course problems that can be applied in practice.
34I can apply the knowledge created in this course to my work or other non-class related activities.
Table A4. Institutional support.
Table A4. Institutional support.
No.Survey QuestionsResponses
35I was aware of whom to contact for questions about programs and services at UT56
Table A5. Self-directed learning (autonomous learning).
Table A5. Self-directed learning (autonomous learning).
No. Survey QuestionsResponses
36I was able to make decisions about my online learning (e.g., selecting online project topics).56
37I worked online during times I found convenient.
38I was in control of my online learning.
39I played an important role in my online learning.
40I approached online learning in my own way.
41I was able to remain motivated even though the instructor was not online at all times.
Table A6. Self-directed learning (asynchronous online learning).
Table A6. Self-directed learning (asynchronous online learning).
No.Survey QuestionsResponses
42I was able to access the discussion forum at places convenient to me.56
43I was able to read posted messages at times that were convenient to me.
44I was able to relate the content of online course materials to the information I have read in books.
45I was able to understand course-related information when it was presented in video formats.
46I was able to take notes while watching a video on the computer.

References

  1. Baggaley, J. Educational distancing. Distance Educ. 2020, 41, 582–588. [Google Scholar] [CrossRef]
  2. Macdonald, R.H.; Manduca, C.A.; Mogk, D.W.; Tewkbury, B.J. Teaching methods in undergraduate geoscience courses: Results of the 2004 on the cutting edge survey of U.S. faculty. J. Geosci. Educ. 2005, 53, 237–252. [Google Scholar] [CrossRef]
  3. Ramirez, S., II; Teten, S.; Mamo, M.; Speth, C.; Kettler, T.; Sindelar, M. Student perceptions and performance in a traditional, flipped classroom, and online introductory soil science course. J. Geosci. Educ. 2022, 70, 130–141. [Google Scholar] [CrossRef]
  4. Rosa, M.; Lerman, S. Researching online mathematics education: Opening a space for virtual learner identities. Educ. Stud. Math. 2011, 78, 69–90. [Google Scholar] [CrossRef]
  5. Sadaghiani, H.R. Using multimedia learning modules in a hybrid-online course in electricity and magnetism. Phys. Rev. Spec. Top. Phys. Educ. Res. 2011, 7, 010102. [Google Scholar] [CrossRef]
  6. Clary, R.M.; Wandersee, J.H. Virtual field exercises in the online classroom: Practicing science teachers’ perceptions of effectiveness, best practices, and implementation. J. Coll. Sci. Teach. 2010, 39, 50. [Google Scholar]
  7. Feig, A.D. An online introductory physical geology laboratory: From concept to outcome. Geosphere 2010, 6, 942–951. [Google Scholar] [CrossRef]
  8. Shinneman, A.L.C.; Shane, L.; Myrbo, A.E. Self-guided field trips allow flexibility in undergraduate student introductory field experiences. J. Geosci. Educ. 2020, 68, 371–379. [Google Scholar] [CrossRef]
  9. Burger, B. Course Delivery Methods for Effective Distance Science Education: A Case Study of Taking an Introductory Geology Class Online. In Interdisciplinary Approaches to Distance Teaching; Routledge: London, UK, 2015; pp. 104–117. [Google Scholar]
  10. Ni, A.Y. Comparing the Effectiveness of Classroom and Online Learning: Teaching Research Methods. J. Public Aff. Educ. 2013, 19, 199–215. [Google Scholar] [CrossRef]
  11. Triantafyllou, E.; Timcenko, O. Student Perception on Learning with Online Resources in a Flipped Mathematics Classroom. In CERME 9—Ninth Congress of the European Society for Research in Mathematics Education; Charles University in Prague, Faculty of Education: Prague, Czech Republic, 2016; pp. 2573–2579. [Google Scholar]
  12. Ault, C. The everyday perspective and exceedingly unobvious meaning. J. Geol. Educ. 1984, 32, 89–91. [Google Scholar] [CrossRef]
  13. Orion, N.; Ault, C.R. Learning earth sciences. In Handbook of Research on Science Education; Abell, S.K., Lederman, N.G., Eds.; Routledge: New York, NY, USA, 2007; pp. 653–687. [Google Scholar]
  14. King, C. Geoscience education: An overview. Stud. Sci. Educ. 2008, 44, 187–222. [Google Scholar] [CrossRef]
  15. Deslauriers, L.; McCarty, L.S.; Miller, K.; Callaghan, K.; Kestin, G. Measuring actual learning versus feeling of learning in response to being actively engaged in the classroom. Proc. Natl. Acad. Sci. USA 2019, 116, 19251–19257. [Google Scholar] [CrossRef]
  16. Brierton, S.; Wilson, E.; Kistler, M.; Flowers, J.; Jones, D. A comparison of higher order thinking skills demonstrated in synchronous and asynchronous online college discussion posts. NACTA J. 2016, 60, 14–21. [Google Scholar]
  17. Olt, P.A. Virtually there: Distant freshmen blended in classes through synchronous online education. Innov. High. Educ. 2018, 43, 381–395. [Google Scholar] [CrossRef]
  18. Peterson, A.T.; Beymer, P.N.; Putnam, R.T. Synchronous and asynchronous discussions: Effects on cooperation, belonging, and affect. Online Learn. 2018, 22, 7–25. [Google Scholar] [CrossRef]
  19. Krause, J.; Portolese, L.; Bonner, J. Student perceptions of the use of multimedia for online course communication. Online Learn. 2017, 21, 36–49. [Google Scholar] [CrossRef]
  20. Alam, M.I. Near-surface characterization using traveltime and full-waveform inversion with vertical and horizontal component seismic data. Interpretation 2018, 7, T141–T154. [Google Scholar] [CrossRef]
  21. Alam, M.I.; Uddin, A.; Hames, W.E. Late Paleozoic detrital history of eastern Gondwanaland: Petrofacies and detrital geochronology of Permo-Carboniferous intracratonic sequences of the northwest Bengal Basin. J. Sediment. Res. 2020, 90, 389–402. [Google Scholar] [CrossRef]
  22. Alam, M.I.; Katumwehe, A.; Atekwana, E. Geophysical characterization of a leachate plume from a former municipal solid waste disposal site: A case study on Norman landfill. Am. Assoc. Pet. Geol. Bull. 2022, 106, 1183–1195. [Google Scholar] [CrossRef]
  23. Clark, S.; Libarkin, J.; Kortz, K.; Jordan, S. Alternative conceptions of plate tectonics held by nonscience undergraduates. J. Geosci. Educ. 2011, 59, 251–262. [Google Scholar] [CrossRef]
  24. Dolphin, G.; Benoit, W. Students’ mental model development during historically contextualized inquiry: How the ‘tectonic plate’ metaphor impeded the process. Int. J. Sci. Educ. 2016, 38, 276–297. [Google Scholar] [CrossRef]
  25. Gobert, J.D.; Clement, J.C. Effects of student-generated diagrams versus student-generated summaries on conceptual understanding of causal and dynamic knowledge in plate tectonics. J. Res. Sci. Teach. 1999, 36, 39–53. [Google Scholar] [CrossRef]
  26. McDonald, S.; Bateman, K.; Gall, H.; Tanis-Ozcelik, A.; Webb, A.; Furman, T. Mapping the increasing sophisticate on of students’ understandings of plate tectonics: A learning progressions approach. J. Geosci. Educ. 2019, 67, 83–96. [Google Scholar] [CrossRef]
  27. Garrison, D.R.; Anderson, T.; Archer, W. Critical inquiry in a text-based environment: Computer conferencing in higher education. Internet High. Educ. 2000, 2, 87–105. [Google Scholar] [CrossRef]
  28. Rovai, A.P. Development of an instrument to measure classroom community. Internet High. Educ. 2002, 5, 197–211. [Google Scholar] [CrossRef]
  29. Shea, P.J. A study of students’ sense of learning community in an online learning environment. J. Asynchronous Learn. Netw. 2006, 10, 35–44. [Google Scholar] [CrossRef]
  30. McCombs, B.L.; Vakili, D. A learner-centered framework for e-learning. Teach. Coll. Rec. 2005, 107, 1582–1600. [Google Scholar] [CrossRef]
  31. Garrison, D.R. Self-Directed Learning: Toward a Comprehensive Model. Adult Educ. Q. 1997, 48, 18–33. [Google Scholar] [CrossRef]
  32. Song, L.; Hill, J.R. A conceptual model for understanding self-directed learning in online environments. J. Interact. Online Learn. 2007, 6, 27–42. [Google Scholar]
  33. Arbaugh, B.; Cleveland-Innes, M.; Diaz, S.; Garrison, D.R.; Ice, P.; Richardson, J.C. Developing a community of inquiry instrument: Testing a measure of the Community of Inquiry Framework using a multi-institutional sample. Internet High. Educ. 2008, 11, 133–136. [Google Scholar] [CrossRef]
  34. Bangert, A.W. Building a validity argument for the community of inquiry survey instrument. Internet High. Educ. 2009, 12, 104–111. [Google Scholar] [CrossRef]
  35. Su, J. Successful Graduate Students’ Perceptions of Characteristics of Online Learning Environments. Unpublished. Doctoral Dissertation, The University of Tennessee, Knoxville, TN, USA, 2016. [Google Scholar]
  36. Data USA. 2020. Available online: https://datausa.io/profile/university/the-university-of-tennessee-knoxville (accessed on 24 September 2022).
  37. Macnish, J.; Trinidad, S.; Fisher, D.; Aldridge, J. The online learning environment of a technology-rich secondary college. In Proceedings of the Annual Meeting of the American Educational Research Association, Chicago, IL, USA, 21–25 April 2003. [Google Scholar]
  38. Caskurlu, S. Confirming the subdimensions of teaching, social, and cognitive presences: A construct validity study. Internet High. Educ. 2018, 39, 1–12. [Google Scholar] [CrossRef]
  39. Dempsey, P.R.; Zhang, J. Re-examining the construct validity and causal relationships of teaching, cognitive, and social presence in community of inquiry framework. Online Learn. 2019, 23, 62–79. [Google Scholar] [CrossRef]
  40. Yang, Y.; Su, J.; Bradley, K.D. Applying the Rasch Model to Evaluate the Self-Directed Online Learning Scale (SDOLS) for Graduate Students. Int. Rev. Res. Open Distrib. Learn. 2020, 21, 99–120. [Google Scholar] [CrossRef]
  41. Bond, T.G.; Fox, C.M. Applying the Rasch Model: Fundamental Measurement in the Human Sciences; Psychology Press: London, UK, 2013. [Google Scholar]
  42. Tennant, A.; McKenna, S.P.; Hagell, P. Application of Rasch analysis in the development of quality of life instruments. Value Health 2004, 7, S22–S26. [Google Scholar] [CrossRef]
  43. Bradley, K.; Sampson, S.; Royal, K. Applying the Rasch rating scale model to gain insights into students’ conceptualisation of quality mathematics instruction. Math. Educ. Res. J. 2006, 18, 11–26. [Google Scholar] [CrossRef]
  44. Liu, R.; Sun, L.; Yuan, J.; Bradley, K. Using the 2006 PISA questionnaire to evaluate the measure of educational resources: A Rasch measurement approach. Int. J. Assess. Tools Educ. 2017, 4, 211–222. [Google Scholar] [CrossRef]
  45. Libarkin, J.C.; Gold, A.U.; Harris, S.E.; McNeal, K.S.; Bowles, R.P. A new, valid measure of climate change understanding: Associations with risk perception. Clim. Change 2018, 150, 403–416. [Google Scholar] [CrossRef]
  46. Linacre, J.M. Winsteps® (Version 4.1.0) [Computer Software]. Winsteps.com: Beaverton, Oregon. Available online: http://www.winsteps.com/ (accessed on 10 July 2018).
  47. Bartholomew, D.J.; Steele, F.; Moustaki, I.; Galbraith, J. Analysis of Multivariate Social Science Data, 2nd ed.; Routledge: London, UK, 2008. [Google Scholar]
  48. Mundfrom, D.J.; Shaw, D.G.; Ke, T.L. Minimum sample size recommendations for conducting factor analyses. Int. J. Test. 2005, 5, 159–168. [Google Scholar] [CrossRef]
  49. Skrondal, A.; Rabe-Hesketh, S. Generalized Latent Variable Modeling: Multilevel, Longitudinal, and Structural Equation Models; Chapman Hall/CRC: Boca Raton, FL, USA, 2004. [Google Scholar]
  50. Linacre, J.M. Sample size and item calibration [or person measure] stability. Rasch Meas Tran. 1994, 7, 328. [Google Scholar]
  51. Davi, N.; Pringle, P.; Fiondella, F.; Lockwood, J.; Oelkers, R. Online labs to introduce undergraduate students to scientific concepts and practices in tree-ring research. J. Geosci. Educ. 2022, 70, 73–84. [Google Scholar] [CrossRef]
  52. Houlden, S.; Veletsianos, G.A. Posthumanist critique of flexible online learning and its “anytime anyplace” claims. Br. J. Educ. Technol. 2019, 50, 1005–1018. [Google Scholar] [CrossRef]
  53. Veletsianos, G.; Houlden, S. An analysis of flexible learning and flexibility over the last 40 years of Distance Education. Distance Educ. 2019, 40, 454–468. [Google Scholar] [CrossRef]
  54. Hyland, N.; Kranzow, J. Faculty and student views of using digital tools to enhance self-directed learning and critical thinking. Int. J. Self-Dir. Learn. 2011, 8, 11–27. [Google Scholar]
  55. Kim, R.; Olfman, L.; Ryan, T.; Eryilmaz, E. Leveraging a personalized system to improve self-directed learning in online educational environments. Comput. Educ. 2014, 70, 150–160. [Google Scholar] [CrossRef]
  56. Loizzo, J.; Ertmer, P.A.; Watson, W.R.; Watson, S.L. Adult MOOC learners as self-directed: Perceptions of motivation, success, and completion. Online Learn. 2017, 21, n2. [Google Scholar] [CrossRef]
  57. Bonk, C.J.; Lee, M.M.; Kou, X.; Xu, S.; Sheu, F.R. Understanding the self-directed online learning preferences, goals, achievements, and challenges of MIT OpenCourseWare subscribers. J. Educ. Technol. Soc. 2015, 18, 349–368. [Google Scholar]
  58. Seaman, J.E.; Allen, I.E.; Seaman, J. Grade Increase: Tracking Distance Education in the United States; BABSON Survey Rescarch Group: Babson Park, MA, USA, 2018. [Google Scholar]
  59. Cheslock, J.J.; Ortagus, J.C.; Umbricht, M.R.; Wymore, J. The cost of producing higher education: An exploration of theory, evidence, and institutional policy. In Higher Education: Handbook of Theory and Research; Paulsen, M.B., Ed.; Springer: Berlin/Heidelberg, Germany, 2016; Volume 31, pp. 347–390. [Google Scholar] [CrossRef]
  60. Allen, I.E.; Seaman, J. Online Report Card: Tracking Online Education in the United States; ERIC: Babson Park, MA, USA, 2016. [Google Scholar]
Figure 1. Survey scale comparison: (a) comparison of survey scale/subscale response counts, and (b) comparison of survey scale/subscale response proportions. Dark blue, yellow, gray, orange, and light blue colors represent the respective categories of the Likert scale from 5 to 1 in descending order. Notice that panel (b) shows more responses representing strongly agree (5) and agree (4).
Figure 1. Survey scale comparison: (a) comparison of survey scale/subscale response counts, and (b) comparison of survey scale/subscale response proportions. Dark blue, yellow, gray, orange, and light blue colors represent the respective categories of the Likert scale from 5 to 1 in descending order. Notice that panel (b) shows more responses representing strongly agree (5) and agree (4).
Geosciences 14 00145 g001
Figure 2. Wright map for the Teaching Presence (TP) subscale of the CoI instrument. The left side of the scale of the map shows the person measures (Table A1) and the right side of the map shows the item measures of the TP subscale. Marker X on the left side of the figure represents individual participants. The letters positioned on the vertical axis indicate the mean value (M) of the person ability (left side of figure) or item difficulty (right side of figure), one standard deviation (S), and two standard deviations (T) from the person ability mean or item difficulty mean.
Figure 2. Wright map for the Teaching Presence (TP) subscale of the CoI instrument. The left side of the scale of the map shows the person measures (Table A1) and the right side of the map shows the item measures of the TP subscale. Marker X on the left side of the figure represents individual participants. The letters positioned on the vertical axis indicate the mean value (M) of the person ability (left side of figure) or item difficulty (right side of figure), one standard deviation (S), and two standard deviations (T) from the person ability mean or item difficulty mean.
Geosciences 14 00145 g002
Figure 3. Wright map for the Social Presence (SP) subscale of the CoI instrument. The left side of the map shows the person measures (Table A2) and the right side of the map shows the item measures of the SP. Marker X on the left side of the figure represents individual participants. The letters positioned on the vertical axis indicate the mean value (M) of the person ability (left side of figure) or item difficulty (right side of figure), one standard deviation (S), and two standard deviations (T) from the person ability mean or item difficulty mean.
Figure 3. Wright map for the Social Presence (SP) subscale of the CoI instrument. The left side of the map shows the person measures (Table A2) and the right side of the map shows the item measures of the SP. Marker X on the left side of the figure represents individual participants. The letters positioned on the vertical axis indicate the mean value (M) of the person ability (left side of figure) or item difficulty (right side of figure), one standard deviation (S), and two standard deviations (T) from the person ability mean or item difficulty mean.
Geosciences 14 00145 g003
Figure 4. Wright map for the Cognitive Presence (CP) subscale of the CoI instrument. The left side of the map shows the person measures (Table A3) and the right side of the map shows the item measures of CP subscale. Marker X on the left side of the figure represents individual participants. The letters positioned on the vertical axis indicate the mean value (M) of the person ability (left side of figure) or item difficulty (right side of figure), one standard deviation (S), and two standard deviations (T) from the person ability mean or item difficulty mean.
Figure 4. Wright map for the Cognitive Presence (CP) subscale of the CoI instrument. The left side of the map shows the person measures (Table A3) and the right side of the map shows the item measures of CP subscale. Marker X on the left side of the figure represents individual participants. The letters positioned on the vertical axis indicate the mean value (M) of the person ability (left side of figure) or item difficulty (right side of figure), one standard deviation (S), and two standard deviations (T) from the person ability mean or item difficulty mean.
Geosciences 14 00145 g004
Figure 5. Wright map for the Autonomous Learning (AUL) subscale of the SDOLS instrument. The left side of the map shows the person measures (Table A5) and the right side of the map shows the item measure of AUL subscale. Marker X on the left side of the figure represents individual participants. The letters positioned on the vertical axis indicate the mean value (M) of the person ability (left side of figure) or item difficulty (right side of figure), one standard deviation (S), and two standard deviations (T) from the person ability mean or item difficulty mean.
Figure 5. Wright map for the Autonomous Learning (AUL) subscale of the SDOLS instrument. The left side of the map shows the person measures (Table A5) and the right side of the map shows the item measure of AUL subscale. Marker X on the left side of the figure represents individual participants. The letters positioned on the vertical axis indicate the mean value (M) of the person ability (left side of figure) or item difficulty (right side of figure), one standard deviation (S), and two standard deviations (T) from the person ability mean or item difficulty mean.
Geosciences 14 00145 g005
Figure 6. Wright map for the Asynchronous Online Learning subscale of the SDOLS instrument. The left side of the map shows the person measures (Table A6) and the right side of the map shows the item measures of ASL subscale. Marker # on the left side of the figure represents a multitude of participants. The letters positioned on the vertical axis indicate the mean value (M) of the person ability (left side of figure) or item difficulty (right side of figure), one standard deviation (S), and two standard deviations (T) from the person ability mean or item difficulty mean.
Figure 6. Wright map for the Asynchronous Online Learning subscale of the SDOLS instrument. The left side of the map shows the person measures (Table A6) and the right side of the map shows the item measures of ASL subscale. Marker # on the left side of the figure represents a multitude of participants. The letters positioned on the vertical axis indicate the mean value (M) of the person ability (left side of figure) or item difficulty (right side of figure), one standard deviation (S), and two standard deviations (T) from the person ability mean or item difficulty mean.
Geosciences 14 00145 g006
Figure 7. Monte Carlo statistical simulation for TP and CP subscales: (a) comparisons of TP subscale item estimates; (b) comparisons of TP subscale person estimates; (c) comparisons of CP subscale item estimates; and (d) comparisons of CP subscale person estimates. Red, blue, and black lines represent the real data, simulated mean, and simulated median, respectively.
Figure 7. Monte Carlo statistical simulation for TP and CP subscales: (a) comparisons of TP subscale item estimates; (b) comparisons of TP subscale person estimates; (c) comparisons of CP subscale item estimates; and (d) comparisons of CP subscale person estimates. Red, blue, and black lines represent the real data, simulated mean, and simulated median, respectively.
Geosciences 14 00145 g007
Table 1. Scales of Online Learning Environment Survey.
Table 1. Scales of Online Learning Environment Survey.
ScalesSubscalesItems
Community of InquiryTeaching Presence1–13
Social Presence14–22
Cognitive Presence23–34
Institutional SupportInstitutional Support35
Self-Directed LearningAutonomous Learning36–41
Asynchronous Online Learning42–46
Table 2. Rasch Analysis of the Survey Responses.
Table 2. Rasch Analysis of the Survey Responses.
Aspects of Rasch analysisStatisticsCoISDOLS
TPSPCPAULASL
Number of items1391265
Cronbach’s alpha 0.9590.9040.9340.825
Unidimensionalitypercentage of variance explained by the Rasch dimension60.1%53.0%48.8%52.8%51.9%
Eigenvalue of the largest secondary dimension2.512.812.142.262.23
Item Measure QualityItems with an OUTFIT MNSQ statistic which is greater than 1.50NoneSP_Q14 (MNSQ = 1.60)CP_Q23 (MNSQ = 1.52), CP_Q28 (MNSQ = 1.60)AUL_Q01 (MNSQ = 2.19)ASL_Q15 (MNSQ = 1.60)
Items with an INFIT MNSQ statistic which is greater than 1.50NoneNoneCP_Q28 (MNSQ = 1.56)AUL_Q01 (MNSQ = 1.99)None
Range of point-measure correlations0.70–0.860.66–0.830.65–0.800.63–0.790.72–0.82
Reliability and ValidityPerson separation2.922.492.811.701.76
Item separation2.242.421.393.002.17
Person reliability0.900.860.890.740.76
Item reliability0.830.850.660.900.82
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Alam, M.I.; Su, J.; Yang, H.; Benner, J. Investigating Students’ Perception with an Online Dynamic Earth Course during COVID-19: A Quantitative Inquiry. Geosciences 2024, 14, 145. https://0-doi-org.brum.beds.ac.uk/10.3390/geosciences14060145

AMA Style

Alam MI, Su J, Yang H, Benner J. Investigating Students’ Perception with an Online Dynamic Earth Course during COVID-19: A Quantitative Inquiry. Geosciences. 2024; 14(6):145. https://0-doi-org.brum.beds.ac.uk/10.3390/geosciences14060145

Chicago/Turabian Style

Alam, Md Iftekhar, Jian Su, Hongwei Yang, and Jacob Benner. 2024. "Investigating Students’ Perception with an Online Dynamic Earth Course during COVID-19: A Quantitative Inquiry" Geosciences 14, no. 6: 145. https://0-doi-org.brum.beds.ac.uk/10.3390/geosciences14060145

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop