Next Article in Journal / Special Issue
Expanding Opportunities for Systems Thinking, Conceptual Learning, and Participation through Embodied and Computational Modeling
Previous Article in Journal
Derivation and Application of the Subjective–Objective Probability Relationship from Entropy: The Entropy Decision Risk Model (EDRM)
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

How Fifth-Grade English Learners Engage in Systems Thinking Using Computational Models

1
Teaching and Learning, New York University, New York, NY 10003, USA
2
Teaching and Learning, University of Miami, Coral Gables, FL 33124, USA
3
Scheller Teacher Education Program, Massachusetts Institute of Technology, Cambridge, MA 02139, USA
*
Author to whom correspondence should be addressed.
Submission received: 27 October 2020 / Revised: 18 November 2020 / Accepted: 20 November 2020 / Published: 22 November 2020

Abstract

:
The purpose of this study was to investigate how computational modeling promotes systems thinking for English Learners (ELs) in fifth-grade science instruction. Individual student interviews were conducted with nine ELs about computational models of landfill bottle systems they had developed as part of a physical science unit. We found evidence of student engagement in four systems thinking practices. Students used data produced by their models to investigate the landfill bottle system as a whole (Practice 1). Students identified agents and their relationships in the system (Practice 2). Students thought in levels, shuttling between the agent and aggregate levels (Practice 3). However, while students could think in levels to develop their models, they struggled to engage in this practice when presented with novel scenarios (e.g., open vs. closed system). Finally, students communicated information about the system using multiple modalities and less-than-perfect English (Practice 4). Overall, these findings suggest that integrating computational modeling into standards-aligned science instruction can provide a rich context for fostering systems thinking among linguistically diverse elementary students.

1. Introduction

Systems thinking has become increasingly expected in elementary (i.e., primary) science classrooms. In the international context, the International Society for Technology in Education (ISTE) Standards for Students [1] focus on integrating systems thinking into core content areas, such as science. In the U.S. context, A Framework for K-12 Science Education [2] and the Next Generation Science Standards [3] include “systems and system models” as a crosscutting concept across science disciplines. By upper elementary school, students are expected to “understand that a system is a group of related parts that make up a whole and can carry out functions its individual parts cannot” [3]. In recent years, computational models, or “non-static representations of phenomena that can be simulated by a computer” ([4], p. 137), have been used more widely in science classrooms as a tool for modeling complex systems (e.g., [5,6,7]). Research has offered preliminary evidence that computational models, especially the agent-based variety, can support students in developing systems thinking practices [8,9,10,11,12]. However, there is a need for more studies investigating how computational modeling can promote systems thinking at the elementary level, in knowledge domains other than biology and ecology, and with diverse student groups [12].
The need for further research on systems thinking has become more urgent in the context of rapidly changing student demographics. In the international context, intensified patterns of migration and globalization have meant that students across the globe are learning academic subjects (e.g., science) in second or additional languages [13]. In the U.S. context, traditionally underrepresented student groups in terms of race and ethnicity are now the majority across the nation, and students classified as English learners (ELs) make up the fastest growing subset of the student population, constituting 10.1% of public school students [14]. The call for greater attention to diverse student groups has been emphasized by science reform initiatives, as evidenced by the NGSS vision of “all standards, all students” [3]. These science standards expect all students, including ELs, to engage in rigorous science practices, even as they are still developing English proficiency [15]. Nonetheless, ELs tend to be excluded from educational initiatives and innovations, thus widening an opportunity-to-learn gap and perpetuating these students’ underrepresentation in STEM fields [16].
The purpose of this study was to investigate how computational modeling promotes systems thinking for ELs in fifth-grade science instruction. Specifically, we analyzed how ELs engaged in systems thinking practices as they developed computational models using StarLogo Nova—an agent-based programming environment that uses blocks-based programming—to explain a physical science phenomenon. This research could inform the design of science learning environments that engage all students in systems thinking practices expected by the latest science standards as well as the complex, networked society in which we live.

2. Literature Review

In this section, we review the literature on systems thinking and computational models to support systems thinking. Then, we describe the contribution of the present study. In particular, we call for further research addressing diverse student groups, especially ELs.

2.1. Defining Systems Thinking

Systems thinking has roots in both K-12 science education and computer science education. We consider systems thinking from both perspectives.

2.1.1. Science Education

From a science education perspective, systems have been a part of national science initiatives for the past three decades. The Science for All Americans [17] and the subsequent Benchmarks for Science Literacy [18] included systems as a common theme that applies across science, technology, engineering, and mathematics. The National Science Education Standards [19] defined a system as “an organized group of related objects or components that form a whole ... and have boundaries, components, resources, flow (input and output), and feedback” (p. 116). More recently, the Framework [2] and the NGSS include systems and systems thinking as one of seven crosscutting concepts, which are a refinement of the common themes in the Science for All Americans [17] and the unifying concepts and processes in the National Science Education Standards [19]. According to the NGSS, students are expected to develop a more sophisticated understanding of systems and system models across their K-12 schooling.

2.1.2. Computer Science Education

From a computer science education perspective, systems thinking refers to a student’s ability to understand the relationship between elements in a given environment [20]. If a system is “something more than a collection of its parts,” then systems thinking is “a system of thinking about systems” ([21], p. 2). Studies have shown that systems thinking can be challenging for students, especially thinking about connectedness within systems and understanding complexity [22,23,24]. Even at the college level (e.g., [25]), students may struggle with systems thinking. While much of the research has been carried out with secondary and postsecondary students, studies show that elementary students can, to some extent, engage in systems thinking practices [8], though some practices may be more difficult than others [26]. In a comprehensive review of the literature on teaching and learning about complex systems in K-12 science education, Yoon et al. [12] reviewed 75 empirical studies and found that, while there was a strong representation of systems thinking research in biology and ecology, there was a need to extend this research to other science disciplines. Additionally, Yoon et al. [12] highlighted the need for further research with diverse student groups.
This empirical research has been complemented by conceptual work on how to integrate systems thinking into science instruction. As one prominent example, Weintrop et al. [4] developed a framework for integrating computational thinking into secondary STEM classrooms. Their framework identified systems thinking practices as one of the four overarching categories of computational thinking practices. Within this category, Weintrop et al. [4] named five systems thinking practices: Students investigate a complex system as a whole (Practice 1), which involves understanding the characteristics of a system in the aggregate. Students understand the relationships within a system, meaning they identify the constituent agents of a system and describe how those agents interact (Practice 2). Students think in levels; in other words, they shuttle back and forth between agent and aggregate levels of a system to understand how agent-level interactions produce the system’s aggregate-level behavior (Practice 3). Students communicate information about a system, meaning they identify and convey essential information about a system to others (Practice 4). Finally, students define systems and manage complexity; in other words, they set boundaries of the system and determine its size and complexity (Practice 5). The Weintrop et al. [4] framework provided the foundation for how we conceptualized and operationalized systems thinking in this study. This framework was appropriate for our purposes, because it emphasizes the integration of systems thinking into STEM learning. At the same time, our study extended Weintrop et al. [4] by applying the framework to elementary science classrooms with ELs.

2.2. Using Computational Models to Support Systems Thinking

In recent years, computational models have been used to promote systems thinking practices. Yoon et al. [12], in their comprehensive review, highlighted several studies that leveraged computational tools to help students make sense of systems over time (e.g., [9,27,28]). This literature shows that, in general, students are able to develop computational models that demonstrate at least a rudimentary understanding of systems and how to analyze them (e.g., [9,29]). Students can identify the individual elements in a system [10] and run multiple tests to determine whether their models produce results matching their predictions [9]. In particular, agent-based models can be especially effective for fostering systems thinking. Unlike simulation models, which typically involve adjusting pre-programmed parameters, agent-based models require students to write computer programs for the interactions of agents [30]. Thus, these models can open up the “black box” of how agent-level interactions contribute to producing aggregate-level effects [25,31]. Nonetheless, even with the support of agent-based modeling tools, certain systems thinking practices (e.g., thinking in levels) have proven to be challenging [10,11,32]. For example, in their study of high school students interacting with computational models of ecosystems, Wilensky and Reisman [10] found that students tended to oversimplify agent-level interactions (e.g., predator–prey relationships), which posed challenges for interpreting aggregate-level effects (e.g., survival of different populations).

2.3. Contribution of the Present Study

While both science education and computer science education highlight the importance of systems thinking and computational modeling with all students, little work has been done to consider diverse student groups. We propose that integrating computational modeling into science offers distinct affordances for ELs to engage in systems thinking practices. In particular, computational modeling provides opportunities for ELs to deploy multiple modalities of communication, including visual representation, gesture, and oral and written language [33]. The communicative affordances of computational models can be further enhanced in classrooms that value ELs’ contributions for their disciplinary meaning rather than their linguistic accuracy [34]. As Lee et al. [15] argue, ELs can participate meaningfully in NGSS-aligned science practices (including systems thinking practices) “using less-than-perfect English” (p. 227). However, few studies have examined ways that “STEM subjects [can] support ELs in developing computational thinking” ([16], p. 70].
This study builds on and extends the literature on computational modeling and systems thinking in multiple respects. First, this study integrated computational modeling into NGSS-aligned instruction, which offers policy implications for achieving the vision of science education based on the Framework and the NGSS. Second, this study was carried out with elementary students, which provides developmental implications to the existing literature that typically involves secondary students. Third, this study focused specifically on ELs, which contributes to the limited literature on computational modeling and systems thinking with ELs [16]. Finally, this study engaged ELs in explaining a physical science phenomenon, thus responding to the critical need “for more research [on systems thinking] in different knowledge domains outside of the content areas of biology and ecology” ([12], p. 316). The study was guided by the following research question: How do fifth-grade ELs engage in systems thinking practices as they develop and use computational models to explain a physical science phenomenon?

3. Methods

This study was carried out as part of a larger research project focused on integrating computational thinking and modeling into NGSS-aligned science instruction with a focus on ELs. Taking a design-based research approach [35], our research team developed a yearlong, fifth-grade science curriculum comprised of four units. In each unit, students develop a computational model to explain the unit phenomenon and answer the unit driving question. The curriculum is unique in that it embeds computational thinking and modeling into regular science instruction (rather than a technology block or an after-school program).

3.1. Participants and Setting

During the year when this study took place, the curriculum was being field-tested in 5 fifth-grade teachers’ classrooms in an urban school district. According to the district website, 19% of students in the district were ELs, 73% of students identified as Hispanic, and 78% were eligible for free or reduced-price lunch. To answer the research question for this study, we conducted individual interviews with students about the computational models they had developed in the first unit of the curriculum. The interviews took place in the teacher’s classroom with the largest number of ELs. This teacher was in her third year implementing the NGSS and her first year integrating computational thinking and modeling into science instruction. Of the 19 students in her class, nine students were classified as ELs by their school, and all nine agreed to participate in the interviews. Of the nine students, three identified as female and six as male. According to an English language proficiency assessment administered during the previous school year (which was the latest data available), all nine students were at an intermediate level of English proficiency (Level 3 of 6). All nine students spoke Spanish at home.

3.2. Overview of the Unit

This study focused on the first unit in our yearlong curriculum. The unit fully addresses fifth-grade NGSS performance expectations in physical science (related to the structure and properties of matter) and also introduces a performance expectation in life science (related to decomposers in the environment). Over the course of the unit, students investigate the garbage system in their home, school, and community to answer the driving question: What happens to our garbage?
Students begin the unit by sorting their lunch garbage according to different properties and generating questions about garbage. Students obtain information about their local landfill and develop physical models of the landfill (referred to as “landfill bottle systems”) by putting garbage materials (e.g., fruit, plastic, aluminum foil) in open and closed bottles (see Figure 1).
Over several weeks, students observe changes in the properties (e.g., color, smell) and weights of the open and closed landfill bottle systems (the curriculum uses the term “weight,” not “mass,” in accordance with the NGSS performance expectation for fifth grade). Over time, students observe that the weight of the open system decreases, but the weight of the closed system stays the same, even as the properties of the food materials change (e.g., a smell is produced). To investigate the cause of these changes, students carry out an investigation with agar plates and figure out that microbes (i.e., a type of decomposer) are present in the landfill bottles and contributing to decomposition. However, students still do not know how the microbes decompose the food materials while conserving weight in the closed landfill bottle system.
Computational modeling enters the unit as a way to explain how microbes decompose food materials in the closed landfill bottle system. Specifically, students develop computational models of the closed landfill bottle system to test their ideas about what is happening when microbes break down the food materials. For this task, students work in partners (two students per computer) to develop their computational models. Students begin by participating in “design meetings” in which they consider ideas they have for their computational models and brainstorm how they could represent those ideas in StarLogo Nova. Then, students develop initial computational models. When their initial models are complete, two partner groups work together to provide feedback on each other’s model, and the class engages in a discussion about possible model revisions. Students then revise their computational models and share their revised models with the whole class (each day that students work on their computational models, they are asked to save a new version so that our research team can trace their models over time). The computational models, in tandem with data collected through observations and measurements during the unit, enable students to construct an explanation of how microbes cause changes in the landfill bottles while conserving weight in the closed landfill bottle system.
In the sample computational model shown in Figure 2, students created agents (i.e., banana solid, banana gas, microbes) and programmed the interactions of those agents (i.e., microbes, on collision with banana solid, delete the banana solid and create banana gas). As students ran their computational model, they collected and interpreted data from three data boxes and a graph showing the weight of the banana solid, the weight of the banana gas, and the total weight of the banana. Then, students compared the weight data from their computational model to the weight data from their physical landfill bottle model of the closed system. In the computational model shown in Figure 2, the total weight of the banana remained constant (i.e., 500), which was consistent with the weight data from the physical model (i.e., total weight was conserved).
This computational modeling task promotes four of the systems thinking practices in the Weintrop et al. [4] framework. Students investigate the system as a whole (Practice 1) by tracking the weight of the banana solid, banana gas, and closed system as the model runs. Students develop an understanding of the relationships within a system (Practice 2) by programming the interactions of microbes, banana solid, and banana gas. Students think in levels (Practice 3) by considering how the agent-level interactions of microbes with the banana produce the aggregate-level weight data. As students engage in all of these practices, they communicate information (Practice 4) about the landfill bottle system that they are modeling. Weintrop et al.’s [4] fifth practice (defining systems and managing complexity) is not a focus of the unit, since the boundaries of the system that students are modeling (i.e., the closed landfill bottle system) are largely determined by the teacher. This practice is addressed in subsequent units of the curriculum.

3.3. Data Collection

The data for this study came from individual student interviews that were conducted at the end of the unit. Interviews were conducted within several days of students developing their final computational models. Students met individually with the first or second author and were provided with a laptop showing the computational model they had developed during the unit. The laptop was made visible to both the student and interviewer. Even though students had developed their computational models with a partner, student interviews were conducted one-on-one with the interviewer to provide more in-depth insight into how individual students were engaging in systems thinking practices.
The interviews were semi-structured. The protocol consisted of questions targeting students’ participation in systems thinking practices while leaving flexibility for students to co-construct the conversation [36,37]. At the beginning of the interviews, students were asked to describe their models (e.g., “What does this model show?”). By starting with this broad question, we aimed to elicit what was, for students, most salient about the system they had modeled. However, because students tended to focus on the final versions of their models rather than their modeling processes, students were subsequently asked to describe how they had developed their models during the unit (e.g., “Did you and your partner’s ideas change as you were developing your computational model?”). This retrospective aspect of the interviews offered insight into the systems thinking practices that students had employed in developing their models. Finally, students were asked about potential changes to their models (e.g., “How could you change your model to represent the open landfill bottle system?” and “How would your model work differently if there were twice as many microbes?”). These extension questions offered insight into whether students could apply systems thinking practices to a novel version of the system being modeled.
Interviews took place in a quiet location during science instructional time and lasted approximately 15–20 min per student. Student interviews were video recorded and later transcribed by the authors. Each transcript file included a link to multiple iterations of the student’s computational model.

3.4. Data Analysis

To answer the research question (How do fifth-grade ELs engage in systems thinking practices as they develop and use computational models to explain a physical science phenomenon?), we developed an initial coding scheme based on the Weintrop et al. [4] framework. Given that this framework was intended for secondary education and is frequently applied to the study of life science topics, we adapted the framework to be useful with our focal grade level (fifth grade) and science discipline (physical science). For example, we paid less attention to ideas beyond the scope of elementary science (e.g., stocks and flows within the practice of investigating a system as a whole). We also excluded from our analysis the practice of defining systems and managing complexity. Although this is a key practice in the study of ecosystems, where the size and complexity of the system demands explicitly defining boundaries, it is less relevant to the closed landfill bottle system in physical science, where the boundaries have already been defined.
Table 1 displays the four systems thinking practices included in our analysis along with excerpts from the student interviews that were coded as indexing student engagement in each practice. The definitions for each code were iteratively revised as the first author conducted an initial review of the interview transcripts. Once the coding scheme was established, the transcripts were coded in full by the first author and another member of the research team. Following coding of the transcripts independently, reliability was established, and discrepancies were resolved. Throughout the process, students’ computational models (and the multiple iterations of each) were used to contextualize students’ interview responses, especially their retrospective accounts of their modeling work. After coding the interviews and identifying broad patterns in the data, we identified two students whose responses were representative of the patterns that emerged. By focusing on these two students in the results below, we are able to illustrate, with sufficient depth and turn-by-turn detail, how fifth-grade ELs engaged in systems thinking.

4. Results

We report the findings of Miguel and Angel, whose interview responses were typical of the nine students. For each student, we describe how they engaged in the four systems thinking practices in Table 1: (a) investigating a system as a whole (Practice 1), (b) understanding the relationships within a system (Practice 2), (c) thinking in levels (Practice 3), and (d) communicating information about a system (Practice 4). We highlight that students were generally able to engage in systems thinking while, at times, struggling with certain practices.

4.1. Miguel

The following excerpt comes from the beginning of the interview with Miguel. In presenting the interview excerpts, we preserve students’ linguistic inaccuracies.
Interviewer:
Can you tell me about the model that you made? What does this model show?
Miguel:
It shows me how, um the banana decomposes while the microbes, um, decompose the banana. And you can see the data boxes here (points to weight of banana solid data box) and this just starts to go down because of the microbes decomposing it and this (points to total weight data box) stays the same, the total weight is 500.
Interviewer:
And how did you represent those ideas in your model?
Miguel:
The microbes, I collided them with the banana solid and then I backwards-ed the microbe and I put the banana gas color to turquoise.
Interviewer:
Oh why?
Miguel:
Because I wanted, I had to see it because if it was colored black, I couldn’t see it so I had to change the color.
Interviewer:
Okay, so when the microbe collides with the solid banana, what happens?
Miguel:
The solid banana it gets eated by the microbe and then it makes gas, and it, it, it goes everywhere (moves hands all around).
To describe his model, Miguel began by focusing on the system as a whole (Practice 1). Specifically, he described how microbes decompose the banana, which results in the weight of the solid banana decreasing while the total weight of the system stays the same. Then, when asked how he represented these ideas in his model, Miguel described the relationships within the system (Practice 2). Specifically, he identified each agent (i.e., microbes, banana solid, banana gas) and the relationships among the agents as represented in his code blocks (“The microbes, I collided them with the banana solid and ... I put the banana gas color to turquoise”). While Miguel’s focus on the color of the agents may, at first, seem trivial, this was motivated not by any aesthetic preference but by Miguel’s interest in distinguishing between two agents crucial to the process of decomposition: microbes (black) and banana gas (initially black but subsequently changed to turquoise). At the end of the exchange, Miguel summarized the key relationships among agents that result in weight being conserved in the closed landfill bottle system (“solid banana it gets eated by the microbe and then it makes gas”).
At this point, Miguel was prompted to describe how he and his partner arrived at their representation and whether their ideas changed over time:
Interviewer:
So Miguel, did you and your partner’s ideas change at all while you were developing your computational model?
Miguel:
Hmhmm (affirmative).
Interviewer:
How?
Miguel:
Well at the beginning we wanted the weight should stay the same so we made our model, and then we coded it and ran our code but the data boxes started to grow a little, and the total weight just started to go down and that isn’t what we wanted. We know it wasn’t right.
Interviewer:
So what did you do?
Miguel:
So, we change, we had to go to the microbes, and we saw that we didn’t have any gas coming out so we had to fix that.
Interviewer:
How?
Miguel:
We added the banana gas to the microbe coding.
Interviewer:
And what happened?
Miguel:
We had to check it, so we made the model go again and the weight stays the same at 500.
In this excerpt, Miguel demonstrated that he and his partner were thinking in levels (Practice 3) to develop their model. They began at the aggregate level by anticipating that the weight of the closed system “should stay the same” in the model. This was likely informed by the data they had collected from their physical landfill bottle models, which conserved weight in the closed system over several weeks of observations. Miguel and his partner continued working at the aggregate level when they ran their code and observed that the “total weight just started to go down.” This unanticipated result prompted the two students to dig deeper into their agent-level code (“we had to go to the microbes [tab]”), which revealed that no gas was being produced (“we didn’t have any gas coming out”). After revising the code to include the creation of a banana gas, Miguel and his partner shuttled back to the aggregate level to “check” how this change at the agent level would impact the system in the aggregate. When the pair observed that the total weight was conserved (“the weight stays the same at 500”), they knew they had developed a plausible representation of microbes decomposing the banana. Thus, Miguel and his partner moved fluidly between aggregate and agent levels to develop a revised model that conserved weight in the closed landfill bottle system.
As shown in the previous exchange, Miguel and his partner were engaging in the practice of thinking in levels to develop their model. However, when presented with a hypothetical change to the model, Miguel showed some difficulty with this systems thinking practice:
Interviewer:
Speaking of open and closed, your model is a closed system, right?
Miguel:
Ya.
Interviewer:
How could you change the system to represent the open landfill bottle system?
Miguel:
If it was a open landfill bottle, the weight started were go down, and the banana gas would start rising up.
Interviewer:
What do you mean rising up?
Miguel:
It will be getting out of the jar.
Interviewer:
So what would happen to the system?
Miguel:
So at the microbe, the microbe would touch the banana solid and, well then there would be no gas, gas wouldn’t be created cuz it would come out of the jar.
This excerpt shows how Miguel struggled to think in levels when considering how to represent a different version of the landfill bottle system (i.e., open instead of closed). This hypothetical scenario can be difficult for students, since it involves a change in aggregate system behavior (i.e., total weight of the system decreases) but not in the agent-level interactions that produce the behavior (i.e., microbes still decompose the solid banana and create banana gas). At the aggregate level, Miguel understood that, in an open system, “the weight would go more down,” and he correctly attributed this decrease in weight to banana gas “getting out of the jar.” However, at the agent level, Miguel indicated that a gas would no longer be created “cuz it would come out of the jar,” thus confounding the agent and aggregate levels of the system.
In the excerpts above, Miguel, an EL, was able to communicate information about the landfill bottle system (Practice 4) using multiple modalities of communication and less-than-perfect English. In particular, he used deictic gestures in combination with the dynamic visualization produced by his model (e.g., “this just starts to go down” while pointing to the total weight data box) to communicate about aggregate-level system effects. Miguel also used expressions, such as “banana it gets eated” and “data boxes start to grow a little” that, while less than perfect, were nonetheless effective for conveying his intended meaning. A narrow focus on Miguel’s developing English proficiency could have distracted from his meaningful participation in systems thinking practices.

4.2. Angel

The interview with Angel also began with him describing his computational model:
Interviewer:
Can you just tell me about the model that you made?
Angel:
Well, we first created agents and we had to put traits in and everything. We made this solid banana, and gas banana, and microbes. So we used code blocks (points to the code blocks in the model). When the microbes (makes fist with hand) got collision with the banana solid (moves fist to other hand and clasps hands together), the banana solid turns to a banana gas (waves hands around the air). And we show how the microbes are eating the banana, and how the banana solid, um how the banana solid, it turns to a banana gas.
Interview:
Then what did you do?
Angel:
We ran the model and see that all the weight, it stays the same. See that right here (points to the data boxes in the model). The weight of the banana started, see it’s normal. But when the forever goes (referring to the forever button), the weight of the banana solid suddenly it turns down (gestures hands in a downward motion) and the weight of the total weight it stays the same. Because it’s the total, the total weight of all the things that are there.
Angel began by describing the agents and their relationships (Practice 2). Specifically, Angel took an agent-based perspective [38] by using his hands and body to physically show how the microbes interact with the banana solid to produce banana gas. Then, Angel described how he ran the model to investigate the system as a whole (Practice 2). Specifically, he observed that, while the weight of the banana solid decreases as the model runs, the total weight of the system remains constant.
At this point, Angel was prompted to describe how he and his partner arrived at their representation and whether their ideas changed over time:
Interviewer:
So Angel, did your ideas change at all while you were developing your model?
Angel:
My ideas changed.
Interviewer:
How did they change?
Angel:
By doing that the microbes, that the banana solid turns into banana gas. And then the banana solid disappeared.
Interviewer:
What was your idea in the beginning?
Angel:
That, well we had the microbe and we know we need the microbe to come in and make gas. So we added the gas to the microbe code. But then we have to setup and we ran the model. But the model it was going up.
Interviewer:
What was going up?
Angel:
The total weight was going up and up and we knew that was wrong because the weight need to stay the same. But it was weird cuz we knew we made the banana gas. So we checked out the code blocks. And we had a big problem, because we had the banana gas, but the banana solid was still there, like it wasn’t deleting yet.
Interviewer:
So what did you do?
Angel:
So we used new code so the microbe, the microbe delete the banana solid, and then we checked it, and it worked.
Interviewer:
How do you know it worked?
Angel:
The weight was stay the same, the weight didn’t change.
Similar to Miguel above, Angel and his partner were thinking in levels (Practice 3) to develop their model. However, Angel and his partner took a slightly different approach than Miguel and his partner. Whereas Miguel began thinking at the aggregate level, anticipating that the total weight of the system “should stay the same,” Angel and his partner did not begin with this a priori goal for the system as a whole. Instead, they began thinking at the agent level, with the goal of representing the way that microbes produce a gas (“we know we need the microbe to come in and make gas”). Having coded this agent-level interaction, Angel and his partner then shuttled to the aggregate level to run their model and observe the weight data. They were surprised to notice that the “total weight was going up,” suggesting the pair had some expectation that weight should be conserved in the system, even if not explicitly stated as an a priori goal of their modeling work. This outcome prompted the pair to return to their agent-level code and notice that, while they had created banana gas as intended, the solid banana remained in the system even after being eaten by the microbes (“we had the banana gas, but the banana solid was still there, like it wasn’t deleting yet”). After revising their agent-level interaction to include the microbe deleting the solid banana, the pair moved back to the aggregate level to “check it” and concluded that their model “worked” because the total weight of the system “didn’t change.” Thus, while both Angel and his partner (this example) and Miguel and his partner (previous example) were thinking in levels to develop a plausible representation of decomposition, their modeling work was motivated by different goals (representing what microbes do vs. conserving weight in the system) and launched from different starting points (agent level vs. aggregate level).
While Angel and his partner were engaging in the practice of thinking in levels to develop their model, Angel showed some difficulty with this practice when presented with a hypothetical change to the model:
Interviewer:
How would your model be different if there were twice as many microbes?
Angel:
If there would be a lot more microbes that would mean that it would touch the banana solid faster so that would mean that the banana solid would be disappearing faster.”
Interviewer:
And how would that impact the system?
Angel:
It would’ve changed it by make like, the weight go a bit more higher than usual and it would make more gas.”
Interviewer:
What weight?
Angel:
The total weight of the banana because they would touch the banana solid faster, which would’ve created much more weight of the gas banana.
This exchange shows how Angel struggled to think in levels when considering how to represent a different version of the landfill bottle system, in this case, a system with twice as many microbes. This hypothetical scenario can be difficult for students, since it involves a change at the agent level (i.e., doubling the number of microbes) that does not affect the total weight of the system (i.e., weight will always be conserved in a closed system) but does affect the rate of decomposition at the aggregate level (i.e., how quickly the microbes decompose the solid banana). At the agent level, Angel understood that the solid banana would “disappear faster” (i.e., rate of decomposition). However, at the aggregate level, Angel indicated that this increase in rate would result in the total weight of the closed system “going a bit more higher,” since there would be “more weight of the gas banana” (i.e., no conservation of weight). Thus, similar to Miguel, who struggled with the hypothetical scenario of changing his model to an open system, Angel faced difficulty thinking in levels when presented with a novel scenario.
In the excerpts above, Angel, an EL, was able to communicate information about the landfill bottle system (Practice 4) using multiple modalities and less-than-perfect English. For example, Angel used embodiment and gesture to represent the microbes eating the banana solid (clasping hands together) and creating banana gas that moved freely around the closed system (waving hands in the air). He also used less-than-perfect English to describe both agent-level interactions (e.g., “microbes got collision with the banana solid”) and their aggregate-level effects (e.g., “turns down” to describe a decrease in weight). Thus, similar to his classmate Miguel, Angel was able to engage in systems thinking practices through his emerging English proficiency.

5. Discussion

The purpose of this study was to investigate how fifth-grade ELs engaged in systems thinking as they developed and used computational models to explain a physical science phenomenon. Based on interviews with students about their closed landfill system models, we found evidence of student engagement in four systems thinking practices: (a) investigating a complex system as a whole, (b) understanding the relationships within a system, (c) thinking in levels, and (d) communicating information about a system. These findings are consistent with the emerging literature that suggests that elementary students can engage in systems thinking, though not without some challenges [8,26].
When describing their computational models in the interviews, students showed evidence of investigating a complex system as a whole (Practice 1). Specifically, both Miguel and Angel used the weight data produced by their models to make sense of what was happening in the closed landfill bottle system. Their participation in this systems thinking practice was supported by a key affordance of the StarLogo Nova programming environment, namely, the ability to generate data about a system. This affordance (which is not made available in all blocks-based programming environments) enabled students to engage meaningfully in data analysis and interpretation, which typically poses challenges for younger students [31,39,40].
Additionally, students were able to understand and describe the relationships in a system (Practice 2). For example, both Miguel and Angel programmed how microbes interacted with banana solid such that weight would be conserved in the closed landfill bottle system. This finding is consistent with literature that shows that students are generally able to understand relationships in a system [8,9]. The fact that fifth-grade students were able to describe the agents and interactions in the landfill bottle system is particularly promising given the extensive body of literature on challenges associated with teaching and learning the particulate nature of matter [41,42,43,44]). The landfill bottle computational model may have contributed to overcoming these challenges by enabling students to program the interactions of agents too small to see (e.g., banana gas), which would typically be “black boxed” in a simulation or pre-made model [9,45].
Miguel and Angel also demonstrated that they were thinking in levels (Practice 3) as they described their modeling work retrospectively. Whereas Miguel and his partner began at the aggregate level, with the goal of conserving weight in the closed system, Angel and his partner began at the agent level, with the goal of faithfully representing what microbes do. From these different starting points, both pairs were able to shuttle between agent and aggregate levels to develop a plausible representation of decomposition in a closed system. These results are promising, as thinking in levels is typically considered a challenging systems thinking practice, even for secondary students [11,46,47]. We offer three plausible explanations for students’ success with thinking in levels, which could offer insights into designing learning environments to promote this practice with elementary students, including ELs.
One explanation could be the nature of the phenomenon that students were modeling. Unlike ecosystem models, which are typically characterized by complex, nonlinear relationships [10,48], the closed system landfill bottle model focused on a single agent-level interaction (i.e., microbes decompose banana solid and create banana gas) that had a somewhat predictable aggregate-level effect (i.e., as long as students maintained a 1:1 ratio of banana solid created and banana gas deleted, matter would be conserved in the system). Nonetheless, fifth-grade students were often surprised by the outcome when they ran their computational models (e.g., Angel’s comment that the weight data were “weird”). Thus, computational models such as this one may provide an entry point into thinking in levels that students can build on as they encounter more complex systems in different science domains and across grade levels.
Another explanation could be that students’ computational models were embedded in a meaningful science context. Previous research has documented the benefits of grounding computer science tasks in contexts that are familiar and relatable to real-world observations, especially with younger learners [32]. This was indeed the case with our fifth-grade students, who were deeply invested in (and knowledgeable about) the phenomenon of garbage by the time they developed their computational models.
A related explanation could be that students were able to use multiple sources of data (beyond their computational models) to think in levels. Most notably, students used weight data from their physical landfill bottle investigation to determine whether they had faithfully represented microbes in their computational models. For example, based on the weight data from his physical landfill bottle, Miguel knew the total weight of the closed system in his computational model “should stay the same.” When the weight did not stay the same in his computational model, Miguel shuttled to the agent level and revised his representation of decomposition. By allowing for triangulation of data from multiple sources, computational modeling integrated into science instruction proved fruitful ground for fostering systems thinking practices.
While students were able to think in levels to develop their models, they struggled to engage in this practice when faced with a novel scenario. For example, when asked to predict what would happen if the landfill bottle system were open (rather than closed), Miguel described an agent-level interaction (e.g., no banana gas would be created) that directly conflicted with his prediction for the system in the aggregate (e.g., total weight would decrease due to banana gas leaving the bottle). One explanation for this finding is that thinking in levels is a practice that develops over time and that these students were at some point along a progression toward more sophisticated engagement in the practice [49,50]. This is consistent with studies suggesting a developmental progression of systems thinking, with some practices (or components of a practice) emerging before others [26,51]. Another explanation relates to the differences between the two contexts in which students were thinking in levels. In the instructional context (as described retrospectively in the interviews), students were actively manipulating their models to test their ideas and get feedback on those ideas from the data their models produced. However, in the interview context, students were asked to think in levels without this active engagement with their models. It is possible that, if Miguel had been given the opportunity to change his model to represent an open system, he would have run the model and reevaluated his idea about the agent-level interaction (thus providing evidence of thinking in levels). The fact that students were able to think in levels with their computational models (but were not able to think in levels without their models) suggests the promise of computational modeling for engaging students in sophisticated systems thinking practices. It also suggests the importance of designing learning experiences that offer opportunities for students to model multiple versions of a system.
Finally, ELs were able to communicate information about the landfill bottle system using multiple modalities and less-than-perfect English [15]. Angel, for example, used gesture and embodiment to think like an agent [10], in this case, a microbe. Importantly, Angel’s use of gesture and embodiment was not merely a scaffold or crutch, as nonlinguistic modalities have traditionally been conceived in the literature on teaching ELs. Instead, multiple modalities were essential to engaging in systems thinking practices while also being beneficial to ELs [33]. In addition to deploying a range of modalities, ELs used less-than-perfect English to communicate their science ideas about the system. For example, Miguel used the phrase “the data boxes grow a little” in the context of describing how the aggregate behavior of the system prompted him to revise his agent-level interaction. Although, over time, Miguel will be encouraged to adopt the more canonical way of expressing this science idea (e.g., “the weight increased”), Miguel’s case cautions against assuming that ELs must have developed “full” English proficiency before they are able to engage in sophisticated systems thinking practices. Overall, this study suggests the promise of computational modeling for promoting systems thinking with ELs as well as the need for further research on the affordances (and challenges) of fostering systems thinking with diverse student groups.

6. Conclusions

The findings of this study provide compelling evidence that fifth-grade ELs can engage in systems thinking. We highlight four main contributions of the study to the field. First, the findings show that integrating computational modeling into NGSS-aligned science instruction provides a rich context for fostering students’ systems thinking. Second, the findings show that elementary students can engage in systems thinking practices, which can be further developed across their K-12 schooling. Third, the findings illustrate that computational modeling fosters systems thinking in physical science, which could lay the foundation for engaging with more complex system models in other domains (e.g., ecosystems). Finally, the findings show that systems thinking practices can be made accessible to all students, including students who are still developing the language of instruction.
These findings suggest future directions for research on systems thinking broadly and our design-based research specifically. First, research should investigate whether computational modeling tasks that engage students in modeling multiple versions of a system better support them to develop sophisticated systems thinking practices. In our design-based research, this could involve, for example, revising the landfill bottle computational modeling task to include modeling both the open and closed landfill bottle systems. Second, research should examine how elementary students’ systems thinking develops over longer periods of time than a single unit of instruction. In our design-based research, this could involve examining the development of students’ systems thinking as they encounter new phenomena in new science disciplines over our yearlong, NGSS-aligned curriculum. Third, research should investigate systems thinking with larger samples and diverse groups of students. In our design-based research, this could involve implementation in a larger number of classrooms and with ELs from diverse backgrounds (e.g., home language, level of English proficiency)—an important future direction given the heterogeneity of this student population [16]. As systems thinking becomes increasingly necessary in a complex, networked society, it is important that the education community supports all students in leveraging tools such as computational models to develop systems thinking.

Author Contributions

Conceptualization, A.H., S.E.G., D.W., L.L. and O.L.; Data curation, A.H. and S.E.G.; Writing—original draft, A.H., S.E.G., D.W., L.L. and O.L. All authors have read and agreed to the published version of the manuscript.

Funding

This research was funded by The National Science Foundation, grant number DRL-1742138.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. International Society for Technology in Education. ISTE Standards for Students. 2016. Available online: https://www.iste.org/standards/for-students (accessed on 1 October 2020).
  2. National Research Council. A Framework for K-12 Science Education: Practices, Crosscutting Concepts, and Core Ideas; The National Academies Press: Washington, DC, USA, 2012. [Google Scholar]
  3. Next Generation Science Standards Lead States. Next Generation Science Standards: For States, by States. 2013. Available online: https://www.nextgenscience.org/ (accessed on 1 October 2020).
  4. Weintrop, D.; Beheshti, E.; Horn, M.; Orton, K.; Jona, K.; Trouille, L.; Wilensky, U. Defining computational thinking for mathematics and science classrooms. J. Sci. Educ. Technol. 2016, 25, 127–147. [Google Scholar] [CrossRef]
  5. Pierson, A.E.; Brady, C.E.; Clark, D.B. Balancing the environment: Computational models as interactive participants in a STEM classroom. J. Sci. Educ. Technol. 2020, 29, 101–119. [Google Scholar] [CrossRef]
  6. Sengupta, P.; Kinnebrew, J.S.; Basu, S.; Biswas, G.; Clark, D. Integrating computational thinking with K-12 science education using agent-based computation: A theoretical framework. Educ. Inf. Technol. 2013, 18, 351–380. [Google Scholar] [CrossRef]
  7. VanLehn, K. Model construction as a learning activity: A design space and review. Interact. Learn. Environ. 2013, 21, 371–413. [Google Scholar] [CrossRef]
  8. Evagorou, M.; Korfiatis, K.; Nicolaou, C.; Constantinou, C. An investigation of the potential of interactive simulations for developing system thinking skills in elementary school: A case study with fifth-graders and sixth-graders. Int. J. Sci. Educ. 2009, 31, 655–674. [Google Scholar] [CrossRef] [Green Version]
  9. Klopfer, E.; Yoon, S.; Um, T. Teaching complex dynamic systems to young students with StarLogo. J. Comput. Math. Sci. Teach. 2005, 24, 157–178. [Google Scholar]
  10. Wilensky, U.; Reisman, K. Thinking like a wolf, a sheep, or a firefly: Learning biology through constructing and testing computational theories—An embodied modeling approach. Cogn. Instr. 2006, 24, 171–209. [Google Scholar] [CrossRef]
  11. Wilensky, U.; Resnick, M. Thinking in levels: A dynamic systems approach to making sense of the world. J. Sci. Educ. Technol. 1999, 8, 3–19. [Google Scholar] [CrossRef]
  12. Yoon, S.A.; Goh, S.E.; Park, M. Teaching and learning about complex systems in K–12 science education: A review of empirical studies 1995–2015. Rev. Educ. Res. 2018, 88, 285–325. [Google Scholar] [CrossRef] [Green Version]
  13. Macaro, E.; Curle, S.; Pun, J.; An, J.; Dearden, J. A systematic review of English medium instruction in higher education. Lang. Teach. 2018, 51, 36–76. [Google Scholar] [CrossRef] [Green Version]
  14. National Center for Education Statistics. The Condition of Education 2020; U.S. Department of Education: Washington, DC, USA, 2020. Available online: https://nces.ed.gov/programs/coe/indicator_cgf.asp (accessed on 1 October 2020).
  15. Lee, O.; Quinn, H.; Valdés, G. Science and language for English language learners in relation to Next Generation Science Standards and with implications for Common Core State Standards for English language arts and mathematics. Educ. Res. 2013, 42, 223–233. [Google Scholar] [CrossRef]
  16. National Academies of Sciences, Engineering, and Medicine. English Learners in STEM Subjects: Transforming Classrooms, Schools, and Lives; The National Academies Press: Washington, DC, USA, 2018. [Google Scholar] [CrossRef]
  17. American Association for the Advancement of Science. Science for all Americans: A Project 2061 Report on Literacy Goals in Science, Mathematics, and Technology; American Association for the Advancement of Science: Washington, DC, USA, 1989. [Google Scholar]
  18. American Association for the Advancement of Science. Benchmarks for Science Literacy; Oxford University Press: Oxford, UK, 1993. [Google Scholar]
  19. National Research Council. National Committee on Science Education Standards & Assessment; National Research Council: Washington, DC, USA, 1996. [Google Scholar]
  20. Shute, V.J.; Masduki, I.; Donmez, O. Conceptual framework for modeling, assessing and supporting competencies within game environments. Technol. Instr. Cogn. Learn. 2010, 8, 137–161. [Google Scholar]
  21. Arnold, R.D.; Wade, J.P. A definition of systems thinking: A systems approach. Procedia Comput. Sci. 2015, 44, 669–678. [Google Scholar] [CrossRef] [Green Version]
  22. Cox, M.; Elen, J.; Steegen, A. Systems thinking in geography: Can high school students do it? Int. Res. Geogr. Environ. Educ. 2019, 28, 37–52. [Google Scholar] [CrossRef]
  23. Hmelo-Silver, C.E.; Azevedo, R. Understanding complex systems: Some core challenges. J. Learn. Sci. 2006, 15, 53–61. [Google Scholar] [CrossRef]
  24. Jacobson, M.J. Problem solving, cognition, and complex systems: Differences between experts and novices. Complexity 2001, 6, 4–49. [Google Scholar] [CrossRef]
  25. Jacobson, M.; Wilensky, U. Complex systems in education: Scientific and educational importance and implications for the learning sciences. J. Learn. Sci. 2006, 15, 11–34. [Google Scholar] [CrossRef] [Green Version]
  26. Ben-Zvi Assaraf, O.; Orion, N. System thinking skills at the elementary school level. J. Res. Sci. Teach. 2010, 47, 540–563. [Google Scholar]
  27. Repenning, A.; Ioannidou, A.; Luhn, L.; Daetwyler, C.; Repenning, N. Mr. Vetro: Assessing a collective simulation framework. J. Interact. Learn. Res. 2010, 21, 515–537. [Google Scholar] [CrossRef]
  28. Yoon, S.A.; Koehler-Yom, J.; Anderson, E.; Lin, J.; Klopfer, E. Using an adaptive expertise lens to understand the quality of teachers’ classroom implementation of computer-supported complex systems curricula in high school science. Res. Sci. Technol. Educ. 2015, 33, 237–251. [Google Scholar] [CrossRef] [Green Version]
  29. Basu, S.; Sengupta, P.; Biswas, G. A scaffolding framework to support learning of emergent phenomena using multi-agent-based simulation environments. Res. Sci. Educ. 2015, 45, 293–324. [Google Scholar] [CrossRef]
  30. Railsback, S.; Grimm, V. Agent-Based and Individual-Based Modeling: A Practical Introduction; Princeton University Press: Princeton, NJ, USA, 2019. [Google Scholar]
  31. Hsiao, L.; Lee, I.; Klopfer, E. Making sense of models: How teachers use agent-based modeling to advance mechanistic reasoning. Br. J. Educ. Technol. 2019, 50, 2203–2216. [Google Scholar] [CrossRef]
  32. Levy, S.T.; Wilensky, U. Inventing a “mid level” to make ends meet: Reasoning between the levels of complexity. Cogn. Instr. 2008, 26, 1–47. [Google Scholar] [CrossRef]
  33. Grapin, S.E. Multimodality in the new content standards era: Implications for English learners. TESOL Q. 2019, 53, 30–55. [Google Scholar] [CrossRef] [Green Version]
  34. Grapin, S.E.; Llosa, L.; Haas, A.; Goggins, M.; Lee, O. Precision: Toward a meaning-centered view of language use with English learners in the content areas. Linguist. Educ. 2019, 50, 71–83. [Google Scholar] [CrossRef]
  35. Cobb, P.; Confrey, J.; DiSessa, A.; Lehrer, R.; Schauble, L. Design experiments in educational research. Educ. Res. 2003, 32, 9–13. [Google Scholar] [CrossRef]
  36. Creswell, J.W.; Creswell, J.D. Research Design: Qualitative, Quantitative, and Mixed Methods Approaches; Sage Publications: Thousand Oaks, CA, USA, 2017. [Google Scholar]
  37. Leech, B.L. Asking questions: Techniques for semistructured interviews. Political Sci. Politics 2002, 35, 665–668. [Google Scholar] [CrossRef] [Green Version]
  38. Brady, C.; Holbert, N.; Soylu, F.; Novak, M.; Wilensky, U. Sandboxes for model-based inquiry. J. Sci. Educ. Technol. 2015, 24, 265–286. [Google Scholar] [CrossRef]
  39. National Research Council. Taking Science to School: Learning and Teaching Science in Grades K-8; The National Academies Press: Washington, DC, USA, 2007. [Google Scholar]
  40. National Research Council. Ready, Set, Science! Putting Research to Work in K-8 Science Classrooms; The National Academies Press: Washington, DC, USA, 2008. [Google Scholar]
  41. Lee, O.; Eichinger, C.D.; Anderson, W.C.; Berkhemier, D.G.; Blakeslee, T.D. Changing middle school students’ conceptions of matter and molecules. J. Res. Sci. Teach. 1993, 30, 249–270. [Google Scholar] [CrossRef]
  42. Driver, R.; Squires, A.; Rushworth, P.; Wood-Robinson, V. Making Sense of Secondary Science: Research into Children’s Ideas; Routledge: London, UK, 1994. [Google Scholar]
  43. Johnson, P. Progression in children’s understanding of a “basic” particle theory: A longitudinal study. Int. J. Sci. Educ. 1998, 20, 393–412. [Google Scholar] [CrossRef]
  44. Merritt, J.; Krajcik, J. Learning progression developed to support students building a particle model of matter. In Concepts of Matter in Science Education; Tsaparlis, G., Sevian, H., Eds.; Springer: Berlin/Heidelberg, Germany, 2013; pp. 11–45. [Google Scholar]
  45. Levy, S.T.; Wilensky, U. Mining students’ inquiry actions for understanding of complex systems. Comput. Educ. 2011, 56, 556–573. [Google Scholar] [CrossRef]
  46. Grotzer, T.A.; Powell, M.M.; Derbiszewska, K.M.; Courter, C.J.; Kamarainen, A.M.; Metcalf, S.J.; Dede, C.J. Turning transfer inside out: The affordances of virtual worlds and mobile devices in real world contexts for teaching about causality across time and distance in ecosystems. Technol. Knowl. Learn. 2015, 20, 43–69. [Google Scholar] [CrossRef] [Green Version]
  47. Hmelo-Silver, C.; Pfeffer, M.G. Comparing expert and novice understanding of a complex system from the perspective of structures, behaviors, and functions. Cogn. Sci. 2004, 28, 127–138. [Google Scholar] [CrossRef]
  48. Dickes, A.C.; Sengupta, P. Learning natural selection in 4th grade with multi-agent-based computational models. Res. Sci. Educ. 2013, 43, 921–953. [Google Scholar] [CrossRef]
  49. Mohan, L.; Chen, J.; Anderson, C.W. Developing a multi-year learning progression for carbon cycling in socio-ecological systems. J. Res. Sci. Teach. 2009, 46, 675–698. [Google Scholar] [CrossRef]
  50. Songer, N.B.; Kelcey, B.; Gotwals, A.W. How and when does complex reasoning occur? Empirically driven development of a learning progression focused on complex reasoning about biodiversity. J. Res. Sci. Teach. 2009, 46, 610–613. [Google Scholar] [CrossRef] [Green Version]
  51. Ben-Zvi Assaraf, O.; Orion, N. Development of system thinking skills in the context of earth science system education. J. Res. Sci. Teach. 2005, 42, 518–560. [Google Scholar] [CrossRef]
Figure 1. Physical model of landfill bottles in open and closed systems.
Figure 1. Physical model of landfill bottles in open and closed systems.
Systems 08 00047 g001
Figure 2. Computational model to explain how microbes decompose food materials in a closed landfill bottle system.
Figure 2. Computational model to explain how microbes decompose food materials in a closed landfill bottle system.
Systems 08 00047 g002
Table 1. Codes and Examples for Systems Thinking Practices.
Table 1. Codes and Examples for Systems Thinking Practices.
Code Related to Systems Thinking PracticeExample Excerpt from Student Interviews
Investigating a complex system as a whole
-
“The total weight will come right here (points to the total weight data box) and see it just stay the same because the solid banana is turning to the gas.”
-
“This total weight going up doesn’t make sense because we haven’t added anything to the system.”
-
“It’s (points to the banana gas data box) getting more larger like 262, but the total weight, the banana still there it hasn’t disappear, it’s just turning gas. See, it’s still 500 (points to the total weight data box).”
Understanding the relationships within a system
-
“The agents in my model are the microbes, banana solid, and banana gas.”
-
“I want the microbes to move forward so they can get to the banana solid.”
-
“When the microbes touches the banana, the banana turns the banana solid to gas.”
Thinking in levels
-
“We saw the total weight was going up, so we started working on the microbe to fix it.”
-
“First we checked the microbe code, and then we ran the model so we could see if it worked.”
-
“After we changed the microbes, we had to check to make sure it works.”
Communicating information about a system
-
“The solid banana it gets eated by the microbe.”
-
“This goes up and up and up (pointing to the banana gas data box).”
-
“When the microbes (makes fist with hand) got collision with the banana solid (moves fist to other hand and clasps hands together).”
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Haas, A.; Grapin, S.E.; Wendel, D.; Llosa, L.; Lee, O. How Fifth-Grade English Learners Engage in Systems Thinking Using Computational Models. Systems 2020, 8, 47. https://0-doi-org.brum.beds.ac.uk/10.3390/systems8040047

AMA Style

Haas A, Grapin SE, Wendel D, Llosa L, Lee O. How Fifth-Grade English Learners Engage in Systems Thinking Using Computational Models. Systems. 2020; 8(4):47. https://0-doi-org.brum.beds.ac.uk/10.3390/systems8040047

Chicago/Turabian Style

Haas, Alison, Scott E. Grapin, Daniel Wendel, Lorena Llosa, and Okhee Lee. 2020. "How Fifth-Grade English Learners Engage in Systems Thinking Using Computational Models" Systems 8, no. 4: 47. https://0-doi-org.brum.beds.ac.uk/10.3390/systems8040047

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop