Implementation of Lessons Learned to Simulation-Based Reflection in a Digital Circuits Course
Abstract
A unique method for promoting reflection among engineering students was used in the present study involving a digital circuits course. The method combined computer-based simulation for digital circuit design with reflective-thought prompts after a midterm exam for post-exam analysis and reflection. This method was first implemented in a microelectronics course using the SPICE simulator. Lessons learned from the initial implementation were applied to the digital circuits course. These lessons learned included the need to scaffold students in the use of the simulation tool for reflection, the need to balance frequency of reflection with student workload and fatigue, and question prompts that voluntarily elicit broad thought after a milestone event such as a midterm exam (versus a quiz). Using a published depth rubric, the assessment results found increased depth of reflection in the present course relative to the initial implementation in microelectronics. Specifically, there were increases in depth of reflection after the midterm exam in the present course versus the midterm exam and two quizzes in the microelectronics course. The increases in depth were significant relative to the quizzes. There was also an increase in the relative occurrence of broad reflections in the present course, with significant differences compared to the quizzes. Although significant differences were not found in the final exam averages based on depth of reflection after the midterm exam or participation in this reflection, results from a follow-up survey several months after the course ended indicated benefit for students. Specifically, 80% of those who competed the reflection exercise indicated a high or very high perceived benefit from doing so. Of the approximately 50% who chose not to complete the reflection exercise, the primary reasons were identified via the follow-up survey. Findings from this work align with and add to the developing literature on student reactions to reflection.
Keywords
Reflection, Metacognition, ComputerAided Design, Simulation, Digital Circuits
Keywords
Introduction
Reflection can be defined as thinking about what one is doing 1. Kolb’s Experiential Learning Theory says that learning occurs through doing plus reflecting on the doing; therefore, reflection is necessary to learning2. A second supporting theory is Schön’s Reflective Practitioner Theory, which states that reflection provides professionals with skills for solving complex, real-world problems and gaining a deeper understanding of the design problem 3. Reflection is closely linked to metacognition, which fosters self-regulated learning that is so important for one’s career, in higher education, or for any new scenario 4, 5, 6, 7, 8 . In particular, regular, repeated reflection promotes the development of metacognitive knowledge and skills 9, 6. Individuals who reflect and develop metacognitive skills tend to have self-directed, lifelong learning abilities, including assessment of the task at hand, evaluation of one’s skill level for completing the task, monitoring of task progress, and self-adjustment as needed 4, 5, 6, 7.
The present NSF-funded study implemented a unique method for cultivating reflection and metacognition among engineering students. It combines computer-based simulation for circuit design with reflective-thought prompts and was first implemented in a microelectronics course using the SPICE simulator10 . With microelectronics, students must analyze circuits with complex, non-linear components (e.g., diodes, transistors, logic gates), which is much more difficult than analyzing linear circuits introduced in physics courses. Therefore, after each quiz and the midterm exam, students used SPICE during the next class period to reflect on their performance by comparing their hand calculations to the simulated values. In this way, they could identify errors and improvement opportunities. Students essentially “re-took” the quiz or exam by building the circuit schematic in SPICE, setting various parameters, running the simulation, and identifying any differences between the simulated values and their initial quiz/exam answers. They then responded to the following reflective prompts in writing: “How is my solution different from the provided solution?” and “How can I use this information to improve my performance in the future?” (Benson & Zhu, 2015; Claussen & Dave, 2017).
Following this initial work with microelectronics, we applied several lessons learned in a separate course in digital circuits. Here, the same approach to promoting reflection using simulation and question prompts was implemented. One of the lessons learned was the need to scaffold students in the use of the simulation software by instructing them on the setup of the simulation. Based on focus group and survey results in the microelectronics course, students revealed their struggles with completing the reflective exercises due to the complexity and learning curve of SPICE, which is professional-grade software. Also, based on analysis of the students’ responses to the reflective prompt across the six quizzes and midterm exam in the microelectronics course, we investigated the use of a reduced amount of reflection in the digital circuits course. Here, the reflective exercise was given after the midterm exam, a higher-stakes assessment than the individual quizzes. With the microelectronics course, we suspected student fatigue in responding to the same reflective prompt after multiple quizzes, which may have been a limiting factor.
For the digital circuits course, which is the focus of this paper, we applied the method of using simulation to drive reflection using a different simulation environment. SPICE is not applicable to digital logic circuits. However, in the design of digital circuits, simulation tools are nonetheless used extensively. Typically, digital circuits are modeled using a hardware description language (HDL), such as VHDL or Verilog. The HDL models are then simulated using a logic simulation platform. In this study, we employed VHDL for modeling and ModelSim for simulation. Since logic circuits are often large in scale and complexity, logic simulation is used rather than transistor level simulation.
Digital logic courses are common, required parts of all electrical and computer engineering curricula. In these courses, students study a wide range of topics ranging from Boolean algebra and logic gates to the fundamentals of computer organization. Based on the author’s experience, the topic that students struggle the most with is sequential logic circuits (e.g., flip-flops, memories, finite state machines, etc.). The reason students struggle with these topics is that sequential logic circuits require students to keep track of the inputs and state history. This differs from combinational logic, where the output is purely a function of the circuit inputs. The added complexity that students face in analyzing sequential logic circuits is illustrated in Figure 1 . This figure shows one of the most fundamental sequential logic circuits, an RS-latch. For this circuit, the output nodes (Q and QB) are fed back to the inputs of the logic gates that produce the outputs. Thus, for a student to determine the output at any point in time, he/she must know what the inputs (R and S) are as well as the outputs from the previous state (e.g., Q(t-1)). This can be very tricky for students to analyze even in the simple case of Figure 1 . It becomes significantly more challenging when the complexity of the circuit increases (e.g., flip-flops, registers, counters, etc.). Simulation tools are of great assistance to students in these cases, as they provide a simple means to visualize the transient behavior of circuit inputs and outputs over time, as well as rapidly explore various input scenarios.
The goal of this work is two-fold. First, the authors aimed to improve student understanding of sequential logic circuits using the simulation-guided reflection method. A secondary goal was to improve the method itself by applying lessons learned from previous studies in microelectronics. Specifically, we aimed to improve the reflection method in two ways: 1) scaffold students to a greater degree in the use and setup of simulation for reflection, and 2) engage students to a greater degree in the use of reflection by establishing conditions conducive to reflecting, such as after a milestone event such as midterm exam. The following research questions are examined in this study:
-
RQ1) Do students reflect more deeply and broadly after milestone events, and
-
RQ2) Do students perceive simulation-guided reflection as beneficial?
Literature Review
Reflection is defined as thinking about what one is doing, which is necessary for learning, since Kolb’s Experiential Learning Theory tells us that learning occurs through doing and reflecting on the doing (Bishop-Clark & Dietz-Uhler, 2012; Kolb & Kolb, 2009). A second relevant theory, Schön’s Reflective Practitioner Theory, states that reflection furnishes designers and other professionals with skills for solving complex problems, likely enabling deeper understanding of the problem (Schön, 1987). Reflection is closely linked to metacognition, which is an important component of an engineering education since it fosters self-directed, lifelong learning abilities, which are important for any new situation (Ambrose, 2013; Ambrose, Bridges, Dipietro, Lovett, & Norman, 2010; Jamieson & Shaw, 2019; Marra, Kim, Plumb, Hacker, & Bossaller, 2017; Steiner & Foote, 2017).
Unfortunately, despite the known benefits, reflection and metacognition are typically not formally cultivated as part of an engineering education. Education scholars have called this out and suggested that more research involving reflection and metacognition in the curriculum should be published (Ambrose, 2013; Ambrose et al., 2010; Csavina, Nethken, & Carberry, 2016; Cunningham, Matusovich, Hunter, & Mccord, 2015; Marra et al., 2017). Susan Ambrose called for “opportunities for reflection to connect thinking and doing,” since students learn only when they reflect on what they’ve done (5 p. 17, 20). Ambrose continued “Why, then, don’t engineering curricula provide constant structured opportunities and time to ensure that continual reflection takes place?” (5 p. 20).
Metacognition is the act of thinking about one’s thinking or knowing about one’s knowing. A metacognitive individual can adjust or control his learning through various self-regulating behavior (Steiner & Foote, 2017; Turns, Mejia, & Atman, 2020). Metacognition therefore consists of the following two main components:
-
Knowledge about one’s knowledge or thinking processes, and
-
Self-regulation (i.e., self-control) of one’s thinking processes or learning (Cunningham et al., 2015; Schraw, 1998; Wengrowicz, Dori, & Dori, 2018).
In a classic article, three elements of the first component of metacognition (i.e., knowledge) were identified – knowledge of person, task, and strategy (Flavell, 1979). The second component of metacognition includes the self-regulating elements of planning, monitoring, and evaluating one’s work on a task (Cunningham et al., 2015). Fortunately, an instructor can intentionally and easily promote metacognitive skills through practices such as reflective writings and post-exam reviews by students (Ambrose et al., 2010; Schraw, 1998; Steiner & Foote, 2017). It has been recommended that metacognitive instruction be embedded directly within regular content lessons (Pintrich, 2002).
Both self-evaluation and self-adjustment are associated with self-reflective behavior (Ambrose et al., 2010; Zimmerman, 2002). Regular, repeated, reflection is important in the development of metacognitive knowledge and skills, and reflective questions requiring a written or verbal response can promote metacognition (Schraw, 1998; Steiner & Foote, 2017). Questions from the Exam Analysis and Reflection (EAR) technique were used as the basis for the reflective questions used in the present study (Benson & Zhu, 2015; Claussen & Dave, 2017). The EAR technique prompts students to reflect as follows: “How is my solution different from the provided solution?”, and “How can I use this information to improve my performance in the future?”
Turns, Atman, and colleagues are key researchers of reflection and have developed a survey as part of an NSF grant on reflection (Award No. 1733474), with the survey focused on student reactions and resistance to reflection (Mejia, Turns, & Roldan, 2020; Turns et al., 2020). They explain the importance of investigating these student reactions, as this information can be used to improve reflective exercises, identify why a reflective exercise may not be working as expected, and ultimately enhance engagement and knowledge gains (Mejia et al., 2020; Turns et al., 2020). They identified the following student reactions and contributing factors (among others): effort and time involved, competing obligations, perceived usefulness of reflection, optionality, comfort level, and perceived need and importance (Mejia et al., 2020). Turns and Atman are core team members of CPREE, or the Consortium to Promote Reflection in Engineering Education, which was funded by the Helmsley Charitable Trust 21.
Methods and Context
Course Methods
This study was conducted in a sophomore-level Digital Circuits course in the fall of 2020. The student population (N = 61) was comprised of electrical and computer engineering majors. The structure of the course was typical, with two lectures per week plus an additional hands-on laboratory session. In the lab, students completed several assignments with HDL. By the time they were asked to reflect, students were very familiar with HDL. The assessments given in the course included homework, weekly quizzes, lab assignments, and three examinations. This study occurred during the COVID-19 pandemic; thus, the examinations were taken online, and assessments were open-book.
In our initial implementation of simulation-based reflection in a microelectronics course, students were asked to reflect after each of six quizzes and a midterm exam10 . In the present course (i.e., digital circuits), the frequency of the reflective exercise was reduced to one time, which occurred after the midterm exam. This was done to investigate the potential issue of student fatigue in responding to the same reflection question over time, which was believed to have been the case in the microelectronics course. Care was also taken to ensure that the reflection exercise was administered following a significant event (i.e., the midterm exam). This particular exam was selected because students had to demonstrate their knowledge of basic sequential logic circuits, which was the foundation for topics presented later in the course, including counters, finite state machines, memories, and datapath control.
There were some key differences as well as similarities in the use of simulation-guided reflection in the two courses. First, the circuits analyzed by students in the digital circuits course did not require an extensive amount of mathematical calculations (i.e., calculus and differential equations). Rather, the analysis relied on a solid foundation in Boolean algebra and logic and intuition of the circuit’s intended operation. Second, the computer-aided simulation environment was different. Digital circuits are simulated using a Hardware-Description Language (HDL) along with a logic simulator, whereas analog circuits require SPICE for simulation. The use of HDL to describe circuits requires that students must craft both components used and the overall simulation scenario. Users of SPICE rarely have to craft models of components used in the simulation but only have to create schematics and set the parameters. While graphical entry tools do exist for digital circuits, students in this class were asked model their circuits using plain-text VHDL files.
Reflective Exercise
The midterm exam contained 10 problems, primarily covering basic sequential logic circuits. Shortly after the second midterm, the exams (ungraded) were returned to the students. Ungraded exams were returned so that the reflection exercise would not reduce to a simple comparison of right versus wrong answers. Rather, students were encouraged to re-visit the steps they took to arrive at their answers and critically think about their results. Students were given guidance in using the simulation tool to reflect on each of the 10 problems. Participation in the reflection exercise was voluntary, and students who completed it were awarded extra credit.
Figure 2, Figure 3, Figure 4, Figure 5 illustrate the reflection process used in the digital circuits course, highlighting one of the 10 exam problems. The exam was administered online (Figure 2 ), and students worked through the problems using pen and paper before uploading their final response (Figure 3 ). Since the emphasis of the exam was on sequential logic circuits, most problems were best solved by considering transient output signal waveforms before calculating final output values. Figure 4 show the simulation guidance students were given during the reflection process. Similar guidance was provided for each of the 10 questions, along with VHDL templates and simulation scripts. This additional scaffolding was included in response to student feedback from the microelectronics course. There, students faced hurdles in using SPICE simulation (e.g., software issues, simulation setup) that were not relevant to the exercise at hand. Such hurdles were thought to overwhelm students and discourage participation in reflecting. Figure 4 also shows the results from a student simulation and the evaluation of the simulation results. Simulations were carried out using the ModelSim logic simulation environment. The simulation result provided a baseline to which the students could compare their answers and re-evaluate their work. Finally, after carrying out similar analyses for each exam problem, students were asked to respond to the following reflective prompt:
Q: Please discuss anything you learned from completing this comparison exercise.
Figure 5 shows a reflection written by a student after completing the simulation exercise. The wording of the reflective prompt was carefully chosen so as not to bias or lead students in their responses. Some composed thoughtful, critical reflections while others submitted responses that may be considered shallow and/or lacking in detail. Furthermore, some student reflections contained a great amount of detail but focused on content specific to the course material versus how they might improve as a student. Due to the subjective nature of the responses, great care was taken in assessing them using structured qualitative methods. In section 3.3, the assessment methods used to accurately categorize the responses are described.
Assessment of Student Reflections
A qualitative analysis of the responses to the reflective prompt was conducted by two analysts (i.e., author and co-author). The prompt was as follows: Please discuss anything you learned from completing this comparison exercise. The analysis was done using a rubric to assess the depth of the reflection as well as a coding scheme to categorize the reflection as either broad or specific or possibly both. The level/depth rubric was obtained from the literature and consists of four categories: 1) non-reflection, 2) understanding, 3) reflection, and 4) critical reflection (Kember, Mckay, Sinclair, & Wong, 2008). A level 1 statement (i.e., non-reflection) is characterized by a lack of serious thought or lack of evidence of understanding of a concept or theory. A level 2 reflection exhibits understanding of a concept or topic, but the reflection is confined to theory or textbook material without relation to real-life matters. A level 3 statement exhibits personal insights that extend beyond book theory by discussing practical situations. Although such statements occur rarely, a level 4 statement exhibits evidence of a change in perspective surrounding a fundamental belief in the understanding of a concept.
The coding scheme of Table 1 was used to characterize each reflection as broad, specific, or possibly both. This coding scheme was adapted from earlier work by the authors (Dickerson, Clark, & Jiang, 2020). The “specific” versus “broad” categorization might be compared to the concepts of “near” versus “far” transfer. “Near” transfer occurs when the new setting or context in which one’s learning or skills are applied is similar to the original setting, and “far” transfer occurs when skills are used in a broader range of applications or dissimilar contexts (Ambrose et al., 2010; Marra et al., 2017).
Category |
Description |
---|---|
Broad |
Need for care/thought in one’s work; think before answering |
Confidence enhanced |
|
Want to learn from mistakes / avoid in future |
|
Review work multiple times |
|
Review/reflect on work to fully understand or verify, including with simulator |
|
Review to refresh knowledge |
|
More time/effort needed for study/review |
|
Specific |
Enhanced understanding or application of course content, including analysis methods |
Identification of errors, including mathematical |
|
Simulator knowhow or knowledge |
All reflections were double coded to ensure reliability. The authors independently analyzed and coded all reflections. They then compared their codes and engaged in discussion to reach consensus when there was initial disagreement. The inter-rater reliability based on the intra-class correlation coefficient (ICC) for the numerical depth ratings was 0.965 based on average measures and 0.933 based on single measures, which are each associated with excellent reliability (Fleiss, 1986; Lexell & Downham, 2005).
Comparison of Final Exam vs Reflection Depth
Statistical analyses were carried out to determine whether a relationship existed between final exam score and the depth to which students reflected beforehand on the post-midterm reflection exercise. This analysis was done using Welch’s F-test, a variant of analysis of variance that does not assume equal variances. The analogous non-parametric test, the Kruskal-Wallis test, was run given the small sample size associated with one of the depth levels. A similar analysis was conducted between final exam score and participation (yes/no) in the post-midterm reflection exercise. This analysis was conducted using an independent samples t-test, which was corroborated by the analogous non-parametric test, the Mann- Whitney test (Norusis, 2005).
Follow-up Survey
After the conclusion of the course, a short, anonymous follow-up survey was administered. The purpose of the survey was to assess the impact of the reflective exercise as perceived by students several months later. A second purpose was to determine the reasons why students chose not to participate in the exercise, since approximately half of the students had not participated. A list of possible reasons for not participating was presented to students. These reasons were informed by recent research on student reactions to and resistance towards reflection in the engineering classroom (Mejia et al., 2020). Students who were enrolled in the course were contacted via e-mail approximately 8 months after the course ended. Students were reminded of the exercise through images that were embedded within the survey. Students were asked the following survey questions
-
You submit the simulation-based reflection exercise after the midterm? (Yes/No/Don’t Recall)
-
If “Yes,” Indicate the degree to which the reflection exercise was beneficial to you as a student. (1-Not at all, 2-Low benefit, 3-Neutral, 4-High benefit, 5-Very high benefit)
-
If “No,” Please indicate your primary reason for not completing the reflection exercise
-
Amount of effort or time involved to complete it, or a lack of time on my part.
-
The reflection exercise required me to write
-
It was an optional assignment, or I was doing well in this course at that time, so I didn’t need to participate.
-
The reflection exercise has minimal usefulness for this course or for my engineering education in general
-
The reflection exercise made me go outside my comfort zone or feel exposed
-
Other (textual entry allowed)
-
Results
Assessment of Student Reflections
Table 2 summarizes the results of the analysis of student reflections for content and depth in both the microelectronics and digital circuits courses. For the digital circuits course, 83% of the submitted reflections after the midterm exam contained content characterized as having broad implications, while 41% had specific implications. These percentages aligned with the results obtained after the midterm exam in the microelectronics course, where 74% of students’ responses contained broad content and 45% contained specific content. However, following the two quizzes in the microelectronics course, the percentages of broad responses were much smaller at 55% and 41%, respectively. These proportions were each significantly different from the proportion of responses classified as broad after the midterm exam in the digital circuits course (i.e., 83%). This was based on a z-test of proportions, with p = 0.009 and p < 0.0005 associated with quiz 3 and quiz 6, respectively. This result indicates that the perceived importance of the event preceding the reflection (i.e., a midterm exam) may impact the degree to which students think broadly about themselves, their preparation, and their performance. Thus, reflection after a milestone event, such as a midterm exam versus a quiz, may encourage students to reflect more broadly and generally.
Similar outcomes were found with the depth coding. In the digital circuits course, the average depth level of the post-midterm reflections was 2.83, whereas it was 2.69 in the microelectronics course. As shown in Table 2, the depth averages after the two midterm exams were each higher than the depth averages after the two quizzes in the microelectronics course (i.e., 2.34 and 2.20, respectively, for quiz #3 and quiz #6). This suggests that reflection after a milestone event such as a midterm exam, versus a quiz, may also be successful in motivating students to reflect to a greater depth.
Course |
Reflection After: |
n |
Average Depth |
Broad # (%) |
Specific # (%) |
---|---|---|---|---|---|
Microelectronics |
Quiz #3 |
69 |
2.34 |
38 (55%) |
37 (54%) |
Microelectronics |
Midterm |
82 |
2.69 |
61 (74%) |
37 (45%) |
Microelectronics |
Quiz #6 |
51 |
2.20 |
21 (41%) |
27 (53%) |
Digital Circuits |
Midterm |
29 |
2.83 |
24 (83%) |
12 (41%) |
Upon running a Welch’s analysis of variance test, significant differences were found in the reflective depth averages across the four assessments (p < 0.0005) (Norusis, 2005). Based on the Games-Howell paired comparisons test, there was a significant difference in depth between each midterm reflection and each quiz reflection. In Figure 6 , ELEC MID and DL MID refer to the microelectronics and digital logic/circuits midterm reflections, respectively. ELEC Q3 and ELEC Q6 represent the microelectronics quiz 3 and quiz 6 reflections, respectively. Thus, as shown in Figure 6 , DL MID differed from each of Q3 and Q6, since the confidence intervals for the differences did not contain zero. The same was true for ELEC MID, which differed from each of Q3 and Q6 since these confidence intervals also did not contain zero.
Examples of level 2 and level 3 responses are given below. There were no non-blank level 1 reflections.
-
(level 2) “I learned how clock cycles are supposed to work (I was confused on the exam). I learned that the critical path is the fastest path possible in a circuit, I didn’t realize that included an undefined answer. I also learned how clock cycles can be triggered by different things and how a line of multiple different d-flip-flips are triggered in a row.”
-
(level 3) “I learned a lot by completing the comparison of problem 4. The critical path delay, I assumed the critical path delay of the adders could be added together with no consequence. By forgetting the limitations of the inputs I really shot myself in the foot. This is a classic example of moving too quickly without really thinking about the question. We did several of these in class, so my brain wrote them off as basic and not worthy of my attention. Going too quickly and ignoring critical information has tripped me up many times before and it quite difficult to prepare for in my opinion. Despite this, I hope to correct this type of mistake on the next exam and in the future in general.”
Exam Performance vs Reflection Depth and Participation
Table 3 summarizes the results of the analysis of the final exam average score in the digital logic course versus the student’s reflective depth level after the midterm exam. The final exam score for this analysis was based on four problems that were most directly related to the content of the midterm exam and post-midterm reflection. As shown in the table, there were no significant differences in exam scores for the three reflective depth levels. Based on Welch’s F-test test, the p-value was 0.53. This result was corroborated by the non-parametric Kruskal-Wallis test (p = 0.66). Although a greater reflective depth level was hypothesized to be associated with a significantly higher final exam score, this was not the case.
Depth Level |
n |
Mean |
Std. Dev. |
---|---|---|---|
1 |
31 |
43.8 |
4.8 |
2 |
5 |
41.3 |
6.7 |
3 |
24 |
42.1 |
7.6 |
A similar analysis was run to identify any differences in the average final exam scores based upon whether the student participated (or not) in the reflective exercise (Table 4 ). Approximately half of the students participated by submitting their work with the VHDL simulator and a written reflective response. There was no difference found in the two exam averages based on participation, with p = 0.29 from an independent samples t-test. The result based on the Mann-Whitney test was p = 0.49.
Participation |
n |
Mean |
Std. Dev. |
---|---|---|---|
No |
27 |
43.8 |
4.9 |
Yes |
33 |
42.2 |
7.1 |
Follow-up Survey
Approximately 58% of enrolled students responded to the follow-up survey. Of the students who responded, 52% reported that they completed the exercise, 21% said they had not completed it, and 27% did not recall. The percentage who reported to have completed the exercise aligned with the actual percentage of students who had participated in the exercise.
Of those who reported having completed the exercise, the results in Table 5 were obtained in response to the following question: Indicate the degree to which the reflection exercise was beneficial to you as a student. As shown in Table 5, 80% indicated that the reflective exercise was of high or very high benefit to them as a student. This was a good outcome for students’ perception of the value of the reflective exercise several months after experiencing it.
Answer |
% |
n |
---|---|---|
Not beneficial at all |
6.7% |
1 |
Low benefit |
0.0% |
0 |
Neutral |
13.3% |
2 |
High benefit |
73.3% |
11 |
Very high benefit |
6.7% |
1 |
Sample quotes from students that perceived high or very high benefit from the exercise are as follows:
-
“It gave me an opportunity to identify and fix the gaps in my understanding.”
-
“I only recall doing the simulation on one problem and it greatly changed how I looked at the problem. The problem regarding latency was extremely important and I am actually using that information in my research now, so I would consider that experience to be very important.”
Of those who reported not having completed the exercise, the results in Table 6 were provided in response to the following question: Please indicate your primary reason for not completing the reflection exercise. Although one student indicated “other,” the reasons the student listed in the text entry box directly corresponded to two categories already listed in the response options. Thus, the counts for the two pre-existing categories were updated and are given in Table 6 . The total response count is therefore one more than the number of students who responded to this question. The two reasons stated by the students for not participating were related to 1) time and effort involved, and 2) the optional nature of the assignment and/or perceived lack of need to participate. Fortunately, these are conditions or perceptions that can be adjusted by the instructor so as to encourage, motivate, and enable reflection by all students.
Reason |
% |
Response Count |
---|---|---|
Amount of effort or time involved to complete it, or a lack of time on my part. |
57.1% |
4 |
The reflection exercise required me to write. |
0.0% |
0 |
It was an optional assignment, or I was doing well in this course at that time, so I didn’t need to participate. |
42.9% |
3 |
The reflection exercise has minimal usefulness for this course or for my engineering education in general. |
0.0% |
0 |
The reflection exercise made me go outside my comfort zone or feel exposed. |
0.0% |
0 |
Other |
0.0% |
0 |
Discussion
In this paper, the method of using computer-aided simulation tools to drive written reflections was applied to a digital circuits course using a logic simulator (i.e., ModelSim) and VHDL. Previously, this same method was applied in a microelectronics course using SPICE 10 . In addition to adapting the method to a new course, the simulation-guided reflection process was improved. Specifically, students were provided with additional scaffolding in the use of the tools for reflection. Also, the frequency of the reflective exercise was reduced, the reflection exercise was associated with a milestone event (i.e., the midterm exam), and the reflection prompt was simplified to allow for a wider range of student responses.
To address RQ1, Do students reflect more deeply and broadly after milestone events?, the reflective exercise after the midterm exam was assessed for depth and content by the authors, and the results were compared to the previous study of the microelectronics course. The average depth of the reflections was greater with the digital circuits midterm compared to the microelectronics midterm and quizzes. This suggests that the combination of reduced frequency of reflection, simplified prompting, and deployment after a milestone event may have been successful in having students reflect to a greater depth and more broadly in the digital circuits course versus the microelectronics course.
Student exam scores versus reflective depth level and participation were analyzed with ANOVA and a t-test, respectively, in the digital circuits course. No statistically significant differences were found in exam scores based on either depth level or participation. However, this does not suggest that the reflection exercise was not beneficial for some students. This is supported by results from the follow-up survey, where 80% of students indicated that the exercise was of high or very high benefit to them. Since approximately half of the students chose not to participate in the reflective exercise, students were asked in the follow-up survey to indicate the primary reasons for not participating. The results revealed that the primary reasons for not participating were related to the amount of time and effort required to complete the exercise and students feeling it was not necessary for them to do so. These results address RQ2 (Do students perceive simulation-guided reflection as beneficial?).
Limitations
There are some limitations to this work. We arrived at the conclusion about fatigue based on the instructor’s observation and assessment. However, this conclusion could have been confirmed by asking students at the end of the microelectronics course whether fatigue became an issue for them. Therefore, the follow-up survey was sent to students in the Digital Circuits course to explore their perceptions about reflection. We recommend obtaining students’ perceptions of and reactions to reflection, in line with the research currently underway by Turns et al. and Mejia et al. 15, 20, which was discussed in the literature review.
Conclusions
The implementation of simulation-based written reflection in digital circuits following its initial implementation in microelectronics was encouraging. This was indicated by the greater average reflective depth levels, increased percentage of broad (vs. specific) responses, and student responses to the follow-up survey. This work demonstrated that the simulation-driven reflection method could easily be adapted to topics outside of microelectronics. Also, since simulation tools are common to all engineering disciplines, courses from outside electrical and computer engineering can likewise adopt this method.
There are several areas where future implementations may improve and build upon our initial work with reflection. The first recommendation is to ensure that reflective exercises are deployed after milestone events, such as examinations. Also, many students did not participate in the reflection exercise because they felt it would not significantly impact their grade in any way or otherwise was not necessary. However, reflection is beneficial for all students for their development as engineers, regardless of current performance or prior achievement. In order to increase participation, it is suggested that reflection after milestone events be made mandatory or otherwise highly rewarding in terms of recovering points to incentivize participation.
Thus, in using simulation-based reflection, it is important that the instructor strike a balance between frequency of reflection and student workload or potential fatigue. One suggestion to achieve this is to consider adding optional reflection opportunities after quizzes to recover lost points. Thus, a possible approach is a combination of optional and mandatory reflection exercises throughout the semester to ensure that all students reflect at some point during the term, for example after higher-stakes exams. It is critical that instructors scaffold students in the specific use of the simulation tool for reflection. This includes the setup of the simulation scenario to perform the reflection. For example, in this work, VHDL template files were provided to students to input their calculated circuit parameters. Students also benefited from guidance in what to look for in the simulation results when comparing them to their hand calculations.
A great future research question is the optimal amount of reflection that we should be requesting of students. The key is determining that optimal amount that balances benefit with possible fatigue.