We explore the logistics for and impact of including computational, team-based laboratory experiences into mainstream calculus instruction at a large public university, with the goal of supplementing (or partially supplanting) the traditional knowledge-transfer instructional model with rich experiences.
This project is motivated by the present cultural upheaval concerning the purpose of mathematics instruction, specifically those classes traditionally labeled as “service” or “general education” courses (as opposed to courses designed primarily for mathematics majors). For classes such as the mainstream calculus sequence1 the majority of whose enrollment come from students intending to major in the physical sciences or engineering disciplines, the traditional emphasis on knowledge-transfer and rote calculation is increasingly losing its relevance outside of classroom settings. In many of such classes, the main learning goal has long been reduced to fluency in pen-and-paper computations, a skill rapidly obsoleted by modern technological innovations and is now de-emphasized in recent policy recommendations2. Simultaneously, there is a growing consensus toward the importance of interpersonal and transferable skills training3 in higher education. On a local level, in addition to this cultural shift slowly taking root among the mathematics faculty, we also face external pressure and incentive to revisit our instruction of the mainstream calculus sequence. Specific feedback from Engineering college colleagues brought to light several deficiencies in how we approach calculus instruction. The University, on the other hand, initiated a push for interdisciplinary research programs surrounding the use of computational mathematics in other STEM disciplines, bringing the inclusion of computational methods in STEM coursework also to the spotlight.
Our project is a three-year (six-semester) initiative aiming toward the large-scale deployment of computational labs into Calculus II instruction. We are to design new curricular material, pilot their deployment (first in small classroom settings ramping up toward large lectures), and assess its benefits (to students) and costs (to the department). The material developed and the data gathered will form important local data to inform the ongoing discussion within our department to update our instruction of gateway and service courses.
Our experience implementing this project is detailed in our PRIMUS article:
You can read the accepted version of the author manuscript here.
This project was supported in part by a generous grant from MathWorks through their MathWorks Academic Support program.
The PIs would also like to thank Dan Normand, Arman Tavakoli, and Rocco Tenaglia—the TAs for Pilots 3 and 4 of the project—for their many observations and feedback. Additionally, the TAs for Pilots 5 and 6 of the project have our gratitude for their patience and gusto teaching in this new and unfamiliar format.
Mark Iwen and I produced an initial proposal of changes that we intend to implement. We reached out to individual units within the Engineering College, asking to meet to discuss our proposal. The goal is to set expectations and clearly establish parameters for our project, and to gather ideas for the lab projects that will be used in the course.
Outcome of the meetings: our mandate requires us minimizing changes to the “calculus content” of the course, while including hands-on exploratory labs based on real-life settings with computational aspects using the Matlab programming language. A gradual ramp up of the enrollment is agreed upon, and the AY2016-2017 edition of the pilot program will enroll only selected engineering major students recommended by the college advisors.
MTH133 s62 has enrollment restricted only to incoming freshmen with AP credit, with a declared interest in the engineering majors, and recommended by their college academic advisors. Mark Iwen and I collaborated with Rachael Lund for running this course, which consisted of three 50-minute “traditional” lectures (using the black board) and one 2-hour-long lab per week. Each lab session involved first a “traditional quiz” on standard content material, followed by 80 minutes of computer-based activity.
This course can be described as a “standard Calculus II class” augmented with lab material.
We collected student feedback concerning labs and overall structure for this first pilot through weekly surveys.
MTH133 s37 has enrollment restricted only to engineering students recommended by their college academic advisors. We again co-taught with Rachael Lund. Course structure is the same as Pilot #1, with the addition of more “real-life” examples presented during lecture.
To facilitate the latter, we partially flipped the course. Students are expected to watch short course videos (prepared by Ryan Maccombs of the Math Department independently of this project) prior to coming to class. Class starts with a 5-minute quiz on basic concepts covered by the video, and continues with motivational material and examples.
Lectures are presented in mixed media: black board, overhead projector, and slides.
Student feedback was collected through weekly surveys.
Additional changes are inspired by observations that:
We revisited the labs, with the main aim of shortening them to be approachable in 60 minutes. Additionally, to facilitate transition to larger lecture sizes and still maintain the partially flipped structure, we developed WeBWorK “reading check” quizzes to be completed after the students watched the video but prior to arriving to class. The graded work incentivizes student participation in the pre-course videos, allowing us to include “engineering applications” in lecture without sacrificing curricular material.
With the help of an undergraduate student assistant, Katrina Gensterblum, I did some statistical analysis of student performance data gathered on the WeBWorK platform. We made various preliminary findings some of which confirmed and some of which refuted conventional wisdom concerning student engagement with WeBWorK. In the context of sections 40 and 41 (Pilot #3), however, our main findings are that student participation rates on the video check questions that Mark and I developed in the summer of 2017 are low. Additionally, using as a metric the average number of attempts before reaching the correct answer, the video check questions appear to be overwhelmingly harder than the standard homework questions.
We have several hypotheses to explain this:
We are forced to conclude that our pre-class video structure is not appreciatively improving student learning, and may actually contribute more to student frustration with the class.
In preparation for the large-scale deployment of Pilots #5 and 6 (150 students each semester), we decided to modify the course yet again based on what we learned in Pilots #3 and 4. During the summer of 2018 we rebuilt our curricular material to reflect some structural changes that we will be making.
Logistic changes:
About the summative assessments:
Instead of weekly 15-20 minute quizzes, we run “mini-tests” that last 40 minutes every two to three weeks. The mini-tests are designed to be exactly half the length of a midterm exam, with exactly the same formatting and grading style. The intention is to borrow from backwards design principles: for traditional sections the three types of student assessments (WeBWorK homework, quizzes, and exams) are formatted and graded differently. While there was not much we can do about the necessary evil that is WeBWorK, the least we can do is to replace the quizzes by mini-tests that give students lower-stakes accurate preparation for their exams.
About the labs:
Evaluation of program: Program assessment is designed and implemented by Andy Krause, and includes:
Additional informal feedback from students are collected through an anonymous comment box.
One of the student feedbacks that we received for this new format is that many students wish that they can “find out the correct answer” for the labs. The labs are designed to be supplementary to student learning, with the emphasis on the process and not the results. Therefore “correct answers” were not made available. Students however indicated that knowing what they should be doing and seeing the activities demonstrated could bring them emotional closure and satisfy their curiosity. Therefore we decided that an additional task is to prepare videos that address these issues. It was also decided that these lab videos will make good training tools for graduate TAs implementing the labs.
Analysis of the collected data will also be done, in preparation of publication.