Below, you will see the description of a research study that is being launched this year in the department of Mathematics, Physics & Engineering at Mount Royal University (MRU). Please contact me by email or by leaving a comment on this blog if you have comments or questions, would like references, or would like to contribute in any way, for example by repeating the study in your context!
In short, there are budgetary pressures at our institution to reduce all courses to 3-hour courses (plus labs, if appropriate), despite the fact that math, physics & engineering courses have traditionally had tutorials (which usually means an extra hour or hour-and-a-half of class per week). The thinking behind this tradition has been that students need the extra practice time to hone their mathematics and problem-solving skills. However, besides our own anecdotal experiences, instructors know little about which tutorials are most effective for our students and why. Therefore, Phase I of this study will collect data on what is happening in our current tutorials, whether students find them useful (or not), what kinds of students find them useful (or not), and why. The results of this study will be important for us as we strategize and/or redesign our courses in the future, and we believe they will also be informative for the broader STEM education community. The results will also inform a Phase II of the study, which will ask more focused research questions based on the themes that emerge from Phase I.
Are tutorials useful? If so, why and for whom? If not, why not?
To be able to make recommendations for if and when tutorials are appropriate and effective in math, physics and engineering courses.
Classes in STEM (Science, Technology, Engineering & Math) disciplines, particularly mathematically “heavy” courses such as math, physics and engineering courses, have traditionally been taught using a lecture and tutorial format (and sometimes labs, depending on the course) where the lecture time is spent largely on new “content”, and tutorial time is spent with students getting practice at solving problems themselves. Tutorials can have different names, for example in physics courses in American Universities, they are often called “recitations”. Therefore, for the purposes of this study, the term tutorial is being defined as scheduled class time which is in addition to a regular 3-hour weekly lecture class, where students are doing activities designed to support their learning of material that has already been addressed in lecture. These activities could include problem-solving, running simulations, and group work or discussions, but do not include hands-on, experiment-like activities which are commonly known as labs.
Rather than strictly lecturing during the traditional 3-hour “lecture” time, the effective use of active learning pedagogies is being supported more and more in the STEM education literature. Recently, specific pedagogies such as “Peer Instruction” (developed in physics but used in many disciplines) and “Flipped Classrooms” have received a lot of attention in the literature and media. These pedagogies require students to do homework before coming to class and the “lecture” time with the instructor is more active and more like a traditional tutorial, with the goal of helping students gain a deeper understanding of the concepts. There is a large amount of literature demonstrating the effectiveness of using active learning pedagogies (e.g. Springer et al., 1999; Hake, 1998; Prince, 2004). However, most studies showing the effectiveness of active learning pedagogies do not discuss what other supports are provided to students outside of lecture time (e.g. Smith et al., 2009). Physics classes at Harvard, where Peer Instruction was developed, use “lecture” time to help students develop conceptual understanding, and tutorial time for students to practice their quantitative problem-solving skills (Julie Schell, personal communication), although this is not always evident in the Peer Instruction literature.
There has been less focus in the literature on what other support, such as tutorials, are also helpful to student learning, especially in computationally-heavy courses. Many studies have used control groups and pre-post tests to determine the effectiveness of a specific activity implemented during a tutorial of a certain course (e.g. Slezak et al., 2011) however these types of studies do not address the questions of whether tutorials are more/less helpful for certain types of courses or students. We do know that expert tutorial instruction requires a nurturing tutor with “superior content as well as pedagogical content knowledge”, who is “questioning, knows students’ level of understanding precisely, and can adjust strategy accordingly” (Wood and Tanner, 2012) and one study found that successful implementation of a specific tutorial curriculum required a small student-to-teacher ratio and depended on highly trained instructional staff to be successful (Koenig and Endorf, 2004.) We also know from the learning literature that “deliberate practice” and “time on task” are key variables related to learning. It would seem then, that tutorials are an important part of instruction and should be taught be experts.
However, due to budget restrictions at MRU and many other institutions, tutorials are under threat of being cancelled. If we are to make decisions about re-designing courses including potentially cancelling some or all tutorials or offering tutorials in a different way, then as reflective teachers it is incumbent upon us to have some data not only to inform our decisions but also to provide some baseline data for assessing the changes we make. We need to know what kind of instruction and course formats work best in our context. We don’t really know, for example, if some tutorials are more needed than others, depending on the goals and content of the course or if our current tutorials are effective for only some types of students. Currently, each instructor only has anecdotal data about what they feel works best in their own class.
This study will begin to address this deficiency by collecting data on student performance and student perceptions of tutorial effectiveness, in multiple mathematics, physics and engineering courses at MRU. This proposal is for Phase I of a larger study, with Phase I data being collected in the 2012/2013 academic year at MRU.
Currently we are in the very early stages of developing line of research. After an initial year of collecting data at MRU, Phase II will be developed to ask more focused research questions based on the themes that emerge from Phase I. It is possible we may start to ask questions from a cognitive, social cognitive, motivational, or community-centered perspective.
If, as a result of the study or as a result of budgetary constraints, we end up redesigning courses and/or canceling tutorials in future years for at least some courses, Phase III would involve repeating the study in subsequent years in the revised courses. In Phase III we would want to compare the new courses to baseline data of Phase I and II to determine whether the new courses are meeting the same level of student satisfaction and learning outcomes.
What do you think?
Do you have thoughts or references you’d like to share related to this question? Would the results of this study be useful for you? What questions do you have about the effectiveness of tutorials in math, physics and engineering courses, or in your own context or discipline?
Update as of December 20, 2012: we will be conducting this study in some business classes at MRU in the coming winter semester as well!