I have been waiting to do this post so that I could look at my student evaluations in order to formulate my newest plan of action. For those of you who have been following this journey with me through my previous posts: Flippn' Biochemistry, In Defense of the Flip, and Flippn' Biochemistry Part II, I hope that you have discovered that completely flipping a course successfully can not be done in just one iteration. Instead, I have learned that it is a constant cycle of implementation, surveying and reflection, and adjustments.; and I am hoping by keeping a record of the challenges and strategies I have ran across in the last couple years, I can help encourage my colleagues to incorporate more active learning into their own classrooms while alleviating some of the stress. As I mentioned in my previous post, I implemented a number of changes the second time around including finding a TA, moving to a classroom that facilitated group work versus a fixed chair lecture hall, embedding quizzes into the lecture videos forgoing the online homework, emphasizing the learning objectives both in the videos and in class and directly correlating those objectives to the in-class activities; and redeveloping the exams such that they reflected more of the activities and less rote memorization. My main objective for this term was to overcome the student perception that a course "lecture" must be a lecture by stripping away the typical lecture environment and embracing the components of a laboratory that emphasis problem based learning.
I was also fortunate this year that two colleagues (Dr. Pete Van Zandt and Dr. Melanie Styers) and I were awarded a grant from the Associated Colleges of the South to assess the impact of blended learning and flipped teaching in our student's ability to think critically. For this study we utilized the Critical Thinking Assessment Test (CAT) developed and validated by Tennessee Tech in a pre/post test format in three of our classes that to some degree utilize flipped teaching. We also employed pre/post SALG surveys to assess student perceived gains in hopes of discovering correlations between categories in which the students believed they gained and in what the CAT test measured. While we are still waiting on the results from the CAT test, both my student evaluations, the technology survey that I utilize each year and the SALG responses indicate that we are headed in the right direction.
In 2014 (light blue) and 2015 (dark blue) students were surveyed about halfway through the term with the following prompt: Throughout the term we have utilized a variety of tools to help deepen student understanding on a variety of biochemistry related topics and to facilitate the development of critical thinking skills. To help further develop this course, please rate the tools that we utilized on a scale of 1 to 10 (1: you found this tool to not beneficial at all or even distracting and 10: you found this tool to be very beneficial). The results of that survey are found below (note that in 2014 I goofed and forgot to ask students what they thought of the textbook... so that result was not included for 2014).
As you can see, students' evaluations of each tool (except for the previous year's exams) all increased or stayed the same from 2014 (light blue) to 2015 (dark blue). Of particular note are the large gains in appreciation of the POGIL activities, the video lectures, and the lecture quizzes (which was compared to the online homework assignments through sapling last year). I believe these gains in appreciation are due in large part to stripping away the lecture environment to facilitate group work and communication while decreasing the expectation that lectures are for passive learning at the onset of the term (such as we do in the laboratory).
My other goal in the course this term was to redesign the in-class exams such that they better reflect the POGIL activities which is why we see a significant decreases in appreciation for the previous year's exam, because they were dramatically different in structure with short answer versus fill in the blank and multiple choice. But still we see that, the highest rated tool for "promoting critical thinking" again this year was the course management page on Moodle... which again makes me question whether or not students truly understood the question "Which tools did you find were beneficial in facilitating critical thinking". But I was happy to see that the POGIL activities and case studies came in close.
Student SALG Survey Results
In addition to the comparison of technology between the class in 2014 and 2015, I also looked at how the class of 2015 perceived their changes in the various skills tested by the CAT: (pre=blue and post=orange)
As we can see in the results above, all of the students felt that they made gains in all of the various skills and based on a two sample with unequal variance T-test, we found the majority of the above gains to be significant. Again, I am anxious to see if and how these results correlate with the CAT test findings!
In comparing my student evaluations between 2014 and 2015, I also have found significant improvements particularly in the categories such as "The course improved my ability to think critically and reason effectively", "The course was organized in a way that enhanced my learning", and "The instructor's overall teaching effectiveness"; which I also hope is a reflection of the adjustments I made in approaching the flipped class between 2014 and 2015. Below are some of the comments and feedback provided by the students:
Comments that made me smile:
"Pretty well organized for how much stuff was needed. Lots of thinking by the students that was then reinforced by the teacher". (2015) vs "I think the class should be more lecture based. While the flipped idea is fun, I think that for a class with this much information, we need a lecture." (2014)
"She has made students think critically every class period. She created a new spin to the science department at BSC". (2015) VS "A better focus on making sure students are learning rather than memorizing metabolic pathways" (2014)
"I liked how she supplement the videos with some in-class explanations. The activities were pretty solid too; very helpful. The objectives were AWESOME". (2015) VS "POGIL activities - some concepts, actually most, were too complicated for the score of this course" (2014)
"Forced us to reason through problems rather than simply memorizing facts"(2015) vs "More teaching in class" (2014)
Comments that demonstrate challenges still exist: (besides the "I learn better with straight lecture" comments)
"Narrow the learning objectives to better match the exams, make the exams have stuff on them that we learned in class before taking it, link the activities in class with the material more"
"I also never felt prepared for test despite strenuously studying"
"What was on the test always took me by surprise"
So while the above comments lead me to believe that the students do realize that the flipped model is improving their ability to reason and think critically, I think they are still very unsure of themselves when it comes to the exam and believe that they should still rely on rote memorization. Now I do have to defend myself, because in re-creating the exams, I pulled questions directly (and sometimes literally) out of their in-class POGIL activities. And from the first exams where the average test grade was a D, to the third and even the cumulative final exam grades averaged around a high B; i'd have to say that the students improved DRAMATICALLY on what are very challenging exams!
Changes for next year
In order to continue improving this course, I have came up with three modifications to test next year:
- In the first week of class try to better model how the group should work together and the pattern of the activities. I think if we walk through the first two activities as a larger group step by step, it may help alleviate some of the stress and give them a rhythm to work with the rest of the term.
- Consistently remind them that the videos are there to help introduce or explain course content while the in-class meetings are designed to help them see how that content is applied. (And they need not rely on just the videos, they have a text book and internet resources at their disposal as well).
- Instead of requiring students to complete the video quizzes for credit, I will now have students write out their muddiest points for each video lecture/topic and submit them online (either through facebook or moodle) for a muddiest point lecture at the start of each class. While I consistently tell my students that I will address questions in my "muddiest point" mini-lectures every class meeting, I rarely actually get questions. Then at the end of the term, my students always ask that I do this more. So this way, by requiring the questions for credit, hopefully I can increase the engagement and help better address their needs for clarity.
In terms of the exams, it is my hope that now, with the redesigned exam format from last year freely available to the students to analyze and study, some of the frustrations the students voiced will decrease. I have also implemented an objective alignment activity between the video learning objectives and the in-class learning objectives to help them see how they build and grow with each other to lead them to higher order thinking and then how those higher order critical thinking skills are what I test for in the exam.
Overall, I am really pleased with how that class has progressed and am excited to see how it goes next year!! Also, stay tuned for the CAT results soon to come!