Look what happens when you use your data to improve?

Even though, I know you have plenty of things to think and fret about these days with the start of a new term and the little matter of a proposed calendar and curriculum revision, I hope you are enjoying the weather and finding ways to keep your students motivated despite it!


With that said, I hope you’ve also had a chance to look through your IDEA course reports from the winter term and your packets of student forms.  Although many of you have attended one of the “interpreting the IDEA reports” sessions over the last year or so, I know that some of you continue to have questions.  I’m glad to sit down with you any time and answer any questions you might have.


I would like to share some of my observations after seeing almost every report over the last two terms.  My hope is that these observations are helpful, not only as you might be thinking about using your reports to inform your course design for future terms, but also in considering whether or not the switch to the IDEA Center process has been helpful for Augustana College in helping us to improve our teaching and student learning.


First, it appears to me as if the average PRO score (Progress on Relevant Objectives) went up between fall and winter terms.  There are a number of potential explanations for this – the types of courses offered, student acclimation to college (within the year as well as for first year students), and general attrition for those most unable to succeed at Augustana.  But it struck me that there are also some reasons why we might suspect learning (as represented by the PRO score) to decrease in the winter term – most notably the big break in the middle of the term and its impact on students’ motivation to restart the academic engine or remember what they had learned prior to the holiday break.  So I don’t think it’s completely out of the bounds to suggest that the increase in the overall PRO score is worth noting.


Second, it appears that many faculty members reduced the number of learning outcomes they selected for their individual courses.  I would argue that this is probably a good thing in the vast majority of cases.  First, I interpret the number of objectives selected as an indication of focus rather than an indication of learning.  In other words, as I’ve noted to some of you, in many cases your students reported learning substantially on objectives that you did not select.  In fact, it wasn’t uncommon at all to find faculty selecting fewer objectives and then finding that they could have selected additional objectives and the PRO score would have remained the same or even gone up.  The choice to choose fewer objectives and focus on them set the conditions for the “spill over” learning that was then evident on your reports.


Conversely, for faculty who initially selected many outcomes, the results of those reports suggested that the diffusion effect that I have mentioned repeatedly held true more often than not.  Folks who initially selected many objectives often found that, although some of the objectives they selected played out as they had intended, there were enough objectives on which students reported lower average learning that the average PRO score suffered as a result.  In my mind, the drop in the average number of objectives selected suggests to me that more faculty have engaged in the exact kind of purposeful thinking about course design and course outcomes that the adoption of this instrument was intended to produce.  Some of you might argue that this is only evidence of “gaming the system.”  I would argue that if “gaming the system” sets better condition for learning, then you can call it “manipulating,” “negotiating,” or “peppermint bon bon” for all I care.


With all of the uncertainty and ambiguity that goes with the work that we do – especially when it comes to trying to make decisions about the future of Augustana College – I think it is useful to look at a decision the faculty made last year and assess its impact.  In the case of the decision to switch to the IDEA Center system, I think that there is preliminary evidence to suggest that this switch is helping us improve the conditions for optimal student learning.  Whether or not it actually directly impacts student learning – I think that is a question for another Delicious Ambiguity Column that I will write more than a few years from now.


Make it a great day,



Leave a Reply

Your email address will not be published. Required fields are marked *

Spam Protection by WP-SpamFree