One of the best parts of my job is teaming up with others on campus to help us all get better at doing what we do. Over the past seven years, I’ve been lucky enough to work with almost every academic department or student life office on projects that have genuinely improved the student experience. But if I had to choose, I think my favorite partnership is the annual student learning assessment initiative that combines the thoughtfulness (and sheer intellectual muscle) of the Assessment for Improvement Committee with the longitudinal outcome data (and nerdy statistical meticulousness) from the Office of Institutional Research and Assessment.
For those of you who don’t know about this project already, the annual student learning assessment initiative starts anew every summer – although it takes about four years before any of you see the results. First, the IR office chooses a previously validated survey instrument that aligns with one of Augustana’s three broad categories of learning outcomes. Second, we give this survey to the incoming first-year class just before the fall term starts. Third, when these students finish their senior year we include the same set of questions in the senior survey, giving us a before-and-after set of data for the whole cohort. Fourth, after linking all of that data with freshman and senior survey data, admissions data, course-taking data, and student readiness survey results, we explore both the nature of that cohort’s change on the chosen outcome as well as the experiences or other characteristics that might predict positive or negative change on that outcome.
The most recent graduating cohort (spring, 2017), provided their first round of data in the fall of 2013. Since we had already started assessment cycles of intrapersonal conviction growth (the 2011 cohort) and interpersonal maturity growth (the 2012 cohort), it was time to turn our attention to intellectual sophistication (the category that includes disciplinary knowledge, critical thinking and information literacy, and quantitative literacy). After exploring several possible assessment instruments, we selected an 18-item survey called The Need for Cognition Scale. This instrument tries to get at the degree to which the respondent is interested in thinking about complicated or difficult problems or ideas. Since the Need for Cognition Scale had been utilized by the Wabash National Study of Liberal Arts Education, they had already produced an extensive review of the ways in which this instrument correlated with aspects of intellectual sophistication as we had defined it. And since this instrument is short (18 questions) and cheap (free), we felt very comfortable putting it to work for us.
Fast forward four years and, after some serious number crunching, we have some interesting findings to share!
Below I’ve included the average scores from the 2013 cohort when they took the Need for Cognition Scale in the fall of their first year and in the spring of their fourth year. Keep in mind that scores on this scale range from 1 to 5.
The difference between the two scores is statistically significant, meaning that we can confidently claim that our students are becoming more interested in thinking about complicated or difficult problems or ideas.
For comparison purposes, it’s useful to triangulate these results against Augustana’s participation in the Wabash National Study between 2008 and 2012. Amazingly, that sample of students produced remarkably similar scores. In the fall of 2008, they logged a pre-test mean score of 3.43. Four years later, they registered a post-test mean score of 3.63. Furthermore, the Wabash National Study overall results suggest that students at other small liberal arts colleges made similar gains over the course of four years.
It’s one thing to look at the overall scores, but the proverbial devil is always in the details. So we’ve made it a standard practice to test for differences on any outcome (e.g., critical thinking, intercultural competence) or perception of experience (e.g., sense of belonging on campus, quality of advising guidance, etc.) by race/ethnicity, sex, socioeconomic status, first-generation status, and pre-college academic preparation. This is where we’ve often found the real nuggets that have helped us identify paths to improvement.
Unlike last year’s study of intercultural competence, we found no statistically significant differences by race/ethnicity, sex, socioeconomic status, or first-generation status in either where these different types of students scored when they started college or how much they had grown by the time they graduated. This was an encouraging finding because it suggests that the Augustana learning experience is equally influential for a variety of student types.
However, we did find some interesting differences among students who come to Augustana with different levels of pre-college preparation. These differences were almost identical no matter if we used our students ACT scores or high school GPA to measure pre-college academic preparation. Below you can see how those differences played out based upon incoming test score.
|Bottom 3rd ( < 24)
|Middle 3rd (24-28)
|Top 3rd ( > 28)
As you can see, all three groups of students grew similarly over four years. But the students entering with a bottom third ACT score started well behind the students who entered with a top third ACT score. Moreover, by the time this cohort graduated, the bottom third ACT student had not yet reached the entering scores of the top third ACT students (3.54 compared with 3.59).
So what should we make of these findings? First, I think it’s worth noting that once again we have evidence that on average our students grow on a key aspect of intellectual sophistication. This is worth celebrating. Furthermore, our student growth doesn’t appear to vary across several important demographic characteristics, suggesting that, on at least one learning metric, we seem to have achieved some outcome equity. And although there appear to be differences by pre-college academic preparation in where those students end up, the change from first-year to fourth-year across all three groups is almost identical. This suggests something that we might gloss over at first, namely that we seem to be accomplishing some degree of change equity. In other words, no matter where a student is when they arrive on campus, we are able to help them grow while they are here.
At the end of our presentation of this data last Friday afternoon, we asked everyone in attendance to hypothesize about the kinds of student experiences that might impact change on this outcome. Everyone wrote their hypotheses (some suggested only one idea while others, who shall yet remain nameless, suggested more than ten!) on a 4×6 card that we collected. Over the next several months, we will do everything we can to test each hypothesis and report back to the Augustana community what we found at our winter term presentation.
Oh, you say through teary eyes that you missed our presentation? Well, lucky for you (and us) we are still taking suggestions. So if you have any hypotheses, speculations, intuition, or just outright challenges that you want to suggest, bring it! You can post your ideas in the comments below or email me directly.
I can’t wait to start digging into the data to find what mysteries we might uncover! And look for our presentation of these tests as an upcoming winter term Friday Conversation.
Make it a good day,