It’s Hard to Argue with this Welcome Week Data

Good Morning!

It’s week 10!  The last week of the fall term!  You can make it!

This week I’d like to send a virtual shout-out to all of the folks who run Welcome Week for our new freshmen at the beginning of the fall term. This four-day whirlwind is a logistical Cirque du Soleil of social and academic acclimation.

But in many ways, it’s really more of an orientational triage. There are certain things that the students have to know by the time classes start or they’ll tank right out of the gate. Then there are other things that we’d love them to learn but we know these things might be a bridge too far. In reality, four days isn’t a lot of time, and the students’ ability to digest information is undercut by all of the anxieties that come with knowing, “Holy crap, I start college in a few days!” So the Welcome Week design team is faced with a stark reality: be very clear about the difference between what these new students have to know and what would be nice to know. Then teach them all of the first category and as much of the second as possible – knowing that too much time spent on any of the “would be nice to know” could cut into the “have to know” and then we’ve got a potential problem.

A few years ago, I highlighted the ways that the Welcome Week team has used some simple assessment design principles to improve the quality of the experience. But in that post, we only had anecdotal data to suggest that some good things were happening as a result. Now that we have a couple years of quantitative data, the evidence is pretty clear: Welcome Week has gotten even better at doing exactly what it is supposed to do.

A few weeks after the beginning of the fall term, we ask freshmen to complete a short online survey to find out their perception of Welcome Week. Specifically, we want to know the degree to which they think they learned the things we tried to teach them. I’d like to highlight four items that represent things that we think students have to know. Below each item is the average response score on a 1-5 scale (1=strongly disagree and 5=strongly agree) from each of the last four years. Notice the steady improvement.

My Welcome Week experience . . .

. . . helped me learn exactly how to get to the location of my classes.

  • 2013 – 3.55
  • 2014 – 3.79
  • 2015 – 4.18
  • 2016 – 4.21

. . . helped me find places on campus where I can study most effectively.

  • 2013 – 3.59
  • 2014 – 3.63
  • 2015 – 3.82
  • 2016 – 4.00

. . . taught me specific ways to make the best use of my time during the school day.

  • 2013 – 3.23
  • 2014 – 3.39
  • 2015 – 3.40
  • 2016 – 3.68

. . . emphasized the importance of finding places on campus where I can take time for myself.

  • 2013 – 3.51
  • 2014 – 3.60
  • 2015 – 3.69
  • 2016 – 3.84

As you can see, the Welcome Week team deserves some well-earned praise. They’ve stuck to the overarching design and philosophy of the program and used evidence to inform change. They have redesigned several parts of the experience, revised the way that they train peer mentors, and tackled some difficult logistical challenges to ensure that our new students are more likely to be as ready as possible for the first day of classes. Equally difficult (and probably even more impressive), they’ve stopped doing a number of things – no matter how strongly they believed in the potential of those activities, in order to concentrate more precisely on making the most of every minute of those four days.

Late last week I was playing with our freshly-collected freshman data from the end of the first term to see if we could see any lasting effects of the Welcome Week experience. As you might expect, the impact of Welcome Week tends to fade as subsequent fall term experiences become more influential in driving student success. However, one particularly gratifying finding popped when I tested whether any of the Welcome Week survey items might predict our students’ response to an item in the end of the first term survey, “Welcome Week provided the start I needed to succeed academically at Augustana.” Even though the data collected from the Welcome Week survey was gathered during the second week of the term and the end of the first term data was collected during weeks seven and eight, the item “My Welcome Week experience taught me specific ways to make the best use of my time during the school day,” proved to be a statistically significant positive predictor of our freshmen’s perception of the preparatory effectiveness of Welcome Week. Impressively, this is also one of the learning goals where the Welcome Week team seems to have made substantial strides in preparing our new students to succeed.

So congratulations to everyone involved in putting together and pulling off Welcome Week!  I hope you’ll take a moment to send a kudos to anyone you know, even yourself, who contributed to a great Welcome Week way back at the beginning of the term.

Make it a good day,


Men, Social Responsibility, Volunteering, and Some Troubling Data

Last week I shared the first round of findings from our study of the 2012 cohort’s intercultural competence development during their college career. One finding that jumped out was the disappointing difference in change between men and women. While women’s scores improved on both the cognitive and the behavioral scales, the men’s scores only improved on the cognitive scale. In addition, the women’s improvement on the cognitive scale was notably larger than the men and the degree of women’s improvement on the behavioral scale almost doubled the advantage they started with over men four years earlier.

At the Board of Trustees meetings last week, I provided our annual Academic Quality Markers for the 2016 cohort to the Academic Affairs Committee. It’s pretty apparent that there is something troubling going on with male participation and engagement. Male participation in study abroad, service learning, and volunteering is significantly lower than women. This pattern continues in three student experience items that address our efforts to cultivate citizenship. Moreover, the other comparisons by race/ethnicity and socio-economic status don’t contain such repeated disparities between groups. The only other significant difference occurs where one would expect: white students report less encouragement to interact across difference compared to students of color. Given the substantially higher proportion of white students on campus, it would certainly take relatively less “encouragement” for students of color to find themselves interacting across difference.

I’m sure that the explanation for these differences between men and women are complex. However, we might have found something that could enlighten an effort to better educate our male students within the Global Perspectives Inventory (GPI) data that I shared last week and referenced above. One set of questions within this survey, the Social Responsibility Scale, is composed of statements that focus on the degree to which the respondent engages in the public sphere to affect change. As an example, two of the statements (to which the respondent indicates a level of agreement or disagreement) are: “I work for the rights of others,” and “I consciously behave in terms of making a difference.”

It might not surprise you to find out that male and female Augustana students from the 2012 cohort entered with different average scores, different enough that the gap would be considered marginally statistically significant.

  • Female: 3.76
  • Male: 3.62

But what surprised me was that over the course of four years, only the women had grown on this scale. Male students had on average remained unmoved.

2012 females: 3.76 – – – – – – – – – – – – 2016 females: 3.88

2012 males:    3.62 – – – – – – – – – – – – 2016 males:    3.59

Maybe this lack of male growth in prioritizing social responsibility partially explains the difference between men and women in volunteering and service learning participation. Maybe it partially explains the male deficit in getting something substantive out of Symposium Day. And maybe it partially explains the relatively lower sense among men that Augustana encouraged them to interact across difference.

If our goal, as our mission statement seems to suggest, is to graduate individuals who engage in both leadership and service, it appears that we may need to revisit the ways that we develop a service orientation among our male students.

Hmm . . . if only there were a major reconfiguration of the Augustana educational experience that would allow us to try something new based on these findings . . .

Make it a good day,


Does Augustana students’ intercultural competence improve during college?

In the fall of 2011 we set in motion a college-wide assessment plan where we would collect learning outcome data from each entering cohort, link this data to the various student experience surveys these students complete at different points during their four years at Augustana, then collect the same learning outcome data just before the cohort graduates. This plan allows us to track our students’ four-year change on a specific learning outcome and identify connections between student experiences and variations in the direction and degree of that change.

Obviously it would be logistically impossible (and a little stupid) to tackle all of the Augustana Learning Outcomes every year. So we decided to rotate annually through the three broad categories of learning outcomes starting with intrapersonal conviction, moving to interpersonal maturity, and finally addressing intellectual sophistication before returning to intrapersonal conviction. Because each category includes a variety of more specific outcomes, this framework allows us some flexibility in selecting outcomes that seem particularly pertinent to our students’ success while maintaining a more general pattern that keeps us tuned in to the totality of our learning goals.

The first cohort (starting in the fall of 2011 and graduating in the spring of 2015) provided data on orientations toward different types of motivation, something that undergirds the learning outcome that we have called “Wonder.” I wrote about some of our findings from that study last fall and last winter.

The freshmen who started in the fall of 2012 completed a survey called the Global Perspectives Inventory, an instrument designed to measure intercultural competence (an important aspect of the learning outcome category we call Interpersonal Maturity). In the spring of 2016 we collected the final set of data from this cohort. On September 16th, the Assessment for Improvement Committee (AIC) presented the first of three Friday Conversations (one in each term during the 16-17 academic year) intended to examine this data and explore what it might suggest. For those of you who were unable to attend the Friday Conversation on September 16th, I thought I would post the power point slides below. They give a brief description of intercultural competence, convey the nature of our students’ change on three aspects of intercultural competence as measured by the GPI, and pose some questions for us to begin thinking about what we might explore in preparation for our winter term Friday Conversation.

So click on this presentation of 4-year change at Friday Conversation 9/16/2016 and you will be able to scroll through the power point slides.

As you can see, we found that our students (at least this cohort of students) grew on two of the three elements of intercultural competence. Our students grew the most on the cognitive scale that assesses knowledge of cultures and the implications of differences between cultures. Our students also grew, albeit to a lesser degree, on the behavioral scale that attempts to capture the likelihood to enact behaviors that reflect intercultural competence. Finally, we found that our students made no statistically significant gains on the affective scale that assesses the attitudes that would motivate one to be intercultural competent.

In addition to examining the overall change, we also explored the change among several subgroups of students based on pre-college demographic characteristics. As represented by the bar graphs on several slides, this exploration discovered interesting differences in the intercultural competence growth between men and women, white students and students or color, and students whose ACT score suggested low and high academic preparation.

Reflecting on the changes that we see in our student data, the important next question becomes, Why? Why do our students grow in the way that they do?  Why do some students change differently than others? What experiences influence positive or negative changes in intercultural competence? In my mind, these are the more interesting questions to explore because they can point us toward concrete ways that we might improve the education we provide.

Of course, there are an almost infinite number of questions that we could ask of our data. Are there specific experiences from participating in distinct activities that improve intercultural competence? What about the possibility that a combination of experiences (especially in a specific sequence) might do more than any single experience? Finally, is it possible that a particular dynamic that pervades one’s college experience might transcend an individual experience or combination thereof?

Although we were able to solicit a long list of research questions to test from the folks in attendance at our first Friday Conversation, I’m sure there are many more that we have yet to consider. So please add a research question or two in the comments section below.  We will test as many as we possibly can.  And we will report back at the winter Friday Conversation and on this blog all of what we find.

So put on those hypothesizing caps, and send us your suggestions. If we can find a way to test it, we will!

Make it a good day,


Retention is up. Great. But can we take any credit for this?

A couple of weeks ago I shared with everyone the eye-popping news of our most recent 1st-2nd year retention rate. The CliffsNotes/SparkNotes/Jiffynotes version (whatever happened to Reader’s Digest?) of that post are:

  1. 88.9% of the 2015 class came back this fall,
  2. this is the highest retention rate we’ve ever recorded (and more than a full point higher than the previous high set in 2010), and
  3. maybe we shouldn’t second-guess goals that seem at first to be too high.

Since we’ve made a concerted investment of people and resources toward improving our 1st-2nd year retention rate, this is nice to see. But since our efforts have been targeted to improve retention rates among several specific populations of students (i.e., students who have historically persisted at lower rates than the overall population), it makes sense to look and see whether those specific efforts are bearing fruit. After all, money does’t grow on trees at Augustana (although we’d be stinkin’ rich if it did!); it actually matters a lot whether or not our investments are paying off. So let’s dig a little deeper and examine the the last four years of retention rates among four groups of students long known to leave Augustana at higher rates than the rest of the first-year class: students of color, low income students, less academically prepared students, and first generation students.

While it would be a mistake to think that each of these groups were somehow completely independent of each other (i.e., there are certainly individual students who fit into more than one of these categories), it is true that research on the factors that influence the decision to withdraw from college has found differences among these four groups. Students of color often feel relegated to the margins of a college community – especially when that community is mostly white. Low income students often find that financial constraints undermine their ability to access the college experience offered to, and touted by, mainstream students. Students who are less academically prepared often find themselves overwhelmed and without the resources that might help them adjust to the academic rigors of college. And first generation students often struggle with a lack of confidence in their ability to succeed in college and a lack of knowledge about navigating the unwritten rules and norms that the rest of us unconsciously perpetuate every day.

With these findings in mind, we have developed specific programs to address each of these issues for students who fit into these groups. So . . . are these programs working?

Below I’ve listed the retention rates for each student subpopulation over the last four years. Remember, the overall retention rates for each of the last four years are:

  • 2013 – 84.9%
  • 2014 – 82.9%
  • 2015 – 86.1%
  • 2016 – 88.9%

Over the same period, these are the retention rates for:

Students of Color

  • 2013 – 81.3%
  • 2014 – 78.4%
  • 2015 – 82.2%
  • 2016 – 86.1%

Low Income Students

  • 2013 – 81.3%
  • 2014 – 80.8%
  • 2015 – 83.4%
  • 2016 – 86.6%

Less Academically Prepared Students

  • 2013 – 75.0%
  • 2014 – 78.6%
  • 2015 – 77.4%
  • 2016 – 83.9%

First Generation Students

  • 2013 – N/A (we didn’t create an easy way to track these students until 2013)
  • 2014 – 80.8%
  • 2015 – 80.5%
  • 2016 – 85.3%

Clearly, the retention rates of students of color, low income students, less academically prepared students, and first generation students have improved. And although the nerdy PhD in me would like to see a few more years of retention data before announcing that we have a definitive trend, at the very least we can say that our investments of money, positions, and space into these programs are not not working. Frankly, I think it’s far more reasonable to suggest that our efforts seem to be working quite well.

Lest you forget what this means in terms of real money, the difference in net revenue between last year’s retention rate of 86.1% and this year’s retention rate of 88.9% isn’t chump change. If we conservatively assume that:

  1. term-to-term attrition rates don’t change (which in reality are almost sure to go up if overall year-to-year retention rate goes up), and
  2. Actual revenue per first year student will not be less than the five-year low of $14,251 (2014/15),

the estimated increased net tuition revenue to the college this year ends up at just over $270,000. Moreover, the estimated increased net comprehensive fee revenue (i.e., including housing and student fees in addition to tuition) – again using the five-year low in actual numbers – ends up closer to $432,800.

By the way, if you want to test my math you can find all of the numbers I referenced above on the college dashboard that is always posted on the Augustana Institutional Research web page, along with the detailed breakdown of retention rates.

Triangulating these data with all of the anecdotal evidence I’ve seen over the past year, I’m gonna go out on a pretty thick limb and say that I believe that what we have been doing is working. So the next time you see someone whom you think might be involved in this work (I think you can figure that part out on your own) thank them for all of the effort they have put into helping our students succeed. And if you hear someone grouse about additional resources being invested when things are tight all over, it might be worthwhile to remind them that, sometimes those investments are worth it.

Not to mention that whole “educational mission” thing.

Make it a good day,


What if you could get your students to daydream about learning?

Rumor has it that in the late afternoon, after the students have all retreated to upper campus, you might catch a glimpse of a lone professor strolling under the leafy canopy, daydreaming of students who ponder their learning just for the fun of it. Although this might be an ever-so-slight exaggeration (it’s not THAT leafy), this vision of liberal arts nirvana isn’t just a fool’s paradise. When testing the effect of the first-year survey item, “I find myself thinking about what I’m learning in my classes even when I’m not in class or studying,” we regularly find that students who strongly agree with this statement also earn better grades (no matter their incoming ACT scores), say that they would definitely choose to come to Augustana again, and strongly agree that they can think of specific experiences that helped them clarify their life or career goals.

It appears that students who think about their learning when they don’t have to aren’t just a professor’s dream come true; this behavior is one indicator of a very successful student. Of course, I can already hear you blurting out the obvious, only semi-rhetorical, albeit entirely reasonable, next question.

“But we don’t have any control over that trait, do we?”

I can understand why you might ask that question, especially in that way. Sometimes it feels like all we do is implore students to embrace learning and truly engage the stuff we are trying to teach them. And sadly, all too often it can feel like those passionate pleas just bounce off the classroom’s back wall, reminding us of our inadequacies as the slap-back echo of our own voice hits us in the face.

But if there were some things that you could do, whether you are working with students in the classroom or outside the classroom, that might actually turn students into more intellectually curious, contemplative thinkers, would you do it? Sign me up!

We’ve just finished analyzing last year’s first-year student data and it looks like two items that we’ve recently introduced to the survey might point us toward some ways that could increase the degree to which students think about what they learn in class when it isn’t required. The first item that we found to be predictive of students’ thinking about their learning when they don’t have to asks students the degree to which they agree or disagree with this statement:

“My instructors recommended specific experiences outside of class (such as a lecture, forum, public meeting, demonstration, or other event) on campus or in the community that would complement or enhance my learning in class.”

Even after accounting for students’ sex, race, incoming ACT score, and socioeconomic status, as students reported these kinds of recommendations coming from their instructors more frequently, they also reported that they found themselves thinking about the things they learned in class even when they weren’t in class or studying.

In addition, we found a similar relationship between students’ thinking about learning and the degree to which they agreed with this statement:

“Symposium Day activities influenced the way that I now think about real world issues.”

It strikes me that these two items fit together perfectly.  On Tuesday (that would be tomorrow!), We hold our first Symposium Day of the year. In addition to four fantastic featured speakers, a variety of faculty, staff, and students will present a variety of thought-provoking presentations that tackle one or more aspects of the deliberately broad theme for the day, “Crossroads.” Some crossroads are physical, some are ideological, and some are about values and standing up for a set of principles even when it might not be the most popular thing to do. No matter the angle you take, everyone one of us faces these sorts of choices every day. If we’re paying attention, these moments can bring powerful meaning into our lives.

So if you want your students to be more likely to think about what they are learning when they don’t have to, take advantage of the upcoming Symposium Day and encourage them to soak up the atmosphere and the opportunity to choose what they want to learn. Maybe find a few sessions that sound particularly intriguing or controversial and suggest that your students practice hearing out an idea that they might not initially agree with.

Who knows? By the end of tomorrow that rumored incident of meandering thinkers might include a healthy dose of students, too.

Make it a good day,


Not Much to Say . . . Except, “Wow!”

Although President Bahls announced it at last week’s faculty meeting, it’s possible that the news about our latest first-to-second year retention rates hasn’t quite made it out to everyone who reads this blog. So just in case you haven’t heard, let me share with you a little number that still has me shaking my head a little bit.

  • 1st-2nd year retention rate of Augustana’s 2015 freshman class – 88.9%

Wow.  Just, wow.

So why am I so blown away by this number?

In the fall of 2010, we recorded a retention rate of 87.8%. At the time this was the highest retention rate we’d seen in 25 years of tracking the persistence of first-year students to the second year. For almost a quarter of a century, Augustana’s retention rate had bounced around somewhere between 82 and 87 percent. So in context, 87.8% was an awfully high number and more than a few of us (particularly me) didn’t think we’d be able to do much better than that.

But three years ago while we were in the midst of developing the Augustana 2020 strategic plan, someone asked me to estimate (AKA guess with data) what might be the best possible retention rate that Augustana could achieve given our student profile and educational resources. After crunching some numbers, I suggested that if the stars aligned we might be able to hit a retention rate of 90% in a given year. In all honesty, I wasn’t convinced that we’d ever break 88% since I’ve never seen the stars align outside of a Disney cartoon. Even in my most optimistic moments, I certainly didn’t think we’d crack 88% until we got all of the programming described in Augustana 2020 up and running and had worked out the kinks. If you had forced me to guess before the start of the fall term what our retention rate would be this year, I would have probably said something just short of 87%.

But we blew past 87%. We blew past 88%. We almost cracked 89%. Wow.

The part that is most surprising to me is that we have just started to get all of our programs aimed at first-year student success up and running. Folks have been working extremely hard, but I don’t think anyone would say that we have all hit our stride yet.

And as if all that weren’t enough, it appears that our retention efforts might just be spilling over to our second year students. Our retention rate for 2nd-3rd students this fall hit 94.4% – the highest for those students since we began tracking that number six years ago.

Now it wouldn’t be right if I didn’t acknowledge that this might be an anomaly; next year we could be lamenting a retention rate that is back within our familiar range. But maybe, just maybe, we might be on to something and all of the work that so many people have been doing over the last two years is starting to pay off.

Will we actually get to 90%?  I don’t know.  But the next time someone asks me to give them a ceiling prediction for what the Augustana community is capable of doing, I’m going to think twice before I tell anyone what I think we can’t do.

Congratulations to everyone who has worked so hard on behalf of our students.  It’s humbling to be on the same team with all of you.

Make it a good day,


What do we know about successful first-year Augustana students?

Good morning! I hope you took some time over the three-day weekend to relax and refuel. Before you know it, we’ll be watching the leaves turn, wondering where the warm weather went, and counting down the days til the end of fall term.

Although it seems like school has just started, some of our first-year students already feel like they might be in deeper water than they can handle. And even though you can tell them that the trimester current moves faster than the gentle drift of semesters, it doesn’t get real until the first wave hits them in the face. So week three is the perfect time to hammer home the behaviors that make first-year Augustana students successful.

Now that we have more than five years of data from first-year students that track their behaviors, experiences, and growth, we can start to make some pretty confident assertions about what successful students do. Based on repeated analyses that identify statistically significant relationships between specific student behaviors and outcomes like GPA, a sense of fitting in, and an increased sense of direction and purpose, successful first-year students do these three things.

  • Successful first-year students build a healthy network of friends, guides, mentors, and resources.

This doesn’t mean that successful students have a larger network of friends, guides, and mentors than less successful students. The key factor is the healthy nature of that network. This means that a successful student’s friend network brings out the best in each person and stretches every member of that network to make their community a better place. Likewise, successful students find at least one guide or mentor who both believes in them and challenges them to grow, mature, and think in more complex terms. Finally, successful students seek out the campus resources that they might need before they actually need them, and use them to get better instead of waiting until trouble bubbles up.

  • Successful first-year students dedicate themselves to study smart.

Successful Augustana students have dedicated themselves to four rules that define the way they study. Data from Augustana students repeatedly indicates that these behaviors impact everyone regardless of their pre-college academic preparation or ability.

  1. Religiously use a planner. Although important, it’s not just to keep track of what things need to get done. Really, it’s about organizing and logging when to do each thing on that list.
  2. Study during the day. Just like an 8 to 5 job, get up early and make every minute of the day count – especially the time between classes. The impact of this behavior on stress, sleep, and the quality of academic work turns out to be sort of amazing.
  3. Don’t study in the dorm room. Even though first year students might be used to studying in their rooms when they were in high school, the residence hall environment is pretty different from home in terms of visitor frequency, noise, and potential distractions. Similar to what happens to students who do most of their studying at night instead of during the day, studying in one’s dorm room invites a level of inefficiency that often make studying take longer and be less effective.
  4. Build a like-minded study group. Sometimes it is necessary to study alone, but other times it’s much more beneficial to study with a group. Successful students find like-minded students (not unlike the characteristics of a healthy network of friends) to study with when a group session might be particularly helpful.

If you want your students or your advisees to make the most of their first term at Augustana, tell them to grab hold of those four points and don’t let go.

  • Successful Augustana students take charge of their own growth.

It’s hard to get through a single day without seeing or hearing an invitation or exhortation to get involved in a student club, activity, organization, or event. And we’ve all seen the student email signature that lists membership in more groups than there is time in the day. But the most successful Augustana students aren’t the ones who are involved in a lot of stuff. Instead, the most successful students are the ones who focus on experiences that specifically impact their growth in learning more about themselves and learning more about how they can better relate to others. This bit of advice can get lost if we don’t emphasize it to our students – don’t just get involved in stuff, get involved in the right stuff.

In addition to choosing the right combination of involvement in activities, organizations, and events, successful first year Augustana students connect with CORE right away. They recognize the importance of the relationship between the things they do right now and the person they want to be when they graduate. All the services that CORE provides help students embrace and develop a sense of purpose and fuel an increasing sense of momentum in that direction. As simple as it might sound, students who start building a resume or a grad school portfolio during their first year are more likely to have a job or graduate school place at graduation – regardless of their college GPA. This isn’t magic or assembly line educating – it’s just that these students start considering and articulating the connection between what they are doing now and where they want to be four years from now.

So if you want to drop some knowledge on your students that is virtually guaranteed to make a difference, hit them with these three golden nuggets.

Make it a good day,



Three highlights from the 2016 Student Readiness Survey Results

As most of you know by now, we developed the Student Readiness Survey a few years ago to give us more nuanced information about key traits and dispositions that impact the nature of our student’s transition to college. Instead of basing our conclusions about readiness for college on indicators of a student’s academic preparation or intellectual strength, we wanted to zero in on the dispositions and traits that make a student successful in every aspect of the residential college experience. The results of this instrument have become a key piece of first-year advising and have turned out to be statistically predictive of numerous important developmental and learning outcomes.

The 36 statements on the survey describe a trait or a disposition. For each item, the respondent chooses from a response set that ranges from “never like me” to “always like me.” As an example, one item states, “I like to cooperate with others.” The response that a student selects gives us a glimpse into the way that he or she perceives him or herself regarding an important interpersonal skill that will undoubtedly shape the transition to residential college life.

As you might suspect, most of our student’s responses tend toward the kind of traits and dispositions that we’d like to see (i.e., if we look at the item about cooperation I listed above, scoring “never like me” =1 and “always like me” = 5 produces an average across all incoming students of 4.26). However, There are some dips in scores on a few items that might be telling.

There are six groups of items that are organized into categories, or as a stats geek would call them, scales. The scales attempt to capture:

  • Academic Confidence
  • Academic Habits
  • Comfort with Social Interaction
  • Interpersonal Skills
  • Persistence and Grit
  • Stress Management

Interestingly, a gap seems to appear in the average scale scores that put these six scales into two groups. The scores for Academic Confidence, Persistence and Grit, and Interpersonal Skills each average between a 4.11 and 4.25. By contrast, Academic Habits, Stress Management, and Comfort with Social Interaction each average between 3.76 and 3.85. Even at its narrowest (i.e., 3.85 to 4.11), this gap is statistically significant, suggesting that this gap might be more than random chance. I’m not sure I have any answers – or even hypotheses – as to why this might be, but it seems to me that there might be something more fundamental going on here.

In addition, the three individual items with the lowest overall average scores all sit in the Academic Habits category.

  • When I am confused by an assignment, I seek help right away. (3.48)
  • I highlight key points when I read assigned materials. (3.39)
  • I start homework assignments early enough to avoid having to rush to complete them. (3.38)

Each of these items try to capture an element of academic habits that would indicate self-efficacy and the wherewithal to take assertive action in response to a challenge. These items seem to me to fit into a larger conversation about the degree to which we need to move many students from thinking that education “happens to them” to thinking that “they make their learning happen.”

In your conversation with students this week, just as they are starting to feel the first wave of readings and homework fully wash over them, it might make sense to consider the degree to which your students still need to shift from thinking that education happens to them to actively making their learning happen. Sometimes it turns out that we have to tell our students how to do what we want them to do just as much as we have to tell them what we want them to turn in. I am realizing how much I have forgotten about that difference as I am teaching an FYI 100 section for the first time.

So hang in there with your students, even when they give you that glazed look of overwhelminghood (I know it’s not a word, but you get the idea).

Make it a good day,


Something tells me this is gonna be a great year!

Good morning everyone!

Welcome to campus – no matter if you can’t remember being anywhere else in late August or if you are the first person in your family to start your fall on a college campus! No matter how you got here, how long you’ve been here, or how soon you’ll be diving into your next great adventure, I’m really glad each of you are here right now.

Somehow you’ve stumbled onto a blog called “Delicious Ambiguity” written by me, Mark Salisbury. (Ok, so I emailed you the link and you clicked on it thinking it might be important). I’m the Director of Institutional Research and Assessment at Augustana College or, as some students have taken to calling me (a supreme compliment, IMHO) the Chief Nerd. I started writing this blog in 2011 as a column in the Faculty Newsletter. The goal then was to share snippets of Augustana data with everyone and hopefully encourage each of us to take a moment to ponder the implications of that data. Most of the time, it’s been statistical data (hence the name Chief Nerd), but sometimes it’s data that comes from interviews or focus groups. No matter the source, I try to explore data points that can help all of us – faculty, staff, and students alike – maximize our experience at Augustana. In case you’re wondering, if you ever think to yourself, “Why doesn’t Mark write about that?” send me an email or comment at the bottom of a blog post. If we’ve got the relevant data, I’ll try to write about it.

With every new group of students, be they traditional freshmen or non-traditional transfers, we gather a set of data points that help us better understand the breadth and depth of the diversity contained within that group. Tracking these data points is one way to remind all of us that cultivating a diverse and vibrant community is about exponentially more than just tracking skin color or biological sex.

Today I’d like to share two data tidbits from our incoming class that seem worth pondering.

First, 31.8% of our new students indicate that neither of their parents earned a four-year college degree. Certainly a substantial proportion of these students come from families where they are the first to go to any kind of college. Equally important, this is not a new phenomenon; this proportion has stayed near 30% since we began asking this question of incoming students in 2012 and, as best we can tell, Augustana has already enrolled a substantial proportion of “first generation” college students. While we can certainly parse the nuances of this student category, our reality remains that many students may not grasp the unstated but oft-assumed implications of our liberal arts college culture, both in terms of the intentions behind various policies or the behaviors that many of us enact every day without a second thought. Moreover, many of these students likely harbor an additional layer of internal anxiety about whether or not they truly “belong” in college at all, let alone a private institution like Augustana.

Second, Augustana’s growing enthusiasm for interfaith understanding in recent years couldn’t have come at a better time. Our incoming class is peppered with students from every kind of western and non-western faith. We have new students who self-identify as Muslim, Buddhist, Hindu, Jewish, Mormon, Greek Orthodox, Catholic, Episcopal, Lutheran, Presbyterian, Methodist, Baptist, Pentecostal, non-denomination, and a whopping 6% of students who categorized themselves as “other.” Oh, and to top it off, 16.7% of our incoming students identify as “no religious background” or “atheist.” I don’t know if this is a one-year phenomenon or if we’ve crossed a tipping point of some sort, but this year that group of students is larger than our incoming proportion of Lutheran students (14.4%).

These two data points hold important implications for the assumptions we make about individual students. All of us probably have some growing to do as we think about the way that we interact with each student. I certainly do. I’ve already made the mistake of assuming that someone I had just met came to Augustana from another country. Based on this faulty assumption, I made a comment that I wish I could take back because it might have been interpreted to reiterate the sense that I am a part of the “natural” in-group and they are still a member of a “probationary” out-group. I owe that person an apology, one that I intend to deliver soon.

I don’t say any of that to hold myself up as some grand example, but rather to suggest that adapting to this increasingly prevalent and multifaceted diversity is a process during which we are each likely to stumble. But in stumbling, depending on how we respond to it, we might just be able to communicate more clearly that we genuinely want to make Augustana a welcoming and inclusive place for everyone – no matter where they are from, who they are, or what they want to become.

Make it a good day,


Triangulating our assessment of quantitative literacy

Whether we like it or not, the ability to convey, interpret, and evaluate data affects every part of our personal and professional lives. So it’s not a surprise to find quantitative literacy among Augustana’s nine student learning outcomes. Yet, of all those outcomes, quantitative literacy may be the most difficult to pin down. First of all, this concept is relatively new when compared to other learning outcomes like intercultural competence or critical thinking. Second, there isn’t nearly the range measurement mechanisms – surveys or otherwise – that capture this concept effectively. And third, quantitative literacy is the kind of skill that is particularly susceptible to social desirability bias (i.e., the tendency to believe that you are better at a desirable intellectual skill than you actually are).

Despite the obstacles I noted above, the Assessment for Improvement Committee (AIC) felt like this was an outcome ripe for the assessing. First, we’ve never really measured quantitative literacy among Augustana students before (it wasn’t addressed in the Wabash National Study when we participated between 2008 and 2012). Second, it isn’t clear that we know how each student develops this skill, as we have defined it in our own college documents, beyond what a student might learn in a “Q” course required by the core curriculum. As a result, it’s entirely possible that we have established a learning outcome for all students that our required curriculum isn’t designed to achieve. Uh oh.

In all fairness, we do have one bit of data – imperfect as it is. A few years ago, we borrowed an idea from the National Survey of Student Engagement (NSSE) and inserted a question into our senior survey that asked students to respond to the statement, “I am confident in my ability to interpret numerical and statistical quantities,” giving them five response options that ranged from “strongly disagree” to “strongly agree.”

Since we began asking this question, about 75% of seniors have indicated that they “agree” or “strongly agree” with that statement. Unfortunately, our confidence in that number began to wain as we looked more closely at those responses. For that number to be credible, we would expect to see that students from majors that have no quantitative focus were less confident in their quantitative abilities than students from majors that employed extensive quantitative methods. However, we found the opposite to often be the case. It turned out that students who had learned something about how complicated quantitative methods can be were less confident in their quantitative literacy skills than those students who had no exposure to such complexities, almost as if knowing more about the nuances and trade-offs that can make statistics such a maddeningly imperfect exercise had a humbling effect. In the end it appeared that in the case of quantitative literacy, ignorance might indeed be bliss (a funny story about naming another bias called the Dunning-Kruger Effect).

So last year the AIC decided to conduct a more rigorous study of our students’ quantitative literacy skills. To make this happen, we first had to build an assessment instrument that matched our definition of quantitative literacy. Kimberly Dyer, our measurement ninja, spent weeks pouring over the research on quantitative literacy and the survey instruments that had already been created to find something that fit our definition of this learning outcome. Finally, she ended up combining the best of several surveys to build something that matched our conception of quantitative literacy and included questions that addressed interpreting data, understanding visual presentations of data, calculating simple equations (remember story problems from grade school?), applying findings from data, and evaluating the assumptions underlying a quantitative claim. We then solicited faculty volunteers who would be willing to take time out of their upper-level classes to give their students this survey. In the end, we were able to get surveys from about 100 students.

As you might suspect, the results of this assessment project provided a bit more sobering picture of our students quantitative literacy skills. These are the proportions of questions within each of the aforementioned quantitative literacy categories that students who had completed at least one Q course got right.

  • Interpreting data  –  41%
  • Understanding visual presentations of data  –  41%
  • calculating simple equations  –  45%
  • applying findings from data  –  52%
  • evaluating assumptions underlying a quantitative claim  –  51%

Interestingly, students who had completed two Q classes didn’t fare any better.  It wasn’t until students had taken 3 or more Q classes that the proportion of correct answers improved significantly.

  • Interpreting data  –  58%
  • Understanding visual presentations of data  –  59%
  • calculating simple equations  –  57%
  • applying findings from data  –  65%
  • evaluating assumptions underlying a quantitative claim  –  59%

There are all kinds of reasons that we should interpret these results with some caution – a relatively small sample of student participants, the difficulty of the questions in the survey, or the uneven distribution of the student participants across majors (the proportion of STEM and social science majors that took this survey was higher than the proportion of STEM and social science majors overall). But interpreting with caution doesn’t mean that we discount these results entirely. In fact, since prior research on students’ self-reporting of learning outcomes attainment indicates that students often overestimate their abilities on complex skills and dispositions, the 75% of students who agree or strongly agree is probably substantially higher than the proportion of graduates who are actually quantitatively literate. Furthermore, since the proportion of students who took this survey was skewed toward majors where quantitative literacy is a more prominent part of that major, these findings are more likely to overestimate the average student’s quantitative literacy than underestimate it. Triangulating these data with prior research suggests that our second set of findings might paint a more accurate picture of our graduates.

So how should we respond to these findings? To start, we probably ought to address the fact that there isn’t a clear pathway between what students are generally expected to learn in a “Q” course and what the college outcome spells out as our definition of quantitative literacy. That gap alone creates the condition in which we leave students’ likelihood of meeting our definition of quantitative literacy up to chance. So our first question might be to explore how we might ensure that all students get the chance to achieve this outcome; especially those students who major in disciplines that don’t normally include quantitative literacy skills.

The range of quantitative literacy, or illiteracy as the case might be, is a gnarly problem. It’s not something that we can dump onto an individual experience and expect that box to be checked. It’s hard work, but if we are serious about the learning outcomes that we’ve set for our students and ourselves, then we can’t be satisfied with leaving this outcome to chance.

Make it a good day,