One of the “exciting” parts of my job is building surveys. I’ve worked with many of you over the past two years to construct new surveys to answer all sorts of questions. On the one hand, it’s a pretty interesting challenge to navigate all of the issues inherent in designing what amounts to a real life “research study.” At the same time, it can be an exhausting project because there are so many things you just can’t be sure of until you field test the survey a few times and find all of the unanticipated flaws. But in the end, if we get good data from the new survey and learn things we didn’t know before that help us do what we do just a little bit better, it’s a pretty satisfying feeling.
As many of you already know, Augustana College has been engaged in a major change over the last several years in terms of how we assess ourselves. Instead of determining our quality as an institution based on what we have (student incoming profile, endowment amount, etc.), we are trying to shift to determining our quality based on what we do with what we have. Amazingly, this places us in a very different place that many higher education institutions. Unfortunately, it also means that there aren’t many examples on which we might model our efforts.
One of the implications of this shift involves the nature of the set of institutional data points that we collect. Although many of the numbers we have traditionally gathered continue to be important, the measure of ourselves that we are hoping to capture what we do with those traditional numbers. And while we have long maintained pretty robust ways of obtaining the numbers you would see in our traditional dashboard, our mechanisms for gathering data that would help us assess what we do with what we have are not yet robust enough.
So over the last few months, I have been working with the Assessment for Improvement Committee and my student assistants to build a new senior survey. While the older version had served its purpose well over more than a decade, it was ready for an update, it not an overhaul.
The first thing we’ve done is move from a survey of satisfaction to a survey of experiences. Satisfaction can sometimes give you a vague sense of customer happiness, but it often falls flat in trying to figure out how to make a change – not to mention the fact that good educating can produce customer dissatisfaction if the that customer had unrealistic expectations or didn’t participate in their half of the educational relationship.
The second thing we’ve done is build the senior survey around the educational and developmental outcomes of the entire college. If our goal is to develop students holistically, then our inquiry needs to be comprehensive.
Finally, the third thing we’ve done is “walk back” our thinking from the outcomes of various aspects of the college to the way that students would experience our efforts to produce those outcomes. So, for example, if the outcome in intercultural competence, then the question we ask is how often students had serious conversations with people who differed by race/ethnicity, culturally, social values, or political beliefs. We know this is a good question to ask because we know from a host of previous research that the degree to which students engage in these experiences predicts their growth on intercultural competence.
If you want to see the new senior survey, please don’t hesitate to ask. I am always interest in your feedback. In the mean time . . .
Make it a good day!