Improving Interfaith Understanding at Augustana

This is a massively busy week at Augie. We had a packed house of high school students visiting on Monday (I’ve never seen the cafeteria so full of people ever!), the Board of Trustees will gather on campus for meetings on Thursday and Friday, and hundreds of alumni and family will arrive for Homecoming over the weekend. With all of this hustle and bustle, you probably wouldn’t have noticed three unassuming researchers from the Interfaith Diversity Experiences and Attitudes Longitudinal Survey (IDEALS) quietly talking to faculty, staff, and students on Monday and Tuesday. They were on campus to find out more about our interfaith programs, experiences, and emphasis over the past several years.

Apparently, we are doing something right when it comes to improving interfaith understanding at Augustana. Back in the fall of 2015, our first-year cohort joined college freshmen from 122 colleges and universities around the country to participate in a 4-year study of interfaith understanding development. The study was designed to collect data from those students at the beginning of the first year, during the fall of the second year, and in the spring of the fourth year. In addition to charting the ways in which these students changed during college, the study was also constructed to identify the experiences and environments that influence this change.

As the research team examined the differences between the first-year and second-year data, an intriguing pattern began to emerge. Across the entire study, students didn’t change very much. This wasn’t so much of a surprise, really, since the Wabash National Study of Liberal Arts Education had found the same thing. However, unlike students across the entire study, Augustana students consistently demonstrated improvement on most of the measures in the study. This growth was particularly noticeable in areas like appreciative knowledge of different worldviews, appreciative attitudes toward different belief systems, and global citizenship. Although the effect sizes weren’t huge, a consistent pattern of subtle but noticeable growth suggested that something good might be happening at Augustana.

However, using some fancy statistical tricks to generate an asterisk or two (denoting statistical significance) doesn’t necessarily help us much in practical terms. Knowing that something happened doesn’t tell us how we might replicate it or how we might do it even better. This is where the qualitative ninjas need to go to work and talk to people (something us quant nerds haven’t quite figured out how to do yet). Guided by the number-crunching, the real gems of knowledge are more likely to be unearthed through focus groups and interviews where researchers can delve deep into the experiences and observations of folks on the ground.

So what did our visiting team of researchers find? They hope to have a report of their findings for us in several months. So far all I could glean from them is that Augustana is a pretty campus with A LOT of steps.

But there is a set of responses from the second-year survey data that that might point in a direction worth contemplating. There is a wonderfully titled grouping of items called “Provocative Encounters with Worldview Diversity,” from which the responses to three statements seem to set our students’ experience apart from students across the entire study as well as students at institutions with a similar Carnegie Classification (Baccalaureate institutions – arts and sciences). In each case, we see a difference in the proportion of students who responded “all the time” or “frequently.”

  1. In the past year, how often have you had class discussions that challenged you to rethink your assumptions about another worldview?
    1. Augustana students: 51%
    2. Baccalaureate institutions: 43%
    3. All institutions in the study: 33%
  2. In the past year, how often have you felt challenged to rethink your assumptions about another worldview after someone explained their worldview to you?
    1. Augustana students: 44%
    2. Baccalaureate institutions: 34%
    3. All institutions in the study: 27%
  3. In the past year, how often have you had a discussion with someone of another worldview that had a positive influence on your perceptions of that worldview?
    1. Augustana students: 48%
    2. Baccalaureate institutions: 45%
    3. All institutions in the study: 38%

In the past several years, there is no question that we have been trying to create these kinds of interactions through Symposium Day, Sustained Dialogue, course offerings, a variety of co-curricular programs, and increased diversity among our student body. Some of the thinking behind these efforts dates back six or seven years when we could see from our Wabash National Study Data and our prior NSSE data that our students reported relatively fewer serious conversations with people who differed from them in race/ethnicity and/or beliefs/values. Since a host of prior research has found that these kinds of serious conversations across difference are key to developing intercultural competence (a skill that certainly includes interfaith understanding), it made a lot of sense for us to refine what we do so that we might improve our students’ gains on the college’s learning outcomes.

The response to the items above suggests to me that the conditions we are trying to create are indeed coming together. Maybe, just maybe, we have successfully designed elements of the Augustana experience that are producing the learning that we aspire to produce.

It will be very interesting to see what the research team ultimately reports back to us. But for now, I think it’s worth noting that there seems to be early evidence that we have implemented intentionally designed experiences that very well might be significantly impacting our students’ growth.

How about that?!

Make it a good day,

Mark

 

Does Our Students’ Interest in Complex Thinking Change over Four Years?

One of the best parts of my job is teaming up with others on campus to help us all get better at doing what we do. Over the past seven years, I’ve been lucky enough to work with almost every academic department or student life office on projects that have genuinely improved the student experience. But if I had to choose, I think my favorite partnership is the annual student learning assessment initiative that combines the thoughtfulness (and sheer intellectual muscle) of the Assessment for Improvement Committee with the longitudinal outcome data (and nerdy statistical meticulousness) from the Office of Institutional Research and Assessment.

For those of you who don’t know about this project already, the annual student learning assessment initiative starts anew every summer – although it takes about four years before any of you see the results. First, the IR office chooses a previously validated survey instrument that aligns with one of Augustana’s three broad categories of learning outcomes. Second, we give this survey to the incoming first-year class just before the fall term starts. Third, when these students finish their senior year we include the same set of questions in the senior survey, giving us a before-and-after set of data for the whole cohort. Fourth, after linking all of that data with freshman and senior survey data, admissions data, course-taking data, and student readiness survey results, we explore both the nature of that cohort’s change on the chosen outcome as well as the experiences or other characteristics that might predict positive or negative change on that outcome.

The most recent graduating cohort (spring, 2017), provided their first round of data in the fall of 2013. Since we had already started assessment cycles of intrapersonal conviction growth (the 2011 cohort) and interpersonal maturity growth (the 2012 cohort), it was time to turn our attention to intellectual sophistication (the category that includes disciplinary knowledge, critical thinking and information literacy, and quantitative literacy). After exploring several possible assessment instruments, we selected an 18-item survey called The Need for Cognition Scale. This instrument tries to get at the degree to which the respondent is interested in thinking about complicated or difficult problems or ideas. Since the Need for Cognition Scale had been utilized by the Wabash National Study of Liberal Arts Education, they had already produced an extensive review of the ways in which this instrument correlated with aspects of intellectual sophistication as we had defined it. And since this instrument is short (18 questions) and cheap (free), we felt very comfortable putting it to work for us.

Fast forward four years and, after some serious number crunching, we have some interesting findings to share!

Below I’ve included the average scores from the 2013 cohort when they took the Need for Cognition Scale in the fall of their first year and in the spring of their fourth year. Keep in mind that scores on this scale range from 1 to 5.

Fall, 2013 3.43
Spring, 2017 3.65

The difference between the two scores is statistically significant, meaning that we can confidently claim that our students are becoming more interested in thinking about complicated or difficult problems or ideas.

For comparison purposes, it’s useful to triangulate these results against Augustana’s participation in the Wabash National Study between 2008 and 2012. Amazingly, that sample of students produced remarkably similar scores. In the fall of 2008, they logged a pre-test mean score of 3.43. Four years later, they registered a post-test mean score of 3.63. Furthermore, the Wabash National Study overall results suggest that students at other small liberal arts colleges made similar gains over the course of four years.

It’s one thing to look at the overall scores, but the proverbial devil is always in the details. So we’ve made it a standard practice to test for differences on any outcome (e.g., critical thinking, intercultural competence) or perception of experience (e.g., sense of belonging on campus, quality of advising guidance, etc.) by race/ethnicity, sex, socioeconomic status, first-generation status, and pre-college academic preparation. This is where we’ve often found the real nuggets that have helped us identify paths to improvement.

Unlike last year’s study of intercultural competence, we found no statistically significant differences by race/ethnicity, sex, socioeconomic status, or first-generation status in either where these different types of students scored when they started college or how much they had grown by the time they graduated. This was an encouraging finding because it suggests that the Augustana learning experience is equally influential for a variety of student types.

However, we did find some interesting differences among students who come to Augustana with different levels of pre-college preparation. These differences were almost identical no matter if we used our students ACT scores or high school GPA to measure pre-college academic preparation. Below you can see how those differences played out based upon incoming test score.

ACT Score Fall, 2013 Spring, 2017
Bottom 3rd ( < 24) 3.33 3.54
Middle 3rd   (24-28) 3.42 3.63
Top 3rd ( > 28) 3.59 3.77

As you can see, all three groups of students grew similarly over four years. But the students entering with a bottom third ACT score started well behind the students who entered with a top third ACT score. Moreover, by the time this cohort graduated, the bottom third ACT student had not yet reached the entering scores of the top third ACT students (3.54 compared with 3.59).

So what should we make of these findings? First, I think it’s worth noting that once again we have evidence that on average our students grow on a key aspect of intellectual sophistication. This is worth celebrating. Furthermore, our student growth doesn’t appear to vary across several important demographic characteristics, suggesting that, on at least one learning metric, we seem to have achieved some outcome equity. And although there appear to be differences by pre-college academic preparation in where those students end up, the change from first-year to fourth-year across all three groups is almost identical. This suggests something that we might gloss over at first, namely that we seem to be accomplishing some degree of change equity. In other words, no matter where a student is when they arrive on campus, we are able to help them grow while they are here.

At the end of our presentation of this data last Friday afternoon, we asked everyone in attendance to hypothesize about the kinds of student experiences that might impact change on this outcome. Everyone wrote their hypotheses (some suggested only one idea while others, who shall yet remain nameless, suggested more than ten!) on a 4×6 card that we collected. Over the next several months, we will do everything we can to test each hypothesis and report back to the Augustana community what we found at our winter term presentation.

Oh, you say through teary eyes that you missed our presentation? Well, lucky for you (and us) we are still taking suggestions. So if you have any hypotheses, speculations, intuition, or just outright challenges that you want to suggest, bring it! You can post your ideas in the comments below or email me directly.

I can’t wait to start digging into the data to find what mysteries we might uncover! And look for our presentation of these tests as an upcoming winter term Friday Conversation.

Make it a good day,

Mark

Just when you think you’ve got everything figured out . . .

This post started out as nothing more than a humble pie correction; something similar to what you might find at the bottom of the first page of your local newspaper (if you are lucky enough to still have a local newspaper). But as I continued to wrestle with what I was trying to say, I realized that this post wasn’t really about a correction at all. Instead, this post is about what happens when a changing student population simply outgrows the limits of the old labels you’ve been using to categorize them.

Last week, I told you about a stunningly high 89.8% retention rate for Augustana’s students of color, more than five percentage points higher than our retention rate for white students. During a meeting later in the week, one of my colleagues pointed out that the total number of students of color from which we had calculated this retention rate seemed high. Since this colleague happens to be in charge of our Admissions team, it seemed likely that he would know a thing or two about last year’s incoming class. At the same time, we’ve been calculating this retention rate in the same way for years, so it didn’t seem possible that we suddenly forgot how to run a pretty simple equation.

Before I go any further, let’s get the “correction,” or maybe more precisely “clarification,” out of the way. Augustana’s first-to-second year retention rate for domestic students of color this year is 87.3%, about a point higher than the retention rate for domestic white students (86%). Still impressive, just not quite as flashy. Furthermore, our first-to-second year retention rate for international students is 88.4%, almost two percentage points higher than our overall first-to-second year retention rate of 86.5%. Again, this is an impressive retention rate among students who, in most cases, are also dealing with the extra hurdle of adapting to (if not learning outright) Midwestern English.

So what happened?

For a long time, Augustana has used the term “multicultural students” as a way of categorizing all students who aren’t white American citizens raised in the United States. Even though the term is dangerously vague, when almost 95% of Augustana’s enrollment was white domestic students (less than two decades ago) there was a reasonable logic to constructing this category. Just as categories can become too large to be useful, so too can categories become so small that they dissipate into a handful of individuals. And even the most caring organization finds it pretty difficult to explicitly focus itself on the difficulties of a few individuals.

Moreover, this categorization allowed us to construct a group large enough to quantify in the context of other larger demographic groups. For example, take one group of students from which 10 of 13 return for a second year and compare it with another group of students from which 200 of 260 return for a second year. Calculated as a proportion, both groups share the same retention rate. But in practice, the retention success for each group seems very different; one group lost very few students (3) while the other group lost a whole bunch (60). In the not so distant past when an Augustana first-year class would include maybe 20 domestic students of color and 5 international students (out of a class of about 600), grouping these students into the most precise race, ethnicity, and citizenship categories would almost guarantee that these individuals would appear as intriguing rarities or, worse yet, quaint novelties. Under these circumstances, it made a lot of sense to combine several smaller minority groups into one category large enough to 1) conceptualize as a group with broadly similar needs and challenges and 2) quantify in comparable terms to other distinct groups of Augustana students.

In no way am I arguing that the term “multicultural” was a perfect label. As our numbers of domestic students of color increased, the term grew uncomfortably vague. Equally problematic, some inferred that we considered the totality of white students to be a monoculture or that we considered all multicultural students to be overflowing with culture and heritage. Both of these inferences weren’t necessarily accurate, but as with all labels, seemingly small imperfections can morph into glaring weaknesses when the landscape changes.

Fast forward fifteen years. Entering the 2017-18 academic year, our proportion of “multicultural students” has increased by over 400%, and the combination of domestic students of color and international students make up about 27% of our total enrollment – roughly 710 students. Specifically, we now enroll enough African-American students, Hispanic students, and international students to quantitatively analyze their experiences separately. To be clear, I’m not suggesting that we should prioritize one method of research (quantitative) over another (qualitative). I am arguing, however, that we are better able to gather evidence that will inform genuine improvement when we have both methods at our disposal.

After a conversation with another colleague who is a staunch advocate for African-American students, I resolved to begin using the term “students of color” instead of multicultural students. Although it’s taken some work, I’m making slow progress. I was proud of myself last week when I used the term “students of color” in this blog without skipping a beat.

Alas, you know the story from there. Although I had used an arguably more appropriate term for domestic students of color in last week’s post, I had not thought through the full implications of shifting away from an older framework for conceptualizing difference within our student body. Clearly, one cannot simply replace the term “multicultural students” with “students of color” and expect the new term to adequately include international students. At the same time, although the term “multicultural” implies an air of globalism, it could understandably be perceived to gloss over important domestic issues of race and inequality. If we are going to continue to enroll larger numbers across multiple dimensions of difference, we will have to adopt a more complex way of articulating the totality of that difference.

Mind you, I’m not just talking about counting students more precisely from each specific racial and ethnic category – we’ve been doing that as long as we’ve been reporting institutional census data to the federal government. I guess I’m thinking about finding new ways to conceptualize difference across all of its variations so that we can adopt language that better matches our reality.

I’d like to propose that all of us help each other shift to a terminology that better represents the array of diversity that we’ve worked so hard to achieve, and continue to work so hard to sustain. I know I’ve got plenty to learn (e.g., when do I use the term “Hispanic” and when do I use the term “Latinx”?), and I’m looking forward to learning with you.

And yes, I’ll be sure to reconfigure our calculations in the future. Frankly, that is the easy part. Moreover, I’ll be sure to reconceptualize the way I think about student demographics. We’ve crossed a threshold into a new dimension of diversity within our own student body. Now it’s time for the ways that we quantify, convey, and conceptualize that diversity to catch up.

Make it a good day,

Mark

Retention, Realistic Goals, and a Reason to be Proud

When we included metrics and target numbers in the Augustana 2020 strategic plan, we made it clear to the world how we would measure our progress and our success. As we have posted subsequent updates about our efforts to implement this strategic plan, a closer look into those documents exposes some of the organizational challenges that can emerge when a goal that seemed little more than a pipe dream suddenly looks like it might just be within our grasp.

Last week we calculated our first-to-second year retention numbers for the cohort that entered in the fall of 2016. As many of you know, Augustana 2020 set a first-to-second year retention rate goal of 90%, a number that we’d never come close to before. In fact, colleges enrolling students similar to ours top out at retention rates in the upper 80s. But we decided to set a goal that would stretch us outside of this range. To come up with this goal, we asked ourselves, “What if the stars aligned with the sun and the moon and we retained every single student that finished the year in good academic standing?” Under those conditions, we might hit a 90% retention rate. A pipe dream? Maybe. But why set a goal if it doesn’t stretch us a little bit? So we stuck that number in the document and thought to ourselves, “This will be a good number to shoot for over the next five years.”

Last year (fall of 2016), we were a little stunned to find that we’d produced an overall first-to-second year retention rate of 88.9%. Sure, we had instituted a number of new initiatives, tweaked a few existing programs, and cranked up the volume on our prioritizing retention megaphone to eleven. But we weren’t supposed to have had so much success right away. To put this surprise in the context of real people, an 89% retention rate meant that we whiffed on a whopping seven students! SEVEN!!! Even if we had every retention trick in the book memorized, out of a class of 697, an 88.9% retention rate is awfully close to perfect.

So what do you do when you find yourself close to banging your head on a ceiling that you didn’t really ever expect to see up close? The fly-by-night thought leader tripe would probably answer with an emphatic, “Break through that ceiling!” (complete with an infomercial and a special deal on a book and a DVD). Thankfully, we’ve got enough good sense to be smarter than that. In reality, a situation like this can set the stage for a host of delicately dangerous delusions. For example, if we were to exceed that retention rate in the very next year we could foolishly convince ourselves into thinking that we’ve discovered the secret to perfect retention. Conversely, if our retention rate were to slip in the very next year we could start to think that our aspiration was always beyond our grasp and that we really ought to just stop trying to be something that we are not (cue the Disney movie theme song).

The way we’ve chosen to approach this potential challenge is to start with a clear understanding of all of the various retention rates of student subpopulations that make up the overall number. By examining and tracking these subgroups, we can make a lot more sense of whatever the next year’s retention rate turns out to be. We also need to remind ourselves that for an overall retention rate to hit our aspired goal, all of the subpopulation retention rates have to end up evenly distributed around that final number. And that has always been where the really tough challenges lie because retention rates for some student groups (e.g., low income, students of color, lower academic ability) have languished below other groups (e.g., more affluent, white students, higher academic ability) for a very long time.

With all that as a prelude, Let’s dive into the details that make up the retention rate of our 2016 cohort.

Overall, our first-to-second year retention rate this fall is 86.5%. No, it’s not as strong as last year’s 88.9% retention rate. Even though our three-year trend of 3-year retention rate averages continues to improve (84.7%, 86.0%, and 87.2% most recently), I would be lying if I said that I wasn’t just a little disappointed by the overall number. Yet, this sets up the perfect opportunity to examine our data more closely for evidence that might confirm or counter the narrative I described above.

As always, this is where things get genuinely interesting. Over the last few years, we’ve put additional effort into the quality of the experience we provide for our students of color. So we should expect to see improvement in our first-to-second year retention rate for these students. And in fact, we have seen improvement over the past four years as retention rates for these students have risen from 78.4% four years ago to 86.1% last year.

So it is particularly gratifying to report that the retention rate for students of color among the 2016 cohort increased again, this time to an impressive 89.8%! Amazingly, these students persisted at a rate almost five percentage points higher than white students.

That’s right.  Retention of first-year students of color from fall of 2016 to fall of 2017 was 89.8%, while the retention of white students over the same period was 85.6%.

Of course, there is clearly plenty of work for us yet to do in creating the ideal learning environment where, as a result, the maximal number of students succeed. And I don’t for a second think that everything is going to be unicorns and rainbows from here on out. But for a moment, I think it’s worth taking just a second to be proud of our success in improving the retention rate of our students of color.

Make it a good day,

Mark

For the Want of a Nail: Maybe another vital predictive clue?

I’ve always loved the Todd Rundgren song “The Want of a Nail.”  In addition to a jumpin’ soul groove that could levitate a gospel choir, syncopated piano, punchy horn arrangements, and the legendary Bobby Womack singing call-and-response with Rundgren’s lead vocals (as if that weren’t enough!), the lyrics relay the old proverb “For Want of a Nail” that laments the loss of an entire kingdom due to the seemingly minor detail of a missing nail that would have kept a horse’s shoe attached to a lone steed during a fleeting cavalry charge.

After I wrote last week’s post comparing a couple of major pre-college predictors of first year success, I wondered again about whether there might be other key predictors of first year success that tend to fly just under the radar. I guess I’m thinking of the kinds of traits and attributes that, although they might not dominate a first impression, might just tip the balance for a student teetering on the edge of academic survival. In addition, by “under-the-radar” I’m referring to the type of variables that might normally get overshadowed in large-scale statistical analyses of college student success. These almost imperceptible influencers are often (for lack of a better term) predictors of predictors; like the way that a regular sleep schedule might increase the ability to focus when studying, which in turn increases one’s cognitive stamina during extended test-taking and ultimately improves the likelihood of a higher standardized test scores. Whether they are called “soft skills” in the popular press or “non-cognitive factors” in the academic press, it has become increasingly clear that these attributes matter for college success. So if we could figure out whether any of these attributes play a similar role among Augustana students who fell into that muddy middle of maybe when we were considering their application for admission, we might know a little bit more about teasing out just the right details in the process of trying to decide if a person is a good fit for Augustana or not.

Fortunately, we already collect some of this kind of data through the Student Readiness Survey that incoming freshmen take before attending summer registration. Although the Student Readiness Survey was primarily designed to better inform early conversations between advisors and new students, this data mirrors several of the soft skill traits and is ripe for testing as a predictor of first year success. To make sure we avoid the brute force effect (AKA when a typically powerful predictor like test score or high school GPA drowns out the more subtle impact of other potentially important factors), we took into account high school GPA throughout our analyses so that we could focus on the influence of these attributes and traits.

As a quick reminder, the Student Readiness Survey collects data on six non-cognitive factors or soft skills:

  • Academic Habits (e.g., using a highlighter, planning ahead to do homework)
  • Academic Confidence (e.g., belief in one’s ability to learn)
  • Persistence and Grit (e.g., tendency to fight through difficulty to achieve a goal)
  • Interpersonal Skills (e.g., tendency to consider multiple views in navigating conflict)
  • Stress Management (e.g., perception of one’s temper or ability to be patient)
  • Comfort with Social Interaction (e.g., ability to make new friends)

Sure enough, we found that two of these traits stood out as predictors of cumulative first-year GPA even after taking into account high school GPA.  Can you guess which concepts rose to the top?

Although it might seem obvious, academic habits held up as a significant predictor of first year success. In other words, if you took two prospective students with similar high school GPAs, the one more likely to succeed would be the one who takes a more deliberate approach to academic habits, plans ahead, and studies actively rather than passively.

The second significant predictive trait turned out to be interpersonal skills. Specifically, we found that the more a student was inclined to consider another’s perspective as well as their own when negotiating a disagreement, the more likely they were to succeed as a first year student. Again, this finding took into account high school GPA. So if you were to consider two students with similar high school GPAs for admission, the one that seems to exhibit more sophisticated interpersonal skills may well be the one who is more likely to succeed. It’s important to note that this is not the same as leaning toward the student who seems more comfortable socially. In fact, our data suggests that there may be a small negative relationship between comfort with social interaction and first year success after taking into account high school GPA.

So what should we do with these findings? First of all, tread carefully. Applying these findings is sort of like fixing watches – one wrong move and you’ve turned a nifty time piece into an expensive paper weight. Moreover, these findings will only be useful when you put them in context with everything else you might know about a prospective student. Yet, when straining to find some whisper of evidence about a student’s potential for success, these may well be exactly the veins that you ought to mine. That doesn’t mean that you’ll necessarily find what you’re looking for, but it does mean that you know just a little bit more about where to look.

Although I’ve harped on this point in previous posts, I think it is worth repeating – a mountain of research demonstrates that the experiences during the first year are critical to student success, notwithstanding all sorts of pre-college characteristics. So nothing we might know about a prospective student before they start college rises to the level of slam-dunk proof of their future success or failure. But knowing just a little bit more about traits and attributes under the radar might help us make better decisions about students who seem to cluster at the margins.

Make it a good day,

Mark

Standardized test score vs. high school GPA: The battle of the predictors!!

No matter how solid the overall academic characteristics of a first year class, by the time spring rolls around we always seem to wish that we could be just a bit more precise in identifying students who can succeed at Augustana. Yes, there are some applicants who are almost guaranteed to succeed and some who are obviously nowhere near ready for the Augustana experience. But most applications fall somewhere in between those two poles. And even though this is an exceedingly complicated and imperfect exercise, every bit of information we can tease from our own data can help us perfect our efforts to pick out those diamonds in the rough.

Standardized test score (i.e., ACT or SAT) and high school GPA have always been the big dogs in predicting college success. In the not too distant past, these two metrics were often the only numbers that a college used to make admissions decisions. But (at least) two problems have emerged that make it critically important to test the veracity of both metrics among our own students. First, the test prep business has become an almost ubiquitous partner to the tests themselves. ACT or SAT preparation resources (be they online or in person) are often strongly encouraged and sometimes are even offered as a part of the high school curriculum. As a result, one could argue that standardized test scores increasingly predict test taking skills rather than academic preparation. (Given that the average ACT composite score has remained the same from 1997 to 2016, I’m not exactly sure what the growth in the test prep business says about the industry or the people who pay for those services . . . but that is another story). When we add to the mix the correlation between socioeconomic status and available educational resources, test score becomes an even more suspect measure of academic potential.

Second, high school GPA has become an increasingly “flexible” number as high schools have added more and more weighted courses and varying academic “tracks” for different types of students. As a result, high school GPAs sometimes appear much more tightly clustered for certain types of students, making it more difficult to claim that a moderate differences in high school GPA between two applicants represents an actual difference in college readiness. In addition, these patterns of clustering quickly become specific to an individual school or district, making it even harder to compare applicants between schools or districts. For these reasons Augustana decided a number of years back to generate for each applicant a recalculated GPA that removes much of the peculiarity of the GPA initially provided by the high school.

Given the increasing murkiness of these two metrics, it makes sense to test the predictive validity (i.e., trustworthiness) of each among our own students. In addition, since students almost always submit both a test score and a high school GPA, it would help us a lot to know more about each metric in the context of the other. For example, what if a student’s test score seems much stronger than their high school GPA, or the other way around. Should we place more value on one over the other? Should we put our trust in the more favorable of the two metrics?

To conduct this inquiry thoroughly, we tested the effect of high school GPA and standardized test score on three different measures of first year success: cumulative GPA at the end of the first year, retention to the second year, and the number of credits completed during the first year.

In short, it isn’t much of a contest. The recalculated high school GPA significantly predicts first year cumulative GPA, retention, and number of credits completed. Standardized test score only predicts first year cumulative GPA, while producing no statistically significant effect on retention or number of credits completed. In addition, when high school GPA and test score were analyzed head-to-head (i.e., both variables were included in the same statistical analysis predicting first year cumulative GPA), the size of the high school GPA effect was two and a half times larger than the effect of the standardized test score.

This would suggest that one ought to prioritize the recalculated high school GPA over the standardized test score when evaluating a prospective student. But remember the questions I posed a few paragraphs back about an applicant whose test score and high school GPA don’t seem to match up? We felt like we needed to run one more test just in case.

To test the phenomenon of a test score and a high school GPA that appear to “disagree” with each other, we created a variable that reflected the relative gap between test score and high school GPA and tested the relationship between this variable and the first year cumulative GPA (I’ll gladly explain in more detail the steps we took to build this variable off line. Suffice it to say that, “We got our stats nerd on, and it was awesome.”) Interestingly, our findings closely mirrored our prior results. As the test score exceeded the high school GPA (i.e., as the test score represented an increasingly higher academic potential than the GPA), the first year cumulative GPA tended to drop. Conversely, when the high school GPA exceeded the test score, first year cumulative GPA tended to rise. Although there is a point at which this variable is no longer useful (e.g., if the test score is 35 and the high school GPA is 1.5, one starts to suspect something more nefarious might be at play), these findings corroborate our earlier tests. For Augustana applicants, recalculated high school GPA is a more accurate predictor of first year success than the standardized test score.

So if you are advising first year students, be careful about making assumptions about your students’ ability based on their test scores. Likewise, when there appears to be a gap between what the test score and the high school GPA might suggest, which metric exceeds the other might tell very different stories about the needs of a given student.

Make it a good day,

Mark

Strap yourself in. It’s gonna be an awesome year!

Although every weekend before the start of a fall term seems to bring a surge of energy to campus, this fall feels just a little bit different. I’m not sure I can put my finger on it just yet, but what usually seems like a low hum of electrical current has been elevated to a palpable buzz throughout the quad. It’s almost as if we’ve crossed some sort of invisible threshold where a confluence of undercurrents have achieved a synergy that is about to burst into a kaleidoscopic firework display of sparkle and color.

Over the last 10 years, the Augustana student body has grown more diverse on a variety of dimensions. The graph below shows the growing proportions of students from five different demographic groups since 2006. The race/ethnic line (blue) represents the proportion of non-white students. The socioeconomic line (red) represents the proportion of Pell Grant recipients. The religious line (green) represents the proportion of students who identify with a belief system that is not Protestant or Catholic. The out-of-state line (purple) represents the proportion of students who are from states other than Illinois. And the international line (light blue) represents the students who are coming to Augustana from a country outside of the United States.

graph

These demographic shifts don’t happen by chance, and we ought to recognize and applaud the policy decisions that helped to bolster these trends.

But without a concerted and sustained commitment to make the most of this blossoming tapestry of difference, we leave ourselves vulnerable to a devastating missed opportunity. It would be terribly disappointing if all of this effort to bring such a wealth of diversity to campus produced nothing more than larger groups of homogeneity sitting in separate corners of the Gerber dining hall.

Weaving our varied demographics into a single diverse tapestry will take a continual effort to create, cultivate, and sometimes even coerce meaningful interactions across difference. This will mean that all of us will have to recognize our very human tendency to find comfort in familiarity. Although I’m not suggesting that we shun the familiar, we must humbly recognize our frailty and make the choice to open ourselves to new relationships, new ideas, and new experiences.

I know that helping young people navigate such a complex challenge is not simple, and it certainly doesn’t happen overnight. And, to be clear, I’m definitely not suggesting that we should shove our students into situations that are beyond their capability or maturity. But if we are to weave the diverse tapestry that we imagine, then we will need to put our collective shoulder to the grindstone and push, steadily, kindly, and purposefully; especially when the conversations turn difficult and and our vulnerabilities and insecurities feel exposed.

We have gathered the resources together on this campus to make difference one of our most powerful assets. Now let’s get to work.

Make it a good day,

Mark

What kind of work goes into recruiting a freshman class?

As you have almost certainly seen by now, the number of tuition deposits received by the end of last week for next year’s incoming class had almost surpassed 740; well above the number we had cautiously hoped to reach at the beginning of this recruiting cycle. In the context of a shrinking population of high school graduates in the Midwest, this is a genuinely impressive feat.

So, for the last Delicious Ambiguity post of the year, I thought I’d share some numbers that spell out the enormity of this effort.

From June 1, 2016 through April 29, 2017, 3,123 prospective seniors (i.e., students who would start college in August, 2017) visited our campus. Averaged over that 47-week period, this works out to about 66 visits per week. Given the extended planning required to host any of the several Saturday admissions events, and given that prospective students visit campus even when classes aren’t in session or many of us might be enjoying a holiday week, it seems pretty clear that this office is running at a high clip all year long.

Moreover, the nature of recruiting students to a campus seems to have slowly shifted in recent decades so that relatively more time is required to recruit students after they have already been accepted for admission. This is where I found two numbers to be pretty astounding.

Between December 1, 2016 and April 29, 2017, the admissions counselors and student ambassadors sent 2,868 emails and made 6,145 phone calls to accepted applicants. That averages out to about 137 emails and about 293 phone calls per week. All of this communication happened on top of the multitude of in-person campus visits.

Moreover, all of this communication likely doesn’t include all of the emails and phone calls that faculty and coaches made to prospective students all year long.

So, at least for a few minutes this week, let’s lose the good ole Lutheran Midwestern reservedness and congratulate ourselves unabashedly for a job well done. In addition, I think that the admissions staff and the crew of student ambassadors deserve a giant shout out. Well done, y’all!

Make it a good day, everybody . . . and have a wonderful summer,

Mark

 

Designing semesters bit by bit – Look what we can do!

In the midst of all the inevitable end-of-spring-term craziness, the thought of contemplating one more semester design vote doesn’t seem all that appealing. Arguable, the question of whether or not to include advising within our calculus of faculty load is the most complicated of the many decisions we’ve made this year. I don’t fault anyone one bit for feeling overwhelmed, or even a little crabby, about this last vote – no matter what you think we ought to do.

But in the midst of all this, I think it is worthwhile to step back a little and have a look at what we’re on the verge of accomplishing. You might not be in the mood for hyperbole at the moment, but the truth is that we are about to complete something that almost no other institution has done. We’ve actually designed an entire semester calendar and curriculum framework out in the open, step by step, modeling the implications of all the competing issues from the very beginning and then remodeling the implications of each decision on the larger picture at each step along the way. That isn’t to say that we’ve done everything perfectly – after all, we are no more than a bunch of imperfect yokels trying to pull off extraordinary, something that few schools have ever done in a way that most wouldn’t dare to try. Call me a Pollyanna, but after zooming out and having a look back at what we’ve accomplished this year, you’d be hard pressed not to be impressed.

What have we done since September?  Here are the decisions faculty have made that set each of the major elements of the new semester design in place.

  • Voted for an immersive term 140 to 26
  • Voted for a 4-credit course base instead of a 3-credit course base 126 to 37
  • Voted for the immersion term to occur in January instead of May 136 to 35
  • Voted for 124 credits to graduate instead of 120 credits to graduate 92 to 71
  • Approved the structure proposed for General Education 109 to 31
  • Approved the second language requirement unanimously
  • Approved a framework for major design and footprint unanimously

Other than the vote about the total number of credits to graduate, each vote seems to reflect a clear sense among the community about the direction that is best for us.

In addition, two faculty votes have provided advisory positions to the Board of Trustees, the body that makes the final decision on these two specific issues.

  • Voted for a pre-Labor Day start to the academic year 67 to 59
  • Voted that tuition should cover a relatively higher number of credits per year rather than a relatively smaller number of credits 98 to 62

All in all, the amount of intellectual and emotional work that we have successfully sorted through to accomplish all of these decisions is truly extraordinary. It’s hard to imagine anyone NOT feeling at least a little bit more tired than normal these days.

So even if you’re feeling like you are running on fumes these days, try to take a second, breathe deeply, and look at how much we have accomplished. I, for one, am truly amazed and humbled. It’s an honored to be able to call myself a member of the Augustana community.

Make it a good day,

Mark

Improving our first-year advising: sometimes structure does matter

If you’ve been reading this blog for a while, you’ve almost certainly seen some of my posts about the data we’ve collected to assess and guide our advising practices at Augustana College (here, here, and here). However, those posts only get at part of the story. Since all of those posts drew from senior survey data, we can be almost sure that those findings primarily reflect our students’ advising experiences in their major(s). But we also know that first-year advising matters a lot. Many would argue it matters at least as much as major advising. So I’d like to dive into some of the advising data from our first-year students and see if there’s anything that we can learn from it.

In this post I’d like to focus on two items that we know are important for a successful first-year experience. First-year students answered these questions late in their fall term.

  1. My first year adviser connected me with other campus offices, resources, or opportunities (offices like Student Activities, the Community Engagement Center, the Counseling Center) to help me succeed during my first year.
  2. My first-year adviser made me feel like I could succeed at Augustana.

The table below presents the average response scores to these items over the last four years. The response options were strongly disagree, disagree, neutral, agree, and strongly agree. These responses were converted to a 1-5 scale where 1 equals strongly disagree.

Question 2013-14 2014-15 2015-16 2016-17
My first-year Adviser connected me with other campus offices, resources, or opportunities to help me succeed during my first year.    3.55    3.62    3.83    3.89
My first-year adviser made me feel like I could succeed at Augustana.    4.07    4.20    4.25    4.21

You can see that we’ve improved on both measures since 2013-14. I know that our first-year advising program has emphasized the importance of connecting students with the campus offices that can best help them, and it’s heartening to see that this effort may be producing results. With that said, it looks like we might still need to improve since our average score hasn’t quite surpassed “agree” yet. By contrast, in each of the last four years, on average our student’s “agree” that we have made them feel like they could succeed at Augustana.

Interestingly, while the improvement in referring students to other campus resources seems fairly consistent, the improvement in making students feel like they could succeed seems to have plateaued over the last couple of years. But digging a little deeper, there is a wrinkle in our 2016-17 data that both seems to explain this plateau and may further emphasize the value in moving to the first-year advising structure that the faculty has now approved to implement next year.

This year (i.e., during the fall of 2016), about a third of our first-year student advising groups were enrolled in an FYI-100 course instead of merely meeting informally with their adviser throughout the term. For the students who were enrolled in this class, the average response score to the statement “My first-year adviser made them feel like they could succeed at Augustana” was 4.34. For the students who were not enrolled in this class (about 2/3rds of the whole group), the average response score was 4.17.

Many long time advisers said that the FYI 100 format helped them develop stronger relationships with their advisee. These advisers indicated that the stronger relationships allowed them to engage in more substantive conversations that, in turn, helped the students think more deeply about the nature of their college experience and the ways in which they could make the most of it.

As wonderful as it is to hear that we seem to be making improvements in our advising practices, It is even more exciting to see data confirming these bold strides toward even better first-year advising.

Make it a good day,

Mark