Just when you think you’ve got everything figured out . . .

This post started out as nothing more than a humble pie correction; something similar to what you might find at the bottom of the first page of your local newspaper (if you are lucky enough to still have a local newspaper). But as I continued to wrestle with what I was trying to say, I realized that this post wasn’t really about a correction at all. Instead, this post is about what happens when a changing student population simply outgrows the limits of the old labels you’ve been using to categorize them.

Last week, I told you about a stunningly high 89.8% retention rate for Augustana’s students of color, more than five percentage points higher than our retention rate for white students. During a meeting later in the week, one of my colleagues pointed out that the total number of students of color from which we had calculated this retention rate seemed high. Since this colleague happens to be in charge of our Admissions team, it seemed likely that he would know a thing or two about last year’s incoming class. At the same time, we’ve been calculating this retention rate in the same way for years, so it didn’t seem possible that we suddenly forgot how to run a pretty simple equation.

Before I go any further, let’s get the “correction,” or maybe more precisely “clarification,” out of the way. Augustana’s first-to-second year retention rate for domestic students of color this year is 87.3%, about a point higher than the retention rate for domestic white students (86%). Still impressive, just not quite as flashy. Furthermore, our first-to-second year retention rate for international students is 88.4%, almost two percentage points higher than our overall first-to-second year retention rate of 86.5%. Again, this is an impressive retention rate among students who, in most cases, are also dealing with the extra hurdle of adapting to (if not learning outright) Midwestern English.

So what happened?

For a long time, Augustana has used the term “multicultural students” as a way of categorizing all students who aren’t white American citizens raised in the United States. Even though the term is dangerously vague, when almost 95% of Augustana’s enrollment was white domestic students (less than two decades ago) there was a reasonable logic to constructing this category. Just as categories can become too large to be useful, so too can categories become so small that they dissipate into a handful of individuals. And even the most caring organization finds it pretty difficult to explicitly focus itself on the difficulties of a few individuals.

Moreover, this categorization allowed us to construct a group large enough to quantify in the context of other larger demographic groups. For example, take one group of students from which 10 of 13 return for a second year and compare it with another group of students from which 200 of 260 return for a second year. Calculated as a proportion, both groups share the same retention rate. But in practice, the retention success for each group seems very different; one group lost very few students (3) while the other group lost a whole bunch (60). In the not so distant past when an Augustana first-year class would include maybe 20 domestic students of color and 5 international students (out of a class of about 600), grouping these students into the most precise race, ethnicity, and citizenship categories would almost guarantee that these individuals would appear as intriguing rarities or, worse yet, quaint novelties. Under these circumstances, it made a lot of sense to combine several smaller minority groups into one category large enough to 1) conceptualize as a group with broadly similar needs and challenges and 2) quantify in comparable terms to other distinct groups of Augustana students.

In no way am I arguing that the term “multicultural” was a perfect label. As our numbers of domestic students of color increased, the term grew uncomfortably vague. Equally problematic, some inferred that we considered the totality of white students to be a monoculture or that we considered all multicultural students to be overflowing with culture and heritage. Both of these inferences weren’t necessarily accurate, but as with all labels, seemingly small imperfections can morph into glaring weaknesses when the landscape changes.

Fast forward fifteen years. Entering the 2017-18 academic year, our proportion of “multicultural students” has increased by over 400%, and the combination of domestic students of color and international students make up about 27% of our total enrollment – roughly 710 students. Specifically, we now enroll enough African-American students, Hispanic students, and international students to quantitatively analyze their experiences separately. To be clear, I’m not suggesting that we should prioritize one method of research (quantitative) over another (qualitative). I am arguing, however, that we are better able to gather evidence that will inform genuine improvement when we have both methods at our disposal.

After a conversation with another colleague who is a staunch advocate for African-American students, I resolved to begin using the term “students of color” instead of multicultural students. Although it’s taken some work, I’m making slow progress. I was proud of myself last week when I used the term “students of color” in this blog without skipping a beat.

Alas, you know the story from there. Although I had used an arguably more appropriate term for domestic students of color in last week’s post, I had not thought through the full implications of shifting away from an older framework for conceptualizing difference within our student body. Clearly, one cannot simply replace the term “multicultural students” with “students of color” and expect the new term to adequately include international students. At the same time, although the term “multicultural” implies an air of globalism, it could understandably be perceived to gloss over important domestic issues of race and inequality. If we are going to continue to enroll larger numbers across multiple dimensions of difference, we will have to adopt a more complex way of articulating the totality of that difference.

Mind you, I’m not just talking about counting students more precisely from each specific racial and ethnic category – we’ve been doing that as long as we’ve been reporting institutional census data to the federal government. I guess I’m thinking about finding new ways to conceptualize difference across all of its variations so that we can adopt language that better matches our reality.

I’d like to propose that all of us help each other shift to a terminology that better represents the array of diversity that we’ve worked so hard to achieve, and continue to work so hard to sustain. I know I’ve got plenty to learn (e.g., when do I use the term “Hispanic” and when do I use the term “Latinx”?), and I’m looking forward to learning with you.

And yes, I’ll be sure to reconfigure our calculations in the future. Frankly, that is the easy part. Moreover, I’ll be sure to reconceptualize the way I think about student demographics. We’ve crossed a threshold into a new dimension of diversity within our own student body. Now it’s time for the ways that we quantify, convey, and conceptualize that diversity to catch up.

Make it a good day,

Mark

Retention, Realistic Goals, and a Reason to be Proud

When we included metrics and target numbers in the Augustana 2020 strategic plan, we made it clear to the world how we would measure our progress and our success. As we have posted subsequent updates about our efforts to implement this strategic plan, a closer look into those documents exposes some of the organizational challenges that can emerge when a goal that seemed little more than a pipe dream suddenly looks like it might just be within our grasp.

Last week we calculated our first-to-second year retention numbers for the cohort that entered in the fall of 2016. As many of you know, Augustana 2020 set a first-to-second year retention rate goal of 90%, a number that we’d never come close to before. In fact, colleges enrolling students similar to ours top out at retention rates in the upper 80s. But we decided to set a goal that would stretch us outside of this range. To come up with this goal, we asked ourselves, “What if the stars aligned with the sun and the moon and we retained every single student that finished the year in good academic standing?” Under those conditions, we might hit a 90% retention rate. A pipe dream? Maybe. But why set a goal if it doesn’t stretch us a little bit? So we stuck that number in the document and thought to ourselves, “This will be a good number to shoot for over the next five years.”

Last year (fall of 2016), we were a little stunned to find that we’d produced an overall first-to-second year retention rate of 88.9%. Sure, we had instituted a number of new initiatives, tweaked a few existing programs, and cranked up the volume on our prioritizing retention megaphone to eleven. But we weren’t supposed to have had so much success right away. To put this surprise in the context of real people, an 89% retention rate meant that we whiffed on a whopping seven students! SEVEN!!! Even if we had every retention trick in the book memorized, out of a class of 697, an 88.9% retention rate is awfully close to perfect.

So what do you do when you find yourself close to banging your head on a ceiling that you didn’t really ever expect to see up close? The fly-by-night thought leader tripe would probably answer with an emphatic, “Break through that ceiling!” (complete with an infomercial and a special deal on a book and a DVD). Thankfully, we’ve got enough good sense to be smarter than that. In reality, a situation like this can set the stage for a host of delicately dangerous delusions. For example, if we were to exceed that retention rate in the very next year we could foolishly convince ourselves into thinking that we’ve discovered the secret to perfect retention. Conversely, if our retention rate were to slip in the very next year we could start to think that our aspiration was always beyond our grasp and that we really ought to just stop trying to be something that we are not (cue the Disney movie theme song).

The way we’ve chosen to approach this potential challenge is to start with a clear understanding of all of the various retention rates of student subpopulations that make up the overall number. By examining and tracking these subgroups, we can make a lot more sense of whatever the next year’s retention rate turns out to be. We also need to remind ourselves that for an overall retention rate to hit our aspired goal, all of the subpopulation retention rates have to end up evenly distributed around that final number. And that has always been where the really tough challenges lie because retention rates for some student groups (e.g., low income, students of color, lower academic ability) have languished below other groups (e.g., more affluent, white students, higher academic ability) for a very long time.

With all that as a prelude, Let’s dive into the details that make up the retention rate of our 2016 cohort.

Overall, our first-to-second year retention rate this fall is 86.5%. No, it’s not as strong as last year’s 88.9% retention rate. Even though our three-year trend of 3-year retention rate averages continues to improve (84.7%, 86.0%, and 87.2% most recently), I would be lying if I said that I wasn’t just a little disappointed by the overall number. Yet, this sets up the perfect opportunity to examine our data more closely for evidence that might confirm or counter the narrative I described above.

As always, this is where things get genuinely interesting. Over the last few years, we’ve put additional effort into the quality of the experience we provide for our students of color. So we should expect to see improvement in our first-to-second year retention rate for these students. And in fact, we have seen improvement over the past four years as retention rates for these students have risen from 78.4% four years ago to 86.1% last year.

So it is particularly gratifying to report that the retention rate for students of color among the 2016 cohort increased again, this time to an impressive 89.8%! Amazingly, these students persisted at a rate almost five percentage points higher than white students.

That’s right.  Retention of first-year students of color from fall of 2016 to fall of 2017 was 89.8%, while the retention of white students over the same period was 85.6%.

Of course, there is clearly plenty of work for us yet to do in creating the ideal learning environment where, as a result, the maximal number of students succeed. And I don’t for a second think that everything is going to be unicorns and rainbows from here on out. But for a moment, I think it’s worth taking just a second to be proud of our success in improving the retention rate of our students of color.

Make it a good day,

Mark

For the Want of a Nail: Maybe another vital predictive clue?

I’ve always loved the Todd Rundgren song “The Want of a Nail.”  In addition to a jumpin’ soul groove that could levitate a gospel choir, syncopated piano, punchy horn arrangements, and the legendary Bobby Womack singing call-and-response with Rundgren’s lead vocals (as if that weren’t enough!), the lyrics relay the old proverb “For Want of a Nail” that laments the loss of an entire kingdom due to the seemingly minor detail of a missing nail that would have kept a horse’s shoe attached to a lone steed during a fleeting cavalry charge.

After I wrote last week’s post comparing a couple of major pre-college predictors of first year success, I wondered again about whether there might be other key predictors of first year success that tend to fly just under the radar. I guess I’m thinking of the kinds of traits and attributes that, although they might not dominate a first impression, might just tip the balance for a student teetering on the edge of academic survival. In addition, by “under-the-radar” I’m referring to the type of variables that might normally get overshadowed in large-scale statistical analyses of college student success. These almost imperceptible influencers are often (for lack of a better term) predictors of predictors; like the way that a regular sleep schedule might increase the ability to focus when studying, which in turn increases one’s cognitive stamina during extended test-taking and ultimately improves the likelihood of a higher standardized test scores. Whether they are called “soft skills” in the popular press or “non-cognitive factors” in the academic press, it has become increasingly clear that these attributes matter for college success. So if we could figure out whether any of these attributes play a similar role among Augustana students who fell into that muddy middle of maybe when we were considering their application for admission, we might know a little bit more about teasing out just the right details in the process of trying to decide if a person is a good fit for Augustana or not.

Fortunately, we already collect some of this kind of data through the Student Readiness Survey that incoming freshmen take before attending summer registration. Although the Student Readiness Survey was primarily designed to better inform early conversations between advisors and new students, this data mirrors several of the soft skill traits and is ripe for testing as a predictor of first year success. To make sure we avoid the brute force effect (AKA when a typically powerful predictor like test score or high school GPA drowns out the more subtle impact of other potentially important factors), we took into account high school GPA throughout our analyses so that we could focus on the influence of these attributes and traits.

As a quick reminder, the Student Readiness Survey collects data on six non-cognitive factors or soft skills:

  • Academic Habits (e.g., using a highlighter, planning ahead to do homework)
  • Academic Confidence (e.g., belief in one’s ability to learn)
  • Persistence and Grit (e.g., tendency to fight through difficulty to achieve a goal)
  • Interpersonal Skills (e.g., tendency to consider multiple views in navigating conflict)
  • Stress Management (e.g., perception of one’s temper or ability to be patient)
  • Comfort with Social Interaction (e.g., ability to make new friends)

Sure enough, we found that two of these traits stood out as predictors of cumulative first-year GPA even after taking into account high school GPA.  Can you guess which concepts rose to the top?

Although it might seem obvious, academic habits held up as a significant predictor of first year success. In other words, if you took two prospective students with similar high school GPAs, the one more likely to succeed would be the one who takes a more deliberate approach to academic habits, plans ahead, and studies actively rather than passively.

The second significant predictive trait turned out to be interpersonal skills. Specifically, we found that the more a student was inclined to consider another’s perspective as well as their own when negotiating a disagreement, the more likely they were to succeed as a first year student. Again, this finding took into account high school GPA. So if you were to consider two students with similar high school GPAs for admission, the one that seems to exhibit more sophisticated interpersonal skills may well be the one who is more likely to succeed. It’s important to note that this is not the same as leaning toward the student who seems more comfortable socially. In fact, our data suggests that there may be a small negative relationship between comfort with social interaction and first year success after taking into account high school GPA.

So what should we do with these findings? First of all, tread carefully. Applying these findings is sort of like fixing watches – one wrong move and you’ve turned a nifty time piece into an expensive paper weight. Moreover, these findings will only be useful when you put them in context with everything else you might know about a prospective student. Yet, when straining to find some whisper of evidence about a student’s potential for success, these may well be exactly the veins that you ought to mine. That doesn’t mean that you’ll necessarily find what you’re looking for, but it does mean that you know just a little bit more about where to look.

Although I’ve harped on this point in previous posts, I think it is worth repeating – a mountain of research demonstrates that the experiences during the first year are critical to student success, notwithstanding all sorts of pre-college characteristics. So nothing we might know about a prospective student before they start college rises to the level of slam-dunk proof of their future success or failure. But knowing just a little bit more about traits and attributes under the radar might help us make better decisions about students who seem to cluster at the margins.

Make it a good day,

Mark

Standardized test score vs. high school GPA: The battle of the predictors!!

No matter how solid the overall academic characteristics of a first year class, by the time spring rolls around we always seem to wish that we could be just a bit more precise in identifying students who can succeed at Augustana. Yes, there are some applicants who are almost guaranteed to succeed and some who are obviously nowhere near ready for the Augustana experience. But most applications fall somewhere in between those two poles. And even though this is an exceedingly complicated and imperfect exercise, every bit of information we can tease from our own data can help us perfect our efforts to pick out those diamonds in the rough.

Standardized test score (i.e., ACT or SAT) and high school GPA have always been the big dogs in predicting college success. In the not too distant past, these two metrics were often the only numbers that a college used to make admissions decisions. But (at least) two problems have emerged that make it critically important to test the veracity of both metrics among our own students. First, the test prep business has become an almost ubiquitous partner to the tests themselves. ACT or SAT preparation resources (be they online or in person) are often strongly encouraged and sometimes are even offered as a part of the high school curriculum. As a result, one could argue that standardized test scores increasingly predict test taking skills rather than academic preparation. (Given that the average ACT composite score has remained the same from 1997 to 2016, I’m not exactly sure what the growth in the test prep business says about the industry or the people who pay for those services . . . but that is another story). When we add to the mix the correlation between socioeconomic status and available educational resources, test score becomes an even more suspect measure of academic potential.

Second, high school GPA has become an increasingly “flexible” number as high schools have added more and more weighted courses and varying academic “tracks” for different types of students. As a result, high school GPAs sometimes appear much more tightly clustered for certain types of students, making it more difficult to claim that a moderate differences in high school GPA between two applicants represents an actual difference in college readiness. In addition, these patterns of clustering quickly become specific to an individual school or district, making it even harder to compare applicants between schools or districts. For these reasons Augustana decided a number of years back to generate for each applicant a recalculated GPA that removes much of the peculiarity of the GPA initially provided by the high school.

Given the increasing murkiness of these two metrics, it makes sense to test the predictive validity (i.e., trustworthiness) of each among our own students. In addition, since students almost always submit both a test score and a high school GPA, it would help us a lot to know more about each metric in the context of the other. For example, what if a student’s test score seems much stronger than their high school GPA, or the other way around. Should we place more value on one over the other? Should we put our trust in the more favorable of the two metrics?

To conduct this inquiry thoroughly, we tested the effect of high school GPA and standardized test score on three different measures of first year success: cumulative GPA at the end of the first year, retention to the second year, and the number of credits completed during the first year.

In short, it isn’t much of a contest. The recalculated high school GPA significantly predicts first year cumulative GPA, retention, and number of credits completed. Standardized test score only predicts first year cumulative GPA, while producing no statistically significant effect on retention or number of credits completed. In addition, when high school GPA and test score were analyzed head-to-head (i.e., both variables were included in the same statistical analysis predicting first year cumulative GPA), the size of the high school GPA effect was two and a half times larger than the effect of the standardized test score.

This would suggest that one ought to prioritize the recalculated high school GPA over the standardized test score when evaluating a prospective student. But remember the questions I posed a few paragraphs back about an applicant whose test score and high school GPA don’t seem to match up? We felt like we needed to run one more test just in case.

To test the phenomenon of a test score and a high school GPA that appear to “disagree” with each other, we created a variable that reflected the relative gap between test score and high school GPA and tested the relationship between this variable and the first year cumulative GPA (I’ll gladly explain in more detail the steps we took to build this variable off line. Suffice it to say that, “We got our stats nerd on, and it was awesome.”) Interestingly, our findings closely mirrored our prior results. As the test score exceeded the high school GPA (i.e., as the test score represented an increasingly higher academic potential than the GPA), the first year cumulative GPA tended to drop. Conversely, when the high school GPA exceeded the test score, first year cumulative GPA tended to rise. Although there is a point at which this variable is no longer useful (e.g., if the test score is 35 and the high school GPA is 1.5, one starts to suspect something more nefarious might be at play), these findings corroborate our earlier tests. For Augustana applicants, recalculated high school GPA is a more accurate predictor of first year success than the standardized test score.

So if you are advising first year students, be careful about making assumptions about your students’ ability based on their test scores. Likewise, when there appears to be a gap between what the test score and the high school GPA might suggest, which metric exceeds the other might tell very different stories about the needs of a given student.

Make it a good day,

Mark

Strap yourself in. It’s gonna be an awesome year!

Although every weekend before the start of a fall term seems to bring a surge of energy to campus, this fall feels just a little bit different. I’m not sure I can put my finger on it just yet, but what usually seems like a low hum of electrical current has been elevated to a palpable buzz throughout the quad. It’s almost as if we’ve crossed some sort of invisible threshold where a confluence of undercurrents have achieved a synergy that is about to burst into a kaleidoscopic firework display of sparkle and color.

Over the last 10 years, the Augustana student body has grown more diverse on a variety of dimensions. The graph below shows the growing proportions of students from five different demographic groups since 2006. The race/ethnic line (blue) represents the proportion of non-white students. The socioeconomic line (red) represents the proportion of Pell Grant recipients. The religious line (green) represents the proportion of students who identify with a belief system that is not Protestant or Catholic. The out-of-state line (purple) represents the proportion of students who are from states other than Illinois. And the international line (light blue) represents the students who are coming to Augustana from a country outside of the United States.

graph

These demographic shifts don’t happen by chance, and we ought to recognize and applaud the policy decisions that helped to bolster these trends.

But without a concerted and sustained commitment to make the most of this blossoming tapestry of difference, we leave ourselves vulnerable to a devastating missed opportunity. It would be terribly disappointing if all of this effort to bring such a wealth of diversity to campus produced nothing more than larger groups of homogeneity sitting in separate corners of the Gerber dining hall.

Weaving our varied demographics into a single diverse tapestry will take a continual effort to create, cultivate, and sometimes even coerce meaningful interactions across difference. This will mean that all of us will have to recognize our very human tendency to find comfort in familiarity. Although I’m not suggesting that we shun the familiar, we must humbly recognize our frailty and make the choice to open ourselves to new relationships, new ideas, and new experiences.

I know that helping young people navigate such a complex challenge is not simple, and it certainly doesn’t happen overnight. And, to be clear, I’m definitely not suggesting that we should shove our students into situations that are beyond their capability or maturity. But if we are to weave the diverse tapestry that we imagine, then we will need to put our collective shoulder to the grindstone and push, steadily, kindly, and purposefully; especially when the conversations turn difficult and and our vulnerabilities and insecurities feel exposed.

We have gathered the resources together on this campus to make difference one of our most powerful assets. Now let’s get to work.

Make it a good day,

Mark

What kind of work goes into recruiting a freshman class?

As you have almost certainly seen by now, the number of tuition deposits received by the end of last week for next year’s incoming class had almost surpassed 740; well above the number we had cautiously hoped to reach at the beginning of this recruiting cycle. In the context of a shrinking population of high school graduates in the Midwest, this is a genuinely impressive feat.

So, for the last Delicious Ambiguity post of the year, I thought I’d share some numbers that spell out the enormity of this effort.

From June 1, 2016 through April 29, 2017, 3,123 prospective seniors (i.e., students who would start college in August, 2017) visited our campus. Averaged over that 47-week period, this works out to about 66 visits per week. Given the extended planning required to host any of the several Saturday admissions events, and given that prospective students visit campus even when classes aren’t in session or many of us might be enjoying a holiday week, it seems pretty clear that this office is running at a high clip all year long.

Moreover, the nature of recruiting students to a campus seems to have slowly shifted in recent decades so that relatively more time is required to recruit students after they have already been accepted for admission. This is where I found two numbers to be pretty astounding.

Between December 1, 2016 and April 29, 2017, the admissions counselors and student ambassadors sent 2,868 emails and made 6,145 phone calls to accepted applicants. That averages out to about 137 emails and about 293 phone calls per week. All of this communication happened on top of the multitude of in-person campus visits.

Moreover, all of this communication likely doesn’t include all of the emails and phone calls that faculty and coaches made to prospective students all year long.

So, at least for a few minutes this week, let’s lose the good ole Lutheran Midwestern reservedness and congratulate ourselves unabashedly for a job well done. In addition, I think that the admissions staff and the crew of student ambassadors deserve a giant shout out. Well done, y’all!

Make it a good day, everybody . . . and have a wonderful summer,

Mark

 

Designing semesters bit by bit – Look what we can do!

In the midst of all the inevitable end-of-spring-term craziness, the thought of contemplating one more semester design vote doesn’t seem all that appealing. Arguable, the question of whether or not to include advising within our calculus of faculty load is the most complicated of the many decisions we’ve made this year. I don’t fault anyone one bit for feeling overwhelmed, or even a little crabby, about this last vote – no matter what you think we ought to do.

But in the midst of all this, I think it is worthwhile to step back a little and have a look at what we’re on the verge of accomplishing. You might not be in the mood for hyperbole at the moment, but the truth is that we are about to complete something that almost no other institution has done. We’ve actually designed an entire semester calendar and curriculum framework out in the open, step by step, modeling the implications of all the competing issues from the very beginning and then remodeling the implications of each decision on the larger picture at each step along the way. That isn’t to say that we’ve done everything perfectly – after all, we are no more than a bunch of imperfect yokels trying to pull off extraordinary, something that few schools have ever done in a way that most wouldn’t dare to try. Call me a Pollyanna, but after zooming out and having a look back at what we’ve accomplished this year, you’d be hard pressed not to be impressed.

What have we done since September?  Here are the decisions faculty have made that set each of the major elements of the new semester design in place.

  • Voted for an immersive term 140 to 26
  • Voted for a 4-credit course base instead of a 3-credit course base 126 to 37
  • Voted for the immersion term to occur in January instead of May 136 to 35
  • Voted for 124 credits to graduate instead of 120 credits to graduate 92 to 71
  • Approved the structure proposed for General Education 109 to 31
  • Approved the second language requirement unanimously
  • Approved a framework for major design and footprint unanimously

Other than the vote about the total number of credits to graduate, each vote seems to reflect a clear sense among the community about the direction that is best for us.

In addition, two faculty votes have provided advisory positions to the Board of Trustees, the body that makes the final decision on these two specific issues.

  • Voted for a pre-Labor Day start to the academic year 67 to 59
  • Voted that tuition should cover a relatively higher number of credits per year rather than a relatively smaller number of credits 98 to 62

All in all, the amount of intellectual and emotional work that we have successfully sorted through to accomplish all of these decisions is truly extraordinary. It’s hard to imagine anyone NOT feeling at least a little bit more tired than normal these days.

So even if you’re feeling like you are running on fumes these days, try to take a second, breathe deeply, and look at how much we have accomplished. I, for one, am truly amazed and humbled. It’s an honored to be able to call myself a member of the Augustana community.

Make it a good day,

Mark

What’s all this talk about big data?

Maybe it hasn’t popped up on your radar yet, but it seems like everywhere one turns these days there’s another perfectly coiffed Nostradamus-impersonator lauding the inevitable big data revolution that’s just around the corner for higher education.

In case you’re wondering what I think about big data and all of the hubbub about it, I’ve shared a link to something I wrote for the Chronicle of Higher Education recently that they titled, “Big Data, Scant Evidence.” If you can’t access the it from where you are reading this post but really want to read the piece, send me a note and I’ll try to get an unlocked copy to you. My article is part of a larger supplement published last week about the big data trend in higher education. You might find some of the other articles interesting, although it’s hard to read some of this stuff and not think, “Isn’t this what we’ve been doing at Augustana for a while now?” Well . . . yes. Except that we aren’t necessarily a big enough place to produce big data. So what do we call our data? Diminutive? Pocket-sized? Lean?

Whatever you want to call it, we seem to be pretty good at improving based upon solid information.

Make it a good day,

Mark

A Shameless Plea

First or all, I owe you all a hearty heap of thanks for your patience this spring. For a couple of reasons, some of which can be chalked up to coincidence and some of which can be blamed squarely on me, we are participating in more than the usual number of surveys this spring.

Of course, I’d be remiss if I didn’t say something about the awesome data that we will have at the end of this term and how much it will likely inform the ways that we keep trying to improve our campus. But you’ve heard all that from me before, and by now you either believe me or you don’t.

Nonetheless . . .

We really need your help in getting first year students to respond to our End of the First Year survey. It’s especially important because we’ve been paying close attention to the experience of various subgroups of students (i.e., African-American students, Hispanic students, first-generation students, students coming from particularly low income families, etc.) that have historically not succeeded at the same rates as more affluent white students. In order to have the most robust data from these students, we need to do everything in our power to encourage participation.

And this leads me to my shameless plea.

Please, please, please: if you interact with first year students or have the wherewithal to communicate with first year students, would you please take 30 seconds to make a personal plea on behalf of the college and encourage them to complete the first year survey? All first year students received an email earlier today inviting them to take the survey. I’ll send a link to anyone who would like so that they can include it on their course’s Moodle site or web page.

Thanks very much. It really does make a difference.

Make it a good day,

Mark

And it’s down to three . . .

Good morning everyone!

It’s not every week that you get to see three pretty smart people talk about the way that they might approach a leadership role as provost at Augustana College.  So if you can find a way to be there, I hope you’ll come to see each of the provost candidates present this week.

Each of them will be presenting at 11 AM and at 3 PM on Monday, Tuesday, and Wednesday, respectively. They will each be giving the same presentation in the afternoon that they gave in the morning, so you can come to one or the other.

Your participation in this process matters for several reasons.

  1. The more feedback the better for the search committee after all three finalists have been to campus.
  2. The more questions asked of the candidates the more everyone in attendance gets a sense of each candidate’s approach to public communication.
  3. The more people in attendance at these presentations the more we communicate to each candidate our investment in our provost and the college.

So come on down to the Wilson Center as often as you can make it.

Make it a good day,

Mark