When an adviser’s suggestion become guidance that a student follows

Remember the Student Readiness Survey (the SRS)?  We built this survey and compendium report two years ago to give each first year student and his or her adviser a better way to start a recurring conversation about strategizing to succeed during the first year.  We modeled the SRS after research that examines the various psychological and behavioral factors that can influence college success – things like academic habits, academic confidence, propensity to persist, stress management, etc. Faculty advisers who have used the reports as they were intended have found that their interactions with first year students have changed dramatically.

However, we’ve only had anecdotal data to suggest that this might be an effective tool until now.  So in our new mid-year survey of first year students, among several questions about the interactions between student and their adviser, we included one item that specifically focuses on the SRS.  Students responded to this statement.

  • “My first year adviser helped me understand my Student Readiness Survey (SRS) results.”

The available response options and the response distributions were:

We never talked about them (what is the SRS?) 76 20%
only briefly 74 20%
yes, but they weren’t all that useful 131 35%
yes, and they influenced how I approached the beginning of my freshman year 94 25%

As you can infer from the response options above, we would like to find that the students who selected, “Yes, and they influenced how I approached the beginning of my freshman year,” also had more positive responses to other items that we know are important for a successful first year.

Although we still have a lot to analyze, we’ve already found one statistically significant relationship that I think is worth sharing.  Another item on the same survey asked students to respond to the statement,

  • “My first year adviser made me feel like I could succeed at Augustana.”

Student responses to this item were:

strongly disagree 9 2%
disagree 16 4%
neutral 60 16%
agree 144 38%
strongly agree 146 39%

Obviously, this is an important question because students’ self-belief is often vital to their success.  Moreover, this self-belief is directly influenced by the messages students get during their interactions with faculty, staff, and administrators.

One of the more important interactions shaping this belief involves students and their advisers.  So we tested the relationship between these two items while holding constant several factors that we thought might also impact whether or not the student might indicate that (a) the student’s adviser helped him or her understand the SRS results, and (b) this  conversation influenced how the student approached the beginning of his or her freshman year.  These controls included measures that might account for the degree to which the student might already be fully prepared to succeed in their first year (and thus not really need the additional advice) or the degree to which the student might find guidance through other means like peer groups or faculty interactions (that would then “wash out” the impact of the SRS report and subsequent conversation).

Sure enough, we found a statistically significant and relatively large positive effect.  In other words, as students indicated a more positive response to the notion that their adviser made them feel like they could succeed at Augustana, students were also more likely (much more likely, to be frank) to report that their conversation with their adviser about the SRS results had influenced the way that they approached their freshman year.

This finding seems to validate the way that we designed the SRS report and the way that first year adviser training has emphasized using it as a formative conversation starter. Students often seem to respond particularly well to advice when they hear it as strategizing to increase their own likelihood of success rather than just something more that someone else has told them to do.  In addition, this finding suggests that a ready-made way for advisers to be successful in getting their students to act upon their good guidance is to use the SRS as a tool to start this conversation, revisit the strategies they discussed with the student on a later date, and continue to work with the student to build an actionable plan for success in college.

So even though it is almost the middle of the spring term and you might have long since forgotten about your students’ SRS reports, now might be just the time to get them out and revisit those results with your students.

Make it a good day,


But the story is so much more interesting than the truth!

A couple of weeks ago, the Delta Cost Project produced a report titled “Labor Intensive or Labor Expensive? Changing Staffing and Compensation Patterns in Higher Education.”  The authors examined several decades of IPEDS data to better understand the hiring and compensation trends that might have driven tuition increases across each sector of higher education.  Overall, the report concluded that higher education institutions’ workforces had increased on average by 28% in the last decade as college enrollments increased at a similar pace.

However, the sound bite that won the news cycle asserted that this report supported the “administrative bloat” meme – the claim that an explosion of non-faculty hires has driven increasing tuition costs and has eroded institutional support for (or as some folks would spin it – the supremacy of the) faculty.  The report did highlight several national trends over the last decade including increases in part-time faculty, increases in mid-level administrators, increases in the cost of benefits for all types of employees, and a drop in the ratio of faculty to administrators (i.e., there are more administrators per faculty member now than there were 20 years ago).

But all of these numbers in the Delta Cost Project report portrayed national trends.  A number of faculty and administrators asked me to examine our own Augustana data to compare whether our trends replicate these national data.  So I presented our local data to the Faculty Senate last week and have linked the power point for you to see here.  In order to make any sense of the rest of this post, you’ll have to click on the power point and have a look at the graphs in it.

I’d like to quickly point to a couple of take-aways and then ask the same question that I asked at the end of my presentation.

First, as you can see from the graphs in the power point, Augustana has not mirrored the national trends in the relationship between faculty and administrator positions.  In fact, we’ve gone the other direction.  Faculty positions have increased while administrator positions have declined.

Second, our own increasing use of part-time faculty parallels the national trends, although to a far smaller degree.  Similarly, albeit to an even smaller degree, we’ve increased the number of non-tenure track full time faculty in recent years.

Now I don’t expect for a second that presenting our local data will forever quiet the claim that administrative growth at Augustana is out of control.  But I would like to ask one question: What do any of these numbers have to do with student learning?  Do we know that more faculty, a lower student-faculty ratio, or a lower faculty-administrator ratio somehow improves our retention or graduation rates?  The little evidence we have would suggest that none of these changes produce any effect.  Likewise, there is little evidence to suggest that more administrators,  a lower student-administrator ratio, or a lower administrator-faculty ratio is a quick fix either.  The fact is that we have no idea what the ideal mix of faculty and administrators might be.  In fact, the answer might not be in the numbers themselves, but rather in how all of our faculty, administrators, and staff collaborate to create the best possible conditions for student acclimation, learning, and growth.

Make it a good day,


Supporting Students IN ORDER TO Challenge Them

The most fundamental of frameworks for successful student development, learning, and growth is the synergistic concept of challenge and support.  Essentially, this concept articulates the critical balancing of two approaches to facilitate learning.  First, If we want to help students grow in substantial ways, we have to challenge them to push themselves beyond where they are comfortable.  Then, in order to minimize the likelihood that they will quit in the midst of this discomfort, we must provide encouragement (support) to help them persist toward their goal. It is equally important to recognize that students need both types of interaction, no matter the ordering of them.  So if we want students to respond positively when we challenge them, we have to have already built a foundation of trust (by expressing a belief that they are capable of success) so that they will be willing to take the risk in responding to our challenge.  In this way, challenge and support function almost like Yin and Yang.  If we want our students to grow, and more importantly take responsibility for their own growth, neither one of these two concepts works without the continuous healthy presence of the other.

In the mid-year first year survey, we asked freshmen how often their instructors had pointed out something that they had done well.  We asked this question because we wanted to find out more about the degree to which students experienced support.  (Last week I discussed one of the questions that addressed the degree to which students’ experience challenge.) The responses were distributed like this:

  • Never – 3%
  • Rarely – 15%
  • Sometimes – 44%
  • Often – 29%
  • Very Often – 9%

Frankly, if you were to force me to pick an “ideal” response distribution, I’d say that I would like to see every student choose “sometimes” or “often.”  At the same time, I’d hope that this response was balanced by students’ indicating that they also experienced consistent levels of challenge. Furthermore, I’d hope that this response was a reflection of our students’ experiences in each course rather than the possibility that our students all had some professors that were uniformly critical and others who were uniformly encouraging.

It troubles me that 69 of the 375 respondents (about 60% of our freshman class completed this survey) answered “rarely” or “never.”  Of course these students may have also been classic screw-ups who rarely or never turned in work that merited a compliment.  But even if that were so, given that human beings need a combination of challenge AND support to successfully take on a challenge and persist through to overcome it, throwing our hands in the air and saying that these students’ work didn’t merit a positive word simply increases the chances that they won’t succeed.

Humbly, I would suggest that our job as educators isn’t to ensure failure.  Instead, I’d suggest that our job is to increase the likelihood of success, especially among those who don’t rise to the occasion on their own or who already had the tools before they got here.

One important detail to remember is that these questions asked students to indicate the degree to which they think they received compliments for something that they did well. That isn’t the same as trying to find out if their instructors actually gave them compliments. Sometimes students don’t recognize the words we say or write as compliments just because of where they are in their own development.  For example, students may well not understand the academic language we often use to describe their effort in a paper as a compliment.

So as you begin to provide feedback to students in discussion, on written work, in online fora, or on other assignments, find ways to provide enough support to gain their trust. For then, and only then, will you be in a position to really challenge them when it matters and push them to excel beyond what they originally thought was possible.

Make it a good day,



What if your sense of how hard your students work doesn’t match how hard they think they work?

There are so many times when I read or hear of a great idea that I know would help me in my work. But at the key moment when I could really put that piece of information to use, the little nugget might as well be circling a distant galaxy. So one of the things I am going to try to do better is write posts that are more relevant to the issues faculty and staff face when they face them. You’ve probably heard of Just-in-Time Teaching (it’s a great book, by the way). Think of this as just-in-time data.

Even though the spring term starts this week, many of you are probably still toying with the the details of your syllabus(es), thinking about what you might do to make your class just a bit better without blowing it up and creating an avalanche of work for yourself precisely when you are already swamped. A couple of weeks ago I wrote about evidence from our own data suggesting the potential benefits of adding an early assignment to your course. Two other items from our just-completed mid-year survey of freshmen put in mind one other important, and sometimes easily forgotten, issue that can also make a big difference in your student’s learning and your course’s success.

Freshmen were asked at the end of last term to respond to these two items.

My instructors set high expectations for my learning and growth.

never 3 1%
rarely 6 2%
sometimes 38 10%
often 163 43%
most or all of the time 165 44%

I really worked hard to meet my instructor’s expectations.

never 2 1%
rarely 2 1%
sometimes 38 10%
often 146 39%
most or all of the time 187 50%

On one level, the fact that these two items correlate so closely is a good thing (it would indeed be frightening if they didn’t!). However, as I thought more about these two response sets, I began to wonder: would 89% of our faculty (the proportion that matches our students’ response distribution) also say that first-year students “really worked hard” to meet expectations “often” or “most or all of the time”?

My guess is that there is something worth unpacking here. To be fair, it’s possible that the questions themselves are problematic.  Maybe students aren’t comfortable suggesting that their instructors didn’t push them all that much or that they “mailed it in” more often than not. Yet we have prior NSSE data to suggest that our first year students complete homework assignments and write multiple drafts of papers more often than students at comparable institutions. Because multiple data findings pointing in the same direction make it harder to challenge the validity of a general claim, I’m inclined to suspect that the data points outlined above might indeed suggest something worth considering.

That leaves us with the possibility (especially if you are one of those faculty who don’t think that your first year students “really work hard” to meet your expectations about 90% of the time) that our students either don’t always have a clear sense of what is expected of them or that maybe we aren’t always holding them to our own high expectations when we provide grades and feedback.  Moreover, and I mean this genuinely, more than a few of our students may not yet have had the kind of life experiences that teach one what it really means to work hard to accomplish something.

Of course we know from our daily work that learning is messy business. Human beings aren’t always so thrilled to be stretched outside of their comfort zone, nor are we always excited by the prospect of failure as a necessary precursor of real learning. This is why masterful teaching is a constant balancing act of pushing students beyond where they might want to go while at the same time supporting them by expressing a belief (even if it’s more theoretical than actual) that they can accomplish what you’ve ask them to do.

Most of us have probably been challenged at least once by a student who doesn’t think that they deserve the grade you gave them. That conversation is always more difficult if the student doesn’t grasp the nature of the standards you applied to their work.

I mention all of this in order to suggest that student responses to these two questions may represent the degree to which our students really (sometimes desperately) need clear, precise, and pointed guidance about faculty expectations for quality work.  Although we might all think everyone knows what it means to write clearly, students – especially freshmen – often have only the vaguest notion of what that actually looks like.

There may be lots of other things going on behind these responses.  In fact, if you’ve got an observation that you’d like to share, by all means add a comment below.  But if you want one fairly simple thing to insert into your course(s) that can pay dividends later in the term, take some extra time to clarify your expectations for your students in ways that they can understand.  Then you can truly hold their feet to the fire when you push them to “really work hard” in order to meet the expectations you set.

Make it a good day,