Guest post by Peter B. Wylie, with John Sammis
(Download a printer-friendly PDF version here: Web Surveys Wylie-Sammis)
With the advent of the internet and its exponential growth over the last decade and a half, web surveys have gained a strong foothold in society in general, and in higher education advancement in particular. We’re not experts on surveys, and certainly not on web surveys. However, let’s assume you (or the vendor you use to do the survey) e-mail either a random sample of your alumni (or your entire universe of alumni) and invite them to go to a website and fill out a survey. If you do this, you will encounter the problem of poor response rate. If you’re lucky, maybe 30% of the people you e-mailed will respond, even if you vigorously follow-up non-responders encouraging them to please fill the thing out.
This is a problem. There will always be the lingering question of whether or not the non-responders are fundamentally different from the responders with respect to what you’re surveying them about. For example, will responders:
To test whether these kinds of questions are worth answering, you (or your vendor) could do some checking to see if your responders:
It is our impression that most schools that conduct alumni web surveys don’t do this sort of checking. In their reports they may discuss what their response rates are, but few offer an analysis of how the responders are different from the non-responders.
Again, we’re talking about impressions here, not carefully researched facts. But that’s not our concern in this paper. Our concern here is that web surveys (done in schools where potential responders are contacted only by e-mail) are highly unlikely to be representative of the entire universe of alums — even if the response rate for these surveys is always one hundred percent. Why? Because our evidence shows that alumni who have an e-mail address listed with their schools are markedly different (in terms of two important variables) from alumni who do not have an e-mail address listed: Age and giving.
To make our case, we’ll offer some data from four higher education institutions spread out across North America; two are private, and two are public. Let’s start with the distribution of e-mail addresses listed in each school by class year decile. You can see these data in Tables 1-4 and Figures 1-4. We’ll go through Table 1 and Figure 1 (School A) in some detail to make sure we’re being clear.
Take a look at Table 1. You’ll see that the alumni in School A have been divided up into ten roughly equal size groups where Decile 1 represents the oldest group and Decile 10 represents the youngest. The table shows a very large age range. The youngest alums in Decile 1 graduated in 1958. (Most of you reading this paper were not yet born by that year.) The alums in Decile 10 (unless some of them went back to school late in life) are all twenty-somethings.
Table 1: Count, Median Class Year, and Minimum and Maximum Class Years for All Alums Divided into Deciles for School A
Now look at Figure 1. It shows the percentage of alums by class year decile who have an e-mail address listed in the school’s database. Later on in the paper we’ll discuss what we think are some of the implications of a chart like this. Here we just want to be sure you understand what the chart is conveying. For example, 43.0% of alums who graduated between 1926 and 1958 (Decile 1) have an e-mail listed in the school’s database. How about Decile 9, alums who graduated between 2001 and 2005? If you came up with 86.5%, we’ve been clear.
Go ahead and browse through Tables 2-4 and Figures 2-4. After you’ve done that, we’ll tell you what we think is one of the implications of what you’ve seen so far.
Table 2: Count, Median Class Year, and Minimum and Maximum Class Years for All Alums Divided into Deciles for School B
Table 3: Count, Median Class Year, and Minimum and Maximum Class Years for All Alums Divided into Deciles for School C
Table 4: Count, Median Class Year, and Minimum and Maximum Class Years for All Alums Divided into Deciles for School D
The most significant implication we can draw from what we’ve shown you so far is this: If any of these four schools were to conduct a web survey by only contacting alums with an e-mail address, they would simply not reach large numbers of alums whose opinions they are probably interested in gathering. Some specifics:
Another way of expressing this implication is that each school (regardless of what their response rates were) would largely be tapping the opinions of younger alums, not older or even middle-aged alums. If that’s what a school really wants to do, okay. But we strongly suspect that’s not what it wants to do.
Now let’s look at something else that concerns us about doing web surveys if potential respondents are only contacted by e-mail: Giving. Figures 5-8 show the percentage of alums who have given $100 or more lifetime by e-mail address/no-email address across class year deciles.
As we did with Figure 1, let’s go over Figure 5 to make sure it’s clear. For example, in decile 1 (oldest alums) 87% of alumni with an e-mail address have given $100 or more lifetime to the school. Alums in the same decile who do not have an e-mail address? 71% of these alums have given $100 lifetime or more to the school. How about decile 10, the youngest group? What are the corresponding percentages of giving for those alums with and without an e-mail address? If you came up with 14% versus 6%, we’ve been clear.
Take a look at Figures 6-8, for schools B, C and D. Then we’ll tell you the second implication we see in all these data.
The overall impression we get from these four figures is clear: Alumni who do not have an e-mail address listed give considerably less money to their schools than do alumni with an e-mail address listed. This difference can be particularly pronounced among older alums.
Some Conclusions
The title of this piece is: “Are We Missing Too Many Alumni with Web Surveys?” Based on the data we’ve looked at, we think the answer to this question has to be a “yes.” It can’t be a good thing that many web surveys don’t go out to so many older alums who don’t have an e-mail address, and to alums without an e-mail address who haven’t given as much (on average) as those with an e-mail address.
On the other hand, we want to stress that web surveys can provide a huge amount of valuable information from the alums who are reached and do respond. Even if the coverage of the whole alumni universe is incomplete, the thousands of alums who take the time to fill out these surveys can’t be ignored.
Here’s an example. We got to reading through the hundreds and hundreds of written comments from a recent alumni survey. We haven’t included any of the comments here, but my (Peter’s) reaction to the comments was visceral. Wading through all the typos, and misspellings, and fractured syntax, I found myself cheering these folks on:
In total, these comments added up to almost 50,000 words of text, the length of a short novel. And they were a lot more interesting than the words in too many of the novels I read.
As always, we welcome your comments.