My mind is on alumni surveys — what works, what doesn’t, and what responses are most useful for Development models. Not long ago I was handling a data set from a survey of alumni engagement and deriving predictor variables from it to use in my models. Today I’m doing the same thing with a similar survey from another university.
Similar, but not the same: One university avoided asking alumni about their level of household income; the other went for it. Who was smarter? I have to admit, I’m not really sure.
The Income Question seems like dangerous territory to me. I don’t know about Americans, but my impression is that Canadians are reluctant to divulge. The taxman gets to know, and the census-taker, but that’s it — it’s no one else’s business. People will either skip the question or, worse, abandon the survey. An additional risk for alumni surveys, which may have many goals related to alumni programming, is that it feeds cynicism about what contact from alma mater is really about.
In the data set before me now, a respectable 63.5% of survey participants chose to answer the question, which is lower but compares favourably with the response rate for other, more innocuous questions. The designers of this survey observed some common best practices in handling the question: It was left to the very last (with the easy-to-answer questions loaded toward the front), and the question was worded to make it clear that it was optional.
But why was the question asked at all? Was it a hunt for prospects? Was it feeding into a research study? I don’t know yet. But my concern today is whether the responses are predictive of giving.
The act of answering the income question itself does not seem to be indicative of likelihood (positive or negative) of being a donor; the responders and non-responders are donors in equal proportions. However, alumni donors who skipped the question have average lifetime giving that is twice as high as the responder donor group; their median giving is also higher, so it’s not just a few very generous donors who skipped the question who are distorting the picture.
That seems a little strange: donors AND non-donors are equally OK with the question, but the non-responders are the bigger donors?
Then, when I look at what people actually answered for income levels, it’s a whole different story again. The survey offered the respondent six income ranges to choose from — and lifetime giving increases dramatically with each increase in income. Both average and median lifetime giving shoot upward, and so does participation: At the lowest income level, more than 75% have never given; at the highest two levels (where the number of respondents is limited — 47 and 12), only about 20% have never given.
So while answering the question is not indicative of anything, what you answer appears to have a connection to what you give. That makes intuitive sense, but does this information provide us with something useful and predictive? I’m not sure yet — I have yet to introduce the variable into a model — but already I know that there will be plenty of interaction (or overlap) with other variables. For example, the lower-income alumni tend to be young and female, and the higher-income alumni older and male.
In other words, it seems likely that the income question data will be partly or completely eclipsed by gender, age and other variables. My internal jury is still out. But for now, I’d advise that should you have any input into the design of an upcoming alumni survey, think twice before you put the income question at the top of any list of must-haves.