CoolData blog

2 August 2016

Data Down Under, and the real reason we measure alumni engagement

Filed under: Alumni, Dalhousie University, engagement, Training / Professional Development — Tags: — kevinmacdonell @ 4:00 pm

 

coverI’ve given presentations here and there around Canada and the U.S., but I’ve never travelled THIS far. On Aug. 24, I will present a workshop in Sydney, Australia — a one-day master class for CASE Asia-Pacific on using data to measuring alumni engagement. My wife and I will be taking some time to see some of that beautiful country, leaving in just a few days.

 

The workshop attendees will be alumni relations professionals from institutions large and small, and in the interest of keeping the audience’s needs in mind, I hope to convince them that measuring engagement is worth doing by talking about what’s in it for them.

 

This will be the easy part. Figuring out how to quantify engagement will allow them to demonstrate the value of their teams’ activity to the university, using language their senior leadership understands. Scoring can also help alumni teams better target segments based on varying levels of engagement, evaluate current alumni programming, and focus on activities that yield the greatest boost in engagement.

 

There is a related but larger context for this discussion, however. I am not certain that everyone will be keen to hear about it.

 

Here’s the situation. Everything in alumni relations is changing. Alumni populations are growing, the number of donors is decreasing, and traditional engagement methods are less effective. Friend-raising and “one size fits all” approaches to engagement are increasingly seen as unsustainable wastes of resources. (A Washington, DC based consultancy, the Education Advisory Board, makes this point very well in this excerpt of a report which you can download here: The Strategic Alumni Relations Enterprise.)

 

I don’t know so much about the Asia-Pacific region, but in North America university leaders are questioning the very purpose and value of typical alumni relations activities. In this scenario, engagement measurement is intended for more than producing a merely informational report or having something to brag about: Engagement measurement is really a tool that enables alumni relations to better align itself with the Advancement mission.

 

In place of “one size fits all,” alumni relations teams are under pressure to understand how to interact with alumni at different levels of engagement. Alumni who are somewhat engaged should be targeted with relevant programs and messages to bring them to the next level, while alumni who are at the lowest levels of engagement should not have significant resources directed at them.

 

Alumni at high levels of engagement, however, require special and customized treatment. They’re looking for deeper and more fulfilling experiences that involve furthering the mission of the institution itself. Think of guest lecturing, student recruitment, advisory board roles, and mentorship, career development and networking for students and new grads. Low-impact activities such as pub nights and other social events are a waste of the potential of this group and will fail to move them to continue contributing their time and money.

 

Think of what providing these quality experiences will entail. For one, alumni relations staff will have to collaborate with their colleagues in development, as well as in other offices across campus — enrolment management, career services, and academic offices. This will be a new thing, and perhaps not an easy thing, for alumni relations teams stuck in traditional friend-raising mode and working in isolation.

 

But it’s exactly through these strategic partnerships that alumni relations can prove its value to the whole institution and attract additional resources even in an environment where leaders are demanding to know the ROI of everything.

 

Along with better integration, a key element of this evolution will be robust engagement scoring. According to research conducted by the Education Advisory Board, alumni relations does the poorest job of any office on campus in providing hard data on its real contribution to the university’s mission. Too many of us are still stuck on tracking our activities instead of the results of those activities.

 

It doesn’t have to be that way, if the alumni team can effectively partner with other units in Advancement. For those of us on the data, reporting, and analysis side of the house, get ready: The alumni team is coming.

 

11 May 2015

A new way to look at alumni web survey data

Filed under: Alumni, Surveying, Vendors — Tags: , , , , — kevinmacdonell @ 7:38 pm

Guest post by Peter B. Wylie, with John Sammis

 

Click to download the PDF file of this discussion paper: A New Way to Look at Survey Data

 

Web-based surveys of alumni are useful for all sorts of reasons. If you go to the extra trouble of doing some analysis — or push your survey vendor to supply it — you can derive useful insights that could add huge value to your investment in surveying.

 

This discussion paper by Peter B. Wylie and John Sammis demonstrates a few of the insights that emerge by matching up survey data with some of the plentiful data you have on alums who respond to your survey, as well as those who don’t.

 

Neither alumni survey vendors nor their higher education clients are doing much work in this area. But as Peter writes, “None of us in advancement can do too much of this kind of analysis.”

 

Download: A New Way to Look at Survey Data

 

 

16 July 2013

Alumni engagement scoring vs. predictive modelling

Filed under: Alumni, engagement, predictive modeling — Tags: , , , — kevinmacdonell @ 8:06 am

Alumni engagement scoring has an undeniable appeal. What could be simpler? Just add up how many events an alum has attended, add more points for volunteering, add more points for supporting the Annual Fund, and maybe some points for other factors that seem related to engagement, and there you have your score. If you want to get more sophisticated, you can try weighting each score input, but generally engagement scoring doesn’t involve any advanced statistics and is easily grasped.

Not so with predictive modelling, which does involve advanced stats and isn’t nearly as intuitive; often it’s not possible to really say how an input variable is related to the outcome. It’s tempting, too, to think of an engagement score as being a predictor of giving and therefore a good replacement for modelling. Actually, it should be predictive — if it isn’t, your score is not measuring the right things — but an engagement score is not the same thing as a predictive model score. They are different tools for different jobs.

Not only are engagement scoring schemes different from predictive models, their simplicity is deceptive. Engagement scoring is incomplete without some plan for acting on observed trends with targeted programming. This implies the ability to establish causal drivers of engagement, which is a tricky thing.

That’s a sequence of events — not a one-time thing. In fact, engagement scoring is like checking the temperature at regular intervals over a long period of time, looking for up and down trends not just for the group as a whole but via comparisons of important subgroups defined by age, sex, class year, college, degree program, geography or other divisions. This requires discipline: taking measurements in exactly the same way every year (or quarter, or what-have-you). If the score is fed by a survey component, you must survey constantly and consistently.

Predictive models and engagement scores have some surface similarities. They share variables in common, the output of both is a numerical score applied to every individual, and both require database work and math in order to calculate them. Beyond that, however, they are built in different ways and for different purposes. To summarize:

  • Predictive models are collections of potentially dozens of database variables weighted according to strength of correlation with a well-defined behaviour one is trying to predict (eg. making a gift), in order to rank individuals by likelihood to engage in that behaviour. Both Alumni Relations and Development can benefit from the use of predictive models.
  • Engagement scores are collections of a very few selectively-chosen database variables, either not weighted or weighted according to common sense and intuition, in order to roughly quantify the quality of “engagement”, however one wishes to define that term, for each individual. The purpose is to allow comparison of groups (faculties, age bands, geographical regions, etc.) with each other. Comparisons may be made at one point in time, but it is more useful to compare relative changes over time. The main user of scores is Alumni Relations, in order to identify segments requiring targeted programming, for example, and to assess the impact of programming on targeted segments over time.

Let’s explore key differences in more depth:

The purpose of modelling is prediction, for ranking or segmentation. The purpose of engagement scoring is comparison.

Predictive modelling scores are not usually included in reports. Used immediately in decision making, they may never be seen by more than one or two people. Engagement scores are included in reports and dashboards, and influence decision-making over a long span of time.

The target variable of a predictive model is quantifiable (eg. giving, measurable in dollars). In engagement scoring, there is no target variable, only an output – a construct called “engagement”, which itself is not directly measurable.

Potential input variables for predictive models are numerous (100+) and vary from model to model. Input variables for engagement scores are limited to a handful of easily measured attributes (giving, event attendance, volunteering) which must remain consistent over time.

Variables for predictive models are chosen primarily using statistical methods (correlation) and only secondarily using judgment and “common sense.” For example, if the presence of a business phone number is highly correlated with being a donor, it may be included in the model. For engagement scores, variables are chosen by consensus of stakeholders, primarily according to subjective standards. For example, event attendance and giving would probably be deemed by the committee to indicate engagement, and would therefore be included in the score. Advanced statistics rarely come into play. (For more thoughts on this, read How you measure alumni engagement is up to you.)

In predictive models, giving and variables related to the activity of giving are usually excluded as variables (if ‘giving’ is what we are trying to predict). Using any aspect of the target variable as an input is bad practice in predictive modelling and is carefully avoided. You wouldn’t, for example, use attendance at a donor recognition event to predict likelihood to give. In engagement scoring, though, giving history is usually a key input, as it is common sense to believe that being a donor is an indication of engagement. (It might be excluded or reported separately if the aim is to demonstrate the causal link between engagement indicators and giving.)

Modelling variables are weighted using multiple linear regression or other statistical method which calculates the relative influence of each variable while simultaneously controlling for the influence of all other variables in the model. Engagement score variables are usually weighted according to gut feel. For example, coming to campus for Homecoming seems to carry more weight than showing up for a pub night in one’s own city, therefore we give it more weight.

The quality of a predictive model is testable, first against a validation data set, and later against actual results. But there is no right or wrong way to estimate engagement, therefore the quality of scores cannot be evaluated conclusively.

The variables in a predictive model have complex relationships with each other that are difficult or impossible to explain except very generally. Usually there is no reason to explain a model in detail. The components in an engagement score, on the other hand, have plausible (although not verifiable) connections to engagement. For example, volunteering is indicative of engagement, while Name Prefix is irrelevant.

Predictive models are built for a single, time-limited purpose and then thrown away. They evolve iteratively and are ever-changing. On the other hand, once established, the method for calculating an engagement score must not change if comparisons are to be made over time. Consistency is key.

Which is all to say: alumni engagement scoring is not predictive modelling. (And neither is RFM analysis.) Only predictive modelling is predictive modelling.

6 June 2012

How you measure alumni engagement is up to you

Filed under: Alumni, Best practices, Vendors — Tags: , , , — kevinmacdonell @ 8:02 am

There’s been some back-and-forth on one of the listservs about the “correct” way to measure and score alumni engagement. An emphasis on scientific rigor is being pressed for by one vendor who claims to specialize in rigor. The emphasis is misplaced.

No doubt there are sophisticated ways of measuring engagement that I know nothing about, but the question I can’t get beyond is, how do you define “engagement”? How do you make it measurable so that one method applies everywhere? I think that’s a challenging proposition, one that limits any claim to “correctness” of method. This is the main reason that I avoid writing about measuring engagement — it sounds analytical, but inevitably it rests on some messy, intuitive assumptions.

The closest I’ve ever seen anyone come is Engagement Analysis Inc., a firm based here in Canada. They have a carefully chosen set of engagement-related survey questions which are held constant from school to school. The questions are grouped in various categories or “drivers” of engagement according to how closely related (statistically) the responses tend to be to each other. Although I have issues with alumni surveys and the dangers involved in interpreting the results, I found EA’s approach fascinating in terms of gathering and comparing data on alumni attitudes.

(Disclaimer: My former employer was once a client of this firm’s but I have no other association with them. Other vendors do similar and very fine work, of course. I can think of a few, but haven’t actually worked with them, so I will not offer an opinion.)

Some vendors may make claims of being scientific or analytically correct, but the only requirement of quantifying engagement is that it be reasonable, and (if you are benchmarking against other schools) consistent from school to school. In general, if you want to benchmark, then engage a vendor if you want to do it right, because it’s not easily done.

But if you want to benchmark against yourself (that is, over time), don’t be intimidated by anyone telling you your method isn’t good enough. Just do your own thing. Survey if you like, but call first upon the real, measurable activities that your alumni participate in. There is no single right way, so find out what others have done. One institution will give more weight to reunion attendance than to showing up for a pub night, while another will weigh all event attendance equally. Another will ditch event attendance altogether in favour of volunteer activity, or some other indicator.

Can anyone say definitively that any of these approaches are wrong? I don’t think so — they may be just right for the school doing the measuring. Many schools (mine included) assign fairly arbitrary weights to engagement indicators based on intuition and experience. I can’t find fault with that, simply because “engagement” is not a quantity. It’s not directly measurable, so we have to use proxies which ARE measurable. Other schools measure the degree of association (correlation) between certain activities and alumni giving, and base their weights on that, which is smart. But it’s all the same to me in the end, because ‘giving’ is just another proxy for the freely interpretable quality of “engagement.”

Think of devising a “love score” to rank people’s marriages in terms of the strength of the pair bond. A hundred analysts would head off in a hundred different directions at Step 1: Defining “love”. That doesn’t mean the exercise is useless or uninteresting, it just means that certain claims have to be taken with a grain of salt.

We all have plenty of leeway to chose the proxies that work for us, and I’ve seen a number of good examples from various schools. I can’t say one is better than another. If you do a good job measuring the proxies from one year to the next, you should be able to learn something from the relative rises and falls in engagement scores over time and compared between different groups of alumni.

Are there more rigorous approaches? Yes, probably. Should that stop you from doing your own thing? Never!

17 October 2011

À la recherche du alumni engagement perdu

Filed under: Alumni, Why predictive modeling? — Tags: — kevinmacdonell @ 8:56 am

Have you read any Proust? His voluminous novel contains many unsentimental thoughts about friendship and love. Among them is the idea that the opposite of love is not hate. The opposite of love is indifference.

The constituents in your database who are not engaged are not the ones who write nasty letters. They’re not the ones who give you a big thumbs-down on your survey. They’re not the ones who criticize the food at your gala dinner. They’re not the ones who tell your phone campaign callers never to call again.

Nope. Your non-engaged constituents are the ones you never hear from. The ones who chuck out your mailings unopened. The ones who ignore the invitation to participate in a survey. The ones who have never attended an event. The ones who never answer the phone.

If your school or organization is typical, a good-sized portion of your database falls into this category. There’s money in identifying who is truly not engaged, and therefore not worth wasting resources on.

The ones who are moved to criticize you, the ones who have opinions about you, the ones who want to be contacted only a certain way — ah, they’re different.

The future belongs to those who can tell the difference.

14 September 2011

Is the online behaviour of your alums worth exploring?

By Peter Wylie and John Sammis

(Download a printer-friendly PDF version of this paper: Online behaviour of alums)

For a number of years John Sammis and I have been pushing colleges and universities to examine the data they (or their vendors) collect for alums who are members of their online communities. For example, we encourage them to look at very basic things like:

  • The number of e-mails an alum has opened since it’s been possible to get such data
  • The number of “click throughs” an alum has made to the website in response to an e-mail, an e-newsletter, and the like
  • The number of times an alum visits the website
  • The date and time of each visit

Why do we think they should be recording and examining these kinds of data? Because (based on some limited but compelling evidence) we think such data are related to how much and how often alums give to their alma maters as well as how engaged they are (e.g., reunion attendance, volunteering, etc.) to these institutions.  To ignore such data means leaving money on the table and losing a chance to spot alums who are truly interested in the school, even if they’ll never become major givers.

Frankly the response to our entreaties has been less than heartening:

  • “We don’t have an online community. If we get one, that’s probably a year or two away.”
  • “With the explosion of social media, we’re more interested in what we can learn about our alums through Facebook, LinkedIn, Twitter … I mean those are the sites our alums will be going to, not ours.”
  • “You want us to get record-by-record data from the vendor who maintains our site? Surely you jest. We’re lucky if they’ll send us decipherable summary data on email openings and click-throughs.”

But we’re nothing if not persistent. So what we’ve done here is put together some data from a four year higher education institution that has a pretty active online community. Granted, it’s only one school, but the data show a pronounced relationship between number of website visits and several different measures of alumni engagement and alumni giving.

We have to believe this school is not a glaring exception among the thousands of schools out there that have online communities. Our hope is that you’ll read through what we have to show and tell and conclude, “What the heck. Why don’t we take a similar look at our own data and see what we can see. Can’t hurt.”

Nope. Can’t hurt, and it might help – might help a lot.

A View of the Overall Distribution of Website Visits and the Distribution of Visits by Class Year

Table 1 shows that almost exactly two thirds of the alums have never visited the school’s website as an identifiable member of the school’s online community. The remaining third are roughly evenly divided among four categories: one visit; two to three visits; four to seven visits; and eight or more visits.

Table 1: Frequency and Percentage Distribution of Website Visits for More Than 40,000 Alums

As soon as we saw this distribution, we were quite sure it would vary a great deal depending how long people had been out of school. To confirm that hunch we divided all alums into ten roughly equal sized groups (i.e., into deciles).

Table 2: Count, Median Class Year, and Minimum and Maximum Class Years for All Alums Divided into Deciles

As you can see in Table 2, there are some very senior people in this alumni universe, and there are some very junior people. For example, the majority of folks in Decile 10 (CY 2006 – CY 2010) are probably in their 20’s. What about Decile 1 (CY 1926 –CY 1958)? It’s a safe bet that these folks are all over 70, and we may have at least one centenarian in the group (which we think is pretty cool).

If you look at Table 3, you can see the percentage distribution of website visits for each Class Year Decile. However, the problem with that table (and most tables that have lots of information in them) is that (unless you’re a data geek like we are) it’s not something you want to spend a lot of time looking at. You’d probably prefer to look at a chart, a graphic display of the data. So what we’ve done here (and throughout this paper) is display the data graphically for the folks in Decile 1, Decile 5, and Decile 10 – very senior people, middle-aged people, and very young people.

Table 3: Percentage of Website visits by Class Year Decile

 

Clearly our hunch was right. The distribution of website visits is highly related to how long people have been out of school:

  • Over 90% of alums who graduated before 1959 (Decile 10) have not visited the website.
  • In the youngest group (Decile 10) only a bit over 25% of alums have not visited the site.
  • You have to look at Table 3 to see the trend, but notice how “the 0 Visits” percentage drops for Deciles 7-10 (a span covering alums graduating in 1992 up to 2010):  68.9% down to 64.3% down to 46.5% down to 27.7%.

 

The Relationship between Number of Website Visits and Alumni Engagement

If you work in higher education advancement, you probably hear the term “alumni engagement” mentioned several times a week. It’s something lots and lots of folks are concerned about. And plenty of these folks are finding more and more ways to operationally define the term.

Here we’ve taken a very simple approach. We’ve looked at whether or not an alum had ever volunteered for the institution and whether or not an alum had ever attended a reunion.

Volunteering

Table 4 and Figures 4 to 6 show the relationship between number of website visits and volunteering. Just to be clear on what we’re laying out here, let’s go through some of the details of Table 4.

We’ll use Class year Decile 1 (alums who graduated between 1926 and 1958) as an example. Look at the alums in this Decile who have never visited the website; only 17.1% of them have ever volunteered. On the other hand, 42.9% of alums who have visited the website 8 or more times have volunteered. If you look at Figure 4, of course, you’ll see the same information depicted graphically.

Table 4: Percentage of Alums by Number of Website Visits for All Deciles Who Ever Volunteered

There are two facts that stick out for us in Table 4 and Figures 4 to 6:

  • Alums who have never visited the website are far less likely to have volunteered than those who have visited even once.
  • In general, there is a steady climb in the rate of volunteering as the number of website visits increases.

Reunion Attendance

If you look through Table 5 and Figures 7 to 9, you’ll see a relationship between number of website visits and reunion attendance that’s very similar to what you saw between number of website visits and volunteering. The one exception would be for the youngest group of alums – those in Decile 10 who graduated between 2006 and 2010. These alums simply are too young to have attended a five year reunion. (Although it would appear that several of them found a way to make it back to school anyway – good for them.)

Table 5: Percentage of Alums by Number of Website Visits for All Deciles Who Ever Attended a Reunion

The Relationship between Number of Website Visits and Giving

          There is no question that advancement offices are interested in alumni engagement. But if we’re realistic, we have to admit they tend to view engagement as mainly a step in the direction of one day becoming a donor. So let’s take a look at how number of website visits is related to alumni giving at this school.

We’ve created two sets of tables and figures to allow you to get a clear look at all this:

  • Table 6 and Figures 10 to 12 show the relationship between the number of website visits and giving over the past two fiscal years.
  • Table 7 and Figures 13-15 show the relationship between the number of website visits and lifetime giving of $10,000 or more.

Browse through all this material. After you’ve done that, we’ll tell you what we see.

Table 6: Percentage of Alums by Number of Website Visits for All Deciles Who Have Given Anything in the Last Two Fiscal Years

Table 7: Percentage of Alums by Number of Website Visits for All Deciles Who Have Given $10,000 or More Lifetime

Clearly, there is a lot of information contained in these tables and charts. But if we stand back from all that we see, the picture becomes clear. Regardless of how long alums have been out of school, those who have visited the website versus those who have not are better recent givers, and they are better major givers.

For example, let’s focus on alums who graduated before 1958 (Decile 1).  Those who have visited the website at least 8 times are almost twice as likely to have given in the last two fiscal years as those who have never visited the site (75% versus 41.6%). If we look at giving of $10,000 or more lifetime for this same Decile, the difference is even more striking: 42.9% versus 12.5%.

Let’s jump down to Decile 10, the “youngsters” who graduated between 2006 and 2010. Understandably, almost none of these alums have given $10,000 or more lifetime. But look at Figure 12. For this group the relationship between number of website visits and giving over the last two fiscal years is striking:

  • 27.8% for those with 0 website visits gave during this period.
  • 35.1% for those with 1 visit gave during this period.
  • 38.1% for those with 2-3 visits gave during this period.
  • 43.1% for those with 4-7 visits gave during this period.
  • 50.9% for those with 8 or more visits gave during this period.

Where to Go from Here

Clearly, there is a strong relationship between this simple web metric (number of website visits) and alumni engagement and alumni giving at this particular school. If that’s the case, it’s reasonable to assume that the same sort of relationship holds true for other schools. If you agree with that assumption, then we think it’s more than worth your while to take a look at similar data at your own institution.

At this point you might decide:

“Look guys, this is all very interesting, but we simply don’t have the time, resources, nor staff to do that. Maybe sometime in the future, when things are less hectic around here, we’ll take your advice. But not now.”

As much as we love this sort of analysis, we totally get a decision like that. We may be specialists, but we talk to enough people in advancement every week to realize you have a lot more on your minds than data mining and predictive modeling.

On the other hand, you might conclude that what we’ve done here is something you’d like to try to replicate, or improve on. If so, here’s what we’d recommend:

  1. Find out what kind of online data is available to you.
  2. Ask your technical folks to get those data into analyzable form for you.
  3. Do some simple analyses with the data.
  4. Share the results with colleagues you think would find it interesting.

 

1. Find Out What Kind of Data Is Available

Depending on how your shop is set up, this may take some persistence and digging. If it were us, we’d be trying to find out:

  • Has an alum ever opened an email that we’ve sent them? (In a lot of schools they don’t have to be a member of the online community for you to ascertain that.)
  • Have they ever opened an e-newsletter?
  • Have they ever clicked through to your website from an e-mail or e-newsletter?
  • Can you get counts for number of openings and number of click-throughs?

In all probability, you’ll be dealing with a vendor (either directly or through your IT folks) to get answers to these questions. Expect some pushback. A dialogue that goes like this would not be unusual:

YOU: Can I get the number of e-mails and e-newsletters that each of our alums has opened since the school has been sending out that kind of stuff?

VENDOR: We can certainly give you the number of e-mails and number of e-newsletters that were opened on each date that one was sent out.

YOU: That’s great, but that’s not what I’m looking for. I need to know, on a record-by-record basis, which alums opened the e-communication, and I need a total count for each alum for their total number of openings.

VENDOR: That’ll take some doing.

YOU: But you can do it?

VENDOR: I suppose.

YOU: Terrific!

 

2. Ask Your Technical Folks to Get The Data Into Analyzable Form.

What does “analyzable form” mean? To us that just means getting the data into spreadsheet format (probably Excel) where the first field will probably be the unique ID number you use to keep track of all the alums (and other constituents) in your fundraising database. For starters, we’d recommend something very simple. For example:

  • Field A: Unique ID number
  • Field B: Total amount of lifetime hard credit (for many alums, this value will be zero)
  • Field C: Total amount of hard credit for the last two fiscal years
  • Field D: Total number of e-mails or e-newsletters opened
  • Field E: Total number of click-throughs to your website from these e-mails and e-newsletters
  • Field F: Preferred class year of the alum

In our opinion, this kind of file should be very simple to build. In our experience, however, that is often not the case. (Why? How much time you got?)

Our frustrations with this sort of problem notwithstanding, keep pushing for the file. Be polite. Be diplomatic. And, above all, be persistent.

 

3. Do Some Simple Analyses with The Data.

There are any number of ways to analyze your data. Our bias would be to have you import the Excel file into a stats software package, and then do the analysis. (You can do it in Excel, but it’s a lot harder than if you use something like SPSS or Data Desk [our preference]).

If you can’t do this yourself, we’d recommend that you find someone on your team or on your campus to do it for you.  The right person, when you ask them if they can roughly replicate the tables and charts included in this paper, will say something like, “Sure,” “No problem,” “Piece of cake,” etc. If they don’t, keep looking.

 

4. Share The Results With Colleagues You Think Would Find It Interesting.

Sharing your results with colleagues should be stimulating and enjoyable. You know the folks you work with and have probably already got some in mind. But here are a few suggestions:

  • Look for people who think data driven decision making is important in higher education advancement.
  • Avoid people who are inclined to tell you why something can’t be done. Include people who enjoy finding ways around obstacles.
  • It’s okay to have one devil’s advocate in the group. More than one? That can be kind of frustrating.
  • If you can, get a vice president to join you. Folks at that level can help move things forward more easily than people at the director level, especially when it comes to “motivating” vendors to do things for you that they’d rather not do.

When you can, let us know how things go.

Create a free website or blog at WordPress.com.