Friday, July 23, 2010

Data Mining: What's The Big Deal?

I was participating in a meeting with a new client yesterday and we were discussing some preliminary results of a data-mining exercise. Since some of the team members were unfamiliar with me or the project, I explained the process we used, and then we brainstormed for an hour on what they would like to learn and how they would apply it immediately to their normal practices. As I told them at the end, it was a rare (and gratifying) meeting when "data" and "really excited" occur together in the same sentence.

What I found most fascinating about this meeting was the latent desire among staff to have the database tell them something—ANYTHING. This isn't an unsophsticated or small organization—without giving away state secrets, they have 3.26 million membership dues transactions over the past 17 years, for example, and the quality of their questions and thoughts regarding how they'd apply learnings fell somewhere between pretty good and truly outstanding. But their environment—waiting six months for IT to download their data, receiving only what they refer to as 'sales reports' and little else—is very typical in associations, and it neither rewards nor cultivates expertise working with data among association staff.

No wonder the "data driven decision making" approach recommended in ASAE's 7 Measures book never took off. Even in other ASAE publications, common advice regarding data seems to be "don't collect or store what  you don't know how to use." This might be good advice and promote efficiency, if it wasn't so easy to actually snapshot your data—all of it—in an environment where it can facilitate ad hoc queries, periodic dashboarding, developing product purchaser profiles, measuring the migration patterns of your recent graduates into full membership, exploring the relationship between product/event purchase/attendance with membership conversion, measuring repeat customer repeat rates, doing market basket analysis, creating an RFM matrix for your fundraising, etc.

The key is actually very simple—download your data just once from your AMS into a series of flat files (comma or tab delimited) and import them into a decision support tool. I use SAS, whose basic product for a single user runs $3200ish in the first year then $1600 for an annual license. John Dorman and the folks down at Texas Medical prefer to use the MS SQL that comes free with their network but he describes the cost of upgrades and training a staffer with at least some expertise in programming & analysis to be a one-time expense of maybe $8-$10,000. I find that loading and reprogramming an association's file takes me 2 to 10 hours depending on the number of modules the data is stored in and how much of the data we need to simplify or eliminate (since you don't really need to know the name of the event they registered for on July 2 2003—you just need to know it's one of ten they attended early in their membership tenure before they stopped attending but continued paying their dues). Querying it .. including re-sorts, creation of new variables, categories, etc. in new datasets might take 5 to 20 minutes, even for files with hundreds of thousands of members or millions of transactions. Of course, most consulting isn't iterative: most of our reports have to be large and episodic, rather than small and applied, because we're paid to do projects rather than programs, but if you added this capability in-house (my recommendation) any association who takes this approach could have answers literally on demand without annoying the IT staff with requests or annoying everyone by slowing down a production server.

The sad part of this is that the technology is actually ancient. It's been this cheap at least for the almost 25 years I've worked with these systems: at first nobody believed it was possible because we were using AS400 mini-cmoputers (actually mini meant 'not a mainframe;' weighing 1000 pounds and sometimes being fed by magnetic tape reels or cartridges). But once you turn the corner on this, in 1987 or 2010, seeing is believing. It's a simple process and I promise you'll never miss your 80 page reports again, nor do you have to pay $50,000 for 'data integration' or other support to integrate Cognos, Crystal Reports, or any other tool de jour. And after doing it inside of several associations I also never ran into the program of being regarded as the nerdy 'stapler guy' once we proved the ease and power of real, daily data mining.

Thursday, July 8, 2010

Exit Surveys: Let Us Know How We're Doing ... And Turn Off the Lights When You Leave

I hope that most of us in associations conduct exit surveys, and we recently considered how we should increase our participation rates on the ASAE listserv.

Good objectives for exit survey include:

 Collecting valid research results
 Distinguishing between avoidable & unavoidable attrition
 Ending this stage of the member relationship on a good note, and
 Clean the database for subsequent efforts to reinstate those who will want to come "back to the fold" in the future.

Our advice in this area includes:

1) Do a multi-part contact sequence. Starting with the lowest cost-per-contact option and migrate into costlier ones with the deliverable names who are not responding. A typical sequence might be: #1 online survey, #2 mail or fax survey, and (possibly) #3 telephone.

With any online survey, we recommend using an invitation & two reminders to non-respondents over a several-week period. This constitutes a reasonable and consistent effort from which they can opt out anytime or respond, and you should suppress them from the reminders. You'll always annoy the vocal 1%, but more persistence conveys to the former member that you're serious about wanting their feedback.

2) Make sure the message comes from an high ranking executive or volunteer name as the sender and signer. The reason many of our members leave us may be tied to something big & strategic within the association or in practice in general. Rightly or wrongly, a former member may regard correspondence coming from you or the membership department in general as transactional rather than an earnest attempt to learn and take action to address the issues a former member identifies that's within their control or influence.

3) Don't worry about anonymity but rather, offer confidentiality instead. You have to know who they are for proper campaign management but you won't report any personally identifiable findings. For your own internal operations, it's often worth the time to go back through your AMS and track the history of that one lapsed member who told you an interesting story about their experience ... if you don't know who they are you can't learn nearly as much about what your association did over their membership cycle to make them feel that way or form that impression.

4) Once you're using mail, offer a response premium for the first time but make it effective. The academic literature has demonstrated for years that front-end premia always outpull back-end; in our increasingly instant gratification, Amazon two-click world, it's probably even more pronounced. Offering respondents their choice of a $25 gift certificate toward your membership/services or an AmEx gift card could appeal to some (back-end premium); it would be far more timely and appealing to provide the incentive to ALL respondents included with your survey.

It helps to guilt a response (you gave me something, okay I guess I'll tell you what I think), and it can be easy to implement through a buckslip coupon insert in your #10 survey mailing, or a perfed tab on your outbound postcard that they can redeem online.

5) Monitor deliverability through each method and include this outcome in your final metrics. For example, if you have 500 lapsed members in April, it's helpful to know that 20% are unreachable due to job changes (bounces, NCOAs, phone confirmation of moves), 45% dropped due to inadequate value, 25% due to employer no longer reimbursing for dues, 20% because other associations meet their needs better, 15% for other cost-related issues, 10% due to retirement, etc. and that 20% might seriously consider rejoining in the future. You can infer the first figure based on actual campaign performance, the remaining figures by extrapolating from your findings based on any response rate.

I think there can be far greater impact if you can link exit survey insight to your ongoing membership operation on a personal basis, by 'flagging and tagging' individuals who report specific reasons for dropping. In the long run, most former members still in the field might not actually care about their confidentiality, and some exit survey findings could lead to additional followup such as more detailed discussions with the willing.

Former members in general appreciate knowing that you're working on the issues that led them to drop their membership and the additional conversations could allow you to probe further, test possible solutions. You certainly wouldn't want to spend all your time doing labor-intensive things for a low-potential reinstatement prospect, but if you can focus specialized attention on the individuals with the most interesting and/or prevalent comments, or those with pertinent demographics or practice characteristics, you can incorporate this into a much stronger member retention process. –Kevin