One of the most important — and most neglected — parts of a social marketing campaign is evaluation. I’m not talking about typing up a report at the end to hand over to your funders. Rather, “baking in” monitoring and evaluation from the moment your campaign is developed will help you track progress, determine if your approach is effective, and modify mid-course if necessary. Here are three steps to help you do it.
- Clarify what you’re measuring. First, start with your goal. If you want building owners to perform energy efficiency upgrades, obviously the ultimate metric is how many upgrades get completed. But that process takes time, and you may not have a final count until your campaign is over. Are there interim metrics you can monitor to ensure you’re making good progress? Think about the steps necessary to complete your goal — in this case, you first have to talk to business owners, then get them to agree to an energy audit, and then finally complete the work. Thus, you can track the number of targets reached (this means an actual conversation, not just mailing them a flyer) and the number of audits — and if you’re halfway though your campaign and are lagging on all fronts, you may want to re-evaluate your tactics. Some slackers may try to convince you that it’s OK to rely on “output measures” — how many brochures were distributed, how many times the program was mentioned in the newspaper, etc. Those are the easiest things to measure, but matter the least. Who cares how many brochures were given out if no one read them? Focus on “outcome measures” — the things your target audience actually did in response to your activities.
- Determine your measurement technique/s. Depending on the nature of your campaign, you’ll measure things differently. Sometimes it’s a straightforward matter of recording numbers from a database — the number of building audits requested, or the number of new voters who went to the polls. Sometimes you’ll want to look at a “control group” for contrast — for instance, if you launch a smoking prevention campaign with one 8th grade homeroom, and don’t with their counterparts, you can see whether the intervention worked. You might also consider quantitative surveys (phoning a large number of people to ask about their behavior), quantitative surveys (conducting focus groups — aka in-depth interviews — with a small group of people), or observation research (watching what people do “in the field” — for instance, whether more or fewer people are throwing cigarette butts onto the sidewalk of a major plaza).
- Identify your schedule and budget. How frequently should you monitor your progress? And what will it cost (have you factored it into your campaign budget)? Those are two critical questions you must answer at the outset of your campaign. In general, you’ll want to get a baseline reading before you begin your campaign (to see what the situation is like without your involvement), test during the campaign (for some campaigns, it could just be once, and for others it might be on a very regular basis), and then conduct a final measurement after your campaign is over, to see what kind of impact you’ve had. You can determine the ideal frequency of “mid-course monitoring” based on two factors — first, how hard/expensive it is to get the data (checking a database is easy, while organizing focus groups is much more labor-intensive), and second, how frequently good data is available (you might want to track syphilis transmission rates every day, but if the government isn’t equipped to provide it that frequently, you’re out of luck).
You want your campaign to be as successful as possible. The best way to do it is to monitor your progress carefully, adjust mid-course if your process isn’t working, and learn what you can do even better next time.
This blog originally appeared on the Huffington Post.
Dorie Clark is CEO of Clark Strategic Communications and a frequent collaborator with the energy efficiency consulting firm Serrafix. Clark is the author of Reinventing You. She has taught social marketing at Tufts University, and has consulted for clients including Google, Yale University, and the National Park Service. Listen to her podcasts or follow her on Twitter.