Tuesday, October 31, 2006

Get Out the Vote
Some anecdotal responses

Donald P. Green and Alan S. Gerber are political scientists at Yale University. Eager to justify the discipline's claim to be scientific, they conducted experiments in U.S. elections to measure the efficacy of various campaign methodologies over a period several electoral cycles. They randomly selected potential voters and a similar control group; they applied various electoral techniques to the selectees and avoided the controls; measured actual voting behavior based on public records; and replicated the same experiments in different times and places. That is, they set loose the scientific method on campaign tactics.

Their little book Get Out the Vote: How to Increase Voter Turnout is a fascinating treasury of data, all fully reported. Some of their conclusions agree with my long experience in field campaigns; a very few don't agree; and I have many anecdotes that I think amplify some of their findings.

First a caveat: though Green and Gerber name the book GOTV, they were not always measuring the same thing that campaigns mean by that acronym. In actual campaigns, GOTV means getting out their own votes, not voters at large. The last thing most campaigns want is to turn out every voter. Though Green and Gerber do describe some partisan experiments, a lot of their data derives from non-partisan turnout efforts in which increased participation by anyone was considered a success. Campaigns want something quite different. They seek to win, not simply encourage participation.

Green and Gerber report on many tactics, in descending order of efficacy. Here I am summarizing their findings on the subset of their data that refers to partisan, contested elections:
  • Door-to-door -- 1 vote per 14 contacts;
  • Leafletting -- in partisan campaigns, 1 vote per 66 lit drops to registered voters;
  • Direct mail -- 1 vote per 177 base voters; 1 vote per 600 persuaded voters;
  • Phone calls -- volunteer callers, 1 vote per 35 contacts; paid phonebank, 1 vote per 400 calls without enhanced training and supervision;
  • Robo calls -- no detectable effect;
  • Email -- no detectable effect.
Some observations:

My experience is certainly that door knocking canvasses work well for campaigns. But I was initially skeptical about this book's assertion that organizers can assume roughly 12 contacts an hour. I thought that was too high a number. So I asked around among experienced organizers and they agreed with Green and Gerber. Moreover, I tracked my own contacts on a precinct walk last Saturday and they were right on target.

Why does much of my experience suggest lower numbers of voters contacted per hour? I think too much of my experience has been with inner city canvasses in low-income areas where voters may be harder to find. Also, most of the door knockers I've worked with have been very inexperienced individuals who were tasked with moderately difficult missions to find out the partisan preferences of the voters. This is tougher than just encouraging turnout and may lead to lower contact rates.

An article by Michael McDonald in the Washington Post on Sunday suggested that canvass efforts may not be so effective in hot contests such as we are seeing this year. Green and Gerber read their experiments otherwise, saying door knocking GOTV works in both competitive and non-competitive contests. I'm not surprised. What none of us who care about politics can readily understand is just how remote these contests are from many voter's ordinary consciousness. Most people don't think much about politics -- or want to. Last weekend I walked door to door for one of the hottest contests around this year. Our opening line was supposed to be: "Have you heard of Jerry McNerney who is running for Congress?" Lots of people had not, ten days out. They have better things to do than worry about politics. That is why they need person-to-person contact to get them to vote (our way, we hope).

Overall, I could not agree more with Green and Gerber:

Face-to-face interaction makes politics come to life and helps voters to establish a personal connection with the electoral process. The canvasser's willingness to devote time and energy signals the importance of participation in the electoral process.

Under this label, Green and Gerber mean dropping campaign literature, especially door hangers, at the doors of targeted voters. As an organizer, I hate this kind of "leafleting." I consider the choice to employ this tactic an admission of defeat: we must not have recruited enough canvassers who would actually talk with people so we defaulted to running those we did get around to "lit drop." With that attitude, I was pleasantly surprised to see that GOTV maintains this activity is relatively effective.

They don't have a lot of data on this, so I feel real free to suggest my caveats about "leafleting." When five campaigns are lit dropping the same area, I doubt the effect is positive. Or, rather, I doubt it helps determine the direction of the vote, even if it increases turn out. And when an area is frequently targeted by pizza delivery door hangers, forget it!

I've seen a couple of kinds of large lit drops that I think did some good. If no other campaigns are door hanging and the date is surprising, I think you get something out of hitting every house in a neighborhood. Once we door hung a full half of San Francisco for a candidate the night before Thanksgiving; on the holiday morning, every address found a message from our candidate who was facing a December run off election. That was probably noticeable and a reasonable use of volunteer labor.

The other effective door hanging I've seen is done the night before or morning of Election Day with polling place information on each door hanger. I suspect this is novel enough to attract some notice, though it would be great to have better data on whether this really works.

Direct mail
Damn -- I love it that Green and Gerber consider direct mail relatively pricey and ineffective. Campaigns put vast sums into mailing and then starve their people-to-people efforts when the money could better have been used for personal contact. At the very least, many could put more effort into creating literature that can be delivered at the doors. I think some of the popularity of mail as a tactic is that lots of campaign workers think of themselves as budding message wizards, sure they've got the magic words and pictures that will win the day. Again, the more one interacts with average, unengaged voters, the more one realizes that messages and their impacts include an awful lot of unknowns. Usually message is only one variable in a shifting context that determines electoral outcomes in the few contested races.

Green and Gerber make it clear that phone contacts have to feel personal and neighborly to the voters on the end of the line, whether made by volunteers or professionals. That rings true and is hard to achieve. Organizers all include phoning as a major part of electoral field programs because it is relatively simple to organize, but increasing numbers of unlisted and cell phones and the reality that more and more people don't answer phones are diminishing returns. Improved distributed computerized predictive dialing systems are helping this cycle, but I would expect phoning success to continue to decline -- from being the best way of reaching people aside from at their doors, phones are becoming just one among many communication media, alongside text messages, email, etc.

Quibbles aside, Green and Gerber have presented their research results in a way accessible to anyone who needs to think about how to win campaigns. Like the good professors themselves, I look forward to reading about more well-designed experiments that test all this stuff we do every couple of years. Organizers do enough flailing around in this very human enterprise; it would be nice to know that our tactical choices have been guided by smart experiential data.

No comments: