When a little boy, with some prompting from his mom, asked Texas Gov. Rick Perry how old he thought the earth was, Perry responded that he wasn't sure -- that evolution is just "a theory that's out there" and that's why Texas teaches creationism*. "Ask him why he doesn't believe in science!" the kid's mom pretended to whisper to her son, though the comment was clearly directed at Perry. But it turns out Perry does believe in science after all--when it's applied to politics. As Sasha Issenberg explains to The New York Times' David Leonhardt, Perry's 2006 campaign imported "eggheads" from academia to apply the scientific method to classic campaign tools.

Issenberg explains how the experiments began:

As the 2006 election season approached, the governor's top strategist, Dave Carney, invited four political scientists into Perry's war room and asked them to impose experimental controls on any aspect of the campaign budget that they could randomize and measure. Over the course of that year, the eggheads, as they were known within the campaign, ran experiments testing the effectiveness of all the things that political consultants do reflexively and we take for granted: candidate appearances, TV ads, robocalls, direct mail. These were basically the political world's version of randomized drug trials, which had been used by academics but never from within a large-scale partisan campaign.
And how the tests were conducted:
The eggheads controlled Perry's schedule for three days and randomly assigned his travel across Texas. During that time, they conducted a massive volume of polling calls -- large enough to discern significant movement in each city -- and tracked contributions and volunteer activity. They found that Perry's presence in a city had an impact: his approval ratings went up, and contributions and volunteer signups increased after he did a public event.
 
Because they had randomized the schedule, the eggheads were able with confidence to attribute the changes to Perry's presence. 
No one had ever used such rigorous methods to test the effect of the candidate's appearance on local poll numbers before, Issenberg explains. Further, the eggheads figured out that it was actually worth Perry's time to fly out to small towns to talk to the local press, instead of doing satellite interviews from a studio in Austin. The travel time was worth it, even if it meant Perry couldn't spend as much time on the phones raising money.
 
Interestingly, the campaign operation that has come the closest to forming campaign strategy based on hard data is one that Perry hopes to run against in 2012. But even David Plouffe's methods weren't as rigorous as Carney's. Sure, Plouffe insisted on analyzing data, but Carney used real live scientists, Issenberg explains:
Like Carney, Plouffe is a ruthless empiricist who helped instill an analytical culture that pervaded the Obama organization. Everything that could be measured was measured, and data was used to judge the relative effectiveness of campaign techniques, both old and new.
 
The big difference was in methodology. As best I can tell, the Obama campaign never used randomized trials to test its operations offline. ... That reflects the fact that most of the people doing analytics within the Obama campaign, unlike the eggheads, didn't come out of the academic social sciences, where after 2000 randomized field experiments became an increasingly popular tool for measuring basic political communication techniques. The great use of randomized trials -- as in medicine or agriculture or development policy -- is that they can not only find relationships between sets of data, but can explain causality. So there's a different level of authority in some of the Perry findings than anything that came out of Obama's headquarters, which helps to explain why Perry already leads Obama in at least one head-to-head metric: the number of scholarly papers published based on campaign research.
 
Note: Texas does not actually teach creationism.