Farrah Bostic on Design thinking in the Political Process
Very - VERY interesting conversation with Farrah Bostic, founder of The Difference Engine and faculty at Parsons. We don't usually talk politics, but this look behind the curtain is insightful and engaging. We went overtime because there was so much ground to cover.
[00:00:11] Recording: This is The Strategy Inside Everything.
[00:00:25] Adam Pierno: All right welcome back to another episode of The Strategy Inside Everything, this is a very exciting episode for me because is a topic that is really, really interesting and I'm getting to speak to someone who intellectually, what I learned in the past 30 seconds of pre-talk, is going to run circles around me which I always really enjoy. We have the founder and chief strategist of The Difference Engine, Farrah Bostic who is also adjunct faculty at the Parsons School for Design, and we are just thrilled to have you here Farrah, thank you so much for joining us.
[00:00:59] Farrah Bostic: Yes, thank you, this is great.
[00:01:02] Adam: Can you do me a favor? Can you tell all our listener a little bit more about who you are and what you have done before we dive into our topic?
[00:01:08] Farrah: Sure, as you said I'm a founder and chief strategist at a consultancy here in New York called The Difference Engine, our primary focus is what we have been calling, “Inside product-driven strategy,” for about the last four years, four and a half years. So a lot of what I do is design research, so it's user interviews and ethnographic work, and helping product teams better develop products that really work for their customers. It is also sometimes teaching those product teams how to do their own research, since I constitutionally object to the idea of outsourcing empathy.
Then the other thing that I found as part of that, is not just the skills of research, in terms of how do you ask good questions, or how do you find people to talk to, or how do you take good notes, but also how to do good analysis, but it's also a process problem, how does this integrates into the rest of our design process or the rest of our project management process, and so I also do a lot consulting with teams about that, so the bigger question of fitting in thinking and listening to the systems they have put into place creating better briefs for design teams, creating better research reports, and reporting structure.
Those are the things that I have been doing. My background is primarily in market research, I’ve spent four and a half years with the brand and communications research consultancy called Hall & Partners here in New York.
[00:02:41] Adam: I know them well, yes.
[00:02:42] Farrah: Yes, where I became a partner and research director and I led the innovation practice space within the company, both internally, starting to develop new practices for us as qualitative researchers, new methods, and tools, but also helping to shepherd some of our clients through new product development cycles and better ways of thinking about incorporating research into that early innovation work that they were trying to do.
So, I was with them for, like I said, about four and a half years. I was with a company called OTX for a little while after that, until they were gobbled up by IPSOS, and then I worked in Ad agencies actually, as a digital creative to begin with about 20 years ago.
[00:03:26] Adam: That must have a huge shift, yes?
[00:03:28] Farrah: Yes, a huge shift. Well, one of the reasons why I became a strategist and a researcher is that in 1999 and 2000 when you are trying to build e-commerce sites for companies there was very little of anything that was truly off the shelf, so we had to build things from scratch. If you are building inventory systems from scratch just to enable a website that ultimately a marketing website for a retailer in Santa Monica then you have to understand something about logistics and inventory and the business model of retail.
Suddenly you find yourself elbow deep in the guts of the business, not really thinking much about marketing and copyrighting at all, but instead thinking about the underlying business model, and the costumer, and what it is that the business is trying to achieve. So after that experience, I went back to working in normal in ad agencies, but I was a very bad agency employee as a result.
I was a nuisance in meetings.
[00:04:29] Adam: Is it because you were asking a lot of questions as a spectator?
[00:04:32] Farrah: Yes, asking a lot of questions and taking a lot of initiatives that maybe people wish I hadn't taken.
[00:04:39] Adam: In big agencies, that is very unpopular.
[00:04:43] Farrah: Yes, although it also taught me a little bit about the virtue of asking for forgiveness instead of permission, and just for getting on with it, and the will like you when you win, sort of became for good or ill, an operating philosophy that I have.
[00:04:59] Adam: I love it, I love it.
[00:05:01] Farrah: Yes.
[00:05:02] Adam: Well, thank you for that background, and I think it's setting people up to understand what kind of a great talk we are about to have. The topic that you gave us is really interesting to me so we are going to talk a little bit about politics today, but we are not going to talk about the candidates we are going to talk about the other side of it, specifically, the way we are measuring intention, the way we are doing voter research, the way we are treating essentially user research practices so poorly in politics.
[00:05:37] Farrah: Yes, exactly, I think the thing that has been peaking my interest, obviously, as a researcher, I'm always having to reorient myself to be objective of the project that I'm working on, which is helping my client fix what's broken or make more money, or see growth, or whatever it is that they're trying to do, because I spend time in the field with their customers and so it is far easier for me to become an advocate for them, and think about their needs first, so a big part of a synthesis and analysis process for me is bringing all of it together.
But as I've been observing what I think as political research, but what most people encounter as polling or those Frank-Luntz focus groups they do at CNN sometimes, and first of all, my delicate design researcher sensibility is offended by the practice and the methodologies.
[00:06:32] Adam: There's a lot of sensitivities that are offended by those.
[00:06:35] Farrah: Yes. It’s what Franklin calls a focus group is not recognizable to me as a focus group, but I appreciate that it is something that is done. I've sort of backed in a little bit to investigating this bit more and I suppose it starts with my functioning assumption about the people I interview that they are not stupid.
[00:06:59] Adam: Well, that's actually a huge piece of any kind of research that I think a lot of people go into that and, since we are talking about politics, I think the pollsters and the media have a sense that everybody else is dumb or that the US public might be stupid and when you start from that place nothing good can really come from the conclusion you're trying to draw.
[00:07:22] Farrah: Yes, I think you start from dumb or gullible, or lazy and disengaged, those are kind of the default presumptions about the average voter, they have terms for this which is the low information voter, you spend two years watching, if you are a news junkie you spend two years watching the run-up to the presidential election, the first year and a half of which talking heads will constantly remind each other and the home audience, most people are not paying attention to this yet, which may or may not be true.
I wouldn't know for sure I would guess that's probably true, who wants to start thinking about it two years out, and especially in the last two elections where it has been such a crowded field of people that maybe you've never heard of before.
[00:08:09] Adam: It's like watching the first three weeks of top chef, a lot of these people are not really important, I don't need to pay attention just yet.
[00:08:18] Farrah: Exactly, exactly, wait till the first call, and you can check in after that. That's one set of defaults, I think that's interesting is we now have a new set of pejoratives for people who respond to political polls and that is that they are possibly trolling the pollster, that they are giving the answer that they think will piss off the political opposition, as opposed to honestly reflecting what they have to say.
I encounter this things market research and design research with client from time to time who are concerned about different groups of respondents, so one is the professional respondent, the person who signs up for every research project because they make money off of it. I used to think this person was a fiction, I actually met a respondent whose e-mail address what’s something like "email@example.com."
I was like, "wow, this is what you do for a living? You take surveys? You are the first person that I've met that fits that description". So they do this.
[00:09:27] Adam: Did you go back in your mind to all the times clients had brought up that concern to you and you shrugged it off?
[00:09:33] Farrah: Yes, I did feel like "Well, in 20 years of doing this I've only encountered that person once," My instincts are still basically right that there are so few as to be vanishingly small, but you do enough focus groups and interviews over a lifetime and I suppose you will eventually encounter one. I suppose in politics that would be a similar kind of person, a person who will, in fact answer the phone during the dinner, the hour to take a 20 minute survey from a political pollster.
[00:10:04] Adam: On a landline.
[00:10:05] Farrah: Yes, on a landline, exactly. And so we can discount them because they are not like other people or they're not like what we think other people are like. Another group of people that my clients have objected to is the dominant respondent, the person who has too much to say and seems to know too much about the topic or pretends to know too much about the topic.
[00:10:24] Adam: I love it. We want answers but we want the exact right amount of quantity of words and-
[00:10:29] Farrah: Yes exactly. We don't want-- The other day during the hurricane in Florida, they stopped a guy on the beach in Miami and asked him what he made of all this weather and why is he still sticking around and why he was down on the beach and he gave this very technical response about wind speeds and direction and degrees north and how far above the 80th meridian and whatever that Miami was and people were sort of like, "Wow! You know, FOX NEWS just inadvertently found the smartest men on the street ever. Why doesn't Jimmy Kimmel ever find those people?"
But I know my clients would be sitting in the backroom suspecting that that guy was full of it. That he was making things up for a bit-- he just wanted to impress people or he was some sort of dominant respondent and then therefore not the ideal. So you have-
[00:11:18] Adam: Even if he was correct? You rule it out as an outlier. Well you say, “Well, let's rule that guy out and we'll rule out the bottom too.”
[00:11:24] Farrah: Exactly. So we have those types of people I suppose in politics, these are the people who do what I did today, which is went and voted in a local primary elections. I voted for five civil court judges and a district attorney today. So those are people who according to national pollsters are seen as also kind of outliers. They're engaged voters who vote every election and have an opinion about everything. So we sort of discount them as well.
Then with all of that you've got the assumption that users are dumb, that they don't understand what you're trying to accomplish and that's their fault and not yours for failing to communicate it or failing to design it properly. That is something that I've dealt with more probably in communications research than in product research. The belief that if someone doesn't get my vision, or get the idea for the campaign that they're recruited badly as oppose to know the representative of their subset of your customers and they don't get it and you need to do a better job at connecting with them. They don't need to do any more work to connect with you. They're busy.
[00:12:37] Adam: As it relate the politics Farrah, are you thinking more concerned about people's understanding of the political process itself or the way candidates message and connect with voters or populations?
[00:12:55] Farrah: Yes. I think I've become pretty interested in a-- as a little bit less in how things are messaged, but more in how ordinary people access politics. What is it that brings them into it? What are the things they actually care about? How do they evaluate a candidate or an issue? What are the mental frames, the values, the priorities, the experiences that they're bringing, the metaphors that matter that they're bringing to the party when they enter a voting booth or when they respond to a political pollster or when they watch the news?
These things are-- seems to me not as well understood as we pretend that they are. And I think primarily because we rely on political pollsters for information about what voters do and I think increasingly we are also relying on big data analytics from social platforms about what people click on. And there are certainly some good hypothesis to be formed from both of those types of sources.
And then I'm certain that there are non-profits groups like the Pew Trusts, and Point Turned, the Knight Foundation and various others were funding efforts to try to understand groups of voters better or groups of citizens better, certain issues better. But I think there's sort of shocking lack of-- A moment where all of the political pollsters flipped over from being predictors who are held with standard of accuracy to becoming explorers of probability, and sort of understanding of the underlying system upon which all of these voting is actually based.
[00:14:44] Adam: Now, that's a huge distinction. Let's talk about applying design thinking there. So, if we were recreating this whole structure from scratch. What would be the approach with design thinking? We're thinking from the point of view of the user and we're thinking about the problem they’re trying to solve. We're trying to get to the motivation for why they're trying to solve it. Correct?
[00:15:06] Farrah: Yes. I think this is exactly the thought experiment I've been pitching to a few people about assigned project that I'm undertaking now, which is, if you are to imagine that for some reason, it’s that moment and, what was it, Back to the Future 2? Where we now draw the ultimate timeline. So everything else is the same but one party has disappeared.
If you were to design a new party from the ground up and you didn't have some sort of anchoring point of the existing party or was not a re-positioning or a re-packaging of let's say the Democratic Party, but a totally new party. What would you do? Having everything else the same today. I think because of the rise of things like Lean Startup thinking and Agile Methodologies and design thinking. All of these things offer tools that are lightweight, iterative, prototype oriented, but also privileging thick analysis and going and looking early on at real people in their real experiences and getting real feedback about real stuff.
So instead of going and asking, I'll use their term, a low-information voter in a swing county in Ohio a year ago March, "What do you think about the TPP?" A document, they have no doubt have not read and had not been covered in any real depth in an accessible way on normal network news. Then, how are they supposed to respond to that question? So instead of asking those questions which feel a little bit like walking up to somebody in you know Amish country and asking them what they think about Uber. Its like-- why not-
[00:16:53] Adam: They wouldn't like it either. It'll be unpopular there as well.
[00:16:56] Farrah: Probably true. But first you have to explain what Uber is, right? And that is not a thing that generally speaking happens in these kinds of survey instruments. So, the thing you would start with is-- First of all, I think you would start with is some desk research. What's going on in that county? How is that county shaped by economic forces, by demographics, by the environment? What are the political forces at play locally? What's gone on historically? How has that district changed or that county changed? Who were the types of people in it that we would want to go talk to that are kind of representative of different groups.
And then-- and you would think about those groups less from demographic perspective, which is what traditional market research does and because it's media focused, media-buying focused, or the way politics does things because that's also a media-buying focus. But instead from a designer's perspective of what kinds of things are people trying to accomplish.
So people who are trying to solve the problem of sending their kids to college, or people trying to solve the problem of trying to get a better job or any job, people who are trying to solve the problem of immigration status for extended family or concerned about the local watershed, or whatever they are thing might be that they are actually doing something about.
One of the things that we do in early stage customer development design research is we look for people who are cobbling together clunky solutions at great expense to an existing problem and we find out what's up with that problem, and what's up with those clunky solutions, and why they are going to this great effort to solve these problems.
One of the things I've been thinking a lot about as kind of an interesting first topic is just the subject of voting. And maybe, you know, also relevant because I voted today. We take this for granted as a comparatively simple thing to do but it's not simple. You have to register. You have to have ID the first time you go and vote. You have to know where the polling place is and sometimes that changes. You have to know who's on the ballot and what they stand for.
Literally today, I had to select as many as five people for civil court judge, because we have six civil court judges. So, you're voting for five people at once. How do you make those selections? How do you find out anything about a civil court judge? There's not enough time or money in the world for all 12 of the candidates to put their messaging in front of me.
[00:19:20] Adam: There's no media campaign for those seats.
[00:19:22] Farrah: No. Although, I tweeted that I voted and now I'm being followed around by a city council candidate, who is not in my district, on Facebook- [laughs]
[00:19:31] Adam: So he gets it yes.
[00:19:36] Farrah: But in any case, you would-- if you think about voting as actually a fairly high friction activity not a low friction activity. Then, it changes the way we think about why people don't vote. I find it hard to believe that as many as 50% of the eligible voters in the United States don't vote because they don't care. I might believe some of those people are disenchanted and have been turned off from the process, but that's not the same as not caring.
I think a lot of people don't vote because they don't know how. They don't know if they can. They don't know where. They can't get there. Canvassing prior to the election, just helping people figure out where their polling place was, and reminding them about, when it was happening and helping them make a plan to vote was illuminating because you wind up-- in situations where you can't talk about voting without also talking about the criminal justice system. "I was arrested once, can I still vote?" Well, was it a misdemeanor? "Yes." Then, you can vote. In some states-
[00:20:36] Adam: What if it becomes more and more complex and then you're dealing with-- the disenchantment becomes one more layer on the friction and the confusion becomes one more layer on that.
[00:20:47] Farrah: Exactly. It's part of a bigger system. You can't really talk about voting by itself, because it overlaps with churches and schools, jails and courthouses. It overlaps with transit system and highways and rideshare and access to other kinds of transportation. It overlaps with what kind of jobs people have. Can they get time-
[00:21:09] Adam: Yes, right, workplace policy. Exactly.
[00:21:11] Farrah: -and child care, health care. We talked to a couple of people who weren't voting because they were invalids. They couldn't leave their apartment. They should have been given absentee ballots. All of these things converge to affect whether or not someone can vote. A design researcher would start to map out all those intersections and overlaps and start to identify where they think the solvable problems are, the most important solvable problems are.
You might then look at quantitative data to back-up what's the size of any of these problems. How many people are affected by a perceived inability to vote because they don't know that an expired driver's license is not necessarily a reason that they can't go down and vote?
[00:21:58] Adam: Would you go into this with the assumption fair that the average user, there'd be outcome that they want is to vote and have participated or would you go into it with the assumption that the average user is neutral on voting.
[00:22:15] Farrah: I think that I would probably go in with neutral. There is a possibility people don't want to vote. You would need to find out what that's about, too. So, there's always a possibility of using all of these perfectly legitimate reasons as pre-text for not voting. That it's a huge hassle and also, I don't want to. When really, it's, "I don't want to and also, it's a huge hassle."
[00:22:40] Adam: Right. That's a subtle difference but a big difference, at the same time.
[00:22:43] Farrah: Exactly. I think you'd want to start with, let's just talk about what it takes to vote. "Have you voted recently?" "What was that like?" For other people, "What do you think it's like? What do you think are the steps involved in doing this thing?" I found in other scenarios where we talk to people about-- I've done work on arthritis and asthma and the HPV vaccine. These kinds of conversations are incredibly illuminating about how much work people think it is in order to get their problems resolved.
Just the belief that it's 15 steps when it's really only 2-- and I'm not saying it's 2 in those situations, but it's far more onerous of a process than it actually is, is enough for people to say, "In a trade-off of all the things I'm responsible for and care about in the world, this, I can manage. I don't need to solve this problem."
[00:23:41] Adam: Now, that's huge. And trade-offs spark something here because I wonder if-- where we started the conversation and we were talking about-- and you wrote an article about on Medium about-- What's the organization, again, that it gets critiqued for their crazy questions and polling people online-
[00:24:02] Farrah: Public Policy Polling in North Carolina.
[00:24:04] Adam: Yes. They have a non-traditional approach because they're asking questions that get to the root of the answer they want versus the direct, "What do you think of TPP?" Do you think, for some of these trade-offs, we'd have to have multiple polling types to get multiple diverse populations and multiple motivation sets? Correct?
[00:24:28] Farrah: Yes.
[00:24:28] Adam: You can't just continue going the way we're going with just landlines or online only, because I know that's where the trolling is most feared, anyway.
[00:24:37] Farrah: No, I think, you do have to-- this is true, in any market research, there's still a place for random digit dialing of landlines. There's still a place for guys with clipboards in shopping malls and intercepting people where they are. Different methodologies, in that sense, of how do you read people not just on online polls but in others, how do you design sample is another part of it. Who are the people that have opted in to PPP's panel? I have no idea.
There are critiques of their methodologies in the way that they assembled their sample and all of that. But I think, we have that challenge in any kind of research about the types of people we're trying to reach. What is the best method of reaching them? "Best" is a combination of factors. What's most expedient? It's also what's more affordable, but it's also, "Are we really going to find the types of people we're looking for if we use this approach or that approach?"
Over and above that, though, I think there's another example. A couple of years ago, I want to say this was in 2015. There was an episode, actually, of FiveThirtyEight had a podcast that was a data podcast as opposed to a politics podcast.
[00:25:56] Adam: What's the Point?
[00:25:57] Farrah: Yes, What's the Point, exactly. They did an episode that was about religious polling and about one of the struggles of religious polling, being that, not every community uses the same language and framework to describe their community of practice. You think about some communities where going to church does not necessarily mean going to a prayer service or listening to a sermon. It means a book club or bible study or a barbecue. There's a million reasons people go to church in different types of communities. Whereas, there are other communities where congregants show up once a year for a particular holiday and never come back.
Using the language of going to church, first of all, you'd have to swap out that word for mosque or temple depending on who you're talking to. You might have to modulate your understanding of what it means to be churchgoing and community is where people don't go to churches per se. They go to meeting halls, and they go to each other's homes, where prayer takes place elsewhere. You might also have to separate out, "When are you going to church for prayer services?" and "When are you going for some other more social and community-based activity?"
That really was interesting to me because it means, you can't actually feel the exact same survey instrument in every religious community. You have to have different instruments. In which case, analysis starts to become, not impossible, but harder because you're not comparing like things. What you wind up with, then, is a lot more emphasis placed on the analysis and synthesis part which requires a great deal of sensitivity.
That means that in addition to all of that survey taking, you need to have people who've actually spent time in those communities, got to know those communities, understand, at least, bit more about how the community actually functions. The interesting thing about PPP, I think, is I don't think they get any greater insight into underlying beliefs that people have but I do think they're very good at measuring the breakthrough of strange messages. [laughs]
The weird things that re-gendered to different communities and different people's mindsets, that's very interesting but it's only good for that. Instead of looking at it and saying, "Oh my god, all these people think that Hilary Clinton committed some crime or Donald Trump did something." That instead, we should say, "Hey that message really broke through. A lot of people heard that headline somewhere or a lot of people think that's believable.
Not necessarily that, that's why they made the decision or that they think they really, if you ask them in a qualitative environment, "Do you really think that's true?"[laughs] You don't know what they would do that. But it is an interesting-
[00:29:02] Adam: Do you think that it's media issue or a-- I guess, it's a combination of the media blowing up a polling issue or a research issue where it's maybe a flawed methodology or a single-- I shouldn't say flawed. I guess, I should say single methodology that yields a particular result that gets reported by the media as a fact versus "Well, according to these people that were reached by landline or these people were intercepted at a restaurant or these people that answered on Facebook, this is what they said."
[00:29:35] Farrah: I think that it tends to be picked up, because it feeds a narrative that already exists about people's gullibility or about the outrages things that the American voter believes or is willing to say that believe. I also think that in the same way, if you have-- I don't know what your personal political persuasions are, but there are elements of my family and friends who are deeply conservative and are willing to believe all kinds of things I would consider to be crazy about the left. And at the same time, I think when I see somebody like, God-lover, but which will matter doing-- exclusively tonight, we have this relief from the PPP, about the latest crazy things the people didn't believe.
That's the same thing. It's fueling a belief that I'm a left about the crazy about their beliefs. The thing about that then is it's not really helping [laughs] us do anything. It's just sort of saying and especially the way it's portrayed. It's like see how crazy they are, not, What if going on culture? What is going on in the way we've, covered the issues? What's going on in the way the information is transmitted? What's going on in people's mental frameworks and metaphors for politics? What's going on with people's value sets that makes these things more or less believable, or makes people want to associate themselves with these ideas in a semi-public way?
And they don't think best. This sort of headline of-- 30% of-- what is the thing the other day, about people in a Fox News polls, cited they believed that the national media was more damaging to American society than white supremacists were. That's a great headline and especially if you believe that all Fox News polls are biased, then it fits your narrative: conservatives are racists, and hate the media. So it’s a-
[00:31:37] Adam: According to some people.
[00:31:39] Farrah: Yes, yes. Some people of which some are very fine people.
[00:31:44] Adam: I think we all watch the news the same way when the story line or that the narrative that the media is all putting out there is that dumb gullible, lazier, disengaged voter. It's an offshoot of an optimism biased for almost all viewers that think right. All are those other 349 million people but not me. Right? That is not who I am. I wonder about the other 349 million if we were all thinking that, and what is happening.
I have to laugh for all cries, so I apologize.
[00:32:17] Farrah: Well, and I think that's the opportunity is to find out what is happening. Why are people answering that way? And then, there are a lot of ways like, just taking that poll, for example. That's an interesting one to reckon with. One is, let's set aside the assumption, which is probably false, that the Fox News Polling Organization is the same thing as Fox and Friends in the morning. It's not the same thing. Let's just take that the methodologically was more less sound. And so, we don't need to clipboard over that any longer.
The question that is when people answer that question: what do they have in mind? What were they picturing when they imagine the national news media who is included in that co-hoard? Is it just the failing New York Times? [laughs] Is it just CNN-
[00:33:11] Adam: Sad.
[00:33:13] Farrah: Are there other players that are incorporated in there, and within that co-hoard, how does trust rise and fall across different outlets. Where you get that idea? In what way are they ruining American culture and society? Again, I'm a qualitative researcher, so I have questions when people say things like that. Who says who ruins it for you? How is that ruining impact your daily life? What do you to solve that problem? Who's in that co-hoard? You know, what do you need?
On the flip side of it, you know, there's a lot of ways of reading the comparatively low belief that white supremacists are bad, and that's-- there are people who believe that white supremacists are made-up thing by the left, that there are no white supremacists. I mean, Roger Stone was on Al Letson Reveals the other day, basically saying the clan is not real. It's made up of criminal informants for the FBI. It's a fiction, it does not exist. This is just a make-work program for dodgy people.
There are other people like me who haven't aged and I'm from Oregon originally where I have to up-- I have to wake every day install the patch, on the phrase white supremacists in my brain because when you say, "white supremacists," I see skinheads in Northern Idaho. Like, that's what I picture. And the guys in Charlottesville the other day don't look like those guys in general.
So, what are we talking about when we talking about a white supremacist? And so there are people for whom-- that's a strange term that evokes in a different era and a particular group of people. That might be part of thinking. Those people have been long gone, what are you been talking about, or maybe a belief that comparing to those people who have no, you know, influence.
[00:35:06] Adam: No voice anymore. Yes, exactly. And I think that two thoughts, one, we should compare over cast playlist because I have a feeling we listen to it, I heard that-- revealed its episode as well. I think context. I think what's missing, what's broken in political polling and political research is really that context. Why did that message break through? It's part one, what messages have you received. Part two, why did that message break through to you, and then, further down the line, what other pieces have you received or what other pieces have the influence that can get through, and what do you intend to do with it, with that information.
[00:35:50] Farrah: Yes. This is where, I think there's a need to-- and it certainly helped me and that is to understand the business realities of political posters, which is that they are expected to achieve a lot, like monologue or polls that constantly in the field during an election cycle. And now, because of this feeling that the whole world is been turned upside down, they're having to continue to do that, except that a lot of the time that people who pay for polls are newspapers and media outlets. Newspapers are still not doing that low [laughs] financially, so there's so much money to go around for that.
So, that there's a lot of pressure to make a lot out of not very much in terms of resource. On top of all of that, the main difference, even though it is sort of fundamentally the same practice, is between political polling and marketing polling, is that-- if Microsoft wants to launch an enterprise software, it doesn't happen on November 9th, every four years. And everyone makes the same decision on the same day about which piece of business intelligent software to install in their corporation, and it's not just down to-- is it going to be Microsoft or is it going to be a SAP and IBM. And that's it. There no other choices. It's either that or we stay home. We don't do business.
And so, everything is about prediction. And then, you are you rise or fall in terms of credibility, and amongst your clients which are campaigns and political positions and news organizations based on your ability to accurately predict the outcome of an election, not just calling who won or lost, but calling the margins. This is a high-stakes game that they play comparatively compared to the market researchers and design researchers who are less engaged and accurate prediction of outcomes and more engaged in trying to understand the underlying systems and help clients make better choices. It's more like-
[00:37:52] Adam: Context and direction versus final results.
[00:37:56] Farrah: Right. In which case, what we do then informs the business analysts or CFO or CEO or COO, who then says we're setting this goal for next year or next quarter, and then these are all of the things that we have to do in order to meet that goal. It's a complete reversal of the workflow. It seems to me that there is an opportunity space to do the work market researchers and design researchers do that would eventually help, issue-based organizations, advocacy groups, candidates and campaigns make better decisions. But it also might, as hypothesis, might help news organizations tell better, more fulsome stories about what's really going on in any given electoral district.
[00:38:45] Adam: Assuming they want it, right?
[00:38:48] Farrah: Yes.
[00:38:50] Adam: There's always that.
[00:38:50] Farrah: I do what I give them a little bit of that-- I’ve work with these organizations. I've done some work with NBC News and with The National Times, and they do want to understand. They have limited resources, they have limited time, and they tend to be busy with other things, natural disasters and wars and consumer reporting. The rest of the time making into a new cycle, it's kind of a race to catch up with all the things they have to catch up with, and this past cycle was even harder, because there was a guy who packed himself in the TV and wouldn't get out for 18 months or so, and still won't get out.
They were a little preoccupied. I think they had to get over that preoccupation. But I think they do genuinely want to understand what's happening with their viewers. Those are the people they are also trying to serve and I don't doubt their intentions. But they don't have the tools for it, and they do to ultimately also have to serve up something that is entertaining. And I think one of the-
[00:39:47] Adam: On a schedule. Yes exactly.
[00:39:50] Farrah: Exactly. On a schedule. And there’s an institutional belief that anything that is too explanatory is eating your peas and not having dessert. And they believe once again, that most people are lazy, and would rather have dessert than peas. There's a lot to unpack and a lot of interesting opportunities, I think, to do better work there and expect those who wants it. [laughs]
[00:40:14] Adam: Yes, I think that is the question. I think we've gone overtime. I think we're going to have to set up a part two of this thing because there's so much more to discuss, and I'm really enjoying it. I want to ask you one parting question which is, "We have an election coming in about three and a half years, do you think polling will improve or do you think this is just an institutional track that we're a rat that we're caught in?"
[00:40:39] Farrah: I don't know. I think that for companies where primarily fielding quantitative survey-based polling data, I think some methods stay functionally the same, and that's primarily because, the guys at FiveThirtyEight frequently point out they have generally had high predictive value. So right, the questions that they ask, the sampling methodology that they use have more history of being right.
That said, there's also a tendency on the part of any marketer betting political marketers or perhaps at least the susceptible to throw everything out, when all of their beliefs are challenged at once. It's tossed everything out, and begin again. The problem is, I don't think they really begin again from the totally clean slate. Right, they still begin again from a belief that people are stupid, lazy, disengaged, or lying. [laughs] So, that's not helpful.
[00:41:34] Adam: [laughs]
[00:41:35] Farrah: I think there will be some organizations who are going to be more sensitive about the way they ask questions. I think there is going to be a look again at what is a likely voter and what is not a likely voter. I think some assumptions about how influential demographic trends are on turn-out, for example, have been called out into question. So there would be some changes, I think, in sampling more than anything else and, maybe, some differences in terms of the types of questions that are asked. My worry is that everything becomes a little bit more like PPP [laughs] a little less.
[00:42:08] Adam: Right. It becomes Super Bowl Wednesday where it's just all zaniness and no hard science.
[00:42:13] Farrah: Yes. But maybe I can come up with something to help. We'll see how it goes. [laughs]
[00:42:20] Adam: Okay, you're on the clock. Yes, exactly.
[00:42:21] Farrah: Okay, [laughs] exactly.
[00:42:23] Adam: Well, Farrah. Thank you so much for joining me. I really, really appreciate it. I know this is a little different than what we've done so far with the show, but I loved it. So I'm assuming our audiences will as well.
[00:42:34] Farrah: I hope so. Thank you so much for having me.
[00:42:36] Adam: Hey Farrah, they can find you on Twitter you're @FarrahBostic.
[00:42:39] Farrah: That's correct.
[00:42:40] Adam: They can also find The Difference Engine which is @differenceNGN. Is that right?
[00:42:46] Farrah: Yes, that's right.
[00:42:48] Adam: And any of the new side projects that you mentioned you want to talk about or they're not ready for reveal yet?
[00:42:54] Farrah: Not quite ready for reveal yet, but soon. [laughs]
[00:42:58] Adam: Awesome. Well if you share them with me, I'll share them with the group.
[00:43:02] Farrah: Wonderful. Thank you.