A recent Huffington Post article highlighted research findings which suggest that Fox News viewers are less informed than all other news consumers.
Researchers from Farleigh Dickinson University conducted a nationwide phone poll asking nine political knowledge questions, four on international issues and five on domestic issues. The results of the survey show that someone who watches Fox News would only be able to answer 1.04 questions on domestic policy which is lower than NPR listeners 1.51 and people only watching The Daily Show with Jon Stewart 1.42.
Does this mean that Fox News viewers must be less politically sophisticated than NPR viewer? Not at all. To paraphrase Boudreau and Lupia (2011): Many people make the false assumption of confusing a respondent’s knowledge of trivia questions as a reliable indicator of their abilities to perform politically relevant tasks.
Before I begin my argument, here is the list of the questions word for word as asked by the experimenters:
- To the best of your knowledge, have the opposition groups protesting in Egypt been successful in removing Hosni Mubarak?
2. How about the opposition groups in Syria? Have they been successful in removing Bashar al-Assad?
3. Some countries in Europe are deeply in debt, and have had to be bailed out by other countries. To the best of your knowledge, which country has had to spend the most money to bail out European Countries? Open ended question (looking for Germany)
4. There have been increasing talks about economic sanctions against Iran. What are these sanctions supposed to do? Open ended, (looking for nuclear program, uranium enrichment or WMDs)
5. Which party has the most seats in the House of Representatives right now? Open ended (looking for Republicans).
6. In December, House Republicans agreed to a short term extension of a payroll tax cut, but only if President Obama agreed to do what? Open ended (looking for Keystone XL Pipeline, Canadian pipeline, State Department Review of Pipeline or Expedited Environmental review.)
7. It took a long time to get the final results of the Iowa caucuses for the Republican candidates. In the end, who was declared the winner? Open ended (looking for Romney)
8. How about the New Hampshire Primary? Which republican won that race? Open ended (looking for Romney).
9. According to official figures, about what percentage of Americans are currently unemployed? (Looking for 8.1%)
There are several things wrong with asking questions like this to measure “political knowledge.”
Firstly, political knowledge is not the same as political trivia. Does one need to know that Romney won the New Hampshire primary in order to know that Romney is the front runner of the Republican party? No. A lot of people may not pay attention to the presidential primaries because it is a horse race. They might rather invest time when it’s down to two competitors. Does this mean that these people are bad citizens for not following the primaries? No. It means that they have busy lives.
This study also makes a big assumption, that these scientists have picked the best questions possible to measure political knowledge. I would disagree. What makes these questions so important? Why not ask “Which Republican candidate is a Mormon?” or “What major terrorist was killed in in May of 2011?” My point is that the questions chosen were absolutely arbitrary. It is worth noting that if the most informed citizens were answering an average of 1.51 out of 5 questions correctly that maybe the questions did not adequately capture what “being informed” means.
Studies like this confuse interest in politics with political knowledge. Not everyone is interested in foreign policy and rightly so. Often foreign policy decisions do not affect a person on a local level and there is a high “price of entry” to foreign policy. One has to be familiar with history of countries, countries’ forms of government and most of the time an individual only has one vote with which to influence foreign affairs. In other words, it’s simply not worth the time or the effort to pay attention to world affairs when one is mostly unable to influence them.
A voter’s interest in foreign affairs does not have anything to do with their interest and participation on other political issues however. In fact, it does not even bar entry to high local political positions as demonstrated by former Governor Sarah Palin. (I highlight her because the example is evident. But I’m sure there are others like her in local government that don’t know anything about foreign policy and still manage to be good at their local jobs.)
When assessing political knowledge we should instead be thinking about questions like “Are there a few political issues that you have an opinion on and if so, can you identify the political party that would support your opinion?” That is the kind of information which can assess if someone has enough political sense to be an informed voter. Most people do know their party’s general platform and do choose to vote for the party that is in sync with their personal feelings on those big issues. It’s nonsense to say that because a person cannot answer trivia questions that they are bad citizens, uninformed or as studies like this insinuate, stupid people.
Apart from theoretical problems, this survey is filled with structural problems. In this case, the coding of data is problematic, not fully allowing partial answers or taking into account all possible responses. If when asked about economic sanctions in Iran one had answered their purpose was to “stop terrorist activities”, they would have been coded as incorrect even though stopping terrorism is partially why there are sanctions.
This survey utilizes a “don’t know” response. People answer “don’t know” for reasons other than lack of knowledge. Studies have shown that some individuals are more likely than others to give a guess when they don’t know, whereas others are (pun intended) more conservative, and don’t like to give answers unless they are absolutely certain.
Miller and Orr (2008) designed a survey experiment in which the first group’s questions encouraged “don’t know” the second group’s questions discouraged “don’t know” and for the third group the “don’t know” option was unavailable. They found that discouraging “don’t know” led to an increase in correct answers per respondent. In fact the “don’t know” omitted group outscored the don’t know “encouraged group by 9%. It very well may be that Fox News viewers are more risk adverse or less confident than NPR listeners or Daily Show viewers.
In conclusion, political trivia surveys are to political knowledge what the IQ is to intelligence. They are both just arbitrary quizzes made up by scientists. Neither adequately measures all aspects of the construct it claims to assess. Both are limited by how the question is asked and coded. Both are limited by what questions are asked in the first place. This recent “study” provides no evidence that viewers of Fox News are less politically savvy than viewers of other news.
Political scientists need to stay up to date on the method that they are using. There are better ways to measure constructs than survey questions. They should refrain from conducting studies like this with a clear ideological agenda. Most of all, political scientists should stop confusing political knowledge with political trivia.
Boudreau, C., & Lupia, A. (2011). Political knowledge. In J. Druckman, D. Green, J. Kuklinski & A. Lupia (Eds.), Cambridge Handbook of Experimental Political Science (pp. 171-183). New York: Cambridge University Press.
Miller, M., & Orr, S. K. (2008). Experimenting with a “third way” in political knowledge experimentation. Public Opinion Quarterly, 72, 768-80.