The midterm elections have come and gone, but Arizona voters may still not understand what they actually voted for.
This year Ballotpedia, a non-partisan voter education organization released a ballot readability report evaluating every state proposition for the upcoming midterm. Based their scores, an Arizonian voter would have to have over 20 years of education – equivalent to a doctorate degree – in order to understand the five propositions on this midterm’s ballots. The high scores are no anomaly, however. These scores are at the peak of a decade long trend where the language of Arizona ballots has steadily increased.
Ballotpedia’s report analyzed ballots using the Flesch-Kincaid readability tests which evaluate text based on the number of words, syllables and sentences used. The scores it produces reflect the approximate number of years of education someone would need to understand the text, meaning a score of five would mean five years of school and so on.
Ryan Byrne, one of the staff writers for Ballotpedia who created the first ballot readability report in 2017, said the report was generated out of curiosity and in hopes of better informing their readers.
“A lot of people intuitively know that ‘wow this is kind of difficult to understand’ so we wanted to provide some number, some evidence to confirm or reject that suspicion,” he said.
According to Byrne, the readability test in particular was chosen because of how the test evaluates readability.
“Ideas are complex. We can’t actually measure ideas in any kind of objective light,” he said. “So we’re currently relying on formulas that primarily look at language and the structure of language.”
Given the test’s use and acceptance in the academic community and by the national government – the test was originally created to help craft manuals for defense department employees – Byrne said it was only fitting that other government documents, like ballots, be evaluated the same way.
Using the test as a standard, the reading level required to understand ballot titles and summaries in Arizona has shifted significantly. The average score for ballot titles has increased by 64 percent over the last decade from 14, the equivalent of an associates degree, to 23, the equivalent of a doctorate degree. The average score for ballot summaries has doubled, increasing from 14 to 28.
Richard Herrera, a political science professor at Arizona State University, said that a score of 16 or more is too high and contradicts the idea that ballots should be written at “a complete but accessible level for most adults.”
In a state where less than a third of the population over 25 has a bachelor’s degree or higher, according to U.S. census data, scores over 20 mean the majority of voters are faced with propositions worded in ways they might not understand. This discrepancy between the education level of registered voters and reading level of ballots could have serious consequences.
Ballots with complicated language could lead to skewed votes whether by voter misunderstanding or omitting responses to propositions they don’t understand.
“The issue of readability may mean some voters do not cast votes on those ballot proposals or they may vote based only on what ads suggest is correct,” Herrera said in an email interview.
A 2011 study of over 1,200 ballots, conducted by political scientists Shauna Reilly and Sean Richey, came to a similar conclusion. The more difficult the language – or the higher the reading score – the higher the rates of voter roll-off, or the more ballots casted with skipped ballot measures.
Herrera, however, couldn’t give a reason for the trend.
“Some ballot measures are very complicated and may be more so than others,” he said in an email. “Other than that, I do not have any evidence that there are other, political, reasons.”
The office of the Arizona Secretary of State, who is responsible for writing ballots, could not be reached for comment.
For the foreseeable future, Byrne says Ballotpedia intends to continue an annual readability report as long as there are statewide issues to evaluate. While these reports aren’t targeted at any specific audience, Byrne has hope that the reports will at least “reach everyone that’s interested in ballot measures.”
Whether their viewers are voters, media, interested individuals or government officials, Byrne says the report has value because there are few others doing similar research.
“I think by providing some kind of standard analysis of ballot readability, it’s something for them to take into consideration because they may or may not currently be taking it into consideration,” he said.