How the Quest for Clicks Can Warp Perception

How the Quest for Clicks Can Warp Perception

image via communitynews.net

In my last post, I mentioned that I was going to do a deeper dive on a survey published by Pew Research in April of 2024 because there were some distinctly odd things about how the data were presented.  The survey had this title:

About half of Americans say public K-12 education is going in the wrong direction

That title, absent any other information,  doesn’t offer any clarity on what the “wrong direction” means.  It could be literally anything — teachers being paid too little or too much, school shootings, crappy textbooks, the possibilities are almost endless.  It might even be based on something demonstrably untrue.  It’s clickbait and it’s the worst kind of clickbait because a the majority of people will only read the title and go no further — its untethered negativity will inform perception without being troubled with nuance.  And there is some interesting nuance to be found.

The Nuance

While true that 51% of respondents said they thought education was going in the wrong direction and only 16% thought it was going in the right direction, a third of respondents chose “don’t know.”   I have no way of knowing why they chose that response, but the fact that so many of them did might have something to do with the vagueness of the question. I’m drawing this conclusion because when presented with concrete qualifiers, everybody could identify how they thought.

image via pewresearch.org

Of those possible concrete reasons for why the public K-12 education system was going the wrong way,  respondents identified if they thought it was a major reason, a minor reason, or not a reason.  The reasons were:

    • Schools not spending enough time on core academic subjects, like reading, math, science and social studies.
    • Teachers bringing their personal political and social views into the classroom.
    • Schools not having the funding and resources they need.
    • Parents have too much influence in decisions about what schools are teaching.

These were presented in rank order, with most cited reason first.  In the table below, that number is shown in the grey column.  Then the report disaggregated the support for each reason  by political party, which seems pretty straightforward BUT is actually very nuanced if you click through to the methodology:

image via pewresearch.org

The survey reported how democrats and republicans felt about these issues, but 38% of respondents identified at either Independent or “something else.”  However, since the question also asked them if they leaned democrat or republican and 34% of independents/something else respondents identified their leanings, the survey incorporated those results with the party they leaned toward even though they do not identify with that party.  This is problematic because it makes the survey seem a lot more bifurcated and divisive than it would be with a third category.  As reasonably intelligent adults, I think we can hold the tension of multiple parties with a variety of opinions; we don’t need it simplified into two categories. This nuance might just be visible is in the table at right, where the data are broken out by conservative/moderate conservative and liberal/moderate democrat.  I think — without any confirmation — that the “moderate” categories are meant to capture the Independent/lean democrat/lean republicans, but this strikes me as a mischaracterization since the question to respondents was if they “lean” one way or the other.  A Libertarian wouldn’t call themselves a Republican but would more than just lean right.  I always worry about data if I’m saying to myself “it’s probably fine?” There was no place in the actual survey for respondents to identify as conservative/moderate/liberal, so the source for these designations is murky at best and again makes the results very binary when in fact they might not be.

What’s less fine is this:  Of the people who responded to the survey, 75% did NOT have a child in public school at any level.  This is another example of a survey in which the majority of the respondents are not directly experiencing teachers or teaching in public schools at the time of the survey.  Their opinions are not necessarily being formed by current, personal experiences but by other things like media reports, things they’ve read on social media, rumors, and reading only survey headlines instead of full reports…

I’m guessing Pew Research did this for clicks;  like everyone else, they’re competing to get eyeballs on their reports.  But in education, there is more at stake than web traffic.  It’s irresponsible to use headlines designed to stoke outrage. Such tactics are likely  to shape perception in negative ways when the truth is a lot more complex. Results broken out into two categories when three would be more accurate and less divisive have real world repercussions for students and teachers in the form of funding and other constraints.  If we’re going to read surveys (and we should — surveys are fascinating) we have to commit to reading the whole survey to fully understand the evidence. And we have to demand that those headlines be fair and balanced, even if balance isn’t all that exciting.

A (Not Very) Short Discussion of the U.S. Department of Education -- Part 1
Is Teaching a Profession, a Job, or a Calling? (Part 3)

Comments are closed.