The Denver Post has run two guest opinion columns (by a local columnist here and by a syndicated columnist here) relying heavily on a dubious study, financed by the right-wing Koch Foundation and conducted by a UCLA professor, which concluded that “freedom of expression is deeply imperiled on U.S. campuses.”

Yikes! I’ve been seeing conservatives fret about this, and I’ve been waiting for proof. Could this be it?

My kid is in college and my wife works for one. They both love free speech, and they haven’t complained about its demise on campus. And I haven’t seen any data, beyond minor anecdotes, ridiculously overblown by the media, supporting the notion that American universities don’t love free speech and are anything but completely dedicated protecting it and educating students about its value.

Yet, a new poll comes out, and Denver Post columnist Vincent Carroll sounds the alarm in an Oct. 1 column arguing that unlike today, activists in the 1960s and 1970s did not want to shut down hate speech:

“UCLA professor John Villasenor, who conducted the survey, found that a plurality of students believe the First Amendment does not protect hate speech (of course it does) and a majority thinks a school is ‘legally required’ to present opposing viewpoints to a speaker ‘known for making statements that many students consider to be offensive and hurtful’ (there is no such requirement),” wrote Carroll.

“Far more disturbingly, a slight majority also said it was acceptable for a student group to disrupt the speech of a controversial figure ‘by loudly and repeatedly shouting so that the audience cannot hear the speaker,’ while nearly one-fifth said it was acceptable to use violence ‘to prevent the speaker from speaking.'”

The trouble is, no one, not even Villasenor himself, seems to know whether his on-line, opt-in poll can be trusted.

And it doesn’t comport with most opinion polls of students on the topic, according to Prof. Seth Masket, who’s the chair of the political science department at the University of Denver.

“It’s true that researchers use on-line polls all the time,” Masket wrote me in response to my email query. “But generally great care is taken to make sure the sample is gathered or weighted to be representative of the underlying population. It’s not clear to me that this was done in this case.

“Regardless, the fact that this survey produced results that are highly inconsistent with other recent surveys on similar topics suggests that more research is needed. Either this poll is an outlier or there’s been a radical and massive shift in the beliefs of college students. The former is far more likely than the latter.”

I contacted Masket after my jaw dropped in response to reading an extraordinary exchange between Villasenor and the Guardian’s Lois Beckett, in which Villasenor wouldn’t even confirm that his own study was representative nationally of college students.

Beckett reports:

The way the survey results have been presented are “malpractice” and “junk science” and “it should never have appeared in the press,” according to Cliff Zukin, a former president of the American Association of Public Opinion Polling, which sets ethical and transparency standards for polling…

Villasenor wrote in an email that he was reluctant to give a yes or no “sound bite” answer to the question of whether the students he surveyed were nationally representative of college students or not.

By some measures, Villasenor wrote, the 1,500 respondents to his survey had seemed to reflect the rough demographic makeup of American college students. By others, they might not.

Asked if he’d seen the Guardian story, Carroll emailed me:

“Yes, I saw the Guardian critique, although not until the day my column was published. It made me uneasy. Then I saw this analysis from the Washington Post, which made me feel better: [See it here.]

I am not a polling expert, so I am not about to tell you I am sure which point of view is correct. Polls are used in the country far too often to measure the unmeasurable — to assess opinion that either doesn’t really exist or that is so shallow as to be essentially meaningless. I thought that the questions in the Brookings poll by contrast had the potential to provide useful information, which is why I wrote about the results.

Whatever the merits of that poll, I do think, as I wrote in my column, that the desire of some student activists today to cleanse campuses of hate speech and other opinions that offend them differs from the goals of student activists in the 1960s and 70s.”

I agree with Carroll about polls. They seem to be used to generate clicks or create public opinion more often than to provide meaningful information that reflects public opinion.

And the article Carroll cites, by Washington Post opinion writer Catherine Rampell, who defended Villasenor’s survey, helped me understand the utility of an opt-in poll, like the one Villasenor conducted. But Rampell’s article didn’t make me feel much better about relying on conclusions derived from Villasenor’s specific poll.

For example, Rampell reveals that Villasenor “had not conducted a survey before,” and yet he was responsible for making sure his sample was representative. And we know from the Guardian that Villasneor didn’t really know if his poll was representative of college students nationally.

Still, Rampell’s column makes the point that these types of polls can be useful, and any poll can be flawed. In this case, Charlottesville might have affected it, Rampell points out.

That’s why it’s a disservice for columnists or journalists to trumpet a single poll about any topic, particularly if it’s mostly out-of-step with other opinion data.

That’s what Masket soberly points out above, and what Tufts University Prof. Daniel Drezner argues here in a Washington Post column about the Villasenor poll.

“So let me be candid: Just as I think everyone was too hasty in trumpeting this [Villasenor] survey, I do not want everyone coming to the opposite conclusion because of this column,” writes Drezner. “Villasenor’s findings warrant some follow-up polling. If his results are substantiated in more rigorous follow-up research, I will be greatly concerned.

“But what concerns me far more right now is the eagerness with which columnists seized on these findings as vindicating their preconceived belief that today’s college students are just the worst. One of the common laments of modern pundits is that today’s college kids are snowflakes who rely on feelings more than logic to jump to conclusions. But it is the commentators who are leveling critiques against today’s college students by relying on arguments as well organized as a Berkeley free speech week. They are the ones who failed to look more closely at a result that they so badly wanted to be true.

“Given the hysteria that this poll produced, I am far less concerned about today’s students than I am about today’s scolds.”

To be fair, Carroll draws on his own experience to make the point that he didn’t see 1960s and 70s activists shouting down speakers, but toward the end of his column, he offers this broader pessimistic opinion:

“Villasensor believes colleges need ‘to do a better job fostering freedom of expression on their campuses,’ while middle and high schools should focus more attention on the First Amendment and constitutional principles. Of course they should, but the likelihood of either occurring is not good,” Carroll wrote.

In truth, most of the evidence we have says colleges in America are doing this now–and doing a good job of it.