Lies, Damn Lies, and Statistics
An election is coming up, and I recently received a call asking if I would be willing to participate in a survey regarding the issues on the ballot. I said, "Sure, why not?" After all, I'd love to have my voice and opinions represented, since I know I am way outside the norm when it comes to my political stance; the more of us oddballs who weigh in, the better for all the other oddballs out there who feel alone.
So the woman started to ask me questions. Her first few questions were generic—whether I was registered to vote, that sort of thing.
Then she asked me whether I would describe myself as Democratic or Republican. I said neither; I don't like what either party has done to our country. She pressed me, asking if I at least leaned one way or the other, and I said, "Not really." Then she pressed me some more, asking that, if I had to choose, which one would I choose. So I answered her.
And that was the end of the survey. I was quite startled, because she had said she was going to ask me my opinion on the issues on the ballot, and as soon as she found out which way I would lean if forced, she stopped the survey.
(I won't say which way I said I would lean, as it isn't very relevant and I don't want to cloud this short essay with issues of party lines. For the sake of argument, I am going to say that it was Republican.)
The obvious implication here is that the survey is deliberately being biased in favor of one party. The researchers have an agenda and they want to stack the deck as strongly as they can to support their agenda. They are not interested in a true random sampling or in getting a cross-party opinion on the issues; instead, what they are doing is preselecting respondents whom they have a pretty good assurance will give the opinions they seek.
Let me explain. Let's say the issue is one that Democrats strongly oppose. If the researchers only poll Democrats, but don't mention that they only polled Democrats when they publish their survey results, they can say, "90% of the voters we surveyed are strongly opposed to this issue." Yes, it may be true that 90% of the people they surveyed are strongly opposed to the issue, but the survey itself is tainted, biased from the start, and therefore the result itself is dishonest, because they didn't actually survey a true random selection, but instead only surveyed people who would tell them what they wanted to hear.
The dishonesty of this appalls me. They know and understand that the herd mentality often kicks in, especially around politics. If people are told that "90% of the voters we surveyed oppose" an issue, then the tendency on the part of most people is to think that 90% of the voters can't be wrong, and if one hasn't studied the issue, perhaps going with the majority is the safe bet.
However, if people are told that "90% of the Democrats we surveyed" oppose the issue, then that's a horse of a different color. Then people can decide whether they want to go along with the Democratic platform or not.
This is a deeply insidious, despicable, and unethical trick that the bottom-feeding researchers (and whoever is paying them) are playing on the unsuspecting public.
That is why I am making this post today: To warn you, if you didn't already know, to closely examine and question any survey results you read. Better yet, don't pay attention to any surveys, but instead read about, study, and reflect upon the issues yourself, and make up your own mind on what is the best course of action to take.
Oh, and remember to vote, too!
(PS In case you are wondering about the title of this post: Either Mark Twain is said to have said the following, or he is said to have quoted Benjamin Disraeli, the prime minister of the British Empire from 1874-1880: "There are three kinds of lies: lies, damn lies, and statistics.")