In my last post, I discussed the unreliability of the press as journalists rush stories to print without checking facts and invent “facts” when truth takes too long to uncover. Often this is simply sloppiness and impatience—sometimes it’s something much more damaging and calculating.
But untruths in the press are not always entirely the fault of the reporters and fact-finders. Sometimes it’s the powers-that-be who are at fault, providing press releases for media that are less than veracious.
PRESS RELEASES: THE OPPOSITE OF INVESTIGATION
“American Planes Hit North Vietnam After Second Attack on Our Destroyers; Move Taken to Halt New Aggression,” read the headlines of the Washington Post on August 5, 1964. As a result of this reported attack on our navy in Tonkin Gulf, President Johnson ordered retaliatory strikes on North Vietnam. But the report was not true—there had been no attack on our navy in the Tonkin Gulf. Relying entirely on Government press releases, the media spread deception designed to escalate the war with the full acquiescence of the American people. It later came to light that journalists of the day had considerable information which contradicted the official accounts, but this information was never used. It wasn’t until December of 2005 that the documents outlining the truth of this incident were officially declassified and made available to the public.
In this case, the media was not to blame for the dissemination of false information, although had they drawn from other sources than the official ones some truth might have been made known. Asking pertinent questions might have helped to alter events—who knows, perhaps public outcry might have brought the war to end much sooner.
But not all press releases have global consequences. Sometimes there are consequences only for one average American family. Here’s a personal account of how the media completely misrepresented a terrifying accident in my own family.
My husband works at an airport and is trained to be both a police officer and a crash/fire/rescue emergency worker. While most of his week is spent in law enforcement, once or twice a week he is assigned to a truck in the fire hall and responds to aircraft emergencies. One night, responding to a potential crash site, he drove his 40 ton truck out of the fire bay and as he turned onto the runway it began to tip and groan; and then, off-balance, it crashed on its side, its momentum nearly causing it to roll completely over. Every bit of glass in the structure shattered even before impact, and if my husband had not been wearing his safety belt he would have been flung out of the side window and then crushed to death as the twisted metal landed on him. As it was, he suffered internal injuries from the safety belt and from stress of impact, but was treated in the ER and released the same day. I received that phone call from the chief that every wife of police officers dread, but for bad news, it was better bad news than it could have been. We felt blessed that the accident had been no worse.
And then the morning paper came out.
The upshot of the article stated that my husband had been the driver in a single-vehicle crash, destroying an immensely expensive piece of airport equipment. “Officer Ross has tested negative for alcohol; the drug test results are pending.” And that was it.
Sounds pretty condemning, doesn’t it? If you read that article, you’d have no doubt in your mind that the accident was due to recklessness on my husband’s part, probably the result of drug-use on the job. It wasn’t entirely the fault of the reporter who wrote this damning bit of news—police officers are not permitted to speak to the press, and so the reporter had to rely on the airport’s Public Relations Department for information. And the airport was very happy to allow my husband to take the blame for the accident.
A little digging, however, would have provided just enough additional information to at least allow a bit of doubt to be cast on my husband’s culpability in the incident. Alcohol and drug testing are standard procedure in any accident and regulations required the tests be done—but no one at the airport ever even considered that my husband might actually have been under the influence of anything. Furthermore, the video camera’s footage from inside the truck itself showed that it had been proceeding at only 14 miles per hour at the time of the accident—hardly reckless driving. The resulting investigation into the accident showed that the airport had (against the manufacturer’s warning) enhanced the vehicle with a two-ton boom for spraying fire-deterrent foam, making the truck top-heavy and increasing the weight on the suspension—but did not reinforce the suspension to accommodate the increased weight. Naturally, the suspension weakened and eventually collapsed, nearly killing my husband as it did so.
It was difficult enough as a family to deal with an accident which could so easily have been fatal, without also having to ward off accusations of culpability in the accident. Our children were old enough to understand what was happening, and it was very upsetting to us all. The press is very happy to assume that a police officer is at fault when an incident occurs—and it is not inclined to take the time to investigate the possibility that someone else might have been at fault. Journalists also never seem to take into consideration the families of police officers and how false reports might affect them.
STATISTICS: MARK TWAIN WAS RIGHT!
Numbers and graphs always seem to lend an element of veracity to an article—after all, numbers don’t lie. Mark Twain would disagree, and he famously quoted Benjamin Disraeli to prove it: “There are three kinds of lies: lies, damned lies, and statistics.” Anyone who actually deals with statistics for their livelihood or who teaches statistics classes in university agrees that one can make statistics say anything one wants, whether it represents truth or not.
The fact is, statistics exist for the purpose of estimation—they were never meant to be taken as literal fact. Stats track trends and help businessmen and governments make educated guesses, but they are never actually true. Take a census for example. In 2014, the population of the United States was reported to be 318,857,056. Was that number ever at any point of time literally true? Probably not, given the number of births and deaths that take place every moment of every day, and taking into account the number of uncounted, displaced persons on the streets; not to mention inevitable errors made by census takers.
Here’s another way to make statistics say two very different things. Doctor A and Doctor B both work in the ER at the same hospital. Forty-six percent of Doctor A’s patients die under his care. Sixteen percent of Doctor B’s patients die under his care. Which doctor would you rather have treat you if you have to go to the ER? What if I told you that Doctor A is an expert in severe injury cases, and so all of the worst accident victims are placed under his care; the patients are dying when he gets to them, and yet 54% of them live because of his expertise. Doctor B is new and inexperienced; he is given the easiest cases to treat, and yet 16% of his patients don’t make it. Now whom do you want to be your doctor? Numbers can only tell you just so much—the most important bits of information cannot be quantified with statistics.
One often sees two very different things compared in the press to make points, but false comparisons don’t represent reality. For example, here’s a news report: there are twice as many vehicle-related fatalities in the U.S. in one month than the U.K. sustains in an entire year. Are drivers more careful in Great Britain? Is it safer to drive there than it is to drive in the United States? The fact is, there are 254 million vehicles registered in the United as compared to only 34 million in the U.K. This reflects the great difference in the population of the two countries: 319 million in the U.S as opposed to 64 million in the U.K. Estimated per capita, drivers in the U.S. are actually much safer than those in our mother country.
We’ve all seen those weasely advertising scams that say “up to 99% success rate” or “nine out of ten” whatever experts agree. Hardly anyone considers when reading these stats that every percentage point from zero to 98 is “up to 99%”. And as for the “nine out of ten” experts—who are these people? How many experts were polled? Where did they live? How were the questions worded? Was every response counted, or just the ones the pollsters liked? So many variables! How can we take such information seriously?
Frankly, numbers and statistics are fairly meaningless as they are normally used in the media. The truth cannot be reduced to a number—reality is always so much more complicated than math.
Sorry—this project has grown into a huge production! Part Four should finish this topic for me, and then I will rest easy.