Liar, Liar, Pants on Fire, Hang Them Up on a Telephone Wire

"2 Men Looking On" from the "Perched" series by artist Jodi Chamberlain. Her multimedia work can be found at http://www.jodichamberlain.com (just click on the painting). Copyright by Jodi Chamberlain; used with permission of the artist.

By way of an introduction, I did some Internet research on the origin of the expression “Liar, Liar, Pants on Fire, hang them up on a telephone wire” and found this reference to an 1810 poem by the English writer William Blake:

 The Liar

Deceiver, dissembler
Your trousers are alight
From what pole or gallows
Shall they dangle in the night?
 
When I asked of your career
Why did you have to kick my rear
With that stinking lie of thine
Proclaiming that you owned a mine?
 
When you asked to borrow my stallion
To visit a nearby-moored galleon
How could I ever know that you
Intended only to turn him into glue?
 
What red devil of mendacity
Grips your soul with such tenacity?
Will one you cruelly shower with lies
Put a pistol ball between your eyes?
 
What infernal serpent
Has lent you his forked tongue?
From what pit of foul deceit
Are all these whoppers sprung?
 
Deceiver, dissembler
Your trousers are alight
From what pole or gallows
Do they dangle in the night?

If you do your own Internet research (and of course you do; you’re probably verifying this on Google right now), you’ll find numerous references to this Blake poem, some of them in fairly reliable places. There’s only one small problem: William Blake didn’t write it.

Additional research has yet to turn up the actual author of the poem or its place and date of origin. I did find, however, that the word “alleged” had at one time been attached to the claims of Blake’s authorship. Some of those who later reposted the poem and its origin myth carefully edited out the word “alleged,” perhaps to boost their own authority. I’ll leave it to the Blake scholars to hash out why this could or could not possibly be an obscure verse from the poet who gave us “Tyger tyger, burning bright”—which, when you say it loud, does share some strong poetic similarities with “Liar, liar, pants on fire” after all.

• • •

I am constantly amazed and dismayed by the number of people who both post and spread unproven statements and “facts” on the Internet. Quite a few of these people are close friends, many of them remarkably intelligent people. Even so, they often clamor to be the first ones to expose some new controversy or conspiracy via Facebook or Twitter. Maybe it’s the journalist in all of us, desperate for a scoop, but if there’s one thing I learned while staying with my ailing father and watching endless episodes of Judge Judy, Judge Jeanine, and Judge Joe Brown, it’s that hearsay is irrelevant in a court of law.

Let’s stop a moment to take a look at that word, hearsay. We hear something, then we say it. From ear to mouth in an almost direct line that bypasses the brain. It’s a key ingredient of soap operas and celebrity gossip, yet it also sadly infects the serious discussions of the day’s major issues. Perhaps in this technological age we need new words for hearsay, such as readshare or seetweet. “I read those statistics on someone’s Facebook wall, so I immediately ‘shared’ them by reposting on my wall.” “I saw a sports commentator on television talking about that big trade that might happen, and so I tweeted that it actually had because he sounded so convincing.”

Another classic example is the Reverend Martin Luther King, Jr. quote that went viral immediately after the killing of Osama bin Laden. Here’s the sentence in question: “I mourn the loss of thousands of precious lives, but I will not rejoice in the death of one, not even an enemy.” I have no problem with the quote itself, but King never said it. All credit is due to Jessica Dovey, who prefaced an actual King quote with that sentence. In its original form, she even set her own thought outside of the actual quotation, clearly distinguishing her words from his. (The teacher in me can’t help but point this out: Punctuation matters!) As the quote ricocheted around the Internet, however, the quotation marks got knocked off and the two speakers’ sentiments became one. Since the Internet never forgets, there are now thousands—if not millions—of lingering misattributions.

After this incident, I began to wonder just how many of those inspirational/motivational “posters” that people share contain actual quotes from the the pictured sources. With today’s technology, anyone with a basic knowledge of photo editing can attribute just about anything to anyone and fool millions in the process. Just look at all those often-shared fake Ryan Gosling quotes. It’s all in good fun…until someone starts attributing hateful or hurtful language to him. (Chances are probably pretty high that someone already has.) Then what started as parody or satire slides uncomfortably across a blurry line into libel and slander.

I know I come across as a killjoy here, but on matters like this, I reveal my true colors as a classic skeptic. I reserve judgment until I can ascertain the facts of the matter, and I remain open to the possibility that objective reasoning may lend credence and support to multiple sides of an argument, even those that might contradict my own previously held beliefs. Like William Blake and many other artists of the Romantic era, I seek a direct link to the primary source—though, devout skeptic that I am, I question his subsequent claims to a mystical intimacy with a supremely divine being. Perhaps his position in literary history—at the crossroads of the Enlightenment (or Age of Reason) and the Romantic era (or Age of Emotion)—describes my own inner identity crisis as a writer. I would also argue that it describes the manner in which America currently teeters as a political force in the world, and our casual (not causal) relationship with the truth seems to be a reliable measure of that tension.

As an educational writer, I am often governed by a remarkably strict and lengthy set of mostly objective state standards and guidelines, even as I am asked to write fairly sentimental pieces, such as articles about developmentally challenged children overcoming adversity. Age of Reason, meet Age of Emotion. For a recent series of writing assignments, I was required to provide two credible and reliable sources for each sentence (yes, each sentence) in nonfiction articles. When and where possible, I was to cite the primary source—in other words, an eyewitness account or an original text, not some second-hand or third-hand citation or discussion. After all, elementary school standards (i.e. grades one through six) stress the significance and importance of primary sources in research. This is not “elitist, ivory tower” graduate-school-level academia stuff.

In my own college-level classrooms, I likewise impressed upon students the necessity of primary sources in validating claims, especially as we grappled with sensitive and controversial issues such as abortion, capital punishment, and religious freedom. By extension, I reminded them that primary sources were crucial in evaluating the claims made by advertisers and politicians, both of which would often seek to twist the truth to suit their own agendas. By further extension—and hoping to avoid turning those young people from skeptics into cynics—I reminded them that they needed to evaluate the primary source itself and take into account any conflicts of interest. After all, the makers of Brand X are likely to promote studies that show that their product is more effective than Brand Z, especially when those studies have been planned and paid for by the makers of Brand X.

I had hoped that this kind of education, common in most schools and colleges, would serve the students well in their adult lives and help them to make objective, reasonable, and justifiable decisions on all matters great and small, from the election of presidents to the selection of a new toothpaste. (Is it more important to remove plaque, avoid gingivitis, eliminate bad breath, or have whiter teeth? Honestly, a skeptic can spend hours analyzing the endless varieties of toothpastes offered by each brand these days.) I wanted students to realize the great potential and power they had to sort through all of the distractions and clamor around them in order to get at the truth, even if that earned truth challenged their preconceptions and ideological dispositions. In other words, I wanted them to be open to new information and perspectives, even if this led them to change their minds. After all, the human capacity for change and adaptation is a key aspect of our forward-moving evolution. Yes, I said it— and if you don’t believe in change or evolution, enjoy the view from your rut.

In the end, however, it appears that preconceptions and ideological dispositions still get the best of most of us. On social networks, people on the left post and spread unverified stories and statistics that seem to prove their political points; people on the right post and spread unverified stories and statistics that seem to prove their political points. Some of these are easily proven to be hyperboles, misstatements, or outright lies; others are carefully twisted and redirected versions of the truth (the process we all know as “spinning”). In nearly all instances, however, the posters (and re-posters) risk looking either naïve, misinformed, ignorant, or just plain stupid when the claims are analyzed and the facts made clear.

That risk—and the willingness of so many people to accept it for the most trivial of reasons—fascinates me. Checking the validity of some of these claims and stories takes less than a minute, especially if you’re clued in to sites such as snopes.com and politfact.com. (Disclaimer: These sites are good places to check first, but on their own they are not definitive arbiters of right and wrong. They can, however, provide valuable leads in a search for those coveted primary sources I mentioned earlier.) Despite that, there is still the temptation to hit that “share” button, which takes but a split second and provides that little hit of adrenaline that information junkies seem to crave. After all, if they wait too long, someone else might scoop them and ruin their status as the first (if not primary) source of breaking news. Take that, FoxNews and CNN!

(Topical side note: The death of Whitney Houston on February 11 was a prime example of this. Even before reports had been confirmed, dozens of people raced to post the story as fact on their Twitter and Facebook feeds. Perhaps they all wished to be, if I may twist the New York Times’s catchphrase slightly, the “tweeters of record.”)

In many instances, perhaps, the risk of being wrong is offset by a firm belief that we are right. Even if the statement is later shown to be false, the underlying “truthiness” of it remains. (Thank you, Stephen Colbert, for having introduced that word into the public lexicon. We clearly needed it.) To explain this more plainly, the truth doesn’t need to actually be true as long it truthfully supports what we truly believe to be true. This is why politicians can make wild claims in our legislative chambers (such as the “fact” that Planned Parenthood’s primary mission is to provide abortions) and later claim “no harm no foul” because the easily disproven “fact” was simply alluding to a basic “truth” anyway. (If I had to allow such reasoning in my classes…well, I couldn’t and wouldn’t. It undermines the entire premise of education, not to mention making a travesty of logic and reason. Civilized people simply don’t entertain this kind of idiocy.)

When it comes to believing the unbelievable (or giving credence to the incredible), it’s not just the lies that people tell that intrigue me—it’s also the lies that they hear and subsequently believe or let pass unchallenged. This takes us back to the crux of the reposting problem: people believe and spread untruths because, in some strange way, they WANT the untruths to be true somehow. Testing such a claim would be evidence of weakness or a lack of confidence: If it feels or sounds right to you, it probably is—no further research necessary. In other words, if you’re a cynic, pointing out hypocrisy justifies your cynicism. If you’re paranoid, spreading conspiracy theories justifies your paranoia. If you’re (liberal/conservative), propagating the latest (conservative/liberal) scandal gives you that higher moral standing that you just know your side deserves. If you hate Company A or Team B, then that damning news about Company A or Team B simply MUST be true.

So let’s bring this full circle. Over the years, I’ve come to believe that you can learn a whole lot about a person from the lies he or she tells. Further, you can tell even more about that person from the lies he or she is eager to believe, especially at the risk of reputation and social standing. The more entrenched the belief, the less likely the person is to respond to reason. As a result, dissonance becomes an unavoidable consequence, and with that dissonance comes discomfort, anger, and sometimes violence. In all the hullabaloo, reason and logic are often the first casualties.

Perhaps a first-hand case study would help to illustrate the point. At a town-hall-style meeting a couple of years ago, Senator Bernie Sanders was taking questions from the crowd. One man, most likely a member of the Tea Party, asked him to explain why the government keeps raising his taxes. Sanders pointed out to him that President Obama had just passed tax cuts for the majority of wage-earners, but this man was insistent: His taxes had gone up. Sanders tried to reason with him, pointing out that he appeared to fit the demographic of people who received a tax cut, but no, there was no reasoning with the man. His taxes had gone UP. The discussion, as Sanders subsequently remarked in his typically brusque style, had reached an impasse and could not continue in that forum.

Had the man’s taxes gone up? Probably not. But here’s a pet theory I have about all of this. Had the amount of money being taken out of his weekly or monthly paycheck gone up? That seems likely, especially since many companies have been increasing the employee co-share of medical benefits as health insurance costs continue to rise. The disgruntled man might easily have lumped all of these paycheck deductions under the simple heading of “taxes,” giving him evidence, however misinterpreted, for his claim and belief. And what legislation had been introduced to try to deal with the problem of rising medical insurance costs? Why, the very legislation that members of the Tea Party are dead-set against.

The man at the town hall meeting was angry, and he wanted that anger to be heard and validated. When the facts didn’t serve that purpose, he was faced with a dilemma. He could change his mind, or he could stand his ground.

His final decision: ignore the facts. Therein lies the root of the word ignorance: someone who can be shown the truth yet insist on the lie. Someone who can be told that something is fiction and yet still believe that it’s fact. In this respect, ignorance is a worse trait than naïveté or stupidity.

Just consider the synonyms for the feeble-minded: words like dull and dim. Ignorant people are hardly dull or dim. They carry the full force of their beliefs behind them. Their faith in one unwavering, indisputable view of the world can be both violent and catastrophic. After all, when one person runs through the village with his pants on fire, he or she can spread the flames to every surrounding building. That makes the practice an emergency worthy of public concern.

Luckily, we are all able firefighters. With all this “Information Age” technology around us, the truth is closer to each and every person than it ever has been in the history of mankind. And yet, the most powerful tools can often be used both to create and to destroy. A hammer can help build a hope chest for our dearest possessions, but it can also break a window or kill a man. One post can spark the overthrow of dictators and the rise of democracy, but it can also ruin careers and destroy a country’s economy.

I’ll end this section with a paraphrase of Tim Gunn from the reality show “Project Runway”: “Use the wall thoughtfully.” He was talking about shelves of fashion accessories, but he might just as well have been talking about Facebook. Whatever we do, we should do it thoughtfully, with care and consideration. To do otherwise would be, quite simply, uncivilized.

(NOTE: There is a second and perhaps even third part to this post already underway…)

Advertisement

Saying Goodbye to Santa Claus

Spoiler Alert: Santa’s “Big Secret” revealed in this blog entry.

Exhibit One: Me with Santa Claus at home in the 1960s, proof positive that he exists.

Now that Santa has flown in, tucked gifts under trees both hither and yon, and headed back to the North Pole for some well-deserved R&R, I feel it’s time to take a look at one of America’s biggest myths and think about how it may have affected us as a nation…or not.

But first, in the spirit of the holiday season, I offer a nostalgic visit to my hometown in Massachusetts circa 1970. Picture plastic candles in each street-facing window and a lacquered pinecone wreath adorned with a festive red felt bow on the front door. If you peer in through the spray-on snow frosting the windows, you can see me carefully filling a plastic garbage bag with dozens of gifts. My parents watch, slightly puzzled but mostly silent, as I pull on my snow boots and mittens, then leave the house, bag slung over my shoulder.

A week or so earlier, I had learned a shocking truth that rocked my little world—a secret that had been kept by nearly every adult I had ever met. They had lied to me, these adults. People whom I had trusted entirely, including the local minister and my own parents, had taken part in an international conspiracy and perpetrated a myth, a fantasy, a fiction. The story included a conveniently distant setting, a saintly protagonist (whom I had met in person on several occasions), and a desirable plotline that evoked grand themes of peace, good will, and generosity. To cover their tracks, my parents had even planted evidence: sleigh bells jangled as sound effects in the wee hours of Christmas morning; cookie crumbs and half-drunk tumblers of milk left on the metal TV table set up alongside the chimney.

All of this was an elaborate scheme that blurred the lines between fiction and nonfiction, between fantasy and reality. Young and gullible, I was easily duped. Of course there was a Santa. Of course reindeer flew. Why even question the physics of how, in one single night, a rather rotund man could pilot a craft to every single household around the world and leave presents for all the good boys and girls—and still have time to toss back some cookies and sip some milk in each abode?

In asking me to believe in such fantastic things, my parents taught me an important lesson that would be vital to my budding literary ambitions: how to suspend disbelief. In doing so, however, they taught a corollary lesson: how to suspend belief. In other words, in order to suspend my disbelief in Santa Claus, I also had to suspend my belief in many of the lessons learned in grade school (science, geography, math, etc.).

In some ways, then, the revelation that Santa Claus was a fabrication probably came as something of a relief to me. The dissonance between fantasy and fact, between what I was being told to believe and what I was learning to be true, lessened. That psychological summary may be a bit too deep to ascribe to an eight-year-old’s consciousness, so let me state it another way: Santa or no, the presents were still there on Christmas morning, and so all was well with the world.

Luckily for me, my parents didn’t serve up the revelation about Santa Claus with a simple “Sorry, kid, but that’s just the way it is.” They discussed the importance of symbolism and how this extended to the Santa myth, claiming that while Santa himself may not be real, the spirit of giving that he represents lives on in the hearts and souls of all those who have heard his story. Any fan of the famous “Yes, Virginia, There Is a Santa Claus” newspaper editorial might have accused my parents of plagiarism, but I could tell they were sincere.

Still: Such power in a fictional tale! Suddenly, my dreams of becoming a fiction writer one day became a vastly more important, almost religious endeavor. See how the power of a story, even a fictional story like Santa Claus, could have such great positive effect on the real human world!

And so I headed off into the night with my makeshift Santa sack. Inside I had placed carefully wrapped toys and books for the kids, and on the second and third winters’ visits, some ribbon candy for the adults in each household. I carried on the tradition until, one year, something unexpected happened. Some families had wrapped and readied gifts and treats for me. Somewhat embarrassed by their assumption that I expected something in return, I ended the Christmas Eve tradition that same year.

For years, I forgot about this bit of personal history. I was recently reminded of it by an article about a Vermont teacher accused of being unprofessional and irresponsible for spilling the beans about Santa in a fifth-grade classroom. The teacher had asked students to list names of famous people in American history. In order to keep the lesson focused on facts, the teacher felt compelled to leave figures such as Winnie the Pooh, Harry Potter, and Santa Claus off the list. (I could not tell from the article if she allowed the also-mentioned Jeff Foxworthy and Justin Bieber to remain on the list, but that’s another discussion for another time.)

(The full article is here:

http://www.reformer.com/ci_19576402?IADID=Search-www.reformer.com-www.reformer.com)

The mother who raised the “unprofessional” and “irresponsible” charges against the teacher went on to say that teaching about Santa Claus was like teaching about religion: the topic is best set aside with recommendations to ask one’s parents about such things. That seems fair enough…until I thought about the goals of education in general.

Since a good part of my day job (writing and editing educational materials) relies on the various state standards developed by school boards (many of them quite conservative) around the country, I know that “learning to distinguish between fantasy and reality” is a pretty important benchmark in the lower grades. (Keep in mind that the instance noted above took place in a fifth-grade classroom.) In other words, children are required to differentiate between nonfiction and fiction (fairy tales, myths, legends, and the like). Teachers are required to provide students with the skills and strategies to do this. By fifth grade, then, your average American student should have the reasoning skills to figure out the Santa thing on her/his own. Any parent who disagrees risks spotlighting their children as slow learners—perhaps along with themselves.

According to research done by psychiatrists at Ithaca College and Cornell University in the 1990s, the average American child learns the truth about Santa at age 7 1/2. However, after interviewing 500 elementary-school children, they discovered that “Many children kept up the charade after they knew the truth…because they did not want to disappoint their parents.”

Parents, take a moment to reflect one the meaning of that last clause (no pun intended). Your kids may be duping you into believing that they still believe in Santa. I think back on my own behavior as a pseudo-Santa and wonder if that was, in some warped way, an effort to turn the lies my parents had told me into truths…ergo, my parents had not lied to me after all.

Further, Dr. John Condry, one of the authors of the Ithaca/Cornell study, reported, “Not a single child told us they were unhappy or upset by their parents having lied about Santa Claus. The most common response to finding out the truth was that they felt older and more mature. They now knew something that the younger kids didn’t.”

(You can read more about the study here:
http://www.nytimes.com/1991/11/21/garden/parent-child.html?pagewanted=2&src=pm)

This finding surprised me. “Not a single child”? Parents, take another moment to think about telling your child that he or she cannot have a toy or candy bar that he or she has already selected while you were shopping at the grocery store. When you took the item away, was your child calm and well-mannered about it? Or was the response similar to those submitted for a recent Jimmy Kimmel spot in which the talk show host asked parents to tell their children, “Hey, sorry, I ate all your Halloween candy.” (Permissions permitting, the videotaped results of this rather non-academic study are here: http://www.huffingtonpost.com/2011/11/03/jimmy-kimmels-ate-halloween-candy-challenge_n_1074334.html)

In a 2006 opinion piece in the New York Times, Jaqueline Woolley wrote, “Children do a great job of scientifically evaluating Santa. And adults do a great job of duping them. As we gradually withdraw our support for the myth, and children piece together the truth, their view of Santa aligns with ours. Perhaps it is this kinship with the adult world that prevents children from feeling anger over having been misled.” What is this “kinship with the adult world” of which Woolley writes? Is it the tacit understanding that adults lie, and that it is OK for them to lie (or “support a myth”) on a grand scale?

(The link to the Woolley article is here:
http://www.nytimes.com/2006/12/23/opinion/23woolley.html?_r=1&oref=slogin)

Surely someone sees this Santa thing differently. For balance, I turned to a group whose opposition to myths and distortions is part and parcel of their identity: the objectivists. This group is huge these days with Republicans and the Tea Party, both of which have renewed a fervent interest in the writings of Ayn Rand, particularly as it applies to self-determination and self-interest. Surely, the somewhat socialist “give liberally to the poor children of the world” Santa myth (I base that description on the story’s historical roots in relation to Saint Nicholas, who, by the way, was also the patron saint of pawnbrokers) would be anathema to such a group. And it is.

According to Andrew Bernstein, a senior writer of the Ayn Rand Institute, “”Santa Claus is, in literal terms, the anti-Christ. He is about joy, justice, and material gain, not suffering, forgiveness, and denial.” Another quote from the article: “The commercialism of Christmas, its emphasis on ingenuity, pleasure, and gift buying, is the holiday’s best aspect—because it is a celebration, the achievement of life.”

(You can read the full piece, a celebration of the commercialism of Christmas, here:
http://www.aynrand.org/site/News2?news_iv_ctrl=1263&page=NewsArticle&id=7632)

All of this leaves me as puzzled about Santa Claus as I was when I learned the dark secret of his nonexistence. To this day, I give presents that have “From Santa” scrawled on the tag, and I try to mask my own handwriting despite the fact that the recipients know they’re from me. Likewise, I love surprise presents: gifts that appear out of the blue from anonymous sources, those random acts of kindness that rekindle our faith in human generosity. (Special kudos to Ben and Jerry’s for a coupon they once published that granted a free ice cream cone to the person in line behind you at one of their scoop shops. Brilliant.)

The spirit of Santa lives on and is no lie. It survives despite the increases in greed and entitlement—both running rampant through our society today, malignant cancers that question and threaten human compassion and generosity. I’d even argue that the spirit of Santa, despite its secularization over the decades, also maintains its ties to the spirits of nearly every religion, even those that claim independence from mythology or dogma.

In the years ahead, perhaps we can pull that spirit back from fiction and establish it fully as year-round fact. After all, nearly every child longs for Santa to be more than a seasonal fantasy. Maybe it is up to the child within us adults to make it so.

Postscript: I dedicate this blog entry to my father (pictured above as Santa) who passed away in 2011 and was very dearly missed this Christmas season. His many gifts to me continue to resonate throughout my life.