Tag Archives: History

What’s Old and New about these Book Bans

6 Feb

Luis Alvaz, CC BY-SA 4.0 https://creativecommons.org/licenses/by-sa/4.0, via Wikimedia Commons

While it wasn’t the best book I read as a teen, Richard Peck’s 1995 young adult novel about a suburban town’s attempts to shield its teens from sex, drugs and rock ’n’ roll certainly had the best title summing up the whole idea: The Last Safe Place on Earth. The 1990s are often thought of as a more placid era in America in contrast to today. After all, no politician from an opposing party angrily denied Bill Clinton’s electoral victories, let alone urged a mob of violent citizens to stop the congressional counts of the election results. 

But right-wing extremists embracing both anti-government and white supremacist ideologies bombed a federal building in Oklahoma City, murdering 168 people including 19 children. A total of seven women’s health workers were murdered and 13 more were injured by shootings, stabbings, bombings or acid attacks perpetrated by anti-abortion terrorists over the course of the decade. (That’s not counting attacks before or after the 90s.) I distinctly remember the day my schoolteacher wrapped up a debate about the death penalty and then soundly refused a student’s request to hold a debate on abortion. “No way. Grown-ups can’t even handle that debate without resorting to violence,” he declared. In the 90s, the culture wars were raging as we, the kids of the Baby Boomers, sat in schools and parents fought over whether or not we should be allowed to learn anything from the feminists or the gays who had fought and were fighting for liberation. If you ever heard about a proposed book ban in schools or libraries, you could be fairly safe guessing it came from the Christian Right, opposing anything that didn’t portray premarital sex as sinful, feminists as destructive or queer kids as sick. 

The current calls to remove certain books from school libraries are novel only in part. The American Library Association provides statistics on the most frequently challenged books since 1990 and some of the titles and many of the topics on this year’s list remain the same. In 1990, Robie H. Harris’s It’s Perfectly Normal was the villain of the hour, while today it’s Cory Silverberg’s Sex Is A Funny Word. Comprehensive sex education has been attacked ever since it was first proposed in America and 19 states still mandate abstinence-only lessons. Last year’s miniseries Mrs. America deftly showed how Phyllis Schlafly used the power of an enormous mailing list to unite diverse conservatives and religious groups across the country in their staunch opposition to gender equality and make them into the massively powerful political force they have become. Judy Blume, who has long been the most challenged author in the United States, wrote about her experience in 1999:

There was no organized effort to ban my books, or any other books that I knew of anyway. The seventies were a good decade for writers and readers. Many of us came of age during those years, writing from our hearts and guts, finding editors and publishers who believed in us, who willingly took risks to help us find our audience. We were free to write about real kids in the real world. Kids with real feelings and emotions, kids with real families, kids like we once were. And young kids gobbled up our books, hungry for books with characters with whom they could identify…

Then, almost overnight, following the presidential election of 1980, the censors crawled out of the woodwork, organized and determined. Not only would they decide what their children could read, but what all children could read. It was the beginning of the decade that wouldn’t go away, that still won’t go away…

But the calls to remove books about the Holocaust and Ruby Bridges today are something new. I can’t speak to the experience of students in the Southern states, where the United Daughters of the Confederacy fought successfully 150 years ago to expunge discussions of slavery and human rights from school history lessons about the Civil War. But in the 1990s, it was very easy as a white teen living first on Long Island and then in an Upstate New York town with minimal racial diversity to think that racism existed but was mostly a problem of the past, thanks to the way it was taught. I learned in school how heroic American soldiers had liberated the concentration camps and how heroic Northerners had helped Dr. King end segregation through non-violent resistance. Both stories had happy endings. I never learned about the U.S. government rejecting a ship of Jewish asylum-seekers during the Holocaust. Or about any of the Americans who supported fascism or antisemitism, or the two-thirds of Americans who said German Jews were either fully or partly to blame for their own persecution. Or about violent reactions to racially integrating schools in the Northern states. Or about white flight, past or present. The Oklahoma City bombing was taught as tragic, militia groups were framed as crazy, but there were no lessons about these groups’ ties to white supremacy. The Ku Klux Klan faded from our history books after we finished the chapter on the Civil Rights Movement.

I knew homophobia was everywhere – from my classmates (and the occasional teacher) who used slurs regularly, to national figures who called lesbians degenerate, to the outrage in the local papers over an attempt to start a Gay-Straight Alliance at my school. Such viciousness regarding race seemed to exist only far away. When my mother bought a subscription to the newsletter of the Southern Poverty Law Center, I learned there were hate groups around the U.S. But such statistics were not taught in school and they did not make the front page of mainstream papers, which made me subconsciously wonder how powerful they really were. No mainstream sources were asking me to question why all the neighborhoods I had lived in were all-white, or where those who had so viciously opposed Dr. King had gone.

In the 90s, intersectionality and Critical Race Theory were around but never afforded attention outside of academia. Warren Beatty’s film Bulworth called out the left for having gone soft on human rights and taking Black voters for granted, but it attracted little more than passing popularity among my classmates for its brazen gangsta talk. We wouldn’t have been allowed to watch it in high school on the grounds of foul language.

At the same moment in modern history, my partner was across the Atlantic, sitting in a Catholic high school in Germany, learning in no uncertain terms that his country was responsible for the Holocaust. Here in Germany, book bans have widely been condemned since the 1960s to be the work of fascists, as memorialized by Berlin’s Empty Library, seen in the photo above next to the plaque reading, “Those who burn books are capable of burning people.” Susan Neiman’s excellent book, Learning from the Germans, outlines how U.S. municipalities and schools could teach about our own history of racism, sexism, ableism and human rights crises in a way that precludes complacent self-congratulation and nationalism. Proposals echoing such suggestions are the target of so many of the book challenges and vitriolic debates in schools today.

The rise of voices calling out modern racism in the U.S. began in the 2000s when I was in college, where many of my fellow Millennials embraced Michael Moore and John Stewart. Such voices were regularly dismissed as fringe by the mainstream media, and you were easily dismissed as a crazy lefty if you mentioned them around certain neighbors or relatives in the post-9/11 era. A college course in genetics confronted me with the faulty science of The Bell Curve, a book I could barely believe had become a bestseller in the 90s. Barack Obama’s first run on the campaign trail left me shocked at how many white voters—both Republicans and Democratic feminists alike—openly used racist arguments to attack him and his family in support of their preferred candidates. Discussions of racism in the mainstream gradually increased over the course of his presidency.

In 2015, the year after the first Black Lives Matter demonstrations, the New York Times revealed in a front-page story that the village of Yaphank, a 10-minute ride from my childhood home, was once the site of Hitler Youth camps and still had a whites-only housing policy on the books. In 2018, an in-depth, 10-part report featured in Newsday revealed that Long Island’s four counties—Suffolk, Nassau, Brooklyn and Queens—top the list for the most racially segregated counties in the United States. Such mainstream media attention to racist policies that have been there all along is new, and if students in school today are not learning about it, they should be.

Even John McWhorter, a frequent critic of today’s human rights activism, has lauded this mainstream shift as an improvement:

I welcome the increased awareness of the notion of systemic racism. Despite my alarm at the excesses of today’s progressive politics, I’ve never argued the simplistic notion that racism boils down to cross-burnings and white people saying the N-word. I recall sadly a conversation I had, when I was a grad student, with a white woman who was an undergraduate. She said, roughly: “So today, Black people can go anywhere they want, they can do anything they want — what’s the problem?” And she wasn’t terribly interested in an answer. Her question was more of a declaration, what she regarded as just facts, and she felt no civic impulse to even consider otherwise.

Of course, her perspective, then, is alive and well now. Yet an undergrad today would be much less likely to see race matters only that far. The racial reckoning of recent years; the cultural decentering of whiteness; and the airing of what is meant by systemic racism have brought about that positive evolution. The other day I heard some white kids—upper-middle-class New Yorkers—casually referring in passing to systemic racism while walking down the street from school, clearly thinking of it as an assumed concept. I was hearing no such thing in my grad student days. Gallup polling asking “Are Black people in your community treated less fairly than White people?” in situations involving the workplace, shopping, dining out, interactions with police and access to health care, shows that from 1997 until 2021, white Americans and Americans overall became more aware of racial disparities.

Whether it’s a backlash to more probing lessons about racism or a decades-long effort to marginalize queer citizens, restrictions on libraries always threaten democracy. The current efforts to curtail human rights discussions by removing resources on history in schools in the United States is a crisis. But we should never ignore the proof that the seeds for this crisis were sown long ago.

Because We Gotta Keep Telling the True Stories in Dwarf History

1 Sep

Anthonis_van_Dyck_013(1)

(Public Domain Image used via)

 

Leaving you this week with a must-read feature in the New Zealand Herald: “The Civil War Solider with Dwarfism Who Was Gifted to the Queen.” Following the extraordinary life of British man Jeffrey Hudson, the article quotes historian Dr. John Woolf who points out that Hudson’s being handed over to Charles I’s wife as a present was not unusual at the time:

Dwarfs were around in the courts of Ancient Egypt, China and West Africa. Alexander the Great (356BC-323BC) gathered a whole retinue of dwarfs. The Romans collected dwarfs as pets, placing some in gladiatorial rings to fight with Amazons, and tossing others across the amphitheater for entertainment. By the Middle Ages, dwarfs were kept side-by-side with monkeys, sometimes traveling between royal households in birdcages.

I never learned that in school.

Through resources provided by Little People of America, I became aware around the age 12 of the circus freak tradition in the 20th century to which so many dwarfs were left to turn. This made me increasingly suspicious as a teenager when watching period films and documentaries romanticizing the days of beautiful people darting between horse-drawn carriages and candlelight that none of what I saw would have been imaginable* back then for someone who looked like me. My own research later confirmed those suspicions. It’s time the rest of the world start to talk about it.

 

*Aside: As noted before on the blog, period films rarely depict what life truly would have been like for any of us. Invariably Victorian women are portrayed wearing makeup while too many pre-Victorian kings are portrayed without. Not to mention a third of us would have been more likely to die in childhood than survive long enough to make it into the history books alongside Charles I. During his reign, you were most likely to die of small pox. Play this game to find out what long-forgotten diseases would have killed you in other time periods in the West.

 

 

Misremembering What “Great” Looked Like

2 Apr

Rogier_van_der_Weyden_(workshop_of)_-_Portrait_of_Isabella_of_Portugal(Public domain image used via)

 

How much of a story about life in the good old days is fact and how much is fiction? In the HBO miniseries John Adams, a mob of Patriots attack a British customs officer, strip him naked and cover him in tar and feathers. The scene shows the victim slathered in asphalt tar – a substance that did not exist in the 1770s. Mobs instead used pine tar, which is brown instead of black, but filmmakers of course knew that modern viewers would not recognize it as easily as they would asphalt.

Such artistic license is arguably negligible and John Adams deserves distinction as a period drama that is predominantly accurate, rendering its characters and indoor scenes as gray and as musty as life was before electricity and indoor plumbing. Most filmmakers prefer to embellish period dramas, opting for audience appeal over historical accuracy. In the 2002 film version of The Importance of Being Earnest, the Victorian protagonists serenade their beloveds with an upbeat jazz number, which is the equivalent of playing disco music in 1945. And for most of the story, Colin Firth and Rupert Everett look like they always do – that is, clean-shaven and donning boyish coiffures they previously wore in romantic comedies set 100 years later. While parasols and top hats abound, no one in the film is flaunting the glistening hair gel and heavy handle-bar mustaches of the play’s original stage production in 1895.

Directors almost always decide that lovers and heroes in period pieces should adhere to contemporary fashion rules from the neck up, lest audiences be less likely to swoon. Thus pretty much any film set in Ancient Egypt, Rome or the Early Modern Era pretends that men never wore eyeliner or lip rouge. (And that all the good guys looked white.) Films set in the Victorian era correctly leave cosmetics off the men but wrongly apply it to the female characters, who would have been insulted by anything more than face powder. (Makeup was for actresses and prostitutes, and Victorians didn’t see much difference between the two.) Even though Queen Elizabeth II is the most famous woman in the world, the actress who portrays her in the award-winning series The Crown has a far daintier nose and jaw, with eyebrows plucked to evoke the cover girls of today. Filmmakers who wish to forego such historical inaccuracies face an uphill battle, according to John Adams director Tom Hooper: “Wherever possible I wanted to do things that weren’t about making people beautiful. The truth is there’s a whole machine of filmmaking that’s all about making people look great. And you have to really intervene in every department to sort of say, ‘No, I don’t want that. I don’t want people to wear any makeup. You’re not allowed to wash people’s hair.’ ”

Hollywood takes such liberties in the hopes that the audience will barely notice. Viewers watch period dramas in order to oo and ah at the finery, and imagine that they could easily slip into an earlier era and have a grand old time. They can imagine this because they are protected from unpleasant information such as the fact that the powdered and painted aristocrats of Louis XIV’s courts regularly relieved themselves in the gilded corridors and behind the velvet curtains of the palace. Horace Walpole noted the stench at the time, but Hollywood has yet to. The audience’s comfort comes at the expense of the opportunity to learn that standards of attractiveness, cleanliness, and morality are far from universal, shifting continuously throughout human history. Likewise, it is an opportunity to learn that our feelings of disgust are often not innate but a product of where and when we grew up.

A handful of films and plays have thrived by underscoring the changes between then and now. Mad Men earned critical acclaim and a loyal following not only for its meticulously authentic fashion but for subtly laying bare the secrets of everyday life in the early 1960s that TV shows of the era had omitted: rampant infidelity, casual racism, sexual harassment, anti-Semitism, misogyny, covert homosexuality and vicious homophobia, legal date rape, domestic violence, and health hazards as far as the eye can see. Hamilton has been a Broadway sensation for deliberately altering the facts and urging the audience to take notice – wanting all to be fully aware of the historical significance of people of color portraying national heroes who owned slaves.

Mad Men and Hamilton have garnered attention precisely because they deny audiences the escapism so commonly peddled by period pieces. Escapism can be innocuous, but not when it warps our sense of reality and the world as it is, once was, and should be. When wildly popular stories like Gone with the Wind and Song of the South portray plantation life as merry, influential social conservatives argue that African-Americans had no complaints before the Civil Rights Movement. When populist politicians inform voters who pride themselves on a lack of “elitist knowledge” that they can make their countries “great again,” difficult truths about the past remain problems unsolved. Too often our glorious history as we like to think of it is more fantasy than fact – which is why sociologists call it The Way We Never Were.

 

 

High Heels Are A Civil Rights Issue

26 Feb

king_charles_i_after_original_by_van_dyck

(Public domain image via)

 

Last week there was much discussion on the blog about the social ramifications of height, but what about high heels? The Women and Equalities Committee of the U.K.’s House of Commons recently found that employee dress codes that require heeled-shoes for women are violating laws banning gender discrimination. The Committee reviewed the matter after receiving a petition signed by 138,500 people and started by Nicola Thorp, a London receptionist who in December 2015 had been suspended by her employer without pay for violating the company’s dress code for women by showing up for work in flats.

I personally find high heels frequently quite becoming. I also personally find them physically hazardous. Pretty much anyone with any sort of orthopedic disability has been advised by their specialists again and again to limit the time they spend in heels to a minimum. While reporting on the U.K. ruling, NBC News let women in on “an essential secret — carrying a pair of trainers in your handbag.” This is cold comfort to those of us who know that back pain is also caused by carrying more than 5% of your body weight in your handbag. One twentysomething friend with an invisible disability was told by her spinal surgeon that she should wear heels pretty much never. Thorp was right to sue on the basis of gender discrimination because only women are required by some employers to toddle about on their toes, but a case could be made on the basis of disability discrimination as well.

That disabled women could be fired—or simply looked upon unfavorably in the workplace for “not making an effort”—is indeed a social justice issue. We in the West have come to regard heels as a sign of female beauty and professionalism not so much because they are inherently smart looking, but because they were invented to signify wealth.

Heeled shoes were designed to be painful and inefficient if you walked around much because the upper classes around the world have traditionally used their fashion statements—from foot-binding to corsets to flowing robes and fingernails—to prove that they were wealthy and didn’t need to labor to survive like the lowly workers. Prof. Lisa Wade offers a wonderful break-down of the history of the high heel at Sociological Images, pointing out that they were first considered manly because men were the first to don them to display social status. Women began wearing them to imitate this status, which led to men abandoning them. Wade explains:

This is a beautiful illustration of Pierre Bourdieu’s theory of class distinction. Bourdieu argued that aesthetic choices function as markers of class difference. Accordingly, the elite will take action to present themselves differently than non-elites, choosing different clothing, food, decor, etc. Expensive prices help keep certain things the province of elites, allowing them to signify their power; but imitation is inevitable. Once something no longer effectively differentiates the rich from the rest, the rich will drop it. This, I argue elsewhere, is why some people care about counterfeit purses (because it’s not about the quality, it’s about the distinction).

Eventually men quit wearing heels because their association with women tainted their power as a status symbol for men. (This, by the way, is exactly what happened with cheerleading, originally exclusively for men). With the Enlightenment, which emphasized rationality (i.e., practical footwear), everyone quit wearing high heels.

What brought heels back for women? Pornography. Mid-nineteenth century pornographers began posing female nudes in high heels, and the rest is history.

In many moments in the history of many cultures, extra pounds of body fat have also signified high social status because wealth was needed to keep someone well-fed. The price of sugar and of meat plummeted in the 20th century in the West and were soon no longer considered delicacies only the wealthy could afford. This coinciding with the eugenics craze in the early 20th century brought about the birth of our modern preoccupation with not just longevity and bodily cleanliness but physical “fitness.” These shifts are why modern fashion dictates that those who wish to project high social status should dress inefficiently, like a traditional aristocrat, while remaining physically strong, slim and active, like a traditional laborer.

High-status men are now encouraged to wear expensive attire in addition to building and maintaining a muscular physique that can get down in the dirt – something the manly dukes and earls of yore would have considered horrifically common. High status women are now encouraged to diet and exercise to be “healthy” in addition to wearing heels to hint at sexiness in their physique via the historical association with both princesses and porn stars – at the risk of breaking down their bodies as they rush off to work and back like the peasant women of yore.

Indeed, our modern fashion rules for professional women are ever so young because upper class women who worked were an anomaly in the Modern Era until the 20th century. The First and Second Wave feminists successfully fought for our right to vote and become actors, bankers, flight attendants, and politicians, but we have yet to expunge the idea that a woman who suffers for beauty is admirable, rather than irresponsible. Nicola Thorp’s petition, however, has dealt it a blow.

Women should feel free to wear heels almost whenever they wish, but disabled women should not have to suffer social consequences for choosing to protect their bodies. True equality may also come when men can wear heels like Mozart and Louis XIV without fear of gay bashing, as long as such a fashion shift does not harden into a fashion decree. If it does, then disabled men will have to use their right to petition against discrimination.

No matter how you personally feel about them, just remember that modern ideas about fashion, gender/sex, class, and disability all meet whenever we consider a pair of high heels. That’s why we call it intersectionality.

 

 

 

What Should You Do When a U.K. Night Club Offers Guests a “Free Midget” for Its Easter Special?

3 Apr

las_meninas_01

(“Las Meninas” by Diego Velásquez via)

 

There are undoubtedly those who find the idea of a night club offering its VIP-members a “free midget” for the evening hilarious. (It’s just so novel, ain’t it?) And there are certainly those who find the idea offensive. (“That was offensive,” comedienne Joanna Hausmann points out, is the third most-uttered phrase in America.)

And then there are those of us who know that the idea is not original. Far from it. It is at least 2,000 years old. Records show people with dwarfism were purchased as slaves in Ancient Rome and China up through the Renaissance. In bondage for their entertainment value, they were made to dance like monkeys and sometimes kept in cages.

From the Early Modern Era on into the 18th century—and, in some parts of the world, the late 20th century—they remained ubiquitous as lifelong servants and entertainers to aristocrats and dictators. Whether such servitude constituted slavery is difficult to ascertain. There is no evidence to suggest dwarfs were relegated by law to slave status at birth like other minorities were, perhaps because dwarf entertainers and servants were a frivolity for monarchs rather than a source of cheap labor for major industries. Records predating the 20th century reveal a handful of people with dwarfism lived independent lives. But, like the freak shows of the circus, servitude was often dwarfs’ best hope for sustenance in a world where families often abandoned them as children.

Dwarf advocacy organizations have condemned the Manchester night club’s offer as “discriminatory.” But rather than entangle ourselves in another battle between the that’s-so-offensive crowd and the hey-lighten-up crowd, I would prefer to ask both sides if they are aware of the history of servitude and enslavement. And if, as I suspect, most are not aware of it, it is necessary to consider why.

 

 

“We’ve Never Lived in Such Peaceful Times”

4 Jan

Time allowed(Image by H. Kopp-Delaney used under CC 2.0 license via)

 

“Is the world becoming a more dangerous place?” This is not a subjective question, but it is all too often answered by entirely subjective findings. Do you watch the local news and listen to a police scanner? Do you see graffiti as street art, or cause to clutch your valuables and not make eye contact with anyone? Do you know someone personally who has been robbed, attacked, or murdered?

The objective answer to the original question, however, is no. The world is in fact safer than it has ever been in human history because we humans have become drastically less violent. Never before has there ever been a place of such high life expectancy and such low levels of violence as Western Europe today. Around the globe, there are lower rates of war and lower rates of spankings. There is no guarantee that the decline in violence will continue. But most of us have a hard time even believing that it exists at all.

In his book The Better Angels of Our Nature, Harvard psychologist Stephen Pinker proves that the human emotional response to perceived danger—especially danger towards ourselves or someone with whom we can easily empathize—always risks distorting our perceptions of safety. One of the problems of empathy, he argues, is that we more readily feel for those we perceive to be more similar to us. This results in our investing more time, money and emotion toward helping a single girl fighting cancer if she speaks our language and lives in a house that looks like our own than toward helping 1,000 foreign children fighting malaria. We are more likely to disbelieve a victim of abuse if we can more quickly identify with the accused, and the same is true for the reverse scenario. And if you have been the victim of a horrendous crime or are struggling to survive in any one of the countries ravaged by war this year, you may become angry at any suggestion that the world is getting better, lest the world ignore the injustices you have suffered.

Those of us working in human rights must beware these problems whenever we trumpet a cause. Every activist’s greatest enemy is apathy, and fear of it can lead us to underscore threats while downplaying success stories in order to keep the masses mobilized. But any method founded on the claim that we have never lived in such a dangerous time is spreading lies.

As Pinker and Andrew Mack report in a recent article:

The only sound way to appraise the state of the world is to count. How many violent acts has the world seen compared with the number of opportunities? And is that number going up or down? … We will see that the trend lines are more encouraging than a news junkie would guess.

To be sure, adding up corpses and comparing the tallies across different times and places can seem callous, as if it minimized the tragedy of the victims in less violent decades and regions. But a quantitative mindset is in fact the morally enlightened one. It treats every human life as having equal value, rather than privileging the people who are closest to us or most photogenic. And it holds out the hope that we might identify the causes of violence and thereby implement the measures that are most likely to reduce it.

There is a risk that some will see the decline in violence as reason for denying crime (“Rape hardly ever happens!”), dismissing others’ pain (“Quit whining!”), and justifying their disengagement (“See? We don’t need to do anything about it!”). Pinker and Mack, however, claim the decline can be attributed in the modern era to the efforts of those in the human rights movements. In the example of violence against women:

The intense media coverage of famous athletes who have assaulted their wives or girlfriends, and of episodes of rape on college campuses, have suggested to many pundits that we are undergoing a surge of violence against women. But the U.S. Bureau of Justice Statistics’ victimization surveys (which circumvent the problem of underreporting to the police) show the opposite: Rates of rape or sexual assault and of violence against intimate partners have been sinking for decades, and are now a quarter or less of their peaks in the past. Far too many of these horrendous crimes still take place, but we should be encouraged by the fact that a heightened concern about violence against women is not futile moralizing but has brought about measurable progress—and that continuing this concern can lead to greater progress still…

Global shaming campaigns, even when they start out as purely aspirational, have led in the past to dramatic reductions of practices such as slavery, dueling, whaling, foot binding, piracy, privateering, chemical warfare, apartheid, and atmospheric nuclear testing.

The decline of violence undermines the arguments of those who invest their energy in fear-mongering (“People are evil and out to get you!”), self-martyrdom (“I’ve tried for so long—I give up!”) or indifference (“There’s no point to even trying.”). In his excellent book, which is well worth your time, Pinker demonstrates that all humans are tempted to use violence when we are motivated by feelings of greed, domination, revenge, sadism, or ideology (i.e., violence for a greater good), but we have proven that we can overcome these temptations with our capacity for reason, self-control, sympathetic concern for others and the willingness to adhere to social rules for the sake of getting along. There is much work to be done, but the decline is ultimately cause for hope. 

Happy New Year!

 

 

Interpreting History Part II: Oppression Has Never Been Universal

5 Aug

(“Samurai Kiss” via)

 

Nothing divides a country quite like a national holiday.  When I was studying in St. Petersburg ten years ago, there was as much apathy as there was celebration on the Russian Federation’s June 12th decennial.  German reactions to Reunification Day every October 3rd are anything but united.  And on the United States Fourth of July last month, Chris Rock tweeted, “Happy white peoples independence day, the slaves weren’t free but I’m sure they enjoyed fireworks.”

Amid the outbursts of “unpatriotic!”, conservative blogger Jeff Schreiber shot back, “Slavery existed for 2000yrs before America. We eradicated it in 100yrs. We now have a black POTUS. #GoFuckYourself.” 

Schreiber has since written a post on his blog, America’s Right, apologizing for cursing and conceding that the slave trade was unconscionable.  But for all his insistence that he never intends to diminish the horrors of American slavery, he adds that President Obama’s policies are now “enslaving Americans in a different way.”  (Real classy.)  And for all his reiteration that slavery was always wrong, he still hasn’t straightened out all the facts skewed in his Tweet.

“Slavery existed for 2,000 years before America.”  He uses this supposed fact to relativize the oppression, as if to shrug, “Well, everyone was doing it back then.”  His tweet implies that the ubiquity of the slave trade makes America’s abolition of it exceptional, not its participation.  This argument hinges on fiction.  Slavery did not exist for 2,000 consecutive years.  In the West, it was pervasive in Antiquity and the Modern era, but it was downright uncommon in the Middle Ages.  (While anathema to our modern ideas of freedom for the individual, medieval serfdom was not slavery.)  Slavery was re-instituted in the West roughly 500 years ago with the advent of colonialism.  And the United States held on to it long after most other colonial powers had abolished it.  Critics can say what they want about the effectiveness of Chris Rock’s rain-on-a-parade tactics, but his argument did not distort history.      

In my last post, I argued the risks of concealing the human rights abuses of the past for the sake of nostalgia, if anything because it is the height of inaccuracy.  But portraying history as an unbroken tradition of straight, white, able-bodied male dominance like Schreiber did is also inaccurate.  The universal human rights movement in its modern form is indeed only a few decades old, but the idea of equality for many minorities can be found all over in history at various times and places.  The Quakers have often been pretty keen on it. 

And almost no minority has been universally condemned.  People with dwarfism appear to have been venerated in Ancient Egypt.  Gay men had more rights in Ancient Greece and in many American Indian societies than in 20th century Greece or the United States.  Muslim women wielded the right to divorce long before Christian women.  English women in the Middle Ages were more educated about sex than their Victorian heiresses.  Much of the Jewish community in Berlin, which suffered such unspeakable crimes culminating in the mid-20th century, were at earlier times better integrated into the city than Jewish people were in many other capitals of Central Europe.  In short, history does not show that racism, misogyny, homophobia, ableism, transphobia, and our current beauty standards are dominant social patterns only recently broken by our ultra-modern culture of political correctness.  The oppression of minorities may be insidious and resilient throughout history, but it has never been universal. 

Downplaying the crimes of the past by claiming everybody did it is both historically inaccurate and socially irresponsible.  It is perverse when such misconceptions fuel arguments for further restrictions on human rights.  In 2006, Republican Congress member W. Todd Akin from Missouri claimed that, “Anybody who knows something about the history of the human race knows that there is no civilization which has condoned homosexual marriage widely and openly that has long survived.”  Even if this were true, the argument is absurd.  (It appears that no civilization has regularly chosen women with dwarfism for positions of executive power, but does that mean it’s a bad idea?)  But the argument collapses because it relies on facts that are untrue.

Granted hyperbole is a constant temptation in politics.  Stating things in the extreme is a good way to grab attention.  In an earlier post on sex, I asserted that mainstream culture assumes women’s sex drive is lower than men’s because female sexual expression has been “discouraged for millennia.”  Patriarchy has certainly been a major cultural pattern around the world and throughout history, and we cannot emphasize its power on both the collective and individual psyche enough.  But patriarchy is by no means a cultural universal.  Ethnic groups in Tibet, Bhutan, and Nepal continue to practice polyandry into the present day, while history shows many others that have done the same at various times.  These exceptions question the biological theory that heterosexual male jealousy is an insurmountable obstacle to sexual equality.  And prevents any conservative excuse that insists, “Everybody’s been doing it.”    

They haven’t been.  Xenophobia has never been universal.  Humans may have a natural fear of the unfamiliar, of what they perceive to be the Other, but our definitions of the Other change constantly throughout time and space, as frequently and bizarrely as fashion itself.   This makes history craggy, complex, at times utterly confusing.  Like the struggle for human rights, it is simultaneously depressing and inspiring.  But whatever our political convictions, we gotta get the facts straight.

Despite what Stephen Colbert says.

 

 

Interpreting History Part I: Count Me Out

29 Jul

alter ego(Image by Bob May used under CC license via)

 

Anytime my partner and I don’t know what to do or say, one of us asks, “What’s in the news?” and we dive into a political discussion.  So it’s no surprise that we’ve become somewhat embarrassingly addicted to Aaron Sorkin’s The Newsroom.  The news media has been (unsurprisingly) critical of a show founded on the idea of chastising the news media.  Feminists have been (sometimes rightly) critical of its portrayal of women.  The show has almost countless strengths and weaknesses, but I find myself still obsessing over the brilliant, captivating opening scene that kicked off the series.  If you can’t this clip, it basically boils down to a flustered news anchor named Will McAvoy overcome with disgust at the state of the nation and nostalgia for the 1950s and 60s: “America’s not the greatest country in the world anymore,” he sighs.  “We sure used to be.”

We stood up for what was right.  We fought for moral reasons.  We passed laws, we struck down laws for moral reasons.  We waged wars on poverty, not poor people.  We sacrificed, we cared about our neighbors.  We put our money where our mouths were, and we never beat our chests…  We cultivated the world’s greatest artists and the world’s greatest economy.  We reached for the stars, acted like men.  We aspired to intelligence.  We didn’t belittle it.  It didn’t make us feel inferior…  We didn’t scare so easy.     

“Nostalgia” literally means “aching to come home.”  It’s the temporal form of homesickness, time rather than place being the source of pain.  We all do it.  It can be oddly soothing at times to be in awe of another era, especially the one you were born in.  But Will McAvoy should watch Woody Allen’s Midnight in Paris for proof that nostalgia is an ultimately futile pastime that every sad sack of every era has hopelessly indulged in.  (If “things were better back in the day,” then how come every generation says this?)  But since McAvoy’s nostalgia is an earnest, political battle cry, heaping laurels on the good old 1950s and 60s when the leaders of the day did their job right, I’m more inclined to have him watch Mad Men.  Or just open up the 1960 children’s illustrated encyclopedia I found at my great aunt’s house, which states, among other things: “The Australian aborigine is similar to the American negro in strength, but less intelligent.”  Didn’t scare so easy, indeed.     

The problem with nostalgia is that it is far more emotional than intellectual and thereby lends itself to inaccuracy all too easily.  America was indeed doing great things sixty years ago.  And reprehensible things.  We hid our disabled and gay citizens away in institutions, asylums and prisons.  We enforced the compulsory sterilization of mentally disabled and Native American women.  We took decades to slowly repeal segregationist laws that the Nazis had used as models.  We maintained laws that looked the other way when husbands and boyfriends abused their partners or children.  In short, we handed out privilege based on gender, sexuality, ethnicity, religion, physical and mental capabilities with far greater frequency and openness than we do today.  Perhaps we were the “greatest country in the world” compared to the others.  (Europe and East Asia were trying to recover from the devastation of World War II, after all, while other nations were trying to recover from the devastation of colonialism.)  But McAvoy’s wistful monologue is much more a comparison of America Then with America Now.  And that is hard to swallow when considering that a reversion to that society would require so many of us to give up the rights we’ve been given since then.   

Am I “another whiny, self-interested feminist” out to bludgeon the straight, cis, WASPy male heroes of history?  Am I “just looking to be offended”?  No, I’m struggling.  Next to literature and foreign languages, history has always been my favorite subject.  And pop history always touches upon this question:

“If you could go back to any period in history, which would it be?” 

From an architectural point of view?  Any time before the 1930s.  From an environmental point of view?  North America before European contact.  From a male fashion point of view?  Any period that flaunted fedoras or capes.  From a realistic point of view?  No other time but the present.  Because if I am to be at all intellectually honest in my answer, there has never been a safer time for me to be myself. 

Last year, I read The Lives of Dwarfs: Their Journey from Public Curiosity To Social Liberation by Betty Adelson.  Despite my love of history, I hated almost every minute of it.  Lies my Teacher Told Me by James Loewen had helped me understand how so many black American students feel uninspired by U.S. history and the figures we hold up as heroes because so many of those men would have kept them in shackles.  But it wasn’t until I read The Lives of Dwarfs that I understood how nasty it feels on a gut-level to face the fact that most of history’s greatest figures would more likely than not consider you sub-human. 

With the exception of Ancient Egypt, my own lifetime has been the only period wherein someone with dwarfism could have a fair chance of being raised by their family and encouraged to pursue an education and the career of their choice, as I was.  At any other point in Western history, it would have been more probable that I would have been stuck in an institution, an asylum or the circus (the Modern Era before the 1970s), enslaved by the aristocracy (Rome, Middle Ages, Renaissance) or left for dead (Ancient Greece).  Of course inspiring cases like Billy Barty show that a few courageous/decent parents bucked the trends and proved to be the exception to the rule, but that’s what they were.  Exceptions. 

I am fortunate to have been born when I was and for that reason, nostalgia for any other period in time can never be an intellectually honest exercise for someone like me.  The moment someone says, “Yeah, well, let’s not dwell on odd cases like that.  I’m talking about the average person,” they’re essentially saying, “Your experience is less important than mine.”

Everyone is entitled to have warm, fuzzy feelings about the era in which they grew up.  If any period can put a lump in my throat, it’s the 1970s.  The Sesame Street era.  The boisterous, primary-colored festival flooded with Williams Doll, Jesse’s Dream Skirt, inner city pride à la Ezra Jack Keats, and androgynous big hair all set to funky music can evoke an almost embarrassing sigh from me.  Donning jeans and calling everyone by their first name, that generation seemed set on celebrating diversity and tearing down hierarchies because, as the saying goes, Hitler had finally given xenophobia a bad name.  Could there be a more inspiring zeitgeist than “You and me are free to be to you and me”? 

 

But I’m being selective with my facts for the sake of my feelings. 

Sesame Street and their ilk were indeed a groundbreaking force, but it was hardly the consensus.  Segregation lingered in so many regions, as did those insidious forced sterilization laws.  LGBT children were far more likely to be disowned back then than today—Free To Be You And Me had nothing to say about that—and gay adults could be arrested in 26 states.  The leading feminist of the time was completely screwing up when it came to trans rights.  Although more and more doctors were advocating empowerment for dwarf babies like me, adult dwarfs faced an 85% unemployment rate with the Americans with Disabilities Act still decades away.  And Sesame Street was actually banned in Mississippi on segregationist grounds.  When the ban was lifted, its supporters of course remained in the woodwork.  We have made so much progress since then.  It would be disingenuous for me to ignore that simply for the sake of nostalgia. 

To be fair to Sorkin, it’s a hard habit to kick.  We have always glorified the past to inspire us, no matter how inaccurate.  Much of American patriotism prides itself on our being the world’s oldest democracy, but we were not remotely a democracy until 1920.  Before then, like any other nation that held free elections, we were officially an androcracy, and of course we didn’t guarantee universal suffrage until the Voting Rights Act of 1965.  That my spellcheck doesn’t even recognize the word “androcracy” signifies how little attention we afford our history of inequality.  But we have to if accuracy is going to have anything to do with history.  A brash statement like “We sure used to be [the greatest country in the world],” as a battle cry for self-improvement is asking to be called out on the inanity of this claim. 

Everyone is entitled to appreciate certain facets or moments in history, just as everyone is entitled to look back fondly upon their childhood.  Veracity falters, however, with the claim that not just certain facets but society as a whole was all-around “better.”  This is never true, unless you’re comparing a time of war to the peacetime preceding it (1920s Europe vs. 1940s Europe, Tito’s Yugoslavia vs. the Balkans in the 1990s), and even then the argument is sticky (Iraq during the insurgency vs. Iraq under Saddam Hussein).  In the words of Jessica Robyn Cadwallader, concealing the crimes of the past risks their reiteration.  Whenever we claim that something was socially better at a certain point in history, we must admit that something was also worse.  It always was. 

But such a sober look at the past need not be depressing.  It reminds me how very grateful I am to be alive today.  My nephews are growing up in a society that is more accepting than almost any other that has preceded it.  That is one of helluva battle cry.  Because what could possibly be more inspiring than history’s proof that whatever our missteps, things have slowly, slowly gotten so much better?