Tag Archives: Ableism

So Who Should The Cliques Make Fun Of Now?

6 Jan

Christina Red Carpet A new study claiming that Overweight and Class 1 Obese people have a lower mortality rate has been bouncing around the world since Thursday.  National Public Radio’s report seems to be the most comprehensive but hints at the two most extreme, polarized viewpoints:

Cosmetic: This is a victory for the overweight—now we can trash skinny people (again)!

Medical: If people hear about this, everyone will stop exercising and eating their vegetables and then everyone’s going to die!

Both views treat the public like infants who can’t possibly think for themselves.

Doctors are right to worry that a sizeable portion of the population will use this news as an excuse for whatever unhealthy habits they love.  This is why it is important to include the many possible factors skewing the results.  But many people will always cherry-pick whatever statistics suit their lifestyle or claim to be the exception to the rule.  I don’t have any political solutions for engaging with contrarians—whether we’re debating eating habits or global warming—but talking down to them and using scare tactics has a pretty high failure rate.

And from the disability rights perspective, there are exceptions to the rule when it comes to health.  Thousands of them.  As said before, a round belly is not always a sign of fat.  A bony body is not always a sign of an eating disorder.  Many forms of exercise can be more hazardous than beneficial to people with certain conditions.  And many life-threatening conditions are invisible.  Medical tests, not appearance, are always the most reliable indicators of health.  This robs us of the easy answers we crave and which facilitate public debate, but there has never been and never will be a one-size-fits-all health program for the 7 billion humans on the planet.

You and your doctor know better than anyone else if you are healthy or not.  If she says you are overweight but your genes and cholesterol levels put you at no risk for heart disease, she’s probably right.  If she says your weight is ideal but your eating habits put you at risk for malnutrition, she’s probably right.  And if her advice seems sound but her delivery makes you feel too ashamed to discuss it, go find someone with better social skills to treat you.  At the individual level, it’s no one else’s business.  Outside of the doctor’s office, it shouldn’t be any more socially acceptable to discuss someone else’s weight or waist size than it is to discuss their iron levels, sperm count, or cancer genes.

But beauty standards and health trends often go hand-in-hand.  And what really needs to go is the lookist idea that we’re all semi-licensed doctors who can diagnose people just by glancing at them and deciding how they measure up according to the latest medical research.  The reason we have a hard time letting this go is because it’s fun to point out others’ supposed weaknesses.  It’s self-elevating and validating to snicker that ours is the better body type because it calms our insecurities.  Beauty standards are cultural and constantly morphing throughout history, but they have always remained narrow.  (This is especially the case for women, though I sincerely apologize for not providing more research on men.)  Whether fawning over big breasts or flat tummies, public praise for certain body types has almost always been at the expense of others:

 

 
After decades of the Kate Moss heroin chic, Christina Hendricks (see above) of Mad Men has garnered lots of attention for her curves and this week’s study is likely to encourage her fans.  “Christina Hendricks is absolutely fabulous…,” says U.K. Equalities Minister Lynne Featherstone.  “We need more of these role models. There is such a sensation when there is a curvy role model.  It shouldn’t be so unusual.”  She is dead right that it shouldn’t be hard for curvy women to find sexy heroines who look like them in film and on television, just as skinny women or disabled women or women of any body type shouldn’t have to give up on ever seeing celebrities with figures like theirs.  But “Real women have curves!” is just as exclusionary as the catty comments about fat that incite eating disorders.  And when Esquire and the BBC celebrate Hendricks as “The Ideal Woman,” they mistake oppression for empowerment.

We can accept the idea that people of all sorts of different hair colors and lengths can be beautiful.  Will mainstream medicine and cosmetics ever be able to handle the idea that all sorts of different bodies can be healthy?  History says no.  But maybe it’s not naïve to hope. 

And what does Christina Hendricks have to say about all of this?  “I was working my butt off on [Mad Men] and then all anyone was talking about was my body.”

Touché.

 

 

The Year In Review

30 Dec

Hidden Object(Image by Hans-Jörg Aleff used under CC license via)

 

When I launched Painting On Scars at the beginning of this year, I had loads to say and almost as much worry that few would be interested in issues of disability and physical difference.  As the year comes to a close, I look back and see that the posts about ableism and lookism have generally been the most popular, followed by my spring article about family planning, reproductive rights, and privacy.  This hasn’t been the only surprise.

Lots of people find this blog by googling “dwarf + woman + sex.”  I have no idea who these people are.  They may be fetishists, they may be researchers, they may be women with dwarfism.  Your guess is as good as mine.

Since March, Painting On Scars has been read in over 100 countries.  To the surprise of few, no one in China reads it.  To the surprise of many, at least one person in Saudi Arabia does.  So have people in St. Lucia, Jordan, and Benin. 

Thanks to blogging, I’ve discovered there is a considerable online community committed to combating ableism with its own terms and tropes such as “supercrip” and “inspiration porn.”  I love such communities.  I also love bridging communities.  Because responses to my blog have shown me, perhaps more than anything has, that I want to talk to everyone.  And I really don’t care what your label is. 

I don’t care if you consider yourself Republican or Democrat or feminist or anti-feminist or religious or atheist or socialist or libertarian or apolitical or intellectual or anti-intellectual.  Well, okay, I do take it into consideration.  Somewhat.  But there is rarely consensus when we ask that everyone define these terms.  And none of them carries a guarantee against nasty personality traits like narcissism and defensiveness and aggression and cowardice.  Novelist Zadie Smith noted that we are told every day by the media and our culture that our political differences are the most important differences between us, but she will never be convinced of that.  When lefty comedian Jon Stewart was asked earlier this year if there’s anything he admires about right-wing hardliner Bill O’Reilly, he said, “This idea that disagreeing with somebody vehemently, even to the core of your principles, means you should not engage with them?  I have people in my own family that make this guy look like Castro and I love them.”

This is not to say that it’s all relative and I see no point to social justice or politics.  On the contrary, difference continues to be marginalized by the tyranny of the majority, as evidenced by the fact that the number one Google search term that has brought readers to my blog is “freaky people.”  And far too many kind people will more readily lash out at a person or group whose recognition demands they leave their comfort zone, rather than the forces that constructed and defined their comfort zone.  Well-intentioned friends and parents and bosses and classmates and leaders and partners and siblings and colleagues are capable of the vilest selfishness when they are scared of a power shift.  (As the Christian activists pictured above acknowledge.)  This is heart-breaking.  And it is not okay. 

But on the flipside, people are constantly smashing the prejudices I didn’t even know I had about them.  Every day friends and family and strangers demonstrate strengths that highlight all the mistakes I make, proving to me that politics are tremendously important but they will never be the most important element of a human being.   That may be a political idea in itself, but regardless of the divisions, most people on earth do seem to believe deep down inside that everybody matters.

And that’s what makes the struggle for social justice worth it.  If you are friendly and well-mannered and generous and honor your commitments and don’t let your self-doubt make you self-centered and try to listen as much as you talk and are honest about your problems without fishing for compliments and are big enough to apologize when you’ve screwed up, I respect you and admire you and am humbled by you.  I want to do the best I can because of you. 

 And since you’ve read this far, it’s more than likely you’re good at listening.  Thank you and happy new year!

 

 

“ ‘I Am So Sorry’ Is A Start”

23 Dec

Last week 20 children and 7 women were murdered as I was celebrating my birthday.  Hearts leapt into throats and the urge to hug the little ones in our lives pushed the tears further down the cheeks.  As you absolutely undoubtedly know, the Internet has since been inundated with debates regarding gun control, violent video games, and even gender roles.  Amidst all the vitriol and special snowflake lecturing, it’s the lackluster discussions of psychiatric disorders that seem the least helpful.   

Too much of what has been said about mental illness has been too simplistic, too unscientific, too dismissive of the fact that accurately diagnosing a deceased individual often requires years of research.  Liza Long’s piece “I Am Adam Lanza’s Mother” is brazenly presumptuous and fraught with problems, while most of the outraged responses obscure their excellent points with a few too many personal jabs at her.  Of course everyone wants to know as soon as possible why 20 children were chosen as targets, but in this quest our commitment should be to accuracy, not promptness.

Although much of my work is in disability rights, I rarely write about mental illness or psychiatric disorders.  I have family members who are mentally ill and many friends who work in psychiatric fields, but I do not know nearly enough about it to speak with any authority and all too often hearsay is copy-and-pasted as fact.  Genuine concern is sometimes obscured by sick fascination.  The term “mentally ill” is a gigantic umbrella that covers everything from paranoid schizophrenia to anorexia nervosa to hypochondria.  Those with psychiatric disorders make up what is perhaps the most misunderstood and diverse minority on earth.  Casually tossing out easy-reading explanations before the news cycle gets bored and moves on usually does them more harm than good. 

I’ve been reading as much as I can about the complexities of Asperger’s syndrome, schizophrenia, psychopathy, and the countless articles reminding everyone that most mentally ill people are far more likely to be the victims of violence rather than the perpetrators.  I plan on getting my hands on a copy of Richard J. McNally’s What Is Mental Illness? in the new year.  Meanwhile, I can only hope that news readers and viewers do not perpetuate the media’s easy-answer approach to something as complex as medicine.

And while filtering out the less helpful material, I found two beautifully honest pieces by Rev. Emily C. Heath and Linton Weeks about what to say to grieving parents.  People in bereavement are traditionally not classified as minorities, but fear, misconceptions, and snap judgments usually surround them.  (I wrote earlier this year about what loss has taught me about the complexities of grief and the prejudices I used to hold against it.)  As we continue the debates aimed at preventing future tragedies, we should learn how to deal with what this tragedy has done to those closest to it.

 

 

Degenerates, Nazis, & the U.N.

16 Dec

(Via)

 

A reaction to last week’s post about the U.N. Convention on the Rights of People with Disabilities sparked a behind-the-scenes discussion about whether or not I should allow name-calling in the Painting On Scars comments section.  I like to engage with almost anyone who disagrees with me, but online I know I also tend to only comment on sites that have strict no-drama policies because discussions can become pointless and boring really, really fast when there’s nothing but insults and exclamation points.  I ultimately decided that, for now, any rude behavior speaks for itself: Commenters can name-call all they want regarding people they dislike or say absolutely nothing, because in both cases they’re not going change anyone’s mind.

That said, I will always tell any supporters if they adopt tactics I want to have nothing to do with.  And it’s important to call out invectives that are particularly malicious in a way some might not be aware of.  The comment in question last week referred to the U.N. as “a bunch of degenerates, throat cutters, and other trash.”  Using the word “degenerate” in a discussion about disability rights is exceptionally insensitive, if not mean-spirited.    

The first time I read the word out loud to a friend here in Germany, his eyes shot up and said, “Be very careful with that word.  It immediately makes everyone think of the Nazis.”  And by “Nazis,” he meant the actual, goose-stepping, genocidal nationalists who tried as best they could to make sure disabled people either died off or were killed off.  Not “Nazis” in the Internet-temper-tantrum sense of “anyone I disagree with.”  The word also evokes the brownshirt term “degenerate art.”  Modern German sensitivity to the term is the result of looking honestly at the nation’s history of ableism.

Action T-4 was the first genocide program ordered by the Nazis, calling for the extermination* of those deemed by doctors to be “incurably sick.”  Between 200,000 and 300,000 disabled people were killed, though many were used for scientific experiments first.  *And by the way, I DETEST any use of the term “euthanasia” in this context.  “Euthanasia” literally means ending life to end pain, and for this reason I find it applicable where patient consent has been given or where pets are concerned.  But to imply that what the Nazis did to disabled citizens was anything other than murder is to dehumanize the victims.

The forced sterilization programs of disabled people in Nazi Germany, meanwhile, were modeled after American laws.  The very first forced sterilization law in the world was introduced in Indiana in 1907, and 30 states followed suit.  The Supreme Court upheld Virginia’s eugenics program in 1927 and it remained on the books until 1974.  Oliver Wendell Holmes summarized the Supreme Court’s decision thusly:  

It is better for all the world, if instead of waiting to execute degenerate offspring for crime or to let them starve for their imbecility, society can prevent those who are manifestly unfit from continuing their kind…  Three generations of imbeciles are enough.

The Nazi poster featured above focused instead on the expense: “It costs the German people 60,000 Reichsmarks to keep this genetic defective alive.  Fellow German, that is your money!”  After World War II, the Nuremberg Doctors’ Trial and the resulting Nuremberg Code discouraged ableist politicians from openly promoting eugenics on either side of the Atlantic.  But it wasn’t until 1981, the year I was born, that the disability rights movement in West Germany came into full swing and sought to combat ableism head-on. 

Almost every human rights movement is said to have a trigger moment when oppression went a step too far and the people fought back.  For the American Civil Rights movement, it was the death of Emmett Till.  For the gay rights movement, it was the Stonewall Uprising.  For the German disability rights movement, it was the Frankfurt Travel Ruling of 1980, brought about by a woman suing her travel agency for booking her in a Greek hotel where a group of Swedish disabled guests were also vacationing.  She claimed that having to see and hear disabled people had ruined her trip and the judge agreed with her.  Protests exploded across the country and the next year, which the U.N. had declared the Year of the Disabled, several West German disability rights groups organized and formed agendas.  They used the U.N. events to draw attention to the dire situation of disabled citizens in the country.

Two years later, the Green Party entered the Bundestag for the first time and was the first to voice support for disability rights as a human rights issue.  The Greens were born out of the 60s student movement in West Germany.  The movement was famous for protesting what most young activists across the Western world opposed at the time: the Vietnam War (and war in general), traditional gender roles, consumerism, pollution, etc.  But first and foremost, the West German 68ers were young people demanding the nation come to terms with its dark past, decrying that an overwhelming number of the nation’s leaders and officials were former Nazis.  Their commitment to human rights was inspired by an unfaltering awareness of how horrific things can get.  Their actions led to the passing of anti-discrimination laws and an amendment to the German Constitution in 1995, modeled after the Americans with Disabilities Act.

Another result of the students growing up and entering the government came in 1983 when conscientious objectors to the draft were no longer required to argue their motivations before a board for approval. This made it far easier for young men to opt for a year of community service in lieu of military service.  By 1991, half of those drafted became conscientious objectors.  For over 30 years, scores of German 19 year-old boys worked with mentally ill children at the Red Cross, in nursing homes, as assistants for physically and mentally disabled teenagers, and for Meals on Wheels.  This has created generations of men who often speak fondly of the experience and who are usually less fazed by disabilities or dependence, demonstrating a tolerance and openness that seems extraordinary for their age. 

The draft was discontinued last year and since then the community service option has been suspended.  Military debates aside, I agree with conservative politicians who have called for preserving the community service requirement and expanding it to women because it is an excellent government tool for combating both ableism and social segregation on a personal level.  Ableism is still a tremendous problem here in Germany, but in three generations, the country has changed from one of the most ableist societies on earth to one of the least.   The word “degenerate” signifies humanity’s capacity for cruelty and sensitivity to the word signifies our commitment to never repeat it.

To be fair, the word in last week’s comment was not aimed directly at disabled people but at the U.N. members working for disability rights.  And frankly, I’m a little insulted.  Because if anyone’s a degenerate here, it’s me. 

I am scientifically a mutant by virtue of my fibroblast growth receptor gene 3.  (Yes, yes, my genetics professor explained that technically all of us are mutants, but mostly just in boring ways… )  I am a semi-invertebrate now that pieces of my backbone were removed six weeks ago.  And I don’t take the last empty seat on the subway and request my friends slow down to my pace when walking for nothing.  So if anyone’s gonna go calling the organization that sprang from the Nuremberg Trials and founded the Universal Declaration of Human Rights a bunch of degenerates, they gotta get through me first.  I’m a degenerate living in Germany and proud of it.

 

 

Universal Disability Rights – Remind Me Again Why We Don’t Care?

9 Dec

 

Well, I was going to write about how conservatives are sometimes more open to discussing issues faced by disabled people than liberals are.  Then on Tuesday, all but eight Republican senators voted against the Convention on the Rights of Persons with Disabilities, making sure the United States distinguishes itself as one of the few nations on earth that will not commit to protecting disabled rights.  Appeals by the likes of the World Health Organization, the American Psychiatric Association, and senior Republicans (and disabled veterans) John McCain and Bob Dole were to no avail.  So I’m not in the mood to write any sort of tribute to conservative ideals this week.

Supporters of ratification like Dole and John Kerry argued that the United States would be leading the world, since much of the Convention was modeled after the Americans with Disabilities Act of 1990.  Opponents argued that this is exactly why ratification is of little importance.  We already have the ADA and we don’t like the UN, so who cares?  But By refusing to ratify the Convention, the United States is undermining its authority, ultimately saying, “Too bad!” to the disabled citizens of other countries that will also abstain, where ableism is sometimes deadly.  (Do we need to talk about the thousands of medical conditions that are still thought to be works of the devil or punishment by God in far too many cultures?)  But this is not just a matter of the United States choosing whether or not to officially lead the world.  When it comes to human rights at home, complacency can be devastating.

 In many respects, the U.S. is not coming out on top.  According to an OECD 2009 study of 21 developed countries cited by the World Bank and WHO last year, disabled people of working-age are more likely to live below the poverty line than non-disabled people in every country but Norway, Sweden, and Slovakia.  This likelihood is highest in the United States, Australia, Ireland, and Korea, and lowest in the Netherlands, Iceland, and Mexico.  According to WHO, the discrepancy between the employment rates of disabled and non-disabled citizens is twice as high in the United States (35 percentage points) as in Germany (18 percentage points).  And in the U.S., the risk of violence against people with disabilities is four to ten times higher than against people without disabilities. 

I will never officially endorse a candidate or a party on this blog.  Despite obvious political trends at the macrocosmic level, personal experience has shown me that people of all political stripes believe in universal human rights and I never wish to alienate anyone over issues not directly related to equality.  But shame on every single senator who blocked the Convention.  No one has ever protected human rights on an international scale through isolationist policies.  In a world where people with dwarfism still have little hope of employment outside the circus, people with albinism are persecuted, surgeries are performed without consent, and a diagnosis of mental illness is thrust upon LGBT people and denied people with clinical depression, international cooperation is crucial.  Otherwise, human rights disintegrates back into its inconsistent old self and becomes nothing more than a matter of privilege.  

 

 

Happy Halloween

24 Oct

As of tomorrow, I have to go on medical leave and take a break from blogging for hopefully just a short while.  So, in the spirit of season, I’ll leave you with a re-run of my old post, “Curiosity Kills the Rat.”  Happy Halloween and be back soon!

CURIOSITY KILLS THE RAT

“All the freaky people make the beauty of the world.”

— Michael Franti

Fourteen years ago, I made a trip to Hot Topic—that quintessential 90s chain store for all things goth—in search of some fishnet stockings for a friend.  It was my first visit to the store since I was back in a wheelchair for my third and final limb-lengthening procedure and the narrow aisles prevented me from venturing beyond the entrance.  My first time in a wheelchair, from ages 11 to 12, had been a completely humbling experience as I was forced to see how very inaccessible the world is for the non-ambulatory.  This time around I was battling the hot-cheeked self-consciousness that adolescence attaches to any signs of dependency. 

As I tried to look casual while flipping through black gloves, black stockings, and black dog collars, a guy approached me sporting crimson hair, eyebrow rings, an employee badge and a smile.  “This is store is easily adjustable,” he grinned, and with that he began shoving aside the display cases and clothes racks—which were, like me, on wheels—clearing a path for me right through to the back and taking little notice of the other shoppers, some of  whom took one to the shoulder.  It was one of those crushes that disappear as quickly as they develop but leave a lasting memory: my knight in shining jewelry.

Thanks to experiences like this, I have a special place in my heart for the acceptance of physical differences that can often be found in the subcultures of punks, hippies, and goths.  From the imagining of monsters to the examination of anything taboo, counter-culture is often unfazed by physical qualities that fall outside of mainstream beauty standards.  The first kid in my high school who chose not to stare at the external fixators on my arms but instead held the door for me had green and purple hair.  About a month after my trip to Hot Topic, I showed a death-metal-loving friend my right fixator (shown above) for the first time, with the six titanium pins protruding from open wounds in my thigh.  He grinned, “That is the ultimate piercing, man!”  He hardly could have come up with a more pleasing reaction.  That my wounds were cool instead of “icky” or “pitiful” was a refreshing attitude found almost exclusively outside mainstream culture.  This attitude more readily understands my belief that my scars are merit badges I earned, not deformities to erase. 

However, this tendency toward decency over discomfort is just one side of the alternative coin.  Every subculture has its strengths and its weaknesses, and for all the freaky heroes I’ve encountered, I’ve also met plenty whose celebration of difference devolves into a sick fascination with the grotesque.  “Weird for the sake of weird” is progressive when it asserts that weird is inescapable, that it is in fact as much a part of the natural order as any of our conventions, and when it serves as therapy for the marginalized.  But it is problematic when it involves self-proclaimed artists using others’ reality as their own personal toys.     

In a previous post, I referred to a friend of friend including me in an Internet discussion about limb-lengthening.  His comments were in reaction to a photo of a leg wearing an Ilizarov fixator that had been posted on a Tumblr page focused on the wonders of the world.  There are countless sites like it, where photos of conjoined twins, heterochromatic eyes, intersexual bodies, and medical procedures are posted alongside images of animals, vampires, robots, cosplay, self-harm, manga and bad poetry.  I get it.  The world is “crazy” and it’s all art.  But if that’s not a freak show, what is? 

Disabled people are no longer put behind glass or in the circus—at least not in the U.S., Canada or Western Europe—but many people still believe they reserve the right to stare, both in public and on the Internet.  Whether under the guise of promoting diversity or admiring triumph in the face of adversity, they suppress any realization they may have that no one likes being stared atUnless it’s on our terms.  

I see endless art in my medical experiences and it can be so therapeutic.  During my first limb-lengthening procedure I also had braces on my teeth, leading my dad to observe, “She’s now 95% metal.”  Kinda cool.  During my third procedure, I had Botox injected into my hips twice to paralyze my muscles lest they resist the lengthening.  At the time, when I along with most people had no idea what it was, it was described to me as “basically the most deadly poison known to man.”  Whoa, hardcore.  When I happened upon photos of my anterior tibialis tendon graft surgery, I was enthralled: “I’m so red inside!”  And when a fellow patient recently alerted me to the fact that a high-end jeweler designed a bracelet strongly resembling the Ilizarov frame, I laughed my head off.  Almost all of us like looking at our bodies, and perhaps this is especially so for those of us who have had real scares over our health.  It’s a matter of facing our fears and owning it.  But no one likes the idea of others owning it.  This subtle but severe preference, this desire for dignity determines the difference between human rights and property rights. 

Two years ago, NPR featured a piece by Ben Mattlin, who is non-ambulatory and who said he used to be uncomfortable with the idea of Halloween and its objectification of the grotesque.  From my very first costume as a mouse to my most recent stint as the Wicked Witch of the West, my love of Halloween has not so much as once flickered, but his point is worth discussing.  Costume play, Halloween and any celebration of “weird” that is primarily attention-seeking inherently assumes there is a “natural” basis to be disrupted.  (And all too often Halloween devolves into offensive imitations of all sorts of minority identities.) 

I have my own collection of artsy photos stolen off the Internet that I use as screensavers and montages for parties, but they do not include photos of bodies taken outside the context of consensual artistic expression.  Re-appropriating a photo in a medical journal for a site about all things bizarre is protected under freedom of speech, but it can feel like disregard for consent.  And in any case, such xenocentrism will always be just as superficial as the status quo it seeks to disrupt.

When conjoined twins Abigail and Brittany Hensel agreed to be interviewed once—and only once—for a documentary about their lives (which I highly recommend), they explained that they don’t mind answering strangers’ questions at all.  (Ben Mattlin has said the same, as do I.)  What they hate more than anything is being photographed or filmed without permission.  While attending a baseball game outside their hometown, a sports film crew quickly directed their attention to the girls.  Even though they were already being filmed by their own documentary team, the stranger camera’s invasive, presumptuous stare ruined the day for them. 

Sensitivity toward others’ experience with medicine and death should never kill the discussion.  These discussions are imperative and art is the most glorious way we relate to one another.  But just as there’s more to good manners than simply saying “Please,” there’s more to genuine learning and artistic expression than poking at anything we can get our hands on.  Nuance, deference and respect are prerequisites for anyone with artistic or scientific integrity not only because they are the building-blocks of common decency, but because history has shown that curiosity will more likely harm the rat than the cat.

 

 

Dragging Entertainment Into the 21st Century

21 Oct

(Via)

 

This week, humor site Cracked.com features a great article by J.F. Sargent titled “6 Insane Stereotypes That Movies Can’t Seem to Get Over.”  Alongside the insidious ways in which racism, sexism, homophobia still manage to persevere in mainstream entertainment, Number Two on the list is “Anything (Even Death) Is Better Than Being Disabled”:

In movie universes, there’s two ways to get disabled: Either you get a sweet superpower out of it, like Daredevil, or it makes you absolutely miserable for the rest of your life. One of the most infamous examples is Million Dollar Baby, which ends with (spoilers) the protagonist becoming a quadriplegic and Clint Eastwood euthanizing her because, you know, what’s the point of living like that? Never mind the fact that millions of people do just that every day…

Showing someone using sheer willpower to overcome something is a great character arc, and Hollywood applies that to everything, from learning kung fu despite being an overweight panda to “beating” a real-world disability. The problem is, this arc has some tragic implications for the real-world people who come out with the message that they are “too weak” to overcome their disabilities.

The result is that moviegoers think that disabilities are way worse than they actually are, and filmmakers have to cater to that: For example, while filming an episode of Dollhouse where Eliza Dushku was blind, the producers brought in an actual blind woman to show the actress how to move and get around, but the result was that “she didn’t look blind,” and they had to make her act clumsier so the audience would buy it.

Even in Avatar, real paraplegics thought that Sam Worthington’s character was making way too much effort transferring from his chair, but that’s the way we’re used to seeing it in movies. It’s a vicious cycle, and it isn’t going to stop until either Hollywood wises up or people with disabilities stop living happy, fulfilling lives.

I’ve examined Hollywood’s ableist problems several times before and there are still plenty to dedicate an entire blog to.  But, like The Daily Show or The Onion, Cracked has a long history of excellent social critique embedded amongst the fart jokes and it’s awesome.  Especially when considering that not only mainstream but alternative entertainment all too often can’t seem to let go of the tired stereotypes.  That Cracked is a site not officially dedicated to politics or social activism suggests that the comics writing for it believe calling out the industry for its embarrassing ineptitude is just common sense.

 

 

   

Biology and “The Imprecision of Stereotypes”

16 Sep

 

This week the British newspaper The Telegraph asks:

Ever wondered why men can’t seem to tastefully decorate a house?  Or have a tendency for dressing in clothes that clash?  And why, for that matter, can’t women seem to hack it at computer games?  Now scientists claim to have discovered the reason: the sexes see differently.  Women are better able to tell fine differences between colors, but men are better at keeping an eye on rapidly moving objects, they say.

Professor Israel Abramov and colleagues at the City University of New York reached their conclusions after testing the sight of students and staff, all over 16, at two colleges…

The authors wrote: “Across most of the visible spectrum males require a slightly longer wavelength than do females in order to experience the same hue.”  So, a man would perceive a turquoise vase, for instance, as being a little more blue than a woman who was looking at it too.

Abramov, professor of cognition, admitted they currently had “no idea” about how sex influenced color perception.  However, writing in the journal Biology of Sex Differences, he said it seemed “reasonable to postulate” that differences in testosterone levels were responsible…

Men can’t perceive colors as deftly as women can.  That’s why all the great Western painters like Van Gogh and Cézanne and Leonardo and Picasso and Renoir and Monet and Munch and Vermeer and Kandinsky and Matisse are female.  And all the major fashion designers of the last century like Hugo Boss and Karl Lagerfeld and Gianni Versace and Giorgio Armani and Calvin Klein and Ralph Lauren were women.  Oh, wait. 

Maybe the study meant to say testosterone only triggers color ineptitude when male ears register the words “home decorating.”  Or that male color perception improves when money is involved. 

Or maybe The Telegraph author was exaggerating just a bit.  Tacking jazzy headlines onto reports of scientific studies are all the rage these days, no matter how much they distort the findings.  In June, Medical Daily ran an article under the title, “Racism Is Innate.”  Innate means, according to my biologist father, “present at birth,” so this seemed like a call to toss all those No child is born a racist buttons onto the trash heap.  Except that anyone who bothered to read the article would discover that the study simply concluded that brain scans of adults show simultaneous activity in the centers that process fear and emotion and those that differentiate between familiar and unfamiliar faces.  The idea that fear of the Other can be neurologically mapped lends itself to a great deal of speculation and debate, but nowhere did the study claim that racism is present at birth. 

Such truth-stretching borders on mendacity, yet it pervades the science sections of so many newspapers.  Scientific studies are supposed to be free of bias, but the news media is severely biased toward publishing whatever will grab readers’ attention.  As several researchers have pointed out, differences between the sexes are currently considered a much more interesting discovery than no difference, so publishers often remain silent on an issue until they find a study that provides the juicier headline, no matter how numerous the contradicting studies are.  When the market is left to decide, it chooses salability over comprehensiveness.

Such an irresponsible approach to science results in a gravely misinformed public.  I can’t tell you how many people have repeated the claim that our modern Western female beauty standards are “natural” because a round waist resembles pregnancy and triggers the male fear of cuckoldry.  No one seems to remember that several crosscultural studies discredited this idea years ago.  But how can anyone be expected to remember something the media chose not to promote in the first place? 

And forget about waiting until the study is corroborated.  In 2007, The Times ran a headline claiming that women are naturally drawn to the color pink because of our savannah foremothers’ need to gather berries while the men hunted.  The Times published the study without consulting any historians, who eventually pointed out that pink was considered a manly color as recently as 1918 until fashion trends changed.  Oops.

This doesn’t mean that we should, as Mitt Romney has demanded, “keep science out of politics.”  Science is impartiality and corroboration and the best method we have for sorting facts from wishful thinking—for preventing our emotional, egotistical needs from weakening our objectivity.  To me, science is the most humbling force in the universe because it demands we always admit what we do not know.  It prevents hasty conclusions based on flimsy evidence, gut feelings, and political agendas.  It questions crude stereotypes and discovers more complex structures. 

But according to pop science reporters and the researchers they choose to spotlight, nearly every single modern joke about the differences between men and women stems from millennia-old evolutionary adaptations.  (Indeed, the Telegraph article claims that the female proclivity for detecting color helped our foremothers with gathering berries.  Always with the damn berries… )  As stated in the graphic below, such reports all too often suggest that prehistoric society on the African savannah looked just like something Don Draper or Phyllis Schlafly would have designed:

Men hunt, women nest, and every macho social pattern we see today has been passed down to us from our prehistoric ancestors.  Even though historians find that these patterns, like our racial categories, are barely more than two centuries old, if that.  And that the gender binary is far from universal.  Misinterpreting scientific findings is just as dire as ignoring them. 

When it comes to what women and men can and can’t do, neuroscientist Lise Eliot notes, “Expectations are crucial.”  When boys and young men grow up in a culture that mocks their supposed incompetence in all things domestic (“Guys don’t do that!”), it comes as no surprise that only the most self-confident will pursue any interest they have.  Meanwhile, studies show girls perform as well as boys do in math and science until they reach puberty.  Maybe the onset of menstruation paralyzes our visual-spatial intelligence because we’ve got to get picking those berries, or maybe girls pick up on the not-so-subtle message that guys think coquettish beauty is more important than nerdy brains in the dating game.  (For more details on the sexism faced by aspiring female scientists, see Cordelia Fine’s excellent book, Delusions of Gender.)  In her research, Dr. Eliot finds only two indisputable neurological differences between males and females:

1) Male brains are 8% to 11% larger than females’.

2) Female brains reach maturation earlier than male brains. 

All other neurological studies that find major differences between the sexes are studies of adults: i.e., the people most shaped by their culture and society.  Only cross-cultural studies of adults can isolate nurture from nature.  In any case, Eliot is a proponent of neuroplasticity, the idea that the pathways and synapses of the brain change depending upon its environment and the neural processes and behaviors it engages in.  In other words, painting or gaming from an early age or frequently throughout your life will condition your brain to do these tasks and related ones well.  It explains why the gender roles of a given time and place are so powerfulwhy mastering unfamiliar tasks is an uphill climb for men and women but also why countries committed to equality have the narrowest gender gaps. 

“Plasticity is the basis for all learning and the best hope for recovery after injury,” Eliot writes.  “Simply put, your brain is what you do with it.”  For more, see her brilliant parenting book, Pink Brain, Blue Brain: How Small Differences Grow into Troublesome Gaps—and What We Can Do About It.   

But I’ll never believe that a neuroscientist has all the answers.  I live in a country that showed the world the dangers of hastily trying to trace all social patterns back to biology.  As a result, the media here in Germany is usually much more reticent to casually toss around arguments like those in The Telegraph or The Times or Medical Daily.  Natural scientists have made discoveries like neuroplasticity and limb-lengthening that are crucial to progress, but social scientists have discovered that equality and empathy are crucial to any society that values peace and respect over power and greed. 

Or, in other words.

 

 

In the U.S., Paralympic Athletes Might As Well Be “Untitled”

9 Sep

(Via)

 

The Paralympics end today after a week of what seemed to be decent coverage, though it depended on where you tried to watch them.  The host country allotted 150 hours of coverage to the Games, Australia clocked in 100 hours, and Germany and France allotted 65 and 77 hours respectively.  Meanwhile, the United States broadcast a whopping five and half hours and no live coverage at all, as per tradition.  Yay.

Considering how little attention was afforded the Games themselves, it is unsurprising that there was little dialogue stateside about disability rights and issues of equality.  What a missed opportunity.  The British media immersed itself in it, with articles like “Is it Ok To Call The Athletes Brave?”  Indeed, disrespectful attitudes toward people with disabilities today are more often implicitly patronizing than openly derisive, and it was pleasing to see the public address this.

The Paralympic Guide to Reporting that was handed out to media outlets brought up several interesting points about language.  It rightfully asserts that disabling conditions or features should not be turned into personal nouns that define the entire person or people in question: i.e., the disabled, the blind, a paraplegic.  Adjectives and verbs—a paraplegic athlete, athletes with disabilities—are less limiting, portraying a medical condition as one of many characteristics a person has.  (This has been repeated to me ad infinitum by a friend who’s uncomfortable whenever I refer to myself as a dwarf.  “You are Emily.  You have dwarfism!” he insists.  “And you have hazel eyes and freckles and long hair…”)  Other terms and phrases to avoid noted by the guide include:

normal

able-bodied

wheelchair bound

confined to a wheelchair

suffers from

afflicted with

victim of

The last three are commonly used today.  They’re problematic because they imply that a disability is always regrettable.  Sometimes it is, and sometimes it isn’t.  Suffering may have been an apt term for my achondroplasia two months ago, when severe lumbar pain made it hard for me to think of anything else during a sightseeing trip in England.  But suffering has nothing to do with all the ways in which my condition has brought me in contact with all sorts of unique people and places and outlooks.  I can’t imagine my life without it.  It’s my version of normal.  Unless the patient specifically says otherwise, any assumption that a disability is a round-the-clock tragedy is wrong.

For the sake of splitting hairs, I sometimes think the words disabled and disability are problematic because they automatically draw attention to what a person cannot do.  In the worst case, they can sound pitiful.  I’m very fond of the word typical in lieu of normal or able-bodied because it highlights that the standard by which we group people is based on a body type chosen by the scientific community.  It implies medical averages, not social values.  Typical is used in everyday speech to mean “usual” at best and “unexciting” at worst, unlike normal, which implies a state of correctness worth striving for, like in the phrase “back to normal.”  Discussions of autism and some other psychiatric conditions now commonly use the term neurotypical to refer to people without the diagnoses.  Maybe physiotypical could someday be the term for non-disabled people.

But as I’ve said a few times before, the search for acceptable terms is not about deciding what automatically classifies a speaker as Tolerant or Bigoted.  Words are only half as important as the intentions behind them, and the desire to understand another’s perspective is what separates an empathic person from a selfish one.  In the recent words of Professor Charles Negy, “Bigots… never question their prejudices.”  

The above list of do’s and don’ts is probably disconcerting to some readers.  I always feel simultaneously inspired and confused when given a list of hot-button words I’m told to avoid from now on.  Hell, I’ve written the word able-bodied before, and I’m someone excluded by it.  I find no problem with the word handicapped—I had handicapped housing rights in college and a handicapped parking sticker during my limb-lengthening procedures—but it’s considered offensively archaic in the U.K., apparently similar to invalid or cripple.  As we’ve seen in the midget vs. dwarf vs. LP debate, rarely is there ever a consensus in a given community over labels.  Labels are almost always problematic.  In my experience, the dialogue always matters more than the conclusion it comes to. 

And the inability of the U.S. media to have such dialogue during the Paralympics was pitiful.

 

 

Fighting the Good Fight or Feeding The Ego?

19 Aug

Body Art Chameleon“I know so many men and boys and trans individuals who wear dresses for so many different reasons, and they do it a lot more than mainstream movies, TV, and advertising suggest.” 

I felt my fingers tremble just a tiny bit as I typed this sentence last week.  Not because of the subject matter.  Not because of the point I was trying to make.  Because of the “I.”  Was that word going to drive home my point, or derail it?

Studies show personally knowing someone who belongs to a minority group increases the likelihood that you will have empathy for that minority.  If you have a family member who is gay, you’re less likely to oppose marriage equality.  If you know someone with dwarfism well, you’re less likely to see their medical diagnosis whenever you look at them.  GLAAD emphasized the political potential for all this in a brilliant meme last fall.  Urging LGBT individuals to talk openly about their partners and love lives at the dinner table with the same frequency as their straight family members, they called it, “I’m Letting Aunt Betty Feel Awkward This Thanksgiving.” 

Truly caring for someone with a different perspective often—though, sadly, not always—inspires us to try to understand their perspective and this enhances our own.  Letting others know that They are not so different from Us because we know and care deeply about many of Them can effectively break down barriers.  And, when discussing social injustice, it’s always best to ask someone with personal experience, lest we unwittingly make erroneous assumptions.  But, of course, just having friends who belong to minority groups doesn’t solve everything. 

As I wrote about knowing men and trans people who wear dresses to elucidate that They are actually Us, I cringed at the idea of flaunting my loved ones’ Otherness for the purposes of my blog.  By inserting myself into the statement, there was a risk that some would think I was trying to prove my open-mindedness.  I’ve bragged like that in the past, especially when I was an egocentric teen.  (You know, back when you practiced writing your name over and over?)  And my own Otherness has been flaunted a few times by friends and acquaintances seeking attention for their open-mindedness.  It’s a serious problem in the social justice movements.  

In Black Like Me, the author tells the story of a New Yorker he encounters who has come to the South to “observe” the plight of the black citizens.  “You people are my brothers,” the New Yorker insists.  “It’s people like me that are your only hope.  How do you expect me to observe if you won’t talk to me?”  Although the man’s opposition to segregation was morally correct, his overt self-regard and patronizing disgust at his brothers’ “ingratitude” makes it one of the most cringe-inducing scenes in the book.

In Baratunde Thurston’s fantastic memoir, How To Be Black (just out this year), the author asks writers and activists about white people’s fear of being called racist.  damali ayo, the author of How To Rent A Negro and Obamistan! Land Without Racism, says it best:

It shows our values as a culture when somebody says, “I don’t want to be a called a racist.”  Really what they’re saying is, “I want you to like me.  I don’t want to not be liked.  I want to still be okay with you.”  They don’t mean, “What I really want is to know and understand experiences of people of color…”  That would be great.

And so, it just shows that, as I always have said, we are operating at this third-grade level of race relations.  And it’s that third-grader that goes, “Please like me, do please like me,” versus “Can I understand?”

We all want to be liked and we all want to do the right thing.  But the the third-grader mindset can’t help but focus more on the former.  It is evident in common phrases like:

“We were the only white people there!” 

 “I’ve always wanted a gay friend!” 

“I think I’m [bisexual/learning disabled], too, because I [kissed a girl once/have difficulty concentrating]!” 

“I’m not prejudiced!  I have so many [nonwhite/foreign/LGBT/disabled] friends!”

Of course, in certain contexts and worded differently, these statements would not be offensive.  What makes them offensive is the need to let others know all about us, the belief that our support for equality deserves praise, the patronizing (and unjust) view that minorities should be grateful for our lack of prejudice.  We can note that we were the only white people in a group in order to spark a dialogue about social segregation, or we can flaunt the experience like a medal from the Liberal Olympics.  We can worry that having a homogeneous circle of friends will limit our perspective, or we can believe that racking up as many minority friends as we can is proof of our expertise on all minority issues.  We can try to empathize with someone labeled “different” because of their sexuality or biology in order to remove stigmas and barriers, or we can try to seek the attention they are getting for ourselves.  We can respond to accusations that we have offended by trying to understand why someone would be hurt, or we can respond by listing our liberal credentials.

This depends primarily on the individual.  Someone who likes to brag about their open-mindedness usually brags about most things they do.  This personality trait seems to be particularly common among educated elites—parodied so well at Stuff White People Like—because elite education frequently fosters competitiveness.  (Taking the time to count your degrees, count the books you own, count the minority friends you have…)  Competitiveness is anathema to selflessness.   But while bragging about the number of books we own is silly because we’re obviously missing the point of reading, bragging about the number of minority friends we have is grave because we’re missing the point of human rights.

Do we donate to charity privately because it makes us feel better to spend the money on someone else?  Or do we hope that others will notice and admire our sacrifice?  Then again, drawing attention to the work we’re doing is usually important if we want to advertise the cause and urge others to join.  That’s where things get murky.

A while back, within a few months of each other, two friends stood up to ableism and told me about it after the fact.  A guyfriend came fuming to me about his teacher who had used the word “midget” and who had then insisted, despite my guyfriend’s protests, that it wasn’t offensive at all.  A girlfriend told me that a mutual acquaintance had said something crass about my dwarfism and that she had told him to back off repeatedly because she wouldn’t tolerate such bigotry in her presence.  The first friend focused his story on the offender’s behavior.  The second focused her story on her heroic defense.  People who want to understand the problem more than anything tend to focus their feelings on the injustice they encountered.  People who want to be liked more than anything tend to focus their feelings on their performance.

This shouldn’t ever deter anyone from working for equality and social justice, from celebrating diversity or from spreading awareness.  Open minds should always be highly valued.  But to paraphrase the recent words of the Crunk Feminist Collective, by not being racist—or sexist or homophobic or lookist or ableist or transphobic—we’re not doing anything special.  We’re doing what we’re supposed to do.

 

 

When It Comes To A Boy In A Dress, The Question Is: What’s Wrong With Us?

12 Aug

When I was about 10 years-old, a friend of mine with achondroplasia was being teased at her school for being so short.  After being shunned at lunchtime repeatedly—“No freaks at this table!”—her mother finally called her local chapter of Little People of America, which sent a spokesman into the school to give a presentation.  After he read Thinking Big to the class, explaining thoroughly in an age-appropriate manner why my friend looked the way she did, one of the biggest bullies raised his hand.  “So, you mean, she’s little because she’s a dwarf?” he asked.

The spokesman offered to let my friend answer the question herself and she replied, “Yes.”

The boy who had teased her so much suddenly had tears in his eyes.  It later came out that his new baby brother had just been diagnosed with dwarfism.  He had had no idea until that moment that his brother was going to grow up to look just like the girl he’d targeted. 

To anyone who insists, “He couldn’t have known,” he could have.  We could have let him know.  What is school for, if not the pursuit of knowledge?  With the exception of women, all minorities risk marginalization not only by others’ lack of empathy but by the lack of visibility automatically brought on by their lower numbers.  Any place that prides itself on learning should pride itself on learning about other perspectives, other identities, other behaviors, no matter how rare.

So “What’s Wrong With A Boy Who Wears A Dress?” asks The New York Times magazine on its cover this week.  Despite that the flippant headline sacrifices sensitivity for saleability, at least it’s shedding light on the subject.  I know so many men and boys and trans individuals who wear dresses for so many different reasons, and they do it a lot more than mainstream movies, TV, and advertising suggest:

 


When asked why he likes regularly wearing his wife’s nightgowns, one man shrugged, “It’s comfy.”

The Times article has its flaws.  When discussing how boys who wear dresses turn out later in life, the article stuffs them into three overly simplistic boxes: a) gay, b) heterosexual, and c) transsexual.  Such labels do not encompass all the ways and reasons people of various gender identities and sexualities wear dresses into adulthood.  As one friend observed, “The path of least resistance for so many is to wear dresses in secret.  By using these limiting categories, the article implies that and also does nothing to change that.”  The use of the categories also implies that these individuals owe us a clear-cut, sex-based explanation for their behavior, which is itself a symptom of narrow mindedness.  No one demands a woman explain why she likes wearing jeans.

And yet the article also keeps its subjects silent.  While documenting the struggles of both conservative and liberal parents, the author would have been wise to include the perspective of adults who wore or wear dresses.  In the absence of their agency, their nervous parents are essentially speaking for them.  (Rule Number One in Battling Intolerance: Never, ever let a minority’s agency be ignored.)

But for all these errors, the article concludes with those who ultimately support their sons as best they can.  One dad heard that his five year-old was being taunted in kindergarten for wearing pink socks, so he bought himself a pair of pink Converse sneakers to wear in solidarity.  The kindergarten teacher jumped in, too, opening up a class discussion about the history of gender rules and shocking the kids with the information that girls were once not allowed to wear pants. 

Whenever reports on “different” children list the anxieties parents have about their kids not being accepted, the message often starts to get muddled.  Sometimes the article is clear that we as members of society need to get over our hysterical hang-ups and start accepting these children as they are so that they and their parents no longer have to worry what we and our own children will say.  Too often, however, the article spends so much time quoting the parents’ fears that the source of the problem starts to sound more and more like the child’s disruptive identity, not others’ clumsy reactions to his identity.  And that’s wrong.

Whenever a child is made fun of for being himself, it’s our problem, not his.  Biologists can say what they want about a fear of difference being an evolutionary adaptation, but our culture values differences two ways, either as “abnormal” (i.e., strange and pitiful) or “super-normal” (strange and admirable).  The Beatles’ mop-tops were abnormal to parents of the time (“They look like girls!”), and super-normal to their teenage children.  In the nature vs. nurture debate, we need to stop saying “nurture” and start saying “culture,” because changing the environment a child grows up in means changing the behaviors of more than just one set of parents.  Mine never once told my younger brother, “Only sissies cry,” but his little league coach told the team just that.

This is our culture and we are the ones shaping it as the creators and consumers.  By making and watching films and TV shows that state what’s “gay,” “wimpy,” “ugly,” “freaky,” or “gross.”  By stating, “Guys just don’t do that,” or letting such remarks go unchallenged.  By repeating traditional views of minorities—e.g. the dwarfs of Snow White and Lord of the Rings—and failing to provide more realistic portrayals with greater frequency.  As adults, we bear so much responsibility for shaping the world the younger generation is trying to navigate.   (As this German Dad proved so well.)

Since the Sixties, many parents and teachers and educational programs have embraced books that promote understanding of ethnic diversity such as People and of disability such as I Have A Sister: My Sister Is Deaf to broaden our children’s perspective and nurture empathy toward people they do not encounter every day.  Yet books like My Princess Boy or The Boy In The Dress have yet to break into the standard curriculum.  There seems to be an unspoken assumption that such books are primarily for the boys they’re about.  (Buy them only after your son starts actively asking for a tiara.)  But everyone should be reading them, for the same reason everyone should be reading Thinking Big.  By waiting to address the idea of free gender expression until a little boy gets bullied, we are cultivating the assumption that the problem never existed until that little boy came along.  The problem was always there.  

Critics have argued The Boy In the Dress is unsuitable for any boy in real life who feels the like the protagonist because any school he attends in real life is far less likely to rally around him so enthusiastically.  But that’s exactly why this book needs to be read and discussed and picked apart by school classes around the world, not just by boys alone in their bedrooms. 

As a teacher, babysitter and relative, I encourage the little boys in my life to play dress-up, house or princess with their female playmates because I’ve yet to hear a convincing argument as to why it’s any different from encouraging the girls to get down and dirty in the mud with their brothers.  Sure it’s radical—just as my mother’s wearing jeans to school 42 years ago was radical—and the last thing I want to do is turn a child into something he’s not.  But as with a girl, I want him to feel that every option is open to him, despite any hang-ups tradition has about it.  And if it becomes evident that he truly has no interest in anything soft or sparkly, I at least want to do my best to ensure that he never, ever makes fun of any boys who feel otherwise.

 

 

Interpreting History Part II: Oppression Has Never Been Universal

5 Aug

(“Samurai Kiss” via)

 

Nothing divides a country quite like a national holiday.  When I was studying in St. Petersburg ten years ago, there was as much apathy as there was celebration on the Russian Federation’s June 12th decennial.  German reactions to Reunification Day every October 3rd are anything but united.  And on the United States Fourth of July last month, Chris Rock tweeted, “Happy white peoples independence day, the slaves weren’t free but I’m sure they enjoyed fireworks.”

Amid the outbursts of “unpatriotic!”, conservative blogger Jeff Schreiber shot back, “Slavery existed for 2000yrs before America. We eradicated it in 100yrs. We now have a black POTUS. #GoFuckYourself.” 

Schreiber has since written a post on his blog, America’s Right, apologizing for cursing and conceding that the slave trade was unconscionable.  But for all his insistence that he never intends to diminish the horrors of American slavery, he adds that President Obama’s policies are now “enslaving Americans in a different way.”  (Real classy.)  And for all his reiteration that slavery was always wrong, he still hasn’t straightened out all the facts skewed in his Tweet.

“Slavery existed for 2,000 years before America.”  He uses this supposed fact to relativize the oppression, as if to shrug, “Well, everyone was doing it back then.”  His tweet implies that the ubiquity of the slave trade makes America’s abolition of it exceptional, not its participation.  This argument hinges on fiction.  Slavery did not exist for 2,000 consecutive years.  In the West, it was pervasive in Antiquity and the Modern era, but it was downright uncommon in the Middle Ages.  (While anathema to our modern ideas of freedom for the individual, medieval serfdom was not slavery.)  Slavery was re-instituted in the West roughly 500 years ago with the advent of colonialism.  And the United States held on to it long after most other colonial powers had abolished it.  Critics can say what they want about the effectiveness of Chris Rock’s rain-on-a-parade tactics, but his argument did not distort history.      

In my last post, I argued the risks of concealing the human rights abuses of the past for the sake of nostalgia, if anything because it is the height of inaccuracy.  But portraying history as an unbroken tradition of straight, white, able-bodied male dominance like Schreiber did is also inaccurate.  The universal human rights movement in its modern form is indeed only a few decades old, but the idea of equality for many minorities can be found all over in history at various times and places.  The Quakers have often been pretty keen on it. 

And almost no minority has been universally condemned.  People with dwarfism appear to have been venerated in Ancient Egypt.  Gay men had more rights in Ancient Greece and in many American Indian societies than in 20th century Greece or the United States.  Muslim women wielded the right to divorce long before Christian women.  English women in the Middle Ages were more educated about sex than their Victorian heiresses.  Much of the Jewish community in Berlin, which suffered such unspeakable crimes culminating in the mid-20th century, were at earlier times better integrated into the city than Jewish people were in many other capitals of Central Europe.  In short, history does not show that racism, misogyny, homophobia, ableism, transphobia, and our current beauty standards are dominant social patterns only recently broken by our ultra-modern culture of political correctness.  The oppression of minorities may be insidious and resilient throughout history, but it has never been universal. 

Downplaying the crimes of the past by claiming everybody did it is both historically inaccurate and socially irresponsible.  It is perverse when such misconceptions fuel arguments for further restrictions on human rights.  In 2006, Republican Congress member W. Todd Akin from Missouri claimed that, “Anybody who knows something about the history of the human race knows that there is no civilization which has condoned homosexual marriage widely and openly that has long survived.”  Even if this were true, the argument is absurd.  (It appears that no civilization has regularly chosen women with dwarfism for positions of executive power, but does that mean it’s a bad idea?)  But the argument collapses because it relies on facts that are untrue.

Granted hyperbole is a constant temptation in politics.  Stating things in the extreme is a good way to grab attention.  In an earlier post on sex, I asserted that mainstream culture assumes women’s sex drive is lower than men’s because female sexual expression has been “discouraged for millennia.”  Patriarchy has certainly been a major cultural pattern around the world and throughout history, and we cannot emphasize its power on both the collective and individual psyche enough.  But patriarchy is by no means a cultural universal.  Ethnic groups in Tibet, Bhutan, and Nepal continue to practice polyandry into the present day, while history shows many others that have done the same at various times.  These exceptions question the biological theory that heterosexual male jealousy is an insurmountable obstacle to sexual equality.  And prevents any conservative excuse that insists, “Everybody’s been doing it.”    

They haven’t been.  Xenophobia has never been universal.  Humans may have a natural fear of the unfamiliar, of what they perceive to be the Other, but our definitions of the Other change constantly throughout time and space, as frequently and bizarrely as fashion itself.   This makes history craggy, complex, at times utterly confusing.  Like the struggle for human rights, it is simultaneously depressing and inspiring.  But whatever our political convictions, we gotta get the facts straight.

Despite what Stephen Colbert says.

 

 

Interpreting History Part I: Count Me Out

29 Jul

alter ego(Image by Bob May used under CC license via)

 

Anytime my partner and I don’t know what to do or say, one of us asks, “What’s in the news?” and we dive into a political discussion.  So it’s no surprise that we’ve become somewhat embarrassingly addicted to Aaron Sorkin’s The Newsroom.  The news media has been (unsurprisingly) critical of a show founded on the idea of chastising the news media.  Feminists have been (sometimes rightly) critical of its portrayal of women.  The show has almost countless strengths and weaknesses, but I find myself still obsessing over the brilliant, captivating opening scene that kicked off the series.  If you can’t this clip, it basically boils down to a flustered news anchor named Will McAvoy overcome with disgust at the state of the nation and nostalgia for the 1950s and 60s: “America’s not the greatest country in the world anymore,” he sighs.  “We sure used to be.”

We stood up for what was right.  We fought for moral reasons.  We passed laws, we struck down laws for moral reasons.  We waged wars on poverty, not poor people.  We sacrificed, we cared about our neighbors.  We put our money where our mouths were, and we never beat our chests…  We cultivated the world’s greatest artists and the world’s greatest economy.  We reached for the stars, acted like men.  We aspired to intelligence.  We didn’t belittle it.  It didn’t make us feel inferior…  We didn’t scare so easy.     

“Nostalgia” literally means “aching to come home.”  It’s the temporal form of homesickness, time rather than place being the source of pain.  We all do it.  It can be oddly soothing at times to be in awe of another era, especially the one you were born in.  But Will McAvoy should watch Woody Allen’s Midnight in Paris for proof that nostalgia is an ultimately futile pastime that every sad sack of every era has hopelessly indulged in.  (If “things were better back in the day,” then how come every generation says this?)  But since McAvoy’s nostalgia is an earnest, political battle cry, heaping laurels on the good old 1950s and 60s when the leaders of the day did their job right, I’m more inclined to have him watch Mad Men.  Or just open up the 1960 children’s illustrated encyclopedia I found at my great aunt’s house, which states, among other things: “The Australian aborigine is similar to the American negro in strength, but less intelligent.”  Didn’t scare so easy, indeed.     

The problem with nostalgia is that it is far more emotional than intellectual and thereby lends itself to inaccuracy all too easily.  America was indeed doing great things sixty years ago.  And reprehensible things.  We hid our disabled and gay citizens away in institutions, asylums and prisons.  We enforced the compulsory sterilization of mentally disabled and Native American women.  We took decades to slowly repeal segregationist laws that the Nazis had used as models.  We maintained laws that looked the other way when husbands and boyfriends abused their partners or children.  In short, we handed out privilege based on gender, sexuality, ethnicity, religion, physical and mental capabilities with far greater frequency and openness than we do today.  Perhaps we were the “greatest country in the world” compared to the others.  (Europe and East Asia were trying to recover from the devastation of World War II, after all, while other nations were trying to recover from the devastation of colonialism.)  But McAvoy’s wistful monologue is much more a comparison of America Then with America Now.  And that is hard to swallow when considering that a reversion to that society would require so many of us to give up the rights we’ve been given since then.   

Am I “another whiny, self-interested feminist” out to bludgeon the straight, cis, WASPy male heroes of history?  Am I “just looking to be offended”?  No, I’m struggling.  Next to literature and foreign languages, history has always been my favorite subject.  And pop history always touches upon this question:

“If you could go back to any period in history, which would it be?” 

From an architectural point of view?  Any time before the 1930s.  From an environmental point of view?  North America before European contact.  From a male fashion point of view?  Any period that flaunted fedoras or capes.  From a realistic point of view?  No other time but the present.  Because if I am to be at all intellectually honest in my answer, there has never been a safer time for me to be myself. 

Last year, I read The Lives of Dwarfs: Their Journey from Public Curiosity To Social Liberation by Betty Adelson.  Despite my love of history, I hated almost every minute of it.  Lies my Teacher Told Me by James Loewen had helped me understand how so many black American students feel uninspired by U.S. history and the figures we hold up as heroes because so many of those men would have kept them in shackles.  But it wasn’t until I read The Lives of Dwarfs that I understood how nasty it feels on a gut-level to face the fact that most of history’s greatest figures would more likely than not consider you sub-human. 

With the exception of Ancient Egypt, my own lifetime has been the only period wherein someone with dwarfism could have a fair chance of being raised by their family and encouraged to pursue an education and the career of their choice, as I was.  At any other point in Western history, it would have been more probable that I would have been stuck in an institution, an asylum or the circus (the Modern Era before the 1970s), enslaved by the aristocracy (Rome, Middle Ages, Renaissance) or left for dead (Ancient Greece).  Of course inspiring cases like Billy Barty show that a few courageous/decent parents bucked the trends and proved to be the exception to the rule, but that’s what they were.  Exceptions. 

I am fortunate to have been born when I was and for that reason, nostalgia for any other period in time can never be an intellectually honest exercise for someone like me.  The moment someone says, “Yeah, well, let’s not dwell on odd cases like that.  I’m talking about the average person,” they’re essentially saying, “Your experience is less important than mine.”

Everyone is entitled to have warm, fuzzy feelings about the era in which they grew up.  If any period can put a lump in my throat, it’s the 1970s.  The Sesame Street era.  The boisterous, primary-colored festival flooded with Williams Doll, Jesse’s Dream Skirt, inner city pride à la Ezra Jack Keats, and androgynous big hair all set to funky music can evoke an almost embarrassing sigh from me.  Donning jeans and calling everyone by their first name, that generation seemed set on celebrating diversity and tearing down hierarchies because, as the saying goes, Hitler had finally given xenophobia a bad name.  Could there be a more inspiring zeitgeist than “You and me are free to be to you and me”? 

 

But I’m being selective with my facts for the sake of my feelings. 

Sesame Street and their ilk were indeed a groundbreaking force, but it was hardly the consensus.  Segregation lingered in so many regions, as did those insidious forced sterilization laws.  LGBT children were far more likely to be disowned back then than today—Free To Be You And Me had nothing to say about that—and gay adults could be arrested in 26 states.  The leading feminist of the time was completely screwing up when it came to trans rights.  Although more and more doctors were advocating empowerment for dwarf babies like me, adult dwarfs faced an 85% unemployment rate with the Americans with Disabilities Act still decades away.  And Sesame Street was actually banned in Mississippi on segregationist grounds.  When the ban was lifted, its supporters of course remained in the woodwork.  We have made so much progress since then.  It would be disingenuous for me to ignore that simply for the sake of nostalgia. 

To be fair to Sorkin, it’s a hard habit to kick.  We have always glorified the past to inspire us, no matter how inaccurate.  Much of American patriotism prides itself on our being the world’s oldest democracy, but we were not remotely a democracy until 1920.  Before then, like any other nation that held free elections, we were officially an androcracy, and of course we didn’t guarantee universal suffrage until the Voting Rights Act of 1965.  That my spellcheck doesn’t even recognize the word “androcracy” signifies how little attention we afford our history of inequality.  But we have to if accuracy is going to have anything to do with history.  A brash statement like “We sure used to be [the greatest country in the world],” as a battle cry for self-improvement is asking to be called out on the inanity of this claim. 

Everyone is entitled to appreciate certain facets or moments in history, just as everyone is entitled to look back fondly upon their childhood.  Veracity falters, however, with the claim that not just certain facets but society as a whole was all-around “better.”  This is never true, unless you’re comparing a time of war to the peacetime preceding it (1920s Europe vs. 1940s Europe, Tito’s Yugoslavia vs. the Balkans in the 1990s), and even then the argument is sticky (Iraq during the insurgency vs. Iraq under Saddam Hussein).  In the words of Jessica Robyn Cadwallader, concealing the crimes of the past risks their reiteration.  Whenever we claim that something was socially better at a certain point in history, we must admit that something was also worse.  It always was. 

But such a sober look at the past need not be depressing.  It reminds me how very grateful I am to be alive today.  My nephews are growing up in a society that is more accepting than almost any other that has preceded it.  That is one of helluva battle cry.  Because what could possibly be more inspiring than history’s proof that whatever our missteps, things have slowly, slowly gotten so much better?

 

 

When You Gonna Start Makin’ Babies?

22 Jul

Gotcha by Clint McMahon(Image by Clint McMahon used under CC license via)

 

A while back, tucked inside one of my longer posts was a link to a conversation Rosie O’Donnell had in February with comedienne Chelsea Handler on her show in which she discussed her phobia of dwarfs.  Driven by Handler’s insistence that sex with a dwarf would be “child abuse,” the conversation devolved into musing about how dwarf women give birth:

O’Donnell: When a little person has a normal-sized person, I don’t understand how that happens.

Handler: That I don’t understand!

O’Donnell: I don’t get it.  How come the little person isn’t dead when the normal-sized baby comes out?

Handler: Sometimes two smalls make a tall.

O’Donnell: But how does it come out?

Handler: I don’t know.  I think anything can come out of that.

For your information, Chelsea, when it comes to achondroplasia—the most common type of dwarfism—“two smalls” have the exact same chance of having a “tall” (25%) as they do of having a child with two achondroplastic, homozygous genes, which is always fatal.  (The baby is usually stillborn or dies within the first few weeks after birth.)

O’Donnell has since apologized for talking about her phobia of dwarfs, though Little People of America have rightly said she missed the point.  Many have said that as an openly gay woman, she should know better when discussing prejudice, but I was more surprised by her callousness in light of her being an adoptive parent.  And I notice my (hyper-)sensitivity to that issue seems to grow every time I encounter it.

And of course I seem to be encountering it everywhere nowadays.  “When ya gonna start makin’ babies?”  Almost all of us in our late twenties and thirties are used to being asked this regularly.  I’ve been told I should take it as a compliment, since it’s rarely asked of couples who would make terrible parents.  Yet I’ve been amazed at how intrusive the questions and comments can be, how often something as personal as parenthood is treated like small talk.  It’s understandable as more of my peers become parents; the prospect of making humans is daunting and people need to vent about it.  Those who don’t want children while living in a baby-obsessed world feel the need to vent back.  All this venting results both in community-building and in tactless comments that knock those outside of the community. 

One of my friends who miscarried was told by a stranger, “Well, it wasn’t a real baby.”  A friend who adopted a girl from South Korea was told by a fellow church member, “Her eyes aren’t that bad.”  A friend who had a C-section was told she must not feel as close to her child as women who give birth “naturally.”  Childfree friends have been told that their lives will be never be “complete” until they’ve had children.  A biology professor who had two foster daughters was asked if he was worried they would inherit their imprisoned father’s criminal tendencies because “that stuff’s in the genes, y’know.”  I’ve been told it’s selfish to want a child with achondroplasia, it’s selfish to want a child without achondroplasia, it’s selfish to allow my child to inherit my achondroplasia, it’s selfish to play God with genetics, it’s selfish to want to biologically reproduce what with the world population exploding, and it’s selfish to worry about any of this because it’s not like I’m infertile.  All of these comments were well-intentioned. 

Usually people are simply thinking out loud when they say such things.  It is important to remember that no one can be expected to know exactly what to say in unusual circumstances, lest I end up lecturing as if I’ve never inadvertently offended anyone.  Almost all of us have good intentions, but many are unaware of how quickly we redirect conversations back to our own experiences, how easily we forget to prioritize listening over interrogating, empathy over curiosity, respect over Thank-God-that’s-not-me! complacency.   

Hereditary conditions, finances, disabilities, infertility, relationships and emotions ensure that having children is not a universal experience.  There is no right way for everyone and any opinion that can in any way be construed as a judgment can cut someone deep because babies and bodies are entangled in supremely visceral feelings.  It’s no coincidence that Roe v. Wade was argued based on the right to privacy: Something as sensitive, as complicated and as profoundly emotional as your reproductive choices should be volunteered at your discretion. 

That said, parenthood is all about making decisions that will inexorably affect someone else’s life, not just your own, and this is why it is such a hot-button issue.  Our reproductive decisions, more than any other decisions, are the intersection of personal freedoms and social responsibility.  As the daughter of a social worker who worked for Child Protective Services, I have firm beliefs about right and wrong when it comes to parenting.  As someone whose genes make the prospect of parenthood unusually complicated, I’ve begun to see how judgmental those beliefs can come off when the presentation is sloppy. 

As an avid reader of Offbeat Families, I know that sharing knowledge and experiences can help others in so many ways.  But as someone who feels very ambivalent about offering up my not-yet-existent children’s potential situation as conversation fodder, I’ve become less trustful of many of my most well-meaning friends and family members.  Questions about my situation so quickly transform into lectures about their situation.  (I’ve also noticed that the more nervous someone is, the more they lecture.)  Besides making me more guarded about my personal experience, it has also taught me to stop myself from making snap judgments about others’ reproductive choices.  When dealing with anyone else’s family planning, I have been humbly learning to: 

 1)      Fight the urge and try not to ask others about their reproductive choices, especially in the context of small talk.  Let them volunteer it.  Go ahead and volunteer your own stories, but don’t press the other person if they do not respond in kind.  We can never assume what’s lurking under there. 

 2)      Beware of talking about the decisions you made in a way that inadvertently hurts those who must make different decisions.  This is also very tricky, but if you are convinced water birth is the only way you can imagine doing it or you are proudly childfree or you know exactly how to make sure it’s a girl, be aware that people in different financial or medical situations may not have these options at all.    

 3)      When someone does want to share something you have little experience with (e.g. adoption, stillbirth, staying childfree, etc.), prioritize listening and learning over immediately finding something to compare it to.  Relativizing struggles can be helpful and I’ve gotten some great feedback from friends, but my guard goes up when someone without achondroplasia tells me right away they know what I should do because they know someone whose baby has diabetes, they took a college class on bio-ethics, or they heard something like it on the news.

4)      Only offer your ethical opinion if the person makes it perfectly clear they want to hear it.  Every society bears the responsibility of taking a legal stance on complex reproductive issues: prenatal testing, genetic counseling, birth control, abortion, sterilization, drug testing, assisted reproductive technology, the life of the mother vs. the life of the fetus, custody, adoption, foster care, etc.  We are all compelled as citizens to be aware of the laws concerning these issues.  And we all have our own opinions about them.  But anyone directly affected by them is likely to have heard it before and to have been thinking about it longer than we have.  I’ve been thinking about the effects my dwarfism may have on my kids since I was fourteen.

5)      Don’t gossip about others’ decisions behind their backs.  It makes your listeners aware how they will be talked about when it’s their turn to decide about having children.  There is a fine but crucial line between trying to understand situations that are new to you and using someone’s situation to tell an interesting story.

6)      Do try to actively listen when invited to, saying truly supportive things, as one or two particularly fantastic friends of mine have, such as: “I can only begin to imagine what I’d do in that situation.”  “Let me know if you don’t want to answer this question…”  “On a much smaller level, it sounds a tiny bit like what I felt when…”   “No matter what you decide, I know you’ll be great at it because…”  “I’m always here to listen if you ever need to spill, as long as it helps.”

Of course, in listing here what I have learned not to do, I can only hope that my own past SNAFUs have been minimal.  Insensitivity, by definition, is the disconnect between intention and effect.  Embarrassed apologies to anyone whose toes I stepped on while stomping through my own bigfooted opinions.

 

 

Cross-posted on August 27, 2012 at Offbeatfamilies.com

Body Image Part II: The Rules for Snark

10 Jun

(Image by Stephen Alcorn © 2003 http://www.alcorngallery.com)

 

Last week I went after talking about others’ bodies for the sake of analyzing what you can’t be attracted to.  Today I’m going after talking about others’ bodies for the sake of musing, or amusement…

Anyone who insists they never make fun of others behind their back is lying.  We all do it, and to the extent that snark is now rivaling porn as the Internet’s raison d’être.  Every bit of our outward appearance—our fashion choices, our speaking styles, our assertiveness or timidity—it’s all out there for others’ scrutiny and all of us pick targets when we’re in the mood, sometimes at random, sometimes with a purpose.  Just take the example of weddings.  I bet there’s at least one wedding you’ve seen that looked ridiculous to you.  Alternative brides think, Wear an expensive dress if that’s what you’ve always wanted, but it’s still vulgar materialism.  And the mainstream brides think, Dont wear a white dress if you don’t want it, but you just want attention for being anti-everything.  While others simply think, Purple.  Yuck.  Or something to that effect. 

In wedding planning as in our everyday fashion, what we choose is a comment on what we don’t.  No one’s choice is in isolation of everyone else’s.  To dress like a punk or to dress like a cowboy, to speak a local dialect or to speak like a newsreader, to try to fit in or to try to stand out are all decisions we make that usually reflect both our tastes and our beliefs.  We give others’ decisions either the thumbs up or thumbs down accordingly.  As I’ve said before, it’s fair game when beliefs are targeted, because we should all take responsibility for our beliefs.  But too many of us make no distinction between the elements of someone’s appearance that reflect their beliefs, and the elements that reflect their biology.  

Many of my friends and family, along with most commenters on TV or online, see little difference between making assumptions about others’ clothes and making assumptions about the bodies they cover.  Just as they’ll assume the slick suit must belong to a businessman and the lady in shorts and sneakers is American, they’ll assume the particularly skinny woman must be anorexic, that the man whose hands shake must be an alcoholic, that the young woman who collapsed must be either diabetic or pregnant, that the large child over there getting his breast milk is obviously too old for that, that chubby guy over there is certainly overweight and should lose a few pounds, that the poor kid with acne isn’t using the right medicine.  Sometimes these flimsy diagnoses are voiced as expressions of sympathy or intellectual exercises à la Sherlock Holmes, sometimes they are dripping with self-aggrandizing pity or snarky complacency.  They are always unjust because, unlike quips about clothes or tattoos or cell phone ringtones, comments about another’s body have little to do with choices anyone has made. 

As someone who’s undergone limb-lengthening, I can of course attest that there are a few choices we make about our appearance.  But while I chose to try to add as many inches as possible to my height, I didn’t have much of a choice about how many inches I could go for.  (I gave all I could in physical therapy, but in the end, my ticked-off muscles stiffened and decided the limit for me.)  Nor did I have much of a choice about my anterior tibialis tendons severing on both legs, which now makes me stumble on average every few weeks and makes dismounting from a bicycle dangerous.  (After two surgeries to repair the tendons and three years of physical therapy, they remain weak.)  Nor have I ever had any choice about my hips swaying when I walk because the ball-and-socket hip joint in achondroplastic people is shaped like an egg-and-socket.  Skinny friends with hypoglycemia, heavy friends with slow metabolism, and friends with diastrophic dwarfism—whose growth plates do not respond to limb-lengthening—can also attest that any choices we make about our bodies are always limited.  Discussing these choices is important, but strangers assumptions about them are usually way, way off.

It is because I know so many kind, loving people who analyze strangers bodies that I wasn’t at all surprised by the nasty ruminations over her “puffy” appearance that Ashley Judd so awesomely bucked in Newsweek earlier this year.  And I’m only half-surprised by the website Too Big For Stroller, where people post street photos of children who appear to have outgrown the transport and smirk about what idiotic parents they must have.  In his essay, “Broken Phantoms,” Robert Rummel-Hudson writes beautifully, harrowingly about the unfair judgment strangers often heap on individuals with rare disabilities whose symptoms are less visible.  He went after the Too Big For Stroller crowd and summarized their defense arguments thusly: 

However many kids with invisible disabilities might be made fun of or hurt by that site, they are acceptable collateral damage, because some of them are probably lazy kids with weak parents, and they must be judged.

“Acceptable collateral damage” is the word I’ve been searching for my whole life.  It’s how Jason Webley downplayed the rights of “the few conjoined twins in the world” in light of his Evelyn Evelyn project.  It’s how so many minorities are dismissed as annoyances in our majority-rules society by the vacuous, relativist claim, “Everyone’s going to be offended by something.”  Which is another way of saying, “We can’t consider everyone’s rights.” 

All of us make automatic, silent assumptions about others’ bodies, often trying to figure out how we ourselves measure up, because we are all insecure about our bodies to some degree.  But the ubiquity of these thought patterns and the rate at which they are voiced is the problem, not the excuse.  There’s probably a list of catty things I’ve said the length of a toilet roll, but I try to stop myself from diagnosing strangers’ bodies, if anything out of awareness of my own vulnerability to inaccurate assumptions.  A few years spent in and out of hospitals also taught me what the hell do I know about where they’re coming from, and we all think enough unproductive thoughts about others’ physical appearance as it is.  In an essay about me and my scars, Arthur W. Frank writes that when we see someone who looks either unattractive or pitiful to us, our first thought is, “I’m glad that’s not me.”  And our second thought is, “But if it were me, I’d get that fixed.”

This is, of course, more than anything ahope.  We hope we would be different in the same situation.  But we’re afraid we may not be, and this fear causes us to quickly deflect the problem onto someone else.  Why not the person who just upset our delusions of normalcy?  So we and our supposedly meritocratic society nurture this idea—“I wouldn’t be like that”—as a justification for being judgmental.  Whether or not we voice these assumptions is indeed a choice we make, and whether or not we add any hint of judgment is yet another.   Whether or not this is fair is often debated on a case-by-case basis, but anytime anyone insults someone else’s body, it is a demonstration of their own insecurities.  Period.   

We’re all constantly judging one another and judging ourselves in comparison to one another.  This can be fair game when we stick to focusing on the mundane decisions we all make.  There is a world of a difference between quipping about fashion choices with head-shaking amusement—Sorry, Eddie Izzard, but sometimes you do not know how to put on makeup—and allowing our personal insecurities to fuel pity or disdain for others’ apparent physical imperfections.  There is no fair way to trash someone else’s body because, for the most part, your own biology is neither your fault nor your achievement.

 

 

In Comedy, It’s All About Deciding Who’s Us & Who’s Them

28 Apr

Krampus twins(Via)

 

The Guardian’s stylebook contains the greatest commentary on style I’ve ever seen in print:

political correctness: a term to be avoided on the grounds that it is, in Polly Toynbee’s words, “an empty right-wing smear designed only to elevate its user.”

Around the same time, while researching the back stories of Life’s Too Short for my review, I came upon the controversy over the word “mong” in which Ricky Gervais found himself embroiled this past fall.  Apparently “mong” is a British English insult derived from “Mongoloid,” the very antiquated and now unacceptable term once used to describe people with Down’s Syndrome.  Both Americans and Brits have probably heard “retard” used the same way.  Gervais eventually apologized to those who objected—including the mother of a child with Down’s Syndrome who has frequently endured the insult—but not without first dragging his heels screaming at what he called “the humorless PC brigade.” 

I will never get over how many comedians insist that any criticism of their work is an indictment of all comedy; as if there’s no such thing as an unfunny comedian, only stupid audiences.  This logic sets the bar for comedy so low that no comedian need ever try to be original.  Ignoring the “PC brigade” (i.e., anyone who doesn’t live with the privileges they do), they can simply regenerate old stereotypes, mining the minstrel shows, the frat houses and the school yards, and if no one laughs at this, it’s simply because we’re all too uptight, right?  Wrong.  We don’t refrain from laughing because we feel we shouldn’t.  We refrain because, unlike the repressed who giggle away in awe, we’ve heard it a thousand times before and we know it’s far from unique.  And isn’t unique what every comedian, entertainer and artist strives to be?   

Like politics, comedy can be divided into two categories: that which confronts our problems with our own selves, and that which confronts our problems with others.  Xenophobia literally means the (irrational*) fear of strangers and the second type of comedy relies upon this fear.  There has to be a “them” for “us” to laugh at.  So Republicans laugh at Democrats.  Hippies laugh at yuppies.  Academics laugh at hippies.  Progressives laugh at bigots.  It’s fair game when beliefs are targeted because we must always take responsibility for our beliefs.  However, when the joke defines “them” as those who have had no choice whatsoever about their distinguishing quality—ethnicity, gender identity, sexuality, physical traits, mental or physical capabilities, or class background—and who continue to be disenfranchised by society’s delusions of normalcy, the joke had better target those delusions to be in any way original.  Otherwise, why pay for cable or tickets to hear someone lazily reiterate the guffaws of playground bullies? 

Every good comedian, from Stephen Colbert to Eddie Izzard to Christian Lander to the writers at The Onion, knows that the best jokes mock people’s hang-ups and clumsy reactions to minority issues, not the mere existence of minorities. My beloved Flight of the Conchords frequently flip gender roles and ethnic stereotypes, exposing the absurdity of racism and misogyny.  As the following video demonstrates, 1970s machismo has been begging to be made fun of.  However, when it comes to physical Otherness, it is the body—not fearful attitudes toward it—that they choose to snicker over, 54 seconds into the video:

 

 

Hermaphrodite?  Really?  An intersex kid’s medical reality is your toy?  C’mon, Conchords.  You’ve proven you’re great at making fun of white Kiwis tripping over Maori culture.  (“Jemaine, you’re part Maori…  Please be the Maori!  If you don’t do it, we’re gonna have to get Mexicans!”)  Surely you could come up with some good bit about hipster comedians clinging to lookist and ableist jokes like teddy bears and throwing temper tantrums when they’re taken away.  Or take a tip from Mitchell & Webb and take a jab at the way the ableism of reality TV masquerades as sensitivity:

 

 

Of course comedians have the right to make jokes objectifying minorities.  But I’m more interested in why they feel the need to, why they choose to objectify some people and not others.  Being gay, disabled, trans, intersex or non-white is not inherently hilarious to anyone who doesn’t live their lives sheltered from anyone unlike them.  The American freak shows of P.T. Barnum and the racist British sitcoms of the 1970s signify not just how profoundly disenfranchised minorities were in these countries, but how absurdly provincial audiences must have been in order to be so easily titillated.  Many comedians who reiterate chauvinist jokes argue that in doing so they are pushing the boundaries, expanding freedom of thought in defiance of PC oppression, when in fact they are merely retreating to well-trod ground, relying on ideas that challenge nothing but the very young idea that minorities deserve to be included in the dialogue as speakers, not objects.  As Bill Bryson has pointed out, the backlash against “political correctness” took place the moment the idea was introduced and has always been far more hysterical than what it protests.   

Toni Morrison has said, “What I really think the political correctness debate is really about is the power to be able to define.  The definers want the power to name.  And the defined are taking that power away from them.”  Revealing that it is all about power explains why emotions run so high whenever minorities get upset by certain jokes and comedians get upset about their being upset.  But this redistribution of power can be productive.  Taking old slurs and xenophobic tropes away from today’s politicians and comedians challenges them to think beyond their own experience and to wean themselves off society’s long-held fears, to redefine “them” as those enslaved by the limits of their imagination; in essence, to really push the boundaries.  Yet too often they default to the tired claim that this challenge infringes on their right to free speech. 

Some progressive critics do bring on the censorship accusation by using the ineffective phrase “You can’t say that!” and sometimes this is indeed an open attempt at censorship because most media outlets self-censor.  For example, Little People of America has called for the Federal Communications Commission to add “midget” to its list of words you can’t say on television.  I understand the temptation to insist upon the same treatment afforded other minorities: If certain ethnic and gender slurs are banned by newspapers and TV networks, why not others?  But this tactic too easily insults those other minorities—are you claiming black people have it easier than you?—and creates the concept of a forbidden fruit that will only tantalize right-wing politicians and shock jock comedians.  Simplifying the issue into Good Words/Bad Words can be a waste of an opportunity.  Instead of limiting itself to which words are always unacceptable regardless of context or nuance, the dialogue should always aim to reveal which minority jokes truly blow people’s minds and which lazily replicate institutionalized chauvinism. 

Instead of splitting hairs over the modern meaning of the word “mong,” I’d love it if a comedian went at the fact that Dr. Down came up with the term “Mongoloid” because he thought patients with the diagnosis resembled East Asians.  Because really.  Who’s asking to be made fun of here?

 

 

* “Phobia” always indicates an irrational fear, hence arachnophobia, agoraphobia, claustrophobia, homophobia, etc.  Fears that are well-founded are not phobias.

Female Privilege

14 Apr

(Rates of violence worldwide, used under CC license via)

 
 
Recently at Feministing, Cara Hoffman wrote about violence that targets men, setting off an angry debate. Most commenters rightly supported the idea of feminism openly discussing the ways in which men are specifically victimized, but there were some who said this had no place in the movement. Such a women-only approach to feminism is indeed sexism that, like male chauvinism, will never be successful as long as it is determined to concern itself with only one half of the population. The hero and heroine gender tradition oppresses men, women and those who identify as neither. As women, we should never be so insecure as to ignore anyone’s true disenfranchisement or to deny the privileges patriarchy automatically bestows upon us.  

Yes, being female comes with certain privileges under patriarchy. (And no, I don’t mean Phyllis Schlafly’s you-get-your-restaurant-meals-paid-for-so-be-happy-staying-out-of-the-workforce sort of “privilege.”) Privilege is granted by society to certain people based on things we had absolutely nothing to do with: our gender identity, our ethnicity, our sexuality, our physical traits, our mental capabilities, our class background. That is why any privilege—like any form of disenfranchisement—is the essence of injustice. 

Men face oppressive double-standards in dating and the family unit that I will address in a later post, but, in the wake of the arrest of Trayvon Martin’s killer, I want to focus for now on prejudices against men that are truly life-threatening. Beginning at the personal level, my husband has been beaten up twice by strangers. My brother and several guyfriends have been attacked outside clubs by strangers. Others were shoved down the stairs and slammed against lockers in school by bullies. I’ve never once been challenged to fight as they have, just as they have never experienced sexual harassment as I have. Of course far too many women are beaten by both men and other women, just as far too many men are sexually assaulted by both women and other men, but my personal experience and my husband’s are representative of the increased risk each of us face for certain kinds of attack in our society. There’s no need to try to decide which is worse: the threat of sexual assault or the threat of coming to blows. Both can end in the worst possible way, both are always inexcusable. Both target people based on their apparent gender. 

As a woman, I am far less likely to be challenged to fight or to be suspected of violence by authorities. As a woman, I am automatically more trusted to be around children. As a woman living in the United States and Europe, I have never been asked to die for my country.  As a woman, I can express more affection to a member of my gender without fear of gay bashing than a man can. As a woman, I can buy products of any color without fear of gay bashing. As a woman who’s not physically strong, I don’t have to worry as much as a man does about being picked on by bullies looking for an easy target. As an achondroplastic woman, I’ve always been less likely to be confronted by an assailant looking to engage in dwarf tossing than an achondroplastic man is. As a woman, I am permitted to choose emotional fulfillment over professional success without being considered a failure. This is why homeless women attract less contempt than homeless men. And part of why men are three times more likely to commit suicide than women. 

In a previous post discussing female sexuality, I quoted Chloe S. Angyal’s point that traditional gender roles consider sexuality a no-win situation for women, that any type of behavior we choose can be seen as an invitation to sexual assault. For men, the same Catch-22 can apply for men regarding violence. Looking tough? You’re a threat that needs to be knocked down. Looking vulnerable? You’re the perfect victim to pounce on. If you are identifiable as a minority through your appearance or behaviors, you’d better make sure you avoid the wrong parts of town, which, in some cases, may include your entire home town or country. Or shoot first. 

Like the virgin/whore cycle with which women are encumbered, men are confronted with the brute/wuss standard from the earliest of ages. You’re a monster if you use your fists to solve your problems, but you’re a sissy if you can’t. Non-violent young men must endure society’s suspicion that they are prone to be violent while at the same time enduring their own vulnerability as a victim of violence. The reality of violence against women can never be denied or downplayed, but neither can violence against men, who are 2 to 4 times more likely to be killed by violence than women. Because of the pressures of the traditional model of masculinity, men are far less likely than women to seek help after being threatened or assaulted. 

Most violence enacted upon boys and men is by other boys and men, and this proves that, as with violence against women, the solution is not to condemn a gender, but to condemn an attitude. Googling “female privilege” results in some very creepy websites, wherein men rage about women who won’t sleep with them after they held the door for them, and patriarchy relies on this polarization of the genders for survival.  Despite what so many of those misogynistic websites claim, women who identify as feminists demonstrate less hostility toward men than women who embrace traditional gender roles because we know that those traditions screw everyone over, including men.  That’s why we unite with men against them, taking them apart bit by bit, non-violently.    

  

 

In Activism, The Medium Is The Message

6 Apr

 

An acquaintance recently referred to me in a discussion about limb-lengthening on a Tumblr page.  Having heard about my medical experiences from mutual friends, he insinuated that I may have been forced into it, reported the procedure is used to make people with dwarfism “look normal” and dismissed it as therefore morally wrong.

Around the same time that week, The New York Times featured a discussion regarding whether the Internet’s contributions to political discourse are always productive under the headline, “Fighting War Crimes, Without Leaving the Couch?”  The Internet itself is so multi-faceted it undoubtedly does as much good as harm.  Like all media, it has both cerebral and shallow corners.  And, as the Times piece reveals, there is a fine line between slacktivism and activism.  But the recent trend toward microblogging—Tweets, Facebook status updates, Tumblr—for political discussions is rife with problems.  For every productive comments thread I’ve read, there are conversations that never evolve beyond slogans, sneering, choir-preaching, or kneejerk reactions with most information based on hearsay.  Every single piece of information cited in the Tumblr discussion on limb-lengthening contained at least one factual error.  (More here on the fact that it was posted in the context of sick fascination rather than bio-ethics.)  That microblogging brings those who don’t have the time or energy to compose an entire blog post or article into the discussion is hardly a compelling argument, since it quickly extends to Those Who Don’t Have the Time to Research Or Think Much About the Issues. 

I’m quite used to having my story cited in debates because of the exposure I’ve allowed it.  I love debate like other people love video games and limb-lengthening is a contentious issue.  (Just ask my friend who witnessed a stranger with dwarfism approach his mother and demand, “How could you ruin your child’s life like this?!”)  When ignoring the broad-sweeping nature of his assertion, I consider this friend of a friend’s kneejerk opposition to cosmetic surgery preferable to, say, the handful of journalists who have interviewed me and chosen to portray limb-lengthening as a painless miracle cure for anyone unhappy with their size.  But reading his hasty dismissal of my seven-year-long experience based only on what our mutual friends had told him brought back memories of all the people I’ve observed summarizing deeply personal, overwhelmingly complicated decisions in 140 characters or less, both online and off:

“It’s been TWO months since she died.  He’s gotta move on.” 

“It was so selfish of her to get pregnant now with everything her husband’s going through.”   

“It’s absolutely horrible to abort a fetus that tests positive for a disability.  Who would do such a thing?!” 

“Only one girlfriend?  Well, then she’s not really gay.  She was just experimenting.” 

“It’s ultimately selfish to want a child with dwarfism.  You wouldn’t want to do that to a child.”

“No wonder she got mugged.  Any girl who goes hiking alone should know better.”

“It’s so stupid that women are supposed to be upset about not being able to have their own kids.  They could just adopt.” 

Assuming others’ motivations, knowing what’s best for everyone, passing on poorly researched information; too often gossip masquerades as political discourse, both in the media and at home.  We all feel compelled to have an opinion.  About everything.  The more noble root of this is the desire to actively take an interest in everything.  But that nobleness dies the moment we can’t be bothered to consider anything beyond our gut reaction before spouting off; the moment a desire to improve the world devolves into the simple urge to mark everything we see with our own personal “GOOD” or “BAD” stamp. 

Obviously, as a blogger I am constantly offering my opinions.  But I remain acutely conscious of my chosen medium, taking inspiration from Marshall McLuhan whose quote heads this post.  There is a difference between tabloids and broadsheets, between documentaries and reality TV, between a blog entry and a Tweet, and it’s not just big words: It’s the intellectual commitment required of the audience in order to consume.  True learning demands this commitment and risks upsetting our world view.  Voyeurism indulges our complacency and guarantees our prejudices will be cemented.     

Every blog post I put out is both a labor of love and a terrifying experience.  Every week I hear the imaginary voices of every individual who could in any way be implied in my arguments howling at me, “Who do you think you are?!”  The voices aren’t loud enough to scare me into silence.  But, combined with the inspiring examples set by my partner, my mom and dad, Ariel Meadow Stallings, Barack Obama and many others, they motivate my every edit of that girl in high school who was so well known for her righteous indignation that she was voted “Most Argumentative” in the yearbook.

That girl has made so many mistakes along the way.  I found out that posting your religious views online can earn you applause from strangers but cost you a friendship.  I’ve learned using the “I know someone who…” argument can offend or embarrass said person if you haven’t asked their permission, even when it’s intended as praise.  I’ve learned passion alone inspires your supporters but usually sounds like ranting to the unconvinced, especially on Facebook.  I’ve learned mass emails are not only passé outside the workplace but were never very popular to begin with.  (At least not among the recipients.)  I’ve learned to never read the comments section on YouTube unless I want to lose all my faith in humanity.

I intend to address all the reasons why I underwent limb-lengthening eventually, but at the moment I’m not sure yet if I can in anything less than the 13 pages I needed in Surgically Shaping Children.  I’m sorry to play Tantalus to those unable to shell out the cash for the book or find it at their library.  This undoubtedly limits the number of people I inform.  But, for now at least, I prefer to be held responsible for a few well-informed individuals rather than many misinformed ones.  And no matter how I end up condensing it, I know I won’t ever be able to fit seven years of limb-lengthening into one Tweet.    

 

 

Four Tiers of Fear

31 Mar

 

“How DARE you call me a racist!” 

We’ve all heard that one before, and it’s becoming ever more frequent with the debate over Trayvon Martin’s death.  Marriage equality opponents have been adopting the same tone over the past few years, claiming “homophobic” is now an insult.  In the video posted above, Jay Smooth makes an excellent argument for shifting the focus from criticizing actions instead of people in order to spark more productive dialogue about racism and this can be applied to any discussion about xenophobia. 

But outrage at any charges of xenophobia is not only an issue of grammar.  This outrage usually relies on the assumption that “racist” or “homophobic” automatically denotes a Neo-Nazi level of vitriol.  (This is why it’s frequently accompanied by the protest, “Some of my best friends are black/gay/dwarfs!”)  The outrage silences any discussion about the more insidious forms of chauvinism, and this is the very discussion that needs to happen, because the most insidious forms are the most ubiquitous. 

Most people who harbor transphobic, racist, ableist, sexist, lookist, ethnocentric or homophobic views are not Neo-Nazis.  Most would never physically harm anyone, and as Jay Smooth demonstrates, most would never admit to being xenophobic.  My theory is that chauvinism appears in society today in four different forms:

***

1. Violence: Both organized and individual violence, though of course the more organized, the more terrifying.  (The Southern Poverty Law Center reports this month that hate groups are on the rise in the United States.)  A hate crime should not necessarily be punished more severely than any other case of assault or murder, but its designation is an essential counter-statement by society to the statement the violence was intended to make.  While the most horrific form of xenophobia, violence is also the least common.

2. Overt Animosity: Harassment and disrespect that falls short of violence.  It’s insulting someone to their face, knowingly using slurs, arguing in earnest against someone’s human rights.  It’s refusing to hire, date or talk to someone because they belong to a certain ethnic group, or because they do not belong to a certain ethnic group.  It’s parents disowning their children for being gay, trans or disabled.  It’s the guy I witnessed at the mall yesterday who tapped a Chinese woman on the shoulder, closed his eyes and babbled, “Ching-chong-chang!” before dashing off.  It’s the Yale Delta Kappa Epsilon fraternity’s pledge, which included the chant, “No means yes!  Yes means anal!”  It’s the New Orleans cop saying Travyon Martin was a “thug and… deserved to die like one.”  Because the intention is either to provoke or dismiss the victim, it’s extremely difficult to find a constructive counter-argument.  Beyond ignoring such provocations because they are beneath us, our only hope is to appeal to any capacity for empathy the offenders may have when they are not in a provocative mood.  Such cruelty always stems from profound personal insecurities.         

3. Covert Animosity: Disrespect behind someone’s back.  This usually occurs when the speaker thinks they are surrounded by their “own kind,” and thus unlikely to offend anyone present with their slurs or jokes.  We’ve all heard at least one relative or coworker talk this way.  Often an environment encourages such disrespect and the peer pressure to join in is high.  Often someone will insult an entire minority privately but be utterly decent when meeting an individual from that minority.  A friend of mine once dismissed a boy band on TV as “a bunch of fags” just hours after he’d been raving to me about my awesome neighbor, who he knew is openly gay.  Sometimes this behavior is excused on the grounds that the speakers are from “a different generation,” an excuse I rarely accept since those with more progressive views can often be found in the same generation.

4. The Xenophobic Status Quo: The stereotypes and privilege that surround us.  Most of us have some of these prejudices without knowing it because we have been bombarded with them from birth on.  It’s the invisibility of minorities in the media and the social segregation in public that causes us to stare when we see certain people.  It’s the jokes that rely on the assumption that all heterosexuals find gay sex, intersexuality or transsexuality at least a little gross.  Or the assumption that physical disabilities, mental disabilities and physical deformities are always tragic and sometimes morbidly fascinating.  It’s the virgin/whore standard to which Western women are still held, leading us to comment far more on the appropriateness of their clothes and promiscuity than on men’s.  It’s our collective misogyny, homophobia and transphobia that converge to make us wonder why a man would ever want to wear a dress, but not why a woman would want to wear jeans.  It’s the prevalence of chauvinist expressions in our language (e.g. “Congressman,” “flesh-colored”) and of chauvinist traditions in our books, films and legends (e.g. our god is a white male) that makes them difficult to avoid and easy to reiterate.  It’s our demanding transgendered people wait for the rest of us to “get used” to the idea of their transitioning instead of questioning our belief in the gender binary.  It’s our view of every person who belongs to a minority not as an individual but as an example representing that minority with every move they make.  It’s the assumption that a difference upsets normalcy in lieu of the concession that normalcy is a delusion.  The privileges bestowed by our society on some members at the exclusion of others, rewarding those who have done nothing but be born with characteristics considered “normal,” are perhaps the most insidious reinforcement of these prejudices.

***

There is a danger to placing too much emphasis on the differences between the four tiers—I never want to end up in a conversation where people’s actions are excused as being “only Tier 4 sexist”—because all four tiers feed off each other.  They don’t exist in a vacuum.  The non-violent ideas of covert animosity and the xenophobic status quo provide confrontational people with a means of choosing their victims.  Conversely, regularly seeing society’s long tradition of hate crimes and public humiliation both in our history books and in our everyday news is what leaves us all dangerously unsurprised by the less belligerent forms of disenfranchisement many of us help perpetuate. 

Yet it is important to distinguish between these manifestations of fear in order to avoid the assumption that only violence and overt animosity qualify as xenophobia.  That assumption lets millions of people off the hook.  You don’t have to belong to the Westboro Baptist Church in order to have homophobic views.  You don’t have to belong to the NPD or the BNP or the Georgia Militia in order to have racist views.  You don’t have to wait in a dark alley for a stranger in order to commit rape.  You don’t have to threaten someone in order to to make them feel unwelcome.  Our society has been built on many xenophobic assumptions, making it very easy for all of us to pick some of them up along the way.  The fight for equality aims to make it more and more difficult, but it needs to be able to recognize its targets and use tactics suitable to each. 

I make these distinctions in the hopes of facilitating the conversation on chauvinism.  Yet it should come as no surprise that chauvinism is difficult to discuss because, in the words of Jay Smooth, it’s a system that has been designed to insult and subjugate.  In other words, it’s hard to speak politely about the idea of being impolite. 

 

 

The Good, the Bad and the Boring of “Life’s Too Short”

21 Mar

 

Today Feministing.com features my review of HBO’s Life’s Too Short, the first sitcom I’ve ever seen starring someone with dwarfism.