Archive | Scars & Physical Ability RSS feed for this section

“Richard III Was Dwarf, Doctor Says”

10 Feb

KING RICHARD III (Image by Leo Reynolds used under CC license via)

 

From an article appearing 20 years ago in The Seattle Times on August 23, 1991:

King Richard III was a dwarf, according to a medical diagnosis that has outraged defenders of the monarch.

“The combination of slow growth and short stature, preceded by a difficult breech birth… and intimations of physical weakness and sexual impotence… suggest idiopathic pituitary dwarfism,” Dr. Jacob Van der Werff ten Bosch said in an editorial published today in the medical journal Lancet.

Balderdash, say Richard’s partisans.  “Everyone knows Shakespeare’s Richard III, but not everyone knows the historical evidence,” said Jack Leslau, a biographer of the king. “There are various medical theories that all work on the assumption that he was some sort of monster with a physical deformity.”

The Lancet editorial was timed for the anniversary of Richard’s death in battle Aug. 22, 1485, at Bosworth Field – where, as Shakespeare had it, the monarch offered “my kingdom for a horse!”

Van der Werff ten Bosch, a former professor of medicine, says there is no reason to take offense. “As a doctor I would not think it’s ridiculing a king to call him a dwarf. It’s simply a medical diagnosis,” he said.

Since the excavation and analysis of the royal bones announced this past Monday, the BBC now reports, “Richard III was portrayed by Shakespeare as having a hunched back and the skeleton has a striking curvature to its spine. This was caused by scoliosis, a condition which experts say in this case developed in adolescence. Rather than giving him a stoop, it would have made one shoulder higher than the other.” 

So what Dr. Van der Werff ten Bosch said all those years ago was wrong.  At least half of it, anyway.

 

 

Props to The Observer for (Finally) Doing the Right Thing

20 Jan

a bit of controversy surrounding the transgender flag: san francisco (2012)A little background: A while ago a British journalist named Suzanne Moore, who specializes in women’s rights, made an offhand transphobic comment in an article about body image:  “We [women] are angry with ourselves for not being happier, not being loved properly and not having the ideal body shape – that of a Brazilian transsexual.”  There was an ensuing backlash from many in the trans community, especially on Twitter.  Her friend and fellow writer Julie Burchill penned a column in her defense titled, “Transsexuals Should Cut It Out,” which appeared last week in The Observer.  Without ever saying what exactly the trans activists in question had said to Moore that was so horrific, Burchill just called them names: “A bunch of dicks in chick’s clothing… bed-wetters in bad wigs… trannies…  They’re lucky I’m not calling them ‘shemales.’  Or shims.”

(Oh, really?  They’re lucky you don’t use the most dehumanizing terms you can think of?  Even though you just kind of did…  But I guess every member of every minority really should feel grateful to anyone who refrains from attacking their freak qualities with the worst slurs.  And in that case, thank you, Julie Burchill.  Thank you for not referring to people with dwarfism as midgets or Paralympic athletes as cripples.  I know the temptation is always there to vomit in disgust at people who are physically different and it takes a will of iron to keep the insults from dribbling out.  You are truly strong.  Anyone less magnanimous than you would mouth off.  You have shown yourself to be the paragon of generosity.  I for one am now going to get up every morning and feel grateful there are people like you saintly enough to walk down the street and not spit at those of us who truly belong in the circus.)

The Observer received a barrage of emails and commentary from horrified readers and promptly demonstrated that a small group of thoughtful citizens can indeed change the world when it pulled the column from its website.  The editors have issued this apology (emphasis mine):

This clearly fell outside what we might consider reasonable. The piece should not have been published in that form. I don’t want the Observer to be conducting debates on those terms or with that language. It was offensive, needlessly. We made a misjudgment and we apologise for that.

A newspaper shouldn’t reject writing that merely argues against trans rights or any sort of human rights.  As awful as bigotry is, dialogue between opposing sides is the only way to change minds and spur progress.  But any publication looking to host productive debate should always be able to discriminate between substantive reasoning and a pointless list of pejoratives.  I wouldn’t oppose printing Burchill’s piece because her argument was chauvinistic, but because she failed to be civil and because she wasn’t even addressing the trans activists’ stance.  She was simply snarking about their bodies.  And I’ve said it once, and I’ll say it again: If you can’t make your point without trashing traits your opponent has no choice about—their gender identity, ethnicity, biology, sexuality, or class background—then your argument doesn’t have a leg to stand on.  At worst, it’s abuse, and doesn’t even belong in high school.  (Indeed, that’s what anti-bullying policies are all about.)  At best, it’s meaningless.  (Would anyone try to convince the world to depose Saddam Hussein by ranting about the ugliness of his moustache?)

Upon first discovering Burchill’s piece last week, I assumed the only reason the editors would publish such an uninhibited temper tantrum was because they’re a business and believe feuds sell papers.  It is a relief to see now that they do not want their readers thinking that’s the kind of business they’re running.

Unsurprisingly, The Telegraph and others have bellowed, “CENSORSHIP!” and—you can see it coming a mile away—“PC police!” and have joined up with Burchill in republishing her piece.  They apparently have no qualms about profiting from the attention a semi-famous writer’s bad manners will grab.  Which is why it is so important to commend The Observer.  A week ago, I was deeply depressed by their descent into yellow journalism.  Their current endeavors to wipe off the self-inflicted stains are better late than never.

 

(Via)

 

 

The Year In Review

30 Dec

Hidden Object(Image by Hans-Jörg Aleff used under CC license via)

 

When I launched Painting On Scars at the beginning of this year, I had loads to say and almost as much worry that few would be interested in issues of disability and physical difference.  As the year comes to a close, I look back and see that the posts about ableism and lookism have generally been the most popular, followed by my spring article about family planning, reproductive rights, and privacy.  This hasn’t been the only surprise.

Lots of people find this blog by googling “dwarf + woman + sex.”  I have no idea who these people are.  They may be fetishists, they may be researchers, they may be women with dwarfism.  Your guess is as good as mine.

Since March, Painting On Scars has been read in over 100 countries.  To the surprise of few, no one in China reads it.  To the surprise of many, at least one person in Saudi Arabia does.  So have people in St. Lucia, Jordan, and Benin. 

Thanks to blogging, I’ve discovered there is a considerable online community committed to combating ableism with its own terms and tropes such as “supercrip” and “inspiration porn.”  I love such communities.  I also love bridging communities.  Because responses to my blog have shown me, perhaps more than anything has, that I want to talk to everyone.  And I really don’t care what your label is. 

I don’t care if you consider yourself Republican or Democrat or feminist or anti-feminist or religious or atheist or socialist or libertarian or apolitical or intellectual or anti-intellectual.  Well, okay, I do take it into consideration.  Somewhat.  But there is rarely consensus when we ask that everyone define these terms.  And none of them carries a guarantee against nasty personality traits like narcissism and defensiveness and aggression and cowardice.  Novelist Zadie Smith noted that we are told every day by the media and our culture that our political differences are the most important differences between us, but she will never be convinced of that.  When lefty comedian Jon Stewart was asked earlier this year if there’s anything he admires about right-wing hardliner Bill O’Reilly, he said, “This idea that disagreeing with somebody vehemently, even to the core of your principles, means you should not engage with them?  I have people in my own family that make this guy look like Castro and I love them.”

This is not to say that it’s all relative and I see no point to social justice or politics.  On the contrary, difference continues to be marginalized by the tyranny of the majority, as evidenced by the fact that the number one Google search term that has brought readers to my blog is “freaky people.”  And far too many kind people will more readily lash out at a person or group whose recognition demands they leave their comfort zone, rather than the forces that constructed and defined their comfort zone.  Well-intentioned friends and parents and bosses and classmates and leaders and partners and siblings and colleagues are capable of the vilest selfishness when they are scared of a power shift.  (As the Christian activists pictured above acknowledge.)  This is heart-breaking.  And it is not okay. 

But on the flipside, people are constantly smashing the prejudices I didn’t even know I had about them.  Every day friends and family and strangers demonstrate strengths that highlight all the mistakes I make, proving to me that politics are tremendously important but they will never be the most important element of a human being.   That may be a political idea in itself, but regardless of the divisions, most people on earth do seem to believe deep down inside that everybody matters.

And that’s what makes the struggle for social justice worth it.  If you are friendly and well-mannered and generous and honor your commitments and don’t let your self-doubt make you self-centered and try to listen as much as you talk and are honest about your problems without fishing for compliments and are big enough to apologize when you’ve screwed up, I respect you and admire you and am humbled by you.  I want to do the best I can because of you. 

 And since you’ve read this far, it’s more than likely you’re good at listening.  Thank you and happy new year!

 

 

Degenerates, Nazis, & the U.N.

16 Dec

(Via)

 

A reaction to last week’s post about the U.N. Convention on the Rights of People with Disabilities sparked a behind-the-scenes discussion about whether or not I should allow name-calling in the Painting On Scars comments section.  I like to engage with almost anyone who disagrees with me, but online I know I also tend to only comment on sites that have strict no-drama policies because discussions can become pointless and boring really, really fast when there’s nothing but insults and exclamation points.  I ultimately decided that, for now, any rude behavior speaks for itself: Commenters can name-call all they want regarding people they dislike or say absolutely nothing, because in both cases they’re not going change anyone’s mind.

That said, I will always tell any supporters if they adopt tactics I want to have nothing to do with.  And it’s important to call out invectives that are particularly malicious in a way some might not be aware of.  The comment in question last week referred to the U.N. as “a bunch of degenerates, throat cutters, and other trash.”  Using the word “degenerate” in a discussion about disability rights is exceptionally insensitive, if not mean-spirited.    

The first time I read the word out loud to a friend here in Germany, his eyes shot up and said, “Be very careful with that word.  It immediately makes everyone think of the Nazis.”  And by “Nazis,” he meant the actual, goose-stepping, genocidal nationalists who tried as best they could to make sure disabled people either died off or were killed off.  Not “Nazis” in the Internet-temper-tantrum sense of “anyone I disagree with.”  The word also evokes the brownshirt term “degenerate art.”  Modern German sensitivity to the term is the result of looking honestly at the nation’s history of ableism.

Action T-4 was the first genocide program ordered by the Nazis, calling for the extermination* of those deemed by doctors to be “incurably sick.”  Between 200,000 and 300,000 disabled people were killed, though many were used for scientific experiments first.  *And by the way, I DETEST any use of the term “euthanasia” in this context.  “Euthanasia” literally means ending life to end pain, and for this reason I find it applicable where patient consent has been given or where pets are concerned.  But to imply that what the Nazis did to disabled citizens was anything other than murder is to dehumanize the victims.

The forced sterilization programs of disabled people in Nazi Germany, meanwhile, were modeled after American laws.  The very first forced sterilization law in the world was introduced in Indiana in 1907, and 30 states followed suit.  The Supreme Court upheld Virginia’s eugenics program in 1927 and it remained on the books until 1974.  Oliver Wendell Holmes summarized the Supreme Court’s decision thusly:  

It is better for all the world, if instead of waiting to execute degenerate offspring for crime or to let them starve for their imbecility, society can prevent those who are manifestly unfit from continuing their kind…  Three generations of imbeciles are enough.

The Nazi poster featured above focused instead on the expense: “It costs the German people 60,000 Reichsmarks to keep this genetic defective alive.  Fellow German, that is your money!”  After World War II, the Nuremberg Doctors’ Trial and the resulting Nuremberg Code discouraged ableist politicians from openly promoting eugenics on either side of the Atlantic.  But it wasn’t until 1981, the year I was born, that the disability rights movement in West Germany came into full swing and sought to combat ableism head-on. 

Almost every human rights movement is said to have a trigger moment when oppression went a step too far and the people fought back.  For the American Civil Rights movement, it was the death of Emmett Till.  For the gay rights movement, it was the Stonewall Uprising.  For the German disability rights movement, it was the Frankfurt Travel Ruling of 1980, brought about by a woman suing her travel agency for booking her in a Greek hotel where a group of Swedish disabled guests were also vacationing.  She claimed that having to see and hear disabled people had ruined her trip and the judge agreed with her.  Protests exploded across the country and the next year, which the U.N. had declared the Year of the Disabled, several West German disability rights groups organized and formed agendas.  They used the U.N. events to draw attention to the dire situation of disabled citizens in the country.

Two years later, the Green Party entered the Bundestag for the first time and was the first to voice support for disability rights as a human rights issue.  The Greens were born out of the 60s student movement in West Germany.  The movement was famous for protesting what most young activists across the Western world opposed at the time: the Vietnam War (and war in general), traditional gender roles, consumerism, pollution, etc.  But first and foremost, the West German 68ers were young people demanding the nation come to terms with its dark past, decrying that an overwhelming number of the nation’s leaders and officials were former Nazis.  Their commitment to human rights was inspired by an unfaltering awareness of how horrific things can get.  Their actions led to the passing of anti-discrimination laws and an amendment to the German Constitution in 1995, modeled after the Americans with Disabilities Act.

Another result of the students growing up and entering the government came in 1983 when conscientious objectors to the draft were no longer required to argue their motivations before a board for approval. This made it far easier for young men to opt for a year of community service in lieu of military service.  By 1991, half of those drafted became conscientious objectors.  For over 30 years, scores of German 19 year-old boys worked with mentally ill children at the Red Cross, in nursing homes, as assistants for physically and mentally disabled teenagers, and for Meals on Wheels.  This has created generations of men who often speak fondly of the experience and who are usually less fazed by disabilities or dependence, demonstrating a tolerance and openness that seems extraordinary for their age. 

The draft was discontinued last year and since then the community service option has been suspended.  Military debates aside, I agree with conservative politicians who have called for preserving the community service requirement and expanding it to women because it is an excellent government tool for combating both ableism and social segregation on a personal level.  Ableism is still a tremendous problem here in Germany, but in three generations, the country has changed from one of the most ableist societies on earth to one of the least.   The word “degenerate” signifies humanity’s capacity for cruelty and sensitivity to the word signifies our commitment to never repeat it.

To be fair, the word in last week’s comment was not aimed directly at disabled people but at the U.N. members working for disability rights.  And frankly, I’m a little insulted.  Because if anyone’s a degenerate here, it’s me. 

I am scientifically a mutant by virtue of my fibroblast growth receptor gene 3.  (Yes, yes, my genetics professor explained that technically all of us are mutants, but mostly just in boring ways… )  I am a semi-invertebrate now that pieces of my backbone were removed six weeks ago.  And I don’t take the last empty seat on the subway and request my friends slow down to my pace when walking for nothing.  So if anyone’s gonna go calling the organization that sprang from the Nuremberg Trials and founded the Universal Declaration of Human Rights a bunch of degenerates, they gotta get through me first.  I’m a degenerate living in Germany and proud of it.

 

 

Universal Disability Rights – Remind Me Again Why We Don’t Care?

9 Dec

 

Well, I was going to write about how conservatives are sometimes more open to discussing issues faced by disabled people than liberals are.  Then on Tuesday, all but eight Republican senators voted against the Convention on the Rights of Persons with Disabilities, making sure the United States distinguishes itself as one of the few nations on earth that will not commit to protecting disabled rights.  Appeals by the likes of the World Health Organization, the American Psychiatric Association, and senior Republicans (and disabled veterans) John McCain and Bob Dole were to no avail.  So I’m not in the mood to write any sort of tribute to conservative ideals this week.

Supporters of ratification like Dole and John Kerry argued that the United States would be leading the world, since much of the Convention was modeled after the Americans with Disabilities Act of 1990.  Opponents argued that this is exactly why ratification is of little importance.  We already have the ADA and we don’t like the UN, so who cares?  But By refusing to ratify the Convention, the United States is undermining its authority, ultimately saying, “Too bad!” to the disabled citizens of other countries that will also abstain, where ableism is sometimes deadly.  (Do we need to talk about the thousands of medical conditions that are still thought to be works of the devil or punishment by God in far too many cultures?)  But this is not just a matter of the United States choosing whether or not to officially lead the world.  When it comes to human rights at home, complacency can be devastating.

 In many respects, the U.S. is not coming out on top.  According to an OECD 2009 study of 21 developed countries cited by the World Bank and WHO last year, disabled people of working-age are more likely to live below the poverty line than non-disabled people in every country but Norway, Sweden, and Slovakia.  This likelihood is highest in the United States, Australia, Ireland, and Korea, and lowest in the Netherlands, Iceland, and Mexico.  According to WHO, the discrepancy between the employment rates of disabled and non-disabled citizens is twice as high in the United States (35 percentage points) as in Germany (18 percentage points).  And in the U.S., the risk of violence against people with disabilities is four to ten times higher than against people without disabilities. 

I will never officially endorse a candidate or a party on this blog.  Despite obvious political trends at the macrocosmic level, personal experience has shown me that people of all political stripes believe in universal human rights and I never wish to alienate anyone over issues not directly related to equality.  But shame on every single senator who blocked the Convention.  No one has ever protected human rights on an international scale through isolationist policies.  In a world where people with dwarfism still have little hope of employment outside the circus, people with albinism are persecuted, surgeries are performed without consent, and a diagnosis of mental illness is thrust upon LGBT people and denied people with clinical depression, international cooperation is crucial.  Otherwise, human rights disintegrates back into its inconsistent old self and becomes nothing more than a matter of privilege.  

 

 

Lessons Learned From A Laminectomy

2 Dec

Sippy Cup Forgotten

(Image by Randy Robertson used under CC license via)

 

Five weeks ago I had a spinal surgery to relieve compression brought on by my achondroplasia.  I took a break from blogging because, first of all, I’ve only recently been allowed to sit for longer than an hour or two, and secondly, major life interruptions are almost always best discussed from hindsight.  (Even though the personal usually ends up being political, this blog is not and never will be a tell-all of how high my temperature is or how my incision looks today.) 

I will confess that the hardest aspect was the lack of community.  No one at home or in the hospital had the same condition I did.  Since several of my readers have achondroplasia or children with achondroplasia, and I myself was ravenous for any sort of information I could get my hands on, here’s a synopsis of the past five months:

One night in July, I noticed I couldn’t sleep on my stomach without the muscles in my left thigh and hip burning with pain.  I took some Ibuprofen and applied a hot pack but to no avail.  Within a few days, the burning sensation expanded up into my lower back and deep in my backside.  It came whenever I lay on my stomach, lay on my back, or walked more than a few yards.  Strangely, it disappeared when I was sitting up straight.  I had to sleep propped up on pillows to keep the pain at bay and woke up during the night whenever I curled into a new position.  I described it as sciatica – which is, apparently, just a name for a set of symptoms and has various causes.  Maybe sleeping for five nights straight on a friend’s uncomfortable couch had done it?  My doctor gave me a prescription for physical therapy and stronger pain killers, but the medication had no effect and, after three weeks of physical therapy, the symptoms only got worse.    

By the time I met with an orthopedist, the burning began to be replaced with a pins-and-needles sensation that ran all up and down my left leg and worsened with walking.  Once again, it disappeared whenever I flexed my hips.  While the therapists tossed out the usual suspects for usual patients—disc herniation or degeneration, etc.—my family and I had begun to suspect achondroplastic lumbar spinal stenosis.  People with achondroplasia are at high risk for this because our spinal columns are exceptionally narrow and become acutely so with age.  The symptoms described in the medical literature on achondroplasia exactly matched mine.  Between one-quarter and one-third of all people with achondroplasia develop stenosis, usually in their 20s or 30s, and I was a perfect candidate.  Average-sized patients with stenosis are usually encouraged to turn to surgery only as a last resort, but achondroplastic patients almost always require a laminectomy.  And, according to most specialists I’ve spoken with, the sooner the better.  

I hate having surgery.  Talking with the anesthesiologist about all the medications I’m allergic to brought back all sorts of unpleasant memories.  But I eventually got in contact with an excellent team of neurosurgeons who were very informed and reassuringly confident that a laminectomy (without spinal fusion) would be the best defense against permanent paralysis.  And with my 13th operation now behind me, I know several things I didn’t before.

I learned that, unlike orthopedists, neurosurgeons cannot tell you at what time your surgery will take place until the day of, if at all, because emergency cases such as strokes and spinal cord injuries take priority.  Your surgery could be postponed by such cases more than once, as mine was.  It is surreal to find out you just spent a whole day without food or water for nothing, while also finding out the people who knocked you to the back of line are probably fighting for their lives.  

I learned that, contrary to what I had assumed, you wake up after back surgery lying on your back.  I was especially grateful for this after my partner pointed out that I had a black-and-blue mark on my cheek from lying on my face for the two and a half hour procedure.

I learned that the day of surgery is one of the easiest.  Waking up in the recovery room and discovering I could cope with the pain and seeing myself wiggle my feet sent waves of relief everywhere.  Seeing my husband waiting for me in my hospital room was thrilling.  And the drugs took care of the rest.

After that, however, each day threw a new curveball, whether it was the pain of moving, the vomiting that came after moving (typical for spinal patients), or the dilemma of never wanting to go to the bathroom because it destroyed whatever comfort I had finally found.  Unlike the patients whose stenosis had been caused by disc herniation, I could not walk without a walker after surgery and managed no more than baby-steps.  As with limb-lengthening, I learned to take it week by week in order to see that progress was happening, however slowly.  By the third week, the worst pain was gone and I could walk short distances without any assistance.  (After five weeks, I can now manage a few blocks, though it takes me twice as long as it used to and my balance remains fragile, so I like to avoid crowds.)

I learned that after spinal surgery, walking and lying down are good for you.  Sitting and standing are bad for you.  I can’t remember the last time I watched so many films in such a short time.

I learned that sippy cups are perfect for drinking when you have to lie flat on your back.  They make you look ridiculous/adorable.

I learned nurses are among the hardest working, strongest, most fearless people in the world.  No one whose work is free of analyzing other people’s vomit and urine can say otherwise. 

I learned (once again) that there is always someone at the hospital about to go through something a lot worse than what you’ve endured.  Hospitals have a bizarre way of inundating you with more self-pity than you’ve ever felt before and, at the same time, more sympathy for others than you’ve ever known before.

I learned that as an adult I could see how much skill and patience goes in to being a great caregiver.  When you’re a child, you expect—and should be able to expect—your parents and relatives providing unconditional support and tolerance for your needs and your bad moods.  When you’re an adult, you’re more likely surrounded by friends and partners; people who choose to check in on you and listen to you and soothe you for three hours straight and accompany you to the doctor and run errands for you and reach things you can’t out of their own free will.  You begin to understand the sacrifices your family made and those your true friends are making.  Just because you don’t deserve the raw deal you’ve been given doesn’t mean you deserve to take their patience or attention for granted.  No matter how bad you think you have it, always, always say thank you to whoever is being kind to you.  (And take a break from whoever isn’t.)

So now I have a new scar and hopefully I’ve helped flood the web so that googlers can find information about “achondroplasia spinal stenosis” more easily.  In my experience, seeing what you’ve learned, what you’ve been humbled by, is the whole point of having scars.

 

 

Happy Halloween

24 Oct

As of tomorrow, I have to go on medical leave and take a break from blogging for hopefully just a short while.  So, in the spirit of season, I’ll leave you with a re-run of my old post, “Curiosity Kills the Rat.”  Happy Halloween and be back soon!

CURIOSITY KILLS THE RAT

“All the freaky people make the beauty of the world.”

— Michael Franti

Fourteen years ago, I made a trip to Hot Topic—that quintessential 90s chain store for all things goth—in search of some fishnet stockings for a friend.  It was my first visit to the store since I was back in a wheelchair for my third and final limb-lengthening procedure and the narrow aisles prevented me from venturing beyond the entrance.  My first time in a wheelchair, from ages 11 to 12, had been a completely humbling experience as I was forced to see how very inaccessible the world is for the non-ambulatory.  This time around I was battling the hot-cheeked self-consciousness that adolescence attaches to any signs of dependency. 

As I tried to look casual while flipping through black gloves, black stockings, and black dog collars, a guy approached me sporting crimson hair, eyebrow rings, an employee badge and a smile.  “This is store is easily adjustable,” he grinned, and with that he began shoving aside the display cases and clothes racks—which were, like me, on wheels—clearing a path for me right through to the back and taking little notice of the other shoppers, some of  whom took one to the shoulder.  It was one of those crushes that disappear as quickly as they develop but leave a lasting memory: my knight in shining jewelry.

Thanks to experiences like this, I have a special place in my heart for the acceptance of physical differences that can often be found in the subcultures of punks, hippies, and goths.  From the imagining of monsters to the examination of anything taboo, counter-culture is often unfazed by physical qualities that fall outside of mainstream beauty standards.  The first kid in my high school who chose not to stare at the external fixators on my arms but instead held the door for me had green and purple hair.  About a month after my trip to Hot Topic, I showed a death-metal-loving friend my right fixator (shown above) for the first time, with the six titanium pins protruding from open wounds in my thigh.  He grinned, “That is the ultimate piercing, man!”  He hardly could have come up with a more pleasing reaction.  That my wounds were cool instead of “icky” or “pitiful” was a refreshing attitude found almost exclusively outside mainstream culture.  This attitude more readily understands my belief that my scars are merit badges I earned, not deformities to erase. 

However, this tendency toward decency over discomfort is just one side of the alternative coin.  Every subculture has its strengths and its weaknesses, and for all the freaky heroes I’ve encountered, I’ve also met plenty whose celebration of difference devolves into a sick fascination with the grotesque.  “Weird for the sake of weird” is progressive when it asserts that weird is inescapable, that it is in fact as much a part of the natural order as any of our conventions, and when it serves as therapy for the marginalized.  But it is problematic when it involves self-proclaimed artists using others’ reality as their own personal toys.     

In a previous post, I referred to a friend of friend including me in an Internet discussion about limb-lengthening.  His comments were in reaction to a photo of a leg wearing an Ilizarov fixator that had been posted on a Tumblr page focused on the wonders of the world.  There are countless sites like it, where photos of conjoined twins, heterochromatic eyes, intersexual bodies, and medical procedures are posted alongside images of animals, vampires, robots, cosplay, self-harm, manga and bad poetry.  I get it.  The world is “crazy” and it’s all art.  But if that’s not a freak show, what is? 

Disabled people are no longer put behind glass or in the circus—at least not in the U.S., Canada or Western Europe—but many people still believe they reserve the right to stare, both in public and on the Internet.  Whether under the guise of promoting diversity or admiring triumph in the face of adversity, they suppress any realization they may have that no one likes being stared atUnless it’s on our terms.  

I see endless art in my medical experiences and it can be so therapeutic.  During my first limb-lengthening procedure I also had braces on my teeth, leading my dad to observe, “She’s now 95% metal.”  Kinda cool.  During my third procedure, I had Botox injected into my hips twice to paralyze my muscles lest they resist the lengthening.  At the time, when I along with most people had no idea what it was, it was described to me as “basically the most deadly poison known to man.”  Whoa, hardcore.  When I happened upon photos of my anterior tibialis tendon graft surgery, I was enthralled: “I’m so red inside!”  And when a fellow patient recently alerted me to the fact that a high-end jeweler designed a bracelet strongly resembling the Ilizarov frame, I laughed my head off.  Almost all of us like looking at our bodies, and perhaps this is especially so for those of us who have had real scares over our health.  It’s a matter of facing our fears and owning it.  But no one likes the idea of others owning it.  This subtle but severe preference, this desire for dignity determines the difference between human rights and property rights. 

Two years ago, NPR featured a piece by Ben Mattlin, who is non-ambulatory and who said he used to be uncomfortable with the idea of Halloween and its objectification of the grotesque.  From my very first costume as a mouse to my most recent stint as the Wicked Witch of the West, my love of Halloween has not so much as once flickered, but his point is worth discussing.  Costume play, Halloween and any celebration of “weird” that is primarily attention-seeking inherently assumes there is a “natural” basis to be disrupted.  (And all too often Halloween devolves into offensive imitations of all sorts of minority identities.) 

I have my own collection of artsy photos stolen off the Internet that I use as screensavers and montages for parties, but they do not include photos of bodies taken outside the context of consensual artistic expression.  Re-appropriating a photo in a medical journal for a site about all things bizarre is protected under freedom of speech, but it can feel like disregard for consent.  And in any case, such xenocentrism will always be just as superficial as the status quo it seeks to disrupt.

When conjoined twins Abigail and Brittany Hensel agreed to be interviewed once—and only once—for a documentary about their lives (which I highly recommend), they explained that they don’t mind answering strangers’ questions at all.  (Ben Mattlin has said the same, as do I.)  What they hate more than anything is being photographed or filmed without permission.  While attending a baseball game outside their hometown, a sports film crew quickly directed their attention to the girls.  Even though they were already being filmed by their own documentary team, the stranger camera’s invasive, presumptuous stare ruined the day for them. 

Sensitivity toward others’ experience with medicine and death should never kill the discussion.  These discussions are imperative and art is the most glorious way we relate to one another.  But just as there’s more to good manners than simply saying “Please,” there’s more to genuine learning and artistic expression than poking at anything we can get our hands on.  Nuance, deference and respect are prerequisites for anyone with artistic or scientific integrity not only because they are the building-blocks of common decency, but because history has shown that curiosity will more likely harm the rat than the cat.

 

 

Dragging Entertainment Into the 21st Century

21 Oct

(Via)

 

This week, humor site Cracked.com features a great article by J.F. Sargent titled “6 Insane Stereotypes That Movies Can’t Seem to Get Over.”  Alongside the insidious ways in which racism, sexism, homophobia still manage to persevere in mainstream entertainment, Number Two on the list is “Anything (Even Death) Is Better Than Being Disabled”:

In movie universes, there’s two ways to get disabled: Either you get a sweet superpower out of it, like Daredevil, or it makes you absolutely miserable for the rest of your life. One of the most infamous examples is Million Dollar Baby, which ends with (spoilers) the protagonist becoming a quadriplegic and Clint Eastwood euthanizing her because, you know, what’s the point of living like that? Never mind the fact that millions of people do just that every day…

Showing someone using sheer willpower to overcome something is a great character arc, and Hollywood applies that to everything, from learning kung fu despite being an overweight panda to “beating” a real-world disability. The problem is, this arc has some tragic implications for the real-world people who come out with the message that they are “too weak” to overcome their disabilities.

The result is that moviegoers think that disabilities are way worse than they actually are, and filmmakers have to cater to that: For example, while filming an episode of Dollhouse where Eliza Dushku was blind, the producers brought in an actual blind woman to show the actress how to move and get around, but the result was that “she didn’t look blind,” and they had to make her act clumsier so the audience would buy it.

Even in Avatar, real paraplegics thought that Sam Worthington’s character was making way too much effort transferring from his chair, but that’s the way we’re used to seeing it in movies. It’s a vicious cycle, and it isn’t going to stop until either Hollywood wises up or people with disabilities stop living happy, fulfilling lives.

I’ve examined Hollywood’s ableist problems several times before and there are still plenty to dedicate an entire blog to.  But, like The Daily Show or The Onion, Cracked has a long history of excellent social critique embedded amongst the fart jokes and it’s awesome.  Especially when considering that not only mainstream but alternative entertainment all too often can’t seem to let go of the tired stereotypes.  That Cracked is a site not officially dedicated to politics or social activism suggests that the comics writing for it believe calling out the industry for its embarrassing ineptitude is just common sense.

 

 

   

What’s Privilege?

7 Oct

(Via)

 

This week I led a workshop about teaching pre-school children about diversity.  I started by asking the teachers what privilege is, and I got the same answer a family member had given just days before: “Privilege is what people who are really lucky have.  Like being born into a rich family, going to nice schools, or even just being exceptionally good-looking and therefore having an easier time of it.”

It is interesting that so many seem to be under the impression that privilege and luck are what extremely well-off people have.  Privilege does belong to anyone whose place in society is considered “better than normal,” but also to anyone whose place is considered simply “normal.”  As said before, privilege is granted by society to certain people based on things we had absolutely nothing to do with: our gender identity, our ethnicity, our sexuality, our physical traits, our mental capabilities, our class background.  That is why any privilege—like any form of disenfranchisement—is unjust.     

In the workshop, I read off the following list of statements that illustrate privilege to the participants who were lined up in a row.  (It’s a hodge-podge of original statements and ones taken from privilege activities created by Peggy McIntosh, Earlham College, and the Head Start Program.)  Anyone for whom the statement was true could step forward.  Anyone else had to stay behind.  All of us in the group stepped forward at least half the time.  You can see for yourself where you would have ended up: 

 1)      I always felt safe in my neighborhood as a child.

2)      If I wish to, I can be with people of my race/ethnicity most of the time.

3)      I never have to plan how to reveal my sexual orientation or gender identity to friends, family, or colleagues.  It’s assumed.

4)      I can go out in public without being stared at.

5)      I participated in extracurricular activities as a child (swimming, football, ballet, piano, yoga, painting, etc.).

6)      I can easily buy posters, picture books, dolls, toys and greeting cards featuring people of my race.

7)      I can wear a skirt, a dress, jeans, or pants, without anyone staring or asking me to explain my choice.

8)      In school, I could always take part in whatever activity or games the class was assigned.

9)      None of my close friends or family has ever been arrested.

10)  Rarely have I been asked to explain why my body looks the way it does or why I move or speak the way I do.

11)  I have never worried that I might not be able to afford food.

12)  When I learned about “civilization” in school, I was shown that people with my skin color made it what it is.

13)  I have never heard of someone who looks like me being given up for adoption or aborted because of it.

14)  Who I am attracted to is not considered a political issue.

15)  I attended a private school.

16)  I am never asked to speak for everyone in my ethnic group.

17)  I can find colleges that have many people from my class background as students.

18)  I can criticize our government without being seen as an outsider.

19)  My family never had to move for financial reasons.

20)  If I am assertive, it is never assumed that it comes from my need to “compensate” or struggle with my identity.

21)  When I was a child, I never had to help my parents at their workplace regularly.

22)  When I talk about my sexuality (such as joking or talking about relationships), I will not be accused of “pushing” my sexuality on others.

23)  If I make a mistake or get into trouble, I am usually judged as an individual, not as an example of people who look like me.

24)  I can go for months without being called straight, heterosexual, or cis.

25)  I can use public facilities (store shelves, desks, cars, buses, restrooms, and train or plane seats) or standard materials (books, scissors, computers, televisions) without needing help or adaptations.

26)  When I dress for a formal event, I don’t worry about being accused of looking too dolled up or not pretty enough.

27)  As a child, I never had to help care for a family member.

28)  When I watch family advertisements for food, medicine, clothing, games and toys, the families on TV usually look like mine.

29)  I grew up feeling I could be whoever or whatever I wanted.

30)  I have never been asked, “What do [people like] you like to be called?”

 

 

Playing Disabled

30 Sep

Miracle Worker

(Image by cchauvet used under CC license via)

 

Snow White and the Huntsman is out on DVD in Europe tomorrow. Unlike in most other Snow White films, the seven dwarfs are portrayed by average-sized actors, their bodies altered by digital manipulation. No one in the dwarf community is pleased about this.  Little People of America issued a statement criticizing the filmmakers’ failure to give priority to performers with dwarfism, while Warwick Davis argued, “It is not acceptable to ‘black up’ as a white actor, so why should it be acceptable to ‘shrink’ an actor to play a dwarf?” 

I don’t believe digitally generated dwarfism is on par with blackface and all that evokes, but it’s not too far off because there is a long tradition in cinema and theater of socially privileged actors portraying socially marginalized characters. And never the other way around. Blackface is a particularly hideous blemish on the history of entertainment because it was almost always used for mockery. Yellowface has a similarly horrid history: Until 1948, anti-miscegenation laws in the U.S. banned actors of different ethnicities from kissing onscreen, so whenever a white actor portrayed an Asian leading man, Anna May Wong knew the role of the heroine was off limits to her, despite her being the most successful Chinese-American actress of the era. Meanwhile, as noted before, the circus freak show tradition that caricatures people with disabilities is still going strong today. 

To be fair, Snow White and the Huntsman does not create the illusion of dwarfism in order to mock it. This is why, to me, the blackface comparison seems overblown.  (A more apt analogy to blackface would be an actor inhaling helium to play a dwarf, as David Hyde Pierce did for laughs on an episode of Frasier years ago.) When a character matter-of-factly has a disability and the performer simulates their body type with artifice, is this not comparable to any sort of makeup or costumes? Danny Woodburn (whom you might know from Seinfeld) discussed it in an excellent interview on The Patt Morrison Show in June:

Directors, producers have every right to cast who they want to cast.  I just think this is something that merits discussion when the disability community—not just the little people community but the disability community—is so underrepresented in the film and television industry…

Others without disability portraying people with disability.  When producers, directors don’t actively seek performers with disability—[and they’d have to] because a lot of those performers don’t have equal access to casting, don’t have equal access to representation—when they don’t actively seek out those performers, then there’s a real slight against our society, I believe…

This is about making a stand so that there’s at least some due diligence… When you have a community of disabled that is about twenty percent of the population and less than one percent of disabled actors appear on TV. And some of the disabled characters, many of them are not portrayed by disabled actors.

Woodburn and Little People of America raised this issue ten years ago when Peter Jackson announced that he would cast only average-sized actors in The Lord of the Rings. As noted before, part of me was glad to see those magical creatures distanced from real-life people with skeletal dysplasias, but if Jackson had chosen to use dwarf performers to portray the Hobbits or the Dwarves, might someone like Woodburn be as famous as Elijah Wood is today? It’s hard to say. Famous actors create box office draw. Almost no famous actors are disabled and almost no disabled actors are famous. And that’s the problem.

If digital manipulation and theater makeup are someday used to expand roles to minority performers, allowing actors of any body type or ability to play the Huntsman or Prince Charming, it will then lose its exclusionary feel. I adored Snow White and the Seven Dwarfs growing up and, even though I was the only kid with dwarfism, I always portrayed the princess in the living room productions put on for my parents and their friends. But cinema has almost never swung that way. There is no history of ethnic minorities portraying famous white characters or disabled performers portraying physiotypical heroes and heroines. Plenty of ambulatory men have sat in wheelchairs to portray FDR, but no disabled man has been cast as JFK. And that stings a bit.

And what stings even more is the way in which privileged actors so often earn automatic praise for portraying minority characters in epic films, as if all minorities are opaque, mystical people only geniuses could begin to understand. John Malkovich as a mentally disabled man in Of Men and Men, Colin Firth as stammering King George VI, and Patty Duke, Melissa Gilbert and more recently Abigail Breslin as Helen Keller have all been lauded for their performances. They are all fine actors who have proven a wide range of talent, and the stories they tell are truly moving. But the public’s nearly kneejerk assumption that a minority role is a feat of greatness for a privileged actor can feel very condescending. 

In the very bizarre, direct-to-DVD film Tiptoes, Gary Oldman was digitally manipulated to take the role of the leading man with dwarfism. Peter Dinklage, who played the comedic supporting role (and, in my opinion, the only good moments in the film), said: “There was some flak. ‘Why would you put Gary Oldman on his knees? That’s almost like blackface.’ And I have my own opinions about political correctness, but I was just like, ‘It’s Gary Oldman. He can do whatever he wants.’ ” 

Fair enough, but when he was sappily introduced in the trailer as playing “the role of a lifetime,” I almost lost my lunch.


 

Biology and “The Imprecision of Stereotypes”

16 Sep

 

This week the British newspaper The Telegraph asks:

Ever wondered why men can’t seem to tastefully decorate a house?  Or have a tendency for dressing in clothes that clash?  And why, for that matter, can’t women seem to hack it at computer games?  Now scientists claim to have discovered the reason: the sexes see differently.  Women are better able to tell fine differences between colors, but men are better at keeping an eye on rapidly moving objects, they say.

Professor Israel Abramov and colleagues at the City University of New York reached their conclusions after testing the sight of students and staff, all over 16, at two colleges…

The authors wrote: “Across most of the visible spectrum males require a slightly longer wavelength than do females in order to experience the same hue.”  So, a man would perceive a turquoise vase, for instance, as being a little more blue than a woman who was looking at it too.

Abramov, professor of cognition, admitted they currently had “no idea” about how sex influenced color perception.  However, writing in the journal Biology of Sex Differences, he said it seemed “reasonable to postulate” that differences in testosterone levels were responsible…

Men can’t perceive colors as deftly as women can.  That’s why all the great Western painters like Van Gogh and Cézanne and Leonardo and Picasso and Renoir and Monet and Munch and Vermeer and Kandinsky and Matisse are female.  And all the major fashion designers of the last century like Hugo Boss and Karl Lagerfeld and Gianni Versace and Giorgio Armani and Calvin Klein and Ralph Lauren were women.  Oh, wait. 

Maybe the study meant to say testosterone only triggers color ineptitude when male ears register the words “home decorating.”  Or that male color perception improves when money is involved. 

Or maybe The Telegraph author was exaggerating just a bit.  Tacking jazzy headlines onto reports of scientific studies are all the rage these days, no matter how much they distort the findings.  In June, Medical Daily ran an article under the title, “Racism Is Innate.”  Innate means, according to my biologist father, “present at birth,” so this seemed like a call to toss all those No child is born a racist buttons onto the trash heap.  Except that anyone who bothered to read the article would discover that the study simply concluded that brain scans of adults show simultaneous activity in the centers that process fear and emotion and those that differentiate between familiar and unfamiliar faces.  The idea that fear of the Other can be neurologically mapped lends itself to a great deal of speculation and debate, but nowhere did the study claim that racism is present at birth. 

Such truth-stretching borders on mendacity, yet it pervades the science sections of so many newspapers.  Scientific studies are supposed to be free of bias, but the news media is severely biased toward publishing whatever will grab readers’ attention.  As several researchers have pointed out, differences between the sexes are currently considered a much more interesting discovery than no difference, so publishers often remain silent on an issue until they find a study that provides the juicier headline, no matter how numerous the contradicting studies are.  When the market is left to decide, it chooses salability over comprehensiveness.

Such an irresponsible approach to science results in a gravely misinformed public.  I can’t tell you how many people have repeated the claim that our modern Western female beauty standards are “natural” because a round waist resembles pregnancy and triggers the male fear of cuckoldry.  No one seems to remember that several crosscultural studies discredited this idea years ago.  But how can anyone be expected to remember something the media chose not to promote in the first place? 

And forget about waiting until the study is corroborated.  In 2007, The Times ran a headline claiming that women are naturally drawn to the color pink because of our savannah foremothers’ need to gather berries while the men hunted.  The Times published the study without consulting any historians, who eventually pointed out that pink was considered a manly color as recently as 1918 until fashion trends changed.  Oops.

This doesn’t mean that we should, as Mitt Romney has demanded, “keep science out of politics.”  Science is impartiality and corroboration and the best method we have for sorting facts from wishful thinking—for preventing our emotional, egotistical needs from weakening our objectivity.  To me, science is the most humbling force in the universe because it demands we always admit what we do not know.  It prevents hasty conclusions based on flimsy evidence, gut feelings, and political agendas.  It questions crude stereotypes and discovers more complex structures. 

But according to pop science reporters and the researchers they choose to spotlight, nearly every single modern joke about the differences between men and women stems from millennia-old evolutionary adaptations.  (Indeed, the Telegraph article claims that the female proclivity for detecting color helped our foremothers with gathering berries.  Always with the damn berries… )  As stated in the graphic below, such reports all too often suggest that prehistoric society on the African savannah looked just like something Don Draper or Phyllis Schlafly would have designed:

Men hunt, women nest, and every macho social pattern we see today has been passed down to us from our prehistoric ancestors.  Even though historians find that these patterns, like our racial categories, are barely more than two centuries old, if that.  And that the gender binary is far from universal.  Misinterpreting scientific findings is just as dire as ignoring them. 

When it comes to what women and men can and can’t do, neuroscientist Lise Eliot notes, “Expectations are crucial.”  When boys and young men grow up in a culture that mocks their supposed incompetence in all things domestic (“Guys don’t do that!”), it comes as no surprise that only the most self-confident will pursue any interest they have.  Meanwhile, studies show girls perform as well as boys do in math and science until they reach puberty.  Maybe the onset of menstruation paralyzes our visual-spatial intelligence because we’ve got to get picking those berries, or maybe girls pick up on the not-so-subtle message that guys think coquettish beauty is more important than nerdy brains in the dating game.  (For more details on the sexism faced by aspiring female scientists, see Cordelia Fine’s excellent book, Delusions of Gender.)  In her research, Dr. Eliot finds only two indisputable neurological differences between males and females:

1) Male brains are 8% to 11% larger than females’.

2) Female brains reach maturation earlier than male brains. 

All other neurological studies that find major differences between the sexes are studies of adults: i.e., the people most shaped by their culture and society.  Only cross-cultural studies of adults can isolate nurture from nature.  In any case, Eliot is a proponent of neuroplasticity, the idea that the pathways and synapses of the brain change depending upon its environment and the neural processes and behaviors it engages in.  In other words, painting or gaming from an early age or frequently throughout your life will condition your brain to do these tasks and related ones well.  It explains why the gender roles of a given time and place are so powerfulwhy mastering unfamiliar tasks is an uphill climb for men and women but also why countries committed to equality have the narrowest gender gaps. 

“Plasticity is the basis for all learning and the best hope for recovery after injury,” Eliot writes.  “Simply put, your brain is what you do with it.”  For more, see her brilliant parenting book, Pink Brain, Blue Brain: How Small Differences Grow into Troublesome Gaps—and What We Can Do About It.   

But I’ll never believe that a neuroscientist has all the answers.  I live in a country that showed the world the dangers of hastily trying to trace all social patterns back to biology.  As a result, the media here in Germany is usually much more reticent to casually toss around arguments like those in The Telegraph or The Times or Medical Daily.  Natural scientists have made discoveries like neuroplasticity and limb-lengthening that are crucial to progress, but social scientists have discovered that equality and empathy are crucial to any society that values peace and respect over power and greed. 

Or, in other words.

 

 

In the U.S., Paralympic Athletes Might As Well Be “Untitled”

9 Sep

(Via)

 

The Paralympics end today after a week of what seemed to be decent coverage, though it depended on where you tried to watch them.  The host country allotted 150 hours of coverage to the Games, Australia clocked in 100 hours, and Germany and France allotted 65 and 77 hours respectively.  Meanwhile, the United States broadcast a whopping five and half hours and no live coverage at all, as per tradition.  Yay.

Considering how little attention was afforded the Games themselves, it is unsurprising that there was little dialogue stateside about disability rights and issues of equality.  What a missed opportunity.  The British media immersed itself in it, with articles like “Is it Ok To Call The Athletes Brave?”  Indeed, disrespectful attitudes toward people with disabilities today are more often implicitly patronizing than openly derisive, and it was pleasing to see the public address this.

The Paralympic Guide to Reporting that was handed out to media outlets brought up several interesting points about language.  It rightfully asserts that disabling conditions or features should not be turned into personal nouns that define the entire person or people in question: i.e., the disabled, the blind, a paraplegic.  Adjectives and verbs—a paraplegic athlete, athletes with disabilities—are less limiting, portraying a medical condition as one of many characteristics a person has.  (This has been repeated to me ad infinitum by a friend who’s uncomfortable whenever I refer to myself as a dwarf.  “You are Emily.  You have dwarfism!” he insists.  “And you have hazel eyes and freckles and long hair…”)  Other terms and phrases to avoid noted by the guide include:

normal

able-bodied

wheelchair bound

confined to a wheelchair

suffers from

afflicted with

victim of

The last three are commonly used today.  They’re problematic because they imply that a disability is always regrettable.  Sometimes it is, and sometimes it isn’t.  Suffering may have been an apt term for my achondroplasia two months ago, when severe lumbar pain made it hard for me to think of anything else during a sightseeing trip in England.  But suffering has nothing to do with all the ways in which my condition has brought me in contact with all sorts of unique people and places and outlooks.  I can’t imagine my life without it.  It’s my version of normal.  Unless the patient specifically says otherwise, any assumption that a disability is a round-the-clock tragedy is wrong.

For the sake of splitting hairs, I sometimes think the words disabled and disability are problematic because they automatically draw attention to what a person cannot do.  In the worst case, they can sound pitiful.  I’m very fond of the word typical in lieu of normal or able-bodied because it highlights that the standard by which we group people is based on a body type chosen by the scientific community.  It implies medical averages, not social values.  Typical is used in everyday speech to mean “usual” at best and “unexciting” at worst, unlike normal, which implies a state of correctness worth striving for, like in the phrase “back to normal.”  Discussions of autism and some other psychiatric conditions now commonly use the term neurotypical to refer to people without the diagnoses.  Maybe physiotypical could someday be the term for non-disabled people.

But as I’ve said a few times before, the search for acceptable terms is not about deciding what automatically classifies a speaker as Tolerant or Bigoted.  Words are only half as important as the intentions behind them, and the desire to understand another’s perspective is what separates an empathic person from a selfish one.  In the recent words of Professor Charles Negy, “Bigots… never question their prejudices.”  

The above list of do’s and don’ts is probably disconcerting to some readers.  I always feel simultaneously inspired and confused when given a list of hot-button words I’m told to avoid from now on.  Hell, I’ve written the word able-bodied before, and I’m someone excluded by it.  I find no problem with the word handicapped—I had handicapped housing rights in college and a handicapped parking sticker during my limb-lengthening procedures—but it’s considered offensively archaic in the U.K., apparently similar to invalid or cripple.  As we’ve seen in the midget vs. dwarf vs. LP debate, rarely is there ever a consensus in a given community over labels.  Labels are almost always problematic.  In my experience, the dialogue always matters more than the conclusion it comes to. 

And the inability of the U.S. media to have such dialogue during the Paralympics was pitiful.

 

 

Germany Rules on Male Circumcision

26 Aug

Justice(Image by Viewminder used under CC license via)

We’ve been waiting all summer for this decision.  On Thursday here in Berlin, the German Ethics Council ruled that male circumcision is legally permissible without a doctor’s order, but several conditions must be met:

    • Both parents must be in full agreement.
    • All possible risks to the procedure must be explained in full detail.
    • Local anesthetics must be an option.
    • The procedure must be certified by a medical professional.

Some of these requirements, especially the last two, go against what some fundamentalist religious leaders mandate.  Why all the fuss?  In Europe, where female genital cutting is illegal, male circumcision is only common in Muslim and Jewish communities.  Last year, a German court in Cologne ruled that the circumcision of an underage male constitutes aggravated assault and battery, and the debate has been raging ever since.  It has split the nation into two parties: Those that see the procedure as cosmetic at best and mutilating at worst, carried out on patients too young to give consent, versus those that believe any ban on age-old rituals and tribal markers constitutes religious and/or ethnic persecution.  That the ritual German lawyers sought to ban is a Jewish custom makes it a particularly sensitive case here.

When we hear stories of female genital cutting in Africa, Westerners are generally horrified.  But few in the United States understand that many Europeans gape at our 60% rate of male circumcision and consider it to be of course not quite but almost as cruel.  “How on earth could parents do that to their baby boy?!” is the reaction I get from the vast majority of Christian and non-denominational European males I talk to.  They are much more prone to believe studies citing the problems it can cause—for example, a supposedly higher rate of dyspareunia for women who have intercourse with circumcised men—than studies that downplay such fears.  I usually admit to them that, because it is so very common where I come from, I’d never given it much thought beyond those pop culture jokes about what looks better.

Which just goes to show how powerful cultural customs and values can be.  Both female and male genital cutting involves groups that say we should protect the parents’ right to choose what they think is best for their children without government interference, while the others say the government should protect children from procedures that offer no medical benefit before they are old enough to decide for themselves, regardless of what their parents want.

I’ve written before that as someone who’s undergone limb-lengthening, I know how complex decisions about body alteration can be.  Determining an appropriate age of consent for surgery can be even more complicated.  But also due to my experience, I wince along with Jessica Valenti when parents choose procedures for their children that offer no real medical benefit.  While discussing circumcision, my European friends argue that patients should reach the age of consent before undergoing any procedure that, unlike limb-lengthening, does not become more medically complicated with age.  Should courts ever rule this way, this will inevitably lead to bans on juvenile nose-jobs like the one Valenti cites.  But then what about ear-piercing? 

Years ago, I was a panelist at a conference called “Surgically Shaping Children” at the Hastings Center, a think tank for bio-ethics, where we addressed elective procedures such as limb-lengthening on dwarfs and determining a gender for intersex children.  After a two-day debate and a resulting book, we concluded that the best way to prevent parents from making decisions that could be damaging to their children is to keep both the parents and their children as informed as possible about every issue that’s at stake: medical facts, cultural identity, individual identity, and agency.  The German Ethics Council’s ruling also implies that such comprehensive understanding is necessary. 

I think a ban on circumcision would have created more cultural resentment than understanding.  But the scientific community, and society as a whole, should take the place of the legal system in helping parents understand all the complexities of altering a child’s body without a medical purpose.  There may be no easy answer, but the discussion has got to keep on going.

Fighting the Good Fight or Feeding The Ego?

19 Aug

Body Art Chameleon“I know so many men and boys and trans individuals who wear dresses for so many different reasons, and they do it a lot more than mainstream movies, TV, and advertising suggest.” 

I felt my fingers tremble just a tiny bit as I typed this sentence last week.  Not because of the subject matter.  Not because of the point I was trying to make.  Because of the “I.”  Was that word going to drive home my point, or derail it?

Studies show personally knowing someone who belongs to a minority group increases the likelihood that you will have empathy for that minority.  If you have a family member who is gay, you’re less likely to oppose marriage equality.  If you know someone with dwarfism well, you’re less likely to see their medical diagnosis whenever you look at them.  GLAAD emphasized the political potential for all this in a brilliant meme last fall.  Urging LGBT individuals to talk openly about their partners and love lives at the dinner table with the same frequency as their straight family members, they called it, “I’m Letting Aunt Betty Feel Awkward This Thanksgiving.” 

Truly caring for someone with a different perspective often—though, sadly, not always—inspires us to try to understand their perspective and this enhances our own.  Letting others know that They are not so different from Us because we know and care deeply about many of Them can effectively break down barriers.  And, when discussing social injustice, it’s always best to ask someone with personal experience, lest we unwittingly make erroneous assumptions.  But, of course, just having friends who belong to minority groups doesn’t solve everything. 

As I wrote about knowing men and trans people who wear dresses to elucidate that They are actually Us, I cringed at the idea of flaunting my loved ones’ Otherness for the purposes of my blog.  By inserting myself into the statement, there was a risk that some would think I was trying to prove my open-mindedness.  I’ve bragged like that in the past, especially when I was an egocentric teen.  (You know, back when you practiced writing your name over and over?)  And my own Otherness has been flaunted a few times by friends and acquaintances seeking attention for their open-mindedness.  It’s a serious problem in the social justice movements.  

In Black Like Me, the author tells the story of a New Yorker he encounters who has come to the South to “observe” the plight of the black citizens.  “You people are my brothers,” the New Yorker insists.  “It’s people like me that are your only hope.  How do you expect me to observe if you won’t talk to me?”  Although the man’s opposition to segregation was morally correct, his overt self-regard and patronizing disgust at his brothers’ “ingratitude” makes it one of the most cringe-inducing scenes in the book.

In Baratunde Thurston’s fantastic memoir, How To Be Black (just out this year), the author asks writers and activists about white people’s fear of being called racist.  damali ayo, the author of How To Rent A Negro and Obamistan! Land Without Racism, says it best:

It shows our values as a culture when somebody says, “I don’t want to be a called a racist.”  Really what they’re saying is, “I want you to like me.  I don’t want to not be liked.  I want to still be okay with you.”  They don’t mean, “What I really want is to know and understand experiences of people of color…”  That would be great.

And so, it just shows that, as I always have said, we are operating at this third-grade level of race relations.  And it’s that third-grader that goes, “Please like me, do please like me,” versus “Can I understand?”

We all want to be liked and we all want to do the right thing.  But the the third-grader mindset can’t help but focus more on the former.  It is evident in common phrases like:

“We were the only white people there!” 

 “I’ve always wanted a gay friend!” 

“I think I’m [bisexual/learning disabled], too, because I [kissed a girl once/have difficulty concentrating]!” 

“I’m not prejudiced!  I have so many [nonwhite/foreign/LGBT/disabled] friends!”

Of course, in certain contexts and worded differently, these statements would not be offensive.  What makes them offensive is the need to let others know all about us, the belief that our support for equality deserves praise, the patronizing (and unjust) view that minorities should be grateful for our lack of prejudice.  We can note that we were the only white people in a group in order to spark a dialogue about social segregation, or we can flaunt the experience like a medal from the Liberal Olympics.  We can worry that having a homogeneous circle of friends will limit our perspective, or we can believe that racking up as many minority friends as we can is proof of our expertise on all minority issues.  We can try to empathize with someone labeled “different” because of their sexuality or biology in order to remove stigmas and barriers, or we can try to seek the attention they are getting for ourselves.  We can respond to accusations that we have offended by trying to understand why someone would be hurt, or we can respond by listing our liberal credentials.

This depends primarily on the individual.  Someone who likes to brag about their open-mindedness usually brags about most things they do.  This personality trait seems to be particularly common among educated elites—parodied so well at Stuff White People Like—because elite education frequently fosters competitiveness.  (Taking the time to count your degrees, count the books you own, count the minority friends you have…)  Competitiveness is anathema to selflessness.   But while bragging about the number of books we own is silly because we’re obviously missing the point of reading, bragging about the number of minority friends we have is grave because we’re missing the point of human rights.

Do we donate to charity privately because it makes us feel better to spend the money on someone else?  Or do we hope that others will notice and admire our sacrifice?  Then again, drawing attention to the work we’re doing is usually important if we want to advertise the cause and urge others to join.  That’s where things get murky.

A while back, within a few months of each other, two friends stood up to ableism and told me about it after the fact.  A guyfriend came fuming to me about his teacher who had used the word “midget” and who had then insisted, despite my guyfriend’s protests, that it wasn’t offensive at all.  A girlfriend told me that a mutual acquaintance had said something crass about my dwarfism and that she had told him to back off repeatedly because she wouldn’t tolerate such bigotry in her presence.  The first friend focused his story on the offender’s behavior.  The second focused her story on her heroic defense.  People who want to understand the problem more than anything tend to focus their feelings on the injustice they encountered.  People who want to be liked more than anything tend to focus their feelings on their performance.

This shouldn’t ever deter anyone from working for equality and social justice, from celebrating diversity or from spreading awareness.  Open minds should always be highly valued.  But to paraphrase the recent words of the Crunk Feminist Collective, by not being racist—or sexist or homophobic or lookist or ableist or transphobic—we’re not doing anything special.  We’re doing what we’re supposed to do.

 

 

Interpreting History Part II: Oppression Has Never Been Universal

5 Aug

(“Samurai Kiss” via)

 

Nothing divides a country quite like a national holiday.  When I was studying in St. Petersburg ten years ago, there was as much apathy as there was celebration on the Russian Federation’s June 12th decennial.  German reactions to Reunification Day every October 3rd are anything but united.  And on the United States Fourth of July last month, Chris Rock tweeted, “Happy white peoples independence day, the slaves weren’t free but I’m sure they enjoyed fireworks.”

Amid the outbursts of “unpatriotic!”, conservative blogger Jeff Schreiber shot back, “Slavery existed for 2000yrs before America. We eradicated it in 100yrs. We now have a black POTUS. #GoFuckYourself.” 

Schreiber has since written a post on his blog, America’s Right, apologizing for cursing and conceding that the slave trade was unconscionable.  But for all his insistence that he never intends to diminish the horrors of American slavery, he adds that President Obama’s policies are now “enslaving Americans in a different way.”  (Real classy.)  And for all his reiteration that slavery was always wrong, he still hasn’t straightened out all the facts skewed in his Tweet.

“Slavery existed for 2,000 years before America.”  He uses this supposed fact to relativize the oppression, as if to shrug, “Well, everyone was doing it back then.”  His tweet implies that the ubiquity of the slave trade makes America’s abolition of it exceptional, not its participation.  This argument hinges on fiction.  Slavery did not exist for 2,000 consecutive years.  In the West, it was pervasive in Antiquity and the Modern era, but it was downright uncommon in the Middle Ages.  (While anathema to our modern ideas of freedom for the individual, medieval serfdom was not slavery.)  Slavery was re-instituted in the West roughly 500 years ago with the advent of colonialism.  And the United States held on to it long after most other colonial powers had abolished it.  Critics can say what they want about the effectiveness of Chris Rock’s rain-on-a-parade tactics, but his argument did not distort history.      

In my last post, I argued the risks of concealing the human rights abuses of the past for the sake of nostalgia, if anything because it is the height of inaccuracy.  But portraying history as an unbroken tradition of straight, white, able-bodied male dominance like Schreiber did is also inaccurate.  The universal human rights movement in its modern form is indeed only a few decades old, but the idea of equality for many minorities can be found all over in history at various times and places.  The Quakers have often been pretty keen on it. 

And almost no minority has been universally condemned.  People with dwarfism appear to have been venerated in Ancient Egypt.  Gay men had more rights in Ancient Greece and in many American Indian societies than in 20th century Greece or the United States.  Muslim women wielded the right to divorce long before Christian women.  English women in the Middle Ages were more educated about sex than their Victorian heiresses.  Much of the Jewish community in Berlin, which suffered such unspeakable crimes culminating in the mid-20th century, were at earlier times better integrated into the city than Jewish people were in many other capitals of Central Europe.  In short, history does not show that racism, misogyny, homophobia, ableism, transphobia, and our current beauty standards are dominant social patterns only recently broken by our ultra-modern culture of political correctness.  The oppression of minorities may be insidious and resilient throughout history, but it has never been universal. 

Downplaying the crimes of the past by claiming everybody did it is both historically inaccurate and socially irresponsible.  It is perverse when such misconceptions fuel arguments for further restrictions on human rights.  In 2006, Republican Congress member W. Todd Akin from Missouri claimed that, “Anybody who knows something about the history of the human race knows that there is no civilization which has condoned homosexual marriage widely and openly that has long survived.”  Even if this were true, the argument is absurd.  (It appears that no civilization has regularly chosen women with dwarfism for positions of executive power, but does that mean it’s a bad idea?)  But the argument collapses because it relies on facts that are untrue.

Granted hyperbole is a constant temptation in politics.  Stating things in the extreme is a good way to grab attention.  In an earlier post on sex, I asserted that mainstream culture assumes women’s sex drive is lower than men’s because female sexual expression has been “discouraged for millennia.”  Patriarchy has certainly been a major cultural pattern around the world and throughout history, and we cannot emphasize its power on both the collective and individual psyche enough.  But patriarchy is by no means a cultural universal.  Ethnic groups in Tibet, Bhutan, and Nepal continue to practice polyandry into the present day, while history shows many others that have done the same at various times.  These exceptions question the biological theory that heterosexual male jealousy is an insurmountable obstacle to sexual equality.  And prevents any conservative excuse that insists, “Everybody’s been doing it.”    

They haven’t been.  Xenophobia has never been universal.  Humans may have a natural fear of the unfamiliar, of what they perceive to be the Other, but our definitions of the Other change constantly throughout time and space, as frequently and bizarrely as fashion itself.   This makes history craggy, complex, at times utterly confusing.  Like the struggle for human rights, it is simultaneously depressing and inspiring.  But whatever our political convictions, we gotta get the facts straight.

Despite what Stephen Colbert says.

 

 

Interpreting History Part I: Count Me Out

29 Jul

alter ego(Image by Bob May used under CC license via)

 

Anytime my partner and I don’t know what to do or say, one of us asks, “What’s in the news?” and we dive into a political discussion.  So it’s no surprise that we’ve become somewhat embarrassingly addicted to Aaron Sorkin’s The Newsroom.  The news media has been (unsurprisingly) critical of a show founded on the idea of chastising the news media.  Feminists have been (sometimes rightly) critical of its portrayal of women.  The show has almost countless strengths and weaknesses, but I find myself still obsessing over the brilliant, captivating opening scene that kicked off the series.  If you can’t this clip, it basically boils down to a flustered news anchor named Will McAvoy overcome with disgust at the state of the nation and nostalgia for the 1950s and 60s: “America’s not the greatest country in the world anymore,” he sighs.  “We sure used to be.”

We stood up for what was right.  We fought for moral reasons.  We passed laws, we struck down laws for moral reasons.  We waged wars on poverty, not poor people.  We sacrificed, we cared about our neighbors.  We put our money where our mouths were, and we never beat our chests…  We cultivated the world’s greatest artists and the world’s greatest economy.  We reached for the stars, acted like men.  We aspired to intelligence.  We didn’t belittle it.  It didn’t make us feel inferior…  We didn’t scare so easy.     

“Nostalgia” literally means “aching to come home.”  It’s the temporal form of homesickness, time rather than place being the source of pain.  We all do it.  It can be oddly soothing at times to be in awe of another era, especially the one you were born in.  But Will McAvoy should watch Woody Allen’s Midnight in Paris for proof that nostalgia is an ultimately futile pastime that every sad sack of every era has hopelessly indulged in.  (If “things were better back in the day,” then how come every generation says this?)  But since McAvoy’s nostalgia is an earnest, political battle cry, heaping laurels on the good old 1950s and 60s when the leaders of the day did their job right, I’m more inclined to have him watch Mad Men.  Or just open up the 1960 children’s illustrated encyclopedia I found at my great aunt’s house, which states, among other things: “The Australian aborigine is similar to the American negro in strength, but less intelligent.”  Didn’t scare so easy, indeed.     

The problem with nostalgia is that it is far more emotional than intellectual and thereby lends itself to inaccuracy all too easily.  America was indeed doing great things sixty years ago.  And reprehensible things.  We hid our disabled and gay citizens away in institutions, asylums and prisons.  We enforced the compulsory sterilization of mentally disabled and Native American women.  We took decades to slowly repeal segregationist laws that the Nazis had used as models.  We maintained laws that looked the other way when husbands and boyfriends abused their partners or children.  In short, we handed out privilege based on gender, sexuality, ethnicity, religion, physical and mental capabilities with far greater frequency and openness than we do today.  Perhaps we were the “greatest country in the world” compared to the others.  (Europe and East Asia were trying to recover from the devastation of World War II, after all, while other nations were trying to recover from the devastation of colonialism.)  But McAvoy’s wistful monologue is much more a comparison of America Then with America Now.  And that is hard to swallow when considering that a reversion to that society would require so many of us to give up the rights we’ve been given since then.   

Am I “another whiny, self-interested feminist” out to bludgeon the straight, cis, WASPy male heroes of history?  Am I “just looking to be offended”?  No, I’m struggling.  Next to literature and foreign languages, history has always been my favorite subject.  And pop history always touches upon this question:

“If you could go back to any period in history, which would it be?” 

From an architectural point of view?  Any time before the 1930s.  From an environmental point of view?  North America before European contact.  From a male fashion point of view?  Any period that flaunted fedoras or capes.  From a realistic point of view?  No other time but the present.  Because if I am to be at all intellectually honest in my answer, there has never been a safer time for me to be myself. 

Last year, I read The Lives of Dwarfs: Their Journey from Public Curiosity To Social Liberation by Betty Adelson.  Despite my love of history, I hated almost every minute of it.  Lies my Teacher Told Me by James Loewen had helped me understand how so many black American students feel uninspired by U.S. history and the figures we hold up as heroes because so many of those men would have kept them in shackles.  But it wasn’t until I read The Lives of Dwarfs that I understood how nasty it feels on a gut-level to face the fact that most of history’s greatest figures would more likely than not consider you sub-human. 

With the exception of Ancient Egypt, my own lifetime has been the only period wherein someone with dwarfism could have a fair chance of being raised by their family and encouraged to pursue an education and the career of their choice, as I was.  At any other point in Western history, it would have been more probable that I would have been stuck in an institution, an asylum or the circus (the Modern Era before the 1970s), enslaved by the aristocracy (Rome, Middle Ages, Renaissance) or left for dead (Ancient Greece).  Of course inspiring cases like Billy Barty show that a few courageous/decent parents bucked the trends and proved to be the exception to the rule, but that’s what they were.  Exceptions. 

I am fortunate to have been born when I was and for that reason, nostalgia for any other period in time can never be an intellectually honest exercise for someone like me.  The moment someone says, “Yeah, well, let’s not dwell on odd cases like that.  I’m talking about the average person,” they’re essentially saying, “Your experience is less important than mine.”

Everyone is entitled to have warm, fuzzy feelings about the era in which they grew up.  If any period can put a lump in my throat, it’s the 1970s.  The Sesame Street era.  The boisterous, primary-colored festival flooded with Williams Doll, Jesse’s Dream Skirt, inner city pride à la Ezra Jack Keats, and androgynous big hair all set to funky music can evoke an almost embarrassing sigh from me.  Donning jeans and calling everyone by their first name, that generation seemed set on celebrating diversity and tearing down hierarchies because, as the saying goes, Hitler had finally given xenophobia a bad name.  Could there be a more inspiring zeitgeist than “You and me are free to be to you and me”? 

 

But I’m being selective with my facts for the sake of my feelings. 

Sesame Street and their ilk were indeed a groundbreaking force, but it was hardly the consensus.  Segregation lingered in so many regions, as did those insidious forced sterilization laws.  LGBT children were far more likely to be disowned back then than today—Free To Be You And Me had nothing to say about that—and gay adults could be arrested in 26 states.  The leading feminist of the time was completely screwing up when it came to trans rights.  Although more and more doctors were advocating empowerment for dwarf babies like me, adult dwarfs faced an 85% unemployment rate with the Americans with Disabilities Act still decades away.  And Sesame Street was actually banned in Mississippi on segregationist grounds.  When the ban was lifted, its supporters of course remained in the woodwork.  We have made so much progress since then.  It would be disingenuous for me to ignore that simply for the sake of nostalgia. 

To be fair to Sorkin, it’s a hard habit to kick.  We have always glorified the past to inspire us, no matter how inaccurate.  Much of American patriotism prides itself on our being the world’s oldest democracy, but we were not remotely a democracy until 1920.  Before then, like any other nation that held free elections, we were officially an androcracy, and of course we didn’t guarantee universal suffrage until the Voting Rights Act of 1965.  That my spellcheck doesn’t even recognize the word “androcracy” signifies how little attention we afford our history of inequality.  But we have to if accuracy is going to have anything to do with history.  A brash statement like “We sure used to be [the greatest country in the world],” as a battle cry for self-improvement is asking to be called out on the inanity of this claim. 

Everyone is entitled to appreciate certain facets or moments in history, just as everyone is entitled to look back fondly upon their childhood.  Veracity falters, however, with the claim that not just certain facets but society as a whole was all-around “better.”  This is never true, unless you’re comparing a time of war to the peacetime preceding it (1920s Europe vs. 1940s Europe, Tito’s Yugoslavia vs. the Balkans in the 1990s), and even then the argument is sticky (Iraq during the insurgency vs. Iraq under Saddam Hussein).  In the words of Jessica Robyn Cadwallader, concealing the crimes of the past risks their reiteration.  Whenever we claim that something was socially better at a certain point in history, we must admit that something was also worse.  It always was. 

But such a sober look at the past need not be depressing.  It reminds me how very grateful I am to be alive today.  My nephews are growing up in a society that is more accepting than almost any other that has preceded it.  That is one of helluva battle cry.  Because what could possibly be more inspiring than history’s proof that whatever our missteps, things have slowly, slowly gotten so much better?

 

 

When You Gonna Start Makin’ Babies?

22 Jul

Gotcha by Clint McMahon(Image by Clint McMahon used under CC license via)

 

A while back, tucked inside one of my longer posts was a link to a conversation Rosie O’Donnell had in February with comedienne Chelsea Handler on her show in which she discussed her phobia of dwarfs.  Driven by Handler’s insistence that sex with a dwarf would be “child abuse,” the conversation devolved into musing about how dwarf women give birth:

O’Donnell: When a little person has a normal-sized person, I don’t understand how that happens.

Handler: That I don’t understand!

O’Donnell: I don’t get it.  How come the little person isn’t dead when the normal-sized baby comes out?

Handler: Sometimes two smalls make a tall.

O’Donnell: But how does it come out?

Handler: I don’t know.  I think anything can come out of that.

For your information, Chelsea, when it comes to achondroplasia—the most common type of dwarfism—“two smalls” have the exact same chance of having a “tall” (25%) as they do of having a child with two achondroplastic, homozygous genes, which is always fatal.  (The baby is usually stillborn or dies within the first few weeks after birth.)

O’Donnell has since apologized for talking about her phobia of dwarfs, though Little People of America have rightly said she missed the point.  Many have said that as an openly gay woman, she should know better when discussing prejudice, but I was more surprised by her callousness in light of her being an adoptive parent.  And I notice my (hyper-)sensitivity to that issue seems to grow every time I encounter it.

And of course I seem to be encountering it everywhere nowadays.  “When ya gonna start makin’ babies?”  Almost all of us in our late twenties and thirties are used to being asked this regularly.  I’ve been told I should take it as a compliment, since it’s rarely asked of couples who would make terrible parents.  Yet I’ve been amazed at how intrusive the questions and comments can be, how often something as personal as parenthood is treated like small talk.  It’s understandable as more of my peers become parents; the prospect of making humans is daunting and people need to vent about it.  Those who don’t want children while living in a baby-obsessed world feel the need to vent back.  All this venting results both in community-building and in tactless comments that knock those outside of the community. 

One of my friends who miscarried was told by a stranger, “Well, it wasn’t a real baby.”  A friend who adopted a girl from South Korea was told by a fellow church member, “Her eyes aren’t that bad.”  A friend who had a C-section was told she must not feel as close to her child as women who give birth “naturally.”  Childfree friends have been told that their lives will be never be “complete” until they’ve had children.  A biology professor who had two foster daughters was asked if he was worried they would inherit their imprisoned father’s criminal tendencies because “that stuff’s in the genes, y’know.”  I’ve been told it’s selfish to want a child with achondroplasia, it’s selfish to want a child without achondroplasia, it’s selfish to allow my child to inherit my achondroplasia, it’s selfish to play God with genetics, it’s selfish to want to biologically reproduce what with the world population exploding, and it’s selfish to worry about any of this because it’s not like I’m infertile.  All of these comments were well-intentioned. 

Usually people are simply thinking out loud when they say such things.  It is important to remember that no one can be expected to know exactly what to say in unusual circumstances, lest I end up lecturing as if I’ve never inadvertently offended anyone.  Almost all of us have good intentions, but many are unaware of how quickly we redirect conversations back to our own experiences, how easily we forget to prioritize listening over interrogating, empathy over curiosity, respect over Thank-God-that’s-not-me! complacency.   

Hereditary conditions, finances, disabilities, infertility, relationships and emotions ensure that having children is not a universal experience.  There is no right way for everyone and any opinion that can in any way be construed as a judgment can cut someone deep because babies and bodies are entangled in supremely visceral feelings.  It’s no coincidence that Roe v. Wade was argued based on the right to privacy: Something as sensitive, as complicated and as profoundly emotional as your reproductive choices should be volunteered at your discretion. 

That said, parenthood is all about making decisions that will inexorably affect someone else’s life, not just your own, and this is why it is such a hot-button issue.  Our reproductive decisions, more than any other decisions, are the intersection of personal freedoms and social responsibility.  As the daughter of a social worker who worked for Child Protective Services, I have firm beliefs about right and wrong when it comes to parenting.  As someone whose genes make the prospect of parenthood unusually complicated, I’ve begun to see how judgmental those beliefs can come off when the presentation is sloppy. 

As an avid reader of Offbeat Families, I know that sharing knowledge and experiences can help others in so many ways.  But as someone who feels very ambivalent about offering up my not-yet-existent children’s potential situation as conversation fodder, I’ve become less trustful of many of my most well-meaning friends and family members.  Questions about my situation so quickly transform into lectures about their situation.  (I’ve also noticed that the more nervous someone is, the more they lecture.)  Besides making me more guarded about my personal experience, it has also taught me to stop myself from making snap judgments about others’ reproductive choices.  When dealing with anyone else’s family planning, I have been humbly learning to: 

 1)      Fight the urge and try not to ask others about their reproductive choices, especially in the context of small talk.  Let them volunteer it.  Go ahead and volunteer your own stories, but don’t press the other person if they do not respond in kind.  We can never assume what’s lurking under there. 

 2)      Beware of talking about the decisions you made in a way that inadvertently hurts those who must make different decisions.  This is also very tricky, but if you are convinced water birth is the only way you can imagine doing it or you are proudly childfree or you know exactly how to make sure it’s a girl, be aware that people in different financial or medical situations may not have these options at all.    

 3)      When someone does want to share something you have little experience with (e.g. adoption, stillbirth, staying childfree, etc.), prioritize listening and learning over immediately finding something to compare it to.  Relativizing struggles can be helpful and I’ve gotten some great feedback from friends, but my guard goes up when someone without achondroplasia tells me right away they know what I should do because they know someone whose baby has diabetes, they took a college class on bio-ethics, or they heard something like it on the news.

4)      Only offer your ethical opinion if the person makes it perfectly clear they want to hear it.  Every society bears the responsibility of taking a legal stance on complex reproductive issues: prenatal testing, genetic counseling, birth control, abortion, sterilization, drug testing, assisted reproductive technology, the life of the mother vs. the life of the fetus, custody, adoption, foster care, etc.  We are all compelled as citizens to be aware of the laws concerning these issues.  And we all have our own opinions about them.  But anyone directly affected by them is likely to have heard it before and to have been thinking about it longer than we have.  I’ve been thinking about the effects my dwarfism may have on my kids since I was fourteen.

5)      Don’t gossip about others’ decisions behind their backs.  It makes your listeners aware how they will be talked about when it’s their turn to decide about having children.  There is a fine but crucial line between trying to understand situations that are new to you and using someone’s situation to tell an interesting story.

6)      Do try to actively listen when invited to, saying truly supportive things, as one or two particularly fantastic friends of mine have, such as: “I can only begin to imagine what I’d do in that situation.”  “Let me know if you don’t want to answer this question…”  “On a much smaller level, it sounds a tiny bit like what I felt when…”   “No matter what you decide, I know you’ll be great at it because…”  “I’m always here to listen if you ever need to spill, as long as it helps.”

Of course, in listing here what I have learned not to do, I can only hope that my own past SNAFUs have been minimal.  Insensitivity, by definition, is the disconnect between intention and effect.  Embarrassed apologies to anyone whose toes I stepped on while stomping through my own bigfooted opinions.

 

 

Cross-posted on August 27, 2012 at Offbeatfamilies.com

Body Image Part IV: My Choice and Your Choice Entwined

24 Jun

Copyright Folke Lehr(Image ©Folke Lehr)

I began The Body Image Series with this question: If we were fully convinced that no one else cared one bit what we looked like, how much would we care?  Would we have any reason to envy conventionally attractive people?  Would weight loss have anything to do with waist size?  Would limb-lengthening still touch on the idea of “blending in”?

 ***

Ten years ago, I attended the premiere of HBO’s Dwarfs: Not A Fairy Tale along with the other subjects of the documentary.  Upon seeing me, one of the men with achondroplasia asked his friend, “What’s she doing here?  She’s not a dwarf.”

“She had limb-lengthening surgeries to make her taller,” his friend murmured.

 “What?!” he exclaimed. “She cheated!”

I felt myself blush before I could think of what to say.

Immediately, a woman with diastrophic dwarfism, the shortest of all of us, turned to me and said, “I’m on your side, Honey.  No way did you cheat.”

Part of me finds it hard not to laugh when others dismiss limb-lengthening on dwarfs as a “quick fix.”  Breaking bones, stretching them over a three-to-five-month period and then waiting for them to heal for another ten months is not exactly comparable to a boob-job done over the weekend.  Then again, you’d better have a damn good reason to be willing to go through something so intensive and risky.  So, did I do it to function better or, as a former president of Little People of America insisted, to “blend in”? 

I did it to access all facilities I could not modify myself, from public ones like plane or train seats to private ones like friends’ furniture. I did it to correct some of my lordosis, so that I would have less back pain. I did it to have the extra leverage enabling me to carry bigger armloads. I did it to take bigger steps when walking, so I could cover more ground before I got tired. I did it so that my weight would be slightly more evenly distributed, making spinal compression less of a danger. I did it to reach farther. I did it because the patients I met who had done it were just as happy as those who had not.  Looking back on it all, this was definitely reason enough for me, regardless of whether or not it is for others.  But I can’t just leave it at that.

In my last post, I argued why there is no right way to hate your body.  In my experience, you can take dramatic measures to alter your body without hating it.  Indeed, the work you put into it can and should be an act of love, not desperation.  The night before my first limb-lengthening surgery, I kissed my old legs goodbye.  I was willing to let them go, but I kissed them all the same.  Yet many if not most outsiders assume that dwarfism is a visible difference the patient must want to erase.  After all, trying to argue that you don’t want to blend in, even though you will blend in, sounds like you’re trying to circle a square. 

So why not just say that limb-lengthening was my personal choice and my choice doesn’t affect anyone else?  But it does.  By blending in, I automatically relieve myself of a good deal of prejudice, of stares, of awkward reactions.  I have fewer questions to answer from people on the street and fewer chances to educate them.  By blending in, I’m breaking ranks with the dwarf community to some degree.  That’s nothing to sneeze at when considering that before the Americans with Disabilities Act of 1990, dwarfs had an unemployment rate of 85% in the U.S. all because of lookism.  By blending in, I am contributing to the trend that may make limb-lengthening a fashion for people with dwarfism.  Both politics and beauty standards measure strength in numbers. 

In the late 90s, my first femur surgery was filmed for a feature about limb-lengthening on the American news show 20/20.  The interviewer asked a 12 year-old patient with dwarfism, “Did you do it to look normal or to function better?”

Without missing a beat, the boy answered, “So that I could function better.  I don’t care how I look.  I just want to do what everyone else can.” 

Sitting at home watching, I raised my fist in solidarity and whispered, “Right on, kid.”

In the follow-up commentary, Connie Chung reported, “He has since finished the procedure to combat his dwarfism.”

I shot up in my seat in disbelief: “COMBAT?!” Was that the automatic assumption?  I wasn’t in a battle against my dwarfism, and obviously neither was this patient.  I was working with my body, not against it!  I realized then that it was important that others knew this if they were going to know that I chose limb-lengthening.

We may someday live in a world in which every candidate for limb-lengthening makes the same decision I did and in doing so, makes the world a less physically diverse place.  I will accept such a world, since my own efforts to function better have helped contribute to it.  But I won’t make any arguments advocating such homogeneity.  If my dwarfism and limb-lengthening have taught me anything, it is that it’s far more important for me to argue that beauty is about so much more than blending in. 

Deep down inside, every one of us wants to be conventionally attractive to some degree, because life seems easier that way.  We love the idea of throngs of people admiring us, envying us, falling hard for us at first sight.  It makes us feel fantastic on a visceral, heart-thumping level to be praised for our looks.  But if everyone agrees that there’s more to love and romance than conventionally good looks, what is the point of having broad appeal?  During the years when my curly hair reached my backside, I enjoyed the compliments but they were always the same, regardless of whether they came from friends or strangers.  My short, round achondroplastic hands, meanwhile, have garnered a lot more attention to detail.  My dad always called them “starfish hands.”  A guy in college examined them and disagreed: “They’re Maggie Simpson hands.”  Another amended it with a giddy squeal, “They’re finger-painting hands!”  When I began my final limb-lengthening procedure, a guyfriend in high school nicknamed me “Legs” because I had the most expensive pair around.  Who needs broad appeal when you have genuine affection?  What better proof is there of such affection, of people’s capacity to look beyond convention than their fearlessly falling in love with features they’ve never seen before?

If I deeply regretted having dwarfism, then limb-lengthening would indeed be an extreme measure taken to offset severe personal insecurity, and that would be a major cause for concern. Hating my looks so profoundly would impact other dwarfs’ perception of their own looks.  This is why I blog.  I don’t want to live in a world where anyone is pressured to change their body just to be accepted, and I don’t want my story to be misused to contribute to the forces pushing the world in that direction.

This is not to say every person who is born on the margins should turn their life into a 24-hour political cause.  Trans individuals should never have to answer invasive questions about their bodies any more frequently than cis individuals should.  LGBT people should never be pressured to come out.  Black Americans shouldn’t have to put up with strangers and acquaintances trying to touch their hair all the time.  The right to privacy is a human right. Your sex life, your income, your medical records, and your body are all matters you shouldn’t ever have to submit to anyone’s microscope if you don’t wish to.  But if we do open our mouths, we have to take responsibility for the consequences.   

When I choose to talk about my body and my choices, it feels to me like I’m talking only about myself.  But others are listening for how it all affects them.  If they don’t care about me personally, it’s their only reason for paying attention.  It’s the only reason we read novels and newspaper articles and blogs about strangers’ lives.  We’re searching for something we can relate to, and if we can’t relate, we at least want to know how other people’s choices are shaping the world we live in.  Opinions such as “I was so gross when I weighed x pounds,” or “I can’t wait to get rid of these hideous scars” both reflect and influence the society comprising us all.  We love taking credit for our words when others agree or are inspired by them.  But if someone raises the possibility of our statements having a negative impact on others, the temptation to shirk all responsibility for others is strong.  But we can’t ever shirk it.  That’s cowardly.

This doesn’t mean we must accept others offhandedly judging our most complex decisions.  Unfortunately, no matter what we say or how carefully we try to shape the argument, there will always be those out there who judge before hearing the end of the sentence.  Putting more energy into brandishing our opinions than admitting what we don’t know is also cowardly. 

A friend I met in the hospital was ten years-old and in the midst of limb-lengthening when a woman with dwarfism approached him in public and berated his mother for choosing limb-lengthening for her child. My friend concluded that this is why we shouldn’t talk to strangers.

We are talking to strangers when we publicly discuss our personal decisions, and the Internet is blurring the lines between public and private discussions faster than ever.  As decision-makers, we cannot discuss our choices and our views free from any responsibility for the effect they will have on others.  As observers, we cannot accurately judge others’ decisions at face-value, free from the burdens of learning. 

During one of my limb-lengthenings, I was featured in a French magazine article that posed questions I’ve used in my workshops on dwarfism and diversity, paraphrased here: 

Society does not physical accept differences easily.  Without a doubt, that is society’s fault.  But who should change?  Society or the dwarf?  For the dwarf to change, she must undergo years of painful surgeries and intensive physical therapy, risking many complications.  For society to change, it must alter its way of thinking.  Who suffers more in the change?  Which change is harder to achieve?

My experiences with dwarfism and limb-lengthening have inspired me to try to change both.  As best as a bossy girl from Long Island can.

 

Body Image Part II: The Rules for Snark

10 Jun

(Image by Stephen Alcorn © 2003 http://www.alcorngallery.com)

 

Last week I went after talking about others’ bodies for the sake of analyzing what you can’t be attracted to.  Today I’m going after talking about others’ bodies for the sake of musing, or amusement…

Anyone who insists they never make fun of others behind their back is lying.  We all do it, and to the extent that snark is now rivaling porn as the Internet’s raison d’être.  Every bit of our outward appearance—our fashion choices, our speaking styles, our assertiveness or timidity—it’s all out there for others’ scrutiny and all of us pick targets when we’re in the mood, sometimes at random, sometimes with a purpose.  Just take the example of weddings.  I bet there’s at least one wedding you’ve seen that looked ridiculous to you.  Alternative brides think, Wear an expensive dress if that’s what you’ve always wanted, but it’s still vulgar materialism.  And the mainstream brides think, Dont wear a white dress if you don’t want it, but you just want attention for being anti-everything.  While others simply think, Purple.  Yuck.  Or something to that effect. 

In wedding planning as in our everyday fashion, what we choose is a comment on what we don’t.  No one’s choice is in isolation of everyone else’s.  To dress like a punk or to dress like a cowboy, to speak a local dialect or to speak like a newsreader, to try to fit in or to try to stand out are all decisions we make that usually reflect both our tastes and our beliefs.  We give others’ decisions either the thumbs up or thumbs down accordingly.  As I’ve said before, it’s fair game when beliefs are targeted, because we should all take responsibility for our beliefs.  But too many of us make no distinction between the elements of someone’s appearance that reflect their beliefs, and the elements that reflect their biology.  

Many of my friends and family, along with most commenters on TV or online, see little difference between making assumptions about others’ clothes and making assumptions about the bodies they cover.  Just as they’ll assume the slick suit must belong to a businessman and the lady in shorts and sneakers is American, they’ll assume the particularly skinny woman must be anorexic, that the man whose hands shake must be an alcoholic, that the young woman who collapsed must be either diabetic or pregnant, that the large child over there getting his breast milk is obviously too old for that, that chubby guy over there is certainly overweight and should lose a few pounds, that the poor kid with acne isn’t using the right medicine.  Sometimes these flimsy diagnoses are voiced as expressions of sympathy or intellectual exercises à la Sherlock Holmes, sometimes they are dripping with self-aggrandizing pity or snarky complacency.  They are always unjust because, unlike quips about clothes or tattoos or cell phone ringtones, comments about another’s body have little to do with choices anyone has made. 

As someone who’s undergone limb-lengthening, I can of course attest that there are a few choices we make about our appearance.  But while I chose to try to add as many inches as possible to my height, I didn’t have much of a choice about how many inches I could go for.  (I gave all I could in physical therapy, but in the end, my ticked-off muscles stiffened and decided the limit for me.)  Nor did I have much of a choice about my anterior tibialis tendons severing on both legs, which now makes me stumble on average every few weeks and makes dismounting from a bicycle dangerous.  (After two surgeries to repair the tendons and three years of physical therapy, they remain weak.)  Nor have I ever had any choice about my hips swaying when I walk because the ball-and-socket hip joint in achondroplastic people is shaped like an egg-and-socket.  Skinny friends with hypoglycemia, heavy friends with slow metabolism, and friends with diastrophic dwarfism—whose growth plates do not respond to limb-lengthening—can also attest that any choices we make about our bodies are always limited.  Discussing these choices is important, but strangers assumptions about them are usually way, way off.

It is because I know so many kind, loving people who analyze strangers bodies that I wasn’t at all surprised by the nasty ruminations over her “puffy” appearance that Ashley Judd so awesomely bucked in Newsweek earlier this year.  And I’m only half-surprised by the website Too Big For Stroller, where people post street photos of children who appear to have outgrown the transport and smirk about what idiotic parents they must have.  In his essay, “Broken Phantoms,” Robert Rummel-Hudson writes beautifully, harrowingly about the unfair judgment strangers often heap on individuals with rare disabilities whose symptoms are less visible.  He went after the Too Big For Stroller crowd and summarized their defense arguments thusly: 

However many kids with invisible disabilities might be made fun of or hurt by that site, they are acceptable collateral damage, because some of them are probably lazy kids with weak parents, and they must be judged.

“Acceptable collateral damage” is the word I’ve been searching for my whole life.  It’s how Jason Webley downplayed the rights of “the few conjoined twins in the world” in light of his Evelyn Evelyn project.  It’s how so many minorities are dismissed as annoyances in our majority-rules society by the vacuous, relativist claim, “Everyone’s going to be offended by something.”  Which is another way of saying, “We can’t consider everyone’s rights.” 

All of us make automatic, silent assumptions about others’ bodies, often trying to figure out how we ourselves measure up, because we are all insecure about our bodies to some degree.  But the ubiquity of these thought patterns and the rate at which they are voiced is the problem, not the excuse.  There’s probably a list of catty things I’ve said the length of a toilet roll, but I try to stop myself from diagnosing strangers’ bodies, if anything out of awareness of my own vulnerability to inaccurate assumptions.  A few years spent in and out of hospitals also taught me what the hell do I know about where they’re coming from, and we all think enough unproductive thoughts about others’ physical appearance as it is.  In an essay about me and my scars, Arthur W. Frank writes that when we see someone who looks either unattractive or pitiful to us, our first thought is, “I’m glad that’s not me.”  And our second thought is, “But if it were me, I’d get that fixed.”

This is, of course, more than anything ahope.  We hope we would be different in the same situation.  But we’re afraid we may not be, and this fear causes us to quickly deflect the problem onto someone else.  Why not the person who just upset our delusions of normalcy?  So we and our supposedly meritocratic society nurture this idea—“I wouldn’t be like that”—as a justification for being judgmental.  Whether or not we voice these assumptions is indeed a choice we make, and whether or not we add any hint of judgment is yet another.   Whether or not this is fair is often debated on a case-by-case basis, but anytime anyone insults someone else’s body, it is a demonstration of their own insecurities.  Period.   

We’re all constantly judging one another and judging ourselves in comparison to one another.  This can be fair game when we stick to focusing on the mundane decisions we all make.  There is a world of a difference between quipping about fashion choices with head-shaking amusement—Sorry, Eddie Izzard, but sometimes you do not know how to put on makeup—and allowing our personal insecurities to fuel pity or disdain for others’ apparent physical imperfections.  There is no fair way to trash someone else’s body because, for the most part, your own biology is neither your fault nor your achievement.

 

 

In Comedy, It’s All About Deciding Who’s Us & Who’s Them

28 Apr

Krampus twins(Via)

 

The Guardian’s stylebook contains the greatest commentary on style I’ve ever seen in print:

political correctness: a term to be avoided on the grounds that it is, in Polly Toynbee’s words, “an empty right-wing smear designed only to elevate its user.”

Around the same time, while researching the back stories of Life’s Too Short for my review, I came upon the controversy over the word “mong” in which Ricky Gervais found himself embroiled this past fall.  Apparently “mong” is a British English insult derived from “Mongoloid,” the very antiquated and now unacceptable term once used to describe people with Down’s Syndrome.  Both Americans and Brits have probably heard “retard” used the same way.  Gervais eventually apologized to those who objected—including the mother of a child with Down’s Syndrome who has frequently endured the insult—but not without first dragging his heels screaming at what he called “the humorless PC brigade.” 

I will never get over how many comedians insist that any criticism of their work is an indictment of all comedy; as if there’s no such thing as an unfunny comedian, only stupid audiences.  This logic sets the bar for comedy so low that no comedian need ever try to be original.  Ignoring the “PC brigade” (i.e., anyone who doesn’t live with the privileges they do), they can simply regenerate old stereotypes, mining the minstrel shows, the frat houses and the school yards, and if no one laughs at this, it’s simply because we’re all too uptight, right?  Wrong.  We don’t refrain from laughing because we feel we shouldn’t.  We refrain because, unlike the repressed who giggle away in awe, we’ve heard it a thousand times before and we know it’s far from unique.  And isn’t unique what every comedian, entertainer and artist strives to be?   

Like politics, comedy can be divided into two categories: that which confronts our problems with our own selves, and that which confronts our problems with others.  Xenophobia literally means the (irrational*) fear of strangers and the second type of comedy relies upon this fear.  There has to be a “them” for “us” to laugh at.  So Republicans laugh at Democrats.  Hippies laugh at yuppies.  Academics laugh at hippies.  Progressives laugh at bigots.  It’s fair game when beliefs are targeted because we must always take responsibility for our beliefs.  However, when the joke defines “them” as those who have had no choice whatsoever about their distinguishing quality—ethnicity, gender identity, sexuality, physical traits, mental or physical capabilities, or class background—and who continue to be disenfranchised by society’s delusions of normalcy, the joke had better target those delusions to be in any way original.  Otherwise, why pay for cable or tickets to hear someone lazily reiterate the guffaws of playground bullies? 

Every good comedian, from Stephen Colbert to Eddie Izzard to Christian Lander to the writers at The Onion, knows that the best jokes mock people’s hang-ups and clumsy reactions to minority issues, not the mere existence of minorities. My beloved Flight of the Conchords frequently flip gender roles and ethnic stereotypes, exposing the absurdity of racism and misogyny.  As the following video demonstrates, 1970s machismo has been begging to be made fun of.  However, when it comes to physical Otherness, it is the body—not fearful attitudes toward it—that they choose to snicker over, 54 seconds into the video:

 

 

Hermaphrodite?  Really?  An intersex kid’s medical reality is your toy?  C’mon, Conchords.  You’ve proven you’re great at making fun of white Kiwis tripping over Maori culture.  (“Jemaine, you’re part Maori…  Please be the Maori!  If you don’t do it, we’re gonna have to get Mexicans!”)  Surely you could come up with some good bit about hipster comedians clinging to lookist and ableist jokes like teddy bears and throwing temper tantrums when they’re taken away.  Or take a tip from Mitchell & Webb and take a jab at the way the ableism of reality TV masquerades as sensitivity:

 

 

Of course comedians have the right to make jokes objectifying minorities.  But I’m more interested in why they feel the need to, why they choose to objectify some people and not others.  Being gay, disabled, trans, intersex or non-white is not inherently hilarious to anyone who doesn’t live their lives sheltered from anyone unlike them.  The American freak shows of P.T. Barnum and the racist British sitcoms of the 1970s signify not just how profoundly disenfranchised minorities were in these countries, but how absurdly provincial audiences must have been in order to be so easily titillated.  Many comedians who reiterate chauvinist jokes argue that in doing so they are pushing the boundaries, expanding freedom of thought in defiance of PC oppression, when in fact they are merely retreating to well-trod ground, relying on ideas that challenge nothing but the very young idea that minorities deserve to be included in the dialogue as speakers, not objects.  As Bill Bryson has pointed out, the backlash against “political correctness” took place the moment the idea was introduced and has always been far more hysterical than what it protests.   

Toni Morrison has said, “What I really think the political correctness debate is really about is the power to be able to define.  The definers want the power to name.  And the defined are taking that power away from them.”  Revealing that it is all about power explains why emotions run so high whenever minorities get upset by certain jokes and comedians get upset about their being upset.  But this redistribution of power can be productive.  Taking old slurs and xenophobic tropes away from today’s politicians and comedians challenges them to think beyond their own experience and to wean themselves off society’s long-held fears, to redefine “them” as those enslaved by the limits of their imagination; in essence, to really push the boundaries.  Yet too often they default to the tired claim that this challenge infringes on their right to free speech. 

Some progressive critics do bring on the censorship accusation by using the ineffective phrase “You can’t say that!” and sometimes this is indeed an open attempt at censorship because most media outlets self-censor.  For example, Little People of America has called for the Federal Communications Commission to add “midget” to its list of words you can’t say on television.  I understand the temptation to insist upon the same treatment afforded other minorities: If certain ethnic and gender slurs are banned by newspapers and TV networks, why not others?  But this tactic too easily insults those other minorities—are you claiming black people have it easier than you?—and creates the concept of a forbidden fruit that will only tantalize right-wing politicians and shock jock comedians.  Simplifying the issue into Good Words/Bad Words can be a waste of an opportunity.  Instead of limiting itself to which words are always unacceptable regardless of context or nuance, the dialogue should always aim to reveal which minority jokes truly blow people’s minds and which lazily replicate institutionalized chauvinism. 

Instead of splitting hairs over the modern meaning of the word “mong,” I’d love it if a comedian went at the fact that Dr. Down came up with the term “Mongoloid” because he thought patients with the diagnosis resembled East Asians.  Because really.  Who’s asking to be made fun of here?

 

 

* “Phobia” always indicates an irrational fear, hence arachnophobia, agoraphobia, claustrophobia, homophobia, etc.  Fears that are well-founded are not phobias.