Archive | Scars & Psychology RSS feed for this section

Can We Understand Race In Terms of Medicine?

14 Feb

Take off your fucking mask(Image by Taylor Dave used under CC license 2.0 via)
 
Leaving you this Valentine’s Day with the urging to go read an excellent discussion at NPR titled “Is It Time to Stop Using Race in Medical Research?

Then go read Alva Noë’s essay, “Can You Tell Your Ethnic Identity from Your DNA?” He writes:

…even if, in the ideal case, we find meaningful clusters of similarity in the space of genetic variation, there is no reason to think that these will map onto ethnicity or other categories in terms of which we understand our own identity. Identity, after all, varies non-continuously. French and German villages may be separated by the smallest of geographic distances. Genetic variation, on the contrary, so far as we now know, varies continuously. DNA is just not going to carve up groups at their culturally significant “ethnic” joints.

This interests me personally because any sort of categorizing of humans ends up being far more complicated than our everyday discourse would have us believe. Race, gender, and disability are so often thought to be concretely definable through bodily indicators, yet our categories for these identities—black/white/Asian, male/female, healthy/disabled—often fail fantastically to represent a good portion of humanity. As I’ve shown before, dwarfism itself is a social construct. All identities are to some extent.

 

 

Can We Understand What It Is Like To Hear Sound for the First Time?

17 Jan

listen(Image by Jay Morrison used under CC license via)
 
In the 1990s, Cristina Hartmann was one of the first of a few hundred deaf and hearing impaired children in the United States to undergo surgery for a cochlear implant. She has written extensively about the experience of hearing sound for the first time after the implant in her right ear was activated, most recently this month on Quora.com:

My mother was the one who told me, “Raise your hand when you hear something.” That statement left me baffled. What was I looking for? It was a bit like searching for Waldo when you didn’t know what he looked like.

In that tiny, windowless room deep in the large Manhattan hospital, the audiologist began tapping away at her keyboard. Everyone stared at me, even a woman standing in the doorway whom I had never seen before. I felt the heavy weight of expectations on my shoulders. I had to do something. I concentrated very hard, searching for the mysterious, indefinite Waldo. Whenever I felt anything, an itch or a breeze, I raised my hand slowly, searching everyone’s expressions for whether I had gotten it right or wrong. Nobody gave me any confirmation, so I went on guessing. Twenty-five years later, I realize the whole thing was a show that I performed. I knew this was a momentous event, and I didn’t want to disappoint….

As a congenitally deaf child (who was a bit long in the tooth at 6), I had never formed the neural pathways for my brain to even begin processing auditory stimulation. In the fashion of the ostrich, my brain ignored the strange stuff, and I remained as deaf as I had been an hour prior…

It took months and plenty of therapy for her brain to adapt. Thirteen years later, the activation of a second implant, this time in her left ear, proved a more harrowing experience than the first:

As the audiologist began the beep sequence, I burst into tears and involuntarily clenched the left side of my face. She looked up, puzzled. “Why are you crying? You’ve had this before!” she said. The pain was like sparklers going off on the left side of my head. The stimulation, as little as it was, completely overwhelmed me.

Even though I had already laid the neural pathways for auditory stimuli for my right ear, my brain was unprepared for the stimuli coming from the left side. Since my brain had already experienced this type of stimuli, it could process it, but it was still sensory overload. That stuff hurts. It took me months to acclimate myself to the new implant, but in the meantime, I cringed every time I turned it on. As I said, laying new neural pathways takes work.

Hartmann was later told by the mother of another patient, “Once they started with the beeps, [my daughter] screamed and cried.”

Such narratives exist in stark contrast to the YouTube videos of newly activated implant users laughing and smiling—and, in one case, crying for joy—that have been bouncing around the Internet with far greater frequency. While both narratives provide important information for those considering cochlear implants for themselves or their children, they are also an important contribution for the greater public in our understanding of what it means to be deaf.

It makes sense that crossing out of the world of silence into the world of sound is just as disorienting as its opposite. A hearing person with a middle ear infection strains to perceive the sound of speech, and a deaf person with a new cochlear implant strains to tune out noise pollution: the knocks of a radiator in another room, car doors slamming on the street, wind, footsteps, not to mention the countless background beeps and clicks of the Digital Age. After all, when a baby leaves the womb, she does not instantly adapt to her new home. She comes out crying. There’s too much light and not enough warmth. And, if she is not deaf, there is too much sound.

Speech is no less difficult to learn than Sign language, just as English is no less difficult than Chinese. The ease with which we learn one form of communication or the other depends entirely upon our personal experience and place in the world. For those of us who have grown up hearing speech, the viral videos communicate something very different than for those who grew up in Deaf culture.

While the experiences of utter delight portrayed in the videos are valid, their popularity contributes to an oversimplification of the issue. Watching a toddler smile upon finally hearing his mother’s voice for the first time sends a very strong subliminal message: Being deaf must be worse than not being deaf, and therefore anyone would want to join the world of the hearing. But the general public as an audience is already biased toward the hearing world’s standards of happiness. We are moved by the sound of loved ones uttering our names but not at the image of them signing our names because our culture does not rely on—and therefore does not highly value—Sign language.

This what inspired Lalit Marcus, the daughter of deaf parents and an active promoter of Deaf culture, to pen an article for The Wire titled, “Why You Shouldn’t Share Those Emotional ‘Deaf Person Hears for the First Time’ Videos”:

I want to make it clear that I don’t have a problem with people who choose to get cochlear implants. Medical decisions are painfully personal… I’m all for people making the health choices they think are best for them. What bothers me are the maudlin videos produced out of someone’s intense, private moment that are then taken out of context and broadcast around the world. What bothers me is how the viewer never learns how the individual came to the decision about their implant, which factors they took into account, whether their medical insurance covered it. Sometimes we don’t even learn their names.

This gives me pause. I consider the clip of me removing my casts to look at my newly lengthened legs, which featured 15 years ago in the HBO documentary Dwarfs: Not A Fairy Tale and last year on Berlin’s public station. The moment was simply joyous—as was the moment I stood up, let go of my friend’s hands and took my first steps—but the story behind it was abundantly complex. Which hopefully both documentaries portray.

I have endeavored to communicate that through this blog and all the media work I have done for the past 20 years.

Limb-lengthening and cochlear implant procedures are markedly different in several ways. Limb-lengthening, for example, does not threaten to endanger another language. But it does threaten to break ranks in the dwarf community through the controversy of altering versus accepting extraordinary bodies. Both procedures have proven to evoke vitriol among their proponents and detractors.

Hartmann reveals:

Most of my deaf friends were good about my CI. They didn’t mind it, except for the fact that my speech therapy cut into play time. That being said, people in the Deaf community felt free to make pointed and derisive comments about my CI. I still get these comments, even almost 24 years after my surgery. To some, I’ll always be a CI-wearer and a turncoat.

The CI advocates aren’t any better, if not worse.

I have very pleasant relationships with many parents of implanted children and CI users. I, however, have also been called a failure because I still use [American Sign Language] and don’t speak perfectly. I’ve also seen a mother run across a room to prevent her child from signing to another deaf child. I’ve been scolded for making gestures and looking too “deaf.”

The debate, of course, is ongoing.

But for those of us not faced with opting for or against a cochlear implant, we are faced with the challenge of overcoming our bias and remembering that Deaf culture is no less valid than the hearing culture we inhabit. Especially when those admittedly tantalizing videos wind up in our Facebook feeds.

 

 

Body Dysmorphia & the Dangers of Operating Out of Insecurity

29 Nov

Reid reading(Image by Miguel Tejada-Flores used under CC 2.0 via)
 
At the beginning of Mean Girls, Lindsay Lohan’s character watches her new high school friends indulge in body-bashing in front of a bedroom mirror:

“God, my hips are huge!”

“Oh, please. I hate my calves!”

“At least you guys can wear halters. I’ve got man-shoulders.”

“My hairline is so weird.”

“My pores are huge!”

“I used to think there was just fat and skinny,” Lohan thinks to herself. “Apparently there’s a lot of things that can be wrong with your body,”

While most women in the Western world are well-acquainted with this mentality, such self-hatred also occurs in men, albeit more covertly. Body dysmorphic disorder affects between 1% to 2% of the population and is distributed equally among men and women. And if they have the means to pursue cosmetic surgery, they can become addicted to it.

In an article appearing at The Huffington Post last week, 27-year-old Reid Ewing (pictured above), who plays a run-of-the-mill hunk on Modern Family, revealed his seven-year struggle with body dysmorphic disorder and his subsequent addiction to cosmetic surgery. After describing in detail his self-hatred in front of the mirror and his misery after each of the several surgeries, he turns his lens to the doctors who were only too ready to put him under the knife:

Of the four doctors who worked on me, not one had mental health screenings in place for their patients, except for asking if I had a history of depression, which I said I did, and that was that. My history with eating disorders and the cases of obsessive compulsive disorder in my family never came up. None of the doctors suggested I consult a psychologist for what was clearly a psychological issue rather than a cosmetic one or warn me about the potential for addiction.

People with body dysmorphic disorder often become addicted to cosmetic surgery. Gambling with your looks, paired with all the pain meds doctors load you up on, make it a highly addictive experience. It’s a problem that is rarely taken seriously because of the public shaming of those who have had work done. The secrecy that surrounds cosmetic surgery keeps the unethical work practiced by many of these doctors from ever coming to light. I think people often choose cosmetic surgery in order to be accepted, but it usually leaves them feeling even more like an outsider. We don’t hear enough stories about cosmetic surgery from this perspective.

Not long after I had decided to stop getting surgeries, I saw the first doctor I met with on a talk show and then in a magazine article, giving tips on getting cosmetic surgery. Well, this is written to counter his influence. Before seeking to change your face, you should question whether it is your mind that needs fixing.

Plastic surgery is not always a bad thing. It often helps people who actually need it for serious cases, but it’s a horrible hobby, and it will eat away at you until you have lost all self-esteem and joy. I wish I could go back and undo all the surgeries. Now I can see that I was fine to begin with and didn’t need the surgeries after all.

I have written extensively about my decision to undergo six years of limb-lengthening. In the many, many conversations I have had with people in person, on panels and in print about this decision, I have emphasized that it was not for cosmetic purposes and that anyone who would do it to counteract feelings of bodily inferiority should refrain. Ewing’s stories of screaming at his scars and feeling anything but satisfied with himself are precisely why.

And for the majority of people who are not at risk for such all-encompassing self-destruction, it is still worth asking ourselves as a culture if the aforementioned tradition of bonding through body-bashing brings us any self-esteem or joy.

 

 

What the Stubblefield Rape Case Means for Disability Rights

22 Nov

Words as skin(Image by Maurizio Abbate used under CC license via)

 

When people continue to believe in a method that has repeatedly been proven not to work, what harm can it do? Does it matter that an herbal supplement is ineffective if someone who uses it says it truly makes them feel better? Does it really matter whether or not primates can learn American Sign Language or parrots can learn to read English out loud if it makes animal lovers so happy to believe that they do?

Misinterpreting animal communication can of course be dangerous. In 2007, a Dutch woman who insisted she was bonding with an ape at her local zoo refused to believe the primatologists’ warnings that staring directly into a male gorilla’s eyes and showing one’s teeth—i.e., smiling—triggers aggression. She refused to believe this even after the gorilla broke out of his enclosure and attacked her.

But what if someone assumes a living person is communicating with them? What if they assume said person is confiding their wishes and life choices in them? What if they can do so because we don’t share a common language with the person they claim to be speaking for?

Facilitated Communication, a.k.a. “FC,” is a method developed in the late 20th century to help severely disabled people with little or no speech communicate with others. By supporting their patient’s hand or arm, a trained facilitator could theoretically help the patient type out sentences, thereby “unlocking” intelligence previously obscured. The method was considered a breakthrough for patients with diagnoses ranging from severe autism to severe cerebral palsy. It was touted as a miracle for their loved ones, who understandably wanted nothing more than to be able to hear their thoughts, wants and needs.

Anna Stubblefield is a philosophy professor and disability rights advocate who, until recently, taught seminars about FC at Rutgers University. What she did not teach her students is that FC has been condemned over the past three decades by the American Psychological Association, the American Association of Pediatrics, the American Association on Intellectual and Developmental Disabilities, the Federal Trade Commission, and the New York State Department of Health, among others. Double-blind testing generally reveals the facilitator to be subconsciously guiding their patient’s typing, rather than simply supporting it. This year Sweden banned FC in schools nationwide.

Professor Stubblefield adamantly rejects the classification of FC as a pseudoscience. Her mother was a pioneer of the technique. When one of her seminar students asked her in 2009 if it could perhaps help his severely disabled young adult brother—referred to in the press as “D.J.”—she agreed to treat him. A 20-page report in The New York Times Magazine chronicles Stubblefield’s increasingly intimate relationship with her patient, eventually culminating in her announcement in 2011 to his family that she and D.J. were in love. She planned to leave her husband and two children for him. As his legal guardians, D.J.’s family told her she had overstepped her boundaries and requested she leave him alone. When she did not, they eventually filed charges against her. They testified that gradually Stubblefield’s claims to D.J.’s interests and values—typed out in their FC sessions—had begun to sound suspiciously like things she would want him to say. Stubblefield was sentenced last month to 40 years in prison for sexual assault.

Another proponent of FC, Martina Susanne Schweiger, was convicted last year in Queensland, Australia for performing sex acts on a 21-year-old patient whom she believed had reciprocated his love for her via FC.

I’ve written before about widespread prejudices against disabled people and how often it denies us our sexuality. But disabled people also suffer sexual abuse at rates far higher than the general population. Most are taken advantage of by their family members and/or caregivers. Stubblefield and the remaining proponents of FC argue that their critics are ableist for denying D.J.’s capacity for intellect and intimacy. The prosecution argued that Stubblefield is ableist for assuming she knows what D.J. wants.

The desire to be the next Miracle Worker is understandable and so often noble. Who doesn’t want to help those in need? And the lure of the controversial in the pursuit of justice is not uncommon. From Jodie Foster and Liam Neeson in Nell to Sean Penn and Michelle Pfeiffer in I Am Sam to Adam Sandler and Don Cheadle in Reign Over Me, Hollywood is rife with love stories and courtroom dramas about a misunderstood outcast who has finally found the one open-minded hero who understands him, believes in him and then must fight the cold-hearted, close-minded authorities from keeping them apart.

Yet red flags should go up whenever there is a risk that a self-appointed advocate is putting words in someone’s mouth, no matter which side that advocate thinks they are on. Particularly when their patient or client belongs to a highly marginalized minority.

News of this case has elicited many head-shaking responses along the lines of, “Well, they all sound nuts.” One of the jurors told NJ.com, “I was like…‘You’re going to leave your husband and your kids for someone like this?’” Disability rights advocates rightly bristle at the infantilizing of D.J.—not to mention the salacious headlines that seem obsessed with his personal hygiene—while ultimately declaring the case incredibly sad. Yet we rarely use “nuts” or “sad” to describe male teachers convicted of seducing students unable to give consent. We describe them as predators or abusers.

Abusers of course rarely think of themselves as such. Child molesters are often convinced their victims were flirting with them. Few would consider themselves sadistic. Most are simply skilled at rationalizing their behavior to themselves. But regardless of what they believe their intentions are, abusers by definition deny others power in pursuit of their own.

The Stubblefield case and the Schweiger case highlight a very uncomfortable fact for disabled people everywhere: that some of the caregivers and activists working and sometimes fighting on our behalf are doing it to feed a savior complex. And anyone with a savior complex is not truly listening to those they claim to be helping.

Addressing this problem becomes increasingly difficult when we consider how very young the concept of disability rights is over the course of human history. Living in any other era, most of us would have been abandoned by our families in asylums or elsewhere. Ancient Spartans advised throwing us off cliffs after birth. Some modern philosophers, such as Prof. Peter Singer, still advocate infanticide for some. Awareness of all this often makes us feel compelled to be eternally grateful to anyone who offers us any sort of support or help, regardless of whether or not it is truly helpful or respectful of our boundaries.

That we do not yet have the means to access D.J.’s thoughts and desires is indeed tragic. But opposition to FC does not mean we damn severely disabled people to the realm of hopelessness. On the contrary, accepting criticism of FC can only help to improve upon the ways in which researchers develop better practices and technologies. Relying on discredited methods would not have gotten Stephen Hawking his voice. Annie Sullivan prevailed with Helen Keller because she not only relied on rigorously tested methods but also shed her status as Keller’s sole communicator by enrolling her in an interdisciplinary program at the Perkins School. The ability to kill your darlings is an ingredient of innovation.

And any true investment in disabled people and the methods that best assist them must be accompanied by the credo activists began using around the time D.J. was born: Nothing about us without us.

 

 

Who Should Think You’re Beautiful?

11 Oct

Goodnight(Image by Aphrodite used under CC 2.0 via)

From the Archives

 

Should beauty pageants stay or go?  The New York Times tackled this question during the 87th Miss America Pageant.  Amidst all the discussions about deferential giggles and zombie smiles, I find myself echoing the conventional wisdom that Let’s face it, it’s all about the swimsuit round, and Caitlin Moran’s wisdom that You can call it the ‘swimsuit round’ all you like, but it’s really the bra and panties round.

A decade ago Little People of America entertained the idea of holding an annual beauty pageant, but it was swiftly nixed by the vast majority of members.  The inherent problems were pretty obvious: Isn’t being judged by our looks the biggest problem dwarfs face?  Do we really want to set a standard for dwarf beauty?  And if so, which diagnosis gets to be the standard?  Achondroplasia or SED congenita?  Skeletal dysplasias or growth hormone deficiencies?  Ironically—or perhaps not—there was also a widespread fear that heightism would dominate the judging.

What I find most unsettling about beauty pageants is not the nondescript personality types on display—although I am very concerned about that, too—but the idea that it is perfectly normal and okay to want millions of strangers to love your looks above all else. This idea seeps into every corner of Western culture, not just beauty pageants and women’s magazines. 

If you’ve ever entered “body image” into a search engine, it won’t take you long to come across the phrase You’re beautiful!  It’s everywhere, and it’s usually geared at anyone, particularly anyone female, who believes they fall short of the beauty pageant prototype.  You’re beautiful! is part battle cry, part mantra – a meek attempt to broaden society’s beauty standards and an earnest attempt to bolster individual self-confidence.  Super-imposed over flowers and rain clouds and sunsets and cupped hands, it becomes hard to tell the online empowerment apart from the online valentines. And as much as I admire the intentions behind it, I’m tempted to question it. 

Making peace with our bodies is important.  Diversifying our criteria for human beauty is necessary.  But why should we need to hear that we’re beautiful from someone we don’t know?  Of course we can never hear it enough from friends and lovers.  (I’ve heard it three times in the last 24 hours and I’m not giving it up for anything!)  But basing self-confidence in strangers’ praise upholds the notion that it is bad to be thought of as ugly or plain by people who don’t know anything else about you

We all have our secret fantasies about being gorgeous rock stars and princesses and Olympic heroes with throngs of admirers dying to throw their arms around us.  But, to echo Jane Devin, if most men can go through life with no one but their lovers daring to praise their looks, why do women still demand so much attention? 

This past spring Scientific American revealed that, despite how much our culture suggests that most of us need to hear over and over how attractive we are before we even begin to believe it, the average person overestimates their appearance.  This shouldn’t be too surprising. The world’s largest empire isn’t called “Facebook” for nothing.  And as the Scientific author pointed out, the vast majority of us consider ourselves to be above-average in most respects, which is statistically impossible.  He explains: 

If you think that self-enhancement biases exist in other people and they do not apply to you, you are not alone. Most people state that they are more likely than others to provide accurate self-assessments

Why do we have positively enhanced self-views? The adaptive nature of self-enhancement might be the answer. Conveying the information that one has desirable characteristics is beneficial in a social environment…  Since in self-enhancement people truly believe that they have desirable characteristics, they can promote themselves without having to lie. Self-enhancement also boosts confidence. Researchers have shown that confidence plays a role in determining whom people choose as leaders and romantic partners. Confident people are believed more and their advice is more likely to be followed.

So self-confidence is good and self-doubt is bad, both in love and in life.  And demanding strangers and acquaintances tell us that we’re beautiful is narcissism, not self-confidence.  In the words of Lizzie Velásquez, who was voted Ugliest Girl in the World on YouTube, “I don’t let other people define me.”

This is not to suggest a ban on praising anyone’s looks ever.  I still harbor adolescent crushes on a pantheon of celebrities, from George Harrison to Harriet Beecher Stowe.  But between the beauty pageants and the You’re beautiful! memes, it does seem that most of us still believe that having broad appeal is some sort of an achievement, as opposed to dumb luck.  And that for a woman, it’s an achievement worthy of mention on a résumé. 

In April, President Obama touted newly appointed Kamala Harris as “by far the best-looking attorney general.”  After dealing the president a well-deserved eye-roll, Irin Carmon at Salon suggested that before publicly praising someone’s looks, we should ask ourselves: Is it appropriate to tell this person and/or everyone else that I want to sleep with them?   

It’s an excellent point, though crucial to add that seeing beauty in someone is not always rooted in lust.  Love for friends and family usually renders them absolutely adorable or heroically handsome.  Whenever I overhear someone say, “You’re beautiful!” it will always register as an expression either of desire or affection.  (Neither of which, Mr. President, are ever appropriate in a professional context.)  

Yet plenty of us still envy Kamala Harris a little.  And too many of us seem to think being conventionally attractive is truly important because it corresponds directly to being successful in love.  This is perhaps the most dangerous myth of all. 

If I hear the phrase, “She was out of my league!” one more time, I’m going to swat the sad sack who says it.  My dating history is nothing to brag about, but I can brag—shamelessly—about being a trusted confidante to dozens upon dozens of different people with all sorts of dating histories.  And after a few decades of listening to them spill their hearts out, I’ll let you in on a little secret: When it comes to love and lust, everyone is wracked with self-doubt. 

And I mean everyone.  The athletes, the models, the geeks, the fashionistas, the bookworms, the jet-setters, the intellectuals, the rebels, the leaders, the housewives, the musicians, the Zen Buddhists, the life of the party.  That girl who can’t walk through a club or the office without being propositioned.  That guy known as a heartbreaker because he can bed anyone he wants to and does so.  That stoic who doesn’t seem to care about anything.  That wallflower so set on navel-gazing that she thinks she’s the only one who’s lonely.  Every single one of them has fretted to me at 2 am, sometimes sobbing, sometimes whispering, sometimes hollering, always shaking: “Why doesn’t he/she love me?!” 

This isn’t to say that it all evens out completely and no one handles it better than anyone else.  Outside of abusive relationships, those who obsessively compare dating scorecards and create rules and leagues for turning sex into a competition are invariably the most miserable.  Some people date a lot because they’re popular, others because they have low standards.  Some marry early because they’re easy to know and like, others because they’re terrified of being alone.  Just being able to easily land a date or get laid has never made anyone I know eternally happy.  Narcissism and self-pity come from thinking it can. 

We’d all like to be the fairest of them all, but what we want more than anything is to be devastatingly attractive to whomever it is we’ve fallen in love with.  And because only those who genuinely know us can genuinely love us, any beauty they see in us comprises our style, our charisma, our perfections and imperfections.  It is the driving force behind all the world’s great works of art we wish we were the subject of.  And unlike beauty pageants or Google’s image search, true art is constantly redefining and questioning and promoting beauty all at once.   

I will always tell certain people how gorgeous they are because I can’t help but think that about those I’m awe of.  (And I guarantee that my friends are prettier than yours.)  But for those of you out there who might feel tempted to rebut the compliment with that age-old line, “You’re just saying that because you’re my [friend/partner/family]!” consider that a compliment motivated by true love is hardly a bad thing. 

And that being desired by someone who doesn’t love you at all can get really creepy.  Really fast. 

 

 

Originally posted September 15, 2013

Content Warnings and Microaggressions

20 Sep

Grunge Warning Sign - Do Not Read This Sign

(Image by Nicolas Raymond used under CC license via)

 

There’s a heated debate going over at The Atlantic over trigger warnings and microaggressions. For those less familiar with online minority rights debates, trigger warnings originated as labels for video or texts depicting graphic violence, often sexual, that could be triggering for survivors of assault suffering from PTSD. They have since evolved into “content warnings,” used to label any video or text containing arguments, comments, humor or images that marginalize minorities. I most recently ran into one preceding a beer ad in which two brewers tried to joke about never wanting to have to do anything so humiliating as dressing in drag in the red-light district in order to earn money.

Jonathan Haidt and Greg Lukianoff have argued that content warnings have led to “The Coddling of the American Mind,” a culture of silencing, wherein too many are afraid to initiate dialogue on these issues, lest they offend. They criticize restrictive speech codes and trigger warnings, and suggest universities offer students free training in cognitive behavioral therapy in order to “tone down the perpetual state of outrage that seems to engulf some colleges these days, allowing students’ minds to open more widely to new ideas and new people.”

“Microaggressions” is a term invented in 1970 by Harvard professor Charles M. Pierce to refer to comments or actions that are usually not intended as aggressive or demeaning but nevertheless do contribute to the marginalizing of minorities. Examples would be certain physicians being addressed as “Nurse” at the workplace. Or nurses, secretaries, cashiers, and storage room workers constantly hearing the widespread Western belief that low-skilled jobs deserve a low degree of respect. Or men still being expected to prove their worth through their career and never their emotional fulfillment. Or lesbians being asked if they’ve had “real sex.” Or anyone hearing from magazines, sitcoms or even loved ones that body types like theirs are something to avoid ending up with or hooking up with.

Microaggressions are the essence of insensitivity and they highlight the widespread nature of many prejudices about minorities. I analyze them all the time on this blog, without labeling them as such. Finding blogs that feature them in list-form can be done with little effort.

Citing a sociological study by professors Bradley Campbell and Jason Manning, Connor Friedersdorf has argued that calling out microaggressions on social media sites has led to a culture of victimhood, wherein the emotions of the offended always matter more than the perpetrator’s intentions. Victimhood culture is “characterized by concern with status and sensitivity to slight combined with a heavy reliance on third parties. People are intolerant of insults, even if unintentional, and react by bringing them to the attention of authorities or to the public at large.”

Cue the overemotion. Simba Runyowa rightly rebuts that many of Friedersdorf’s examples of hypersensitivity are cherrypicked, but then goes on to deny that anyone would ever want to be seen as a victim. (Not only do most petitioning groups—whether the majority or the minority—claim to be the victim of the other side’s moral failings and undeserved power, but it appears he has never tried to explain what it’s like to have a rare condition, only to be interrupted by the insistence, “I think I have that, too!”) On the other side, Haidt, Lukianoff and Friedersdorf have attracted plenty of support from those who are only too happy to believe that college campuses and the blogosphere today are ruled by the PC police, rendering such places far worse than Stalinist Russia.

I rarely issue content warnings on videos or quotations or any examples of bigotry I analyze on this blog. My primary reason is that a majority of the content we consume every day is arguably misogynistic or heteronormative or ableist or racist or classist or lookist. This does not at all mean that we should not address those problems, but demanding “warnings” on whatever has marginalized me leaves me open to criticism for not doing the same for all the other injustices I may not see.  As both a Beatles fan and a social justice blogger, I will always prefer to read or hear a comprehensive critique of John Lennon’s ableism than to see warnings on his biographies.

And I don’t label microaggressions as such because I agree with Friedersdorf that the word seems at odds with its definition. Insensitivity can be very hurtful. It can contribute to feelings of alienation by functioning as a reminder of how millions of people might think of you. But it is not aggressive. Highlighting, questioning and debating ubiquitous prejudices, stereotypes and traditions is crucial to human progress. Mistaking ignorance for hostility, however, is an obstacle to it.

Would it be accurate and productive to post something like this?

Microaggression: Having to hear yet another parent talk about how thrilled they are to have been able to give birth “naturally.”

(Avoiding C-section is never an option for women with achondroplasia like me.)  And would it be accurate and productive to something like post this?

Microaggression: Having to hear yet another childfree blogger brag about how great it is to have the time and energy to do things I’ll never be able to do like hiking or biking, let alone if I have kids.

Would it be more practical to tweet such complaints rather than pen an extensive article about the intricacies of the problem because few have time to read the particulars of considering parenthood with achondroplasia? Would posting them on a site featuring microaggressions serve as a much-needed wake-up call, convincing the perpetrators to see the issue from my perspective, or would it put them on the defensive? Would it spark dialogue or shut it down? Are the comments that marginalize my experience veritably aggressive? Feel free to share your thoughts in the comments.

But whether we think people on either side of the majority vs. minority debates are too sensitive or insensitive, we should be aiming for dialogue over exclamation points.

 

 

We Gotta Watch Our Language When It Comes to End of Life Debates

13 Sep

Jerzy body

(Image by Ekaterina used under CC license via)

 

On Friday the British Parliament resoundingly struck down a bill that would guarantee its citizens the right to physician-assisted death. Yesterday California’s legislature voted to make it the sixth state in the U.S. to legalize it.

Robust, nuanced arguments have been made for and against physician-assisted death for terminally ill patients, and none of these arguments could be successfully summarized within a single article. This is why a conclusive stance on the issue will never appear on this blog. It is nothing but moving to hear the deeply emotional pleas from those in the right-to-die movement who have thought long and hard about the prospect of death, who feel empowered by having some choice when facing down a daunting fate, who don’t want to find out which of their loved ones may turn out to be unskilled at care-giving. And it is equally moving to hear the experiences of those working in hospice and palliative care who face the approach of death every day with the determination to make it as minimally painful and emotionally validating as possible for all involved.

However, despite the emotional validity of both sides, there are tactics the right-to-die movement should avoid if it does not wish to make our culture more ableist than it already is. Openness about end of life decisions can shed light on a subject previously cloistered away, but the more the right-to-die movement celebrates the idea of ending someone’s life before it transforms into a certain condition, the less willing the public may be to engage with and invest in those who live in that condition.

Which is why no one should call physician-assisted death “Death with Dignity,” as lawmakers in Washington, Oregon, and New York have done. The implication that anyone who opts out of assisted death might live an undignified life is reckless and arrogant. A patient declaring the prospect of invasive life-saving interventions “too much” is fair. A writer declaring the quality of life of those who opt for them “pathetic” is ostracizing. It insults not only those enduring late-life debilitation, but the everyday conditions of many, many disabled people of all ages around the world.

Even today, when so many movements push to integrate disabled people into the mainstream, the average person is generally isolated from the reality of severe deformity, high dependence, and chronic pain. This isolation feeds fear and is therein self-perpetuating. As opponents have pointed out, many right-to-die arguments quickly snowball, equating terminal illness with chronic illness and disability, and portraying all three as a fate worse than death. Hence the name of the New York-based disability rights group Not Dead Yet.

Vermont’s recent law, the Patient Choice and Control Act, bears a far less polemic name than the others currently on the books. That’s a start. Experts are divided as to whether the current openness about end of life decisions in the U.S. has led to more terminally ill Americans considering and opting for hospice and palliative care. Regardless, both sides should be encouraging well-informed discussions that honor a patient’s right to voice his beliefs based on personal experience, and a disabled person’s right to not be further marginalized by a culture that has historically feared her existence.

 

*Note: I use “physician-assisted death” and other terms in deference to World Suicide Prevention Day this past Thursday and the media guidelines from the Center for Disease Control, which discourage use of the word “suicide” in headlines to avoid contagion.

 

What Do You Think of When You See the Word “Healthy”?

6 Sep

Up close Star makeup mac, urban decay(Image by Courtney Rhodes used under CC 2.0 via)
 
In late 2013, journalist Katie Waldman examined the juicing trend, which was cropping up in the corners of Western society where there is a heavy focus on modern notions of “natural and organic” (think anywhere from Berlin’s Prenzlauer Berg to Burlington, Vermont and Berkeley, California) as well as in those where people competitively strive to follow the latest fashions in health and beauty (think the high-earning sectors of London, Manhattan or Los Angeles). Lifestyle writers have declared two years later that juicing has staying power, despite Waldman’s disturbing findings. Along with little to no evidence that cleansing the body with juice can be physically beneficial, she revealed that the language of most detox diets echoes the language used by those struggling with disordered eating – i.e., the idea that most of what the masses eat is on par with poison and you’re a bad person if you don’t purge it. She writes:

After days of googling, I still have no idea WTF a toxin is… Cleansing acolytes use the word toxin loosely, as a metaphor for our lapsed lifestyles…. The problem with this way of thinking is that food and weight are not matters of morality. Thin is not “good,” carbs are not “bad,” and in a world of actual pressing political and social ills, your dinner plate should not be the ground zero of your ethical renewal.

I’m neither a supporter nor an opponent of juicing in particular. Anyone should drink whatever they want to drink. But Waldman made a fantastic point about the way the upper and middle classes in the West so often believe one’s health to be a sign of one’s morality.

This idea is hardly new. The eugenics craze of the 19th and 20th centuries—that culminated with the Nazis exterminating “degenerates”—involved Fitter Families contests held at county fairs wherein judges handed out trophies to those deemed to have the best heritage, skin color, and tooth measurements. Professor Alan Levinovitz argues in Religion Dispatches that these attitudes have survived on into the present, altered only ever so slightly: “The sad thing is, it’s really easy to judge people on the basis of what they look like. We have this problem with race. In the same way, it’s really easy to look at someone who’s obese and say, ‘Oh look at that person, they’re not living as good a life as I am. They’re not as good on the inside because I can tell their outside isn’t good either.’ ”

Do we as a culture believe that being “healthy” is about appearance? Dieting often dictates that it’s about behaviors measurable through appearance. Psychologists agree to the extent that their notions of “healthy” are about behavior, but they also frequently intersect with notions of being “good.” But is being “healthy” about being brave, honest, generous and humble? Physicians would generally argue it’s about staving off death. Right-to-die advocates would argue it’s about quality of life over longevity. Is being healthy a matter of what scientists decide? Ed Cara found earlier this year that weight loss does not lead to happiness. Is happiness a measure of being healthy? Or are you only healthy if you suffer for it? Concepts of “healthy” vary vastly from person to person, and across cultures. Is that healthy?

In The Princess Bride—probably the Internet’s second-most quoted source after Wikipedia—the hero cautions, “Life is pain. Anyone who says differently is selling something.”

Yet the villain says, “Get some rest. If you haven’t got your health, you haven’t got anything.”

Whether you agree with any or none of the above, leave me your thoughts on the meaning of “healthy” either in the comments or via an e-mail to paintingonscars[at]gmail.com

 

 

The Problem of Dwarfs on Reality TV

30 Aug

voyeurism(Image by Natasha Mileshina used under CC license via)

The new television schedule has kicked off both in the U.S. and the U.K. with the usual plethora of reality TV shows and the usual high number of shows zeroing in on people living with dwarfism: The Little Couple; Seven Little Johnstons; Our Little Family; Little Women of L.A.; and the grandfather of them all, Little People, Big World. Besides the patronizing titles and taglines, the shows feature factoids about dwarfing conditions and lots of melodrama thrown in with some social critique lite.

Having handed my life story over to a journalist for the umpteenth time this past spring, my husband and I recently discussed how important it is to be able to trust that your storyteller will not exploit you for entertainment value. It takes a perceptive mind and an agile hand to elucidate dwarf-related topics like bio-ethics, self-image, political correctness, beauty standards, harassment, adoption, job discrimination, pain management, and reproductive freedom—all of which could and have filled scholarly journals and books—via mere sound bites. At one point in the conversation my husband paused and said, “Just to make sure we’re on the same page, honey – we’re never appearing on reality TV. Right?”

I laughed and nodded reassuringly.

He was not unwise to worry.

Reality TV offers their subjects fame at the expense of their dignity. Documentaries and news features also carry a risk for this, but one element that distinguishes reality TV from journalism is the rock-solid guarantee of fights, tears, and bad-mouthing. For some participants there may be gratification in the knowledge that millions of viewers are interested in you enough to want to watch how you live every waking minute of your life, but it comes with the unspoken fact that they’re also waiting for you to slip up so that they have a good story to hash out among their friends and in gossip columns.

We are all vulnerable to voyeuristic temptation and the media knows this. It’s why it offers us up-close shots of survivors’ tears as soon as possible, and it’s why we click on them, despite recent and compelling arguments that this is socially irresponsible. The message of reality TV seems to be that no one really ever moves beyond middle school jealousy and superficiality, so we might as well let it all hang out. The better angels of our nature be damned.

Years ago, Cathy Alter mused via a glib article in The Atlantic about her rather bizarre obsession with dwarf reality shows. The greatest revelation came from her therapist, who explained, “I think regular size people feel more secure as people when they can observe midgets… I think that contrast is validating because we tell ourselves that at least there are people who have it worse, because they are small… We need the midgets to feel normal.”

This confirms what I have always suspected and, admittedly, feared. That millions of people are watching under the guise of wanting to understand difference while ultimately enjoying getting to look at lots of juicy pictures of freaks.  This is why these sensationalist shows do so well, while earnest, in-depth documentaries like Little People: The Movie remain out of print. Before the birth of reality TV in the late 90s, dwarfs were most often featured on daytime talkshows, alongside episodes featuring people caught in affairs and people who believed they were the reincarnation of Elvis.

As often the only dwarf in a given person’s circle of acquaintances, I have been told by many how touching they find these shows. How wonderful it is to see that “dwarfs are just like everyone else!” I can accept that there will probably always be a market for shallow entertainment that twists tragedy into soap opera and reduces the complexities of life into easy-to-swallow sentimentality, no matter how far society progresses. Tabloids will continue to exist because millions of people—including kind, intelligent people I know—will continue to buy them. In this regard, the individual shows are not so much problematic as is the fact that they are where TV viewers are most likely to see people with dwarfism.

Actress Hollis Jane, who called out Miley Cyrus last year for exploiting performers with dwarfism as sideshow acts, explained this summer why she turned down a contract to appear on Little Women of L.A.:

Other than Peter Dinklage, Tony Cox (Bad Santa) and Danny Woodburn (who played Mickey Abbott on Seinfeld), it’s nearly impossible to name successful actors and actresses who also happen to be little people. People get upset about the Kardashians representing women in America but for every Kardashian there is a Meryl Streep, a Natalie Portman, or a Zoe Saldana. Little people don’t have that. I have wanted to be an actress since I was in first grade and I played the angel, Gabriel, in a nativity play. I held firm to this dream until sixth grade when a parasitic thought crawled into my head and told me that I would never be an actress because I was a little person. I realized that since there was no one on television who looked like me, it meant that there would never be… When Game of Thrones premiered, my world was rocked. Peter Dinklage was doing the impossible. He was being taken seriously as an actor without exploiting his height for shock value or a joke. The night he won his Emmy, I cried for an hour.

She adds, “I have nothing against the women on these reality shows. There is a part of me that thinks it’s great we have little people on TV in any capacity…but I also think we deserve more than that.”

If the general public truly believed this, if reality TV viewers truly saw their dwarf subjects as their equals rather than curiosities, then we would see a lot more dwarfs as newsreaders and game show hosts, starring in sitcoms and dramas, playing the lead detective and the lovely heroine and the hero facing impossible odds to save the day. Perhaps that day will come, but for now few people can name a single dwarf actress and many dwarfs get told that they look like “that guy on the show about the little people.” That’s our reality.

For Anyone Who Has Ever Been Asked “So What Do You Like to Be Called?”

2 Aug

 

Leaving you this summer day with some astute observations from comedian Hari Kondabolu about the power of social constructs, or rather, our strong attachment to them.

 

 

Difference Diaries Wants to Hear from You

19 Jul

Copyright Difference Diaries

 

I have recently become the Director of Educational and Multimedia Outreach at the Difference Diaries, and today marks the launch of the Difference Diaries Blog. We want submissions and we want them now.

The Need. This week Freeburg High School in Illinois jubilantly voted down a petition by Little People of America to retire their school mascot, the Freeburg Midgets.

Such incidents are hardly isolated. Dwarfs rarely make the news, and when we do, we often wish we didn’t. Two summers ago Slate magazine, one of my favorite socio-political periodicals geared at young adults, kicked off a blog about Florida with an opening article called, “True Facts About the Weirdest, Wildest, Most Fascinating State.” Among the facts that apparently render the Sunshine State weird are the python-fighting alligators and “a town founded by a troupe of Russian circus midgets whose bus broke down.” On the day of its release, Slate ran the article as its headline and emblazoned “A Town Founded By Russian Circus Midgets” across its front page as a teaser.

Face-palm.

Here’s the thing about dealing with all this. You get used to it, but not forever and always. Sometimes it rolls off your back, sometimes it hits a nerve. This time, seeing a magazine as progressive as Slate brandish RUSSIAN CIRCUS MIDGETS on its front page while leaving disability rights out of its social justice discussion brought me right back to college, where friends of friends called me “Dwarf Emily” behind my back and someone else defended them to my face. Where classmates cackled about the film Even Dwarfs Started Off Small—“because it’s just so awesome to see the midgets going all ape-shit!”—but declined my offer to screen the documentary Dwarfs: Not A Fairy Tale. Where a professor was utterly outraged that her students didn’t seem to care about immigration rights or trans rights, but she never once mentioned disability rights. Where an acquaintance asked to borrow my copy of The Curious Lives of Human Cadavers, but awkwardly turned down my offer to lend her Surgically Shaping Children. Where roommates argued vociferously that they would rather be euthanized than lose the ability to walk. Where jokes about dwarf-tossing were printed in the student newspaper.

I won’t go into certain crude comments that involved me personally, but I will say that when a friend recently, carefully tried to tell me about how shocked he was to find a certain video of dwarfs in a grocery store, I cut him off and said, “Lemme guess, it was a dwarf woman porn video? That’s one of the top search terms that bring people to my blog.”

This is not to ignore all those I’ve met who, despite their lack of experience with disability, ask carefully constructed questions and consistently make me feel not like a curious object but like a friend who is free to speak her mind about any part of her life experience. And some young adults are doing awesome work for disability rights and awareness. But when a journalist and mother of a disabled twentysomething recently said to me, “No one wants to talk about disability rights – it’s not seen as sexy enough,” I knew exactly what she was talking about.

Maybe this is just a matter of my growing up, leaving the cocoon of childhood and finding out how uncaring the world can sometimes be. But ableism among young adults in the form of silence and/or sick fascination is a lot more prevalent than many would like to admit. And why does it have to be? Are physical differences truly not sexy enough? Is it because we associate disabilities, diseases and related issues—like caregiving—with older people and with dependence? Dependence is usually the last thing to be considered cool. But does it have to be?

The Means. As a non-profit organization, Difference Diaries aims to ignite ongoing conversation that will contribute to better lives for those living with defining difference as well as friends, families, and perfect strangers who “just never thought about it.” The young adults who share their stories offer real insights and an opportunity for viewers and readers to know a little more about “what it’s like.”

We focus on conditions as diverse as the individuals living with them including: cancer, hemophilia, dwarfism, sickle-cell anemia, albinism, facial deformity, blindness, HIV, amputee, hemangioma, vitiligo, diabetes, renal disease, Crohn’s disease, cystic fibrosis, cerebral palsy, OCD and more.

This is why we want to hear from you. We are seeking blog submissions about living with Difference as a young adult. Prospective bloggers should consider: What does Difference mean to you? What is your personal experience of being Different? What has to be explained most often at work, school, out in public? What would be the most helpful thing for people to know about your Difference? How would you like to see society improve in how it handles Difference?

Send us your submissions via e-mail to info[at]differencediaries.org

 

 

Mother Petitions to End Germany’s Nationwide Youth Games

5 Jul

BXP135660(Image by Tableatny used under CC license via)
The Nationwide Youth Games (Bundesjugendspiele) are a 95-year-old annual tradition here in Germany wherein students ages 6 to 16 spend a day competing against each other in track and field, swimming, and gymnastics. The total scores are read off in a ceremony before the entire school, and those who accumulate a certain number of points are awarded either a “certificate of victory” or a “certificate of honor.” Since 1991, “certificates of participation” have been handed out to the rest of the students.

After her son came home sobbing at having received a mere certificate of participation two weeks ago, journalist Christine Finke started an online petition to put an end to the Games. She explains on her blog:

I’m doing this for all the children who feel sick to their stomach the night before the Nationwide Youth Games, for those who wish they could disappear into the ground during the Games, and for those who want to burst into tears during the awards ceremony… Sports should be fun and make you feel good about your body. But the Nationwide Youth Games are founded on grading: on the upgrading and degrading of some at the expense of others.

She dismisses the Games as a relic of the Nazi era, and while the original Reich Games preceded Hitler, founder Carl Diem did go on to be active member of the regime who instrumentalized the Games as propaganda for the Nazi obsession with bodily perfection. Finke points to the Nazi-like language of her critics on Twitter: “Our children shouldn’t be allowed to turn into sissies.”  Indeed, mottos such as Only the strong survive commonly found in sports culture in the U.S. and other countries are not taken lightly here in Germany, where sick and disabled citizens were murdered in mass numbers less than a century ago. 

As a semi-disabled kid, I had plenty of physical limitations, but, like most kids, I enjoyed the sports that I could play fairly well (baseball, tennis, jump rope) and I quickly got bored with those that put me at the bottom of the class (basketball, football, soccer).  Due to the vulnerability of the narrow achondroplastic spinal column, I wasn’t ever allowed to participate in gymnastics, and contact sports were forbidden after the age of 10 when my peers began to tower over me.  I countered the feelings of exclusion with feelings of pride for holding the pool record for staying underwater (1 minute 15 seconds), and for surpassing everyone in the joint flexibility tests. But what about the kids whose bodies ensure that they will never surpass anyone else in any competition? The best advice I ever got came from my primary school physical education teacher: “If you had fun, you won.” 

But then came adolescence, and with the onset of puberty, the body suddenly is no longer merely something that gets you from place to place. It becomes an object you are expected to sell to others in the brutal competition of dating and mating. It’s no wonder that an almost debilitating self-consciousness encompasses so many, whether in the form of sitting out of sports, refusing to ever dance or, in extreme cases, developing disordered eating habits.

I asked adult German friends how they felt about the Games. “It is the most humiliating memory I have from school!” one responded.

“It’s more likely to teach people to stay far, far away from sports for the rest of their lives, rather than inspire them to be more physically active,” argued one mother.

“Ach, it wasn’t humiliating,” insisted one man. “It was boring. It was all about skipping out to go smoke cigarettes while the super-athletes had their fun.”

“Exactly!” chimed another. “No one cared about it except the ones who won everything.”

I spent my high school years as the scorekeeper for the girls’ volleyball team at the urging of one of the two coaches, whom I both admired greatly. Throughout three years of volleyball games, I witnessed edifying examples of cooperation and self-confidence, and I witnessed a lot of childishness and borderline cruelty from overemotional adults as well as teens.

From that time on, I’ve generally viewed competitive sports the same way most people view rodeos or yodeling clubs – i.e., good for you if you derive joy from that sort of thing, but the competitions and the medals say nothing to me about whether or not you’re a lovely person. 

Of course athletic achievement can signify important life skills like self-discipline and team work, as a recent Michigan State University study has found. But sports are not necessary for developing those skills. Self-discipline can also be demonstrated by reading two books a week or vowing to learn a foreign language and actually doing it.  Tolerance, self-confidence and decisiveness has been shown to increase among students who study abroad.  Team work can be learned from playing in a band.  Or, as LeVar Burton taught us on Reading Rainbow, an aerobics-inspired dance troupe. 

In arguing to keep the Games, physical education teacher Günter Stibbe says, “Sports are brutal, of course.  But students have to learn how to deal with humiliation.”

Indeed, narcissism is characterized not just by excessive bragging but also by reacting badly to criticism or failure.  Performing poorly in sports—or in any field—can be an opportunity to learn to accept all the moments in life when you won’t be seen as special. But the idea that the body is only worth what it can do is deleterious. And too many educators fail to teach students the dangers of being too competitive and fearing weakness

The heavier burden may in fact fall on those who come out on top in high school and risk later panicking when they learn that the big wide world doesn’t really care about how many points they accrued in the discus throw back when they were 16.  Both the losers and the winners would benefit from learning that athletic competitions in youth are no more important than rodeos or yodeling competitions at any time in your life.  After all, points and medals are no indication of whether or not you’ll know how to pursue healthy relationships, be a responsible member of your family and community, or find a fulfilling career. Those who heavily brag on into adulthood about how hard they just worked out down at the gym—or how many books they read, or how much they earn—usually appear to be compensating.

This is perhaps why Stibbe criticizes the tradition of reading of the scores in front of the whole school as “pedagogically irresponsible.”

But in Der Spiegel’s online survey, there is no option for arguing for the Games on the grounds of sportsmanship and accepting one’s limitations. The two arguments to click on to support the tradition are “For God’s sake! It was the only thing I was ever good at in school!” and “What else would we do with our crumbling race tracks?” The majority of the 57,000+ respondents chose the latter.

 

 

Why Do Names for Minorities Keep Changing?

14 Jun

midget not wanted(Image by CN used under CC 2.0 via)

I’ve been writing about the word “midget” more than usual this month, thanks to an Irish public service announcement and then GoogleTranslate. The taboo nature of the word in the dwarf community is almost amusing when we consider that the world’s largest dwarf advocacy organization, Little People of America, was originally named Midgets of America. No lie. (You can read about why I feel that the change was hardly an improvement here and why others do as well here.)

Minority names have been changing a lot throughout the last century. This social pattern has been dubbed the Euphemism Treadmill by psychologist Stephen Pinker. Toni Morrison has pointed out that it’s all about power: “The definers want the power to name. And the defined are now taking that power away from them.” But as names for minorities keep changing, many laypeople keep complaining about the seemingly convoluted nature of of it all:  

“Can’t they just stick to a name and be done with it?”

“Why should I have to be careful if they’re going to be so capricious about it?”

“It seems like they’re just looking for us to slip up so they can call us out!”

It’s not hard to understand where this frustration comes from. No one likes being accused of insensitivity for using a word they had thought was in fact accurate and innocuous. But rarely does anyone ask why the names change.

In 2010, President Obama signed Rosa’s Law, classifying “intellectually disabled” as the official government term to describe what in my childhood was referred to as “mentally retarded.” “Mentally challenged” and “mentally impaired” were other terms suggested and used in PC circles in the 1990s. Already I can sense a good number of my readers wondering whether these changes were truly necessary. I can also sense, however, that few would wonder whether it was necessary to abandon the terms “idiots,” “morons,” and “imbeciles” to refer to such people.

“Idiot,” “moron,” “imbecile,”  and “dumb” were all medical terms before they were insults, used by doctors and psychologists across the Anglophone world. But gradually laypeople started using them to disparage any sort of person they disagreed with. And now this is their only purpose. Instead of getting all of us to stop using these words as insults, the medical minorities have stopped accepting them as official names.

The names for psychiatric disorders and developmental disabilities are particularly prone to being re-appropriated by the mainstream to describe behaviors and tendencies that barely resemble the diagnoses. “Sorry, I wasn’t listening,” I once heard a colleague apologize. “I have such ADD today.”

“I think you’re becoming pretty OCD,” quipped a friend upon perusing my books, which are strictly organized by size.

“That movie kept going back and forth. It had no point! It was so schizophrenic.”

For over 10 years now, psychiatric researchers and patients have been working to abandon this last one. Using “schizophrenic” to describe anything that oscillates between two opposing views or behaviors can easily lead to widespread ignorance about the intricacies of the condition. “Psychosis susceptibility syndrome” is one proposed replacement, but the ubiquity of “psychotic” in common parlance may prove to be equally problematic. “Salience syndrome” was the term most preferred by patients participating in a survey at the University of Montreal and was published in the most recent edition of the Diagnostic and Statistical Manual of Mental Disorders in 2013.

This is the choice we have about labels for minorities: We either stop using minority labels to insult people, or get used to minorities asking us to use different labels to refer to them.

But if only it were that simple. Getting people to abandon marginalizing terms for minorities without fighting about it is as difficult as the word “political correctness” itself. There are two reactions all too common in any given conversation about political correctness and they both invariably botch the conversation:

  • Libertarian Outrage: “You can’t tell me what to say!  I can call anyone what I wanna call ’em and it’s their own fault if they’re upset!”
  • Liberal Outrage: “I’ll humiliate you for using an old-fashioned term because PC is all about competition and it feels cool to point out others’ faults.”

Both reactions are based on a refusal to listen and a readiness to assume the worst of the other side. Plenty of anti-PC outrage is fueled by the belief that any discussion about names and language is hot-headed and humorless, and plenty of liberal bullying is fueled by the belief that honest-to-goodness naiveté is as morally objectionable as outright hostility.

Political correctness is not a competition, and if it were, it would be one that no one could win. A human rights activist may be an LP with SAD who is LGBTQIA and know exactly what all those letters mean, but they may not know that “Lapland” and “Fräulein” are now considered offensive by the people once associated with them. And they are less likely they know about the taboo term in German for the former Czechoslovakia.

And as someone who’s spent her life having to decide how she feels about “midget” and “dwarf” and “little person,” I can tell you that attitudes are far more important than labels. Because even if the word often matches the sentiment, this is not always the case. There’s a difference between the stranger who told my father when I was a kid, “She’s an adorable little midget!” and the coworker who told my cousin recently, “The best thing about Game of Thrones is getting to laugh at that midget!” 

I will always prefer to have an in-depth discussion with someone about the meaning of dwarfism than to call someone out for using a certain word.  I will always prefer to hear someone earnestly ask me how I feel about a certain word than witness them humiliating someone else for uttering it.

Too often these discussions are diluted down into simple lists that start to look like fashion do’s and don’ts, and this is perhaps the gravest insult to the noble intentions of those who kick-started the PC movement. As one progressive blogger pointed out years ago in The Guardian, her lesbian parents are firm supporters of trans rights and, up until recently, used the word “tranny” without any idea that it is widely known among trans people as a pejorative. Too much sympathy for the couple’s ignorance could be harmful. When the mainstream insists that no one should be expected to know about newly taboo terms for minorities, it implies that no one should be expected to be listening to the human rights conversations that are going on about these groups. But conversely, too little sympathy for sheer ignorance is equally unproductive.

Because bigotry is not ignorance. As a wise man said, bigotry is the refusal to question our prejudices.   

Interview on Berlin Television

6 Jun

©Ines Barwig(Image ©Ines Barwig)

 

Berlin’s public broadcasting station rbb has just aired a report on Painting On Scars, which you can read about and watch here.

For those of you not fluent in German, I advise you against using GoogleTranslate. As a professional translator, I’ve always considered the service a bit of a rival, but now we’re talking full-blown war. Because while any half-educated human Germanist could tell you that the rbb report translates into English as “Short-Statured – Getting Taller Through Operations,” Google says:

 

GoogleTranslate

 
 

When It Comes To Health, Who Should Minorities Trust?

12 Apr

Medication diet squircle(Image by Barry used under CC.20 via)

 

At the beginning of this year, I underwent orthopedic surgery and rare complications immediately arose from it, causing me to take three months of sick leave. In that time, both my country of origin and my country of residence experienced outbreaks of measles that have set the Internet ablaze with raging arguments about medicine, personal choice and the greater good. While the critics of Big Pharma have plenty of good points, recent studies of Big Herba—which is unregulated in the U.S.—have debunked an array of flaws that can be deadly. Glossing over the vitriol, at the crux of the matter lies a very reasonable question: When it comes to health, who should you trust?

“Trust to your doctor” sounds simple enough until we consider the many instances throughout history when medical professionals have abused this trust, particularly in regard to minorities. Health organizations around the world classified gay people as mentally ill as late as 2001. A panelist on Larry Wilmore’s The Nightly Show last month cited the Tuskegee syphilis experiment, which treated African-American men like lab rates from 1932 to 1972, as the basis for his overarching distrust of government health organizations. Investigations recently revealed that the U.S. Public Health Service committed similar crimes against mental patients and inmates in Guatemala in the 1940s. The polio vaccine, which has saved millions of lives globally, was first tested on physically and mentally disabled children living in asylums and orphanages. Researchers advocated the forced sterilization of trans people and ethnic minorities as recently as 2012. And of course there were the Nazis and the many, many scientists before them who passionately promoted eugenics. ITV recently rebroadcast a documentary hosted by Warwick Davis detailing Dr. Mengele’s horrific experiments on dwarfs at Auschwitz.

In other words, minorities don’t have to dig too deep to come up with plenty of reasons to be wary of scientists and doctors. Regulation, transparency and a never-ending, highly public debate on bio-ethics and human rights are necessary to prevent such crimes from happening again.

But an ideological opposition to all doctors based on such abuses ignores the myriad successes. A Slate article appearing last fall, “Why Are You Not Dead Yet?” catalogs the thousands of reasons so many of us are living so much longer than our ancestors did—from appendectomies to EpiPens to everyday medications—which we so often overlook because we have come to take the enormous medical advances of the past 200 years for granted.

And yet, as so many scientists are only too ready to admit, science does not know everything. Almost no medical procedure can be guaranteed to be risk-free, and many people base their distrust of doctors on this fact. My current post-surgical complications were just cited to me by an acquaintance as reason enough for why I never should have had the operation at all and instead gone to a TCM healer.  

In my 33 years I have undergone 14 surgeries, physical therapy, hydrotherapy, occupational therapy, electro-muscular stimulation therapy, and the list of medications I’ve taken undoubtedly exceeds a hundred. I have also been treated with reiki, shiatsu, osteopathy, acupuncture, massage, prayer, and herbal remedies based on macrobiotic, homeopathic and detox theories. Some of these treatments I chose as an adult, and some of them were chosen for me by adults when I was a child and a teen. Some of the medical treatments worked, some didn’t, and some caused new problems. Some of the alternative treatments rid me of lingering pain, and some were a complete waste of time, money and energy as my condition worsened. I won’t ever advocate any specific treatment on this blog because my readership is undoubtedly diverse and the risk of making inaccurate generalizations is too great.

Indeed, a grave problem in the public debate on health is the frequent failure to acknowledge human diversity. Most health advice found online, in the media, at the gym or a healing center is geared not at minorities but physiotypical people, who are seeking the best way to lower their risk for heart disease, fit into their old jeans, to train for a marathon, or to simply feel better. They are not seeking the best way to be able to walk to the corner or have enough strength to shop for more than half an hour. Those in the health industry who endorse one-size-fits-all solutions—“We just need to jog/Start tai-chi/Eat beans, and all our troubles will go away!”—rarely address minority cases that prove to be the exception to their rule. But atypical bodies have just as much to teach us about our health as typical bodies, and leaving them out of the conversation benefits no one but those seeking to profit off easy answers.

When it comes to seeking treatment for my condition, I follow a simple rubric: I don’t want to be the smartest person in the room. I have no professional training in medicine or anatomy. As this physician explains so well, self-diagnosis is a very dangerous game. Yet I sometimes am the expert on my body thanks to the relative scarcity of people with achondroplasia—there are only 250,000 of us on earth, or 0.00004% of the world population—compounded with the scarcity of people with achondroplasia who have undergone limb-lengthening and sustained bilateral injuries to the anterior tibialis tendons. A visit to a healing center or a hospital often entails conversations like these:

Shiatsu Healer: You’re walking with a sway-back. Your wood energy is obviously misaligned because you are stressed.

Me: My hips sway when I walk because the ball-and-socket joint in the hip is shaped instead like an egg-and-socket in people with achondroplasia.

***

Physical Therapist: Your hips sway when you walk because one leg is obviously longer than the other.

Me: No, I have my orthopedist’s report documenting that my legs are precisely the same length. My hips sway when I walk because the ball-and-socket joint in the hip is shaped instead like an egg-and-socket in people with achondroplasia.

 ***

Nurse: Your temperature is pretty high. I’m a bit worried.

Me: These anesthesiology guidelines I got from the Federal Association for Short-Statured People say that hyperthermia is to be expected post-op in patients with achondroplasia.

Sometimes the information I offer goes unheeded. In both the U.S. and in Germany, I have found arrogance is equally common among doctors and healers. Some of them are delightfully approachable, and others are so socially off-putting that they make you want to throw your wheelchair at them. The same arrogance, however, can take different forms. I have documented before the particular brand of pomposity so endemic to doctors, and it is safe to say that holistic healers are less likely to treat their patients like products on an assembly line because, by definition, they are more likely to take psychological well-being into account. But they are also more likely to endorse a one-size-fits-all solution for health, which invariably marginalizes minorities like me.

Those of us with extremely rare conditions are far more likely to find specialists among those licensed in medicine than among alternative healers. Living Naturally, the only website on alternative treatments I could find that even mentions achondroplasia, emphasizes that none of the therapies they suggest for achondroplasia have ever been tested on patients who have it. To be fair, rare conditions by definition are not well-known to your average GP either. But physicians more often know how to work with the facts, embracing the medical literature on achondroplasia I hand to them. Some alternative healers also embrace such literature, while others dismiss anything written by anyone in a white coat.

Even when a visceral hatred of hospitals and their hosts is irrational, it is understandable. My most recent stay involved some of the kindest medical professionals I have ever encountered but nevertheless left me waiting for two and a half hours on a metal bench with no back support in a hallway glaring with fluorescent lights and echoing with the cries of patients in pain. I respect everyone’s right to opt against surgery, or any medical treatment, as long as their condition does not cause others harm. But no matter how much modern medicine has abused minorities’ trust, disabled people are the only minority that cannot afford to forgo it.

A worldwide study presented to Little People of America found that, at this point in history, dwarfs have a higher quality of life—i.e., access to effective health care, employment opportunities, acceptance in society—in Northern Europe than anywhere else on earth. Reductive arguments that demonize all of Western medicine because the Nazis! can be canceled out by reductive arguments that dismiss anything developed outside the West because Asia’s terrible disabled rights record!  

Broad generalizations like “Natural is better” can only be upheld by those ensconced in the privileges of a non-disabled body. In 2011, the parenting website Offbeat Families banned the term “natural birth”—urging writers to instead refer to “medicated” and “unmedicated” birth—because “natural” had so often been used to imply “healthier.” An unmedicated birth is wonderful for anyone who can and wants to experience it, but it is important to remember that it is a privilege. A privilege, like a disability, is neither your fault nor your achievement.      

“Healthy” is a relative idea. Our choices about our bodies will always be limited. This is a sometimes terrifying fact to face. But in the public debate, we must remember that it is a fact those among us with rare disabilities and conditions can never avoid. In failing to remember it, we fail to make decisions about human health that are truly informed.

 


Pasch and Passover

5 Apr

(xkcd comic used under CC 2.5 via)

 

Taking this holiday weekend off and leaving you with the philosophical musings of Duncan Hull.  Until next week!

 

 

“We’ve Never Lived in Such Peaceful Times”

4 Jan

Time allowed(Image by H. Kopp-Delaney used under CC 2.0 license via)

 

“Is the world becoming a more dangerous place?” This is not a subjective question, but it is all too often answered by entirely subjective findings. Do you watch the local news and listen to a police scanner? Do you see graffiti as street art, or cause to clutch your valuables and not make eye contact with anyone? Do you know someone personally who has been robbed, attacked, or murdered?

The objective answer to the original question, however, is no. The world is in fact safer than it has ever been in human history because we humans have become drastically less violent. Never before has there ever been a place of such high life expectancy and such low levels of violence as Western Europe today. Around the globe, there are lower rates of war and lower rates of spankings. There is no guarantee that the decline in violence will continue. But most of us have a hard time even believing that it exists at all.

In his book The Better Angels of Our Nature, Harvard psychologist Stephen Pinker proves that the human emotional response to perceived danger—especially danger towards ourselves or someone with whom we can easily empathize—always risks distorting our perceptions of safety. One of the problems of empathy, he argues, is that we more readily feel for those we perceive to be more similar to us. This results in our investing more time, money and emotion toward helping a single girl fighting cancer if she speaks our language and lives in a house that looks like our own than toward helping 1,000 foreign children fighting malaria. We are more likely to disbelieve a victim of abuse if we can more quickly identify with the accused, and the same is true for the reverse scenario. And if you have been the victim of a horrendous crime or are struggling to survive in any one of the countries ravaged by war this year, you may become angry at any suggestion that the world is getting better, lest the world ignore the injustices you have suffered.

Those of us working in human rights must beware these problems whenever we trumpet a cause. Every activist’s greatest enemy is apathy, and fear of it can lead us to underscore threats while downplaying success stories in order to keep the masses mobilized. But any method founded on the claim that we have never lived in such a dangerous time is spreading lies.

As Pinker and Andrew Mack report in a recent article:

The only sound way to appraise the state of the world is to count. How many violent acts has the world seen compared with the number of opportunities? And is that number going up or down? … We will see that the trend lines are more encouraging than a news junkie would guess.

To be sure, adding up corpses and comparing the tallies across different times and places can seem callous, as if it minimized the tragedy of the victims in less violent decades and regions. But a quantitative mindset is in fact the morally enlightened one. It treats every human life as having equal value, rather than privileging the people who are closest to us or most photogenic. And it holds out the hope that we might identify the causes of violence and thereby implement the measures that are most likely to reduce it.

There is a risk that some will see the decline in violence as reason for denying crime (“Rape hardly ever happens!”), dismissing others’ pain (“Quit whining!”), and justifying their disengagement (“See? We don’t need to do anything about it!”). Pinker and Mack, however, claim the decline can be attributed in the modern era to the efforts of those in the human rights movements. In the example of violence against women:

The intense media coverage of famous athletes who have assaulted their wives or girlfriends, and of episodes of rape on college campuses, have suggested to many pundits that we are undergoing a surge of violence against women. But the U.S. Bureau of Justice Statistics’ victimization surveys (which circumvent the problem of underreporting to the police) show the opposite: Rates of rape or sexual assault and of violence against intimate partners have been sinking for decades, and are now a quarter or less of their peaks in the past. Far too many of these horrendous crimes still take place, but we should be encouraged by the fact that a heightened concern about violence against women is not futile moralizing but has brought about measurable progress—and that continuing this concern can lead to greater progress still…

Global shaming campaigns, even when they start out as purely aspirational, have led in the past to dramatic reductions of practices such as slavery, dueling, whaling, foot binding, piracy, privateering, chemical warfare, apartheid, and atmospheric nuclear testing.

The decline of violence undermines the arguments of those who invest their energy in fear-mongering (“People are evil and out to get you!”), self-martyrdom (“I’ve tried for so long—I give up!”) or indifference (“There’s no point to even trying.”). In his excellent book, which is well worth your time, Pinker demonstrates that all humans are tempted to use violence when we are motivated by feelings of greed, domination, revenge, sadism, or ideology (i.e., violence for a greater good), but we have proven that we can overcome these temptations with our capacity for reason, self-control, sympathetic concern for others and the willingness to adhere to social rules for the sake of getting along. There is much work to be done, but the decline is ultimately cause for hope. 

Happy New Year!

 

 

Political Correctness Makes You More Creative

21 Dec

Europe According to Germany(“Europe According to Germany” by Yanko Tsvetkov used under CC 2.0 via)

 

Study On Avoiding Stereotypes Smashes Stereotype About Avoiding Stereotypes. Sounds like an Onion headline. The recent study at UC Berkeley reveals that encouraging workers to be politically correct—that is, to challenge and think beyond stereotypes—results in their producing more original and creative ideas. As Olga Kazhan points out at The Atlantic, this flies in the face of conventional wisdom, which asserts that political correctness stifles the truth for the sake of acquiescing to the hypersensitive. Yet the study shows that truth and knowledge are obscured when facts are simplified into stereotypes.

Take, for example, the belief widely held in the West that women talk more than men do. Unpacking this stereotype unleashes several revelations about modern Western culture. All in all, women do not use more words than men on average. Women do talk more than men in certain small groups, but men talk more than women at large social gatherings. Listeners, however, tend to become more easily annoyed by women talking in such settings, so they notice it more. Baby girls in the West do start talking earlier than baby boys do, leading pop culture to promulgate the idea that female loquaciousness must be inborn. Yet more than one study have found that girls’ advantage may very well be because mothers talk more to their infant daughters than to their sons. And what about the stereotype that women remember emotional experiences better than men do? There appears to be evidence for this, rooted in the fact that American adults tend to ask girls more questions about their feelings during their developmental years, while encouraging boys to instead focus on their actions and achievements.

So while the genders may behave differently in some respects, further scrutiny shows that we certainly treat the genders differently. Political correctness demands we alter this. And then see what happens.

But instead of being seen as a great generator of progress and innovation, political correctness is more often perceived as a silencing technique, if Google’s image search is any indication. There is some valid cause for this concern. One of the worst tactics taken up by some minority rights activists is the phrase You can’t say that. It often stems from the noble idea that no one should have to endure threats, harassment and direct insults in everyday life. But simply banning bad words can lead to the destructive assumption that simply using the right words makes everything okay.

After all, avoiding stereotypes is not about shutting up but embracing depth and nuance. Professor Mihaly Csikszentmihalyi researches happiness and creativity, and in his latest book, he finds that one of the best tools for innovation is not limiting our own selves to gender stereotypes:

Psychological androgyny… refer[s] to a person’s ability to be at the same time aggressive and nurturant, sensitive and rigid, dominant and submissive, regardless of gender. A psychologically androgynous person in effect doubles his or her repertoire of responses and can interact with the world in terms of a much richer and varied spectrum of opportunities. It is not surprising that creative individuals are more likely to have not only the strengths of their own gender but those of the other one, too.

While the studies cited here focus on gender stereotypes, it’s easy to see how political correctness can foster productivity when applied to all sorts of minorities. For example, one way to react to  urgings to avoid antiquated terms like “Bushmen” and “Hottentots” is to ask why. This will reveal that “Hottentot” was a name assigned by Dutch and German colonists meant to caricature the sound of the Khoekhoe language, and that “Bushmen” was a derogatory name for the San first assigned to them by the Khoekhoe. This uncovers the fact that the San have been the most exploited people of southwestern Africa, primarily because their society has no system of ownership. They have been stereotyped as primitive and therefore less intelligent, but like so many non-state societies surviving into the present day, they have done so by developing skills that help them live in isolation – i.e., in unforgiving environments where other peoples have perished.

Or you can react to the urging to avoid “Hottentots” and “Bushmen” by simply saying, “I’ll call them whatever I want to call them!”  As the saying goes, stereotypes are there to save us the trouble of learning.

 

 

How To Do Empathy Wrong

23 Nov

sssssh(Image by Valentina Cinelli used under Creative Commons license via)

Have you ever had someone say to you, “I know exactly what you’re going through!” only to have them then rip into a monologue that proves they have no idea what you’re going through?

SarahKat Keezing Gay, whose newborn son needed a heart transplant, has had plenty of experiences with this:

One of my favorites has always been people comparing children’s issues with those of anything that isn’t a child. “Oh, I know just what it’s like to have a newborn. My cat wakes me up all the time!” or “Having kids is expensive, sure, but it’s nothing like having a horse.”

With Hud’s medical stuff, most of the comparisons were to really old people with totally different, usually terminal conditions. “I know just what it feels like to wait for a baby to get a heart transplant. My 85-year old great-uncle had liver disease, and waiting for his transplant was so hard on my family!” … This was particularly chafing when entangled with glaring inaccuracies, such as: “He’s sick? When my grandma went through chemo, she looked terrible, so he must be taking lots of herbal supplements to stop the hair loss and everything, right?”

She is hardly the first survivor of trauma who has had to deal with blunt comparisons that are ultimately unhelpful. In college, I witnessed a trust fund kid compare his worries about paying for a new car to a trailer park kid’s worries about paying for his course books: “I hear ya, bro – I’m struggling, too!”

The best way to get along with the rest of the world is to try to understand it. And most understanding is achieved by comparing the unknown to that which we already know. But there is an unproductive tendency in the it’s-a-small-world-after-all mindset to relativize all hardship to the point of equating all hardship. Twilight star Kristen Stewart told interviewers that unwanted paparazzi photos made her feel “raped.” Millionaire businessman David Harding pronounced the words “geek” and “nerd” to be “as insulting as n*****.” Famed divorcée Elizabeth Gilbert of the Eat, Pray, Love franchise declared that divorce can be more anxiety-inducing than the death of a child, asserting this in a book devoted to gushing about the joys of her new-found love. I don’t know Gilbert or Harding or Stewart personally, so it would be presumptuous to conclude that they must simply be naïve and have no idea what trauma or death threats or bereavement feel like. But their utterances are false equivalencies that alienate more people than they enlighten.

In the recent words of NPR’s Annalisa Quinn: “ ‘We’re all the same on the inside!’ is not that far from ‘Everyone is like me!’ which is not that far from ‘My perspective is universal!’ ” The phrase I know exactly what you’re going through, while sometimes well-intentioned, can ultimately be silencing because it puts the listener in the awkward position of having to choose between keeping quiet and trying to find a gracious way to say, “No, you don’t know what I’m going through.” Saying such a thing can come off as angry and self-involved, so most polite people opt instead to hold their tongues, sparing the other person their upset but also an opportunity to be taken out of their comfort zone and learn about an experience they’ve never had.

In his adorable piece “How To Be Polite,” Paul Ford writes that the fastest way to make a friend as an adult is to ask them what they do for a living and—no matter what their job is—react by saying, “Wow. That sounds hard.” The last time he used this line he was talking to a woman whose job it was to pick out jewelry for celebrities.

It’s a sure-fire way to a person’s heart because we all think we work really hard. We all think we have had trials and tribulations. The blues would never have broken out of the Mississippi Delta if we didn’t. But while our lives are all equally important, they are not equally painful:

Everyone on earth is privileged in some way, but not everyone has experienced severe pain.  Arguing with family, enduring rejection in love, searching for a lucrative and fulfilling job, dealing with the bodily break-down that comes with the onset of age – it is all cause for pain. The pain is both valid and common, which is why there is a plethora of books and films and songs about these experiences. And which is why we expect such pain from life and why it is fair of others to expect us to learn how to deal with it. It is substantial, but it is not severe.

Those who experience severe pain are, thankfully, becoming a minority as our society becomes ever safer and healthier, with rates of life-threatening illness and violence lower than they have ever been in human history. But misery loves company, and severe pain brings on not only profound stress but great loneliness. That’s why support groups exist. Having friends who try to understand, not because they see a chance to tell their own story but because your happiness genuinely matters to them, is lovely. Their efforts signify bravery. But they can never offer the unique comfort of connection that blooms from really knowing what you’re going through.

This was clear when I recently spent an evening at a dinner table where I was the only one who did not have a parent who had died or disowned me. It is clear whenever I read Keezing Gay’s accounts of her baby’s transplant, which moves me to tears every single time, all of them merging to constitute but a drop in the ocean of what her family went through.

The middle-aged mother of a deceased teenager said to me months after her death, “Our friends in Utah got the wrong news and thought for a while that it had been me. That I was the one who died. And I immediately thought when I heard that, Why couldn’t it have been me?  I had a good life.  My life was good until this moment.”

My life was good until this moment.

Unlike mundane pain, severe pain so often brings perspective. Of course, whether or not it does ultimately depends upon the wisdom and strength of the individual. This fact is lost on those who uphold the long tradition of viewing severe pain as a beauty mark worth yearning for because it supposedly imbues the sufferer with automatic heroism. This tradition pervades many circles, though most often those of the young and artsy navel-gazers.

Wes Anderson, who may be our generation’s king of the artsy navel-gazers, captured this problem surprisingly well in Moonrise Kingdom. The scene involves two pre-teens: Suzy the Outcast, who is angry about her mother’s infidelity and often gets into fights at school, and Sam the Oddball Orphan, who has been bounced around from foster family to foster family before being bullied at camp.

She tells him dreamily, “I always wished I was an orphan. Most of my favorite characters are. I think your lives are more special.”

Her sweetheart pauses and narrows his eyes. “I love you, but you don’t know what you’re talking about.”

Because it’s not empathy when it’s all about you.  As Nigerian feminist Spectra wrote in her critique of American Mindy Budgor’s white savior complex gone wild: “This isn’t about people ‘staying where they are’ and disengaging from the world. This is about learning to engage with other cultures with some humility, or at least some bloody respect.”

There is no benefit to engaging in Oppression Olympics; i.e., to trying to prove that abused children have it worse than soldiers with PTSD, or that black women have it worse in the U.S. than gay men. But there is a benefit to acknowledging the differences between their experiences as well as the differences between mild, moderate and severe pain. The benefit is true understanding.

Shortly after an uproar over her rape comment, Kristen Stewart apologized for her crudeness. Acknowledging what we don’t know is an indispensable step in the path toward true understanding. The most deeply thoughtful, impressively modest people I know do this all the time. Their frequent deference in combination with their unwavering support proves that there’s a world of a difference between trying to put yourself in someone else’s shoes and assuming you’ve already worn them.

 
*As in all of my posts, the identities of many of the people cited here have been altered to protect their privacy.

White Woman Sues Spermbank for Accidentally Giving Her Black Donor’s Sperm

5 Oct

Unity in Diversity(Image by Fady Habib used under CC 2.0 via)

 

Man, we can’t go two months without some couple making headlines over a baby they didn’t plan for. An Ohio woman named Jennifer Cramblett is suing a spermbank for impregnating her with the contents of a vial different from the one she selected. The mix-up resulted when a clerk misread Vial 330 as “380.” Her lawsuit reads:

On August 21, 2012, Jennifer gave birth to Payton, a beautiful, obviously mixed race, baby girl. Jennifer bonded with Payton easily, and she and [her partner] Amanda love her very much. Even so, Jennifer lives each day with fears, anxieties and uncertainty about her future and Payton’s future. Jennifer admits that she was raised around stereotypical attitudes about people other than those in her all-white environment. Family members, one uncle in particular, speaks openly and derisively about persons of color. She did not know African Americans until her college days at the University of Akron.

Because of this background and upbringing, Jennifer acknowledges her limited cultural competency relative to African Americans, and steep learning curve, particularly in small, homogeneous, Uniontown, which she regards as too racially intolerant.

As just one example, getting a young daughter’s hair cut is not particularly stressful for most mothers, but to Jennifer it is not a routine matter, because Payton has hair typical of an African American girl. To get a decent cut, Jennifer must travel to a black neighborhood, far from where she lives, where she is obviously different in appearance, and not overtly welcome.

One of Jennifer’s biggest fears is the life experiences Payton will undergo, not only in her all-white community, but in her all-white, and often unconsciously insensitive, family. Despite her family’s attempts to accept her homosexuality, they have not been capable of truly embracing Jennifer for who she is. They do not converse with her about her gender preference, and encourage her not to “look different,” signaling their disapproval of her lesbianism.

Though compelled to repress her individuality amongst family members, Payton’s differences are irrepressible, and Jennifer does not want Payton to feel stigmatized or unrecognized due simply to the circumstances of her birth. Jennifer’s stress and anxiety intensify when she envisions Payton entering an all-white school. Ironically, Jennifer and Amanda moved to Uniontown from racially diverse Akron, because the schools were better and to be closer to family. Jennifer is well aware of the child psychology research and literature correlating intolerance and racism with reduced academic and psychological well-being of biracial children.

Family planning is so endlessly complicated that any law-abiding individual seeking privacy deserves it. But Cramblett is going public with her pursuit of compensation for emotional distress and therein invites judgment. John Culhane writes at Slate that this sort of blunder is bound to happen in the free market of assisted reproductive technology. Julie Bindel at The Guardian warns of a creeping let’s-get-a-designer-baby approach to parenting among those using IVF. “Just remember,” she writes. “If the child you end up with does not exactly fit your ideal requirements, you can’t give it back – and nor should you even suggest that something bad has happened to you.”

Do parents have the right to be guaranteed certain kinds of children? Those pursuing parenthood via sperm donors, egg donors, or adoption have much more freedom to decide against certain kinds of children than those using nothing but their own biology. The application for becoming an egg donor in New York contains over one hundred invasive questions about family and medical history, as well as education, favorite sports, artistic talents and “additional characteristics” such as “cleft chin, full lips, big eyes, or high cheekbones.” Applicants are required to submit three photos “that shows [sic] your face and/or body type clearly.”

I understand why such questions are asked. Many if not most parents already know such things about those involved in producing their child, so why shouldn’t the IVF parents be allowed to know? If my partner and I were to join their ranks, what sort of donor profile would seem most appealing to us? Deciding upon something inherently entails deciding against something else. Nevertheless, it is hard not to see this tick-the-box approach to baby-making as eugenic. How many parents would accept my eggs, with their 50% chance of passing on achondroplasia? How many would sue if someone accidentally got them without asking for them?

Parents seeking to adopt children here in Germany are asked what kind of children they would and would not like to have before they look at profiles. For example, do you mind if your children look extremely different from you? What about physical disabilities? Mental disabilities? Drug addiction? In an interview with a family whose two children were adopted, I was told that the agencies encourage prospective parents to be utterly frank about their fears and prejudices – that an insistence along the lines of, “We can handle anything!” will sound suspiciously naïve.

Such brutal honesty strikes me as reassuringly well-informed, perhaps the result of infamously ideological parents like Josephine Baker or Jim Jones, who flaunted their rainbow families at the expense of the children’s individuality. Reading Cramblett’s descriptions of her relatives’ hurtful reactions to her sexuality, I can sympathize with the feeling that battling one kind of bigotry can be hard enough. Everyone deserves to live free from the unnecessary pain of bigotry. But if we’re going to be suing someone, wouldn’t it be more logical to file complaints against those who make her daughter feel stigmatized and unrecognized? Surely they’re the ones causing “emotional distress.”

While the spermbank does appear to have erred out of negligence and may be at fault, would awarding Cramblett for “emotional distress” not set a precedent and open the door for endless lawsuits over the births of minority children parents did not explicitly wish for? My parents had a 1 in 40,000 chance of producing a child with achondroplasia, as does anyone reading this. (That is, unless you already have achondroplasia.) Should doctors warn every prospective parent of those odds? Should they warn us of the chance for racial atavism? If homosexuality proves to be genetically determined, will parents have a right to sue doctors who fail to remind them of the risk? The very idea of being financially “compensated” for emotional distress is often silly to those of us who know from firsthand experience how vastly unreliable life can be.

Legal decisions aside, my primary hope is that Cramblett and her partner will explain the lawsuit to her daughter in a way that does not cause her to feel any more conflicted about her extraordinary appearance than her relatives’ racist views already do.