Tag Archives: Culture

On Terror, Danger & Perception

8 Jan


(Video by the notorious Jan Böhmermann, NSFW: strong language)

  

The results are in. After the government accepted just over 1 million refugees primarily from Syria over the past two years, there were six terrorist attacks here in Germany committed by suspected Muslim extremists (the Islamic State and the Salafi movement) in 2016. There were 857 attacks on refugee centers across the country committed by suspected German nationalist extremists for the same period.

Whenever we attempt to address cultural problems and discuss who needs to learn proper values, the answer should invariably be: everyone. The price of democracy is constant vigilance.

  

 

 

 

 

The Bathroom Debate and the Pursuit of Personal Comfort

5 Jun

gender_neutral_toilets_gu(Public domain image via)

 

Cisgender people have been sharing bathrooms with transgender people throughout history, whether they have been aware of it or not. So when conservative groups across the United States mobilized this spring to draft bathroom bills, demanding all citizens use public facilities “according to the sex on their birth certificates,” most people I know responded with a head shake and a shrug, summed up best by this meme.

Comedian Stephen Colbert declared on his show, “To all those lawmakers out there who are so obsessed with who’s using what bathroom and what plumbing they’ve got downtown? Newsflash: You’re the weirdos.”

Here in Berlin, unisex bathrooms have been on the rise over the past three years in parts of the city and federal buildings. (As seen in the image above.) Some citizens have expressed their outrage to reporters. Most have shrugged.

But in the U.S., the issue has been taken to court. When North Carolina became the first state to pass a bathroom bill in March, the federal government sued the state for non-compliance with anti-discrimination laws, and the Department of Education issued guidelines to schools nationwide for compliance regarding bathrooms and locker rooms. Attorney General (and North Carolina native) Loretta Lynch argued:

This is not the first time that we have seen discriminatory responses to historic moments of progress for our nation. We saw it in the Jim Crow laws that followed the Emancipation Proclamation. We saw it in fierce and widespread resistance to Brown v. Board of Education. And we saw it in the proliferation of state bans on same-sex unions intended to stifle any hope that gay and lesbian Americans might one day be afforded the right to marry. That right, of course, is now recognized as a guarantee embedded in our Constitution, and in the wake of that historic triumph, we have seen bill after bill in state after state taking aim at the LGBT community. Some of these responses reflect a recognizably human fear of the unknown, and a discomfort with the uncertainty of change. But this is not a time to act out of fear…

Let me speak now to the people of the great state, the beautiful state, my state of North Carolina. You’ve been told that this law protects vulnerable populations from harm – but that just is not the case. Instead, what this law does is inflict further indignity on a population that has already suffered far more than its fair share. This law provides no benefit to society – all it does is harm innocent Americans.

Conservative groups have fired back. Eleven states are suing the federal government. Matt Sharp, the lawyer for a faith-based legal group Alliance Defending Freedom argued on National Public Radio:

And so we’ve got several families there that the Obama administration came in and forced the District 211 to allow a biological boy into the female’s restrooms. And so these girls are telling stories about how when they’re in their locker room changing for PE, they’re now uncomfortable knowing that a boy can walk in at any time under the school’s new policy. They talk about how one girl in particular does not change out of her gym clothes but rather wears them all day long, wears them after going to gym, after getting them dirty and nasty through PE class and then just puts her clothes on top of it because she’s so nervous about the possibility of having to change and shower and whatnot in front of this boy. And we hear stories like that across the country of these girls speaking out and saying, look, we don’t want this student to be bullied or harassed or anything, but we also want our privacy protected. And we just want to know that when we go into these lockers and shower rooms that we’re not going to be forced to share with someone of the opposite biological sex. That’s all these girls are asking for.

If we want to “protect” women in bathrooms and locker rooms from the presence of people who could be attracted to them, then we have to pretend lesbians do not exist. Or stamp out homosexuality altogether. We’ve tried both. Many times. It didn’t work. And countless people suffered.

If we want to assign people to bathrooms and locker rooms based on “the gender on their birth certificates,” then we have to pretend that intersex people don’t exist. Approximately 1 in every 2,000 people are born with sex characteristics that do not correspond with the traditional Western categories of male or female. Surgeries intended to “normalize” the appearance often cost the patient sensation and function. That few of us ever learn about the prevalence of such bodies in our biology classes at school—let alone anywhere else—is a testament to the Western World’s strong tradition of ignoring the evidence that questions the gender binary.

While conservatives argue on the shaky basis of common sense and personal comfort, our personal comfort is so often inculcated in us by our culture. Multiculturalism can increase conflict, but also open minds on both sides. In 2013, a devout Muslim student sued her school in central Germany over her right to be exempt from co-ed swim class on religious grounds. The court ruled that the right to religious freedom includes the right to adhere to a Muslim principle of modesty by wearing a burqini, but that it does not extend to being exempt from swim class and the knowledge of what boys look like in swimming trunks.

Here in the former East Germany, nudism is widely accepted at the beach. It’s not uncommon to see teens and senior citizens alike strip down for a quick dip in a lake at a park. While West Germans often find that a bit strange, they shrug at the fact that public saunas are unisex across the nation. Visitors from around the world, from Japan to the U.K., famously have a hard time accepting this.

Which is why I do not believe all of America will embrace such liberal values any time soon. And yet, 100 years ago mainstream American men and women alike were aghast at the idea of bare female ankles. And bathing suits looked much more like burqinis than anything the mainstream dons today.

After all, if you want to make a Northern European laugh, just tell them that mermaids in the U.S. are always depicted wearing seashells.

When society’s traditions clash with a person’s reality, one of the two will have to change. The moral question is: Who suffers more in the change? Demanding a trans woman use the men’s bathroom because she has an x and a y chromosome puts her at very real risk for harassment and assault. And any person, cis or trans, who is denied their gender identity is at risk for a wide range of horrific experiences. For society to change, we must learn to accept the unalterable fact of human gender diversity with a willingness to learn about it, so that our descendants may someday look upon it the same way we look upon exposed ankles. History implies we are capable of that.

 

 

What Do You Think of When You See the Word “Healthy”?

6 Sep

Up close Star makeup mac, urban decay(Image by Courtney Rhodes used under CC 2.0 via)
 
In late 2013, journalist Katie Waldman examined the juicing trend, which was cropping up in the corners of Western society where there is a heavy focus on modern notions of “natural and organic” (think anywhere from Berlin’s Prenzlauer Berg to Burlington, Vermont and Berkeley, California) as well as in those where people competitively strive to follow the latest fashions in health and beauty (think the high-earning sectors of London, Manhattan or Los Angeles). Lifestyle writers have declared two years later that juicing has staying power, despite Waldman’s disturbing findings. Along with little to no evidence that cleansing the body with juice can be physically beneficial, she revealed that the language of most detox diets echoes the language used by those struggling with disordered eating – i.e., the idea that most of what the masses eat is on par with poison and you’re a bad person if you don’t purge it. She writes:

After days of googling, I still have no idea WTF a toxin is… Cleansing acolytes use the word toxin loosely, as a metaphor for our lapsed lifestyles…. The problem with this way of thinking is that food and weight are not matters of morality. Thin is not “good,” carbs are not “bad,” and in a world of actual pressing political and social ills, your dinner plate should not be the ground zero of your ethical renewal.

I’m neither a supporter nor an opponent of juicing in particular. Anyone should drink whatever they want to drink. But Waldman made a fantastic point about the way the upper and middle classes in the West so often believe one’s health to be a sign of one’s morality.

This idea is hardly new. The eugenics craze of the 19th and 20th centuries—that culminated with the Nazis exterminating “degenerates”—involved Fitter Families contests held at county fairs wherein judges handed out trophies to those deemed to have the best heritage, skin color, and tooth measurements. Professor Alan Levinovitz argues in Religion Dispatches that these attitudes have survived on into the present, altered only ever so slightly: “The sad thing is, it’s really easy to judge people on the basis of what they look like. We have this problem with race. In the same way, it’s really easy to look at someone who’s obese and say, ‘Oh look at that person, they’re not living as good a life as I am. They’re not as good on the inside because I can tell their outside isn’t good either.’ ”

Do we as a culture believe that being “healthy” is about appearance? Dieting often dictates that it’s about behaviors measurable through appearance. Psychologists agree to the extent that their notions of “healthy” are about behavior, but they also frequently intersect with notions of being “good.” But is being “healthy” about being brave, honest, generous and humble? Physicians would generally argue it’s about staving off death. Right-to-die advocates would argue it’s about quality of life over longevity. Is being healthy a matter of what scientists decide? Ed Cara found earlier this year that weight loss does not lead to happiness. Is happiness a measure of being healthy? Or are you only healthy if you suffer for it? Concepts of “healthy” vary vastly from person to person, and across cultures. Is that healthy?

In The Princess Bride—probably the Internet’s second-most quoted source after Wikipedia—the hero cautions, “Life is pain. Anyone who says differently is selling something.”

Yet the villain says, “Get some rest. If you haven’t got your health, you haven’t got anything.”

Whether you agree with any or none of the above, leave me your thoughts on the meaning of “healthy” either in the comments or via an e-mail to paintingonscars[at]gmail.com

 

 

Is It Wrong to Give Your Kid an Extraordinary Name?

26 Apr

Hello My Name Is... (Image by Alan O’Rourke of workcompass.com used under CC 2.0 via)

 

Every coupled friend I have here in Germany is, as of this year, a parent. And looking upon the names bestowed upon the new generation, I must say I like them all. Or at least, I don’t hate any of them. This is impressive when considering that, if my partner and I ever want to get into a fight, we simply start discussing names we would hypothetically pick for a child. Just give us five minutes and soon we’ll be shouting, “Bo-ring!” “Flaky!” “Hideous!”

And then we run up against the unanswerable question: Is it harder to have a mundane (a.k.a. boring) name or an unusual (a.k.a. weird) name?

While I enjoy the sound of my own name—as many if not most people do—I haven’t enjoyed seeing Emily end up in the top ten of the most popular U.S. baby names for the past three decades. Emily was the first name a sociologist in Freakonomics came up with when asked to list “typical white girl names” in the U.S. One hot summer in Upstate New York, I worked in a room with five other Emilys, all my age. One friend had so many Emilys in his life that he added permanent descriptors to differentiate us. (I was “Home Emily.”) Matt Groening was definitely on to something when he listed meeting-another-kid-with-your-name as one of childhood’s greatest traumas.

This is why I see the appeal of extraordinary names. After all, the whole point of giving a child their own name—as opposed to, say, calling them Person or Daughter No. 1—is to distinguish them from others. To have them, and not four other people, look up when you call them. In my years as a school teacher, I had a much easier time remembering Xenia, Letitia and Suma than Tom, Jim and Kate. I’m also grateful to parents who opt to avoid the sound-combinations that happen to be trending, reducing the likelihood of my having to remember which student is Julie and which is Julia, or whether the boy in front of me is Leon, Leo, or Leonard. I regularly confuse Kristen Stewart and Kristin Scott Thomas, but I’ll never forget Quvenzhané Wallis till the day I die.

Black Americans are renowned for frequently giving their children names that sound vaguely African with modern flourishes, from Baratunde and Beyoncé to Kwame and Malia. I spent a good deal of my childhood on Long Island and in Baltimore where I had classmates and friends named Chiwanna, LaTaesha, Zeeyaré, and Teyonté. South African comedian Trevor Noah has poked fun at how very not African such names sound where he comes from, but the attempt to reconstruct cultural ties, however inaccurate, is perhaps most understandable in the context of those whose ancestors were violently removed from their culture:

 

 

Looking down on extraordinary names can have xenophobic undertones. After all, the pre-1960s model of blending into middle class America resulted in immigrants named Wei-Li and Helmut swiftly transforming into Winnie and Herbert. An insistence that it’s cruel to name your child something unusual suggests something wrong with diversity or being a minority.

“That kid is gonna get teased so bad!” is the usual response to an extraordinary name. But wouldn’t it be better to teach your child how to react to schoolyard teasing with self-confidence and empowerment rather than avoid anything that might make them remarkable? Studies show the Boy Named Sue Effect is real. That is, my friends Lucrezia, Baldur and Bronwyn are more likely to have strong and sturdy personalities than my friends Matt, Matt and Matt.

As one psychologist explained in The New York Times:

Researchers have studied men with cross-gender names like Leslie. They haven’t found anything negative — no psychological or social problems — or any correlations with either masculinity or effeminacy. But they have found one major positive factor: a better sense of self-control. It’s not that you fight more, but that you learn how to let stuff roll off your back.

Then again, some endeavors to be different do seem less defensible than others. As noted before, a study in 2010 showed that teachers here in Germany are more likely to give lower grades and presume unruly behavior of kids named Cindy, Mandy or Kevin because they are assumed to come from anti-intellectual, anti-social homes. These names are common among children born in the Eighties and Nineties in the former East Germany where Hollywood had a strong influence, Kevin having boomed right after the international success of Home Alone. Smashing stereotypes about the people from behind the Iron Curtain is admirable, but destigmatizing Macauley Culkin feels less necessary.

And what about the potential for sounding pretentious? German punk singer Nina Hagen named her daughter Cosma Shiva after having allegedly seen a UFO while pregnant. The most compelling argument against picking a name from a distant culture I’ve heard comes from a fellow Long Islander with an Indian first name and a Jewish surname given by her Jewish dad and mother whose parents hail from Chennai:

I don’t think it’s offensive when a white couple reaches around the world for a name. I think it’s tacky. If you want to name your kid something foreign and exotic, then get to know someone foreign and exotic, and marry them. Otherwise, stick to what you know well. You’re trying to sound deep and yet your relationship to the culture isn’t deep. It’s shallow.

Not to immediately insult Dhani Harrison, but she has a point.

Having no cultural context for a name can be very problematic. What if the foreign name you’ve picked “just because it sounds nice” is widely known abroad as the name of a brutal dictator, infamous celebrity, or literary villain? If a WASPy American couple stumbled upon “Mohammad” or “Fidel” for the first time and decided to give it to their son just for the sound of it, they would be looked upon with a good deal of suspicion. In Amy Tan’s The Kitchen God’s Wife, a man returning home to China after a trip to the U.S. tricks his rival into taking on the name Judas when dealing with Western businessmen, promising him that it is the name of very well-known, powerful historical figure.

And controversy aside, phonetics often don’t translate easily across cultures. Not only are my favorite English names often butchered by German accents, but most of the German names that sound loveliest to me and my American family elicit horrified looks from my contemporaries in Berlin. (Apparently “Hannelore” is one of the ugliest names anyone could ever think of in Germany today.)

This proves, however, that it is often nothing more than a matter of taste.  One person’s tacky is another person’s terrific, and there is little we can do to change that.

 

 

The Real Reason You Should Learn A Foreign Language

27 Apr

Language Scramble (Image by Eric Andresen used under Creative Commons license via)

 

“Emily Sanford speaking, how may I help you?”

“Yeah, hi, I just got put through to you by one of your coworkers, and that guy can barely speak a damn word of German! Why do you hire foreigners? Because they’re so cheap?”

“I’d be happy to help you if you could tell me why you are calling, sir.”

“I need to ask about where to distribute some flyers your company mailed me, but I really want to know first why on earth you hire foreigners? I mean, seriously? Is it to save money?”

I pressed him for the details about the flyers, suppressing the urge to blurt out something in German to the effect of, “I American. I no understanded what you say me in Deutschy language.”

Contempt for immigrants who can’t speak the local language at the C1 Level or higher seems to pervade every country. I’ve witnessed an initiative to make English the official language of my parents’ tiny village in Upstate New York after some white farmers heard two words of Spanish on the street, and I’ve been yelled at here in Germany by surly locals for speaking English in public. These complaints are usually steeped in the explicit or implicit stance that if you can’t speak the language, you shouldn’t be here.

Yet speaking a second language is unlike any other skill. Plenty of fiercely intelligent people are terrible at foreign languages and, unlike being terrible at arithmetic or project management, this weakness will render any of their other talents virtually invisible if the job market does not operate in their mother tongue. Speaking the local language flawlessly and eloquently is the best bet to integration in any society. And if it doesn’t happen to be a language you grew up speaking, it’s a lot of work.

I speak German, French, Russian, Spanish, Swedish, and Dutch, but “speak” is a relative term. I can hold basic conversations in Russian and Spanish, but they’re always peppered with errors. (Im probably the American equivalent of the intelligible but amusing foreigner who says things like, I vant you to come sit on de table.) A few years of self-teaching have led me to understand almost anything written in Dutch, but I can’t understand the nightly news and I can’t say anything not in the present tense. My in-laws in Stockholm sweetly praise whatever I dare to say in their language, but I miss most of the details of whatever they say among themselves. After starting a book called Swedish In Three Months seven years ago, I’m still on chapter four.

I’m fluent in German and French, but “fluent” is too simplistic a word for the complexity of what it denotes. My German feels about as good as my English was back when I was in middle school. That is, I can say almost anything I want to say, but I sound a lot less diplomatic and nuanced than I would like to. I still learn new words every day. (Added to my vocabulary this week were “chisel,” “epic,” and “sexual exploitation.”) Explaining an intricate issue like a budget report to a superior at work can still make me falter. I occasionally hear myself using the wrong gender or preposition, an instant giveaway that I’m foreign.  And because double-digit numbers in German are said in reverse order (e.g. “twoandthirty” instead of “thirty-two”), I hate taking down numbers. Always have and always will.

This is why it would be deceptive of me to simply say, “I speak seven languages.” To Brits and Americans, it sounds like bragging, and to Europeans, it sounds suspicious. After all, it’s an unspoken but well-known fact that Brits and Americans who fancy themselves cosmopolitan love to exaggerate whatever knowledge of a foreign language they have, especially when they’re in the company of those who can’t possible test them on it. As British-Canadian satirist Christian Lander writes at Stuff White People Like:

… two years of college Italian does not confer fluency.  For the most part, these classes will only teach a white person how to order food in a restaurant, ask for a train schedule, and over pronounce words when they are mixed into English. Amazingly this small amount of proficiency is more than enough to warrant inclusion on a resume under “spoken languages.”

… When you hear a white person say that they speak your native language, you will probably think it’s a good idea to start talking to them in said language.  WRONG! Instead you should say something like “you speak (insert language)?” to which they will reply “a little” in your native tongue.  If you just leave it here, the white person will feel fantastic for the rest of the day.  If you push it any further and speak quickly, the white person will just look at you with a blank stare.  Within a minute you will notice that blank stare has shifted from confusion to contempt.  You have shamed them and your chance for friendship is ruined forever.

Finally, though they won’t admit it, white people do not believe that learning English is difficult. This is because if it were true, then that would mean that their housekeeper, gardener, mother-in-law … are smarter than them.  Needless to say, this realization would destroy their entire universe.

Indeed, my linguistic repertoire doesn’t sound at all impressive to the 216 million people around the world who speak four languages or more. Most of these people live in Africa and, unlike me, their range always encompasses completely unrelated languages like French and Bangangte, or English and Wolof. 45% of my Facebook friends speak two or more languages well enough to say or describe whatever they want to say. For them, and half of the people on earth, speaking more than one language is like knowing how to drive or swim. Sure it requires dedication and practice, but it’s not something you flaunt once you learn how to do it. You just do it.

Conventional wisdom says it’s best to be complimented on your language skills by a native speaker.  But if that native speaker is monolingual, they will only notice what you can’t do.  It takes a polyglot to appreciate how far along you are because they know just how much work goes into what you’re trying to accomplish. Anyone who’s lived 24 hours a day in another language knows about the headaches, the falling into bed exhausted at 8 pm, the horrors of meeting someone who talks fast.

Tech reviews across the Interwebs have been abuzz this year about a new language program called DuoLingo. The online program claims to be revolutionizing the way Anglophones learn other languages via the addictive nature of video games. That DuoLingo inspires passion and dedication is wonderful, and after checking out the advanced German program, I’m impressed with how authentically modern the dialogue is. (None of that old school drivel still found in too many online programs: “I am charmed to make your acquaintance. Which way to the discotheque?”) But I’m skeptical of the company’s insistence that you can learn a language without ever speaking to people.

Does the game teach you how to develop an intelligible accent? Does it teach you how to dive into a dinner conversation with sentences shooting at you from every direction? And, perhaps most importantly, does the game warn you about the crucial cultural connotations of certain words? To cite just a few examples, in German a “Pamphlet” isn’t just a pamphlet, it’s a manifesto. The word “deportieren” means what it sounds like except it’s only used to describe someone being sent away to a concentration camp. And you will come off as crass if you ever call a German woman “Fräulein.” As with all my knowledge of German slang, I learned these lessons from German people, not dictionaries. Language is culture and there are no cultures without people.

And just like every culture on earth, every language is a moving target. What sounds hip and what sounds sophisticated and what sounds rude and what sounds stuffy differs from generation to generation, from place to place, and from person to person. It’s exhausting, but it’s also pretty cool. In an increasingly homogeneous world, the most resilient differences are linguistic. American tourists are often disappointed to discover that businessmen in London dress more like Bill Gates than Winston Churchill, or that women in Barcelona don’t walk around with roses clenched between their teeth. But no matter the visual monotony, their ears are guaranteed to be confronted with new music.

Yet, despite its shortcomings, I suspect that DuoLingo’s personless approach to foreign language learning is exactly what many bilingual wannabes yearn for. In my experience, the number one reason adults will avoid or give up learning a foreign language is not that they dislike grammar or are overwhelmed by accents – it’s that whenever you try to speak a new language, you are bound to be laughed at.

Unlike learning to dance or sew or build a shed, you can only master a language by repeatedly practicing in the company of experts—i.e., native speakers—who are not paid to have the patience of teachers. No matter how good you are, the moment you venture out of the classroom to talk to others, someone will smirk at you and someone will correct you and someone else will get frustrated with how long it takes you to say the simplest thing. Someone is bound to make fun of you. And adults do not like being made fun of.  

They don’t like being corrected mid-sentence or being told they sound “cute.” It reminds them of being back in school, and they’ll do anything to avoid it. This is why trying to learn a foreign language from a romantic partner often puts strain on the relationship. Sure it’s fun to proudly whisper “I love you, my sweetness” to your boyfriend in another language. But it’s exasperating to try to discuss a film you just watched together and see a smirk creep across his face as you say, “I think that part not so good, but other part a little, little okay, but it hard understand why the… the… the… what’s the word?”

Adult pride can be so sensitive that there are debates as to whether or not it’s rude to correct a grown person’s linguistic mistakes outside of the classroom. I’m of the camp that insists on gulping down our pride because, as my French hostess told me my third day in Provence, “Do you want to learn French or don’t you?!” Her commitment to this credo was proven when she shouted grammatical corrections to me from another room while I was talking on the phone.

But there are other conflicts where the rules for etiquette are not so clear. My partner and I recently told a Danish-German couple about our latest trip to Stockholm. We had had a few tiffs about my being left out of the Swedish conversations and his relatives being left out of the English conversations. 

Our friends nodded knowingly. “The answer to that problem,” the Dane said with a grin, “is that it’s incredibly rude of them to leave you out of a conversation by speaking a language that’s hard for you, and it’s also incredibly rude of you to insist that everyone switch to a language that’s hard for them just for your sake.”

Indeed, being excluded from anything is a nasty feeling and nothing excludes like a foreign language. Then again, once a couple is fluent in more than one common language, the ability to speak in code is a pretty sweet reward. (Ex: “Do you mind if we change the subject, honey? I don’t want to hear him get going on this again… ”)

Many adults insist that they would have become fluent in a foreign language if only their parents had paid for early lessons because kids pick up languages better. There is truth to this argument children living abroad for a year or more are indeed more likely to become fluent than their parents are, but few understand why. I do not believe the pop science assumption that kids have an easier time learning languages because they are neurologically predisposed. Studies at Cambridge University—and my own experience as an English teacher in Berlin pre-schools—show that kids above the age of three start off a new language with the same bad accent and tendency to make mistakes as adults do. The three advantages children do have over adults are all social.

First of all, while they don’t exactly enjoy being laughed at, kids are far less self-conscious about making mistakes than teenagers and adults are. Secondly, immigrant and expat kids can easily be immersed in the local language simply by being enrolled in school, as opposed to their parents, who must first land a job in the language and therein already demonstrate some proficiency. Thirdly, and perhaps most importantly, kids have a lot less to learn to achieve fluency in their age group than adults do. A first grader’s mastery of a language involves being able to talk about Disney films and their favorite flavor of ice cream and all that other stuff found at the intermediate level of any language course. Fluency for an adult means being able to engage in debates about the next election or to write business letters or to make witty jokes with a killer punch-line, all skills for which we each need 12 years of schooling just to master in our first language, never mind a second one.

Learning a foreign language takes a lot of patience and a sturdy ego. In return, it endows you with empathy for students of your own language. And with this empathy it is not rude to smile at a non-native speaker’s mistakes or to poke fun at languages and accents. It’s hilarious to hear someone with a thick German accent try to say “weather vane” (usually comes out as “fezzerwane”), and it’s just as hilarious to hear Americans try to say, “Geschlechtergleichberechtigung” (“gender equality”).

When I was staying in Tokyo two years ago, my friend Kazumi would call me to dinner. “Em-i-liiii!”

Hai!” I’d reply with exorbitant enthusiasm.

This always made her and her fiancé burst into giggles. “So cute how you say, ‘Hai!’ ” she would smile.

“So cute how you say my name,” I’d smile back.

This exchange would not be so innocuous if one of us were portraying the other’s accent as a sign of stupidity, or complacently refusing to ever leave our own linguistic comfort zone.

When Brits complain about the invasion of other languages and dialects, they ignore that millions throughout Asia, Africa, Oceania, the Americas and the Caribbean gave up their first language for the King’s English lest they face punishment. When Americans insist that they shouldnt have to learn another language because immigrants and foreigners should learn theirs, they ignore that more than three-quarters of us are descended from ancestors who had to learn English as a second language. Many Americans seem to believe they did it so that we wouldn’t have to. But if they want to fully comprehend what exactly their ancestors achieved and what exactly they’re asking of immigrants today, then they will have to try to do it themselves. If I had wanted to be truly fair to my caller so angry about my coworkers German, I would have switched into my own language and waited to see how well he fared. 

Learning a foreign language is not about picking up enough exotic words to be able to show off at dinner parties.  Its about understanding why foreigners make mistakes in our language by exposing ourselves to the mistakes we are bound to make in theirs.  It’s about both the guest and the host, the tourist and the immigrant, not giving anyone attitude for failing to speak flawlessly to them in their own language. Its about forging a path to greater empathy, until it expands into your own backyard and all around the world.

 

 

Heritage on St. Patrick’s Day? It’s Complicated

16 Mar

IMG_1606(Image by Folke Lehr)

 

Along with millions of other Americans, I used to boast a bit every March 17th: “You know, I really am Irish.” It’s a common American pastime to cite one’s known heritage, either as demonyms (“I’m English and Irish and… ”) or percentages (“I’m a quarter Irish, one eighth Polish…” ). I still believe in self-determination, but having lived in Europe for nearly a decade, I have ceased to rattle off these titles. Not only is the latter a vain attempt at exactitude with no chance of ever being exact—we’re not even really sure if my great-grandmother was Polish or Belarusian—but it resembles the sort of puzzle-piecing that only pseudo-scientists of suspicious political convictions find relevant. And it makes Europeans laugh. And then correct me. “No, you’re not Irish. Your ancestors were Irish.” Which is true.

While Americans sometimes refer to their ancestors’ nation as their “homeland,” they usually can’t construct a sentence in the country’s official language and certainly cannot name the country’s current head of government, the second largest city, or any of its history that isn’t directly related to U.S. history. At best they know a handful of expressions, a recipe or two, maybe the region where their parents’ parents’ parents lived. For this reason, their claims to nationality usually strike the natives as silly.

But the melting pot concept is often admirably used to celebrate diversity. It bungles any sense of loyalty and prevents jingoism. I can’t really argue that the English are “naturally” evil for what they did to my Irish ancestors when my last name is Sanford. My known ethnic heritage is a split between some of Europe’s most notorious conquerors (English, German) and their victims (Irish, Polish). To claim only one or two of them as “my people” feels ridiculous. If I ever have children, their great-grandfathers will have fought on opposite sides of World War II.

Then again, not everyone’s heritage is such a hodge-podge, and plenty of conservative genealogists try to prove why the blending of certain cultures is “better” than the blending of others. That the perpetrators of segregation, Nazism, apartheid, aristocracy, and the internment camps are the most famous fans of genealogy causes me to cringe whenever anyone claims pride in having Irish or Italian or Icelandic “blood.”

Such pride is much more understandable when coming from minorities who have been made to feel that they don’t belong in the country they were born in. My grandfather, Michael Sullivan, was the grandson of Irish immigrants to America. He was the oldest of 9 children, my mother has 43 cousins, and I’ve never tried to count how many of us there are in my generation. He often began sentences with the word “ ’Twas,” and liked to sing folk songs that seemed to have come from Ireland, but may very well have originated in immigrant settlements in the States. This is the extent of my experience with his Irishness, but his was far more profound. He grew up in a time when he could easily find signs reading, “Irish need not apply,” and “mick” was a word he hated in the way that only people who have been called a slur do. When he married Barbara Tupper and her grandmother found out he was Catholic, she crossed my grandmother out of the family Bible. All this made John F. Kennedy’s election in his lifetime radical. It is my grandfather’s story and it is important. But it’s not my story.

An attempt to make it my story would feel intellectually dishonest and pretty flaky to boot. As Andrew O’Heir writes this week at Salon: “Irishness [in America today] is a nonspecific global brand of pseudo-old pubs, watered-down Guinness, ‘Celtic’ tattoos and vague New Age spirituality, designed to make white people feel faintly cool without doing any of the hard work of actually learning anything.” Indeed, my middle name endows me with no expertise when it comes to picking out Celtic music or Irish books and films. I can’t tell what most Irish people actually enjoy and what’s just on display for tourists any more than I can tell what Finnish people actually enjoy and what’s just on display for tourists.

As said before, taking an interest in other cultures is always preferable to xenophobia. But it often comes with the temptation to flaunt minimal efforts like feats of greatness. Claiming credentials based on ancestry feels not entirely wrong, but not entirely right either.

The boundaries of countries and ethnicities are as blurry as our sense of self. Heritage is often seen as the recipe that resulted in an individual, yet there are so many more ingredients to the recipe. Yes, I wouldn’t be here today if the branches of my family tree were arranged any differently, but I also wouldn’t be here today if my parents had slept together in April 1981 instead of March. And placing too much importance on genetics insults any families who cannot or choose not to have children using only their own reproductive cells. Family is what you make of it.

This is not to say that everyone should always downplay their roots. Children with at least one parent who emigrated from another country often have undeniable ties to their ancestral culture – in any case, ties that are far more likely to be based on fact than fictitious romanticizing. Most of what constitutes our inexplicable sense of culture comes from traditions and foods and pastimes we experienced growing up, and great writers like Amy Tan, Gary Shteyngart, and Sandra Cisneros show that growing up with two cultures affords you special insights into both. If my German partner and I ever have children, we plan to raise them bilingually (English and German) and bi-culturally (Thanksgiving and St. Martin’s Day), teaching them anything there is to teach about where their mother grew up and where their father grew up. Whether or not to add some Swedish into the mix—my mother-in-law came from Stockholm—is a point of endless debate between us.

If we ever have grandchildren, it will be interesting to see how they approach their American heritage. If they’re at all ashamed or excessively proud, I’m determined to discuss it, but if they’re merely disinterested, so what? I predict that my great-grandchildren will not feel any strong connection to their American heritage, nor should they. As my partner points out, maybe they will be half-Czech or married to a Burkinabé and have their hands full raising their own children bilingually. Cultures and people move and morph constantly throughout time and space.

When I finally traveled to Ireland two years ago, there were traces of culture that seemed somehow familiar. And that was moving. But most of the charm—“The Irish Sea really is that green! They really do sing in the pubs!”—came from recognizing things I’d grown up seeing in movies, not in my grandfather’s house. And I also found traces of culture the following year in Amsterdam that were faintly familiar to me because, although I have no known Dutch forebearers, I grew up on Long Island.

My most impressive sense of belonging in Ireland came from the fact that I was not the palest person around. Not by a long shot. (Hence my captioning the above photo taken on the cliffs of Howth in an e-mail sent to friends: “If there’s anything Sullivan about me, it’s my complexion.”) Lookism can be a very powerful force. But it does not have to be. In Dublin, we were never once served by someone who didn’t have a Slavic accent. If the current flood of Eastern European immigrants end up staying in Ireland, their children will have much more of a claim to the place than I do.

They’ll at least be able to remember the name of the prime minister, after all.

 

 

When Food Preferences Surpass Politesse

24 Nov

yuck(Image by Boris Drenec used under Creative Commons license via)

 

Perhaps not quite in the spirit of Thanksgiving, I’m about to alienate half the people I know: I don’t have much patience for openly picky eaters over the age of 20. The covert ones don’t bother me at all. But announcing to your host that you simply won’t eat mushrooms or mustard or millet is to revert to your 10 year-old self, brazenly acting on the assumption that anyone cooking for you will find your pig-headedness as endearing as your parent or guardian apparently did. I don’t particularly like pears or peas or plenty of other things, but if I learned anything from my time as an exchange student with the American Field Service, it’s that intellectually curious, culturally respectful, fully-grown adults eat whatever is set in front of them. Or at least try a few bites and then leave it to the side without advertising their distaste.

Unless, of course, it threatens their health. I recently hosted a friend who has celiac disease and who apologized several times in advance for the inconvenience. While the sincerity of his remorse was indeed helpful because it was convincing, I assured him there was no need for shame. I’ve had friends with juvenile diabetes and colitis and who need to be fed through tubes. At my wedding, where guests had been requested to bring cakes instead of presents, I chased down every single baker in order to mark any desserts containing traces of peanuts, oranges, or coconut. I never mind offering vegetarian options because they accommodate a wide array of dietary restrictions with both cultural and medical bases, just as alcohol-free beverages are helpful to kids, recovering alcoholics, devout Muslims, and pregnant women alike. But my tolerance generally ends there. Because beyond that boundary seems to be where stubborn intolerance for all sorts of food spreads like the plague.

According to the cover story of Die Zeit this week, less than 1% of Americans are gluten intolerant, yet 28% of Americans purchased gluten-free products last year. The percentage of Germans who purchase lactose-free products has tripled in just five years. Such sky-rocketing numbers sound much more like successful marketing trends than biological shifts in the population. Inordinate media attention to rare medical issues always inspires swathes of people to self-diagnose rather than check with their doctor. In response to what sometimes does seem like an epidemic of hypochondria, a kindergarten in Hamburg has recently taken to demanding medical documentation for any alleged food restrictions among its students. Die Zeit writes that parents were insisting their untested children had food allergies after the appearance of the slightest yucky face. Of course a child at risk for anaphylactic shock is better safe than sorry, but to teach a child to regularly cry wolf is to teach them to rely on their most narrow-minded instincts.

This is not a call to villainize health advocates or burn certain cookbooks. On the contrary, the greatest thing about the human culinary tradition is its diversity. When I grew up in the Eighties on Long Island, skim and lite and sugar-free products were in fashion,  but anything organic or “foreign” or “ethnic” was scarce because what’s wrong with some good old American spray-on cheese? Sushi was gross (“It’s raw fish, you know!”), vegetarian dishes were for pansies, and escargot was what made the French so weird in the first place. (See this Indiana Jones clip.) Kids today are growing up more environmentally conscientious and more open to exploring new cultures, and I am glad to see the American tradition of grimacing at all the icky cuisine of the savages and the smelly Europeans go the way of the Twinkie. But there’s no progress in simply switching the grimace from the sight of imported cuisine to the sight of anything that isn’t in line with the latest imported health fad.

While it seems many finicky eaters think their aversion to certain foods resembles a disability (“Please don’t criticize me for something I can’t do!”), it often resembles ableism (“I refuse to budge on this issue!”). We cannot be open-minded and at the same time refuse to leave our comfort zone. As the Food Commander writes in his excellent Huffington Post article, “Unless you suffer from a disease or real (unlike imagined) food allergies, … kindly embrace the fact that your body is not all that fragile. Humans survive every day in conditions way worse than, say, a four-course dinner in an Upper East Side townhouse.”

Outside of a meal, it can be fun to explore cultural differences and personal preferences: why so many Chinese love meat but dislike butter, or why German senior citizens detest turnips. It’s also amusing to try to argue the illogic of taste. (One such argument culminated in one of my relatives bellowing, “I am not a fan of the bean!”) It is also imperative that we eventually discover which food restrictions have been caused by environmental changes and which have been encouraged by marketing trends. But the fun comes to a screeching halt when these discussions ooze onto the dinner table.

Such candidness often has innocent origins. In these rather unrepressed times, where dinner guests discuss everything from politics to polyamory, why not share our honest opinion of what’s on our plates? This approach, however, ignores two very important facts. Firstly, unlike in a restaurant or your own home, the meal laid out for you has been paid for by someone else. Secondly, unlike the selection of films or games or whatever else it is you don’t like about the home you’re in, the meal laid out for you is the result of someone else’s time and effort. To go so far as to scrutinize it (“Is it organic?”) or disparage it (“It’s too bad it has olives in it!”) is to spit on the dinner invitation that was extended to you out of sheer generosity.

I know what it’s like not being able to participate in communal activities. This blog is all about those who have no choice about being an exception to the rule. Those who have bona fide difficulties digesting certain foods—perhaps akin to my difficulty walking long stretches—should not feel ashamed. But shame should not be supplanted with complacency, either. As my friend with celiac disease said, it is usually regrettable to have to limit one’s range of experience, and it is always regrettable when it involves rejecting an offer of kindness.

Indeed, my proudest moment during my bout of stenosis last year was pulling off a Thanksgiving dinner with my partner that fed 17 people while I was still recovering from spinal surgery. If any of this year’s guests cannot stomach something, they will hopefully follow the example of my more gracious friends who keep things discreet, at least at the table. I don’t subject them to judgment by examining their plates for leftovers and threatening to deny them dessert because they spare me the insult of telling me exactly how my offering failed to satisfy them.

They also resist the temptation to dive into an unsolicited monologue of healthier-than-thou moralizing, a tendency that accompanies food more than any other health issue. I’m usually the last to squirm at medical stories, but I’ve been thinking lately that if I have to hear about the details of the latest nutritional research every time I put a spoon to my mouth, maybe I should start lecturing about my back problems every time I see someone wearing heels or sitting at a computer.

Eating is a necessity and a health issue and an environmental issue and a cultural tradition. I love learning from friends and researchers about the different ways we all eat, and the socio-political forces of the food industries are absolutely fascinating. But I won’t ever admire someone merely for eating homemade bread or fine delicacies or simple fare or whatever it is that the Paleo diet currently dictates. Those I do admire cook joyfully in their own homes and, when invited to someone else’s home, plunge their hands obligingly into whatever their host has set out for them, whether it’s okra or Oreos. As minority rights activist Andrew Solomon has pointed out, a truly tolerant culture celebrates additive social models, not subtractive ones.

Or, more simply, I will always care a lot more about your table manners than your diet.

 

 

When Saying “I Don’t Judge” Is Judgmental

4 Aug

Beautiful and Softly(Image by Thomas Hawk used under Creative Commons license via)

 

“I’ve learned not to judge other people.” In the debate on marriage equality, many former opponents have softened their opinions with this all-too-common phrase. While a little progress and diplomacy in any debate is better than none, this should hardly be considered an acceptable assessment of same-sex marriage. Because whenever we say, “I don’t judge,” we’re implying that we think there is something morally ambiguous to judge about the situation.

We say “I don’t judge” when we observe pain or dishonesty and are hard pressed to think of a way it could have been prevented. We say it when we observe someone lose control and we know that everyone loses control sometimes. We say it when at least two sides are sparring and both have made major mistakes. It’s dishonest to pretend that we don’t have opinions about the decisions and actions we witness, because we all do. But ultimately saying, “I don’t judge” means my opinion is incomplete because I can’t say for sure what I would do in that situation. And when the act in question falls short of intentionally cruel behavior, it is often the appropriate thing to say.

It’s appropriate when we hear about a neighbor’s divorce (“I don’t know the details of the marriage, so I can’t judge”), when we hear that someone took a job that compromised their morals (“I can’t say what I would do if I were that strapped for cash”), when we see people with parenting methods that differ from our own (“That child isn’t my child, and I don’t know what I would do if she were”). We say it not to ignore the harm it may have wrought, but in order to remain humble, to avoid hypocrisy, and to remember that different circumstances prevent the human experience from being truly universal.

But we do not and should not say it regarding lifestyles that raise no moral questions. We don’t say, “She’s dating a foreigner, but I don’t judge,” or “They adopted a child, but I don’t judge.” If anyone said of my partner, “He married a woman with dwarfism, but I don’t judge,” that person would be implying there is something shameful or irresponsible about me and my condition.

A little over a hundred years ago, doctors were saying just that. A Virginia medical manual in 1878 advocated criminalizing marriages between average-sized men and women with dwarfism, insisting that such an act was on par with “murder.”

Modern readers hopefully find nothing morally ambiguous about two consenting adults falling in love and deciding to commit to one another. Regarding interracial or same-sex or international or medically “mixed” marriages, the only people who should invite our judgment are those who impugn these relationships with the statement, “I don’t judge.” It’s an oxymoron, not unlike a “Please” slathered sarcasm. And it would be swell to see it less and less in political discussions on civil rights.

 

 

Who You Telling To Wear Makeup?

28 Apr

fashion show(Image by Alex Craig used under CC license via)

 

While chatting with colleagues over coffee this week, I ended up “outing” myself as a dwarf who’s had limb-lengthening.  (Experience has taught me some people notice right away when they meet me that something is up, while others go a long time without the slightest idea, especially in the wintertime when my scars are hidden under sleeves and pants.)  We arrived at this topic by discussing fashion—and the recent scandal in Sweden that’s left me almost speechless—and then beauty and self-confidence.  Several of my colleagues pointed out that every person they know who’s undergone cosmetic surgery never struck them as unattractive before the fact.  Only an idiot would think that there’s only one kind of beautiful nose or mouth or whathaveyou.  And only a jerk would tell someone to have cosmetic surgery.

As you may have guessed, I agreed wholeheartedly.  But what about telling someone to wear makeup?

This week, a man writing to Slate’s Dear Prudence advice column confessed he feels simultaneously guilty and helpless about the fact that some of his female friends are unlucky in love because “their looks are probably the only thing holding them back.”  Prudence tends give good, progressive advice, but this time, instead of telling him the ladies should move in less superficial circles, she suggested he pair them up with some similarly “average-looking” male buddies.  She then added, “If the problem with your female friends is not their intrinsic looks but the fact that they dress like schlubs or never wear makeup, then a guy’s perspective that they aren’t doing everything with what they’ve got could spur them into action.”

Ugh.  Say what you want about clothes, but the makeup debate is as messy and gunky as makeup itself, which is why I’ve avoided it up until now.  But am I the only one who thinks telling someone to start using makeup is entirely different from giving them your opinion about the way they dress?

Everyone, from my partner to my grandmother, rolls their eyes at certain fashion choices and, as I’ve said before, anyone who denies they ever do it is lying.  It betrays a pathetic insecurity to trash others’ dress for the sake of your own self-aggrandizement—e.g. “I wouldn’t be caught dead in that!”—but it is fair to say what just isn’t your cup of tea.  We can snark a little about someone’s clothes, hairstyles, accessories, headgear or makeup style (if they have one) without too much malice because someone is probably snarking about ours.  No one on earth dresses in a way that is universally attractive because there is no such thing as a universal beauty standard.  And as the saying goes, there is no arguing taste.  Someone thinks this is kick-ass, and someone else thinks it’s sloppy:

Captain Jack Sparrow

Someone thinks this is dreamy and someone else thinks it’s one big yawn:

Jason Straatmann Actor Japan Suit Tie Cufflinks Model

Someone thinks this is sexy and someone else thinks it’s garish: 

Untitled

People find beauty in this:

Traditional Korean dance

Or this:

Ethiopia, Mursi woman

Or this:

Bollenhut-Gutach

Or this:

4601942293_27f40e0122_o

Or this:

Namibië, oktober 2008

Or this:

 
And that’s just a tiny sample from around the world. There is even more variation across time because, as Oscar Wilde said, “Fashion is a form of ugliness so intolerable that we have to alter it every six months.”  I think some of my friends, like some of the subjects above, have a great sense of style, while others do not.  They in turn probably think the same about me.  But if any of them thought I should wear makeup more often than I do—which is almost never—and told me so, they wouldn’t be my friends.  But what if they’re my supervisors?      

In January, a study featured in The New York Times revealed that (American) women who wear makeup are considered more competent and more likable in the workplace.  A panel of stylists and professors made various points about this that basically all boiled down to, “It’s a choice.  If it makes women feel more confident, they should go for it.”  But if the study indicates that their confidence would result from garnering more positive attention for their looks, then their lack of confidence without makeup would result from a fear of not getting attention for their looks. 

Many modern women, especially lipstick feminists, repeat, “Empowerment is all about being free to choose!”  There is truth in this.  I know guys who were bullied in school for wearing concealer or plucking their eyebrows.  Women meanwhile are often forced into a nearly impossible balancing act wherein no makeup = plain Jane, but too much = slut, and kudos to anyone who refuses to play that game.  Good girl culture, as well as the results from the study, assert that “less makeup is more – you should look like you’re not wearing any.”  This rule seems potentially problematic to me because it is insidious.  If someone gets used to just slightly “improving” their face every day, it is more likely they’ll feel insecure without these improvements.  I occasionally enjoy wearing heavy makeup bordering on the outrageous (like glitter), but it feels like a mask and everyone knows it’s a mask.  When it’s so obviously part of a costume, there’s not much danger that I’ll start considering it an inalienable component of myself.  But the subtle makeup seems to be a lot harder for people to let go of.  I know women who refuse to be photographed without their makeup on—and you probably do, too—and if that doesn’t sound like an unhealthy insecurity, I don’t know what does.

In any case, it doesn’t sound like they are “free to choose,” as lipstick feminists advocate.  As I’ve written before in explaining my choice to have my limbs lengthened, we should be free to make complex decisions about our bodies without others making snap judgments about our motivations.  Anyone who does is a coward.  But it is also cowardly of us to voice hatred for our natural faces and simultaneously deny that this has any impact on others.  In the words of philosopher Arthur W. Frank, “When we make a choice, we confront others with that choice.”  The freedom to choose diminishes when a strong majority bends in one direction, because majorities create social pressure.  In a society that literally rewards women who wear makeup—i.e., with higher salaries—it is undeniable that many do so in order to win these rewards, ultimately playing by the rules under the guise of empowerment.  The cosmetics industry, like any industry, always aims to make their customers feel that they cannot live without their product and so they too have embraced the slogan of “Empowerment!”  Leading The Onion to smirk, “Women Now Empowered By Everything A Woman Does!” 

It would be obnoxious of me to assume that every woman with a compact in her purse does it to acquiesce.  I know and admire selfconfident women who love putting on bright red lipstick and self-confident men who wish they could, too, without being gawked at.  Primping can be fun.  Painting your skin certain colors can make you feel fine and refreshed, like slipping into a brand-new top or getting a new haircut.  Or brushing your teeth after a hangover. 

But it’s not quite the same thing, is it?  Once again, it’s a mask.  A friend of mine who loves dressing up but hates wearing makeup recently said, “I guess, ultimately, it’s weird looking in the mirror and seeing something that doesn’t look like me.  I don’t really like makeup on other people either though, so perhaps it’s a general class of trying to hide oneself that bugs me.”   

Indeed, that is one of my many reasons for rarely ever using cosmetics, why I graciously declined friends’ offers to do me up on my wedding day, why I cringe at the idea of anyone pressuring women into it.  I also like being able to rub my face without having to worry about smudging.  I’d rather spend the money on a million other things.  My partner hates the taste of cream, gloss or powder—“Kissing someone wearing foundation is like kissing a sandbox!”—and I must say I don’t blame him.  Most importantly perhaps, I don’t understand why our culture believes that women’s faces require some paint in order to be attractive but men’s faces don’t.  If I can’t compensate for the plainness of my natural face with my charisma, then no one should be able to.

Of course, almost all of us conform to our culture’s beauty standards to some degree.  I’ve worn concealer for blemishes and plucked my eyebrows to make them even, but I feel a strong attachment to my scars and so I’ve kept them.  I don’t always like my face—don’t we all have those days when we look in the mirror and just feel yucky and dissatisfied?—but even if I thought putting on some modern Western style of makeup would make me look “better,” it wouldn’t look like me.  Experience has also taught me that a dissatisfaction with one’s looks is almost always rooted in something more substantial: feeling not very fit, feeling overtired and stressed, feeling lazy because there’s been too much or too little to do.  And even if it’s not, I often feel very satisfied with my face, so on a bad day why not simply walk away from the mirror, focus on something a little more profound than my appearance, and have confidence that the feeling of self-satisfaction will return?

As psychologist Nancy Etcoff wrote in The Times:

Women who feel that makeup use is obligatory but unwanted, that it requires a forced confrontation with the mirror when they’d rather put their attention elsewhere, do not feel more confident after using it.  Research suggests that women can feel objectified by makeup, and for such women, any potential advantage may be offset by the emotional labor of wearing it.

And, in an excellent article on weddings, Ariel Meadow Stallings of Offbeatbride.com writes:

I’ve been thinking a lot lately about the pursuit of authenticity versus the pursuit of attention.  The first feels very internal, like you really have to look with-in yourself with a lot of introspection and thought to determine what’s important … while the other feels very external, like you’re hunting for other people’s eyeballs.  And why does one seem like so much fun, while the other seems like so much work? …

I guess it comes down to this: Attention gives you the cheap high of other people’s energy focused at you … but authenticity gives you that deep, long-lasting satisfaction of knowing that you’re on the right path and you’re doing the right thing.  While the quick high is more fun in the short run, the deep satisfaction is ultimately more filling.

This is why it is fine to wear makeup but wrong to tell someone else to.  Not only is it a ludicrously presumptuous, boundary-crossing thing to say—like telling someone to switch careers or leave their spouse—but it’s vacuous because it has nothing to do with matters of justice or morality.  It is sheerly a matter of beauty standards.  The worst thing about beauty standards is that they create peer pressure based merely on taste.  The best thing about them is that, as seen above, there are millions of them, and they are constantly changing.  If humans are capable of thinking the lip-plate is attractive, then surely we are capable of thinking a woman without makeup is attractive. 

Women and men should feel free to smear their faces with whatever they wish or go without, to pluck their eyebrows or leave them be, to shave any body part or refrain.  (Bearing in mind doctors have recently explained the cringeworthy risks of shaving certain parts.)  But the moment they say that someone should do the same in order to feel better or lure lovers or advance their career, we have a problem.  And it’s not physical.

 

 

Picking & Choosing Our Tragedies

21 Apr

World travel and communications recorded on Twitter

(Image by Eric Fischer used under CC license via)

 

What a week.  A suicide bomber in Pakistan killed four people.  A fertilizer plant explosion in Texas killed at least fourteen people.  Sixteen people died in a goldmine collapse in Ghana.  President Obama and members of the U.S. Senate were sent letters laced with poison.  A journalist in Mexico was assassinated, presumably by agents of the drug wars.  At least 65 people died in terrorist attacks in Iraq.  More than 150 people just died in an earthquake in Szechuan.  And after two young women and a little boy were murdered by bombs at the Boston Marathon, it felt surreal if not uncomfortable to see my last post about America’s inexperience with bombs at home emblazoned across the blog.  But what to say? 

For most of the week, we had no trace of a motive for the Boston bombing.  And now that one suspect of Chechen origin is dead and his brother is in custody, we still don’t have anything we could officially call a reason.  Polemicists on the right and left are using the event as “evidence” for the necessity of their own political agendas, arguing that we should have used drones, or that Dzhokhar Tsarnaev should face a military trial, or that we need more surveillance cameras everywhere, or that the two suspects seem more like the psychotic teens of Columbine than terrorist operatives.  As John Dickerson observed in Slate yesterday:

We need more restraint and less wild guessing. Free-flowing debate in the search for meaning is a part of these moments and a part of the human condition, but … In these fast-moving times when the only thing that is certain is that the first piece of news has repeatedly been wrong, perhaps those lawmakers and pundits who want to be part of the final conversation should (paraphrasing Mike Monteiro) follow the Quaker rule: Be meaningful or be quiet. 

Of course, we all like to think ourselves meaningful.  But so far, with no official motive, the only irrefutable point any politician has made thusfar came from the Ambassador from the Czech Republic, who urged the media to note that Czechs and Chechens belong to two different countries located over a thousand miles apart.

Distance and borders matter, obviously, since none of us are equally horrified by every single one of this week’s tragedies.  But why?  I had friends in the Boston area who were stuck at home during Friday’s lockdown.  (Two were hoping to be allowed out in time for them to make it to the annual birthday celebration of their late brother Bill, whom I wrote about at this time last year in a post on grief.)  But I’ll hazard to guess that most of those glued to the news updates from Boston did not have loved ones there.  The story dominated the headlines across the ocean in Germany, in France, in the U.K.  Everyone seemed to be watching. 

The simplest reason for this is that people are naturally empathic, upset to see others upset and, in the words of the Czech ambassador, “It was a stark reminder of the fact that any of us could be a victim of senseless violence anywhere at any moment.”  But dead people in Pakistan and Iraq no longer serve as reminders of that fact.  They instead represent our ability to compartmentalize, to exile certain tragedies to a semi-numb region of the mind, either because they seem too frequent for us to commit to or because we want to believe there is some crucial difference between Us and Them, protecting us from their fate.  It’s not malicious of us to compartmentalize in this way—to tear up upon sight of the beautiful little boy in Boston while not even checking to see if any of the victims in Iraq were children—but it’s not fair either. 

And so I stared down my last post about World War II bombs, feeling inexplicably uncomfortable, wondering whether it was callous of me to not say anything about the tragedies going on in my old home country, yet knowing World War II would never have happened had my new home country not embraced a dangerous idea of what makes a country “home.”  Borders are always bizarre.  In a digital age, distance is all in the mind.  I’ll never be able to rationally explain why some things feel “close to home” and others don’t.   I’ll always care more about the safety of those I know personally than those I don’t, but I’ll never be completely comfortable with this fact because ignoring our common humanity is what builds borders and facilitates cruelty.  I’ll always tear up if you show me a picture of an innocent victim.  I’ll always try to remember to ask why we are shown pictures of some victims, and not others. 

Or, as a friend in Boston observed during the lockdown, “It is so hard to be inside on this gorgeous, beautiful spring day.  Minor problem, but reminds me how lucky we are most of the time to feel safe outside our homes.”

 

 

This Is What War Looks Like, 70 Years After the Fact

7 Apr

 

 

This past Wednesday, Berlin’s Central Station—the largest train station in Europe—was closed after a 220-pound Soviet bomb from World War II was discovered by construction workers.  840 residents were evacuated from the area before the bomb was successfully defused.

Not all such bombs can be defused.  This past August, an American bomb discovered in Munich had to be detonated by experts, as you can see in the video above.  Approximately fifteen unexploded World War II bombs are discovered in Germany every dayThis does not happen where I come from.

To live in Berlin, my favorite place on earth, is to live in a city of scars.

 

 

“If He Was a Wee Bit Closer, I Could Lob a Caber at Him, Ye Ken”

3 Feb

 

 

Time for another break from the tough stuff.  I want to talk about Disney.  (In earnest, mind you.  As always.)  I just saw Pixar’s Brave and no, I’m not going to write about her feminism—or the ludicrous musings about her lesbianism—or the radical imperfectness of her eyebrows.  What pleased me most about this film was its break from the Broadway tradition that has been dominating—dare I say strangling—animated cinema for decades.  Throughout my childhood, Disney and their competitors would take you around the world with Alan Menken and his endless supply of wide-mouthed Middle American show tunes as your guide.  The main characters’ accents ranged from Beverly Hills to Burbank.    

Like The Princess and the Frog, Brave has the guts to feature songs, accents, and expressions native to the story’s setting.  And it’s about time.  The Broadway model has its merits, but it can start to feel like overkill when it forbids any trace of historical or foreign flavor.  When it comes to their family films, Hollywood has traditionally handled their American audiences like cultural infants.  There conventional wisdom asserts that any voice that doesn’t immediately evoke baseball and apple pie risks obliterating our ability to empathize.  Only “artsy” films for grown-ups like Brokeback Mountain or Capote dare to let the dialect match the backdrop.  Hence our heroes Aladdin and Belle and Ariel and Simba and Esmeralda, who all sound like they went to school with the cast of Saved by the Bell.  As The New York Times observed in 1997, the closest the actors in Anastasia ever came to St. Petersburg was Pasadena.  A character speaking the Queen’s English has been permitted with some regularity, but if they’re not Julie Andrews, they’re probably the villain or the butler.         

Paradoxically, these animated family films set in far off lands usually feature one odd character who does speak with a local accent.  So is this proof we can catch words pronounced differently, or does it not matter what Token Foreigner says because his character is inconsequential?  Beauty and the Beast lets one or two sidekicks babble, “Ooo la la!” and “Sacre bleu !” but pretty much leaves the plot exposition up to everyone else.  In Aladdin, the Arabic accent belongs only to the characters with the fewest lines, such as the merchant—who sings the racist song that was later edited—and Gazeem the thief, who dies before the end of Scene One.  And by the way, I haven’t been able to find anyone in The Little Mermaid who sounds Danish, under the sea or above.

Not only does Brave inject its lines with a kick-ass charisma brought on by Scottish brogue, but most of its voice actors—with the exception of Emma Thompson and Julie Walters—are actually, truly, veritably from Scotland.  Traditionally, the Token Foreigner in a children’s film has been provided by an American actor putting on a stereotypical accent.  (Kelsey Grammer as a Russian aristocrat, Jerry Orbach as a French candlestick… )  The ability to imitate an accent is a great skill for both an actor and an interpreter, but it can easily go horribly wrong without anyone in charge of the film noticing.  The fact that Dick van Dyke got away with his impression of Cockney in Mary Poppins suggests that U.S. film critics of the time had pretty low standards.  Meryl Streep has been famously lauded for her ability to sound authentically Italian, Polish, and British, but almost none of those singing her praises are Italian, Polish, or British.  Her portrayals may very well be accurate, but ever since Mary Poppins, Americans have a bit of a reputation for being too easily fooled.  My Nordic partner always rolls his eyes and shakes his head at the Seinfeld episode that tried to pass off this accent as Finnish:

 

 

This is not to say that Americans are the only ones who can’t tell Finnish from gibberish.  I’ve met plenty of French people who think Japanese sounds like that pathetically generic “Ching-chong-chang!”  And Brits who have claimed—a little arrogantly—that the U.S. does not have as many dialects or accents as the U.K.  Ethnologue cites 176 living languages in the U.S. compared to the U.K.’s 12.  Great Britain and Northern Ireland may contain more dialects—though I would bet their dialects are fewer in number while boasting more speakers per dialect—but this begs the philosophical question of what separates a dialect from a language.  The joke among linguists goes, “A language has an army and a navy.” 

Every culture tends toward simplistic views of other cultures.  When you begin to type “Brave Pixar” into Google, you get the apparently popular question, “Brave Pixar Irish or Scottish?”  Anyone outside of the Celtic-speaking regions could be asking this question.               

I’m sure Brave is still rife with Scottish stereotypes that are more craved by Hollywood than are authentic.  And the ancient clans of the Highlands most likely sounded nothing like Billy Connolly or Craig Ferguson.  But it is nice to see the filmmakers trust us enough to handle protagonists who do not speak exactly like the average American moviegoer.  After all, what is the point to hearing stories from far off lands if it’s not to hear things we may not have heard before?  And the more we are exposed to different authentic accents, the more likely we are to realize that every one of us has one.  And that somewhere, someone is smiling at the way we talk.

 

 

 

What’s Censorship?

27 Jan

Banned Books Display At the Lacey Library(Image by the Timberland Regional Library used under CC via)

 

Eeeny, meeny, miny, moe, catch a tiger by the toe.  If he hollers let him go…  That’s the version I learned.  My British friends caught a fishy by the toe.  My mother’s generation caught a n***** by the toe.  Were they wrong to alter it for us? 

Last week I applauded The Observer’s decision to remove a childish, poorly argued opinion piece from its website on the grounds that it did not meet their standards for style, while others hollered, “Censorship!”  This week, the German media is abuzz with its own debate over publishing standards as Thienemann Verlag has announced its decision to replace racist terms—such as “die Neger-Prinzessin”—in certain classic children’s books.  To which some are saying, Finally, while others are saying, Censorship!  And some are saying, The N-word isn’t racist!

This debate is older than the civil rights movement.  Pull up reviews of The Five Chinese Brothers on GoodReads and you’ll find nostalgic fans shouting, “Book burners!” at anyone who criticizes the illustrations.  The problem with this debate is that it usually attracts extreme narrow-mindedness on both sides. 

Some progressive activists do mistake witch hunting for spreading diversity awareness.  A few years ago feminist author Chris Lynch drew angry reactions from some women’s rights groups who demanded he change the name of his young adult series The He-Man Women-Haters Club.  But the books pick apart the machismo boys learn from pop culture and their fathers.  The mentality adopted by Lynch’s critics was so blunt that they couldn’t tell an opponent from an ally.  If the equality debate ends at what words are okay and which aren’t, regardless of context, it has failed.  Miserably.

But too many activists opposed to censorship demonstrate none of the openness and subtlety that are the building blocks of free thought and artistic integrity, which they purport to defend.  After reading Fahrenheit 451, an unparalleled tribute to the majesty of books, I got snagged in the inanity of Ray Bradbury’s hysterical afterword.  He begins by citing an editor who asked if he could put more female characters in The Martian Chronicles:

A few years before that I got a certain amount of mail concerning the same Martian book complaining the blacks in the book were Uncle Toms and why didn’t I ‘do them over’?  …  How did I react to all of the above? …  By ticketing the assembly of idiots to the far reaches of hell.  The point is obvious.  There is more than one way to burn a book.  Every minority… feels it has the will, the reason, the right to douse the kerosene, light the fuse…  For it is a mad world and it will get madder if we allow the minorities, be they dwarf or giant, orangutan or dolphin, nuclear-head or water conversationalist, pro-computerologist or Neo-Luddite, simpleton or sage to interfere with aesthetics.  The real world is the playing ground for each and every group to make or unmake laws.  But the tip of the nose of my book or stories or poems is where their rights end and my territorial imperatives begin, run and rule.  If Mormons do not like my play, let them write their own.  If the Irish hate my Dublin stories, let them rent typewriters.

That he dared them to back off and write their own books was a productive challenge, but his arrogance in damning them all to hell did not suggest he ever intended to read what they wrote.  (If he truly believed all art should be borne out of one person’s imagination alone, unscathed by anyone’s suggestions for improvement along the way, then he was probably the only writer in human history who never once accepted advice.)  This is not dialogue.  This is not open debate.  This is accusing your opponents of oppression in order to silence them.  This is failing to discern between book-burning and social critique.

Censorship is a serious issue.  Berlin’s memorial to the Nazi book-burning of 1933 is a window into an empty library.  It bears a plaque that reads, “Those who are capable of burning books are capable of burning people.”  No one should ever call for legally prohibiting the publication, sale, or existence of any sort of text if speech is to remain truly free.  Libraries should offer the public all they can eat and more.  But every publisher of children’s books should also be free to reject or revise what they release based on their own educational theories.  No one on earth believes any child of any age should read absolutely anything.  Releasing less hurtful editions of a story—while maintaining the right to publish the original—is not always censorship.  Indeed, automatically assuming it is betrays the sort of narrow-mindedness typical of censors.    

The leave-greatness-untouched argument ignores how many well-known stories have been severely distorted over time.  In the unadulterated Cinderella, the ugly stepsisters chop off pieces of their own feet to force them into the glass slipper.  The prince is fooled until he notices the slipper overflowing with blood.  Snow White forces the Evil Queen to dance in a pair of hot-iron shoes at her wedding until she drops dead.  As for Sleeping Beauty, do you think the medieval prince only kissed her as she slept?  It makes old-fashioned Disney look like a flaming liberal.  These violent versions are still around, but a lack of demand has nudged them out of the spotlight.  I wish the same fate upon racist versions of old children’s books. 

Of course, context is everything, and certain words can have many meanings.  Mark Twain used the N-word in Huckleberry Finn to portray a complex, admirable character who discredits racism and slavery.  But the N-word as it is used by Otfried Preußler—and Astrid Lindgren, and so many other white storytellers of the early and mid-20th century—evokes the colonialist stereotype of the savage who is either happy-go-lucky or bloodthirsty.  (In the words of Cracked.com, “Lesson Learned: What’s the deal with Africans?  If they’re not trying to eat it or throw a spear at it, they’re worshiping it as some sort of tribal deity, am I right?”)  Of course it’s absurd to think that every kid will automatically turn racist from reading this, but it’s also naïve to think such caricatures have no influence.  If childhood stories had no bearing on readers’ perceptions of minorities, then no one would ever promote children’s books that celebrate diversity.    

While I don’t object to students seeing racism or sexism or ableism in books, I strongly object to their being subjected to it before they’ve had any other exposure to more realistic depictions of the people these ideas dehumanize.  Psychologist Hartmut Kasten argues in the left-leaning newspaper Die Zeit that children ages four and up can read and should “learn that there are people with different skin colors, learn what we used to call them, what we call them today, and that there is such a thing as prejudice.”  But is it necessary when first introducing a child to someone who looks different to immediately hand them all the historical baggage of racism, too?  Doesn’t that suggest to them that people with different skin colors are always controversial?  Prejudice can spring from seeing a minority constantly portrayed either as a stereotype or as a victim of stereotyping. 

Prof. Kasten argues that expunging orientalism and other exotic tropes from children’s literature “destroys the imagination.”  But must the exotic always be colonialist just because that’s our tradition?  It is traditional in the Netherlands for St. Nicholas to be accompanied by a mischievous African man named Black Pete.  Some say he is supposed to be St. Nicholas’s servant, others say he is his slave.  For decades, white performers have donned blackface to portray him.  In recent years, some have replaced the blackface with multi-colored face paints, renaming the character “Rainbow Pete.”  This approach has long been popular in Suriname, a former Dutch colony with predominantly black citizenry.  Many are appalled to see an old tradition changed, but the St. Nicholas/Santa Claus/Kris Kringle/Father Christmas/Father Frost myth has been constantly evolving over time, forever an amalgam of various cultural influences.  Our nostalgia does not like us to admit this, but as said before, nostalgia is rarely honest, often revisionist.  And could Prof. Kasten argue that rainbow people are less imaginative than black slaves?         

And if children’s creativity is nurtured by stories from long ago in far off lands, why not make more of an effort to offer tales originating from those lands?  Indeed, in my workshops about teaching diversity awareness in pre-school, I promote translated folk tales and fairy tales such as Sense Pass King and Children of the Dragon to be read alongside Cinderella and Snow White.

 

The best way to combat uncreative stereotypes is to flood children’s libraries with beautiful stories that go deeper.  My hero Judy Blume agrees.  She is the most challenged author of all time in the United States.  Her brilliant books question everything from racism to religion to budding sexuality.  Most of her loudest critics usually argue that children under the age of 18 should never read about masturbation or wet dreams, despite how many 10-year-olds are already wise to it.  Blume wants parents who object to her stories to engage their children in discussions about them, which is a stance I support.  Passionately.  But is any child of any age old enough for such discussions?  Was it censorial of me to be stunned when I found Zehn kleine Negerlein lying around in a Berlin pre-school in 2010?

 

 
Die Zeit insists that if we revise anything that is in any way offensive, then we must revise everything.  (Which will lead to a ban on any disagreeable characters who are female or black or gay or disabled… )  This could be true if we were talking about bringing the law into it, but we’re not.  As far as the law is concerned, anyone is free to adapt any artwork once granted permission by the copyright holder.  Otfried Preußler’s publisher began replacing the N-word from his texts after receiving approval from the author’s daughter.  As hard as it may be for artists to swallow, artwork in the public domain is free to be toyed with as anyone sees fit.  Almost every generation releases the classics with new illustrations, whether it’s The Jungle Book or a children’s Bible. 

But to be fair, the modern illustrations bear the name of the modern illustrator, while a redacted version of an author’s text bears his.  Which feels somewhat mendacious.  Posthumous revisions would best be noted in an afterword discussing the original language and why the publisher does not wish to replicate it.   Alternatively, the cover could indicate that the story is a retelling.  Like so many of my friends, I grew up on abridged versions of Victorian classics such as Peter Pan, The Wizard of Oz, and Alice in Wonderland Only a handful of us went on to read the original texts when we were older.  Just as we went on to discover the original versions of “Eeeny, Meeny, Miny, Moe,” “Turkey in the Straw,” and the stanzas in the German national anthem that no one sings anymore.  

We should never seek to erase our xenophobic heritage – on the contrary, it is something we must own up to and learn from.  But it is no more appropriate for a young child to learn about Little Black Sambo than it is for them to learn about the rape version of Sleeping Beauty.  (Or the most graphic Mother Goose rhymes.  Or old television cartoons like these.)  She will be ready to hear it at some point.  Unfortunately, pinpointing the right point, the right moment, the right age will always be a problem.  Because racism is a problem.