The Ask

I admit to having experienced a slight cringe upon hearing the word “ask” used as a noun recently. The usage to which I’m referring usually takes some form along the lines of “the ask is that you do so and so…” or “the ask is for such and such…” I only started noticing this replacement of “request” (or “demand”!) with a nounification of ask in the last year or so, and had been wondering about this apparently new semantic trend. Then, a few weeks ago, I received an email that reawakened my curiosity about the phenomenon. The message was from someone involved with union work who was asking a favor of his contacts. At the end of the message, he wrote:

To repeat “the ask” (union-organizer lingo for what we’re asking you to do): Please help…

Since I first started coming across ask-as-noun-replacing-request in activist circles, this assertion that “the ask” is commonly used in union talk caused me to speculate as to whether it was part of some leftist conspiracy. There may be some evidence for this. For example, a poster on the timely “Stop Using ‘Ask’ As A Noun”  Facebook page writes that they hear it all the time on NPR, that bastion of the liberally biased media. And, Hillary Clinton, former secretary of state for the socialist Obama administration, purportedly once said:

We’re going to expect more from the Afghan government going forward, and we’ve got some very specific asks that we will be making.

But it appears that use of “the ask” cuts across diverse sectors. For instance, in finance lingo, the minimum price a broker will take for a financial instrument is sometimes called “the ask.” Curiously, a nounified “ask” appears to have infected the vocabulary at corporate behemoth Microsoft in the mid-2000s, as can be seen in blog entries written by more than one frustrated employee. I find this especially amusing considering the fact that the Encarta Dictionary built in to Microsoft Word does not include a noun form of “ask” in its entry for that word. As I type this, I see the phraseology “the ask” appearing throughout this post underlined in cautionary wavy green by Word’s grammar checker.

And this brings us to the potentially irrelevant issue of whether using ask as a noun is grammatically correct in the first place. Along with MS Word’s dictionary, the authoritative thesaurus.com offers no synonyms for the noun form. Ah, but the OED informs us that ask was used as a noun as far back as the year 1000 AD (and I don’t mean in its other guise as a noun: an English/Scottish term for a newt)! The OED update from 2005 further cements “the ask’s” grammatical propriety with reference to its contemporary colloquial usage in Australia, where it is usually interpolated by the modifier “big” (e.g. “that’s quite a big ask” [please make sure to pronounce the "k" if you say this]).

Despite the grammatically correct roots of “the ask” in ancient English, though, it’s clear that this form was not regular in modern language. It’s also evident that “the ask” has been making a comeback of late, and that it might be part of a more general linguistic trend that is meeting with some resistance. A piece in the New York Times by Henry Hitchings last year placed the revival of “the ask” in the context of a phenomenon he described as nominalization (the correct term for “nounification”): when a verb or adjective is transformed into a noun. Nominalization (itself a nominalization of “to nominalize”) comes in two types: the first involves adding a suffix and is more common (e.g. “to frustrate” becomes “frustration”), while in the second the same word is simply converted into a noun. The article’s opening paragraph gives a few notorious examples of the second, more controversial type of nominalization (though another NYT opinionator has lambasted academic writing for inelegant overuse of the first type):

“Do you have a solve for this problem?” “Let’s all focus on the build.” “That’s the take-away from today’s seminar.” Or, to quote a song that was recently a No. 1 hit in Britain, “Would you let me see beneath your beautiful?”

As Hitchings explains, this kind of nominalization can be employed in speech or writing to sound edgy or jaunty; think of the currently hip phrase “epic fail.” But he points out that in the case of “the ask” there is a sort of “distancing” or depersonalizing effect. For instance, the expression “the ask is that…” is undoubtedly less direct than saying “I am asking you to…” And I think this is what irks me about “the ask” (or, as a request of me was put in a recent email, “the want”), rather than some obstinate reaction to linguistic change. The role of “the ask” in obfuscating the personal dynamics of a request is perhaps supported by its prominence in fundraising lingo, as established in the title of the 2006 book The Ask: How to Ask Anyone for Any Amount for Any Purpose. (Along these lines, the protagonist of the 2010 novel The Ask is a university fundraiser.) There also seems to be a perception out there that “the ask” is an evil component of corporate doublespeak; many posters on askisnotanoun.com—a site that simply compiles the approximately bi-weekly tweets reaffirming the position stated in the URL—allude to its use in business jargon. Given this popular view, it seems a bit ironic to me that Marxist union activists are claiming “the ask” as their own!

“Here, YOU write this down”

Photo by City Year

During the first several years of my teaching career, like many other teachers, I became very accustomed to using the white (or, in the case of some CUNY classrooms, the black) board to underline key points of lectures and discussions.  A typical class period might have seen me leading a discussion about a particular piece of reading material, periodically turning my back on the students to write out pieces of what they are saying. Although I always liked the idea of visually underlining the knowledge we collectively created using the board, I’ve also always felt a bit disconnected from students when spending so much class time busily taking dictation. Plus, all that writing is tiring!

So, in the past few semesters I’ve started handing off the task of writing on the board to the students themselves. I bring about a dozen dry-erase markers to every class, leave them on the front table, and invite the students to get up out of their seats, grab a marker, and put something up on the board for all to see.  Since I started incorporating this technique into more formalized exercises, I’ve noticed a few immediate, and unexpected, benefits:

1.  First, it loosens things up. With the class divided into small groups, with each group responsible for writing a few notes or arguments or quotes drawn from readings, students become more physically active, moving about the room, and talking to each other. The atmosphere of the room feels more exciting, like a learning laboratory. Rather than acting as a disseminator of knowledge, my role is transformed to that of a guide, helping students along their own path of critical inquiry.

2.  Because of the prevalence of social media, students are increasingly comfortable with the idea of “public postings.”  By duplicating the framework of a Facebook status or a tweet, but layering in a more rigorous and critical element to the production of such writing, I’ve found students to be generally very receptive to the idea of complicated knowledge condensed through careful composition. During a recent exercise in my U.S. History survey, for example, I asked students to imagine that they were administrating the Twitter account of different figures from the Civil Rights Movement of the 1950s and 1960s. After reading pieces from Martin Luther King, Jr., Malcolm X, Ella Baker, Stokely Carmichael, and others, I found many students eager to approach the board to attempt to fashion a tweet that accurately reflected the complex arguments made in these readings. As I surveyed a board full of these tweets, I realized that the exercise was really just asking them to draw out the main points of the work; there is nothing groundbreaking about that. But the marker in their hands, along with the form of social media, unquestionably helped many students to make connections that they hadn’t noticed before.

3. While students are writing on the board, I often take the opportunity to ask them about what they are writing and why they chose to put it up on the board. This is perhaps the most unexpected and rewarding element of the exercise.  Since the other students are working on their own readings, I have the time to have a one-on-one discussion with the student at the board, and since so many students are uncomfortable speaking in front of the whole class, I’m able to make contact (and actually get to know) a wider section of my class than ever before.

4. Finally, and perhaps most importantly, putting the marker in students’ hands transfers power from the instructor to the students themselves. They are the creators of knowledge, and the recorders of that knowledge, and because they are sharing their thoughts in writing with the rest of the class on the board, aiming for those thoughts to be useful for everyone else, they are forced to become very critical and selective about their writing, and that’s exactly the point.

As I continue to experiment with these kinds of exercises, I’m realizing that I find the idea of students writing on the board appealing mainly because it transforms the classroom into a different kind of space, a more active space, in which the process of knowledge creation becomes the subject of the class itself.

Idealism, Pragmatism, and Evolution (or, Grappling with Academia.edu)

I confess I joined Academia.edu for the same reason I joined Facebook:  my friends pressured me into it. There are also, of course, professional and philosophical arguments to be made in the scholarly online community’s favor:  it’s a great way to network and share ideas outside of one’s particular department or the (to say the least) fraught world of peer review and academic publishing.

Intrinsically, idealistically, I love the idea of Academia.edu. It is a lovely idea to use a social networking model for furthering academic discovery and sharing. It builds on the essential freedoms offered by the web—free publication, a broad reach, a curated community—and enacts a model I have no philosophical quibble with, one of openness, generosity, and sharing.

All of these lovely ideals, though, come up against the more worrying reality of the academic world and our careers in the material world. I can’t be alone in feeling reluctant to share my work online, disseminating it among people who might be less than scrupulous about citation and attribution. Furthermore, many academic presses and journals will (understandably) only take on previously unpublished work, and our careers are highly dependent on publication by reputable presses and journals. The counterpoint to these concerns is stories like this one, where someone used Academia.edu precisely for its intended purpose:  to share research and gain recognition beyond her institution’s own politics and perceived limitations.

These questions only highlight for me the importance of Academia.edu. Like other social media platforms, it doesn’t cause the problems of transitioning into new professional, communicative, and economic modalities, but rather illustrates some of the defining tensions of this transition. I remain reluctant to share my ideas, but this is a consequence of living in the world as it is, where fear of plagiarism and the cutthroat system of peer review and academic publication can stifle creative, original research and a generous, collaborative culture. I hope that Academia.edu is a an indication of where things are going, although at the moment professional pragmatism may still trump a full engagement in this evolution.

What’s in a Name?

Vegan.

What do you think of when you hear that word? Don’t sugarcoat it, everyone is probably thinking the same thing.

Weirdo. Crazy. Doesn’t wear deodorant. Smells funny. Hairy legs. Preachy. Cow-hugger. Hippie.

You may have also rolled your eyes when you read the word vegan. This seems to be a typical reaction, along with the above perceptions. What is it about this word that leads to so many negative connotations? In today’s society, knowledge of the dangers of factory farming for both our health and the environment is widespread. The prevalence of non-meat and non-dairy food options is larger than it ever has been and is continuing to grow, yet the stereotypical view of someone who lives a vegan lifestyle remains far from the norm.

I’ll just come out and say it. I follow a vegan diet. I’m a vegan.


vegan

I know what you’re thinking, that now I’m going to try to convince you to become one too. I’m going to ridicule your food choices and tell you you are a murderer and are going to die of some awful disease because you don’t eat like me. Right? Wrong. Yes, there are the crazy, cow-hugging, hippie types that give us all a bad name, but that’s not everyone. In fact, I probably have less interest in what anyone else eats than they do in what I eat.

suicidal-cowsSee, things on this side of the fence are not so much greener for the vegans that are not preachy crazies. When people find out that you are a vegan, you become a table-side circus attraction who people want to convince to “cheat” on your diet, as if you’re on Weight Watchers and might slip up given enough peer pressure. Are you suuuure you don’t want a bite of my steak? The cow left a note, he was suicidal, can you eat it now?

I sometimes find myself telling the server at a restaurant or new acquaintances that I don’t eat meat and have a dairy allergy. I never know what someone will perceive of me if I say I’m vegan. Vegan is a loaded synonym with “crazy” that comes with many perceptions and stereotypes. So, for social purposes I am a vegetarian with a dairy allergy; this doesn’t leave me in fear that someone will spit in my food for being the difficult diner.

At the end of the day, we are all just people and we all need to eat. I might be a vegan, non-vegan, vegetarian, pescetarian, caveman dieter, I might have a dairy allergy, a gluten intolerance, or I might just like what I like. Can’t we all just eat what we like? I promise to wear deodorant, shave my legs, and get plenty of protein.

More on Mettā

Last week, Sarah contributed a review of a NY Times op-ed by Barbara L. Fredrickson on the Buddhist practice of Mettā (Loving-Kindness) and its physiological benefits on your vagal tone, “a subconscious process that controls [your] heart rate.”  The post was especially interesting to me as a four-year practitioner of Vipassana meditation and Mettā.  To say these practices have been hugely beneficial for me would be an understatement, and certainly I feel their interpersonal, mental, and physical effects.  When I am actively practicing I am less prone to anger or irritation, my mind is sharper, my muscles are less tense, and I don’t take things as personally.  That research might point to a tangible connection between “physical health” and “mental well-being” validates my own experience.  However, as Sarah also points out, Fredrickson takes a leap when she suggests that electronic devices might “take a toll” on our “biological capacity to connect.”  Here, Fredrickson doesn’t have data to back her up but is alluding to potential results of research in process.

Actually, I don’t doubt that there are biological (and not just social) effects of the widespread use of electronic devices — anything we do with our minds and bodies also transforms our minds and bodies in ways big and small.  So my dubiousness about Fredrickson’s assertion differs a bit from Sarah’s.  Here’s the thing: I don’t know how useful such research questions are.  First, they restate what we already know — in other words they prove the obvious (there’s a mind-body connection!) — as so many scientific studies these days seem to do.  Second, they take as a given (rather than as something to be analyzed) that the spiritual is a pristine realm within us that must be protected from the other parts of ourselves.

The notion that something can “take a toll” on our capacity to connect assumes this capacity is ideal and autonomous rather than ever shifting and embedded within the context of a differentiated and power-laden social world and multi-faceted personal life.  Life is a complex process of loss and gain.  As modes of communication change, so do our skills and physiologies.  Humans are social beings by our very definition; I think it’s impossible for us to lose our biological ability to connect.  It’s a different thing to recognize that we can make choices about how we want to connect, how we want to develop our capacities to communicate, and how we can do that in a manner that prioritizes social justice.  Because that’s the other thing we humans have going for us: we’re conscious beings.

Also, context matters.  To put it anecdotally: earlier this year I was texting on the elevator at Hunter College and a professor made a comment to her student – in a slightly derogatory tone — about how “people’s elevator behavior” would be good to study.  I guess I seemed like one of those folks who had lost my capacity for human connection.  In reality I had just finished teaching and was reaching out to a friend who was in the midst of a painful medical procedure and was feeling really down.  So maybe I’m not such a lost soul after all?

If being an anthropology doctoral student has taught me anything, it is the value of asking research questions that get at the lived realities and nuances of social life and moving beyond polarizing discourses of good/bad and hurt/protect.  In this case, it might mean asking how our minds, bodies, and relationships change with different modes of communication — for different groups of people in a diversity of social/economic/geographic settings — and with what effects.

Loving-Kindness and Your Vagal Tone

In the online OpEd column of Friday’s New York Times, Barbara L. Fredrickson shared the results of a scientific research study which proved that the Buddhist practice of Mettā can positively influence the health of the human heart. According to Wikipedia, Mettā is the act of feeling tenderly and positively toward everyone and everything, even those that we hate; it is “associated with tonglen (cf.), whereby one breathes out (“sends”) happiness and breathes in (“receives”) suffering.”

In her OpEd, Fredrickson has a bipartite agenda. First, she shares the results of her research:

My research team and I conducted a longitudinal field experiment on the effects of learning skills for cultivating warmer interpersonal connections in daily life. Half the participants, chosen at random, attended a six-week workshop on an ancient mind-training practice known as metta, or “lovingkindness,” that teaches participants to develop more warmth and tenderness toward themselves and others.

We discovered that the meditators not only felt more upbeat and socially connected; but they also altered a key part of their cardiovascular system called vagal tone. Scientists used to think vagal tone was largely stable, like your height in adulthood. Our data show that this part of you is plastic, too, and altered by your social habits.

The vagal tone is a subconscious process that controls one’s heart rate. I personally think that this study is a great example of how religion and social science can find some common ground. There is certainly a great deal of social value to meditation and spiritual practice–I wonder whether this and other studies will make those who disparage such practices think twice. This study also bridges a lot of ground between notions of physical health v. mental well-being.

The second part of Fredrickson’s agenda in the OpEd seems to me to be more dubious. She questions whether modern technology such as cell phones can negatively affect our social capabilities. This is the OpEd’s opening:

Can you remember the last time you were in a public space in America and didn’t notice that half the people around you were bent over a digital screen, thumbing a connection to somewhere else?

Most of us are well aware of the convenience that instant electronic access provides. Less has been said about the costs. Research that my colleagues and I have just completed, to be published in a forthcoming issue of Psychological Science, suggests that one measurable toll may be on our biological capacity to connect with other people.

As far as I can tell, nowhere in this brief OpEd does Fredrickson offer any rationale for how her study might prove that electronic devices reduce our biological capacity to connect. She uses the following inference to try to prove her point: face-to-face interaction positively influences social gene expression, therefore electronic communication diminishes our social abilities.

Where's the baby? Photo by Dan Zink.

Laptop time. Photo by Dan Zink.

I think that this is an interesting hypothesis, but I don’t think that Fredrickson’s study really goes very far in testing said hypothesis. What is needed is further study on how electronic communication affects our social capacity and the expression of what Fredrickson calls “the new field of social genomics.” I suspect that there are many socially positive as well as negative aspects of electronic communication, and that different users are affected differently. Fredrickson’s warning to mothers that they “may need to worry less about genetic testing and more about how their own actions — like texting while breast-feeding or otherwise paying more attention to their phone than their child — leave life-limiting fingerprints on their and their children’s gene expression” seems a little hyperbolic to me, at least without the research to back it up. I’m left wondering whether this OpEd, while wonderful and intriguing, also favors fear-mongering over scientific subtlety.

Of Superheroes and Roy Scheider

Working as a fellow, one of the most striking things I have seen while working with students is not the verbal material they present, but rather their non-verbal communication and body language.  Many times students huddle together behind the lectern, which I have dubbed their “Fortress of Solitude.”  Most of those who have a speaking role at given time do so while contracting within themselves, as if they would love nothing better than to dig a hole to hide themselves from the critical eye of the instructor and the rest of the class.  I certainly understand that feeling and remember it from days long past as an undergraduate student myself.  The question then is never why, but how to fix this.  After all, as a fellow, that is my mandate. Given that my training is more in the social psychology side of academia, my solutions have naturally been drawn from that field.  More specifically, drawn from Wonder Woman and the research of Dr. Amy Cuddy.

Dr. Cuddy’s research is focused upon how our body language can influence how we feel about ourselves.  Developing a “high power” pose can increase testosterone production, which promotes confidence, and inhibit cortisol, which in turn inhibits stress.  What is a “high power” pose?  According to Dr. Cuddy’s research, most expansive types of body language would be high power poses.  Leaning back in your chair with your feet propped up and hands behind your head; leaning forward onto the table with both hands planted on the table surface.  Or the one the media has loved most and since labeled the “Wonder Woman” pose: feet apart and planted, hands at your waist, shoulders thrown back and head held high.

Of course, statistically significant peer reviewed research can at times be a hard sell to students.  My constant instruction on the useful properties of the Wonder Woman Pose has tended to result in nervous giggles, some of the students perhaps wondering if their fellow has come with a few screws loose.  Cue hilarious laughter when I actually adapt the Wonder Woman Pose in class, tossing my head back in a mock gesture to shake back a luscious head of hair.  Although I wonder if making it a “Superman Pose” would be any more effective, my adherence to the Edna Mode “No capes!” school of cutting edge superhero fashion would greatly diminish the effect.  After all, what would Superman be without his flowing red cape?  Batman would hardly be better.  Have you heard the guy?  The man rarely talks, and when he does he does not enunciate!  That and Batman slouches slightly within the dark recesses of his cape.  Maybe I could ask them to channel Roy Scheider and shout “It’s showtime, folks!” into the mirror…

In the end, superheroes being of no help, I can only reinforce the message to my students much like how a high school football coach would psych up the team: “Don’t withdraw into yourselves!  Walk into your presentation with your back straight, shoulders back, and head held high!  You know the material!  They don’t!”

Effective Communication in the Workplace

Emails and online chats have become the predominant ways of communication in the workplace, replacing live, face-to-face conversations. Technology-based ways of communication are considered more effective, as they allow workers to get their message across quickly while multitasking. It’s more common now that a worker would rather chat online with the colleague sitting in the next cubicle rather than get up, walk over and talk face-to-face.

Similarly, because it’s faster, more convenient and effective, in many cases emailing is preferred to talking in person. But an email can never work like a live conversation, simply because the message is transmitted so differently in the two forms of communication. In an email, everything is in the words (their presence or lack thereof), maybe to an extent in the punctuation: but the meaning is mostly and solely carried by the words that have been typed. On the other hand, in a live conversation, the message is transmitted in a variety of ways: through words, tone of voice, facial expression, eye contact, body language. The gestalt of all these pieces of information is then interpreted to form the meaning of what’s being communicated. In a live conversation, each party uses these different ways to communicate their point. Importantly, each party gets to experience the multiple ways in which the other party communicates their point. On the other hand, in an email, it is very difficult to see through more than the words it contains.  Although while typing the email each party might experience a similar tone of voice or body language as they would in a live conversation, they never get to see what the other party experiences or how they react. Thus, in a way, through email, only part of the information is really being communicated.

In addition, plain words, stripped of body language, tone of voice and eye contact, could be open to multiple interpretations. You know what the other person says in his email, but what you don’t know and could only guess is how he is saying it; and many times that makes quite the difference. After all, in the workplace, it is sometimes (if not most times) inappropriate to place a smiley face at the end of the sentence if you want to “soften the tone” of what you are saying. Similarly, it is very inappropriate if not rude, to use capital letters, bold or underline the text in order to communicate the urgency and the importance of what you are saying. It gets even more complicated when in an effort to be effective in communications and work tasks, people fail to pay attention to basics in their writing. Many times employees use abbreviations, omit greeting lines, and fail to proof their emails, which further reduces the quality of the email communication. Then, the reduced quality is often compensated by increased quantity. Thus, in case of misinterpretation or in need of clarification a worker might have to send several emails in order to get his point across – something that could probably be accomplished with a single phone call or a quick chat face-to-face.

All this is not to say that technology-based communication is ineffective, but only that it might not be always as effective as we think it is. The choice of communication channel should be made such that communication quality and quantity are in good balance. Companies are constantly thinking of effective strategies to improve communication in the workplace. The fact that some of the largest, technology-focused companies are taking steps to improve face-to-face employee communication speaks volumes (At Yahoo, Working from Home Doesn’t Work).

Perfect Strangers, Alone Together

This past Valentine’s Day, a once viral video from 2006 re-made the rounds online: Ben Coonley’s Valentine for Perfect Strangers.

I never get tired of Coonley’s video, described as “a romantic e-card from Otto, a feral cat seeking love from a stranger on the Internet. Otto edits himself into clips from the 1980’s sitcom Perfect Strangers and asks strangers on YouTube to return the favor.”

Watching it again this year, I thought about its potential overlap with Sherry Turkle’s Alone Together: Why We Expect More from Technology and Less from Each Other (2012). Ostensibly, Turkle and Coonley are working in the same soil: intimacy and the internet. The website for Turkle’s book features the following language: “Facebook. Twitter. SecondLife. ‘Smart’ phones. Robotic pets. Robotic lovers. Thirty years ago we asked what we would use computers for. Now the question is what don’t we use them for. Now, through technology, we create, navigate, and perform our emotional lives.” “Technology has become the architect of our intimacies,” she goes on to warn.

But as I recalled watching the video for the first time — years ago with the friend who introduced me to it — I thought not of the pathetic ironies of 21st century digitally-mediated longing but of actual relationships: the shared laughter with my friend, and then my subsequent inclusion of the video in a screening program I’d put together in Puerto Rico. In a steamy gallery space with bad acoustics, dozens of young people sat crowded on the floor and watched Coonley’s video and other short works about love and longing. The event wasn’t a particular success, and I don’t have a big thesis — but just a tiny observation: that for every grand evaluation of the impact of technology, there is an immediately available example of its very opposite. Every online alienation might contain the shadow of a genuine encounter in another time/space dimension. We should keep tracking both story lines.

Notes on Saving the World

Anyway, I keep picturing all these little kids playing some game in this big field of rye and all. Thousands of little kids, and nobody’s around – nobody big, I mean – except me. And I’m standing on the edge of some crazy cliff. What I have to do, I have to catch everybody if they start to go over the cliff – I mean if they’re running and they don’t look where they’re going I have to come out from somewhere and catch them. That’s all I do all day. I’d just be the catcher in the rye and all. I know it’s crazy, but that’s the only thing I’d really like to be.

We’re all familiar with Holden Caulfield’s strange interlude at the end of The Catcher in the Rye from which the book gets its title.

It reminds me of something Amity Bitzel says in her section of the “This American Life” broadcast called “Surrogates.”  She tells the story of how a 27-year-old man who was convicted of killing his parents comes to be adopted into her own abusive family: http://www.thisamericanlife.org/radio-archives/episode/485/surrogates

As she narrates the story of her father’s abuse, she crystallizes the terror into moments in which her father’s rage results in his breaking all the furniture, hitting the girls with a belt, or strangling their mother.  This is the mother that she recalls was always a silent bystander.  She doesn’t know why it never occurred to her to call the cops and she makes reference to the responsible adults who never interceded: “The outside world was never coming to intervene, to save any of us.”

It doesn’t always hit us straightforward and sad.  Robert Hamburger’s REALUltimatePower is a testament to the sweetness of ninjas.  Written from the perspective of a 12-year-old boy,  the aforementioned Robert, the website is hilarious.  Robert is obsessed with ninjas and thinks you should be too.  After all, they fight all the time and they “totally flip out and kill people.”  The book that resulted from the website starts out just as funny, praising all ninjahood and even features his babysitter, a philosophy student, who provides “ontological proof of ninjas” in a footnote.  However, the humor ends abruptly once the reader realizes that Robert’s ninja craze is really about the fact that he lives in an abusive home in which he is unable to protect himself and so he has created the fantasy of ninjas as a way of summoning those who can intercede, if only in his imagination.  The appendix of the book features various documents that make the situation fairly clear. They are as hyperbolic as they are true:

IMG_3721 IMG_3722 IMG_3723 IMG_3724

I had a Robert when I used to teach elementary school in France, but his name was Guillaume.  He would hit another student named Georges over the head with a dictionary and rant a string of sarcasms about how he did it because he’s a maniac. Or he would eat a crayon in a display of frenzy when the girls were watching.  Or he would start a fight at recess. He was always in trouble. One day I passed his desk and as I raised my hand as part of a gesture, he flinched.  But this was not part of his theatrics.

Lauren Berlant thinks a lot about what it is for people to be with other people. In her book Cruel Optimism (2011), she poses the question of: “Why do people stay attached to lives that don’t work?” Her question is relative to adults who have the option of choosing other lives but it helps us to think about what people might do when they don’t have the option of changing lives.  They create things to save them. “Cruel Optimism,” says Berlant, “tells some pretty difficult stories about how people maintain their footing in worlds that are not there for them.”  In my mind, the idea of living in a world that is not there for you means being forced to inhabit simultaneous worlds which are out of sync with one another although they remain intimately connected. There is the given world in which there is the presence of an order, especially that of a social order, its vulnerability being its most necessary quality, which we might say operates by way of Kant’s categorical imperative and the Golden Rule alike. And then there is the personal  world in which that order sometimes harmfully fails so that order becomes bare and arbitrary. Yet, one must go on doing as they would have done unto them, however that is supposed to mean in the disparate positions of these two worlds cleaved into one.

I entered graduate school thinking that if this gig doesn’t work out I’ll just go and teach elementary school, as my heart was split from the first in that decision.  I always wonder if this is the year that I will abandon my graduate studies and go off to teach kids about peregrine falcons and help them glue together those art projects that receive the unconditional, “oooooooh,” from a parent on whom this enormous gift is bestowed. Maybe this year I’ll walk away from these ridiculous academic struggles to go do that, that easier thing.

A friend was telling me the story the other day of how he spent a summer teaching summer camp.  All day he was with the kids, teaching them, giving them the care that goes along with giving them ideas. But at the end of the day, he knew he was sending some of them to be decimated again in those warzones of hostile homes. No matter what he might help them to see through their own better minds, they were still and always going home. He only taught there the one summer.

Holden’s craziness is often misread as part of Salinger’s style, twisting the plot into a sad surprise ending, like some literary grace that solves the problem of his disappointment in life and its systems, of his revelations of people as selfish or shallow by also revealing that he is in a mental institution. So we might consider taking this all with a grain of salt. So that’s the dismissable reading. But another reading is the more cynical one, perhaps, that Salinger writes Holden’s altruism as an insanity.  You really can’t save the world.  It’s crazy.

I think that my friend didn’t go back to that summer camp because it is hard with the little ones. Who can stand on the edge of the cliff running back and forth without going crazy? “I’ll see them when they get to college,” my friend told me. And in that moment I knew I was never going to teach elementary school.

“I’ll see them when they get to college,” I told myself, hoping they find their way here.  Doesn’t Virgil take Dante’s hand, leading him through purgatory, teaching him to make sense of it?