Composition Across the Curriculum

Due to our many discussions about Communication Across the Curriculum and multimodal composing at the Schwartz Institute, I became interested in the idea of Composition Across the Curriculum. In particular, I wanted to think through the pedagogy of using writing, speech, and video in the same classroom. What is similar and what is different for students and instructors when it comes to these different technologies of expression?

Below is an interview with documentary filmmaker Sascha Just, who teaches film production and public speaking in Baruch’s Department of Communication Studies. Her short doc Ambassadors – The Native Jazz Quartet at Work has been screened at the American Documentary Film Festival in Palm Springs, CA, the The Queens World Film Festival, where it was nominated for best short doc, and at Woman with A Movie Camera.


1. What kind of assignments do the film production students create?

The students have four assignments. The first one is a dialogue scene. They form groups and pick a scene. We then film the scenes in class and they edit them in the library computer lab.

For the second assignment, they go out and film a chase scene. Two or three people chase each other. It’s a very fun assignment, creative, somewhat adventurous. More than anything, it teaches how to compose a shot and how to create continuity. –How to build a story. We are dealing with structure based on logic. –Basic film language. At this point, most of them could handle [the editing software] Final Cut and all scenes turned out extremely well.

The third project was a fundraiser/kickstarter video for their final project, which was a short documentary. I figured this is a business school and I want to teach them the reality of filmmaking. It’s expensive. A short fundraiser forces you to focus on the essence of your project. The final assignment was 10-minute documentaries.

2. What kinds of writing do the students do during the semester? How does the writing prepare the students for filming or help them reflect on what they created?

For the chase scene, students drew storyboards to ensure that the shot order would be effective, economic, and logical. For the documentary project, they wrote a production plan: a premise of the project and description of what they were going to shoot, where and when. It helps tighten the production, schedule the shoots, plan interviews, and overall tailor production decisions to support the main idea of the film.

The students write a film analysis paper and an exam.  Both written assignments ask students to demonstrate their understanding of film language. This means on the one hand that they use the correct terminology and can communicate with other filmmakers. On the other, it means that they grasp the meanings a sequence of shots can express. For example, why does the filmmaker choose to shoot this scene with close ups? What did she try to convey?

3. Do you see any strong connections between structuring a speech and structuring a doc? –In terms of clarity of perspective, editing (knowing what to put in, take out, when and how to present information), etc.?

Doc films in particular work with reality but they are no more realistic than so-called fiction films. No matter how accurately researched, they always play with reality. The same can be said about speeches and academic papers. I guess, altogether I question the possibility of representing reality.

However, I believe in putting great effort into creating a structure built on logic. That turns out to be one of the most challenging aspects of filmmaking and public speaking. The questions of “does this scene belong here or there, why does it feel right to place this scene here and not there” preoccupy me a lot. It’s a constant negotiation between the style or aesthetics I am trying to create and the content/information I am trying to communicate. Again, the same as with speeches or academic papers.

AMBASSADORS is a very simple story without real dramatic climax, but was nonetheless difficult to structure. The musicians noticed it – I used the songs as a structure. I personally do not like to work with voice overs, but there are many great films that do (REEL INJUNS, a must see). I am trying to keep my own voice out of it as much as possible, because a) I am more interested in what the characters have to say and b) I feel that my viewpoint comes through a lot anyhow, simply because I select, interview, structure etc.

4. Any thoughts on the communication that happens between the documentarian and the subject and between instructor and student? If the same, how so? If different, in what ways?

I hadn’t thought of it before, but I think there are parallels between interviewing and q & a with students. Both require attentive and engaged listening. Waiting till the person is finished. Prompting further thoughts with short follow-up questions. Phrasing questions short and clear. Neither students nor interviewee should spend too much time trying to figure out what it is I am asking, right?

Both students and interview subjects respond much more willingly if they sense that I care. Once I cried in an interview, because what the person (an older, very unhappy Indian) told me was heartbreaking. It turned out to be one of the most meaningful and informative interviews I have ever conducted. So much for neutrality.

We never are objective anyhow, so why would I try that in such heightened situations like an interview or the classroom? It becomes dishonest.

5. Do you have any thoughts about how people’s behavior changes in front of the camera (particularly in this digital smart phone age)? –I ask this particularly in the context of Baruch where we use the technique of taping students and doing an immediate playback so they can experience their vocal and bodily delivery habits as an audience member would.

Even very confident people who believe that they forget about the camera are on some level aware of it. In my opinion, they perform for the camera. Not necessarily a problem. Without the camera, they would perform for the teacher, class, or interviewer. Performing is so often defined as negative = fake. But ultimately it means that students or interviewees pull themselves together, focus, try to make a good impression, and eliminate distracting stories or habits to the best of their ability.

Sascha Just was born and raised in Berlin, Germany and is a doctoral candidate in the CUNY Graduate Center’s Theatre Department. Her dissertation is about the cinematic representation of New Orleans performance cultures. Just’s in-process documentary film Heirs is a music-driven portrait of New Orleans composed of three artists’ journeys into the city’s past: drummer/vibraphonist Jason Marsalis, Mardi Gras Indian Chief Darryl Montana, and theater artist Lisa D’Amour. 

The Ask

I admit to having experienced a slight cringe upon hearing the word “ask” used as a noun recently. The usage to which I’m referring usually takes some form along the lines of “the ask is that you do so and so…” or “the ask is for such and such…” I only started noticing this replacement of “request” (or “demand”!) with a nounification of ask in the last year or so, and had been wondering about this apparently new semantic trend. Then, a few weeks ago, I received an email that reawakened my curiosity about the phenomenon. The message was from someone involved with union work who was asking a favor of his contacts. At the end of the message, he wrote:

To repeat “the ask” (union-organizer lingo for what we’re asking you to do): Please help…

Since I first started coming across ask-as-noun-replacing-request in activist circles, this assertion that “the ask” is commonly used in union talk caused me to speculate as to whether it was part of some leftist conspiracy. There may be some evidence for this. For example, a poster on the timely “Stop Using ‘Ask’ As A Noun”  Facebook page writes that they hear it all the time on NPR, that bastion of the liberally biased media. And, Hillary Clinton, former secretary of state for the socialist Obama administration, purportedly once said:

We’re going to expect more from the Afghan government going forward, and we’ve got some very specific asks that we will be making.

But it appears that use of “the ask” cuts across diverse sectors. For instance, in finance lingo, the minimum price a broker will take for a financial instrument is sometimes called “the ask.” Curiously, a nounified “ask” appears to have infected the vocabulary at corporate behemoth Microsoft in the mid-2000s, as can be seen in blog entries written by more than one frustrated employee. I find this especially amusing considering the fact that the Encarta Dictionary built in to Microsoft Word does not include a noun form of “ask” in its entry for that word. As I type this, I see the phraseology “the ask” appearing throughout this post underlined in cautionary wavy green by Word’s grammar checker.

And this brings us to the potentially irrelevant issue of whether using ask as a noun is grammatically correct in the first place. Along with MS Word’s dictionary, the authoritative offers no synonyms for the noun form. Ah, but the OED informs us that ask was used as a noun as far back as the year 1000 AD (and I don’t mean in its other guise as a noun: an English/Scottish term for a newt)! The OED update from 2005 further cements “the ask’s” grammatical propriety with reference to its contemporary colloquial usage in Australia, where it is usually interpolated by the modifier “big” (e.g. “that’s quite a big ask” [please make sure to pronounce the "k" if you say this]).

Despite the grammatically correct roots of “the ask” in ancient English, though, it’s clear that this form was not regular in modern language. It’s also evident that “the ask” has been making a comeback of late, and that it might be part of a more general linguistic trend that is meeting with some resistance. A piece in the New York Times by Henry Hitchings last year placed the revival of “the ask” in the context of a phenomenon he described as nominalization (the correct term for “nounification”): when a verb or adjective is transformed into a noun. Nominalization (itself a nominalization of “to nominalize”) comes in two types: the first involves adding a suffix and is more common (e.g. “to frustrate” becomes “frustration”), while in the second the same word is simply converted into a noun. The article’s opening paragraph gives a few notorious examples of the second, more controversial type of nominalization (though another NYT opinionator has lambasted academic writing for inelegant overuse of the first type):

“Do you have a solve for this problem?” “Let’s all focus on the build.” “That’s the take-away from today’s seminar.” Or, to quote a song that was recently a No. 1 hit in Britain, “Would you let me see beneath your beautiful?”

As Hitchings explains, this kind of nominalization can be employed in speech or writing to sound edgy or jaunty; think of the currently hip phrase “epic fail.” But he points out that in the case of “the ask” there is a sort of “distancing” or depersonalizing effect. For instance, the expression “the ask is that…” is undoubtedly less direct than saying “I am asking you to…” And I think this is what irks me about “the ask” (or, as a request of me was put in a recent email, “the want”), rather than some obstinate reaction to linguistic change. The role of “the ask” in obfuscating the personal dynamics of a request is perhaps supported by its prominence in fundraising lingo, as established in the title of the 2006 book The Ask: How to Ask Anyone for Any Amount for Any Purpose. (Along these lines, the protagonist of the 2010 novel The Ask is a university fundraiser.) There also seems to be a perception out there that “the ask” is an evil component of corporate doublespeak; many posters on—a site that simply compiles the approximately bi-weekly tweets reaffirming the position stated in the URL—allude to its use in business jargon. Given this popular view, it seems a bit ironic to me that Marxist union activists are claiming “the ask” as their own!

“Here, YOU write this down”

Photo by City Year

During the first several years of my teaching career, like many other teachers, I became very accustomed to using the white (or, in the case of some CUNY classrooms, the black) board to underline key points of lectures and discussions.  A typical class period might have seen me leading a discussion about a particular piece of reading material, periodically turning my back on the students to write out pieces of what they are saying. Although I always liked the idea of visually underlining the knowledge we collectively created using the board, I’ve also always felt a bit disconnected from students when spending so much class time busily taking dictation. Plus, all that writing is tiring!

So, in the past few semesters I’ve started handing off the task of writing on the board to the students themselves. I bring about a dozen dry-erase markers to every class, leave them on the front table, and invite the students to get up out of their seats, grab a marker, and put something up on the board for all to see.  Since I started incorporating this technique into more formalized exercises, I’ve noticed a few immediate, and unexpected, benefits:

1.  First, it loosens things up. With the class divided into small groups, with each group responsible for writing a few notes or arguments or quotes drawn from readings, students become more physically active, moving about the room, and talking to each other. The atmosphere of the room feels more exciting, like a learning laboratory. Rather than acting as a disseminator of knowledge, my role is transformed to that of a guide, helping students along their own path of critical inquiry.

2.  Because of the prevalence of social media, students are increasingly comfortable with the idea of “public postings.”  By duplicating the framework of a Facebook status or a tweet, but layering in a more rigorous and critical element to the production of such writing, I’ve found students to be generally very receptive to the idea of complicated knowledge condensed through careful composition. During a recent exercise in my U.S. History survey, for example, I asked students to imagine that they were administrating the Twitter account of different figures from the Civil Rights Movement of the 1950s and 1960s. After reading pieces from Martin Luther King, Jr., Malcolm X, Ella Baker, Stokely Carmichael, and others, I found many students eager to approach the board to attempt to fashion a tweet that accurately reflected the complex arguments made in these readings. As I surveyed a board full of these tweets, I realized that the exercise was really just asking them to draw out the main points of the work; there is nothing groundbreaking about that. But the marker in their hands, along with the form of social media, unquestionably helped many students to make connections that they hadn’t noticed before.

3. While students are writing on the board, I often take the opportunity to ask them about what they are writing and why they chose to put it up on the board. This is perhaps the most unexpected and rewarding element of the exercise.  Since the other students are working on their own readings, I have the time to have a one-on-one discussion with the student at the board, and since so many students are uncomfortable speaking in front of the whole class, I’m able to make contact (and actually get to know) a wider section of my class than ever before.

4. Finally, and perhaps most importantly, putting the marker in students’ hands transfers power from the instructor to the students themselves. They are the creators of knowledge, and the recorders of that knowledge, and because they are sharing their thoughts in writing with the rest of the class on the board, aiming for those thoughts to be useful for everyone else, they are forced to become very critical and selective about their writing, and that’s exactly the point.

As I continue to experiment with these kinds of exercises, I’m realizing that I find the idea of students writing on the board appealing mainly because it transforms the classroom into a different kind of space, a more active space, in which the process of knowledge creation becomes the subject of the class itself.

Idealism, Pragmatism, and Evolution (or, Grappling with

I confess I joined for the same reason I joined Facebook:  my friends pressured me into it. There are also, of course, professional and philosophical arguments to be made in the scholarly online community’s favor:  it’s a great way to network and share ideas outside of one’s particular department or the (to say the least) fraught world of peer review and academic publishing.

Intrinsically, idealistically, I love the idea of It is a lovely idea to use a social networking model for furthering academic discovery and sharing. It builds on the essential freedoms offered by the web—free publication, a broad reach, a curated community—and enacts a model I have no philosophical quibble with, one of openness, generosity, and sharing.

All of these lovely ideals, though, come up against the more worrying reality of the academic world and our careers in the material world. I can’t be alone in feeling reluctant to share my work online, disseminating it among people who might be less than scrupulous about citation and attribution. Furthermore, many academic presses and journals will (understandably) only take on previously unpublished work, and our careers are highly dependent on publication by reputable presses and journals. The counterpoint to these concerns is stories like this one, where someone used precisely for its intended purpose:  to share research and gain recognition beyond her institution’s own politics and perceived limitations.

These questions only highlight for me the importance of Like other social media platforms, it doesn’t cause the problems of transitioning into new professional, communicative, and economic modalities, but rather illustrates some of the defining tensions of this transition. I remain reluctant to share my ideas, but this is a consequence of living in the world as it is, where fear of plagiarism and the cutthroat system of peer review and academic publication can stifle creative, original research and a generous, collaborative culture. I hope that is a an indication of where things are going, although at the moment professional pragmatism may still trump a full engagement in this evolution.

What’s in a Name?


What do you think of when you hear that word? Don’t sugarcoat it, everyone is probably thinking the same thing.

Weirdo. Crazy. Doesn’t wear deodorant. Smells funny. Hairy legs. Preachy. Cow-hugger. Hippie.

You may have also rolled your eyes when you read the word vegan. This seems to be a typical reaction, along with the above perceptions. What is it about this word that leads to so many negative connotations? In today’s society, knowledge of the dangers of factory farming for both our health and the environment is widespread. The prevalence of non-meat and non-dairy food options is larger than it ever has been and is continuing to grow, yet the stereotypical view of someone who lives a vegan lifestyle remains far from the norm.

I’ll just come out and say it. I follow a vegan diet. I’m a vegan.


I know what you’re thinking, that now I’m going to try to convince you to become one too. I’m going to ridicule your food choices and tell you you are a murderer and are going to die of some awful disease because you don’t eat like me. Right? Wrong. Yes, there are the crazy, cow-hugging, hippie types that give us all a bad name, but that’s not everyone. In fact, I probably have less interest in what anyone else eats than they do in what I eat.

suicidal-cowsSee, things on this side of the fence are not so much greener for the vegans that are not preachy crazies. When people find out that you are a vegan, you become a table-side circus attraction who people want to convince to “cheat” on your diet, as if you’re on Weight Watchers and might slip up given enough peer pressure. Are you suuuure you don’t want a bite of my steak? The cow left a note, he was suicidal, can you eat it now?

I sometimes find myself telling the server at a restaurant or new acquaintances that I don’t eat meat and have a dairy allergy. I never know what someone will perceive of me if I say I’m vegan. Vegan is a loaded synonym with “crazy” that comes with many perceptions and stereotypes. So, for social purposes I am a vegetarian with a dairy allergy; this doesn’t leave me in fear that someone will spit in my food for being the difficult diner.

At the end of the day, we are all just people and we all need to eat. I might be a vegan, non-vegan, vegetarian, pescetarian, caveman dieter, I might have a dairy allergy, a gluten intolerance, or I might just like what I like. Can’t we all just eat what we like? I promise to wear deodorant, shave my legs, and get plenty of protein.

More on Mettā

Last week, Sarah contributed a review of a NY Times op-ed by Barbara L. Fredrickson on the Buddhist practice of Mettā (Loving-Kindness) and its physiological benefits on your vagal tone, “a subconscious process that controls [your] heart rate.”  The post was especially interesting to me as a four-year practitioner of Vipassana meditation and Mettā.  To say these practices have been hugely beneficial for me would be an understatement, and certainly I feel their interpersonal, mental, and physical effects.  When I am actively practicing I am less prone to anger or irritation, my mind is sharper, my muscles are less tense, and I don’t take things as personally.  That research might point to a tangible connection between “physical health” and “mental well-being” validates my own experience.  However, as Sarah also points out, Fredrickson takes a leap when she suggests that electronic devices might “take a toll” on our “biological capacity to connect.”  Here, Fredrickson doesn’t have data to back her up but is alluding to potential results of research in process.

Actually, I don’t doubt that there are biological (and not just social) effects of the widespread use of electronic devices — anything we do with our minds and bodies also transforms our minds and bodies in ways big and small.  So my dubiousness about Fredrickson’s assertion differs a bit from Sarah’s.  Here’s the thing: I don’t know how useful such research questions are.  First, they restate what we already know — in other words they prove the obvious (there’s a mind-body connection!) — as so many scientific studies these days seem to do.  Second, they take as a given (rather than as something to be analyzed) that the spiritual is a pristine realm within us that must be protected from the other parts of ourselves.

The notion that something can “take a toll” on our capacity to connect assumes this capacity is ideal and autonomous rather than ever shifting and embedded within the context of a differentiated and power-laden social world and multi-faceted personal life.  Life is a complex process of loss and gain.  As modes of communication change, so do our skills and physiologies.  Humans are social beings by our very definition; I think it’s impossible for us to lose our biological ability to connect.  It’s a different thing to recognize that we can make choices about how we want to connect, how we want to develop our capacities to communicate, and how we can do that in a manner that prioritizes social justice.  Because that’s the other thing we humans have going for us: we’re conscious beings.

Also, context matters.  To put it anecdotally: earlier this year I was texting on the elevator at Hunter College and a professor made a comment to her student – in a slightly derogatory tone — about how “people’s elevator behavior” would be good to study.  I guess I seemed like one of those folks who had lost my capacity for human connection.  In reality I had just finished teaching and was reaching out to a friend who was in the midst of a painful medical procedure and was feeling really down.  So maybe I’m not such a lost soul after all?

If being an anthropology doctoral student has taught me anything, it is the value of asking research questions that get at the lived realities and nuances of social life and moving beyond polarizing discourses of good/bad and hurt/protect.  In this case, it might mean asking how our minds, bodies, and relationships change with different modes of communication — for different groups of people in a diversity of social/economic/geographic settings — and with what effects.

Loving-Kindness and Your Vagal Tone

In the online OpEd column of Friday’s New York Times, Barbara L. Fredrickson shared the results of a scientific research study which proved that the Buddhist practice of Mettā can positively influence the health of the human heart. According to Wikipedia, Mettā is the act of feeling tenderly and positively toward everyone and everything, even those that we hate; it is “associated with tonglen (cf.), whereby one breathes out (“sends”) happiness and breathes in (“receives”) suffering.”

In her OpEd, Fredrickson has a bipartite agenda. First, she shares the results of her research:

My research team and I conducted a longitudinal field experiment on the effects of learning skills for cultivating warmer interpersonal connections in daily life. Half the participants, chosen at random, attended a six-week workshop on an ancient mind-training practice known as metta, or “lovingkindness,” that teaches participants to develop more warmth and tenderness toward themselves and others.

We discovered that the meditators not only felt more upbeat and socially connected; but they also altered a key part of their cardiovascular system called vagal tone. Scientists used to think vagal tone was largely stable, like your height in adulthood. Our data show that this part of you is plastic, too, and altered by your social habits.

The vagal tone is a subconscious process that controls one’s heart rate. I personally think that this study is a great example of how religion and social science can find some common ground. There is certainly a great deal of social value to meditation and spiritual practice–I wonder whether this and other studies will make those who disparage such practices think twice. This study also bridges a lot of ground between notions of physical health v. mental well-being.

The second part of Fredrickson’s agenda in the OpEd seems to me to be more dubious. She questions whether modern technology such as cell phones can negatively affect our social capabilities. This is the OpEd’s opening:

Can you remember the last time you were in a public space in America and didn’t notice that half the people around you were bent over a digital screen, thumbing a connection to somewhere else?

Most of us are well aware of the convenience that instant electronic access provides. Less has been said about the costs. Research that my colleagues and I have just completed, to be published in a forthcoming issue of Psychological Science, suggests that one measurable toll may be on our biological capacity to connect with other people.

As far as I can tell, nowhere in this brief OpEd does Fredrickson offer any rationale for how her study might prove that electronic devices reduce our biological capacity to connect. She uses the following inference to try to prove her point: face-to-face interaction positively influences social gene expression, therefore electronic communication diminishes our social abilities.

Where's the baby? Photo by Dan Zink.

Laptop time. Photo by Dan Zink.

I think that this is an interesting hypothesis, but I don’t think that Fredrickson’s study really goes very far in testing said hypothesis. What is needed is further study on how electronic communication affects our social capacity and the expression of what Fredrickson calls “the new field of social genomics.” I suspect that there are many socially positive as well as negative aspects of electronic communication, and that different users are affected differently. Fredrickson’s warning to mothers that they “may need to worry less about genetic testing and more about how their own actions — like texting while breast-feeding or otherwise paying more attention to their phone than their child — leave life-limiting fingerprints on their and their children’s gene expression” seems a little hyperbolic to me, at least without the research to back it up. I’m left wondering whether this OpEd, while wonderful and intriguing, also favors fear-mongering over scientific subtlety.

Of Superheroes and Roy Scheider

Working as a fellow, one of the most striking things I have seen while working with students is not the verbal material they present, but rather their non-verbal communication and body language.  Many times students huddle together behind the lectern, which I have dubbed their “Fortress of Solitude.”  Most of those who have a speaking role at given time do so while contracting within themselves, as if they would love nothing better than to dig a hole to hide themselves from the critical eye of the instructor and the rest of the class.  I certainly understand that feeling and remember it from days long past as an undergraduate student myself.  The question then is never why, but how to fix this.  After all, as a fellow, that is my mandate. Given that my training is more in the social psychology side of academia, my solutions have naturally been drawn from that field.  More specifically, drawn from Wonder Woman and the research of Dr. Amy Cuddy.

Dr. Cuddy’s research is focused upon how our body language can influence how we feel about ourselves.  Developing a “high power” pose can increase testosterone production, which promotes confidence, and inhibit cortisol, which in turn inhibits stress.  What is a “high power” pose?  According to Dr. Cuddy’s research, most expansive types of body language would be high power poses.  Leaning back in your chair with your feet propped up and hands behind your head; leaning forward onto the table with both hands planted on the table surface.  Or the one the media has loved most and since labeled the “Wonder Woman” pose: feet apart and planted, hands at your waist, shoulders thrown back and head held high.

Of course, statistically significant peer reviewed research can at times be a hard sell to students.  My constant instruction on the useful properties of the Wonder Woman Pose has tended to result in nervous giggles, some of the students perhaps wondering if their fellow has come with a few screws loose.  Cue hilarious laughter when I actually adapt the Wonder Woman Pose in class, tossing my head back in a mock gesture to shake back a luscious head of hair.  Although I wonder if making it a “Superman Pose” would be any more effective, my adherence to the Edna Mode “No capes!” school of cutting edge superhero fashion would greatly diminish the effect.  After all, what would Superman be without his flowing red cape?  Batman would hardly be better.  Have you heard the guy?  The man rarely talks, and when he does he does not enunciate!  That and Batman slouches slightly within the dark recesses of his cape.  Maybe I could ask them to channel Roy Scheider and shout “It’s showtime, folks!” into the mirror…

In the end, superheroes being of no help, I can only reinforce the message to my students much like how a high school football coach would psych up the team: “Don’t withdraw into yourselves!  Walk into your presentation with your back straight, shoulders back, and head held high!  You know the material!  They don’t!”

Effective Communication in the Workplace

Emails and online chats have become the predominant ways of communication in the workplace, replacing live, face-to-face conversations. Technology-based ways of communication are considered more effective, as they allow workers to get their message across quickly while multitasking. It’s more common now that a worker would rather chat online with the colleague sitting in the next cubicle rather than get up, walk over and talk face-to-face.

Similarly, because it’s faster, more convenient and effective, in many cases emailing is preferred to talking in person. But an email can never work like a live conversation, simply because the message is transmitted so differently in the two forms of communication. In an email, everything is in the words (their presence or lack thereof), maybe to an extent in the punctuation: but the meaning is mostly and solely carried by the words that have been typed. On the other hand, in a live conversation, the message is transmitted in a variety of ways: through words, tone of voice, facial expression, eye contact, body language. The gestalt of all these pieces of information is then interpreted to form the meaning of what’s being communicated. In a live conversation, each party uses these different ways to communicate their point. Importantly, each party gets to experience the multiple ways in which the other party communicates their point. On the other hand, in an email, it is very difficult to see through more than the words it contains.  Although while typing the email each party might experience a similar tone of voice or body language as they would in a live conversation, they never get to see what the other party experiences or how they react. Thus, in a way, through email, only part of the information is really being communicated.

In addition, plain words, stripped of body language, tone of voice and eye contact, could be open to multiple interpretations. You know what the other person says in his email, but what you don’t know and could only guess is how he is saying it; and many times that makes quite the difference. After all, in the workplace, it is sometimes (if not most times) inappropriate to place a smiley face at the end of the sentence if you want to “soften the tone” of what you are saying. Similarly, it is very inappropriate if not rude, to use capital letters, bold or underline the text in order to communicate the urgency and the importance of what you are saying. It gets even more complicated when in an effort to be effective in communications and work tasks, people fail to pay attention to basics in their writing. Many times employees use abbreviations, omit greeting lines, and fail to proof their emails, which further reduces the quality of the email communication. Then, the reduced quality is often compensated by increased quantity. Thus, in case of misinterpretation or in need of clarification a worker might have to send several emails in order to get his point across – something that could probably be accomplished with a single phone call or a quick chat face-to-face.

All this is not to say that technology-based communication is ineffective, but only that it might not be always as effective as we think it is. The choice of communication channel should be made such that communication quality and quantity are in good balance. Companies are constantly thinking of effective strategies to improve communication in the workplace. The fact that some of the largest, technology-focused companies are taking steps to improve face-to-face employee communication speaks volumes (At Yahoo, Working from Home Doesn’t Work).

Perfect Strangers, Alone Together

This past Valentine’s Day, a once viral video from 2006 re-made the rounds online: Ben Coonley’s Valentine for Perfect Strangers.

I never get tired of Coonley’s video, described as “a romantic e-card from Otto, a feral cat seeking love from a stranger on the Internet. Otto edits himself into clips from the 1980’s sitcom Perfect Strangers and asks strangers on YouTube to return the favor.”

Watching it again this year, I thought about its potential overlap with Sherry Turkle’s Alone Together: Why We Expect More from Technology and Less from Each Other (2012). Ostensibly, Turkle and Coonley are working in the same soil: intimacy and the internet. The website for Turkle’s book features the following language: “Facebook. Twitter. SecondLife. ‘Smart’ phones. Robotic pets. Robotic lovers. Thirty years ago we asked what we would use computers for. Now the question is what don’t we use them for. Now, through technology, we create, navigate, and perform our emotional lives.” “Technology has become the architect of our intimacies,” she goes on to warn.

But as I recalled watching the video for the first time — years ago with the friend who introduced me to it — I thought not of the pathetic ironies of 21st century digitally-mediated longing but of actual relationships: the shared laughter with my friend, and then my subsequent inclusion of the video in a screening program I’d put together in Puerto Rico. In a steamy gallery space with bad acoustics, dozens of young people sat crowded on the floor and watched Coonley’s video and other short works about love and longing. The event wasn’t a particular success, and I don’t have a big thesis — but just a tiny observation: that for every grand evaluation of the impact of technology, there is an immediately available example of its very opposite. Every online alienation might contain the shadow of a genuine encounter in another time/space dimension. We should keep tracking both story lines.