On Smartphones and Journalism

For the past two semesters, I’ve worked with students as they reported all over the five boroughs and Long Island for the Multimedia Journalism class. They’ve produced photo slideshows, videos, and podcasts for the class, and my role has been to coach them through the reporting and editing process.

Here at Baruch, we have audio recorders, video cameras and basic still cameras that the students can borrow from the school if they don’t have their own equipment. At this point, we don’t have high-quality DSLR cameras to offer them (and in any case it’s not an advanced-level class). So most of the time, for the photojournalism assignments, we had them use their smartphone cameras.

I noticed fairly early on that some of the students seemed a little bummed that they had to rely on their smartphones rather than professional-grade equipment when it came time to shoot their photo essays. I’ve been a student journalist myself and know what it is like to feel as though my student status and tight budget is holding me back from telling stories as well as I’d like—so I sought to reassure them that there was no need to feel limited.

The first thing I did was tell them about photojournalist Michael Christopher Brown, who has been featured on TIME’s LightBox photo blog for his iPhone photo essays made in Libya and the Democratic Republic of the Congo. He told LightBox that he began shooting on his iPhone after dropping his SLR shortly after arriving in Libya, and then found that in many ways he actually preferred it.

Our multimedia class discussed the pros and cons of using a smartphone as a camera. True, the quality of the image isn’t as great with a smartphone, and the camera is much more limited in terms of the light conditions where it can shoot. But something strange happens when you pick up something the size of your palm to take a picture of someone instead of several pounds worth of glass, metal and plastic: You become invisible.

You could be checking your email, posting on Facebook, or playing Angry Birds. But even if the person knows you’re taking their picture, a phone is simply less intimidating. Subjects have blinked and even physically recoiled when I’ve pointed my DSLR quite close to their face to take a portrait. Using a camera phone often conveys a certain intimacy, and it makes you seem less of a threat. I know journalists who have been allowed access to places—field hospitals, for instance—with their iPhones while their colleagues with heavy cameras have been forced to wait outside.

Halfway through the fall semester, Hurricane Sandy shut down Baruch for a week. Many of our students were directly affected, and getting back on track as a class wasn’t easy. But one wonderful thing to come out of Sandy was the fact that it afforded the students an opportunity to report on a major story unfolding in their own backyards, and they did some truly beautiful work.

Sandy also led to an historic moment in photojournalism. For the first time ever, a photograph taken with a smartphone made the cover of TIME magazine.

Screen shot 2013-06-14 at 7.29.42 PM

Photo by Ben Lowy

The photographer, Ben Lowy, along with Brown and three other photojournalists, was commissioned by TIME to document Sandy and its aftermath on Instagram.

Earlier this spring, Baruch invited Australian photographer Andrew Quilty, one of TIME’s five Instagrammers, to speak at a panel called “Your Smartphone: A Window On The World.” Sitting on the panel alongside Quilty were Genevieve Belmaker and Kirsti Itämeri, who have both used smartphones extensively in their work. The presentations and discussion delved into the practical aspects of using smartphones, the ethical ramifications, and the future implications for journalism as they become increasingly ubiquitous and cost-effective tools.

Just two weeks ago, for instance, the photojournalism world was stunned by the news that The Chicago Sun-Times had laid off its entire photo department in favor of putting iPhones into the hands of its reporters. From reading my musings up until now, you might think I applauded this decision, but let me point out one key distinction: Quilty, Lowy, and Brown are all experienced photographers who have spent many years developing an eye for style, composition, and content. When they take pictures with an iPhone, it isn’t as an afterthought, so they have something to run along with the story. As far as I’m concerned, there will always be a need for photojournalists who devote their lives to the craft.

One of the Sun-Times photographers started a Tumblr shortly after being laid off. In the description, he writes, “Rob Hart was replaced with a reporter with an iPhone, so he is documenting his new life with an iPhone, but with the eye of a photojournalist trained in storytelling.” And he delivers.

Ultimately, that’s what I want my students to see. That it’s not about the type of camera, it’s about the journalist holding it.

Idealism, Pragmatism, and Evolution (or, Grappling with Academia.edu)

I confess I joined Academia.edu for the same reason I joined Facebook:  my friends pressured me into it. There are also, of course, professional and philosophical arguments to be made in the scholarly online community’s favor:  it’s a great way to network and share ideas outside of one’s particular department or the (to say the least) fraught world of peer review and academic publishing.

Intrinsically, idealistically, I love the idea of Academia.edu. It is a lovely idea to use a social networking model for furthering academic discovery and sharing. It builds on the essential freedoms offered by the web—free publication, a broad reach, a curated community—and enacts a model I have no philosophical quibble with, one of openness, generosity, and sharing.

All of these lovely ideals, though, come up against the more worrying reality of the academic world and our careers in the material world. I can’t be alone in feeling reluctant to share my work online, disseminating it among people who might be less than scrupulous about citation and attribution. Furthermore, many academic presses and journals will (understandably) only take on previously unpublished work, and our careers are highly dependent on publication by reputable presses and journals. The counterpoint to these concerns is stories like this one, where someone used Academia.edu precisely for its intended purpose:  to share research and gain recognition beyond her institution’s own politics and perceived limitations.

These questions only highlight for me the importance of Academia.edu. Like other social media platforms, it doesn’t cause the problems of transitioning into new professional, communicative, and economic modalities, but rather illustrates some of the defining tensions of this transition. I remain reluctant to share my ideas, but this is a consequence of living in the world as it is, where fear of plagiarism and the cutthroat system of peer review and academic publication can stifle creative, original research and a generous, collaborative culture. I hope that Academia.edu is a an indication of where things are going, although at the moment professional pragmatism may still trump a full engagement in this evolution.

Be Interested?

A few weeks ago, at the SUNY Council on Writing Conference, I heard Richard E. Miller give a fascinating keynote called “Who’s this for?: Audience in the Classroom without Walls.” What I found most exciting about his remarks was his description of an assignment he gave a creative nonfiction class: Be Interested. My understanding of what this means is that Miller  asked his students to “produce a research project that others would read willingly.” My first reaction was of the “I want to steal that assignment” variety.  But as I thought more about the prompt, I began to wonder if a student would be as excited as I was. Miller mentioned that he had students who grappled with questions like “How do you become interested in anything?” and struggled with finding a way to experience curiosity in a moment when information is “superabundant.”

27342538

The more I toyed with this kind of assignment, the more I found myself wondering more about what I’d actually be asking students to do, what it actually means to genuinely be interested in something, and what that might look like in writing. A cursory glance at the OED shows that the word “interest” is defined using terms like “concern,” “curiosity,” and “sympathy.” But, interestingly, one definition also lists “to share in something.”

The idea of “sharing” seems central to composing, at least to me. But, often, I think it is this component–that of engaging and collaborating with an audience outside of the “teacher”–that I think might be lacking for many students (and here I’m thinking specifically of the freshmen I work with). To return to Miller’s prompt–I suppose the “assignment” is really to be interested and to be interesting. And, I also suppose that in an environment where students are perpetually in some kind of rubric quest, this probably feels very very scary.

But, on the flip side, this kind of opportunity is one that we should hope students encounter more and more. As Gardner Campbell points out:

We might begin with a curriculum that brings students into creative, challenging contact with the history and dreams of the digital age, perhaps in a first-year experience that asks them to reflect critically on their own digital lives as well as begin to shape and share their own digital creations, both intramurally and publicly. Research into the neurobiology of learning, building on decades of educational research, has shown that students learn deeply when they are asked to narrate their learning, curate their creations within the learning environment, and share what they have curated with a wide and, when appropriate, a public audience. As students understand that they are not simply completing an assignment at a professor’s behest, but in fact beginning their life’s work, they will necessarily become more engaged and produce more authentic work reflective of their own growing interests.

This excerpt is from part 4 of Gardner Campbell’s excellent series of posts on “The Road to Digital Citizenship,” this one subtitled, “Fluency, Curriculum, Development.” Campbell connects student investment in their own work with developing a pedagogy that allows for rigorous reflection on what it means to live a digital life. Campbell also makes the important connection between “sharing” and “publicness,” an important link where the truly interesting might occur through the kinds of conversation digital compositions enable.

Asking students to approach this kind of inquiry marks an important shift in the definition of what it means to write an “academic essay.” I wonder if what is actually happening is a return to Montaigne’s sense of the essay as a “series of attempts,” or Francis Bacon’s “dispersed meditations.” By encouraging students to “be interested” and “curate their creations,” the usual chore of the “paper” becomes more of an experiment in invention or “making.”

images

It is no coincidence that “Composition as Explanation,” Gertrude Stein’s sonic exploration of what it means to “create a composition,” employs the verb “to make” as one of its central repeated words. For example: “This makes the thing we are looking at very different and this makes what those who describe it make of it, it makes a composition, it confuses, it shows, it is, it looks, it likes it as it is, and this makes what is seen as it is seen.” This work is also the first time that Stein refers to her sense of a “continuous present” which was crucial to how she thought of her own process.

steintokEducation writer Audrey Watters lists “The Maker Movement” as one of the “Top Ed-Tech Trends of 2012″ and describes the importance of this kind of pedagogical approach as, “we need more learning by making, through projects and inquiry and hands-on experimentation.” When we actually ask students to physically invent something, to take objects and turn them into something that did not exist ten minutes earlier, this is a very different kind of learning from writing a 3-5 page paper. It marks a return to the kind of “learning by doing” that John Dewey advocated for–“Give the pupils something to do, not something to learn; and the doing is of such a nature as to demand thinking; learning naturally results.” In other words, when we are engaged in the act of “making” or “doing,” that is when real learning occurs, and that is also when I think the sensation of “being interested” is rediscovered.

In many ways this post feels like its own experiment in what Stein might describe as “beginning again and again is a natural thing…”–I wanted to think about this idea of “being interested,” which consequently was so interesting to me that only now have I realized what the connection is to my own recent experiences in the classroom. Meechal recently wrote about one of my latest forays into technology in the classroom, one that I am still processing. When given the chance to use the MaKey MaKey with my 2 composition 2 sections (thanks to Mikhail & BLSCI), I jumped at the chance, trusting a gut feeling that “making” something physically might teach us something about what happens when we “make” academic essays.

Picture1In small groups, the students were given MaKey MaKeys, a number of different materials that conducted electricity, and access to a laptop and told to “make” and “invent.” As a teacher, what was interesting to me was to watch the groups’ progress–many began by seeming a little confused, admittedly not knowing what to “invent,” and feeling at a loss for ideas (or “interest”). But, I also got to watch each group work collaboratively and experientially and ultimate discover the spectrum of things they  might do.

And, after the class session, students blogged about what they experienced through “making.” A few sample responses:

  • “If we just looked at the surface of today’s session, we would see that we were just playing around with the Makey Makey and doing things that are totally unrelated to our English class. However, if we think more deeply, we will see many similarities, especially with the process of writing. At first, we need some ideas to invent something amazing with Makey Makey; if not, we will just be playing and there will not be any creation. It is like writing our essays; we need a specific thesis to write a good essay based on the thesis.”
  • “Making something with the Makey Makeys very musch resembled the writing process. In class on Monday we were supposed to “outline” our plans and ideas for what we wanted to make today in class. An outline plays an important role in essay writing so that the writer has their thoughts and ideas organized and ready to be written down and explained. Each invention also required several “revisions” and “rewrites” in order for it to reach its “final draft” stage. I know that my group changed plans, inventions, and strategies a few times throughout the class period.”
  • “For a good portion of our time we were bouncing back and forth between these questions and sitting there thinking about what we should do. I felt frustrated at the fact that with all these tools we were just stuck, it was like our creativity was at a standstill. However after revisiting the objectives of using the Makey Makey and playing around with it, things made a turn for the better. With developing a greater understanding and applying that understanding to ideas we had, we were able to center on one idea and go with it…Relating to writing, when have that moment where you know the message you want to communicate and gather all your information; everything comes together and flows. Centralizing your idea and making attempts towards it can assist in your creativity. Whether is be the next groundbreaking IT program or your final paper, the initial beginning may prove to be the most difficult; but after you overcome that, you will have your masterpiece.”

Perfect Strangers, Alone Together

This past Valentine’s Day, a once viral video from 2006 re-made the rounds online: Ben Coonley’s Valentine for Perfect Strangers.

I never get tired of Coonley’s video, described as “a romantic e-card from Otto, a feral cat seeking love from a stranger on the Internet. Otto edits himself into clips from the 1980’s sitcom Perfect Strangers and asks strangers on YouTube to return the favor.”

Watching it again this year, I thought about its potential overlap with Sherry Turkle’s Alone Together: Why We Expect More from Technology and Less from Each Other (2012). Ostensibly, Turkle and Coonley are working in the same soil: intimacy and the internet. The website for Turkle’s book features the following language: “Facebook. Twitter. SecondLife. ‘Smart’ phones. Robotic pets. Robotic lovers. Thirty years ago we asked what we would use computers for. Now the question is what don’t we use them for. Now, through technology, we create, navigate, and perform our emotional lives.” “Technology has become the architect of our intimacies,” she goes on to warn.

But as I recalled watching the video for the first time — years ago with the friend who introduced me to it — I thought not of the pathetic ironies of 21st century digitally-mediated longing but of actual relationships: the shared laughter with my friend, and then my subsequent inclusion of the video in a screening program I’d put together in Puerto Rico. In a steamy gallery space with bad acoustics, dozens of young people sat crowded on the floor and watched Coonley’s video and other short works about love and longing. The event wasn’t a particular success, and I don’t have a big thesis — but just a tiny observation: that for every grand evaluation of the impact of technology, there is an immediately available example of its very opposite. Every online alienation might contain the shadow of a genuine encounter in another time/space dimension. We should keep tracking both story lines.

Tearing Down the Academic Paywall

There are cracks in the great academic paywall. I’m not talking about academic article torrents, though they do exist (I will not link to them here). I’m thinking of how many humanists are cultivating online personas and attempting to bypass the paywall in a number of ways–by blogging about their research or getting permission from journals to share their articles publicly. Optimistically, this is a sign of contemporary scholars’ dedication to openness and democracy. Pessimistically, it is a sign of the pressure on the humanities to justify its existence to the public. Times are difficult when the President of the MLA appeals to CNN.com readers by insisting that “Having strong skills in another language may give you an edge when applying for a job.”

Academics’ efforts to bypass paywalls intensified following the recent suicide of programmer, Reddit co-founder, and hacktivist Aaron Swartz. JSTOR, the database whose articles Swartz allegedly tried to share freely, actually led the charge to bring down paywalls even before Swartz’s passing. In tribute to Swartz, many academics shared their previously-paywalled scholarship publicly, using the hashtag #PDFtribute (which in turn spawned pdftribute.org).

I support the ideal of open access to academic work, but I think that it is worth considering what it would mean to remove academic paywalls when most journals and databases have paid staff.

qmpaywall

In a time when adjunctification is rampant, can we really justify de-monetizing all journals and databases? Journal contributors are unpaid to begin with, so for most academics removing paywalls translates into no monetary loss, only a gain in publicity. Yet, like it or not, academic journals, databases, and supportive software companies all make up an industry with paid staff. I personally work for an open-access journal, The Journal of Interactive Technology and PedagogyAt this juncture, our staff do not receive stipends or course release time. In an ideal world, the staff of every journal would receive some kind of support from their institution; yet, this is more likely to be possible at colleges with large endowments, meaning that the playing field could potentially be even more uneven with the removal of paywalls. Again, while I am enthusiastic about the possibilities of open-access scholarship, I also have to point out that the system of labor in the academy is already precarious, so that any new model should avoid exploitative labor practices.

Liberal education itself is broken, torn between the “the life of the mind” and the reality of stifling student debt and increased adjuntification. Fewer students are majoring in English: in 1971 7.6% of conferred degrees were in English, while in 2006 the figure was 3.7%. From a student’s point of view, at least, it seems as though the life of the mind doesn’t pay off.

Neither paywalls nor college enrollment limits can block the natural flow of ideas, especially today. Ideas are viral, they interbreed and sometimes occur spontaneously in different locations. We can see this even in the natural world when separate species independently evolve the same traits–what is known as convergent evolution. Ideas don’t really belong to anyone. They are a product of the accumulation of a variety of factors–social factors, economic factors, previous concepts/discoveries, etc. This is as true in the humanities as it is in the sciences. We often like to focus on one “genius,” one breakthrough moment, when most discoveries or inventions were many centuries and lifetimes in the making. For instance, Thomas Edison was only able to achieve so much success by outsourcing his work to others–to his “muckers.” In my opinion, in the humanities the “superstars” aren’t always the most original thinkers–often they are simply able to synthesize and express preexisting ideas in novel and exciting ways.

medium_90730aa5bbe9865576e53fff389d62c5

Academics in the humanities like to pretend that their ideas are theirs. However, there is no legal basis for such a belief. Intellectual property law doesn’t protect ideas; it only protects the specific expression of an idea. As the U.S. Copyright office states, “Copyright does not protect ideas, concepts, systems, or methods of doing something. You may express your ideas in writing or drawings and claim copyright in your description, but be aware that copyright will not protect the idea itself as revealed in your written or artistic work.”

Now that many academics have made a public turn and are on Twitter, the dissemination, adoption, and critique of ideas within academic discourse is instantaneous and publicly visible:

In the field of English, it seems as though we are already talking and interacting in public and online spaces above (or through) the paywall. The purpose of an academic paywall isn’t to protect authors’ ideas. Rather, it’s an outgrowth of academic labor. In our push to make academic discourse and higher education more open, we also have to consider what the ramifications might be for an academic system of labor that seems to be growing more unequal.

In summary, I suppose that what I’m getting at with this post is that paywalls, tuition, and the intellectual ownership of ideas are unnatural structures that are contrary to the natural spread of ideas and which have grown out of higher education, which, as much as we hate to discuss it as such, is an industry. The new openness of scholarly communication serves to highlight this unnaturalness as well as the tensions between values such as “free thought” and “fair labor,” “ownership” and “openness,” or “prestige” and “access.”

If you see something, tweet something

I watched the first two presidential debates at my friends’ apartment. Sasha and Sam have a projector and a screen, so watching was a regal affair, like watching a movie, but way more depressing.

The frustration during and after the first debate was intense. I spent most of it looking out the window onto beautiful Sixth Avenue in Brooklyn. If Obama wanted to look down at his notes for what seemed to be 99% of Mitt Romney’s speaking time, I’d stare down and watch the people walking on the street below, wondering how they felt about shirking their civic duty, and whether it would be bad if I shirked mine next time around and caught a movie instead of the debate.

But I did go back to Sasha and Sam’s for the next debate. And one thing I thought about, as I sat back to enjoy the show, was why I was so drawn to following twitter while watching.

It is common currency to bemoan the fact that most people are swayed to an alarming extent by whatever pundits they happen to watch on TV. You are who you watch. And the amazing thing about watching our twitter feeds while watching the first debate was that we saw how quickly the pundits, those very same people who define the majority’s opinion, were deciding on twitter that Obama was eating Romney’s dirt. It took about two minutes, based on the people I follow, for the national story to coalesce. Obama was publicly shaming himself. What was he scribbling that whole time, anyway?

From Jon Stewart’s “The Daily Show”

Sam, who also happens to have been Obama’s chief blogger in the 2008 campaign, told us that in a speech he gave a few days ago about social media and elections at Miami University (trickily located in Ohio!) he told the audience that twitter users have a real capacity to sway the election. If the pundits, journalists, academics, and normal-but-witty people who had amassed a twitter audience called the debate for Romney, then Romney would get the headline: Romney won. Since everyone guessed that the second debate would be a closer call, the twittersphere had a real impact. If they uniformly announced that Obama was killing it, then the headline would read: Obama won. And that would sway the polls, cause Romney to falter, backtrack, explain, etc. and give Obama the lead. Call it for Obama two minutes in and save America.

What I actually saw happen on twitter on the night of the second debate was interesting, subtle, and strange. There was continent-sized relief and almost immediately, people were calling it for Obama, but not in a sinister way. They were also calling him on his idiocy (like when he seemed to argue that college students should stop worrying: there are jobs  to be had on the production line!).  They were calling it like it was. If you see something, tweet something.

One question about the twitter/debate combo is, of course, can we watch, listen, process, think, and tweet, or at least watch the twitter stream all at the same time? And does following your twitter stream enhance the experience?

I don’t know if I’d answer this way about every listening experience (the best of the academic talks I go to require every scrap of concentration I can muster; a concert is best attended sans twitter; I can’t imagine ever wanting to tweet or follow twitter at an event I was expecting to find moving, surprising, or deeply meaningful). But watching a debate, which is in many many many (many!) ways a mindless and depressing activity is, I would argue, made manageable, and even fun, by twitter.

I laughed a lot.*

Found buckets of good sense:

I saw my main man Whitman referenced:


Of course, scrolling down twitter is often an onanistic exercise. Doing it, we affirm what we already know or think. We see our funniest, wittiest selves reflected (you, too, can contain multitudes of jokes, memes, witticisms!). And when the next morning’s news comes out, we feel like we had the inside scoop. Of course “binders full of women” is getting hours of news time. We saw it get miles of tweets within seconds of it leaving Romney’s mouth!

It can, though, push us to hear what other people have to say. It depends with whom you populate your twitter feed, to some extent, but even if you’re following mostly like-minded people, there’s always someone who knows something you don’t know, thinks it’s going differently than you think it’s going, or thinks the twitter posts that you find Jon Stewart-worthy are inane. Twitter allows you to settle into yourself comfortably, but it can also startle you out of yourself.

Twitter is, as Doug Henwood suggested on twitter, a hyper-productive cliche production line (with many jobs available for aspiring college grads!).

It has the potential to be an election decider. It’s a sideshow one turns to when the main event promises to be a depressing debacle, no matter how well your horse is doing. It’s a condiment we have come to find necessary to the consumption of a political spectacle.

See you for round three on Monday!

* I promise these tweets were all during the debate (except the Doug Henwood tweet which was the next day). The hours on the side are misleading since I collected them all at varying times the next day.

Prolegomena to failure

We used to read liner notes like they were Bible verses.  I am prone to lamenting that texts like these are gone in the virtual space from which we fish for mp3’s these days.  Long before we could wikipedia our favorite bands to find out what their deal was, we appealed to what was available to us. Yes, long before my life was ruled by incessant url’s, I relied on the majesty of toner to know what culture was:

Most of what we knew we learned from each other.   Mainly it was stupid.  We argued over the correct pronunciation of Ian MacKaye’s last name; we informed each other that Op Ivy was essentially reforming under a new name and scrambled to buy tickets to their first show; there was a new split 7” coming out on colored vinyl of such and such band; Greg Graffin was actually a college professor.

 

What we didn’t spend on cigarettes—the greatest joy of our evenings spent loitering endlessly in parking lots outside of a diner that was central to all of us who attended three different high schools respectively—was spent at record shops or mail-ordering away for vinyl to the far reaches of Olympia or the sprawling East Bay.  When they finally arrived in the mail, we would carefully slide the record out and try to discern what was etched on the inner rim of the record, were we lucky enough to receive such a secret message.  Placing the record on the turntable, we would turn our attention to those elaborate liner notes. We couldn’t post this stuff anywhere, you know, with no facebook pages or twitter feeds, so we photocopied those hand-lettered lyrics sheets and witnessed by pasting them to telephone poles and street signs, to the front of newspaper stands and the backs of bus benches.  Chock it up to teen angst but we were the faithful.

Yeah, some of it was stupid, and sure, we reveled in those short soundtracks of our constantly breaking hearts, singing along: “I believe in desperate acts, the kind that made you look stupid, look like a fool,” and using it as a directive.  I don’t believe in desperate acts anymore but I still love this album.

And some of it seems smarter than I would even want to give my 15-year-old self credit for.  We listened pretty closely for that quick 1:27 on Bad Religion’s Suffer when we heard:

Tell me can the hateful chain be broken?
Production and consumption define our hollow lives.
Avarice has led us ‘cross the ocean,
Toward a land that’s better, much more bountiful and wide.
When will mankind finally come to realize
His surfeit has become his demise?
How much is enough to kill yourself?

We listened to Fifteen and agreed:

I’ve been having a hard time trying to justify
The clouds arising from the cars we drive
And a little too easy seems just a little too hard today
And I’m afraid my children are going to have to watch the world waste away
Been having a hard time trying to accept the fact
That paying money for four walls leaves the slavery intact
And a little too easy seems just a little too hard today
And I’m afraid my children are going to have to watch the world slip away
I know, I know, I know, life has become slavery
Costs two dollars a minute and additional charges to pray to god today
See I’ve been looking for some guidance but the voice on the phone ain’t got a damn thing to say
And a little too easy seems just a little too hard today
And I’m afraid my children are going to have to watch the world fade away
I was born a little too late to see the dream that they called America
See I only want to be a Free man but it’s against the law to sleep on the ground in Gods land
And a little too easy seems just a little bit insane
And I’m afraid I’m going to have to run for my life one of these days
I know, I know, I know, life has become slavery

(the cursor follows me now and asks if I want to post this to facebook.)

And to Screeching Weasel:

We don’t believe in god or jesus christ anymore
We don’t need colleges to validate our lives anymore
We don’t need twelve steps to show us how weak we’ve become anymore
We don’t need to buy into a system that offers empty promises anymore
We don’t need protection against anything anybody might say
We know that government can’t improve our lives anyway
We don’t need to drug ourselves anymore to keep the boredom away
We don’t anything except relying on ourselves for a change
I can see a new tomorrow
Now

What we felt was a failure all around us, one that we did not want to inherit.

But we have. There was no revolution.  Little has changed and I am surely more complicit than I would like to admit.

Like Saul became Paul, the biggest sinners become the most zealous believers.  But does it also work the other way around?  I marched for Occupy Wall Street but I never once slept at Zuccotti Park.  I worried that it felt fascist to be one of the echoer’s of the People’s microphone, atomatonically repeating things I had not thought about before they came from my own mouth. I am as skeptical as they come.  What of these actions do we perform just to make ourselves feel better seeing everything that is wrong with the world?  I agree with Žižek that we contribute to the Children’s Fund to forget about hunger: we buy organic produce and think we have done a small part to save Mother Earth.  But radical change, the necessary changes, are frightening.  We congratulate the courage of Pussy Riot for saying f you to the state and Putin but would everyone still like them if they found out about how their leftist politics also include demonstrating by having public orgies while pregnant as Nadezhda Tolokonnikova did in a Moscow Museum in 2008?  I doubt it.

 

The Last 100 Miles

Or, Back: meet Wall.

David and I have recently discussed our respective strategies for hailing those winsome and fickle scholarly muses. You can explore this topic more by checking out this episode of Radio Lab in which Oliver Saks and Elizabeth Gilbert discuss wooing their muses, and their harrowing escapes from creative blocks.

Wooing the creative spirit though is not always enough to catch your writing fire. For me, the triumph of having completed the first full-draft of my dissertation very quickly transformed to clammy dread. After finally finding ways to sneak my writing past my many resistances, I discovered that new – more forceful — strategies were needed to conquer my fear of the messy and sometimes intellectually violent revision and editing process.

So I have looked for help, and found some in yet another episode of Radio Lab: this one detailing the ingenious ways people have resolved their most entrenched conflicts of desire. Both episodes inspired me to explore more “bullying” than “coaxing” tactics to force myself into corners that I could only write myself out of.

I have tried two techniques thus far (nowhere near as high-stakes as those on profiled on Radio Lab): One involved attaining a daily word count to “unlock” my access to the internet. I could only access my email, the web, and social networking, by first writing either a 1,000 unedited words, or 500 of edited revised sections. This worked splendidly — for a few days. I had actually never been so motivated to attack my writing. However, work emails started to fall through the cracks, or got delayed in their reception because some days I didn’t have the chance to sit down and fill my word quota until later in the day. And then I nearly missed an appointment with students who were emailing me at the last minute. Moving on:

I next put my kitchen timer to work – one that audibly ticks. I set it for 50 minute writing increments. I could do nothing else but write.  The timer doesn’t raise stakes so much as raises awareness of time passing, of an impending end point – of deadlines but also the end of the discomfiting writing process. Whenever I felt like stopping or doing something else, the ticking would remind me that time was passing, reinforcing a sense of urgency and also pace, and I would have no choice but to return to my cursor (or pen).

But there is also a test taking quality to this practice as well. It raises anxiety but also forces a necessary letting go of fear about the adequacy of preparation, to focus on simply doing the best you can with what you have in the moment. And the snap of the final ding never failed to shock me with the satisfaction of having written. How have you bullied your writing?

The Aftermath of Kony 2012, or How the Internet Rejected a Simple Message

The internet has bred a culture of skepticism that pushes back against all of the sourceless bits of information that get distributed online. And then, even if a source is cited, internet users will question that source’s motives. Sometimes this questioning is extreme. Recently an enormous study came out indicating that those who eat red or processed meat on a daily basis have higher incidences of cancer and death. To me, this was a no brainer, a confirmation of other studies on red meat consumption. I read about the study first on MSNBC, where I was shocked by how many internet users scoffed at the findings in comments like the following:

I’m sorry but the tone of this article just REEKS of agenda, specifically the veggie/peta agenda. The message is get them used to rejecting RED meat and then go after them and GET them to quit eating all meat. We’ll use the health excuse since the moral outrage hasn’t/isn’t working.

So it’s all a vegetarian conspiracy. Nice work! Pass the tofurkey. But really, that was an extreme example, but so many of the comments basically derided the study without having much concrete knowledge about how it was conducted. While I find skepticism refreshing, I also think that this type of skepticism is just another form of ignorance. It rejects scientific findings without really engaging with the scientific process. Of course, MSNBC doesn’t have the space to inform its readers about the exact process of the study, and not all of its readers are versed in research methods–these limitations almost render the article useless, just there to spoonfeed information. So is internet skepticism healthy or dangerous? Just like different sources of information, skepticism can run the gamut.

Here, to contrast with the above example, is in my opinion an example of healthy skepticism. A woman whose family is from Uganda is suspicious of the Kony 2012 video:

She brings up some good points:  Kony isn’t a current concern to Ugandans, only 30% of Invisible Children’s donations go to Uganda, there are many other pressing issues in the world, and there are concerns about increased militarization.

But then, if we dismiss Invisible Children, aren’t we partly just giving in to cold cynicism, like the hipster barista meme?

I guess my feelings about Kony 2012 are ambivalent. For one thing, like the woman above, I question why so much money needs to go into making videos, posters, and stickers–is it really okay that the main goal of a charity is to produce flashy media? Doesn’t it become a gimmick? Also, the exact intentions of the video are suspect–it seems to move rather uncomfortably around the fact that it is advocating violence.

Kony 2012 is a bit of a paradox because the same qualities that encouraged its viral spread also created skepticism and a number of backlashes. VICE, which is in the vanguard of independent gonzo journalism, threw one of the earliest punches. The Ugandan PM might have just issued a final blow.

I’m convinced by this that the internet isn’t just a soup of information and misinformation–it can also serve as a kind of information vetting system. Instead of testing students on the reliability of different online sources, it might make sense to approach the internet as a place where knowledge is continually constructed and deconstructed. Often we hear about the bad side of the internet–that Google is making us stupid, short-circuiting our thinking process. However, I think the whole Kony phenomenon should make us a bit optimistic. Maybe the internet isn’t just a place where things go viral; it can also be a place where simple messages are complicated, where difficult and complex views are weighed against easy answers.

Generation Y

Last week, I paid a visit to my doctor for a routine physical exam. Having gone to this doctor for years, he asked me about my research, hobbies, and other such things. When I somehow brought up the topic of teaching undergrads, the doctor looked at me with complete and utter revulsion. “These kids today,” he said quite angrily, “are nothing like they used to be when I was their age. All they care about is Facebook.” Not in the mood for a confrontation, I hesitatingly nodded and quickly changed the topic while he went about taking my blood pressure.

After I left the doctor’s office, I began thinking. Was this generation of individuals, these so-called Generation Y young adults, really that bad? Curious, I decided to google the issue to see what other, more official (i.e., more research-based) sources had to say about it. Quickly, I noticed a slew of news articles in some of the most respected journals and magazines, the overwhelming majority of which cast a very negative shadow on any hope for Generation Y. As one article on USAToday.com put it, these “pampered, nurtured, and programmed” individuals who have a speak-your-mind philosophy often stand in contrast to older generations, especially when it comes to the workplace.

Ironically, around the same time, I came across a newspaper article in AMNY which also proclaimed that college graduates these days are just not as ready for the workplace as they used to be. These individuals are failing to impress their bosses, and that they lack the skills needed to succeed in fields like business. Even worse, as one article on NY Post’s website claimed, is even if these Gen Y-ers are doing a horrible job, they still think they are doing great.

According to these (and many other) articles, much of the problem with these individuals today is rooted in their childhood, in the rather privileged, entitled ways they were raised in our society, and exacerbated by the failure of educators to properly prepare these individuals for the real world. As an educator of lots of Generation Y students, I began thinking about what the real problem was. Maybe it wasn’t what we were teaching in the classroom, but how. I began analyzing my own teaching methods, which include things like group projects to teach collaboration skills, debates to hear both sides of an issue, and individualized presentations to bestow critical thinking. But is this enough to not only prepare students for the workplace, but also make them better, more mindful individuals in society? I invite discussion to hear what other educators think about these Generation Y-ers, and how they can best be “taught” in the classroom. Lastly, I leave you with one, rather positive article I found, commending Gen Y for their innovativeness, something others can most definitely learn from. So are we in fact moving them in a better direction? In the end, maybe my doctor was right in that this generation is not unlike its predecessors. But maybe it’s also for the better. And besides, last time I checked, social media skills were at the top of anyone’s list.