The Irony of Healthy Food

For the past few posts, I’ve been writing about the importance of healthy lifestyles, and particularly, healthy eating. Importantly, however, the definition of healthy isn’t always clear, and that this confusion often leads to negative effects. Nevertheless, with the advent of better access to media and communication sources,the public is becoming more and more educated, and (presumably) able to make better choices with regards to food.

In addition, there have been many efforts made by organizations like the government in helping push the healthy eating agenda. For example, across various cities and states in the U.S., a tax policy on items like soda has been proposed in large part to curb health issues like childhood obesity. In New York City, a new regulation, started in 2008, limits restaurants to using trans fat-free ingredients, and fines those who do not comply. And while such tax proposals on “bad” foods are becoming more and more commonplace in recent time, some legislatures have even gone as far as suggesting a “Fat Tax,” or in other words, a tax for simply being obese.

Beyond the political side of things, companies themselves appear to be taking on an initiative towards providing consumers with healthier food options. For example, fast food chains like Subway position themselves almost entirely as being a healthier option than the typical burger joints. Even restaurants like Dunkin Donuts, the epitome of unhealthy eating, have joined in on the fight and changed their recipes towards healthier alternatives. Moreover, a huge trend in the past few years has been the addition of altogether healthy menu options. One of the pioneering retailers to do this was McDonald’s, whose salads, fruit side dish, and even whole wheat bun options have led others to follow suit.

While the addition of healthier options (theoretically) represent the good intentions of fast food restaurants, there is uncertainty with whether these efforts are truly benefitting society. In fact, recent work in the academic literature suggests the opposite. In one recent paper by Wilcox, Vallen, Block, and Fitzsimons (2009), the researchers examined consumers’ choices when presented with either menus that contained unhealthy and healthy options (i.e., french fries, salad) or menus that only included unhealthy options (i.e., french fries, cheeseburgers). Quite surprisingly, findings suggest that when menu included healthy options, consumers were more likely to choose the most unhealthy option than when the menu included only unhealthy options. Further examination of this effect provided support for a vicarious goal fulfillment explanation–that is, when consumers saw the healthy options on the menu, they felt like they vicariously fulfilled their health goals, and thus were licensed to indulge by choosing the most fattening, unhealthy option instead.

In a similar vein, other research has found that consumers often perceive of all food items at so-called healthier restaurants as having fewer calories than those  found at restaurants that are not primarily positioned as being healthy. In work by Chandon and Wansink (2007), consumers were found to underestimate the total number of calories in foods from restaurants positioned as being healthy. Additionally, this health halo generalized onto the side dishes they chose: the healthier the restaurant was perceived to be (due to the availability of healthier menu options and a little bit of marketing), the more unhealthy side dish options consumers chose.

Thus, I leave you to ponder what the best plan of action is. If the efforts being made by the push towards healthy eating by the government and companies is backfiring, what is left to be done? Can educating consumers, and thus making them aware of these effects as highlighted above, help? Whatever the answer may be, it is crucial for us to figure out the answer. With  1 in every 3 Americans is now technically considered obese,  figuring it out now may greatly affect our society for generations to come.

 

Meatless Minds

In my last post, I wrote about the poor dissemination of information related to healthy eating. The discussion following the post brought up several issues related to this topic, including what exactly constitutes a healthy diet and lifestyle. Undoubtedly, there seems to be very little agreement about what the “best” thing to do is. Every day, there are studies coming out providing new evidence for what to eat, what to avoid, how much to exercise, and how to live life.  Nevertheless, in my personal opinion and view, it seems as though the push towards an animal-free diet is becoming more and more powerful. Not only are such diets linked with short-term benefits (i.e., weight loss and increased energy), but studies have shown that eating plant-based diets can lower the risk of serious diseases that are often linked with deadly consequences. Even people like Bill Clinton seem to be following this type of regimen.

A few days after my last post, I came across news of a study from the Archives of Internal Medicine which proposed that red meat, even in small quantities, can increase the risk associated with an early death. While there has frequently been talk about meat consumption and health issues prior to this article, it seemed as though no research had ever posited such a strong link. In fact, and according to the article, even a single serving of processed, red meat (i.e., 2 strips of bacon) is enough to considerably raise one’s risk of dying from heart disease or cancer.

Given the potentially large implications of this research, I posted this article to my Facebook profile, mainly to get the attention of my meat-eating friends. Almost immediately, I received tremendous backlash and defensive comments. To my meat-eating friends, I was simply pushing the vegetarian agenda (despite the fact that I am actually not a vegetarian–well, not yet at least). They provided numerous reasons why red meat was in fact beneficial. Like much of the Internet reaction in response to this article, people became defensive on a level I never expected.

Naturally, in the days following my contentious post, I honestly thought I’d get emails with pictures of my friends eating a quadruple hamburger (or whatever it is–okay, so I’m not a vegetarian but I haven’t eaten red meat in 20 years, so pardon my lack of proper terminology here), or links to counter-research proving this article’s research faulty or erroneous. Interestingly, however, I received neither. Even more interesting and surprising was the emails that I did receive. Several of my most vocal Facebook commenter friends actually emailed me that they had some alternative meal, red meat-free, since they saw the article.

As a researcher of health risk perception, I became fascinated with the interesting turn of events that I witnessed. Why were people changing their behavior, even after acting like staunch advocates for not changing their behavior? Furthermore, how long would this article affect people’s behavior with regards to red meat consumption? Whatever the answers may be, something seemed to have worked. In today’s day and age of constant bombardment of messages in the media, how did my post get through to some people? Perhaps it was the little scare people felt when reading that post, as they made a connection they never before thought of. Maybe the NYC Department of Health and Mental Hygiene is doing something right with its graphic and vivid antismoking ads (after all, smoking rates are down to record low numbers). It may be interesting to wait and see what happens with soda drinking rates, too.

 

 

 

Food for Thought

Several months ago, I watched  Forks Over Knives, a movie about the possibility of reversing some of even the most serious diseases afflicting people in the United States. The main premise of the film, like a lot of new research coming out these days, is that animal-based products can lead to a variety of problems, including heart-related diseases, diabetes, and even cancers. Like most people who have watched the movie, I was astounded by these claims. While I knew that the typical, greasy cuisines found at most fast food joints were bad for your health, I never imagined that even the products I thought were okay (i.e., milk and fish) were considered detrimental. While the film made numerous statements about the dangers of consuming these foods, what was scarier was the mounds of research that was cited to back these claims. When I finished watching the film, it was not the legitimacy of these statements that I was really shocked by, but rather by the lack of communication this cause was receiving.

After thinking about this film some more, I began to realize that there was no mystery as to why this type of extreme healthy eating diet has gone unnoticed in our society. Having grown up in an Americanized European family, I quickly remembered how my parents used to take me to McDonald’s, Pizza Hut, and KFC (formerly known as “Kentucky Fried Chicken”) for the convenience—typical 1980s style dining. No one was coming out with grandiose health claims or posting caloric information in these restaurants, so my parents (like many people at the time) had no idea that these foods could potentially be so bad for my health. With fierce marketing efforts targeted directly at children, and a lack of knowledge about the nutritional value of fast food, it was no wonder why the industry became as large as it did, and healthier food was ignored.

Nevertheless, although a general lack of knowledge about healthy eating could have been to blame in the 80s, it seems like an unlikely explanation in today’s day and age, with all the available health information out there. So, what was the reason why word of better, healthier diets wasn’t more prevalent? While I thought about this question, I recalled a marketing class I had in college. In this class, we had learned how the area of health was a very delicate topic in this country, and also about the grandiose financial incentives tied to the pharmaceutical industry. Importantly, we learned about the marketing of disease, and how the pharmaceutical industry uses a variety of techniques to push medicines and diagnoses onto people through doctors and commercials. Was Big Pharma to blame for the lack of communication regarding healthy eating? After all, eating healthy and following the particular non-animal-based diet as cited in Forks Over Knives is supposed to reverse many common diseases, so much so that people who follow this diet are able to get off their medications for good.

In thinking about this issue further still, I also began to think about the problem of changing people’s minds with regards to eating. Given that people are so used to specific foods and lifestyles, it is an extremely difficult task to try to change their behavior. As a researcher of health marketing myself, I know that mounds of effort have been made in attempt to alter maladaptive diets and habits, yet have not proven as effective as they could be. Every day, there is some new article about the staggering statistic on obesity, and how the problem is starting earlier and earlier in childhood. Yet despite all the news and publicity, these diseases ensue.

So how can we, as a society, better communicate the vast amounts of information about healthy eating better, and surpass any barriers/obstacles? My idea is that, like most important values and lessons in life, the solution can start in the classroom. Even though a strict, animal-less diet may be the golden standard to healthy eating, any healthy eating might help. While some may argue that educating young consumers may not be enough, I remind you about the power that children can have on adults’ and families’ decisions. Perhaps the solution does in fact start within the elementary school classroom.

Generation Y

Last week, I paid a visit to my doctor for a routine physical exam. Having gone to this doctor for years, he asked me about my research, hobbies, and other such things. When I somehow brought up the topic of teaching undergrads, the doctor looked at me with complete and utter revulsion. “These kids today,” he said quite angrily, “are nothing like they used to be when I was their age. All they care about is Facebook.” Not in the mood for a confrontation, I hesitatingly nodded and quickly changed the topic while he went about taking my blood pressure.

After I left the doctor’s office, I began thinking. Was this generation of individuals, these so-called Generation Y young adults, really that bad? Curious, I decided to google the issue to see what other, more official (i.e., more research-based) sources had to say about it. Quickly, I noticed a slew of news articles in some of the most respected journals and magazines, the overwhelming majority of which cast a very negative shadow on any hope for Generation Y. As one article on USAToday.com put it, these “pampered, nurtured, and programmed” individuals who have a speak-your-mind philosophy often stand in contrast to older generations, especially when it comes to the workplace.

Ironically, around the same time, I came across a newspaper article in AMNY which also proclaimed that college graduates these days are just not as ready for the workplace as they used to be. These individuals are failing to impress their bosses, and that they lack the skills needed to succeed in fields like business. Even worse, as one article on NY Post’s website claimed, is even if these Gen Y-ers are doing a horrible job, they still think they are doing great.

According to these (and many other) articles, much of the problem with these individuals today is rooted in their childhood, in the rather privileged, entitled ways they were raised in our society, and exacerbated by the failure of educators to properly prepare these individuals for the real world. As an educator of lots of Generation Y students, I began thinking about what the real problem was. Maybe it wasn’t what we were teaching in the classroom, but how. I began analyzing my own teaching methods, which include things like group projects to teach collaboration skills, debates to hear both sides of an issue, and individualized presentations to bestow critical thinking. But is this enough to not only prepare students for the workplace, but also make them better, more mindful individuals in society? I invite discussion to hear what other educators think about these Generation Y-ers, and how they can best be “taught” in the classroom. Lastly, I leave you with one, rather positive article I found, commending Gen Y for their innovativeness, something others can most definitely learn from. So are we in fact moving them in a better direction? In the end, maybe my doctor was right in that this generation is not unlike its predecessors. But maybe it’s also for the better. And besides, last time I checked, social media skills were at the top of anyone’s list.

 

The Mixed Blessing of Bad Publicity

Earlier this month, celebrity Alec Baldwin made headlines when he was taken off an American Airlines flight due to his refusal to turn off his iPad because he was in the middle of a “Words With Friends” game. Perhaps what was even more shocking than Baldwin’s relatively petty reason for not complying with the airline’s rules was the astounding amount of publicity the story received in the days and weeks following the incident. In fact, Zynga, the company behind the WWF application on Baldwin’s iPad, was reported to have gotten a boost from all of the publicity about the event that circulated the story.

As a consumer behavior researcher, I’ve often heard the saying “any press is good press.” While my good conscience often doubted this notion at first, I quickly became a believer. Just turn on MTV these days and you will see what I mean (if you haven’t already, that is). It seems as though people are fascinated with shows that are filled with a smorgasbord of bad publicity, including (but not limited to) shows like the Jersey Shore, The Real World, and Celebrity Rehab. Nowadays, it appears that bad publicity is even becoming a type of business strategy for companies, as more and more incidences of scandals leading to increases in sales are becoming the norm.

On a psychological level, researchers argue that the attention-grabbing power of bad publicity is so successful because it is exactly that. When some type of bad publicity incident–be it getting kicked off a plane or being unfaithful like Tiger Woods–is shown over and over again, the story (as well as the main characters) tend to stick with people. Thus, more exposure means more saliency, and the more saliency can mean more audience interest. Combine that with the fact that individuals hold a cognitive bias where they pay attention to negative information more than positive, and you have quite the recipe.

Yet with the increasing value and popularity being placed on bad publicity, are we sending younger, more impressionable individuals in our society the message that doing something outrageously bad is a positive thing? After all, these are the individuals for whom a successful online presence is a priority, and thus might think of any attention as good attention for themselves. Furthermore, how is the trend of bad publicity changing the very values adults attempt to instill upon these individuals at an earlier age?

As an instructor of marketing courses, I often wonder how to solve dilemma of trying to instill a sense of ethics and dignity in my students in the face of a culture that is close to valuing bad publicity. Given that it is becoming so prevalent, I’m often finding bad publicity a topic that is hard to ignore in the classroom. While my original stance on the matter was pure disapproval, I cannot help but think that my students’ perceptions are quite different. Nevertheless, I feel that it is an important issue to discuss to some extent, both in business courses and beyond. After all, these are the future leaders of the world we are educating here.

Nonverbal Communication

In 1957, James Vicary proclaimed that a movie theater in Fort Lee, NJ was broadcasting subliminal messages to viewers. More specifically, he claimed that ads flashing for 0.03 seconds for Coca-Cola and popcorn had led to an increase in sales for those items in the weeks following. As a result, the CIA subsequently banned anything that came remotely close to subliminal advertising. However, when challenged to replicate the results of this study, Vicary failed to do so, and had been deemed a hoax for decades.

Courtesy of featuresblogs.chicagotribune.com

Although the real results of Vicary’s study remained inconclusive, more recent work has suggested that things for which we are not fully aware can indeed influence our behavior. For example, a series of studies on “nonconscious influences”  has suggested that stimuli that are too fast or otherwise weak for our sensory organs to consciously perceive may nevertheless still have a powerful effect on our thoughts and behavior. In one study in particular, researchers exposed some study participants to either an Apple logo or an IBM logo by flashing it in front of them on a screen for 2 miliseconds, below the point of conscious perception. Later, when asked to come up for uses for a brick (as a creativity assessment), the researchers found that participants who had been primed with the Apple computer logo were much more creative than those primed with the IBM logo. They reasoned that this happened because of the association between the Apple brand and creativity.

In addition to this study, there have been many other instances in which individuals’ behavior was shaped by stimuli with which they were nonconsciously primed with (and instead of providing the details of each of these studies here, googling “nonconscious influences” will lead you to find much of them). While the implications of all these findings are endless, I believe it is important to consider the consequences that nonconscious influences can have on our (and especially our students’) behavior. In a previous post, I noted how the average American is exposed to roughly 5,000 advertisements in a single day.

If the research findings in the nonconscious influence area have any merit, it’s easy to imagine the potential effects this can have. Although we try to teach our students well, we are also competing with 5,000 other stimuli they are exposed to, a majority of which they are not even aware they are perceiving. Perhaps it not our students’ fault when we get writing assignments that we deem to be “too dry” and uncreative. They may have been written on an IBM computer.

Although the issue of nonconscious influences may be a hugely complex phenomenon, I have often asked myself the question of whether there is something that I can learn from all this research, and use it to ultimately help my students in their academic endeavors. Ideally, I would love to have pictures of the Apple logo in every classroom I teach, but that doesn’t seem too reasonable or feasible, or even ethically sound. Additionally, if we educate students about the possibility of nonconscious influences on their behavior, is it even remotely likely that anything would change? And if so, what do we tell them short of cutting themselves off from all media? Thus, I invite others to provide their thoughts on this issue.

Objectification in the Classroom?

There is little doubt that the media has a profound influence on its audience. In fact, some experts say that the average American views an average of over 5,000 advertisements in a single day.

With the advent of new technologies, that number is only expected to grow. Further, in American culture and society, the power of advertising to persuade, manipulate, and shape behavior has been undeniable. Despite its primary objective as a medium of selling products, advertising has long been criticized for having deeper and more complex effects on people’s attitudes and behaviors.

While there has been much research about the effects of the media on individuals’ behavior, one of the most prominent areas has been the objectification that it fosters, specifically with regards to the stereotype aimed at women. Many researchers have attempted to understand this phenomenon, and have come up with empirically-validated theoretical accounts and explanations. One such construct, termed objectification theory, posits that in Western society, the female body is regarded as a sexual object that is to be looked at and evaluated (Fredrickson & Roberts, 1997). According to this theory, the female body is “treated as a body (or collection of body parts) valued predominantly for its use (or consumption) by others,” (p. 174). As a result of this process, females come to internalize this “observer” position of themselves, and therefore view their bodies as objects for visual inspection and evaluation. The term self-objectification refers to the adoption of this observer view of the self, and includes constant monitoring and evaluation of how one’s body appears to others (http://en.wikipedia.org/wiki/Sexual_objectification).

Although levels of objectification can differ among women, it has been argued that objectification generalizes to all women due to the female gender role socialization that is found in Western society. Although research has documented many long-term effects of objectification (i.e., anxiety, depression), more recent findings suggest that objectification can lead to short-term effects (i.e., body consciousness, cognitive disruptions) as well.

Despite objectification being a societal problem, there is little doubt that it has made its way into the classroom. I first experienced this firsthand a number of years ago, when I noticed that in all of my classes, I barely had females participating in class discussions. Not thinking much of it at the time, I simply encouraged more participation (from everyone) in general. Little did I know then that what I was facing was much deeper than I could have imagined. One day, a student came up to me after class to tell me how much she loved my class, but that she was afraid of speaking because of her fear of “looking stupid” in front of other students. So naturally I did what any educator would do, and over the course of a few weeks, tried to figure out just how much of a problem this was for other (female) students. I asked several students what they thought about class, and particularly, class discussions, presentations, and other assignments that consisted of some performance aspect in front of other students. Quite surprisingly, I found what my female student had hinted upon: while males had no issues speaking in class, it was the females who had much reservation, mainly due to their concern of how they would appear to others in class. As one female student put it, “I am always worried of what other people will think of me.”

To be honest, this was something that had never crossed my mind before. Here I was teaching topics in marketing, yet one of the most obvious effects of the subject matter was right in front of my eyes. Had the media had such an effect on my female students that it stifled even something as basic as their participation in class? Sure, there were some exceptions to this, as I had some female students who were clearly outspoken and (at least in my opinion) did not have any fear or anxiety in speaking up in class for fear of “looking stupid.” Unfortunately, however, such female students weren’t the norm. And while I have tried to eliminate this problem as best I can (by encouraging participation from everyone, making it a point that I value everyone’s opinion, talking about the topic of objectification, and even showing the video (seen above) in class), I continue to encounter this problem semester after semester.

Although the issue of objectification and its effects on females is something that will be hard to change given that it’s a system problem, I urge you (as instructors) to at the very least recognize it. As Jane Kilbourne mentions in her film, the first step in addressing the problem is awareness. Bringing it to light in the classroom, especially by their college years, might bring us one step closer in finding a solution.

 

LOL K TTYL: Our Undying Need of “Keeping in Touch”

We all know how dangerous talking on a cell phone while driving can be. In fact, the statistics are quite staggering:  a recent CBS News story reports that accidents from using cell phones while driving increased from approximately 636,000 in 2003 to 1.6 million in 2008. Some sources have even equated the act to driving with a blood alcohol level of .08, or the equivalent of 3 margaritas in a single hour. While not every state has a law banning or penalizing drivers who talk on their phones, most (if not all) are pushing towards such legislation.

Aside from talking on cell phones while driving, new laws are now prohibiting drivers from texting while driving. Agencies like the FCC are at the forefront of the effort to prevent drivers from texting, claiming that distracted driving (i.e., texting while driving) has been attributed to over 5,800 deaths, or roughly 16 percent of all fatal crashes. In attempt to get individuals to not text and drive, local governments and other groups have been broadcasting a variety of Public Service Announcements (PSAs) like the following:

(Note:  This video was one of the very few I could find that did not include any graphic/disturbing content. Most others were, well…think traditional drunk driving videos with the guilt component, since we’ve all probably texted while driving at some point.)

Sure, we know about the dangers of doing anything to distract us while driving. But what about our constant interaction with our phones while walking? Yes, we may be on the street, not controlling a car, but can this behavior really be dangerous?  Well, you decide:

Apparently, texting while walking has gotten so bad in some people’s opinions, like that of New York State Senator Carl Kruger, that they have proposed laws attempting to ban the behavior. Yet Senator Kruger isn’t alone in his efforts. Actually, as of July 2011, texting while walking is now illegal to do in the great city of Philadelphia.

http://gawker.com/5822431/texting-while-walking-is-now-a-crime-in-philadelphia

Although the topic of driving, walking, eating, or whatever while texting or generally interacting with a phone have centered around public safety, the other interesting point of view that I always think about is why we do all this to begin with. With the advent of new technology and cool, savvy devices that fall short of doing our laundry (well, maybe this will happen sometime soon, at least I hope), how can we not put down our cell phones? I mean, there’s a call log, text messages, emails, eBay and Groupon alerts, and yes, even Facebook and Twitter. God forbid we don’t check our phones for even an hour.

Similarly, in talking about this topic and how society has to resort to laws to get us away from our phones (at least in the presence of law enforcement), it makes you wonder what the other consequences of this behavior are. There is a plethora of research to suggest that the use of technology, and specifically self-service technologies, can lead to anxiety, depression, and other emotional disturbances. For more elaborate information on this topic, watch PBS’s Frontline program entitled, “Digital Nation.”

http://www.pbs.org/wgbh/pages/frontline/digitalnation/etc/synopsis.html

Perhaps the million dollar question then is, how do we escape the chains of technology so that we don’t get to the point of it leading to a true problem? Maybe the politicians are correct in their efforts in trying to curb our use of technology, even just for a few minutes while we are in public. As an instructor of several classes, I see it all the time with my students. Should I start banning any technology whatsoever in my classes?

In closing, I encourage you to think about some of the laws regarding the prohibition of use of technology while driving or walking that have been gaining some publicity. Yes, while they can be seen as attacks on our basic freedom, they might not be such a horrible thing either, at least in my opinion. They may be a small step in getting us away from our technology, even if it is for just a few minutes. And that may not be a bad thing.

 

Conformity in the Classroom

This past summer marked the 50th anniversary of Stanley Milgram’s famous Milgram obedience experiment conducted at Yale.

http://www.youtube.com/watch?v=GHuI2JIPylk

Considered to be one of the most notable experiments in the field of social psychology in particular, and perhaps even the research world in general, Milgram originally set out to examine the question of why people obey authority, even when doing so contradicts some of their fundamental morals and conscience. In this research, an innocent participant was given the role of a “teacher” who had to punish the confederate “student” with an alleged electric shock of increasing intensity every time the student would make an error on a memory task. The teachers were constantly prodded by the experiment to continue, despite some of their blatant resistance and genuine concern whenever the student would receive a shock. Milgram’s question: how much would people follow the command of the authority, or in this case, the experimenter, even when it meant “harming” another human being?

Although the methodology used was questionably ethical by today’s standards, Milgram’s conclusions were a shock to many: about 65% of the participants in his experiment went as far as administering the strongest voltage available.

While 50 years have passed since Milgram’s original experiment, we, as a society, would like to think that we have moved on, and that what Milgram found in his laboratory doesn’t pertain to the way we think and behave. After all, we are a society in which individualism is a value, and doing our own thing and going against authority is key. If put into that same experiment room, we would surely act much differently.

Yet has much changed? Have we really moved on and learned from research such as Milgram’? Or, is it simply human behavior to act as Milgram’s subjects did? One can hardly imagine that in today’s day and age anyone would conform to authority to such an extent that his or her own conscience would suffer. After all, we are much “smarter” today than we were back then…

In thinking about these questions, I’d like to bring attention to world of street art. Many street artists have often found their inspiration creating art that represents society’s dire dependence to authority and conformity. In their eyes, as in those of many similar skeptics, we continue to act like Milgram’s subjects, albeit in a more disguised way. We continue to obey like authority, act like everyone else, and believe it is the right way to exist. Commercialization, they argue, is simply a means to this end. We are constantly being bombarded of how we should think, feel, and act, and indeed we follow.

 

Well, there may not be anything necessarily wrong with “fitting in” to the molds society has carved out for us. In fact, sometimes it’s required. For example, take the world of business, a place near and dear to my heart as an instructor of several business classes. To be able to succeed in a place like corporate America, individuals must think, feel, and act like all others who have gotten ahead in times prior. Put in another way, individuals need to conform and obey the rules that have been set forth, leaving little room for creative expressions of individuality.

So I ask the question of how can we, educators of undergraduate students (and business students in particular) who are at the brink of entering worlds like Corporate America, properly educate students how to communicate and express themselves with their own voice, while still fitting into this mold? How can we encourage them to be their own people, but not appearing too different that they won’t be able to succeed?

As a crucial part of college education (and as other writers have noted), it is necessary to teach students the basics and have them conform the rules until some comfort is reached and students can feel confident in expressing themselves uniquely. However, based on my own experiences, it appears that students never fully disengage from this generic mold, but rather learn it and stick to it without really exploring their own selves and style. The reasons why this occurs can be plenty, ranging from specific educational experiences and instruction that has encouraged this type of communication, to fear of not landing a good job if doesn’t do exactly as told, to the external pressures of a society which (implicitly) values conformity.

Thus, despite it being over 50 years since Milgram’s original experiments, it is easy to see that perhaps very little has changed about the ways in which we, as individuals, fundamentally behave. While that research may have taught us to be more knowledgeable and stop to think before following fascist regimes, we might also want to think about the implications the research still has for other areas of our lives. As educators, it is our job to ensure that students do receive a quality education like everyone else, yet also free themselves of the confines of our instruction.