For the first time, I am simply going to post a link to another person’s content: Madeline Gabriel’s post, “Should You Share That Cute Dog and Baby Photo?“ on her blog “Dogs and Babies.” But of course, since I am an academic, this “simple” redirect will be followed by a few points of analysis.
This is a piece about using technology to document and preserve as well as connect anew. It is also about advocating for audio documentation as a break from the insistent and incessant visual realm. Rest your screen eyes (after you read this) and just listen. I hereby issue a challenge to you: this Thanksgiving, be the weird/annoying relative/friend who is always up to something and can’t just relax in front of the parade, dog show, or Detroit Lions. Tell them you just have to do this one thing…
Select one relative or friend, perhaps a parent or the oldest person at dinner, and ask to interview him or her. If you have a smart phone, then you have a piece of recording technology John and Ruby Terrill Lomax could scarcely have imagined when they lugged around heavy equipment like this in the 1930s:
Even without an external directional microphone, the voice recording feature on smart phones is an incredible tool. The oral history project StoryCorps has declared the day after Thanksgiving “National Day of Listening.” I see this as a very intentional effort to combat the competitive shopping delirium of “black Friday.” StoryCorps provides an excellent list of questions that suit a variety of themes such as Working, Religion, Family Heritage, and War.
Even if you think you’ve heard every one of a person’s stories one hundred times, themes can open new territory. When I interviewed my father in the StoryCorps booth in 2007, I focused on his childhood memories of World War II. He had told me many times about peeling the foil from the paper of Wrigley’s gum wrappers and getting cash for the foil. But it was not until the recorded interview that he described the profound trauma of seeing news reels with concentration camp footage during Saturday movie matinees. I have a CD of the interview, and it remains startling when my father bursts into tears on the recording.
Starter questions such as “what is your earliest memory?” or “what are you proudest of?” put people in the zone of recollection. These questions can break surfaces that, through habit and routine, have congealed over something potentially rich and evocative; like a dull skin coating a luscious mousse. If you take up this challenge to conduct a Thanksgiving interview, and do wind up breaking through the stubborn skin to discover something profound, please report back to the blog and share your experience.
When we conduct interviews, we are not only communicating across various entities (curricula, generations, turkeys); we are creating primary documents for the potential researchers of tomorrow. A Speaker’s Guidebook (O’Hair et al.) discusses the use of different types of evidence for making a strong, clear argument. The types of evidence include: extended, brief, and hypothetical examples; lay and expert testimony; narrative or anecdote; facts; and statistics. The oral history interview potentially provides the listener / would-be researcher with most of these types of evidence. In this Lomax recording of
the Lomaxes intended to document a short lullaby, but inadvertently documented the use / employment of a black woman transplanted from Virginia to Texas who nursed well over a dozen white children across two generations. This snippet of musical ethnography suddenly becomes relevant to the research of labor historians, women’s studies scholars, African Americanists, and southern culture historians, to name a few.
Not only can interviewing foster emotional connections and provide future researchers with material, it can be a powerful and effective pedagogical tool. When I taught Introduction to Acting at Baruch, the final assignment of the semester was for each student to interview a family member and to create a monologue drawn from that interview. The project was inspired by the performer Anna Deavere Smith. Smith’s bio describes an approach to performance that “combines the journalistic technique of interviewing her subjects with the art of interpreting their words through performance.”
Whatever one thinks of Smith’s final performances, her methodology provided a strong model for my class. Students conducted the interviews, edited them for clarity and narrative focus, and formulated blocking choices based on the emotional beats. Without exception, the work was much more affecting, detailed, and fully realized than anything that had come out of students selecting monologues from edited collections.
I’m not astonished by the hatred of fatness currently present in our culture, or by the extent to which it has intensified over the past few decades. Cultures go through phases and cycles, and there are always scapegoats and victims of shame and blame. What shocks me is how fully this hatred has been adopted into public discourse.
I’m not going to rehearse the critique of anti-fat discourse in any depth here. Suffice it to say that statistical correlations between fatness and illness have nothing to say about the causes of such illness or how about how to avoid it. It is impossible to isolate the health effects of fatness in a context of rampant dieting, since dieting itself seems to be very unhealthy. Even if fatness were shown to be a predictor of certain kinds of illness, losing weight wouldn’t necessarily be a solution. And even if it were, a predisposition to illness is the last thing in the world that ought to provoke anger or scorn.
It seems that so much is going on in the world of communication that it makes my head spin just to think of where to focus, which is perhaps one of the great challenges of communication today. Communication seems to be so much about the technology of communication that we often forget what the basics are and how to measure them. To communicate today seems to be centered around getting information from one point to another, and making sure that it happens effectively, but the deeper issue is to understand what happens when we communicate and what the meaning is. Another question to think about is whether communication is an intentional activity, or whether it is just part of who are regardless of the medium or intention.
Paul Watzlawick was a great researcher on the topic of communication, somewhat forgotten in the digital age but incredibly relevant. Born in Austria in 1939 he studied with Jung, continued his studies in El Salvador before ending up in Palo Alto in 1960 where he worked with Don Jackson and followed the now famous Gregory Bateson. He worked mostly in the field of family therapy, but was interested in a much larger understanding of context in communication, seeing the dynamic as a system involving both parties and framing communication as something we do no matter what. To Watzlawick there was no non-communication, just as there couldn’t be a non-behavior, which meant that everything had to be studied as communication.
Another one of his ideas is that communication involves not only the message delivered, verbal or non-verbal, but the message received as a response, which might seem obvious but isn’t when you think of it. Much of our misunderstandings are in the ways we decipher the response to our ways of communication, which is understandably difficult since it comes from another person who might interpret things quite differently. In the age of rapid electronic messaging, the gap between what is communicated and what is received can be drastic, especially when we cannot measure what is received.
Watzlawick coined another distinction between what he called digital and analog communication, which is not the digital and analog we know. By digital he meant words, whereas analog depicts the non-verbal. Communication had to involve both, i.e. words and the way in which they were being delivered through behavior. The behavior involved the relationship and the context in which words have meaning, meaning also that words alone are not fully communicative without the understanding of their context and without understanding the relationship and behavior of both parties communicating.
This all made me think that we have substantially reduced communication in the age of electronic media in the sense that we have abstracted it to messages delivered without there being a true dialogue involving fully present and communicating individuals, integrating both the digital and the analog. Of course there is no going to back to old school, but perhaps we should think of what or who lies behind the screens that mediate.
How is Africa imagined in the 21stcentury? What notions does Africa conjure in the minds of a casual observer? As a continent constantly mired in crisis, the site of humanitarian disasters, prone to conflict or home to starving millions? These notions along with many others are the prism through which western observers view Africa. For many people around the world, Africa evokes images of war, destitution, extreme poverty.
The noted Nigerian novelist and prolific writer, Chimamanda Adichie Ngozi, has an excellent quip about the dangers of misconceptions across cultures. As she states,” the problem with stereotypes is not that they are untrue, but that they are incomplete. They make one story become the only story…. Stories have been used to dispossess and to malign, but stories can also be used to empower and to humanize. Stories can break the dignity of a people, but stories can also repair that broken dignity…. “ (Also see Binyavana Wainaina’s pieces here and here.
The rise of social media such as Facebook, Twitter and You tube has invariably connected millions of people across the globe. In an unprecedented digital age, we no longer live in geographic isolation. Armed with our smartphones and ipads we are walking receptacles of instant information and connectivity. In a culture where sound bites are king, how does one make sense of current events in African politics when the sum of all phenomenon is viewed through a prism of perpetual conflict, dysfunctional institutions, repressive government and the myths of a “single story.”
More specifically how does one effectively navigate a new information culture that is often replete with attention grabbing details that can obscure the larger context? To what extent do news stories in general invite us to delve deeper or inspire further inquiry on our part? Such pursuits are simply too time consuming and costly. The irony is that globalization has flattened our world, widened and deepened worldwide interconnectedness. Yet we know so little of Africa, that faraway, exotic place. Somehow in our rapidly evolving technological environment rising awareness through the power of social media has not managed to produce careful dissemination of knowledge or events in far flung corners of the world. Take the Kony 2012 hullabaloo for example.
The meteoric attention and rapid attention that the thirty minute video managed to garner was unprecedented. From blog entries, to Facebook, twitter, and classroom discussion, the sheer fire and debate it ignited speaks to an enormous transformation of knowledge production and dissemination. It also highlighted to some extent a surprising shift in consciousness. This is because far from simply jumping on the bandwagon of the normal pity party that Africa’s conundrums frequently inspire, the you tube video inspired considerable critiques. It seemed that far less people bought into what many deemed a brilliant advertising or marketing strategy and instead questioned the motives, the messianic overtones and seeming paternalism . Admittedly, Invisible Children’s plea for assistance in hunting down Ugandan warlord Joseph Kony for his horrific use of child soldiers to carry out a reign of terror is certainly noble for it spurred United Nations action and renewed US attention. But many in the academic, policymaking and blogging communities questioned the short shrift to complicated circumstances in Uganda, misinformation in the video and the nature of Invisible’s Children’s agenda. In short, many viewed co-founder Jason Russell’s pleas as symbiotic of a continuing polemic of paternalistic western engagement.
I vividly remember the response of my students to the video which I showed in my Africa in World Affairs class. Much to my surprise the bulk of my students were not convinced and pointed to misleading facts in varying shape and form. Others expressed disdain for the seeming “patronizing’ tone, one-sided view and absent voices of Ugandans themselves. What of the voices of Ugandans they wondered? Some of my students bristled at the call for charity and how the images presented seemed to reify the “White Man’s Burden.”
How do we re-imagine Africa in the digital age or illuminate the wider historical, post-colonial realities of the continent without resorting to reductionism? Better yet, how do we move beyond stark and troubling stereotypes of Africa as the “dark continent’ waiting for the light of the west, waiting to be saved? How far have we moved beyond prevailing images of the starving child, jutted bones, and swollen bellies? Just like those late night infomercials, but perhaps more gripping, the power of YouTube ‘s ability to convey and transform the way we perceive and react to social phenomenon is undeniable. The most interesting or newsworthy bits are not stories that seriously consider the political historical contexts. Instead broadcast journalism is most concerned with shock worthy sensationalism that is ephemeral at best. In the 21st century, this is simply unacceptable. Despite the continent’s quagmires, there is hope and promise.
African initiated efforts to bolster economic growth, technological innovation, increase indigenous capital and investment is evident as are a growing number of emerging economies, reverses in brain drain among a plethora of other developments. As the Economist notes, “in the past decade, six of the world’s ten fastest growing economies are African. In eight of the past ten years, Africa has grown faster than East Asia, including Japan. Even allowing for the knock-on effect of the northern hemisphere’s slowdown, the IMF expects Africa to grow by 6% this year and nearly 6% in 2012, about the same as Asia.
Re-imagining the continent requires rebranding-using the very (powerful) instruments of technology and communication to showcase a diverse peoples whose futures will not hinge on the goodwill of the west or its aid. Indeed, the efforts of Ghanaian software pioneer, Herman Chinery-Hesse, and architect of a technological revolution is noteworthy. The third annual symposium Africa 2.0 is also testament to a new narrative of empowerment, amid efforts to transform the continent’s image. According to Jessica Ellis of CNN, not only is Chinery-Hesse considered the “Bill Gates of Africa” he is a founder of one of Ghana’s biggest software companies and has been “has been spawning innovations for two decades, helping to break down tech barriers between the continent and the rest of the world.”
However, ordinary Africans must also do their part. By harnessing the power of social media, taking ineffectual and corrupt governments to task and ultimately ushering the much awaited “African Spring” these actions can remake, reshape and reconfigure Africa’s image and upstage prevailing stereotypes of what Africa is and is not. North Africans in Egypt used social media forums to harness support and boost activism against a repressive regime. The western world, African continent and states elsewhere can smartly, sensitively and effectively use social media in constructive ways that channel the capacity for cross cultural understanding while avoiding the dangers of a “single story”.
This is my second post in a series on the politics of knowledge. My goal with these posts is to consider a basic question of critical university studies: How do universities differ from other kinds of social organization such as government agencies, corporations, and cause-oriented nonprofits? What is the importance of higher education? What kind of constituency does it present? What does it mean to build a social institution around the transmission and discovery of knowledge? What is “knowledge” in this context and what are its politics? [Read more...]
When I was a sophomore in China, I took a course called “Cross-Cultural Communication.” It was my first time hearing the phrase “culture shock”, which is defined as the personal disorientation occurs when people experience an unfamiliar way of life in a foreign country. The professor wrote this definition on the blackboard. I made notes carefully and memorized them after class, focused on passing the final exam. What the “culture shock” really is and how it feels like were fell out of my mind when the class ended.
Several years later, when I boarded the plane to New York to pursue a Ph.D. in Accounting, the word “culture shock” just popped into my mind. For the following few months, everything was novel to me. I felt that I was experiencing a totally different life from what I had in China. As time passed by, I almost forgot such the feeling of shock I had at that time.
But I could never forget the first time when I went to Chinatown in Manhattan. Yes, the biggest “shock” I had was from Chinatown. It was a new world to me, unlike any of the places I visited in China. If you ask me what the differences are, I have to say that I cannot tell quite exactly. Probably, the feeling came from the traditional-fashioned seafood stores, the Cantonese-style lion dance, the Cantonese opera played by amateurs in Columbus Park, or even from the store-name styles which I only saw in Hong Kong movies. It is not a kind of thing either like the culture in which I grew up or the American culture I learned from English textbooks, TV shows, and my American friends. This might be the charm of multi-cultural environment of New York.
The initial shock of Chinatown has weakened. I go shopping in Chinatown once or twice a month and will never get lost there again. I’m getting to the so-called “mastery phase” of culture shock now. Yet it seems that some of my undergraduate students are suffering the similar “shock” as I ever had, which not only lies in language difficulties. Last semester, I had a meeting with a group of Chinese students to help them with their final presentation on Business Policy. After their presentation rehearsal, one of them told me that he always felt nervous when he spoke in public after he came to the U.S. Because his English is not bad, I was not sure whether his problem was also caused by the “culture shock” and could only help him by relaying my own experiences and was still not sure that I succeeded.
It seems to me that “culture shock” can really create some adverse influence on international students’ preformance, especially for their first semester in a foreign country. As more and more international students come to the City, how can we help them suffer less from the culture differences? Is their any means to reduce this initial confusion and anxiety?
Last year I walked to class one day with a student. He told me that where he comes from professors are highly respected and that for him it was an honor to be walking to class with me. He also expressed surprise and curiosity about my being a professor at such a young age, since in his country the title of professor is usually attached to much older people. Finally, with no prompting from me, he began to explain to me why he is a proud Republican.
He told me that, as a devout Christian, he would like abortion to be completely outlawed. Furthermore, as an immigrant to this country, he would like all forms of governmental safety net to be abolished, forcing people to work harder and making things “more fair.” Finally he suggested that U.S. society can basically be understood as a conflict between white people and black people in which black people are responsible for most of the problems.
Despite women’s widespread participation in the “Arab Spring”, perhaps most notably in Egypt, many activists point out that women have been sidelined by the new political systems. The new governments created after the fall of regimes rarely feature prominent women and their agendas almost never champion women’s concerns. Women have been left out of the political dialogue since Mubarak was ousted and the committee to redraft the constitution excluded women, even female legal experts. Many Arab feminists express concern over the situation of women in Iraq, where after the overthrow of a secular tyrant four-fifths of all female pupils and students have discontinued their education.
The exclusion of women in the post-revolution state-building efforts in Tunisia, Egypt and Libya is partly a result of political and social factors and the speed at which these transitions are happening, which tends to favor groups that are already organized and seasoned in politics—mostly men. Traditional social and cultural norms have relegated Middle Eastern women,” said Mahnaz Afkhami, the founder and president of the Women’s Learning Partnership, an international NGO working on women’s leadership and empowerment issues across much of the Muslim world. “They often lack the social, economic, and political power they need to overcome antagonistic groups and aggressive policy.”
Human Rights Watch researcher Nadya Khalife argues that the political culture in many regions across the Middle East had yet to prioritize women’s rights, or take women’s voices seriously.
But it would be a mistake to put too much weight on the difficulties that Arab women face on their cultural background. It seems that all revolutions leave women behind. The peaceful transitions in Eastern Europe in the 1990ties hardly created more egalitarian societies. In fact, arguably, the generous provisions of the paternalistic state, such as free child care facilities, long and paid maternity leaves, free health care, have all been replaced with the market driven, capitalist policies. Even more dramatically, the right to abortion have been replaced by much stricter regulations and in some countries, like Poland, it was outlawed. The new leaders like Walesa or Havel certainly did not fight to implement gender equality provisions in the newly democratizing states. Notably, Walesa openly called for the return of traditional roles for women.
The impact of Catholicism on the new society was overwhelming in Poland, where the old dogmas were replaced by growing power of religious fundamentalism. The public space in these new democracies excluded many groups, namely women and sexual minorities. Finally, the public/private divisions continue to endure and the roles of women continue to be prescribed according to old, gendered scenarios.
The cultural wars in Poland intensified with the prospects of EU accession, perceived by some as a threat to existing social relations. But for a long time before the 2004 accession, one of the main characteristics of polarized Polish politics, particularly after 1989’s political opening, was the clash between conservatives promoting family values and defending tradition on one side, and emerging new social movements claiming citizenship rights and legal protection on the other. Gender roles played a special part in these conflicts, because they were perceived as constitutive to the character of the Polish nation. The earlier socialist state’s insistence on freeing women from home confinement and domesticity (albeit limited in scope, and often in name only) is now contrasted with a Catholic ideology that emphasizes women’s roles as mothers and caretakers. Religion took on a political role and dictates acceptable social norms, and has a big impact not just on public sphere but is reflected in legislation. Hitchens was definitely not a feminist but his assertions about the harmful effects of religious dogma played out in rather tragic ways for Polish women.
The accession to the EU in 2004 of a number of post-Soviet states, was a double edged sword for women’s rights advocated in countries like Poland. The EU economic policies in many cases forced the government to yield significant social policies to the EU demands, while forcing the respective governments to start taking seriously the EU’s demands for gender mainstreaming, and various equality measures already present in other member states.
The news from the EU has been gloomy lately, filled with reports of the euro crisis, debt burden, undisciplined spending. Many predict that to solve the growing financial crisis, the countries need to make drastic cuts in spending, curb social services, limit generous pensions and public employees’ entitlements. While the economic model of the EU is being questioned, the liberal democratic model that governs it seems safely entrenched, and the inequalities of the political system persist. The political identity of the EU is closely tied with the economic system. Some of the feminist critics if the EU have long warned that the punitive austerity measures will not affect male and female citizens of the in EU in the same way. For women, who lost much ground since 1989, further cuts in domestic spending and the dismantling of the welfare state, will have disastrous effects.
And so revolutions everywhere have a way to bypass women, and until we insist that women’s rights are a priority in every context and in every culture, it will continue to be so.
The very existence of my research site is unethical. It is a place of poverty and death—a mountaintop tuberculosis sanatorium in Romania where many patients are incurable. They know their situation is hopeless. Dozens of patients I have personally known died during the course of my research. Some have told me that because they are dying, they want to tell me their stories and to help those who still might live. I enter into every interview knowing that I may not have the opportunity for follow-up questions. My months living there were filled with ethically tricky situations, from patients (and nurses) asking for my medical opinions to being propositioned sexually by patients. The worst was when Florin a chubby-faced 20 year old patient committed suicide the same day I interviewed him. His doctor gave him the bad news that he had the same highly resistant strain of TB as his father and he would have to stay at the sanatorium much longer. He was so scared, that evening he left and hung himself. I didn’t find out until months later when I asked his father, now also dead of Multi-Drug Resistant Tuberculosis (MDR-TB) how his son was. I didn’t know what despair looked like until I saw that man, cheeks sunken in, wearing his dead son’s brightly colored hooded sweatshirt. When he finally died, I was disgusted with myself for thinking it was merciful–that maybe death was better than constantly being tortured for infecting his son with a deadly disease. My university Institutional Review Board (IRB) did not prepare me for any of this—in fact nothing did. Here I was worrying about protecting my participants from my research, but who was protecting them from their own lives?
The most important “ethics review” I ever received did not come my university’s Institutional Review Board (IRB), or the Romanian medical ethics board which both approved my anthropological research on tuberculosis in Romania. Rather, it came from Mr. Gheorghe, a fifty year- old Roma man dying of MDR-TB, when he stepped out on the sanatorium balcony and told anyone within earshot something close to the following: “Jonathan is a good person. He wants to know about your lives and your families. You should talk to him.” I could feel myself blushing as he said this. His opinion mattered to the other patients, especially because he was the one selling them cigarettes out of his nightstand. Suddenly, other patients seemed eager to speak with me when they had been aloof and skeptical only days before. Gheorghe didn’t live long enough for me to thank him, he died of a “massive hemoptysis” a technical way of saying he coughed up a massive amount of blood. This is how TB patients often die and it is terrifying.
It took years for me to obtain the official permissions required to live at a Romanian TB sanatorium. I even had to sign a waiver for the U.S. National Science Foundation that they were not liable if I caught the disease. But just having the permission of my university and the Romanian government were not enough. I had to actually ask patients for their permission to ask them about sensitive issues, sometimes asking dying patients about their regrets and about how their families will survive without them. Part of my initial problem was I didn’t know how to ask the patients to let me interview and survey them. Following my IRB protocol, I showed them my stamped informed consent, a full page of Romanian legalese with talk of risks and benefits. I would read sections out loud and the more “informed” the patients became the more uncomfortable they became. This level of formality does not exist in most aspects of their lives. They could not understand that if I only wanted to talk with them, why I needed such involved paperwork with multiple signatures, dates and stamps. In fact, when I submitted my original protocol to the Romanian medical ethics board, I was laughed at and told that this research did not need approval because it was not “clinical”.
What did patients care about? That I would protect their identities and that the process was voluntary. Everything else, including talk of risks and benefits, names and numbers of people to contact, made them uncomfortable. They just wanted my assurance that I would maintain their confidentiality by not publishing their names. Many patients did not even have an expectation of privacy and did not feel qualified to make the decision as to whether or not they should participate in my research. They did not want to hear about protocols. Rather, they wanted someone that they trusted to tell them it was ok and that they could trust me. A document from my IRB could not accomplish this, only someone else vouching for me could.
I gained the endorsement of Mr. Gheorghe by accident. There was no plan, he just seemed willing to talk so I sat on his bed with him and asked about photographs on his wall, one of a handsome young man in a military uniform (him during socialism), another of a strikingly beautiful woman on a motorcycle (his 18 year old daughter) and my favorite, him and his wife proudly standing with their eight children in front of their rural home. He told me that doctors never sit on patient’s beds and they never ask about things like this. Visiting doctors and researchers only care about numbers and information on the patient charts. They are not interested in patient’s lives, only their disease.
In my last post, The Trobriand Islanders Never Friended Malinowski on Facebook, I suggested that the reason for the existence of IRBs is not primarily the protection of research participants. Rather, it is to provide legal protection to institutions such as hospitals and universities which despite their non-profit status, operate more like businesses every day. Every researcher connected with the CUNY system must undergo an online ethics training course where they are without fail, asked questions about the Tuskegee syphilis study and the importance of informed consent. The problem is that researchers in any time are operating under the ethical norms of their particular time and place. Withholding antibiotics from those men long after their syphilis could have been cured is ethically unconscionable now, but then, it was not, at least to enough of the people involved. Today, it is still the medical industry (specifically pharmaceutical companies) that is pushing (and in my opinion far exceeding ethical boundaries, in spite of the presence of IRBs in virtually every medical and educational institution.
In Romania, people generally don’t sue each other, especially the impoverished patients I work with. They live on a mountain “beyond the sight of God” as one patient put it. They don’t have access to lawyers and cannot even call or email the contact info on my informed consent because they lack internet access and money for international calls. When these patients give me their informed consent, it is informed by the personal relationship I have with them and those they know. They do so with the knowledge that they would have little recourse if I did behave unethically. It makes their consent all the more meaningful. Ultimately consent, at least in my research site, has little to do with my protocols and institutional approvals. For the patients informed consent is not something I read out loud to them, it is earned over the course of months through drinking coffee, staring off the balcony and exchanging stories of our families. It is something I take seriously not because of the IRB, but because I know that the people sharing their lives with me trust me on a personal level. I owe it to them to behave in a way that is ethically appropriate and respects their humanity and dignity. I think at this point we have a system of ethics approval which is designed by clinicians and enforced by lawyers for the protection of hospital and university endowments in a litigious society. It is the worst of possible worlds and despite best intentions 20 years from now, future researchers will read of all of the unethical research that took place even in this age of IRBs.
I think part of the issue is that ethical research means different things to different people and institutions. In the technical, clinical and legal language of U.S. IRBs, it means limiting “risk” to the study participants. This definition of ethics was inadequate for one of my Romanian transcribers who did not want to work on my project unless there was an actual benefit to Romanian TB patients—that I am not simply studying their “biosociality” or some other nebulous academic nonsense, but rather trying to use my research to improve people’s lives. I told her that is the only reason why I research. This is the same concern that many patients had. However, it never comes up in my U.S. ethics reviews. I wish it did.