Climate refugees: why we can’t yet predict where millions of displaced people will go

November 28, 2019 9.41am GMT

In the near future, global warming is expected to create millions of climate refugees, and individuals and organisations are already searching for ways to help them. Some ideas are obvious, such as improving conditions in refugee camps.

But there are also more high-tech projects such as using algorithms to forecast where displaced people will travel to. Such forecasts are crucial. They can help support organisations prepare in the right places, they can evaluate current policy (by assessing a counterfactual “what if” scenario) and they can also help predict refugee populations in remote or dangerous areas where there is little empirical data.

So we can predict where climate refugees will go, right?

No. Despite bold and excitable claims that refugee forecasting is largely resolved, we are not convinced. As computer scientists who work on this exact problem, such claims seem like a painful example of running before we can walk.

Almost four years ago, we started to research how people fled from armed conflicts. Many people were displaced due to the Arab Spring and the Syrian War, but little work had been done to predict where they could end up.Africa’s Sahel region contains many of the world’s most climate-vulnerable people. mbrand85 / shutterstock

With our colleague David Bell, we created a tool that could help, and published our work in Nature Scientific Reports. Our tool represents every person as an independent agent, and then uses simple rules-of-thumb derived from scientific insights – for instance “people tend to avoid travelling through mountains when it is raining” – to determine when they will move next, and to where.

This is different from “machine learning” approaches, which use historical data to “train” the algorithm to generate rules and thus predictions. So, for example, machine learning might be given this sort of data: “the number of people that arrived in a refugee camp close to a mountainous area in a conflict that occurred perhaps many years ago, or more recently but in a different country.” The main issue is that historical data used for machine learning is always quantitative, and never is about the conflict that the simulation is directly developed for.

To see how our method worked in practice, we tested our tool against UNHCR data from three recent conflicts in Burundi, the Central African Republic and Mali. Our tool correctly predicted where more than 75% of the refugees would go.Network models for (a) Burundi, (b) Central African Republic and (c) Mali. Conflict zones (red circles), refugee camps (dark green circles), forwarding hubs (light green circles) and other major settlements (yellow circles). Suleimenova et al (2017)

We have since applied our analysis to refugees fleeing conflict in South Sudan, as part of the HiDALGO project. In this study, forthcoming in the Journal of Artificial Societies and Social Simulation, we also looked at how policy decisions like border closures affected the movement of refugees into neighbouring countries, such as Ethiopia or Uganda.

We found there was indeed a link – closing the Uganda border in our model causes 40% fewer “agents” to arrive in camps after 300 days, and that effect lingers even after we reopened the border on day 301. Our tool correctly predicted where 75% of the refugees would actually go in real life.

But doing a correct “retrodiction” in these historical cases does not mean that you can do a forecast. Forecasting where people will go is much harder than predicting a historical situation, for three reasons.A school in Uganda for refugees from war in South Sudan. Roberto Maldeno / flickr, CC BY-NC-ND

  1. Every model makes assumptions. For instance, a model that forecasts where refugees go might makes assumptions about their mode of transport, or the likelihood that they stay overnight in a place where violence has previously occurred. When forecasting, we need to know what happens when we give these assumptions a little shake (we examine this in the VECMA project). The less evidence we have for an assumption, the more we need to shake it and analyse how our model responds. Machine learning models generate implicit (and ill-justified) assumptions automatically when they are trained – for example, chosen destinations correlate with the stock value of company X. In agent-based models, these assumptions come from physical factors like the presence of mountains or armed groups, and are explicitly testable.
  2. Forecasting one thing requires you to forecast many other things as well. When we forecast how people escape conflict, we must forecast how the conflict will evolve. And that could depend on future market prices, weather/climate effects, or political changes, all of which would need forecasting too. To be clear: we did not require any of these models when we validated our predictions against a historical situation, so we are building new models just to make forecasts possible.
  3. Forcibly displaced people are usually fleeing from unexpected and disruptive events. Here the data upon which the machine learning algorithms are “trained” is incomplete, biased or often non-existent. We argue that agent-based models are more effective because they do not need training data, and benefit from understanding the processes that drive forced displacement.

So we have not cracked it.

Yes, forecasting is hard. We do not yet know where climate refugees and other forcibly displaced people are going. We still need huge supercomputers just to forecast next week’s weather.

So it pays to be suspicious of the idea that refugee forecasting is already solved, especially if linked to claims that the “next frontier” for computer scientists is in (controversially) extracting data from vulnerable refugees who are often unaware of the privacy and security risks. Given how hard it remains to predict where the millions of climate refugees will go, the “next frontier” is still the last frontier.

Source Link: https://theconversation.com/climate-refugees-why-we-cant-yet-predict-where-millions-of-displaced-people-will-go-119414

Thanksgiving Is Another Reminder of What America Forgot

The absence of Native perspectives in American history books and classrooms has been remarked on for over 50 years. Will it ever change?

Nick Martin November 28, 2019

In a December 1862 letter to the Senate, President Abraham Lincoln ordered the execution of 39 Sioux citizens. In 1851, the Santee Sioux had ceded the land known as Minnesota to the United States in a pair of treaties, in exchange for a constant supply of services and wares to be provided by the Bureau of Indian Affairs. Like countless treaties signed by the U.S., the agreements were not honored. Corruption consumed the BIA, and basic food items were subject to price gouging. And so, on the brink of starvation in the early winter of 1862, several hundred Sioux raided white towns and villages, looking for the rations that the government had stolen from them and that the colonizers had previously refused to trade with them.

If one is to believe the historians, Lincoln’s decision to impose 39 death penalties for the Sioux Uprising was one of delicate political balance: He had to kill enough Native resisters so as to stifle any future uprisings but not so many that he provoked another. Thirty-nine was the number he landed on after reviewing the transcripts, down from the 303 execution requests made by the military leaders in Minnesota. His letter to the Senate simultaneously served as both the largest mass execution order and the largest clemency order in U.S. history. Ultimately, 38 Sioux were hanged by the neck until death for having the gall to try to keep their people alive. As they stood atop the trap door, with nooses waiting to deliver the final snap, the condemned men spoke their names and cried out “I’m here! I’m here!”

Ten months later, Lincoln signed another letter. This one was a proclamation: As of October 3, 1863, the president, hoping to bring a symbolic sense of calm and joy to a nation torn in two by the still-raging Civil War, declared the fourth Thursday in November to be “a Day of Thanksgiving and Praise.” Never mind the true history of the day Lincoln sought to memorialize, which, aside from its first peaceable but fragile iteration, had twice commemorated the slaying of Wampanoags in battle. Like the 38 Sioux, that was lost to the past: All that mattered was what the living told themselves and their children.

If you grew up going to public school in this country, you probably don’t recall much, if anything, about Lincoln’s execution order. In the long run, it was hardly exceptional for the U.S., save for how many Native lives it doomed, and even that figure was dwarfed by an endless number of massacres and “battles,” carried out by the military, private companies, and citizens. It was business as usual for a young nation with imperialist desires, with a touch of faux mercy to make it go down smoother for a president who would preside over the forced removal of the Navajo and Pueblo people and the Sands Creek Massacre of 1864. In truth, Lincoln, like many who would follow him, was not so different in practice from the more notoriously Native-hating Andrew Jackson: another chief executive who cared little for or about the Indigenous people he shared a continent with. But American textbooks only have room for so many villains.

It is a pity that so many Americans today think of the Indian as a romantic or comic figure in American history without contemporary significance. In fact, the Indian plays much the same role in our American society that the Jews played in Germany. Like the miner’s canary, the Indian marks the shifts from fresh air to poison gas in our political atmosphere; and our treatment of Indians, even more than our treatment of other minorities, reflects the rise and fall in our democratic faith.

Just a little over 100 years after Lincoln signed the first Thanksgiving proclamation, these words by Felix Cohen appeared quoted in the opening to the 1969 Kennedy Report on Indian Education. The report was helmed by Senator Ted Kennedy and serves as a bedrock document in the Native education and political communities: For the first time, possibly ever, it signaled that major U.S. political players were finally paying attention to the erasure of Native communities from the American mosaic. As part of the report, a review of 100 educational texts taken from public schools across the country came to the belated conclusion that Native people were viewed as little more than “subhuman wild beasts in the path of civilization.”

While I was reporting last year on North Carolina’s decision to close down the High Plains Indian School and integrate my tribe, the Sappony, in 1963, I heard directly from family members about how such slanted curricula affected Native students’ experience. My uncles and aunts told me stories of the other kids at school asking them if they had scalped anyone or if they carried tomahawks, and of discriminatory treatment doled out by teachers and administrators to Sappony children, whose only crime was having skin that was a little darker than their own.

f5225126469d2d76bec4cb9a4b19e7ed4a0b9fe7.jpeg

In the 50 years since the Kennedy Report was published, Americans have barely moved an inch when it comes to demanding an accurate historical or contemporary view of Native people be taught in public schools. And this has had a marked effect on Native children forced to listen to their histories being twisted to fit a narrative of deity-ordained land theft and warfare. Writing on this in 1985, Lee Little Soldier found that Native students still often felt “trapped between their birthright and the dominant society, losing touch with the former, but not feeling comfortable in the latter.”

But almost more important than the need for Native children to see themselves properly represented is for those who will have a say in how these curricula are established (in other words, the white-run PTAs and local administrative boards) to correct their own understanding of American history. They also need to accurately perceive the present, realizing that Native communities and individuals exist everywhere, from the reservation to the city to the suburbs. Only by engaging with these communities, including them in the lesson-planning process as living societies rather than mythical figures, can the American school system begin to teach its children how not to exclude and appropriate Native history.

This point was underlined in a 2006 article in the Phi Delta Kappan by Bobby Ann Starnes. While Starnes was relatively well educated on Native history, she found when she began teaching at a predominantly Indigenous school system that all of her knowledge had been historicized—she had no idea of what it meant to actually converse with and teach and befriend Native people, teaching history in a way that has moral weight in the present. “What seem like small matters of word choice,” she wrote, “are important (e.g., did Indians wage war or resist aggression?).” 

Recent years have seen small steps toward delivering teachers and students the overdue updates required to teach these new lessons. The National Educators Association now makes available materials on how to teach Thanksgiving in a historically accurate and culturally sensitive manner. Signed in December 2015, the Every Student Succeeds Act, President Barack Obama’s replacement for No Child Left Behind, required states and local educational agencies to consult with tribes and tribal organizations as they developed their state lesson plans, should they hope to obtain Title I grant funding. Charter schools established to teach history through a Native lens have sprouted with increasing popularity in cities such as Denver and Oklahoma City and Seattle. (As a case in North Carolina recently showed, there are drawbacks to this approach.) But half a century after the federal government declared the country’s biased Indigenous history lessons an educational crisis, these partial measures feel shockingly insufficient.

Fall is a brutal time of year to be Native. Halloween brings “Sexy Indian Princess” costumes. Native American Heritage Month almost inevitably comes with fumbles that undercut the purpose, even without a president trying to squeeze “Founding Fathers” into the month as well. And then there’s Thanksgiving. Even in 2019, principals and teachers deem it appropriate to dress their children up as Natives and celebrate “an annual Pow Wow,” and then post pictures on social media, before being yelled at and quickly deleting their appropriative efforts to whitewash history. Nor are the stereotypes and questionable attempts at representation limited to rural schools, as Saturday Night Live’s recent skit with Will Ferrell, Fred Armisen, and Maya Rudolph dressing up as the relatives of Matoaka (commonly known as Pocahontas) and rambling off a series of shallow punch lines again showed.

Native children have never had the pleasure of seeing themselves and their people adequately acknowledged in their teachers’ lesson plans. Thanksgiving is handled with satin gloves for the sake of the white children. They can learn the name of the chief who sat with the pilgrims of Plymouth, but not that those same pilgrims mounted his son’s head on a pike above their town and left his body to publicly rot. To acknowledge the true history of Thanksgiving would only be the first step, a slippery slope to a nation daring to utter the word “genocide” when thinking about its foundations. It would be a screeching slam of the guitar in the middle of a rendition of “This Land Is Your Land.” It would be the truth.

Source Link: https://newrepublic.com/article/155837/thanksgiving-another-reminder-america-forgot

A Blind Man Sees His Birthday Candles Again, Thanks to a Bionic Eye

A brain implant helped restore his sight after a tragic car crash

Jason Esterhuizen, who lost his eyesight in a car accident, practices locating objects and walking after getting a brain implant that’s meant to create artificial vision. Photo credit: UCLA Health

In December 2011, a horrific car accident knocked Jason Esterhuizen unconscious. When he woke up in a hospital in Pretoria, South Africa, hours away from his hometown, he couldn’t see. The crash had destroyed his eyes and left him completely blind.

Esterhuizen was devastated. At the time, he was 23 and studying to become an airline pilot. The first two years after the accident were the hardest. “Life changes in an instant,” he tells OneZero. “I used to fly airplanes and ride motorcycles and drive my own car.”

Esterhuizen eventually got mobility training and learned how to read braille, use assistive devices, and work on a computer. Then, in 2013, he tuned in to a TV news segment about a company developing a brain implant that could create artificial vision for people like him. Second Sight, based in Sylmar, California, had just won approval in the United States for a retinal implant designed to assist people with blindness caused by a rare genetic disorder called retinitis pigmentosa. Esterhuizen wasn’t a candidate for that device, but by 2018 the company had developed a brain implant that could change his life.

Now part of a small clinical trial, Esterhuizen is one of six blind patients to receive the experimental device, called the Orion. It’s meant to provide artificial vision to people who have gone blind from a wide range of causes, including glaucoma, diabetic retinopathy, optic nerve injury or disease, and eye injury. If it works and is proven safe, it and other brain implants could potentially help many more people who are blind.

When Esterhuizen learned he was a candidate for the trial at the beginning of 2018, he and his wife uprooted their lives in South Africa and moved to Los Angeles to be in the study. The device, he says, allowed him to see his birthday candles for the first time in more than seven years.

From the outside, the Orion looks like a pair of sunglasses with a small camera and video processing unit (VPU) attached to it. Implanted in the brain, however, is a postage stamp-sized chip containing 60 electrodes that sits on the visual cortex, the part of the brain that processes visual information. When the device is switched on, the camera captures a person’s surroundings, and the wireless VPU converts those images into electrical pulses using an algorithm. Those pulses are transmitted to the electrodes on the brain, which interprets them as visual clues.

Renderings of the Orion device. Credit: Second Sight

The brain implant is what sets the Orion apart from the Argus II and other so-called “bionic eyes” on the market. Those devices don’t require brain surgery: Instead, the electrodes are embedded behind the eye. Retinal implants require recipients to have some functioning cells present in the eyes, which is why they are currently only approved for patients with retinitis pigmentosa, which affects only a small percentage of the 3.4 million Americans who are legally blind or visually impaired.

With the current system we’re testing, you don’t even need to have eyes for the device to work.

Retinitis pigmentosa causes photoreceptors, the light-sensing cells responsible for sight, to gradually die, causing vision loss over time. But other eye cells called ganglions, which talk directly to the brain, remain intact. Retinal implants, like the Argus II and those made by France’s Pixium Vision and Germany’s Retina Implant, are designed to stimulate these cells, which transmit visual information along the optic nerve to the brain. About 350 retinitis pigmentosa patients worldwide have received the Argus II device.

But the Orion, which shares much of its technology with the Argus II, bypasses the eye and optic nerve completely. “With the current system we’re testing, you don’t even need to have eyes for the device to work,” says Dr. Nader Pouratian, the neurosurgeon at Ronald Reagan UCLA Medical Center who implanted Esterhuizen’s device. As the primary investigator of the trial at UCLA, he has outfitted four patients with the device. The other two study participants received the implant from Dr. Daniel Yoshor at Baylor College of Medicine in Houston, Texas.

Despite the risk of infection or bleeding or the possibility that the implant wouldn’t work, Esterhuizen didn’t hesitate to undergo brain surgery.

When investigators first switched the implant on after the procedure, they needed to test each of the 60 electrodes one by one to see how much electric current each needed to receive before patients started seeing light. They used that information to make a custom program for each patient, a process that took months.

Now, Esterhuizen and the other participants have regained a limited amount of vision after being completely blind for years or decades. While they don’t see color, shapes, or clear edges and can’t yet read text, they are able to distinguish light from dark, they can recognize moving objects, and they have some degree of depth perception. People and objects appear as dots of light corresponding to where they’re located, and as they get closer, more dots appear. “It’s like learning a new language,” Esterhuizen says. “You learn how to interpret what’s going on.”

Patients meet regularly with vision researchers at UCLA and Baylor to test the device and learn how to use it. In one exercise, they look at a black computer screen and point to a white square that appears intermittently in different locations. The majority of the time, they can successfully point to the square.

It’s not natural vision, but Pouratian says that the device lets the participants do everyday tasks they weren’t able to before. “It’s not that the system helps them become completely independent, but if you can’t see anything, being able to see just a little bit becomes extremely valuable,” he says.

Esterhuizen says he feels safer leaving his apartment alone because he can now see when cars are approaching. Now, he can sort laundry and even find certain objects in his home.

Vision starts in the eye, but it’s the brain that recognizes images and interprets them — a process that mostly remains a mystery. Scientists know that the brain contains maps of the visual field, and that every location in that field is represented by a unique location in the brain. But they haven’t figured out where those exact locations are yet. This is why the Orion and retinal implants can only create a limited range of vision for now.

“Our electrodes are big compared to neurons, so we’re stimulating a lot of neurons at once and the brain is interpreting that,” says Jessy Dorn, vice president of clinical and scientific affairs at Second Sight. “We’re not at the level of each individual cell.”

As a safety precaution, the implant that Esterhuizen and the other patients received only uses one electrode array to stimulate the left side of the brain. As a result, they can only perceive visual cues from their right-side field of vision. If the one array proves to be safe, eventually, Second Sight plans to implant one on both sides of the brain. Dorn says the company is also working on ways to improve the technology to enhance the resolution and range of vision for patients.

The company wants to expand the number of implanted electrodes to between 150 and 200. And it’s working on improving its camera and VPU, potentially with thermal vision and facial and object recognition.

The researchers are also trying to achieve more natural vision by figuring out how to deliver the electrical stimulation in a way that better imitates the firing of groups of neurons. “This idea that we can stimulate the brain to produce visual perceptions is well-known, but the way we need to do it in order to maximize visual perceptions is not as well understood as we want it to be,” says Pouratian.

Dr. Abdhish Bhavsar, director of the Retina Center in Minneapolis and a clinical spokesperson for the American Academy of Ophthalmology who is not involved in the Orion study, says much more research on brain mapping will be needed to provide patients with a greater range of useful vision. “We have a long way to go before we understand what stimulating the brain will do in terms of vision,” he says. “If we developed a map of the brain that showed what exact parts generate what type of images or perceptions of the visual world, then we could start making models based on that.”

The early results from the Orion are encouraging, but brain implants for vision are still very much in their early days. Safety is a major concern: One patient in the Orion trial experienced a seizure after the device was implanted. People who receive such implants will need to be followed for years to make sure there are no complications that emerge later on. Electrodes in the brain also cause scar tissue to form over time, making them stop working, so it isn’t clear how long these implants will last. Second Sight’s Dorn says the electrodes used in the Orion device should work for at least five years. That means patients will probably eventually lose what little vision they acquire with the devices.

Another major limitation of the Orion is that it’s only useful for those who were born sighted and later lost their vision. In people who are born blind, the parts of the brain that are responsible for sight are not fully developed, and visual information cannot be effectively transmitted to the brain. A device that could help all people with blindness is still a long way off.

And if the FDA eventually approves the Orion, not everyone who’s eligible to get the implant will want to undergo brain surgery. The device is also likely be expensive. The Argus II retinal implant costs about $150,000, though it is covered by Medicare in many states.

Esterhuizen though is hopeful about the future of assistive technologies for the blind and visually impaired. “It’s just baby steps for now,” he says. “But eventually I think this technology will change the lives of millions of people.”

Source Link: https://onezero.medium.com/a-blind-man-sees-his-birthday-candles-again-thanks-to-a-bionic-eye-be0d3d987e48

LGBTQ game characters get their moment

Tyler, a transgender guy, is one of two playable characters in the forthcoming Xbox game “Tell Me Why.” Image: Microsoft/DONTNOD Entertainment

In a significant move toward diversifying the world of video games, two major-studio titles due out next year will place queer protagonists at the center of the action.

Why it matters: Gaming has historically been a tough world for LGBTQ players, with plenty of harassment and few visibly queer characters.

Driving the news:

  • Announced Thursday and debuting next summer, “Tell Me Why” is an Xbox title that features Tyler, a trans character, as one of the two playable options — a first for a major-studio game.
  • The Last of Us Part II,” a highly anticipated sequel to a PlayStation game, features Ellie, a young lesbian, as the game’s sole playable character. (Ellie was one of two main characters in the first game in the series.)

History lesson: The debuts come 5 years after Gamergate, a controversy that involved the online harassment of a number of prominent female game developers that is frequently interpreted as a precursor to the broader alt-right movement.

What they’re saying: GLAAD’s Jeremy Blacklow notes that in many ways these new game titles represent a response to Gamergate, given that major-release games take several years to develop.

  • The industry, Blacklow adds, is effectively saying: “We care more about reaching the people who need to see themselves represented than the trolls. That’s huge.”

The creators of “Tell Me Why” said Tyler wasn’t created just to be a transgender character, but rather as one aspect of a complex character telling a unique story.

“With ‘Tell Me Why,’ we want to develop a unique depth of characters that includes a special strong bond between the twins. When we decided on having Tyler be a transgender man we didn’t want him to be recognized just for being transgender.”

“Tyler is a very likable young man, courageous, who knows who he is and what he stands for. He’s full of hopes, dreams, but also fears. He has a bright side, but also flaws, like all of us.”

Florent Guillaume, game director

Out gamers: As important as what is happening in the games is the experience of those playing the games, especially in a world of live-streaming. One of the world’s top gamers, Dominique “SonicFox” McLean, proudly identifies as gay, black and furry.

Yes, but: Everyone is prepared for a possible backlash when the games come out next year.

  • “We all know what’s in the comments section,” Blacklow said.

As a result, Microsoft has been tightening the policies on its Mixer streaming service and said it is “already hard at work on several new programs and tools aimed at reducing harmful content and toxic behavior.”

  • “We are committed to making intentional choices that embrace the vibrancy found in our differences, and hostility is not welcome in our culture,” Microsoft Xbox senior creative director Joseph Staten told Axios.

Some stumbles: Progress hasn’t been linear, even in the last couple years since GLAAD started working with the gaming industry to be more inclusive.

  • Ubisoft had a game, “Assassin’s Creed: Odyssey,” that let people choose whether their characters would find same-sex or opposite-sex romance, but a download forced all characters into heterosexual coupling. The company later reversed course after an outcry.

The bottom line: Done right, games that represent a wider range of human experience can help us all broaden our horizons. And if that helps a generation of video gamers better understand a bit of the transgender experience, all the better.

The Art of the Tattoo in Pictures

A woman displays her tattoos during the International Brussels Tattoo Convention. REUTERS/Yves Herman
Rolf Buchholz of Germany, who says he holds the world record with his 480 piercings, during the International Brussels Tattoo Convention. REUTERS/Yves Herman
The International Brussels Tattoo Convention invites over 450 of the world’s finest tattoo artists. Arie Asona/NurPhoto
Models display their tattoos at International Brussels Tattoo Convention. REUTERS/Yves Herman
Tattoo artists working on a customer at the International Tattoo Convention in Brussels, Belgium. Arie Asona/NurPhoto
A man displays his tattoos in Brussels. REUTERS/Yves Herman
Tattoo artist doing a portrait on the arm of a customer. Arie Asona/NurPhoto
A woman displays her tattoos during the International Brussels Tattoo Convention. REUTERS/Yves Herman
Tattoo artists working on customers at the International Brussels Tattoo Convention. Arie Asona/NurPhoto
The International Brussels Tattoo Convention is the leading show of its kind in Europe. Arie Asona/NurPhoto
Tattoo artist working on a customer in Brussels. Arie Asona/NurPhoto
Rolf Buchholz during the International Brussels Tattoo Convention. REUTERS/Yves Herman

Rowena Chiu’s Weinstein allegation highlights the issue of race in sexual assault

Harmful, erroneous stereotypes attached to Asian women played into Harvey Weinstein’s alleged abuse of Rowena Chiu, she said.

Rowena Chiu was interviewed on the “Today” show on Sept. 9, 2019.Nathan Congleton / NBC

Rowena Chiu’s Weinstein allegation highlights the issue of race in sexual assault

Decades after Rowena Chiu alleges she was sexually assaulted by Harvey Weinstein, she wrote an op-ed article for The New York Times, opening with words that may have felt pointed or shocking to some, but gut-wrenching and all too familiar to many Asian women.

“Harvey Weinstein told me he liked Chinese girls,” wrote Chiu, who is British Chinese. “He liked them because they were discreet, he said — because they knew how to keep a secret. Hours later, he attempted to rape me.”

Race sits at the core of Chiu’s story. Harmful, erroneous stereotypes attached to Asian women played into Weinstein’s alleged abuse of her, according to Chiu. Race also comes into play through specific Chinese cultural values and taboos that made it notably difficult for Chiu, a former Miramax employee, to process and eventually speak out about what had happened to her, she told NBC News.

“I really strongly believe that it took me much longer than the other victims to think, ‘Am I prepared to live with the repercussions of speaking out?’” Chiu said. “It took me a full two years. People are like, ‘Why did it take you that long?’ and I always feel like my answer should be, ‘How did I come to that position so quickly?’ Because to think of myself as an Asian person and a really terrified individual in October 2017, it’s really a big journey to come in just two years.”

Weinstein has denied the attempted rape took place, instead claiming he had a consensual “six-month physical relationship” with her.

According to her account, however, Chiu was pressured into signing a nondisclosure agreement after she attempted to report the alleged attack. Miramax declined NBC News’ request for comment, and Weinstein did not return NBC News’ request for comment.

Chiu, who was raised in a conservative Christian and Chinese household in a predominantly white area outside London, was uncomfortable with speaking about her experience when a New York Times reporter, Jodi Kantor, initially approached her in 2017 — but not solely because she had signed the agreement.

Silenced by the ‘model minority myth’

Shame and saving face are concepts deeply woven into several Asian cultures, in particular when it comes to how women are socialized to avoid acts that may be perceived as bringing shame to themselves or their families. As Asian American psychology researcher Stanley Sue points out, there’s even specific language for the notions.

“‘Haji’ among Japanese, ‘hiya’ among Filipinos, ‘mianzi’ among Chinese, and ‘chaemyun’ among Koreans are terms that reveal concerns over the process of shame or the loss of face,” he said.

Sung Yeon Choimorrow, executive director of the National Asian Pacific American Women’s Forum, explained that given the culture of shame in the Asian diaspora, Chiu’s act of speaking out is tremendously significant.

“On the outside, Asian American women might look like we’re successful, but the level of shame and isolation that comes with experiencing stigma is so deep, like mental health issues and dealing with violence,” she said. “More Asian American women deal with violence than will let on, whether it’s sexual violence or physical violence or emotional violence, because we are told not to talk about it, we are told not to disrupt, or we don’t know where to go for resources.”

There’s an additional layer of scrutiny when it comes to Asian immigrant families. Choimorrow explained that those from immigrant families are often told “from a very young age to assimilate, don’t bring attention to ourselves” as a means of survival in the new country.

Chiu understands this. “They talk a lot about the legal constraints of speaking out but I think it hasn’t centered around a lot of the personal constraints. I would say for me, personally, those were a lot stronger,” Chiu said. “I hadn’t talked to my family, I hadn’t talked to my husband, I hadn’t talked to my sister, I hadn’t talked to my network of friends. No one from that time in my life in ‘98 knew what really happened to me.”

Chiu eventually detailed her experience in “She Said,” the book by Kantor and Meghan Twohey that was published in September.

By contrast, many of her former Miramax colleagues were ready to speak about their experiences with Weinstein when Kantor and Twohey approached them two years ago. Chiu underscored that she was raised to be someone who didn’t speak up, avoided calling attention to herself and never talked back. That presented an especially difficult dilemma when dealing with the trauma and confiding in loved ones about it.

“These are things that are perpetuated when we internalize the model-minority stereotype,” Choimorrow said. “This is what happens when our community internalizes the model minority myth and says, ‘Yes, we have to be those people to get ahead and be successful.’

“It’s not bringing shame to the family, it’s, ‘You are embarrassing us and bringing us shame in front of white people, in front of mainstream America.’”

To this day, Chiu’s parents have not spoken to her about the assault. She has noted, though, that she’s received a great deal of support from the Asian community.

“[White men] expect obedience and submission but if you’re from ‘model minority’ parents who don’t want to make a fuss, you’re in double danger,” Chiu said. “Because you don’t feel like you can stick your neck out or be an unpleasant person.”

She added: “You’re raised as someone who can be nice.”

‘He’d never had a Chinese girl’

In addition to that context, Chiu also found herself surrounded by executives, filmmakers, producers and others in the entertainment industry. She said she was often the only Asian person in the room in the early portion of her career. Looking back, she remembers that racially charged quips and jokes were common. Oftentimes the remarks were well intentioned but the incidents only served to highlight the industry’s lack of cultural and racial sensitivity.

Harvey Weinstein and attorney Benjamin Brafman exit State Supreme Court, on June 5, 2018 in New York.Drew Angerer / Getty Images file

The night Weinstein allegedly attempted to rape her was the first time the then-assistant encountered overt racism while on the job, she said.

In her written account in The New York Times, Chiu described the way Weinstein weaponized her race, diminishing her to a two-dimensional, exotic trope.

“My ethnicity initially marked me as different and inferior: He assured [then-colleague Zelda Perkins] that he wouldn’t harass me because he didn’t, as I remember it, ‘do Chinese or Jewish girls,’” Chiu wrote. “Then later, he turned around and defined me in terms of sexual exoticism, telling me, just before he tried to rape me, that he’d never had a Chinese girl.”

It wasn’t the first time Chiu had heard the “I’ve never had a Chinese woman” line, she said. Like many Asian women living in the West, Chiu said it wasn’t an uncommon comment directed at her in London bars when she was younger. Regardless of who spoke that line or the version they used, she believed the underlying purpose was the same: dehumanization.

“Guys would come up to me and say, ‘I’ve always fancied an Asian woman,’ which is very similar to what Harvey Weinstein said to me,” Chiu said. “What you’re actually saying, but you may not be conscious of, is: ‘Hey, I know you’re an inferior race and I’m doing you a favor by fancying you. There’s a blonde woman I could be talking to but I’m talking to you instead.’”

‘Geishas and prostitutes with hearts of gold’

Experts said the fetishization of Asian women that permeates both barside catcalls and Weinstein’s alleged comments to Chiu is rooted in a toxic mix of imperialism, discriminatory immigration legislation and problematic representations onscreen.

In the 16th and 17th centuries, Europeans imagined the East or the “Orient” as exotic and immoral, Catherine Ceniza Choy, professor of Asian American and Asian Diaspora Studies at the University of California, Berkeley, told NBC News. As European and American colonizers expanded into Asia, they perpetuated ideas of Asian women as attractive, available and submissive, cementing this characterization through postcards and photographs.

Legislation like the Chinese Exclusion Act, which put a 10-year moratorium on Chinese labor immigration, further exacerbated the prevalence of stereotypes by “codifying the foreignness of Asians in America,” Choy said.

And Hollywood didn’t help much with dismantling stereotypes, either, she added.

“Twentieth-century popular culture, especially the stereotyping of Asian women in Hollywood films as dragon ladies, lotus blossoms, geishas and prostitutes with hearts of gold, furthered the reach of these one-dimensional fantasies in more contemporary times,” Choy said.

Chiu noted that not only were Asian women portrayed as sex workers, they were “highly submissive sex workers.”

“The way that they were sexually submissive to a dominant white male, that was an enormous sexual stereotype,” Chiu said of the film industry, where Weinstein was a leader for years. “Whether or not one believes it directly translates into real life, I think dominant white men, who come from that sort of cultural hegemony, absorb those stereotypes consciously or unconsciously.”

Choimorrow agreed. The stigmas attached to Asian women have come at a cost to their safety and equity in sexual situations and beyond, she said.

“The stereotypes play into the culture and assumption about what men feel like they can do with women; objectify and use women at their disposal and at their pleasure,” she said.

According to the Asian Pacific Institute on Gender Based Violence, from 21 to 55 percent of Asian women in the U.S. report experiencing intimate physical and/or sexual violence during their lifetime. The range is based on a compilation of studies of disaggregated samples of Asian ethnicities in local communities. In comparison, 33 percent of women in the U.S. experience sexual violence.

‘Not just a rapist… also racist’

That’s why Weinstein’s alleged invocation of Chiu’s race is not negligible, Choimorrow said, describing the disgraced film executive as “not just a rapist, he is also a racist.” Based on Chiu’s account, Weinstein allegedly targeted Chiu specifically because she is an Asian woman, she added.

“By not talking about the racialized experience of her story, you are erasing the racism that played into her situation.”

Sung Yeon Choimorrow

“By not talking about the racialized experience of her story, you are erasing the racism that played into her situation. This is often what women of color deal with,” Choimorrow said. “We are often forced to choose or not think about one aspect of our lives.”

While Chiu’s account of her experience was covered by numerous outlets, the majority highlighted the chronological events of the traumatic night Weinstein allegedly tried to attack her during the Venice Film Festival in 1998. Very few directly address Chiu’s race at any angle. For the most part, her Asianness is glossed over, or mentioned as an aside.

Perhaps outlets wanted to portray her as an “every-girl,” an “ordinary person who just graduated university with student debt who went up against the most powerful man in Hollywood,” Chiu said. In doing so, the intersectionality of her experience is neglected, according to Choimorrow.

Since coming out with her story, Chiu has spoken about it in front of audiences and with television hosts, exposing more of the issues she grappled with around her alleged assault. But given the dynamics at play surrounding her race and gender, she still worries.

“We don’t know how many silent Asian voices are out there,” she said.

Chiu stressed that sexual assault survivors should only come out if they are comfortable with doing so. In her own experience, “the dread and fear of coming out was worse than the actual coming out,” she said.

“I feared a lot of judgment from my community, from my family, from my culture that didn’t play true,” she said of her journey. “Because in the end, the monsters in your imagination are bigger.”

CORRECTION (Nov. 11, 2019, 10:18 p.m. ET): A previous version of this article misstated where Catherine Ceniza Choy is a professor. She teaches at the University of California, Berkeley, not the University of California, Los Angeles.

——Kimmy Yam

Kimmy Yam is a reporter for NBC Asian America.

Source link:  https://www.nbcnews.com/news/asian-america/rowena-chiu-s-weinstein-allegation-highlights-issue-race-sexual-assault-n1077876

Fall of Berlin Wall: How 1989 reshaped the modern world

World events often move fast, but it is hard to match the pace and power of change in 1989.

It culminated in one of the most famous scenes in recent history – the fall of the Berlin Wall.

The wall came down partly because of a bureaucratic accident but it fell amid a wave of revolutions that left the Soviet-led communist bloc teetering on the brink of collapse and helped define a new world order.

How did the Wall come down?

It was on 9 November 1989, five days after half a million people gathered in East Berlin in a mass protest, that the Berlin Wall dividing communist East Germany from West Germany crumbled.

East German leaders had tried to calm mounting protests by loosening the borders, making travel easier for East Germans. They had not intended to open the border up completely.

The changes were meant to be fairly minor – but the way they were delivered had major consequences.

Notes about the new rules were handed to a spokesman, Günter Schabowski – who had no time to read them before his regular press conference. When he read the note aloud for the first time, reporters were stunned.

“Private travel outside the country can now be applied for without prerequisites,” he said. Surprised journalists clamoured for more details.

Shuffling through his notes, Mr Schabowski said that as far as he was aware, it was effective immediately.

In fact it had been planned to start the next day, with details on applying for a visa.

But the news was all over television – and East Germans flocked to the border in huge numbers.

Harald Jäger, a border guard in charge that evening, told Der Spiegel in 2009 that he had watched the press conference in confusion – and then watched the crowd arrive.

There were emotional scenes as East Berliners entered the West

Mr Jäger frantically called his superiors, but they gave no orders either to open the gate – or to open fire to stop the crowd. With only a handful of guards facing hundreds of angry citizens, force would have been of little use.

“People could have been injured or killed even without shots being fired, in scuffles, or if there had been panic among the thousands gathered at the border crossing,” he told Der Spiegel.

“That’s why I gave my people the order: Open the barrier!”

Thousands flowed through, celebrating and crying, in scenes beamed around the world. Many climbed the wall at Berlin’s Brandenburg gate, chipping away at the wall itself with hammers and pickaxes.

A turbulent year had reached a climax.

Why did the Wall come down?

After World War Two, Europe was carved up by the Soviet Union and its former Western allies, and the Soviets gradually erected an “Iron Curtain” splitting the East from the West.

Defeated Germany was divided up by the occupying powers – the US, UK, France and the USSR – with the eastern part occupied by the Soviets. East Germany, officially known as the German Democratic Republic, became the Soviet Union’s foothold in Western Europe.

But Berlin was split four ways, with British, French and American zones in the west of the city and a Soviet zone in the east. West Berlin became an island surrounded by communist East Germany.

The wall was eventually built in 1961 because East Berlin was haemorrhaging people to the West.

By the 1980s, the Soviet Union faced acute economic problems and major food shortages, and when a nuclear reactor at the Chernobyl power station in Ukraine exploded in April 1986, it was a symbolic moment in the impending collapse of the communist bloc.

Mikhail Gorbachev, the comparatively young Soviet leader who took power in 1985, introduced a reform policy of “glasnost” (openness) and “perestroika” (restructuring).

But events moved far faster than he could have foreseen.

Revolutionary wave

Reform movements were already stirring in the communist bloc. Years of activism and strikes in Poland culminated in its ruling communist party voting to legalise the banned Solidarity trade union.

By February 1989, Solidarity was in talks with the government, and partially free elections in the summer saw it capture seats in parliament. Though the Communists retained a quota of seats, Solidarity swept the board wherever it was allowed to stand.

Poland’s Solidarity movement was successful in partially free elections

Hungarians, too, launched mass demonstrations for democracy in March. In May, 150 miles (240km) of barbed wire were dismantled along the border with Austria – the first chink in the Iron Curtain. Hungary’s 1956 revolution was brutally suppressed by the Soviets, but this was succeeding.

By August, the revolutionary wave had truly re-ignited on the fringes. Two million people across Estonia, Latvia and Lithuania – then part of the Soviet Union – held one of the most memorable demonstrations of the so-called Singing Revolution when they formed a 370-mile (600km) human chain across the Baltic republics calling for independence.

Many East Germans were overcome by emotion as they crossed into Austria

In the heat of August, Hungary opened it borders to Austria in the west, allowing East German refugees an escape.

The Iron Curtain was buckling.

Czechoslovakia, whose push for liberalising reform had been brutally suppressed in 1968, provided another means of escape. East Germans could travel to the neighbouring socialist nation without restriction, and began to flood the West German embassy there by the hundreds, eventually being evacuated to the West by train.

East Germany ended up closing its border with Czechoslovakia in October to stem the tide.

But by then the revolution had spread to East Germany itself.

East Germany rebels

It began with demonstrators rallying for freedom in the centre of the city of Leipzig.

On 9 October, within days of East Germany celebrating its 40th anniversary, 70,000 people took to the streets.

There were calls for free elections from West Germany, and talk of reform from East Germany’s new communist leader Egon Krenz. No-one knew the fall of the Wall was weeks away.

In late October parliament in Hungary, which had been among the first to hold mass demonstrations, adopted legislation providing for direct presidential elections and multi-party parliamentary elections.

And then on 31 October, the numbers demanding democracy in East Germany swelled to half a million. Mr Krenz flew to Moscow for meetings – he recently told the BBC that he had been assured German reunification was not on the agenda.

Find out more about East Germany, 1989

Presentational white space

On 4 November, a month after the East German protests had begun, around half a million people gathered in Alexanderplatz in the heart of East Berlin.

Three days later, the government resigned. But there was no intention to give way to democracy and Egon Krenz remained head of the Communist Party and the country’s de facto leader.

He would not be there long. Five days later, Mr Schabowski gave his world-changing press conference.

Why didn’t the Soviets use force?

Earlier in ’89, Beijing demonstrators in Tiananmen Square who had called for democracy in China were crushed in a major military crackdown.

The USSR had used its military to put down rebellions before. So why not now?

Within the Soviet Union itself, it did, killing 21 pro-independence protesters in the Soviet republic of Georgia. But elsewhere in the communist bloc, they did not.

In a break with Soviet policy, Mikhail Gorbachev decided against using the threat of military might to quell mass demonstrations and political revolution in neighbouring countries.

“We now have the Frank Sinatra doctrine,” foreign ministry spokesman Gennady Gerasimov told US television. “He has a song, `I (Did) It My Way.’ So every country decides on its own which road to take.”

A new chapter in European history

On 3 December, Mr Gorbachev and US President George HW Bush sat side by side in Malta, and released a statement saying the Cold War between the two powers was coming to a close.

More than half a million people gathered in Prague for this November 1989 demonstration as Czechoslovak communism was overthrown

The 1989 wave of revolutions was not over yet.

Student demonstrators in Prague clashed with police, triggering the Velvet Revolution which overthrew Czechoslovak communism within weeks.

In Romania, demonstrations ended in violence and saw the fall of communist dictator Nicolae Ceausescu. A new government took over as the ousted leader fled his palace and angry crowds stormed it.

The Romanian revolution was the only one in Eastern Europe that year that saw bloodshed

He and his wife Elena were captured and executed on Christmas Day. More than 1,000 people were killed in unrest before and after the revolution, setting Romania apart from the largely bloodless events elsewhere.

Postscript to 1989

And the Soviet Union itself?

In 1990, Latvia, Lithuania and Estonia took advantage of their new-found political freedoms to vote out their communist governments and make moves towards independence. The Soviet Union was falling apart, but Mr Gorbachev made one last ill-fated attempt to reform it by calling together the leaders of the 15 Soviet republics.

Hardline communists opposed to his reforms pre-empted him, attempting a coup while he was on holiday in Crimea in August 1991 and putting him under house arrest.

The coup was defeated in three days as pro-democracy forces rallied round Boris Yeltsin, president of the Russian republic.

But it was the death knell for the USSR, and one by one its constituent republics declared independence. By the end of the year the Soviet flag had flown for the last time.

原文链接:https://www.bbc.com/news/world-europe-50013048

Teens on TikTok have no clue they’re perpetuating racist stereotypes

By Brianna Holt 
TikTok is stoking a pop culture phenomenon rooted in a terrible history.

 

When TikTok launched in 2016, the Chinese app had to carve out a space alongside already popular video-sharing platforms like Instagram, Musical.ly, and Dubsmash. Just two years later, TikTok became the world’s most-downloaded app, surpassing Instagram in 2018.

TikTok is known for its trending internet challenges—like the Haribo Challenge, Fake Travel Challenge, and Raindrop Challenge—with the stunts oftentimes screen-recorded and then posted to other social media platforms like Twitter and Facebook. The Chinese-built app also has created a new wave of internet personas, like E-girls and E-boys. But if TikTok is a place where internet memes with teenage appeal get turned into videos featuring real-life teens, it’s also a place where the phenomenon of white teens perpetuating racist stereotypes is on the rise.

Blackface without the face paint

Videos from TikTok are surfacing all over the internet, oftentimes featuring white teens imitating stereotypical lifestyles or characteristics of black people or other people of color. As they nonchalantly change their accents, use appropriated slang terms, and demonstrate certain mannerisms for comedy, it’s obvious there is a gap in their understanding of, and respect for, different cultures. Videos of mostly young white teens portraying fictitious minority characters for the mere purpose of entertainment aren’t only cringe-worthy, offensive, and weird—they perpetuate racist cliches.

A plethora of young white women like Woah Vicky, who masquerade as black women on Instagram, have made names for themselves on social media for their heightened culture appropriation. It’s not altogether different from what happened to Rachel Dolezal, the white woman who passed as a black woman for years and held leadership positions in black community organizations. While the videos populating TikTok tend not to show teens wearing blackface or blatantly referring to themselves as black people, their stars are taking everything but the burden of what it is to be black in America while simultaneously using black culture as a way to grow their own social following.

A deep-rooted history

The obsession with black culture by white people has been an uncomfortably bizarre phenomenon for decades, but portrayals of black people by white people for entertainment purposes goes back even further. Blackface has many forms, but we typically only associate it with non-black people using makeup to portray a black person. In order to understand how the perhaps non-malicious but also unconsciously racist trend of imitating or pretending to be black on social media, without painting your face, is also a form of blackface, one must first understand the history of blackface and its relationship to white identity.

Emerging in the US in the 1820s, blackface often appeared in minstrel shows that depicted people of African descent in comical forms. After the Civil War, when racial tensions were especially heightened, blackface became crueler than ever and was often performed at “coon shows.” During these minstrel shows, black people were portrayed as lazy, stupid, ignorant, criminal, and hyper-sexual. The impact of these shows has lasted for decades, creating harmful stereotypes widely seen in advertising, propaganda, literature, and film. Jim Crow, which inspired the name given to the Jim Crow laws of the American South, was actually one of the first fictional blackface characters recorded in popular culture, often paired with exaggerated African American jargon, painted-on large lips, and unintelligent behavior.

The cultural dynamics got even more complicated  in the early 20th century, when people from other ethnic groups began using blackface either to exert their social rank over that of black people, or in a bid for acceptance by other white people. It was used by Irish, Italian, and Jewish performers, for example, in order to signal that they, too, were deserving of the privileges of being white in America, and to dissolve their own ethnic tensions. In his book Love & Theft: Blackface Minstrelsy and the American Working Class, cultural historian Eric Lott describes the transformation of blackface as an act that “assuaged an acute sense of insecurity by indulging feelings of racial superiority.” European immigrants needed to prove their whiteness and what better way to do so than showcasing that they were not as low as African Americans?

It was also around this time that white women could be found using blackface as a way to get into show business, oftentimes singing in black dialect and acting like black women in their roles. In some instances, there was an underlying sense of appreciation for black culture by those who put on blackface. Actors and jazz musicians recognized the talent of black artists and aspired to match their aptitude. But they simultaneously mocked them, creating a strange combination of obsession and bigotry.

More recently, when public figures like Virginia governor Ralph Northam and his state attorney general Mark Herring were exposed for  having worn blackface as undergraduates, the internet shamed them and called for the cancelation of their political careers. Meanwhile, when teens on TikTok act as if they are black, with their made-up mannerisms, dialects, and jargon, we call it a trend.

But what is the difference between their portrayal and that of the actors in minstrel shows? Where is the outrage that followed the revelations about the college antics of our elected officials?  All of these groups would mock a community they are not a part of, for their own personal gain or as a form of entertainment.

Social media meets segregation

Is TikTok specifically responsible for the rise in digital blackface? Not exactly. The more likely culprit is mass-media consumption, combined with stubbornly segregated schools and neighborhoods.

According to a report from UCLA’s Civil Rights Project, in 2016, 40% of African American students in the US were in schools with 90% or more students of color. This isn’t just the legacy of racism in the US South; the UCLA group finds that New York is consistently one of the most segregated states in the nation.

Access to other cultural groups can be found online, of course. However, the access is limited and usually not a direct educational exchange, often inhibiting, rather than cultivating, a deeper understanding of other groups. Many teens learn about other cultures from the media they’re constantly consuming, rather than having real-life relationships and friendships with people who belong to the cultures they’re tapping into. As a result of their real-life segregation paired with their access to social media, not only are young people unconsciously perpetuating racist stereotypes, they’re appearing foolish to millions of people online in the process.

For example, in these two videos (one and two) that have gone viral on social media, several young white people are seen throwing up gang signs, seemingly unknowingly, as a funny trend. It can be assumed that they saw these signs somewhere online, thought they were cool, and taught them to their friends. They may very well know nothing of the meaning or connotation of these signals—context that probably would be provided in a more diverse circle. But who is available to let them know the actual meaning  of what they’re doing, if their schools, neighborhoods, and social circles are not diverse?

More work to do

It’s not enough for us to assume Gen-Z is the most “woke” generation or that their access to different people online will dissolve racism. While many of these teens would never dare paint their face black, and probably would not hesitate to go online and call out someone who did, they clearly don’t see the similarity in their “comical” recorded actions. They may not have ill intentions, but it is apparent that their understanding of racial stereotypes has not evolved.

Moreover, if these young TikTok users believe the stereotypes they see in movies, TV shows, or even on their social media apps are an accurate representation of black people as a whole, then that’s a huge signifier that upcoming generations are not as integrated as we think.

Optimistically, the preponderance of TikTok users obsessed with “acting black” perhaps suggests that black culture is cool, exciting, and something a lot of people wish to be a part of but don’t know how to emulate appropriately. But these teens are making the mocking of cultures a part of their own culture. And even if the intention is not to be racist or harmful, the legacy of blackface ensures that it is.

Tales From the Teenage Cancel Culture

What’s cancel culture really like? Ask a teenager. They know.

By Sanam Yar and

 

Anthony Freda

A few weeks ago, Neelam, a high school senior, was sitting in class at her Catholic school in Chicago. After her teacher left the room, a classmate began playing “Bump N’ Grind,” an R. Kelly song.

Neelam, 17, had recently watched the documentary series “Surviving R. Kelly” with her mother. She said it had been “emotional to take in as a black woman.”

Neelam asked the boy and his cluster of friends to stop playing the track, but he shrugged off the request. “‘It’s just a song,’” she said he replied. “‘We understand he’s in jail and known for being a pedophile, but I still like his music.’”

She was appalled. They were in a class about social justice. They had spent the afternoon talking about Catholicism, the common good and morality. The song continued to play.

That classmate, who is white, had done things in the past that Neelam described as problematic, like casually using racist slurs — not name-calling — among friends. After class, she decided he was “canceled,” at least to her.

Her decision didn’t stay private; she told a friend that week that she had canceled him. She told her mother too. She said that this meant she would avoid speaking or engaging with him in the future, that she didn’t care to hear what he had to say, because he wouldn’t change his mind and was beyond reason.

“When it comes to cancel culture, it’s a way to take away someone’s power and call out the individual for being problematic in a situation,” Neelam said. “I don’t think it’s being sensitive. I think it’s just having a sense of being observant and aware of what’s going on around you.”

The term “canceled” “sort of spawned from YouTube,” said Ben, a high school junior in Providence, R.I. (Because of their age and the situations involved, The New York Times has granted partial anonymity to some people. We have confirmed details with parents or schoolmates.)

He talked about the YouTuber James Charles, who was canceled by the platform’s beauty community in May after some drama with his mentor, Tati Westbrook, also a YouTuber, and a vitamin entrepreneur. That was a big cancellation, widely covered, that helped popularize the term. Teenagers often bring it up.

Ben, 17, said that people should be held accountable for their actions, whether they’re famous or not, but that canceling someone “takes away the option for them to learn from their mistakes and kind of alienates them.”

His school doesn’t have much bullying, he said, and the word carries a gentler meaning in its hallways, used in passing to tease friends. Often, the joke extends beyond people. One week, after students were debating the safety of e-cigarettes and vaping, some declared that Juul was canceled.

[Here’s what Barack Obama has to say about cancel culture.]

It took some time for L to understand that she had been canceled. She was 15 and had just returned to a school she used to attend. “All the friends I had previously had through middle school completely cut me off,” she said. “Ignored me, blocked me on everything, would not look at me.”

Months went by. Toward the end of sophomore year, she reached out over Instagram to a former friend, asking why people were not talking to her. It was lunchtime; the person she asked was sitting in the cafeteria with lots of people and so they all piled on. It was like an avalanche, L said.

Within a few minutes she got a torrent of direct messages from the former friend on Instagram, relaying what they had said. One said she was a mooch. One said she was annoying and petty. One person said that she had ruined her self-esteem. Another said that L was an emotional leech who was thirsty for validation.

“This put me in a situation where I thought I had done all these things,” L said. “I was bad. I deserved what was happening.”

Two years have passed since then. “You can do something stupid when you’re 15, say one thing and 10 years later that shapes how people perceive you,” she said. “We all do cringey things and make dumb mistakes and whatever. But social media’s existence has brought that into a place where people can take something you did back then and make it who you are now.”

In her junior year, L said, things got better. Still, that rush of messages and that social isolation have left a lasting impact. “I’m very prone to questioning everything I do,” she said. “‘Is this annoying someone?’ ‘Is this upsetting someone?’”

“I have issues with trusting perfectly normal things,” she said. “That sense of me being some sort of monster, terrible person, burden to everyone, has stayed with me to some extent. There’s still this sort of lingering sense of: What if I am?”

Alex is 17, and she hears the word “canceled” every day at her high school outside Atlanta. It can be a joke, but it can also suggest that an offending person won’t be tolerated again. Alex thinks of it as a permanent label. “Now they’ll forever be thought of as that action, not for the person they are,” she said.

“It’s not like you’ll sit away from them at lunch or something,” she said. “It’s just a lingering thought in the back of your mind, a negative connotation.”

During a mock trial practice a couple of weeks before a big competition, the song “Act Up” by City Girls was playing. One of Alex’s teammates, who is of Indian descent, rapped along with the lyrics, which include a racist slur.

The students, who until that point had been chatty because their teacher wasn’t in the room, went silent. “I was the only black person in the room,” Alex said.

Alex and another friend on the team explained to their teammate why he shouldn’t have used that word. “We’re a team, so we can’t have tension exist there,” she said.

He said he understood why they were uncomfortable but that it wouldn’t necessarily prevent him from using it again when singing along. He wouldn’t take it back.

“You’re canceled, sis,” her friend told the teammate. It was partially to lighten the mood, but also partly serious.

“It’s a joke, but still, we understand you have that opinion now and we’re not going to get closer,” Alex said.

Despite his initial tough stance, the teammate didn’t rap the word again, and Alex said that he had remained respectful during practice. The team took ninth and 11th place at the competition.

It was orientation day for freshmen at Sarah Lawrence College, where one new student was unnerved by a social justice group’s presentation. The presenters discussed pronoun use and called on the entering freshmen to “‘battle heteronormativity and cisgender language,’” the student said.

Even if you accidentally misgendered someone, the new students were told, you needed to be either called out or called in. (“Called in” means to be gently led to understand your error; call-outs are more aggressive.) The presenters emphasized that the impact on the person who was misgendered was what mattered, regardless of the intent of the person who had misgendered them.

The freshman thought back to a time when her father had misgendered a friend of hers. Her father had asked her to apologize on his behalf. She did. “‘I only get mad when people intentionally try to misgender me because they feel like they have to correct who I am,’” she recalled her friend saying.

Sarah Lawrence has fewer than 1,500 undergraduates. One upperclassman she became friends with said that she had been canceled in her own freshman year.

But, this upperclassman said, the politics enforced through cancellation don’t always fit neatly into the social dynamics of college.

“I think where it loses me, we’re taking these systems that are applying huge abstract ideas of identity’s role and we’re shrinking it into these interpersonal, one-on-one, liberal arts things,” the upperclassman said.

Among the upperclassman’s friend group now, the idea of cancellation is “basically a joke.” Too many people had been canceled. At a recent party the upperclassman had attended, one guy said, “‘If you haven’t been canceled, you’re canceled.’”

One night during Mike’s freshman year at a New York state college, he and a group of friends were headed to a party downtown. As they were waiting for their Uber, someone cracked a political joke, and then the casual conversation turned confrontational. One of Mike’s friends asked his roommate, D, if he was a Trump supporter.

D had a history of making the group uncomfortable. Mike and their mutual friend Phoebe said that he would make sexist, homophobic and racist remarks in past hangouts.

D said he did support the president — an anomaly in their liberal friend group — and “blew up” at the friend who asked the question. When the friend tried to change the subject, he became more upset. Mike stepped between the two to defuse the situation. “He got in my friend’s face, and that was the last straw,” Mike said.

He tried to cool D down; it didn’t work. D called Mike a homophobic slur, multiple times. The group split up. Mike didn’t return to his dorm that night, staying at a friend’s place instead.

“Even before this, we could tell, if I weren’t roommates with him, we wouldn’t have been friends,” Mike said. “So that was the breaking point for me, him saying that when I was sticking up for him.”

D left an apology note on Mike’s desk, which mostly tried to “justify his actions,” Mike said. “That set in my mind that he didn’t really feel bad about what he did,” he said. “He just felt bad for himself, that he would be looked at in a different light.”

A couple of days later, Phoebe, Mike and D sat down and D repeated the apology. Phoebe and Mike heard him out but said it didn’t clear him of wrongdoing and that he would have to demonstrate that he was different now. Both said that while D appeared sad about losing his friends, tearing up during their discussion, he didn’t show remorse.

Other friends didn’t accept the apology. “We wouldn’t tolerate it anymore, we cut him out of our lives,” Phoebe said.

Thus canceled, D moved from sadness to frustration and anger, Phoebe said. He grew “very bitter,” she said. She noticed that he had unfollowed and blocked the group on Snapchat and other social media a few weeks later.

“He did feel bullied by this whole canceled idea,” she said. “But in this case, no one felt bad doing it, because he didn’t really take responsibility for a lot of the things he said.”

Mike, though, still lives with D. He had signed on to live with him before the ordeal. They don’t speak. D has stopped acknowledging Mike and most everyone from their old group. “I’m definitely not living with him next year,” Mike said.

Phoebe managed to keep things civil. “Every time we see him, I still say hi,” she said. Sometimes, but not always, he nods or says hi back.

原文链接:https://www.nytimes.com/2019/10/31/style/cancel-culture.html