Sal asian man dating white instagram model reddit - with
Reddit and the Struggle to Detoxify the Internet
Still, Ohanian and Huffman never took their own rhetoric too literally. The site’s rules were brief and vague, and their unwritten policy was even simpler. “We always banned people,” Huffman told me. “We just didn’t talk about it very much.” Because Reddit was so small, and misbehavior relatively rare, Huffman could do most of the banning himself, on an ad-hoc basis. “It wasn’t well thought out or even articulated, really. It was ‘That guy has the N-word in his username? Fuck that.’ Delete account.”
As C.E.O., Huffman continued the trend Pao had started, banning a few viciously racist subreddits such as r/Coontown. “There was pushback,” Huffman told me. “But I had the moral authority, as the founder, to take it in stride.” If Pao was like a forbearing parent, then Huffman’s style was closer to “I brought you into this world, and I can take you out of it.” “Yes, I know that it’s really hard to define hate speech, and I know that any way we define it has the potential to set a dangerous precedent,” he told me. “I also know that a community called Coontown is not good for Reddit.” In most cases, Reddit didn’t suspend individual users’ accounts, Huffman said: “We just took away the spaces where they liked to hang out, and went, ‘Let’s see if this helps.’ ”
The first morning I visited the office, I ran into Huffman, who was wearing jeans, a T-shirt, and Adidas indoor-soccer shoes, as he tried to persuade an employee to buy a ticket to Burning Man. Huffman is far more unfiltered than other social-media executives, and every time he and I talked in the presence of Reddit’s head of P.R., he said at least one thing that made her wince. “There’s only one Steve,” Ohanian told me. “No matter when you catch him, for better or worse, that’s the Steve you’re gonna get.” I had a list of delicate topics that I planned to ask Huffman about eventually, including allegations of vote manipulation on Reddit’s front page and his personal feelings about Trump. Huffman raised all of them himself on the first day. “My political views might not be exactly what you’d predict,” he said. “I’m a gun owner, for example. And I don’t care all that much about politics, compared to other things.” He speaks in quick bursts, with an alpha-nerd combination of introversion and confidence. His opinion about Trump is that he is incompetent and that his Presidency has mostly been a failure. But, he told me, “I’m open to counterarguments.”
That afternoon, I watched Huffman make a sales pitch to a group of executives from a New York advertising agency. Like many platforms, Reddit has struggled to convert its huge audience into a stable revenue stream, and its representatives spend a lot of time trying to convince potential advertisers that Reddit is not hot garbage. Huffman sat at the head of a long table, facing a dozen men and women in suits. The “snarky, libertarian” ethos of early Reddit, he said, “mostly came from me as a twenty-one-year-old. I’ve since grown out of that, to the relief of everyone.” The executives nodded and chuckled. “We had a lot of baggage,” he continued. “We let the story get away from us. And now we’re trying to get our shit together.”
Later, Huffman told me that getting Reddit’s shit together would require continual intervention. “I don’t think I’m going to leave the office one Friday and go, ‘Mission accomplished—we fixed the Internet,’ ” he said. “Every day, you keep visiting different parts of the site, opening this random door or that random door—‘What’s it like in here? Does this feel like a shitty place to be? No, people are generally having a good time, nobody’s hatching any evil plots, nobody’s crying. O.K., great.’ And you move on to the next room.”
In retrospect, although Facebook denies this, it seems clear that the company was preparing for a blow that was about to land. On February 16th, the special counsel Robert Mueller filed an indictment against several Russian individuals and businesses, including the Internet Research Agency, a company aligned with the Kremlin. The indictment mentioned Facebook thirty-five times, and not in ways that made the platform seem like a “force for good in democracy.” According to recent reporting by the Daily Beast, the Internet Research Agency also seeded Reddit with disinformation during the 2016 election. (A group of impostors even tried to set up an A.M.A.) Last Monday, the Washington Postreported that the Senate Intelligence Committee will question Reddit executives about this; the same day, Huffman admitted that the company had “found and removed a few hundred accounts” associated with Russian propaganda. (A Reddit representative told me that the company has been coöperating with congressional investigators “for months,” although they haven’t spoken about it publicly.) As in all such disinformation campaigns, the Russians did not act alone: their messages were upvoted and repeated by thousands of unsuspecting Americans. “I believe the biggest risk we face as Americans is our own ability to discern reality from nonsense,” Huffman wrote. “I wish there was a solution as simple as banning all propaganda, but it’s not that easy.”
Zuckerberg recently set a “personal challenge” for himself: “enforcing our policies and preventing misuse of our tools.” This seems to be a reversal for Zuckerberg, who was once a fake-news truther. Two days after the 2016 election, he said, “The idea that fake news on Facebook, of which it’s a very small amount of the content, influenced the election in any way, I think, is a pretty crazy idea. Voters make decisions based on their lived experience.” This was a pretty crazy idea, and Zuckerberg has been walking it back ever since. It’s obvious that what we see online affects how we think and feel. We know this in part because Facebook has done research on it. In 2012, without notice or permission, Facebook tweaked the feeds of nearly seven hundred thousand of its users, showing one group more posts containing “positive emotional content” and the other more “negative emotional content.” Two years later, Facebook declassified the experiment and published the results. Users were livid, and, after that, Facebook either stopped conducting secret experiments or stopped admitting to them. But the results of the experiment were clear: the people with happier feeds acted happier, and vice versa. The study’s authors called it “massive-scale emotional contagion.” Since then, social media has only grown in size and influence, and the persuasive tools available to advertisers, spies, politicians, and propagandists have only become sharper. During the 2016 election, a few Russian impostors affected many Americans’ beliefs and, presumably, votes. With another election coming up, most of the loopholes that the Russians exploited have not been closed, and the main loophole—the open, connected, massively contagious world of social media—might not be closable.
When I raised this issue with Huffman over dinner last summer, he said, “I go back and forth on whether Reddit is the tail or the dog. I think it’s a bit of both.” First, he laid out the tail hypothesis: “Reddit is a reflection of reality. People are enthusiastic about Bernie or Trump in real life, so they go on Reddit and talk about how much they like Bernie or Trump. So far, so good.” Then he laid out the dog hypothesis, which his fellow social-media executives almost never acknowledge—that reality is also a reflection of social media. “All sorts of weird things can happen online,” he said. “Imagine I post a joke where the point is to be offensive—like, to imply, ‘This is something that a racist person would say’—but you misread the context and think, ‘Yeah, that racist guy has a good point.’ That kind of dynamic, I think, explains a lot of what happened on The_Donald, at least in the early days—someone keeps pushing a joke or a meme to see how far they can take it, and the answer turns out to be Pretty fucking far.”
Leftist communities on Reddit often implore the company to ban The_Donald. So far, Huffman has demurred. “There are arguments on both sides,” he said, “but, ultimately, my view is that their anger comes from feeling like they don’t have a voice, so it won’t solve anything if I take away their voice.” He thought of something else to say, but decided against it. Then he took a swig of beer and said it anyway. “I’m confident that Reddit could sway elections,” he told me. “We wouldn’t do it, of course. And I don’t know how many times we could get away with it. But, if we really wanted to, I’m sure Reddit could have swayed at least this election, this once.” That’s a terrifying thought. It’s also almost certainly true.
Early the next week, Reddit banned Physical_Removal. In Charlottesville, James Alex Fields, one of the white nationalists, had driven a car into a crowd of counterprotesters, injuring nineteen and killing a woman named Heather Heyer. “This is a good thing,” the top post on Physical_Removal read. “They are mockeries of life and need to fucking go.” Reddit had a rule prohibiting content that “encourages or incites violence,” and this was a violation of that rule. Huffman said, “We’d had our eye on that community for a while, and it felt good to get rid of them, I have to say. But it still didn’t feel like enough.”
“Encouraging or inciting violence” was a narrow standard, and Huffman and his team agreed to expand it. Four words became thirty-six: “Do not post content that encourages, glorifies, incites, or calls for violence or physical harm against an individual or a group of people; likewise, do not post content that glorifies or encourages the abuse of animals.” This, too, required interpretation, and forced the company to create a non-exhaustive list of exceptions (“educational, newsworthy, artistic, satire, documentary”). Still, it made the team’s intentions clearer. Jessica Ashooh, Reddit’s head of policy, spent four years as a policy consultant in Abu Dhabi. “I know what it’s like to live under censorship,” she said. “My internal check, when I’m arguing for a restrictive policy on the site, is Do I sound like an Arab government? If so, maybe I should scale it back.” On the other hand, she said, “people hide behind the notion that there’s a bright line between ideology and action, but some ideologies are inherently more violent than others.”
In October, on the morning the new policy was rolled out, Ashooh sat at a long conference table with a dozen other employees. Before each of them was a laptop, a mug of coffee, and a few hours’ worth of snacks. “Welcome to the Policy Update War Room,” she said. “And, yes, I’m aware of the irony of calling it a war room when the point is to make Reddit less violent, but it’s too late to change the name.” The job of policing Reddit’s most pernicious content falls primarily to three groups of employees—the community team, the trust-and-safety team, and the anti-evil team—which are sometimes described, respectively, as good cop, bad cop, and RoboCop. Community stays in touch with a cross-section of redditors, asking them for feedback and encouraging them to be on their best behavior. When this fails and redditors break the rules, trust and safety punishes them. Anti-evil, a team of back-end engineers, makes software that flags dodgy-looking content and sends that content to humans, who decide what to do about it.
-