华搜传媒，专注网络推广、网络广告等互联网营销服务，QQ:755436989, 电话：13067953746 （同微信）
我们每天在社交媒体上都会看到各种 明星恋爱分手结婚离婚新闻之类的新闻头条 ，新闻下面评论里除了正常的表达，也有很多“键盘侠”出没。记者Andrew Marantz花了三年时间深入网络喷子和网络推手的世界，寻找那些将边缘话题推入网络对话核心的人，并试图了解他们是如何传播自己的想法的。他发现越是负面的信息越容易激发情绪，获得关注并被广泛传播。
I spent the past three years talking to some of the worst people on the internet. Now, if you've been online recently, you may have noticed that there's a lot of toxic garbage out there: racist memes, misogynist propaganda, viral misinformation. So I wanted to know who was making this stuff. I wanted to understand how they were spreading it. Ultimately, I wanted to know what kind of impact it might be having on our society.
So in 2016, I started tracing some of these memes back to their source, back to the people who were making them or who were making them go viral. I'd approach those people and say, "Hey, I'm a journalist. Can I come watch you do what you do?" Now, often the response would be, "Why in hell would I want to talk to some low-t soy-boy Brooklyn globalist Jew cuck who's in cahoots with the Democrat Party?"
所以在 2016 年，我开始追踪其中一些表情包的源头，回到制造它们或令它们病毒式传播的人那里，我接近那些人说，“嘿，我是记者，我可以来看看你做什么吗？”通常我得到的回应是，“我到底为什么要和一个支持民主党的、住在布鲁克林的全球主义犹太佬交谈？”
To which my response would be, "Look, man, that's only 57 percent true."
对此我的回应是，“瞧，伙计，这只有 57% 是真的。”
But often I got the opposite response. "Yeah, sure, come on by." So that's how I ended up in the living room of a social media propagandist in Southern California. He was a married white guy in his late 30s. He had a table in front of him with a mug of coffee, a laptop for tweeting, a phone for texting and an iPad for livestreaming to Periscope and YouTube. That was it. And yet, with those tools, he was able to propel his fringe, noxious talking points into the heart of the American conversation.
但常常我也得到相反的回应。“好的，当然，来吧。”于是我出现在了一个住在南加州的社交媒体推手的客厅里。他是一个 30 多岁的已婚白人。他面前是一张桌子，桌子上放着咖啡，一台用来发推特的笔记本电脑，一台发短信的手机和一台用来在 Periscope 和 YouTube 上直播的iPad。就这些了。然而，通过这些工具，他能够将他那些边缘的、有害的谈资推进到美国人交谈的中心。
For example, one of the days I was there, a bomb had just exploded in New York, and the guy accused of planting the bomb had a Muslim-sounding name. Now, to the propagandist in California, this seemed like an opportunity, because one of the things he wanted was for the US to cut off almost all immigration, especially from Muslim-majority countries.
So he started livestreaming, getting his followers worked up into a frenzy about how the open borders agenda was going to kill us all and asking them to tweet about this, and use specific hashtags, trying to get those hashtags trending. And tweet they did -- hundreds and hundreds of tweets, a lot of them featuring images like this one.
So that's George Soros. He's a Hungarian billionaire and philanthropist, and in the minds of some conspiracists online, George Soros is like a globalist bogeyman, one of a few elites who is secretly manipulating all of global affairs.
Now, just to pause here: if this idea sounds familiar to you, that there are a few elites who control the world and a lot of them happen to be rich Jews, that's because it is one of the most anti-Semitic tropes in existence. I should also mention that the guy in New York who planted that bomb, he was an American citizen. So whatever else was going on there, immigration was not the main issue.
And the propagandist in California, he understood all this. He was a well-read guy. He was actually a lawyer. He knew the underlying facts, but he also knew that facts do not drive conversation online. What drives conversation online is emotion.
See, the original premise of social media was that it was going to bring us all together, make the world more open and tolerant and fair ... And it did some of that. But the social media algorithms have never been built to distinguish between what's true or false, what's good or bad for society, what's prosocial and what's antisocial. That's just not what those algorithms do.
A lot of what they do is measure engagement: clicks, comments, shares, retweets, that kind of thing. And if you want your content to get engagement, it has to spark emotion, specifically, what behavioral scientists call "high-arousal emotion."
Now, "high arousal" doesn't only mean sexual arousal, although it's the internet, obviously that works. It means anything, positive or negative, that gets people's hearts pumping. So I would sit with these propagandists, not just the guy in California, but dozens of them,
“ 高唤醒”不只是性唤起的意思，虽然是在互联网上，但显然这是可行的。 它意味着任何事情，积极的或消极的，让人们心脏加速跳动的。所以我和这些推手坐在一起，不只是加州这个伙计，还有其他很多人，
and I would watch as they did this again and again successfully, not because they were Russian hackers, not because they were tech prodigies, not because they had unique political insights -- just because they understood how social media worked, and they were willing to exploit it to their advantage.
Now, at first I was able to tell myself this was a fringe phenomenon, something that was relegated to the internet. But there's really no separation anymore between the internet and everything else. This is an ad that ran on multiple TV stations during the 2018 congressional elections, alleging with very little evidence that one of the candidates was in the pocket of international manipulator George Soros, who is awkwardly photoshopped here next to stacks of cash.
一开始我告诉自己这是个边缘现象，这种跟互联网有关的东西。但互联网和其他东西没有任何区别。这是一则在 2018 年国会选举期间在多家电视台播放的广告，基本没有证据的指控，其中一名候选人受到了国际金融操纵者乔治·索罗斯的操控，他被 PS 到了成堆的现金旁边。
This is a tweet from the President of the United States, alleging, again with no evidence, that American politics is being manipulated by George Soros. This stuff that once seemed so shocking and marginal and, frankly, just ignorable, it's now so normalized that we hardly even notice it.
So I spent about three years in this world. I talked to a lot of people. Some of them seemed to have no core beliefs at all. They just seemed to be betting, perfectly rationally, that if they wanted to make some money online or get some attention online, they should just be as outrageous as possible.
But I talked to other people who were true ideologues. And to be clear, their ideology was not traditional conservatism. These were people who wanted to revoke female suffrage. These were people who wanted to go back to racial segregation. Some of them wanted to do away with democracy altogether.
Now, obviously these people were not born believing these things. They didn't pick them up in elementary school. A lot of them, before they went down some internet rabbit hole, they had been libertarian or they had been socialist or they had been something else entirely. So what was going on?
Well, I can't generalize about every case, but a lot of the people I spoke to, they seem to have a combination of a high IQ and a low EQ. They seem to take comfort in anonymous, online spaces rather than connecting in the real world. So often they would retreat to these message boards or these subreddits, where their worst impulses would be magnified.
They might start out saying something just as a sick joke, and then they would get so much positive reinforcement for that joke, so many meaningless "internet points," as they called it, that they might start believing their own joke.
I talked a lot with one young woman who grew up in New Jersey, and then after high school, she moved to a new place and suddenly she just felt alienated and cut off and started retreating into her phone. She found some of these spaces on the internet where people would post the most shocking, heinous things.
And she found this stuff really off-putting but also kind of engrossing, kind of like she couldn't look away from it. She started interacting with people in these online spaces, and they made her feel smart, they made her feel validated. She started feeling a sense of community, started wondering if maybe some of these shocking memes might actually contain a kernel of truth.
A few months later, she was in a car with some of her new internet friends headed to Charlottesville, Virginia, to march with torches in the name of the white race. She'd gone, in a few months, from Obama supporter to fully radicalized white supremacist.
Now, in her particular case, she actually was able to find her way out of the cult of white supremacy. But a lot of the people I spoke to were not. And just to be clear: I was never so convinced that I had to find common ground with every single person I spoke to that I was willing to say,
"You know what, man, you're a fascist propagandist, I'm not, whatever, let's just hug it out, all our differences will melt away." No, absolutely not. But I did become convinced that we cannot just look away from this stuff. We have to try to understand it, because only by understanding it can we even start to inoculate ourselves against it.
In my three years in this world, I got a few nasty phone calls, even some threats, but it wasn't a fraction of what female journalists get on this beat. And yeah, I am Jewish, although, weirdly, a lot of the Nazis couldn't tell I was Jewish, which I frankly just found kind of disappointing.
Seriously, like, your whole job is being a professional anti-Semite. Nothing about me is tipping you off at all? Nothing?
This is not a secret. My name is Andrew Marantz, I write for "The New Yorker," my personality type is like if a Seinfeld episode was taped at the Park Slope Food Coop. Nothing?
Anyway, look -- ultimately, it would be nice if there were, like, a simple formula: smartphone plus alienated kid equals 12 percent chance of Nazi. It's obviously not that simple. And in my writing, I'm much more comfortable being deive, not preive. But this is TED, so let's get practical. I want to share a few suggestions of things that citizens of the internet like you and I might be able to do to make things a little bit less toxic.
不管怎样，最终，如果有一个简单的公式就好了：智能手机加上被疏远的孩子等于 12% 的纳粹几率。这当然不是那样简单。在我的文章中，我更喜欢用描述性的，而不是指令性的字眼。但这是 TED，所以让我们实际点。我想要分享几点能让像你和我这样的互联网公民去做的能够让事情变得不那么糟糕的建议。
So the first one is to be a smart skeptic. So, I think there are two kinds of skepticism. And I don't want to drown you in technical epistemological information here, but I call them smart and dumb skepticism. So, smart skepticism: thinking for yourself, questioning every claim, demanding evidence -- great, that's real skepticism.
Dumb skepticism: it sounds like skepticism, but it's actually closer to knee-jerk contrarianism. Everyone says the earth is round, you say it's flat. Everyone says racism is bad, you say, "I dunno, I'm skeptical about that." I cannot tell you how many young white men I have spoken to in the last few years who have said,
愚蠢的怀疑者：听起来像怀疑，但实际上它更接近于下意识的逆向抬杠。人人都说地球是圆的，你说它是平的。人人都说种族主义不好，你说，“我不知道，我对此表示怀疑。”我无法告诉你，在过去 5 年中我聊过的人中有多少年轻白人说过，
"You know, the media, my teachers, they're all trying to indoctrinate me into believing in male privilege and white privilege, but I don't know about that, man, I don't think so." Guys -- contrarian white teens of the world -- look: if you are being a round earth skeptic and a male privilege skeptic and a racism is bad skeptic, you're not being a skeptic, you're being a jerk.
It's great to be independent-minded, we all should be independent-minded, but just be smart about it.
So this next one is about free speech. You will hear smart, accomplished people who will say, "Well, I'm pro-free speech," and they say it in this way that it's like they're settling a debate, when actually, that is the very beginning of any meaningful conversation. All the interesting stuff happens after that point.
OK, you're pro-free speech. What does that mean? Does it mean that David Duke and Richard Spencer need to have active Twitter accounts? Does it mean that anyone can harass anyone else online for any reason? You know, I looked through the entire list of TED speakers this year. I didn't find a single round earth skeptic.
好的，你支持言论自由，那意味着啥？这意味着大卫·杜克和理查德·斯宾塞必须得激活推特账号吗？这意味着每个人都可因任何理由在网上攻击别人吗？我看了今年 TED 所有的演讲者清单。我并没有发现一个人怀疑地球是圆的。
Is that a violation of free speech norms? Look, we're all pro-free speech, it's wonderful to be pro-free speech, but if that's all you know how to say again and again, you're standing in the way of a more productive conversation.
Making decency cool again, so ... Great!
Yeah. I don't even need to explain it. So in my research, I would go to Reddit or YouTube or Facebook, and I would search for "sharia law" or I would search for "the Holocaust," and you might be able to guess what the algorithms showed me, right? "Is sharia law sweeping across the United States?" "Did the Holocaust really happen?" Dumb skepticism.
是的，我甚至都不需要解释它。在我的研究中，我会去 Reddit、 YouTube 或 Facebook，搜索“古兰经”，或是“反犹大屠杀”，你们可能能够猜到算法会向我展示啥，是吧？“可兰经正在席卷美国吗？”“反犹大屠杀真的发生过吗？”愚蠢的怀疑主义。
So we've ended up in this bizarre dynamic online, where some people see bigoted propaganda as being edgy or being dangerous and cool, and people see basic truth and human decency as pearl-clutching or virtue-signaling or just boring.
And the social media algorithms, whether intentionally or not, they have incentivized this, because bigoted propaganda is great for engagement. Everyone clicks on it, everyone comments on it, whether they love it or they hate it. So the number one thing that has to happen here is social networks need to fix their platforms.
So if you're listening to my voice and you work at a social media company or you invest in one or, I don't know, own one, this tip is for you. If you have been optimizing for maximum emotional engagement and maximum emotional engagement turns out to be actively harming the world, it's time to optimize for something else.
But in addition to putting pressure on them to do that and waiting for them and hoping that they'll do that, there's some stuff that the rest of us can do, too. So, we can create some better pathways or suggest some better pathways for angsty teens to go down.
If you see something that you think is really creative and thoughtful and you want to share that thing, you can share that thing, even if it's not flooding you with high arousal emotion. Now that is a very small step, I realize, but in the aggregate, this stuff does matter, because these algorithms, as powerful as they are, they are taking their behavioral cues from us.
So let me leave you with this. You know, a few years ago it was really fashionable to say that the internet was a revolutionary tool that was going to bring us all together. It's now more fashionable to say that the internet is a huge, irredeemable dumpster fire. Neither caricature is really true. We know the internet is just too vast and complex to be all good or all bad.
And the danger with these ways of thinking, whether it's the utopian view that the internet will inevitably save us or the dystopian view that it will inevitably destroy us, either way, we're letting ourselves off the hook. There is nothing inevitable about our future.
The internet is made of people. People make decisions at social media companies. People make hashtags trend or not trend. People make societies progress or regress. When we internalize that fact, we can stop waiting for some inevitable future to arrive and actually get to work now.
You know, we've all been taught that the arc of the moral universe is long but that it bends toward justice. Maybe. Maybe it will. But that has always been an aspiration. It is not a guarantee. The arc doesn't bend itself. It's not bent inevitably by some mysterious force. The real truth, which is scarier and also more liberating, is that we bend it.