Another vital distinction is between platform and publisher. As Ball explained, companies such as Twitter have long insisted they are the former and not the latter, which means they are not responsible for what others publish on their platform (just as AT&T is not responsible for how people use its telephones). Demanding that Twitter actively intervene in what speech is and is not permissible blurs those lines, if not outright converts them into a publisher. That necessarily vests the company with far greater responsibility for determining which ideas can and cannot be aired.
If, despite these dangers, you are someone who wants Dick Costolo, Mark Zuckerberg, Eric Schmidt and the like to make lists of prohibited ideas and groups, then you really need to articulate what principles should apply. If, for instance, you want "terrorist groups" to be banned, then how is that determination made? There is intense debate all over the world about what "terrorism" means and who qualifies. Should they use the formal lists from the U.S. Government, thus empowering American officials to determine who can and cannot use social media? Should they use someone else's lists, or make their own judgments?
If you want these companies to suppress calls for violence, as Ronan Farrow advocated, does that apply to all calls for violence, or only certain kinds? Should MSNBC personalities be allowed to use Twitter to advocate U.S. drone-bombing in Yemen and Somalia and justify the killing of innocent teenagers, or use Facebook to call on their government to initiate wars of aggression? How about Israelis who use Facebook to demand "vengeance" for the killing of 3 Israeli teenagers, spewing anti-Arab bigotry as they do it: should that be suppressed under this "no calls for violence" standard?
A Fox News host this week opined that all Muslims are like ISIS and can only be dealt with through "a bullet to the head": should she, or anyone linking to her endorsement of violence (arguably genocide), be banned from Twitter and Facebook? How about Bob Beckel's call on Fox that Julian Assange be "assassinated": would that be allowed under Ronan Farrow's no-calls-for-violence standard? I had a long dialogue with Farrow on Twitter about his op-ed but was not really able to get answers to questions like these.
None of this is theoretical. It's the inevitable wall people run into when cheering for the suppression of speech they find "harmful." Indeed, even as they were applauded, Twitter refused to follow their edict through to its logical conclusion when they announced they would not ban the account of the New York Post even though that tabloid featured a graphic photo of the Foley beheading on its front page, which it promoted from Twitter. The only rationale for refusing to do so is that banning the account of a newspaper because Twitter executives dislike its front page powerfully underscores how dangerous their newly announced policy is.
There are cogent reasons for opposing the spread of the Foley beheading video, but there also are all sorts of valid reasons for wanting others to see it, including a desire to highlight the brutality of this group. It's very similar to the debate over whether newspapers should show photos of corpses from wars and other attacks: is it gratuitously graphic and disrespectful to the dead, or newsworthy and important in showing people the visceral horrors of war?
Whatever one's views are on all of these questions, do you really want Silicon Valley executives -- driven by profit motive, drawn from narrow socioeconomic and national backgrounds, shaped by homogeneous ideological views, devoted to nationalistic agendas, and collaborative with and dependent on the U.S. government in all sorts of ways -- making these decisions?
Perhaps you don't want the ISIS video circulating, and that leads you to support yesterday's decision by Twitter. But it's quite likely you'll object to the next decision about what should be banned, or the one after that, which is why the much more relevant question is whether you really want these companies' managers to be making such consequential decisions about what billions of people around the world can -- and cannot -- see, hear, read, watch and learn.
(Note: You can view every article as one long page if you sign up as an Advocate Member, or higher).