I was watching Legal Eagle discussing if Trump incited the mob / insurrection https://youtu.be/XwqAInN9HWI and he goes over some previous cases of “incitement of violence” and it’s crazy how high the bar is.
On the other hand, tech companies can very quickly take down illegal content, like child pornography or people singing Happy Birthday. I do think they have the technical chops + content moderators to at least try to curb some of the more inflammatory posts.
But, it’s not illegal. So they won’t do it, in fact, they’ll profit like Facebook matching body armor ads to groups of people plotting to hang the Vice President. The solution might be to lax the definition of imminent threat, or consider that someone with millions of followers is basically planning acts of violence with a retweet. The fact that people think “it’s just a retweet” means you have zero responsibility and accountability for influencing millions of people shows how we are not prepared technological changes.
I can’t talk you out of it because I am very confused and conflicted. I think I know what needs change, and even how, but not change to what.
Well, this is the thing isn't it, threats of nonspecific violence are a normal part of American politics. You could find thousands of examples of that kind of thing from both sides.
The unusual thing is turning that violence into reality.
Would it be great to dial that down? Yes. Will the Republicans stop doing enraging things and calm the situation? No.
Let me be more direct - how do you think these powers of restriction on "influencing people" you want, will be used by an administration you don't want? Do you think there might be some foreign countries where such powers are already in place, that could give us a clue how things might turn out, or is America too much of a unique exception to be able to learn from anyone else's experience?
Tweeting "burn the whole thing down" to a generally liberal crowd is a figure of speech which implies, by and large, the need for complete reform.
To far-right "conservatives", white supremacists, violent insurrectionists, and paranoid militias, it implies something completely different.
In other words, someone tweeting "burn the whole thing down" to a liberal following in regards to a judicial appointment is a very different thing from Trump or other far-right "influencers" tweeting "burn the whole thing down" to a large and frequently violent crowd in regards to their false claims of stolen elections, because it's reasonable to assume that people upset about the replacement of RBG aren't going to literally burn down the capitol, whereas it's clear that a (small but sufficiently significant) subset of Trump followers were not only willing to, but able to and intent on, burning down the Capitol in response to the false stolen election claims.
Codifying a double-standard of when it's "reasonable to assume" that certain threats are figurative based on political orientation of the speaker is just the kind of thing that pours fuel on the fire.
It's not quite a plan, is it? Like, what exactly are they asking people to burn? "It"? And it's not necessarily advocating violence, either; while arson is terrible and destructive, it is not necessarily violence against people.
It's pretty well short of actually calling people to violent acts. The wording would need to be more specific.
Sure; they'd probably apply strict scrutiny [0]. In the USA, speech is typically protected by default; the burden of proof is on the prosecution to show that the speech was harmful.
Seriously, have you never heard an angry American yell that they are frustrated with the status quo and would like to "burn it down" [1]? It is a common refrain and generally taken as a hyperbolic statement about the speaker's dissatisfaction with the actions of the government.
Finally, if you're in the USA, you have the right to a trial by jury if you're accused of crimes [2]; you do not need to worry that some appointed judge will find your speech harmful, but rather that a panel of your peers will unanimously agree that your speech is harmful.
To what end are you arguing? Do you honestly think that the 6th Amendment will be repealed, or that strict scrutiny will suddenly not be applied when appropriate?
(I suspect that this entire comment thread was started in order to try to pull a rhetorical gotcha, and you didn't actually expect somebody to respond to your words as written, but hopefully you've learned some constitutional law on this beautiful sunny MLK Jr. Day.)
How is it a "rhetorical gotcha" to ask what it would look like to consistently apply a proposed principle or law?
Equal protection under the law is also a constitutional principle (I knew that one already, thanks), but it seems in short supply in much of modern thinking and advocacy.
The first question somebody should ask when advocating some kind of measure aimed at their adversaries is: what would it look like if this was applied to my friends also?
Groups like the ACLU used to think this way, which is why they defended the most undesirable groups advancing the most repulsive ideas.
That kind of thinking is going by the wayside as longstanding liberal principles like freedom of speech, due process, presumption of innocence, etc. are all going by the wayside, tarred by their association with the "right wing."
I agree entirely with your point. I think, however, that there's something of a false equivalence: It is not the case that the tweets which are inciting violence are coming from all political believers in equal share.
We don't get "longstanding liberal principles" by publishing our speech on a private corporate platform. Freedom of speech and due process are merits of the law and not of Twitter's terms of service. Twitter is free to ban all incitements to violence without banning all hyperbolic political speech.
They ACLU haven't left their liberal principles by the wayside. They warned about the power of social media companies as recently as January 8, after the events at the Capitol.
"ACLU Counsel Warns of 'Unchecked Power' of Twitter, Facebook After Trump Suspension"
"For months, President Trump has been using social media platforms to seed doubt about the results of the election and to undermine the will of voters. We understand the desire to permanently suspend him now, but it should concern everyone when companies like Facebook and Twitter wield the unchecked power to remove people from platforms that have become indispensable for the speech of billions – especially when political realities make those decisions easier," the statement read.
"President Trump can turn his press team or Fox News to communicate with the public, but others – like many Black, Brown, and LGTBQ activists who have been censored by social media companies – will not have that luxury. It is our hope that these companies will apply their rules transparently to everyone."
The statement you quoted is a good statement, I agree. And maybe the ACLU is re-finding its way. But the ACLU has changed, and it's not just me saying this, it's the former head of the ACLU:
> ‘I believe that the national ACLU, if the Skokie case arose today, would not take it. They might take the same case for the Martin Luther King Jr Association, but they wouldn’t take it for the Nazis.’
> As Kaminer has long argued, the rot has been setting in for some time. But since Trump’s election, the ACLU has been more noticeably shying away from contentious free-speech cases.
On the other hand, tech companies can very quickly take down illegal content, like child pornography or people singing Happy Birthday. I do think they have the technical chops + content moderators to at least try to curb some of the more inflammatory posts.
But, it’s not illegal. So they won’t do it, in fact, they’ll profit like Facebook matching body armor ads to groups of people plotting to hang the Vice President. The solution might be to lax the definition of imminent threat, or consider that someone with millions of followers is basically planning acts of violence with a retweet. The fact that people think “it’s just a retweet” means you have zero responsibility and accountability for influencing millions of people shows how we are not prepared technological changes.
I can’t talk you out of it because I am very confused and conflicted. I think I know what needs change, and even how, but not change to what.