You just said you yourself were able to identify them. Why couldn’t someone else?
And, suppose they were missing from your News Feed. How much would you miss them? And why would you need them back?
Finally, I’d love to hear about some of these “ads for products that have political messaging” (assuming they’re not the obvious case of T-shirts, stickers, etc. promoting a candidate or cause).
> You just said you yourself were able to identify them. Why couldn’t someone else?
Because as I also said, anybody else would likely make different judgements. The assessment I described is entirely my own opinion, and it would be impossible for me (or anybody else) to make any form of assessment that wasn’t influenced by opinion.
> And, suppose they were missing from your News Feed. How much would you miss them?
I personally don’t care about ads at all. But you’re understating how extreme your suggestion is. News outlets would be excluded from any sort of promotion on FB. Basically any NGO would be excluded, including environmental groups.
I would expect such an extreme idea to have a very strong justification, and I can’t see anything other than FB employees not liking it.
> Finally, I’d love to hear about some of these “ads for products that have political messaging”
Nike and Gillette have two incredibly high profile examples. There’s plenty of smaller ones too. Any company that participates in pride month is delivering a political message. The issue with classifying political speech is that it depends much more on the views of the audience than it does the content of the speech. The graphics used in a weather report would be considered politically charged by a flat earther.
It is simply not possible for FB to take a neutral position on this matter, so your choice is between FB taking a political stance on what election ads are allowed on its platform, or having FB not moderate them.
I’m not sure I agree it’s such a big problem, other than having to upset some advertisers by declining advertisements that used to get automatic acceptance. Remember, Facebook is still a private enterprise and they’re under no obligation to make everyone (or anyone) happy. They are free to limit activity they find inconsistent with their moral view, just like Apple does.
And also keep in mind, these are ads we’re talking about. People who want to publish political speech can still do so on their own pages or status updates, but they won’t get paid distribution into users’ news feeds.
If you accept the argument that forbidding political ads is not possible, then I guess it depends on how much you care about large corporations ability to interfere with democratic processes. Having FB act as the gatekeeper to which political ads are allowed on the worlds largest social media platforms, and which are not seems like a pretty bad outcome to me.
> Remember, Facebook is still a private enterprise and they’re under no obligation to make everyone (or anyone) happy.
Sure, but this is kinda irrelevant to the discussion about what they should do, or more importantly, what they should be allowed to do. The concept of Net Neutrality would be an almost identical violation of freedom of association, but it’s a regulation with a huge amount of popular support.
> If you accept the argument that forbidding political ads is not possible
I don’t agree (at least insofar as this is their policy and not, say, a legal restriction). I think they can get reasonably close using good judgment, and there will always be arguments about the edges. Such is life. We can’t let perfection be the enemy of the good.
None of this argument is about regulation or coercion, by the way. This is really about what I think their own policies should be, not about government regulation of political speech, which is an even bigger minefield.
I’d say this stance is based on the presupposition that a solution implemented imperfectly is in fact good, or at least better than no solution implemented at all. You say yourself that this would require FB to make (naturally subjective) judgements about what is or is not political. The very best case scenario would be that FB ends up marginalizing fringe views.
But would you even trust FB as an organisation to reach a level of neutrality where only the most fringe views were marginalized? I certainly wouldn’t. The alternative is simply that FB leaves it up to the audience to make their own judgements about the credibility of the content they view, something I’d consider to be a far superior outcome. Nothing FB does will ever relieve people of the burden of thinking for themselves.
> I’d say this stance is based on the presupposition that a solution implemented imperfectly is in fact good, or at least better than no solution implemented at all.
Yes. :)
> You say yourself that this would require FB to make (naturally subjective) judgements about what is or is not political. The very best case scenario would be that FB ends up marginalizing fringe views.
That sounds good to me!
> But would you even trust FB as an organisation to reach a level of neutrality where only the most fringe views were marginalized? I certainly wouldn’t.
Why not? We have no evidence to suggest they'd be terrible at it.
> The alternative is simply that FB leaves it up to the audience to make their own judgements about the credibility of the content they view, something I’d consider to be a far superior outcome.
Do you really consider the current state of affairs to be superior and/or ideal? Perhaps I would agree with you in a perfect world where more people were naturally distrustful of content, but we've seen that a large segment of the population is not particularly good at separating the wheat from the chaff.
> You say yourself that this would require FB to make (naturally subjective) judgements about what is or is not political. The very best case scenario would be that FB ends up marginalizing fringe views.
> That sounds good to me!
I think we just have very different values here. I’d consider this to be incredibly harmful. I remember a time when “gay people are just normal people who should be treated with respect and dignity” was a very fringe view. My parents remember when “black people shouldn’t be segregated from the rest of society” was a fringe view.
> Why not? We have no evidence to suggest they'd be terrible at it.
I think a lot of people would say that there’s plenty of evidence that FB isn’t capable of politically neutral moderation.
> Do you really consider the current state of affairs to be superior and/or ideal?
I think for as long as we’ve had language we’ve had misinformation. I don’t think there’s anything remarkable about the current state of affairs at all. There is no system you can put in place to determine what’s true on behalf of other people, because no matter what you do, people still need to make their own judgements about the information they consume.
There is only one institution in most democratic societies that acts as an authority on matters of fact or truth, and that is the justice system. In that system, every fact put forward for consideration is subjected to extensive scrutiny and debate, yet it still produces endless amounts of controversy. Every other truth seeking institution, whether public or private, is even less capable of establishing consensus.
The problem you can’t solve with such systems is that the truth is a fundamentally subjective matter. Any authority established to determine the truth can do nothing more than reflect its own biases and prejudices. History has no examples of misinformation being defeated by censorship, the only solution is competing view points and critical thought. It’s not a problem FB can solve, nor one we should want them to try and solve.
I think there's a mixup in this conversation going on. I thought the question was whether or not Facebook could determine whether a post was political in nature or not (biasing on the side of "it is") and refusing to accept money in exchange for distributing it.
Never was this thread about Facebook's role as an arbiter of "truth." The difficulty of that is much, much higher, and one that I agree shouldn't be entrusted to Facebook (or probably to any single entity, for that matter).
Again, this is not about whether people should be allowed to post their opinions on Facebook. It's really about whether Facebook should be used to amplify political speech; and also (secondarily), if they are used this way, whether they should allow advertisers to narrowly target political speech to audiences. With respect to your "gay rights" fringe argument, I see no difference in how that viewpoint (as formerly a fringe view) would be treated from a then-mainstream viewpoint under a broad rule like this. Neither pro- nor anti- rights messages would qualify for amplification.
BTW, there's a great discussion with Alex Stamos (former CSO at Facebook) in which he discusses it with more nuance and detail, but tries to thread the needle more than I feel comfortable doing: https://galley.cjr.org/public/conversations/-LsHiyaqX4DpgKDq...
The whole issue seems to be based around making judgements about the truth to me. The initial problem is that political ads are high risk for distribution of false or deceptive information. Fact checking them is too fraught with issues, so you just decide to ban them all instead. Now you’ve just kicked the can down the road a bit and narrowed your search of truth to “is this political or non-political”.
> It's really about whether Facebook should be used to amplify political speech
Which I would assert that any attempt to control would require FB to make value judgements about said speech, naturally resulting in FB enforcing some particular set of values.
I would say that FB and other big tech firms have been following a trend that’s been seen in government a lot in recent times. Identify some threat to public wellbeing (in terms of government the canonical example would be terrorism), and leverage it to gain more power and influence. It’s no secret that society generally walks away from a terrorist attack with less freedom than it had before. For the big tech firms the threat has been “fake news” or “misinformation”, and they’ve been pretty consistent with the response of “we need to exert more control over the flow of information and public discourse to combat this”. Putting aside any debate about whether they should be allowed to do this, I think it’s overall a harmful outcome for society, and I’m surprised that FB hasn’t taken the same approach in this case.
And, suppose they were missing from your News Feed. How much would you miss them? And why would you need them back?
Finally, I’d love to hear about some of these “ads for products that have political messaging” (assuming they’re not the obvious case of T-shirts, stickers, etc. promoting a candidate or cause).