Hacker News new | past | comments | ask | show | jobs | submit login
Facebook admits it must do more to stop the spread of misinformation (techcrunch.com)
710 points by CodeGenie on Nov 10, 2016 | hide | past | favorite | 614 comments



This article criticizes Facebook for firing the human editors that had been keeping things sane. However they were pushed to that by accusations of bias in the right-wing media. Accusations that looked likely to lead to a Congressional investigation.

See, for example, http://thehill.com/policy/technology/279361-top-republican-d....

Now they are in a situation where they are damned if they do, damned if they don't. And people immersed in echo chambers will accuse them of bias no matter what.

But the entire system is fundamentally broken. Pay per ad incentives lead to rewarding viral content. And content that induces outrage is far more likely to go viral than pretty much anything else. Plus it goes viral before people do pesky things like fact checks. And the more of this that you have been exposed to, the more reasonable you find outrageous claims. Even if you know that the ones that you have seen were all wrong.

For an in depth treatment of the underlying issues, I highly recommend Trust Me, I'm Lying.


Yes, trying to solve this problem by creating a definition of "misinformation" sufficiently precise to act on it for all possible articles, then trying to remove it somehow is probably an AI-complete problem. If it's not AI-complete it's probably just plain an ill-formed question. That's before we ask questions about the biases that get embedded into the misinformation detector.

This can't be fixed at Facebook scales by building a platform that so highly incentivizes low-quality content of all kinds, then trying to stop the content so late in the cycle. The entire incentive structure has to be rethought. Unfortunately, that's probably a problem that Facebook is literally incapable of solving without going out of business, because Facebook the corporate entity is that incentive structure. To fix Facebook requires Facebook to become not-Facebook.


This is a problem with human wetware - there are several cognitive biases [1][2][3][4] that make people engage more strongly with ideas that fit their preconceptions and discount ideas that don't. Add information-sharing to the mix and the aggregate effect you get is large information cascades that fracture the population into tribes that believe things not necessarily because they're true, but because they are stated early, vehemently, and frequently.

Remember that evolution favors individuals who survive & gain status, not those whose representation of the world is true. Someone who is right but killed for it (eg. Copernicus) is still dead.

The fix is for people to be aware of these cognitive biases in themselves, to actively seek out information that differs from their preconceptions, and to challenge inaccurate information (with evidence!) when presented with it. You can't outsource this to a communication platform, because it's inherently less comfortable and more effort for the reader, and the reader will just go to a competing communication platform that makes them feel better.

[1] https://en.wikipedia.org/wiki/Illusory_truth_effect

[2] https://en.wikipedia.org/wiki/Confirmation_bias

[3] https://en.wikipedia.org/wiki/Bandwagon_effect

[4] https://en.wikipedia.org/wiki/Congruence_bias


Fact check: Copernicus was not killed.

He died of natural causes at 70. According to viral stories at the time, moments after having seen the first print of his book.


Maybe people confuse Copernicus with the advocate of related theories Giordano Bruno, who was killed in 1600 for his beliefs and advocacy (though not only for his beliefs about astronomy).

https://en.wikipedia.org/wiki/Giordano_Bruno

People may also confuse Bruno with Galileo Galilei, who was punished by the Inquisition for his defense of Copernican theory, but not executed.


See, just like that. Would be more effective with sources, though.


Given the amount of misinformation put out by the MSM this election they'd do well to blanket ban the NYT, CNN, et al.


> The fix is for people to be aware of these cognitive biases in themselves, to actively seek out information that differs from their preconceptions, and to challenge inaccurate information (with evidence!) when presented with it.

You are right, but I hope you are aware that this would require a giant leap to an essentially superhuman state of rationality and clear thinking for a vast majority of people.

Just stop for a moment and look around you. Is that how most people actually function in this world?


Separate ideology (or whatever happens to be the object of bias) from identity, and you're halfway there. People are much more amenable to changing their minds if their identities aren't at stake. For example, most people can discuss the pros and cons of F-150s and Silverados fairly rationally; those with the Calvin pissing logos, not so much.


Again agreed. But how would you accomplish that?

For large majorities of voters, dropping that ballot in the box is over 90% an identity game.


On a societal level, your guess is good as mine. Perhaps increased infrastructure spending so that high-skill, high-pay jobs are more evenly spread out and ideological segregation is reduced?

On an individual level, being active in (non-political) hobbies and volunteer groups can help, especially ones that attract both people from the cities and the countryside.


Prove that assumption, please.


We have to design systems that work with the people we have, not the people we wish we had.

Fail to recognize that, and you've failed before you've started.


>> most people can discuss the pros and cons of F-150s and Silverados fairly rationally; those with the Calvin pissing logos, not so much.

How is this different from stereotyping then?


Who am I stereotyping? People with "Calvin pissing on Ford/Chevy/Ram logo" stickers? The vast majority of truck drivers don't really care about brand--they just want the most appropriate truck for their situation.


None of us function in this way. While part of the solution does have to involve people becoming better at vetting their information-sources, it is equally important to acknowledge that for everyone to do this individually would be absolutely impossible given time constraints.

What we need, fundamentally, is a media environment that can harness the collective capacity of humans to vet and debate new information in an organized fashion that can be trusted. This site is a good example where a knowledgeable community, and the simple mechanism of votes does some automatic, crowd-based content curation. But the Reddit-like model is very basic - so much more could possibly be done to allow individual users to contribute to a collective process of knowledge-building.

I think we need more development and discussion devoted to what kind of systems could be designed to encourage transparency, quality, and critical analysis in our news-making.


Well, a good first step would be an education system that encouraged a culture of curiosity and respectful questioning, rather than just "obey".


Hah! It's worse.

As a kid I learnt this and tried to be as rational and aware of my fallibalities as possible.

That makes you an alien to normal human beings. Conversation becomes impossible, processing fast enough becomes impossible, and the penalties for acting weird and communicating weird are huge.

Instead you need to find a way to act in a manner that allows you to blend in with people.


Is that how you function?


This is absolutely the right answer, but maybe Facebook can do some things to help. It's possible to gamify the process of learning to spot biases by presenting people with situations and getting them to think critically to solve the biases or fallacies. Especially on Facebook, people love filling out questionnaires that tell them something about themselves.

I'm not saying that one smart game is going to fix the entire problem, but adding the development of logic skills into the types of activities that people enjoy doing on Facebook is definitely possible.


It doesn't work that way. It assumes that what people are doing is wrong - it's not. The mental overhead to assess everything clinically is sufficient that faster thinkers can just outrun you. People/brains are making trade offs to deal with the world around them.

Other humans are taking advantage of those gaps to mobilize votes or advertisements.

There's no way out and I've been wondering when people would realize the depth and intricacy of the mess we're in. Hopefully people pay more attention to the way cognitive cheat code are being abused.

Hopefully a protocol to deal with it can be made.


There's really an amazing amount of misinformation and manipulation going on. I think there's a place for people developing critical thinking skills, but there's also a limit to what people can sort through. There's a big need for more software to fact-check and compare sources across the Internet. Having Facebook run it is one part of the solution, but people need to have transparent software that they control to do the same kind of work so it isn't entirely centralized. Google, Amazon, and Facebook assistants are great, but we need personal assistants that work for us.


The fix is for people to be aware of these cognitive biases in themselves, to actively seek out information that differs from their preconceptions, and to challenge inaccurate information (with evidence!) when presented with it.

Exactly right. This should be strongly emphasized in our educational curriculum.


As any alien arriving in your world would immediately ask, why doesn't having a true representation of the world help you survive and gain status? And if it doesn't, why does it matter?


An interesting TED talk about the topic how evolution does not necessarily favor true representation of reality in our minds and bodies: https://www.ted.com/talks/donald_hoffman_do_we_see_reality_a...


It helps me gain status even more if I can plant false information in others.


As someone points out Copernicus was not killed. And it illustrates the problem of FB putting its finger on the scales. Ever.


It's almost as if solving the problem of truth is antithetical to the notion of having a for-profit entity be the main channel of [legit|mis-]information in our world.

It's almost as if the way we understand and use money has hardened into a giant screwball of lies, base instincts, and perverse incentives.

Almost.


I think I would argue that this is beyond an AI-complete problem, it is an unanswerable problem. For instance, consider the following spectrum of ideas:

- The earth is flat

- 9/11 conspiracy theories

- Barack Obama was born in Kenya

- Anti-vax

- Climate change denial

- Anti-GMO

- Denial of underlying quantum randomness in the universe

- The Clinton Foundation has been or is involved in some shady dealings

- Muslims want to implement Sharia law in Europe

- The minimum wage decreases employment

- Donald Trump has sexually assaulted a number of women

- Monetary policy impacts the real economy

- The sun will, on balance, probably rise tomorrow

I've sorted this in a (rough, personal) order of increasing plausibility / conformance with facts. Where is the line between misinformation and alternative information? As a human, it's pretty unclear to me.


Could you elaborate on what "AI-complete" means? I have not heard this term before. I am assuming it similar to NP-Complete but in the domain of AI?

If this correct I assume that there are also problem that are classified as AI-hard?


AI-complete, I believe, means 'full AI'. An AI-Complete problem is a problem that can't be solved without creating full human-level (or above) AI.


That's a meaningless term since defining "human-level" is already a lost cause.

Some people can multiply ten digit numbers in their head. Some can't even tell you what a number is. There's a very wide bell-curve here.


Man, it must be me, but you ranked "Monetary policy" WAY more likely than I would (rates near zero for so long doing so little must at least put SOME doubt in your model), and "denial of quantum randomness" way less credible than I see it as. Like, man, reasonable people can disagree about pilot waves and many worlds.

Are there some confusing, emotionally laden, difficult questions of fact in there? Yes. Are some things ultimately unknowable (like what people want, which involves scrying into their minds)? Yes.

These, as it turns out, are not the things I can't help but be side-tracked on. I'm diverted by you claiming monetary policy has an effect with nearly the certainty of the sun rising.


Haha, sorry I knew this list would cause this sort of trouble. I'm a big believer in pilot wave theory myself, it's just very much not the 'mainstream' view. And I didn't mean to imply that the last two were close together, either. I do think monetary policy probably has some effect (whether we understand or can model that effect is another matter), but I didn't mean at all to imply that it was anywhere near the certainty of the sun rising, it just happened to be the closest :).

I think perhaps we disagree more about the probability of these two:

- The Clinton Foundation has been or is involved in some shady dealings

- Muslims want to implement Sharia law in Europe

Than the other two. I think both of those are, in some sense, almost certainly true. There are Muslims who want to implement Sharia law in Europe. The Clinton Foundation has been involved in transactions that at least have the appearance of moral uncertainty. The headlines that I listed are just overly strong statements/generalizations of those (IMO undeniable) facts.


I keep getting sidetracked by other issues, but you're statements are about certainty and uncertainty and where do you draw the line for heavy-handed intervention, so I might as well get into them:

"undeniable" does not mean what you think it means. Example: http://www.cbsnews.com/videos/hillary-clinton-defends-founda...

There, Clinton denies the foundation has been engaged in anything shady. Boom, definitely not undeniable #hackernewsdrama, hahaha.

If you want a disagreement, though, I prefer many-worlds over pilot wave theories, largely because I'm not entirely sure _what_ the propagation speed of the pilot waves would be. 'Greater than the speed of light' either (1) doesn't really narrow it down at all or (2) narrows it down to having no options at all. But they should have _a_ speed, because I'm not sure you can even have standing waves if the disturbances propagate instantly.


Rates near zero is part of the reason the stock market has been on a tear the last few years. It also was a driver of the housing bubble.

It didn't cause the hyperinflation that a lot of people thought would come, but the Great Depression is Bernanke's area of expertise, and he had sound theoretical reasons to believe QE wouldn't spark serious inflation.


The near zero rates occurred in response to the economic crisis after the housing bubble burst, not a driver of the bubble.


Rates were also comparatively low during the 2000s to juice the economy after the tech bust and 9/11. They got even lower after the crash and QE, but they were the lowest they had been in the 00s since the early 60s.


I don't deny quantum randomness per say, I just deeply hope that it isn't true for some reason.


There is no need to create a Grand Unified Theory of misinformation, here.

This problem isn't that hard. You filter this stuff and rank it by its SOURCE, not by the content of individual articles.

It's not hard to figure out which online sources are pushing out most of the abjectly-bad propaganda, and de-rank them.


"This problem isn't that hard. You filter this stuff and rank it by its SOURCE, not by the content of individual articles."

I don't think reifying ad hominem into code is the solution to the problem.

Of course, if you don't mind a rare few false positives here and there on articles, I've got your filter right here:

    def source_is_trustworthy(source):
        return False
 
Remember as you sit here thinking through your exceptions what we're talking about; if your "trustworthy" source was confidently telling you about how Clinton was going to win, it's not one of the exceptions. I won't say I was confident Trump would win, but I was certainly less suprised than most; the fact that I generated this somewhat more predictive model by tossing out pretty much every "mainstream" news source is not a good sign for them. And to be honest, Trump news isn't the only thing that I find this useful for. Beyond the bare facts, most media outlets really aren't good for much anymore, and do you ever have to de-spin their news just to get those bare facts in the first place. (I can't, however, pack up into a recipe how to do this yourself. I think we're in a period of transition in the news industry, much greater than just "the internet makes the old dinosaurs stumble" makes it sound, and I'm at the moment not really all that confident in anything.)


As someone who vehemently opposes Trump, I feel that allegations of anti-Trump bias in the mainstream media were entirely correct and somewhat to be expected. Although Trump is certainly a name that sells papers, Trump's repeated threats to open up news media to broader libel laws were not well received. The news organizations' endorsements were pretty one-sided[0]. I don't want to take you too literally, but it seems to me pretty obvious that a model ignoring the MSM entirely would not be preferable to one that took into account the MSM opinions and then applied a correctional factor. I'm sure we agree that truth is a function of our means for determining truth, but I suspect we disagree strongly on the reliability of Internet news sources.

[0] https://en.wikipedia.org/wiki/Newspaper_endorsements_in_the_...


You must have missed the leaked emails in which Clinton exerts massive influence on the media, planning every last details in exchange for all kinds of rewards [0]

[0] http://observer.com/2016/08/wikileaks-reveals-mainstream-med...


I did miss that. Was that article intended to support that view? The incidents listed don't seem to have been terribly well planned or executed, or to have garnered any positive results. That seems difficult to reconcile with the idea of a powerful media conspiracy.


"source_is_trustworthy" applies to all sources, not just "MSM". As I said, I don't have anything right now that I can point to and say "If they say it, I trust it."

I used to try to apply a correctional factor, but I've found it not even worthwhile anymore. I honestly don't know entirely why. I don't know if the news has gotten that much more biased to the point where it almost swamps the signal entirely, or if the combined financial pressures away from expensive real reporting and towards click-bait headlines has removed the signal, or what, but beyond very bare facts they just aren't worth much anymore. It is also possible this has always been the case and I wasn't aware enough to realize it; there's some classic history stories that back that possibility up.

I do know primary sources are getting easier and easier to consult directly. Which could itself also be a reason my opinion of them has plummeted so much in the past 10 years; it was a lot harder to see through media spin and simplification (deliberately for their audience, and accidentally when they in fact don't understand what's going on themselves, see especially science journalism for that) when they were the only source of information.


What do you think was unfair about the Trump coverage in mainstream newspapers?

What is the correct balance of endorsements? Is anything other than 50:50 incorrect?


0:0 would be fair.

It won't be reached in practice, but if you make it the policy, you have an argument to take down any bullshit. With 50:50 it's going to be an endless quarrel over who has had enough so far, who is underrepresented and why everything is dominated by two polar opposites with no voice given to people who don't want to belong to either camp.


The case of news sources failing to predict events seems like a red herring in the larger debate here - news media is generally about reporting events that have actually happened. Predicting presidential elections or other uncertain events is a side hustle they've been dragged into because it sells. And if you throw out their predictive failures, I think it's pretty easy to compose a trustworthiness function based on overall factual accuracy. And sure, there may be lingering biases that need to be corrected for, but even so, the MSM biases generally don't extend into the realm of pure fabrication, whereas clickbait outrage farms absolutely do - and they are a large chunk of the problem.


I don't think you're ever going to detect truth correctly, but you can absolutely detect falsehood.

Compare "{Candidate} up over 5000% in polls" vs "{Candidate} up over 15% in polls".

Baby steps. PageRank wasn't built in a day.


PageRank is also not unbiased, it has been played before many times and pure PageRank would be pretty easy to play - that's why Google keeps tweaking it years after it was invented. And still SEO industry exists. Now, SEOs just compete on being on the first page, not claiming something as grandiose as "truth" - add that to the mix, and all the biases there are, and you get pretty much hopeless task. And of course you get under a constant stream of criticism, and maybe also government regulation - it's one thing to just produce search results (even that is controversial, remember "right to forget"?), it's another thing to claim being able to "detect falsehood" - that's very sure a way to get sued and for the government to get involved.


It's a lot easier to look for inconsistency than it is to look for incorrectness.

Here's a pagerank like reputation system i've built.

http://github.com/neyer/respect


Whyso? I defer to your experience, but I would expect that incorrectness is simply inconsistency measured against a more complete/complex basis?


To determine correctness, the adjudicator requires knowledge outside the system. In the scientific method, this is done by generating falsifiable hypotheses and performing the experiment. Failing that, you have to fall back on heuristics, like "what I already believe", Occam's razor (simplest theory consistent with observations wins), or reliance on authority/consensus.

Consistency is simply majority wins, which completely stifles new ideas or minority opinions.


> Consistency is simply majority wins, which completely stifles new ideas or minority opinions.

No isn't.

You don't need to have a "non objective view of the truth" - i..e you don't need the ability to ask the system "Is this statement true?"

You just need the ability to ask the system "Will I think this is true, given the other things I have claimed to be true?"

Maintaining internal complexity of your own claim graph gets really tricky unless you keep it as accurate as possible.

For example, consensus view was that trump would not win the election. Trump did win the election. Thus, those two claims:

* (before) Trump will not win * (later) Trump won

These two are inconsistent. It's possible to rectify that inconsistency by marking the earlier statment cancelled. So now we have three statements

* (before) Trump will not win * (later) Trump won * (now) I was wrong to say, "Trump will not win."

The gap between "now" and "Before" can be used to compute "probability of a statement being retracted." The smaller those gaps, the more likely it is what whatever you say comes with an asterisk next to it, saying "the expected time to retraction of this statement is: x days"


Ah, you're only speaking of temporal consistency (or "constancy" - does the source's statement change over time).

I was speaking of consistency with the general body of knowledge in the system, (correspondence?).

In the specific case of evaluating major news sources for trust, I think the vast majority of their statements are never retracted/changed, and therefore they would rate highly on an overall "constancy" scale as well. Their many statements about Trump's likelihood of winning were very constant as well - for months they said he'd lose, then they changed, and forever after they will say he won.

Sounds very temporally consistent, with just a single change.

Not sure if the adjudicator could narrow it just to "trust of the source's future predictions about politics", but basically the whole country just got a negative feedback on the trust weighting for major media in that domain.


I am not saying build a machine to detect correctness, I'm saying build one to detect incorrectness.


Correct. The singular value decomposition it was based on was first described 100 years before.


>I can't, however, pack up into a recipe how to do this yourself.

Sum (from: 0, to: PosInf, Enumerable: Source - Bias)

converges asymptotically to Truth


Sorry. Most of the official sources in this recent campaign were dead wrong on a lot of issues, mostly because of their own massive biases.


Are you referring to wrong when they described what they thought was going on, which contains a large amount of opinion, or wrong on the facts as they exist?

Facts can be presented in a biases way (generally through omission of other contextual facts), but if what is said is factual, at least that bit of information is still correct. If combining multiple sources to generate a grouping of facts, that presentation bias way well disappear. If system weeds out people that present opinion as fact, all the better.


Frankly papers that lie through crafted contexts are much more dangerous because they give a false sense of authority due to their correctness at the micro level but incorrectness at the macro level.

Imagine I develope a newspaper that only reports crimes committed by a particular race and I excruciatingly fact check every incident and highlight when they are uneducated (but don't mention education otherwise). What will result is one of the biggest macro lies that will fuel racism through its broader message while everything will be perfect to a fact checker. This would rank perfectly in such a system while broader newspapers would get hammered for messing up picture captions and confusing Airbus A319s with 737s.


Or opinions can be presented as facts, or at least presented in a way that makes it hard to tell if it is reporting or analysis.


The polls were not that far off. If 1 in 100 Trump voters had gone for Clinton instead she would have won.

I think polling is pretty broken though. A lot has changed from 20 years ago when everyone had a landline phone.


The polls were off by the same percentage amount that would have allowed Romney to win the popular vote over Obama.


Using the source to figure out whether something is accurate/correct has its own issues unfortunately. These include:

1. Editors/writers/management changes, since quite a few media publications have gone from being fairly reliable to biased as all hell based purely on who's taken over there. Or in some (rarer cases), the opposite has happened.

So you'd have to make sure older less biased pieces weren't being punished for the actions of the publication in the present.

2. A lot of sources are correct about some things but not about others. It's very possible to have a publication that's terrible at writing about politics in an accurate and unbiased way, but fantastic at writing about technology or sports. So ideally your system would have to detect the subject of the article as well as the publication.

However, I guess you could at least add known 'satire' and fake information sites to a 'non credible' list. Like the ones listed here:

http://www.snopes.com/2016/01/14/fake-news-sites/


this assumes their interest is in being fair, but I suspect their goal will gravitate mostly toward 'not being sued/investigated' which may very well create a selection bias.


Non-starter. The NYTimes and Washington Post, for example, are respected but have a very strong yet subtle bias.

On the one hand, they seem more interested in fact-checking. On the other, they propagate concepts like: on-balance the US is a force for good and should lead the world. This is highly contentious and non-obvious for many (most?) people outside the US.


Pretty much all mainstream media outlets did, at some point, publish something that can be qualified as "bad propaganda" by at least some of the observers. Now, either you filter out pretty much all the internet except maybe for bare statistics, numbers, math, etc., or you say "well, these guys are not as bad as those guys, because obviously being completely wrong about X much worse than being slightly less than right about Y", at which point you are just implementing your own biases.


Yes. We're talking about a platform that at it's best is birthday wishes, vacation and cat pictures. It's running on a paid content model, with advertisers selling through viral videos and click-bait. It's good not to expect too much out of it.


We're actually not; that's simply false. No, Facebook is not "at it's best" [sic] when it sticks to birthday wishes and cat pictures.

It's actually at its best when it is truly social, in all senses of the word. When it facilitates real communication on real issues across a broad spectrum of people. When it spreads information, not misinformation.


Not trying to speak for the original poster, but what I took away from the argument is that when FB is "truly social", involving real issues, the potential rewards are sufficiently high that misinformation becomes valuable, thus dangerous. When it sticks to birthdays and cat pictures, there's no incentive to go hostile. Think of it as being somewhat similar to the arguments about the Flash Crash regarding HFT and liquidity: that HFT is great for liquidity until you need it, at which point it goes away[1].

I agree with others that detecting misinformation in the general case likely requires a theory of mind, and so something approaching "real" AI. I also agree that FB has a serious problem on their hands. And I'm so very glad I don't use it.

[1] I advance no comment here about the correctness of that; just a comparison to illustrate what seems like an overlooked point.


This is the same as reddit. As social platforms get larger they starts diverging from their original userbase (programmers/geeks for reddit, college students for facebook) to apply to a wider and wider audience. Eventually the wider audience starts to look a whole lot like the demographics of a whole country (or maybe the world), and it becomes a platform with as many arguments as society in general.

The solution to not becoming a battleground which is inherantly destabilizing to the platform is to not grow, or only grow in a single direction. State up front what your targeted user base is and keep to them. NeoGAF as a "gamers only" forum will survive a very long time, for instance.

Facebook has sidestepped it by becoming a personal community for each user, but whenever they try to connect that personal community to the wider world through "shared" news, "shared" trends etc that are not personalized it causes a big ruckus.


If it is truly social then it will spread misinformation like wildfire. That's what people do -- gossip game magnified. That said, maybe Facebook is best when it is just spreading obviously unimportant content (in the sense of national and global affairs), like birthday wishes and cat pictures. No harm if the cat is implicated in wild conspiracy theories.

It's not the cat pic sharing that is paid for by entities with motive and interest in spreading misleading messages.


When Facebook is at it's best is taking money from nefarious actors and fucking up democracy

In Poland, for good half year before our presidential election, then a year after I was getting obviously paid for posts from dozens upon dozens of right wing extremists.

I am left-centrist (although my dad jokes at me that im a bolshevik). I don't have many friends who skew righ-wing either. Yet, there was my wall, chock full of sensationalist bs, for over A YEAR

Wonder how much money Facebook earned in that time just from sad little Poland


Social communication is in-person, face-to-face communication. It's not moving bits around over an http interface.

I love idealists, but it's time to take an honest accounting of the differences between the promises of the internet and the actuality.


Not anymore. I'm sorry, but that that view is now obsolete.

The fact is that large and increasing quantities of our social communication now occur online.

I'm not claiming this is salutary or calling it a good thing. I'm just stating the fact.


I have no doubt that a lot of social information is being transferred in a new way, much the same as the introduction of the telephone allowed a lot of social information to be transferred in a new way. My point is that true socializing and socialization are human, not digital experiences.

Not trying to be pedantic here. It's just important to sort out the terms used if any analysis is going to be useful. What's happening on facebook is not social. It's bits of data around being social. Different thing entirely. Perhaps we need a new term for whatever it is, but whatever we call it, it's not the same as face-to-face real socializing with real people.


Aren't Bud and you socializing right now? Hell, I even feel I'm socializing a bit with you, how wrong am I?


Are you picking on me? Making a joke that we share? Challenging me to come up with a retort? Asking me to join in to a shared humorous narrative? Giving me a friendly jibe about an error or weakness you perceive?

We assume positive intent online because otherwise we get flame wars, but there's a helluva lot of body language and inflection that can take the comment you made and make it mean a dozen things. All of that is lost. And all of it is important.

In addition to the nuance, would I know severine if I saw them on the street? Miss them if they died? Remember the nice way they did X everytime we met?

We are exchanging written opinions and statements, yes. These exchanges also occur in social interactions. But we are not socializing. There's no bond here that's become stronger, no subtle nuance or interplay that we're managing at a subconscious level. For all either of us knows, the other one is a bot. Or an alien. There's no humanity here.


I disagree. For me, and I think for many, online interactions are a just as much a form of socialization as going out in person. For very small groups, I generally prefer the in person version, but for larger groups, I think I get more out of the online setting.

Let's take our relationship, for example. I've been cohabiting some of the same online spaces as you for at least the last decade. For me, you are a valued part of that ecosystem, more so than most of my physical neighbors who I chat with in person a couple times a year.

No, I wouldn't recognize you on the street, but I struggle with face blindness, so it's a weak criterion. Would I "miss you" if you died? No, but I'd certainly reflect sadly on your death if I learned about it. "Remember the nice way you did X"? No, but I have trouble coming up with such an X most people I interact with locally.

There's no humanity here.

I'm surprised you would say this. Is this a recent change in your view of online interactions or have you always felt this way? Quantitively I find about the same amount of humanity here as I do elsewhere, and qualitatively I prefer many things about what I find here to what I see when I go out in person. Are you OK?

(ps. I notice that many of the links in the sidebar of your site are broken.)


Facebook completely controls the presentation priority, no? Bolt a Watson-like truth scoring system on to evaluate content and use the resulting confidence to boost or bury content in other's feeds.

It would absolutely have to be automatically generated, but it doesn't seem impossible (just much harder NLP that Watson playing Jeopardy).


If the scoring system does anything other than maximize engagement with Facebook, Facebook ends up making less money and performing worse on KPIs than they otherwise would. It'd be very difficult for any decision maker at Facebook to build, deploy, and maintain that feature. It'd be the sort of thing that'd only be there if otherwise the company would be under existential threat.


> under existential threat

You mean government regulation a la China because some memes really are dangerous? If you increase the velocity of transmission, then even in a free system there are some dangerously infectious falsehoods that it might be in the public good to slow. Think War of the World on Facebook.


Ah, but herein lies the difference. Memes seem to universally promote freedom of communication, communication about freedom, and a way to react to the oppression of the status quo. In China, this is counter to the powers that be, but in America, this is fully aligned with the anti-regulation party that is sweeping through all branches of government. Memes play into their hands, and so Facebook would never be under existential threat.


That, or a public reaction to their product (ie, clickbait fatigue - Facebook gets really good at local optimization for the most engaging news posts but creates a news feed that convinces people that Facebook as a whole isn't worthwhile).

It basically needs to be serious enough to overwhelm the internal incentives of maximizing key performance indicators.


I think there's the global maximum vs local maxima argument in favor of doing something too. Why do people ultimately spend time on Facebook?

I would say they don't do so because of micro-optimized KPI targeting: they do so because they feel time on Facebook is valuable and makes them happy. If Facebook can't deliver increases in that, then optimizations only take them so far.


Yup. There's definitely some Goodhart's Law going on here as well. Micro-optimized KPI targeting is an inexact measure of feeling like ones time on Facebook is valuable.


Watson scoring something for truth is only as good as the data drawn from. Who decides that data? Jeopardy questions center around long-established, non-controversial information; how does a Watson-like system evaluate unprecedented breaking news? It's not a question of the language used, it's a much, much higher-order issue that current Watson sidesteps.


Agreed, much more difficult, but ultimately the same way we do so? We believe certain ideas to be facts (hopefully based on scientific sources), then we parse incoming information in light of those.

I can't imagine an unprecedented breaking news story that has no basis in any factual information. I'm not talking about an upvoter here, but primarily a downvoter (incongruence with accepted facts being easier to prove than vice versus).


Sure, and that's the higher-order problem. If we could make machines that solve any problem the way we do, we'd be a lot closer to AGI, but for now, it's still science fiction for a computer to have beliefs, and understand information in the context of those beliefs.


I don't think ranking new information according to held beliefs is necessarily that far towards AGI (admittedly, strong NLP may be, though).

There's no fundamentally creative step in deciding "How well does this new piece of information match pieces I previously had"?


It seems to me the hardest part would be separating truthiness from virality. How do they seed legit sources in the current media climate of racing to the bottom for eyeballs.


Isn't this essentially the problem Google's been solving with continued iteration on PageRank?

From an untrustworthy collection of input relationships, how do I produce the most reliably correct output?


No because popularity does not equal truth.


It doesn't? Because in the sense that popularity is "agreement with consensus scientific opinion" then it absolutely does.


Well, to be more specific about what I mean is, Google's algorithm is successful if it finds the pages that people find valuable. It's sort of baked into the value proposition that the end user will be able to gauge the quality of results, so there is a sort of feedback loop. The problem with misinformation is there is no in-band signal for Google about its truthiness. I suppose there heuristics you can use related to fact-checking quality of certain publications, etc, but it's much trickier than general perceived quality.

Nevertheless, I agree that if anyone can do it algorithmically, it would probably be Google.


When was this ever the definition of popularity.

I'd like it if it was, but I've never witnessed popularity that embodies that statement.

Popular opinion is usually informed by the most often repeated narrative.


No, it doesn't. Sometimes, popularity blocks the process of scientific inquiry. As Kuhn pointed out. But in any case, there is no truth, as Popper pointed out.


This is basically impossible given the current and near future state of the art.

The Allen AI group has been working on solving year 4 multiple choice science tests for about 3 years. They now score a bit over 60% correct, and that is a much easier task.


What I'm suggesting we start with is not exactly the same. The analogous question in that domain is "How accurate have they been in eliminating wrong answers?"

Everything has to be right for an answer option to be correct. Only one thing has to be wrong for it to be incorrect.


And how do you think that they would do that?

Choosing a random story from politifact: http://www.politifact.com/missouri/statements/2016/nov/06/ro...

Like most false stories, there is a hint of truth to this but that doesn't make it true. Politfact spends 3 pages discussing it, and concludes:

Courts did object in three cases to decisions Kander made as secretary of state. But the specifics of those cases are much more nuanced than Blunt lets on. And the claim that Kander tried to manipulate the election is, at best, unproven.

A SOTA computer system could probably parse that last paragraph I posted, but couldn't get anywhere near getting close to doing the research needed to get to that conclusion. We are years off that (I work on research in this area)


I'm not surprised parsing something nuanced is beyond state of the art. But what about "illegal immigrants are committing an epidemic of rape against our children" or "we have a flood of illegal immigrants"?

A lot of claims that were made this cycle are rooted in measured quantities where the measurements we have disagree with the claims.


Define "epidemic of rape" or "flood of illegal immigrants".

If there is at least one rape or at least one illegal immigrant then it becomes subjective or contextual.

Is 3 rapes a flood? What if it is in a week in one town? What if one was by a former illegal immigrant? What if it is over the course of a year, but all 3 occurred in one day?

Is 20 illegal immigrants a flood? What if they all arrive in a town with a population of 100?

Who is "we"? Is a story about millions of illegal immigrants in Greece relevant? What if it about thousands of illegal immigrants working in Athens? What if Athens is in Georgia?

A real example:

"Broken Families: Raids Hit Athens' Immigrant Community Hard" vs "Athens crackdown shows no hospitality for illegal migrants"

One is a real headline from Georgia, the other from Greece. Can you tell which is which?


They can just form a group of truth checker, a ministry of sort, who's job is to inform people of the truth. They can call it the Ministry of Truth, where busy citizens who have no time to research can get their truths from.


How would AI resolve religious issues?

The Jews say they are waiting for the Messiah.

The Christians say the Messiah came once and will come again.

The Muslims say there is only one God and Muhammad is his Prophet.

Which of these groups is correct?

I would love to see the AI algorithm that can finally settle this issue! Seriously, mad props to the programmer who writes that code!


Fortunately, you don't really have to solve that problem. It would be sufficient to assess claims of actual fact about the actual world. I think the first step in writing the classifier would be to write something like

    float IsStatementAboutFacts(string statement);  // Returns confidence estimate


Your statement is that of an atheist. You don't seem to realize that many of the followers of these religions believe their assertions to be factual in every way. When I got off the subway today, at the Port Authority, in New York City, there were 2 women, holding up signs about Jesus. One of the signs literally said "Historical fact: Jesus rose from the dead".


To me that's less of an issue than a statement like "if Trump is elected Jesus will return."


This statement is true because Jesus promised to return regardless of election results :)


Right, they have a few historical claims. But none of them have any proof for those claims and their confidence is off the charts.

Some of their statements pass the first filter, far fewer pass the second.


Why not do what Twitter does and let the floodgates open I.e. Do no filtering?

I'm however doubtful that a "successful" social network would learn from struggling one. Well C'est la vie.


Is it established that that is in fact what Twitter does? I see something non-chronological on Twitter and it hasn't been clear for years what criteria are used.


They also banned the Hillary for prison hashtag I believe. The supporters had to spelling it wrong in order to get it to trend again.


indeed, it would take more than likes and shares, just an election of just votes and promisses is insufficient, yet, there we are.


"they were pushed to that by accusations of bias in the right-wing media."

Accusations of bias? I think you mean to say that actual Facebook employees stated this as a fact. And the story was covered by ALL news outlets, starting with Gizmodo.


And Zuckerberg meeting right wing leaders shows that there was great element of truth in it.


It could also mean he was simply running damage control on a bullshit accusation he had no way of denying.

If sites are spreading factually provable lies through Facebook as fact, and people are acting on it, should Facebook have any say in telling people "Hey, this is garbage"?

It's an open question.


It's also partly a constitutional question. I want my friends and I to be able to share articles we find in whatever press on FB. I'll filter by friend. (Mea Culpa: I haven't read TFA yet, so I may be missing some context.)


> ALL news outlets, starting with Gizmodo.

That's like saying "the truth of this information is at least 0"


> Now they are in a situation where they are damned if they do, damned if they don't. And people immersed in echo chambers will accuse them of bias no matter what.

What if they simply change the algorithm to perform some graph clustering/principal component analysis on the stories that are shared organically among groups and then inject let's say 10% stories that are disproportionally shared by other clusters, but not this one?

The granularity could be tweaked to chunk everything into let's say ~5 clusters per nationality and language.

The machines do not have to assess facts, they just have to allow people to see beyond their friend-horizon. So call it the bubble-burster.


Quite sensible. The more options the better


Human editors can separate sane from insane ? I dont think so.

The recent presidential poll fiasco is prime example. All the large media houses like CNN, ABC, WaPo, NYT complete failed to detect and acknowledge the silent righ-wing voters claiming Clinton would win hands down. This is not just about political bias of the media but more importantly their complete failure to judge the pulse of the nation. In my opinion the main stream media was just insane here.

On the contrary the media engaged in an active social bullying of conservative voters by calling them bigots,racists,misogynist and what not. Had this media been lot more sensitive and sane in understanding both liberal and conservative voters, had they focused on what issues really mattered to conservative voter-base instead of ridicule and bullying I am pretty sure it would have changed things.

I blame the social bullying by main stream media as the prime reason why country seems divided.


> The recent presidential poll fiasco is prime example. All the large media houses like CNN, ABC, WaPo, NYT complete failed to detect and acknowledge the silent righ-wing voters claiming Clinton would win hands down.

The fake news on Facebook is a whole different thing, it's easy to spot and literally news stories made specifically to game Facebook. It's insane that it has gotten this far.

https://www.buzzfeed.com/craigsilverman/how-macedonia-became...


I don't think the "bias" of mainstream media sources is solvable, it's not that problem we need to solve.

The problem we need to solve is the sharing of hysterical, blatantly false information, presented as facts, from specific, intentionally propagandistic sites. And I wouldn't hide or disallow it, I would flag it with a warning message.


I think the solution to "damned if you do, damned if you don't" here is to just remove the stupid news sidebar.


I imagine the news sidebar drives a lot of traffic.

(I agree, though.)


Facebook fixes misinformation with this one weird trick! Publishers hate it.

(I couldn't resist)


Yes. Remember that Facebook itself is essentially becoming what AOL used to be. AOL News was huge and it's still probably one of their most popular sites to this day.


No. The solution is to continue to provide news, to curate it, to shut down professional liars, and to tell the critics that if they don't like it, they are free to encourage their followers to use other social media.


> shut down professional liars

Would they really boot CNN?

CNN told us this: https://www.youtube.com/watch?v=_X16_KzX1vE

They were laughably wrong: https://popehat.com/2016/10/17/no-it-is-not-illegal-to-read-...

Maybe one can claim it wasn't an intentional lie, but this guy is someone who really should have known better. So if it wasn't an intentional lie, they were negligent. Not sure where that leaves us.


>Maybe one can claim it wasn't an intentional lie, but this guy is someone who really should have known better.

It would be great if an algorithm could fight this type of content, but it probably can't.

But we can probably start with sites that publish stories like "Rudy Giuliani SLAPPED A REPORTER WITH HIS DICK last Tuesday" and that might solve enough of this problem to call it a decent win.


Even a stopped clock is right twice a day.

Yes, you can completely filter them out figuring it's not worth the effort, but the problem is that's how you form bubbles to begin with.

Might have more luck boiling them down to verifiable facts and stripping all opinions (especially every explanation of "why X did Y" or "what X means").


I just don't know where the cutoff is for professional or habitual liars. There are many ways to mislead, cherry picking data, methodological errors, straight up confusion and overwhelming technical data. Do we need to model the frequency of offense or severity?

I'm not really going for the whole slippery slope aspect, more I feel like I'm miscategorizing the problem.

Maybe the metadata on misinformation looks different from the real deal. How quickly does it spread, and through which people? No need to actually judge the content.


Agreed! They just need a panel of curators from a diverse background (men, women, race, religion, education...) who can provide different perspectives and reduce bias (I say reduce...human decisions will always have bias). Hell maybe an indepndent company does the creation so there's no bias for anti FB postings.

If there's no consensus on the "truthiness" then the default is benefit of the doubt that it's true.

I know we are all in IT, but not all problems are best solved by algorithms.


Would your definition of professional liars include Hillary Clinton and Barack Obama?


The news sidebar to me now is only a "Click to see if this random celebrity died or not" bar


It's amazing how much Trust Me, I'm Lying explains 90% of what you see shared on social networks these days. A great book, and still really relevant.


Not only in social media. The trend of the news in what were once subscription-only papers is disturbing as well now that they're online and supported in part by online advertising and eyeballs. I agree. Great book, written by someone who's been there.


Just a reminder that anyone in the UK or with an interest in UK politics should subscribe to Private Eye.

Ian Hislop likely has the distinction of being the most-sued person in English legal history - almost entirely by people seeking libel writs in attempts to bury the truth.

There is a podcast and selected content online. Funnily enough, it's the only periodical in the UK that has seen a rise in readership.


It seems like the best thing to do would be penalize a site if it shows verifiably false content until it both updates the original story and issues a retraction. Facebook could prioritize retractions in its algorithm and ensure that people who saw the original story would see the retraction as well.

Content providers who abide by journalistic guidelines could be prioritized. Providers who actually have an editor could even associate the editor of each article with the post so that the editor's individual track record could be checked alongside the site itself, which would work better for large organizations who have multiple departments and don't deserve to be globally penalized for a single bad editor.


Do you really want Facebook in this role?

I guess it works as long as you agree with their choices, prioritization, and opinions but as we saw this cycle, there was a collusion between "journalists" and the Clinton campaign. Even if you see no problem with that, would you see a problem if the Trump campaign had done it?


Not particularly. Personally, I've always advocated for some type of "internet journalistic credentials" that could be embedded in meta tags to identify content producers, editors, originators and retractions according to a process of journalistic ethics that actually valued real journalism over content spam. Content originators over scraper and ad-farms.

I'd honestly envision something closer to the BBB to manage this type of thing and track membership, ratings, disputes and resolution that all browsers and search engines could tap into. The key would be that freelance journalists would need to be able to gain membership in the same way that journalists get officially credentialed to cover different types of events.


The BBB is just a way for angry consumer to snipe at vendors. It doesn't to any sort of investigation of issues, nothing close to Politifact or Snopes.


You can't win against clickfarmers and clickbait networks.

No machine learning will save you from botters as the lion share of commercial botters live in a country from which most of the math research behind machine learning is coming from.

In Russia, few people will start an advertising network startup in their sane mind, commercial scale clickfraud, farming, and botting give an order of magnitude higher returns.

The term "Internet marketing business" is used mostly as an euphemerism for all of the aforesaid here.


I think a lot of the problem arose because they kept the content editors close to home.

A lot could be solved by just adding people to that department, located across the country in states where they already have offices anyway, or hire editors to work from home. That way, the editorial content wouldn't have to be so biased by region.


That would help.

Another problem is the type of people who would take this job. They must be literate, but not able to find a better job. This will be people with degrees in English, journalism, and similar -- giving a pretty obvious political bias. You won't be getting mechanical engineers, coal miners, patent lawyers, and stock traders.


> lead to a Congressional investigation.

That really seems like "abridging the freedom of speech" to me. Constitutional conservatives sure love pushing the limits of the constitution when it inconveniences them.

But, you know, the election is over so now that the right has total control we must all put aside our differences and work together.


It is kind of Orwellian to claim an investigation of suppression of speech is "abridging the freedom of speech." Nonetheless, it isn't the government's business, what Facebook does with its feed, and constitutional conservatives should recognize that.

http://thehill.com/policy/technology/279498-some-conservativ...


You don't have freedom of speech in Facebook, if Facebook doesn't want you to, it's their platform...

They may lose credibility doing that, and would be a worse place to discuss things, but that is up them unless you have some kind of contract with them that makes you entitled to something else.

That for me is the main problem, people just want to trust stuff blindly and get all jived up when that doesn't work. You're mad at your own stupidity folks.


The original "bias squad" was not a first amendment issue. Facebook isn't the government (despite their incredible reach).


I think if they control 60% of the news pipeline (and climbing), they're categorically different. The U.S. government regularly investigates similarly large entities.


Facebook, et al, campaigned to have internet access be considered a utility instead of a luxury. The UN is pushing it as a human right.

If you follow that line of reasoning, the companies who run the pipes and control the flow of information aren't just companies playing favorites. They're potentially influencing - if not outright regulating - human rights worldwide.. which means they should have a much higher level of scrutiny.

And they walked right into that one.


FB would need to abandon the algo-driven feed altogether. yes, manual forwarding is bad, but that existed in the email days.

nowadays you get all that suggested content which is driven by your past behavior, acting as a giant amplifier.

to go back to something idlewords is harping about - all of that is possible because of the persistent storage of user data. kill the deep profiles, kill the tracking, only use what the user directly, willingly entered.

in SV speak: utter heresy.


Now they are in a situation where they are damned if they do, damned if they don't

On the topic of "don't," an unspoken option is that Facebook simply not try to do news.

But the entire system is fundamentally broken...

Why would they persist in the face of this? Hamartia?


Why would they persist in the face of this? Hamartia?

"If you're not a part of the solution, there's good money to be made in prolonging the problem."

See https://despair.com/products/consulting for the poster that I stole that from.


Facebook consumes too many eyeball hours not to do news.


> Now they are in a situation where they are damned if they do, damned if they don't. And people immersed in echo chambers will accuse them of bias no matter what.

You are making an assumption that algorithms can not be biased. Earlier, humans were biased. Now, algorithms are biased.


Provided the data isn't wrong, we can typically put tight statistical error bounds on the output of various ML algorithms. Most people's understanding of how algorithms can be "biased" is completely wrong. There was a whole furor a while back about algorithms discriminating due to language differerences in the data for people of different races. Guess what; if you give race as a parameter to any half-decent algorithms trained on that data, the algorithm will learn that the written data contains those biases. Decent algorithms trained on true data are pretty much guaranteed to have results within a very very small margin of reality.


> Guess what; if you give race as a parameter to any half-decent algorithms trained on that data, the algorithm will learn that the written data contains those biases.

You know what? Even if you do not give race as a parameter; an algorithm could be biased. It can easily learn race from secondary or tertiary parameters.


I don't think you understood my post; giving race as a parameter allows ML algorithms to detect and counteract human racism.

If an ML algorithm notices a disparity along e.g. racial lines, it's because it's actually there, not because a human imagined it.


> damned if they do, damned if they don't

The level of damnation differs significantly.

Having a human team being accused of bias (possible falsely) and pressured by politicians to change puts them in the same category as the NYT, the Wall Street Journal and every serious print and broadcast service on earth.

Algorithmically spreading exciting and blatantly false claims puts them in the category of the National Enquirer, and 'The Protocols of the Elders of Zion'

One category is significantly less damned than the other.


I think that's a genuine concern, the only way I can think of that breaks this paradigm is by users to elect moderators.

This would also give a platform to new and grassroot politicians to rise up over time. Right now, grassroot enthousiasm seems to flow towards senators (ron paul, bernie sanders,..) because they didn't vote for something most of the other senators voted for.

Meanwhile having access to capital and connections to even become a senator is very constraint and unlikely to come without strings attached.

Anyway, having users elect moderators solves the problem of "trusting the moderator" and since I assume they would be a salaried position facebook can apply and enforce fairness guidelines.


It's seems pretty obvious to me that they replaced the human curated news section with a far (and intentionally) inferior product. They knew people would eventually begin complaining about the new system and ask for the old system back.


Why exactly shouldn't it be beneficial if we let the market (the viewer) without any regulation decide which information should spread and which shouldn't?

If people prefer seeing outrageous stuff then let them have it.

The reason why governments and institutions want to control this is in order to control its people, all the other reasons given are just an attempt to mask this.


> Why exactly shouldn't it be beneficial if we let the market (the viewer) without any regulation decide which information should spread and which shouldn't?

1) It doesn't work. It leads to the lowest common-denominator winning.

2) Humans aren't rational.

3) The world is complicated enough that the median person simply doesn't know much about anything outside of their silo.

4) There are powerful forces conspiring to profit off of people's ignorance, rather than to turn them into less ignorant, more rational, more empathetic people, more curious people.

5) There are many people who simply aren't that curious. Left to their own devices, they will be overly susceptible to trickery and populism.

This might be a nice idea in a socially homogeneous group of a small number of like-minded farmers. It fails surely and predictably in a large group trying to make good decisions in a complex and technical world.

Read: Why Smart People Make Big Money Mistakes And How To Correct Them: Lessons From The New Science Of Behavioral Economics


It doesn't matter if I am rational or not, it is my freedom to consume information from whichever source I prefer just as it doesn't matter if it's justified to spend double the money on an Apple product over something from a competitor.

It's also irrelevant how you would like people to be (more rational, empathetic and so on), actually no one on this planet even cares about your opinion on that matter enough to actually change themselves. (and no one ever will, trust me)

What matters is that people are free to choose to be whatever kind of person they want to be.

Some of them choose to be persons that you don't like, well then don't be around them, issue solved.

You seem to believe that we exist to follow some great plan devised by other people who believe they are smarter and know what we should actually be instead of what we choose to be. This is not the case.

We've all seen the results of ideology that seeks to reach utopia by changing man into something elevated. (Soviet Union, Nazi Germany, North Korea, Rouge Khmer, Venezuela) It always failed in a catastrophic manner and caused often unimaginable suffering because of the authoritarianism inherent in this ideology.


We've learned enough neuroscience to know that humans make predictable systemic errors in judgement. This is when pursuing their own preferences, not someone else's preferences.

This has real effects that harm other people. Do what you want to yourself. The problem is when you harm others due to your own ignorance or avoidable biases.

None of the big bad movements you mention were genuinely trying to make humans into better thinkers. In fact, they all relied on humans not being such great thinkers.


> This has real effects that harm other people.

That's why we have courts where you can claim damages.

> None of the big bad movements you mention were genuinely trying to make humans into better thinkers. In fact, they all relied on humans not being such great thinkers.

Oh really? The Soviet man, the Übermensch as understood by the Nazis and all similar Socialist concepts are then completely new to you? Some of these concepts are pretty close to what you described a person should be like.

https://en.wikipedia.org/wiki/New_Soviet_man

So it seems to me that you basically share many of the views of a Hitler or a Stalin on questions like these.


You seem to be pattern matching on certain features and neglecting that details matter.

How can conscientious teachers or coaches exist in your worldview?

Surely educators are all evil because they dare to seek to improve people. Education has been used as a pretext to imprison people in oppressive regimes therefore education is evil. Absurd!


I have no issue with education and teaching without coercion, what you are proposing though is use of force to prevent spread of information you do not like or that might cause people to reject your ideology.

This is the real issue here. You are arguing in favour of using force (the basis of Socialism) while I'm arguing for freedom. (the basis of Capitalism)

And please don't tell me that censorship or regulating media isn't really using force. You go break censorship laws in countries that have them and lets see what they will do to you.


You just don't have enough contempt for humanity. The masses are unaccountable and will lie as much as they think they can get away with. It won't help them. They will lose sight of their self-interests as they become more and more detached from the facts of the world.


That's what happens when you try to control other's speech. You will be blamed for bias, and the worst thing the accusations would be completely correct, because humans are biased. Humans have opinions, humans have cognitive biases, humans are prone to groupthink and a long list of fallacies. It is extremely hard work to get even close to objectivity, let alone achieve it - professional scientists who are trained and have tons of tools to avoid biases still regularly fail and produce irreproducible results and wishful thinking papers that are then debunked - for somebody involved in politics where there's no such tools it's doubly hard. Hoping Facebook somehow magically solves this problem for us is naive at best.


The solution is simple, stop being gullible and stupid trusting things you hear or read without something credible to back it up.

Facebook has it's own bias, most people do. Some professional reviewers and political commentators try to be as unbiased as humanly possible, but why blindly trust them, even if that's their intention, there is nothing that say they cannot just be factually wrong.


Maybe their AI should be trained to fact check stuff posted on Facebook? So if you post something that claim outrageous things, it would be visible right from the start that the claim is false.


There was a quote regarding this which I can't find, but paraphrased it was something like,

"Bullshit flies halfway around the world before the truth has even gotten its pants on."


It's a sign of the times that we shit on facebook's AI for not quite being smart enough


There's a proper response to "accusations of bias" from alt-right white-nationalist thug media. Never mind whether the accusations "looked likely to lead to a Congressional investigation".

The response is to tell them to fuck off, and that the professional editors in your private company will continue to do their job however they see fit.

Period, paragraph.


The only reason it looks alt-right white-nationalist to you is because anyone on the left who disagrees with prevailing left wing dogma is branded a heretic and a bigot, and shut down quickly.

Take The Guardian. They employ people like Jessica Valenti and Laurie Penny who publish feminist clickbait. Then they aggressively moderate comments which explain why those two are ill-informed at best and dishonest manipulators at worst. Then they do a "study" where they demonstrate how much harassment their female writers receive, by considering every deleted comment to be an instance of harassment.

Alternative hypotheses are not welcome, and there is no accountability. Actual studies like Pew's about the magnitude and nature of real harassment are ignored, or spun by cherry picking. The dogma remains: women have it worse, and it's because they are women. This system gets them more traffic and attention, so they are encouraged to continue, even as they insist it's terrible and someone should stop it.


I may be missing something here, but what do your two paragraphs of axe-grinding about women have to do with his remark about American "alt-right white-nationalist thug media"?


What does two paragraphs of axe-grinding about woman like Jessica Valenti and Laurie Penny have to do with American "alt-right white-nationalist thug media"? Take a look at the narratives they've been pushing about Truump and the evils of the alt-right lately.


SOME women. You're already altering his words.


Where does he use the words "SOME women" then, if you want to be this anal?


Your comment implies that the commentor was axe-grinding with women in general. The complaint was about two particular women writers at the guardian using their own sex and feminism as a shield against criticism and fuel for their own click bait.


> Your comment implies that the commentor was axe-grinding with women in general. The complaint was about two particular women writers at the guardian using their own sex and feminism as a shield against criticism and fuel for their own click bait.

So it's okay for you to infer what other people mean, but not okay for me to do it? Where did I say "women in general"? Why would my post imply what he said, when everyone can read exactly what he said and interpret it for themselves, just like you did?

He's grinding an axe (completely out of nowhere) about "feminist clickbait", "female writers", the "dogma" that "women have it worse", and singling out two women (and "people like" them) as examples. What am I supposed to do, pretend he's talking about men?

Likewise, why should I pretend that he said "some women" when he didn't? I have no idea what proportion of women he has a problem with.

None of this is relevant to my point though, which was that his rant about women (however many it may be) had absolutely nothing to do with what he was replying to, and was simply an attempt to inject this particular hobby horse of his into a thread where it doesn't belong.


He has a problem with two women. It's you that made it ambiguous by claiming he had a problem "with women".


> He has a problem with two women.

How do you know he has a problem with (just) two women? Did he say that? No. Even now he hasn't said that. On the contrary, he specifically said "people like" the two examples he named.

Are you genuinely telling me you don't know what "people like" means in this context?

> It's you that made it ambiguous by claiming he had a problem "with women".

The words "with women" don't even appear in any of my comments, in any context at all, so I don't know where you're quoting that from. I actually said "I have no idea what proportion of women he has a problem with".

And if my understanding of another person's comment (which is still there, in his own words) somehow made that comment "ambiguous", why are you not finding it ambiguous?


I think The Guardian is a bad example of this. Their articles follow a particularly narrow group of viewpoints. They will never have the mainstream (liberal) appeal as the New York Times or the Washington Post.


Don't you mean a good example of this?


Incidentally, which pew studies? In my anecdotal experience, harassment is a pervasive issue e.g. I personally dont know of any urban woman not adversely affected by street harassment. Like a lot of things, it is a couple of men doing a whole lot of damage.

I keep feeling like the whole alt-right anti-PC movement seems to be born of things people said online or corner case behavior in insulated Universities. Like, who cares? Most men I know who love the anti-PC movement would, in the moment, gladly intervene if a woman is getting verbally harassed and is visibly shaken up. Mention Valenti to the same person and they fly into a rage. Like wut?!

Most feminist and LGBT activists groups aren't talked about by Jezebel and are doing great work.

tl;dr: Message to alt-right activists everywhere: Talk to minority rights activists in person outside of emotionally charged protests. You'll find you don't disagree by much.


> Most men I know who love the anti-PC movement would, in the moment, gladly intervene if a woman is getting verbally harassed and is visibly shaken up. Mention Valenti to the same person and they fly into a rage. Like wut?!

I'm a man, I dislike men who are asses to women and women who are asses to men. I'm finding it consistent and deeply un-wut-worthy. Heck, it seems I sometimes even care about men vs men and women vs women too.


Telling people who aren't aligned to with the liberal intelligentsia to fuck off is exactly what got us in the current mess. Most people are reasonable, but when the reasonable and rational media ignores their issues, we leave the door open to demagogues and dictators. How else could a silver spoon like Donald Trump ever claim to represent the everyman?


I think you're misinterpreting me, and also distorting my words significantly, which I object to. I didn't say we should tell people "who aren't aligned with the liberal intelligentsia" to fuck off. I was speaking narrowly about lawyers from places like Breitbart; i.e., the kinds of folks who have been trying to apply pressure to Facebook and making accusations.

NOT the general population.


Sorry, I apologize as I definitely was using your comment as more of a general soapbox than directly responding. Upvoted.


Don't apologize, it's how I took the comment as well. You can assume someone is guilty of generalizing when they generalize.


>Telling people who aren't aligned to with the liberal intelligentsia to fuck off is exactly what got us in the current mess.

I am so sick of this argument that Trump's victory is somehow the fault of bubble-trapped city-dwelling liberals for not understanding how the rest of the country thinks. Trump supporters are the ones who have either been manipulated to vote against their own interests; implicitly elected a Republican administration no different from the ones that they now claim to have rejected; ignored the obvious signs that Trump is of the same "elite" ilk that seeks to exploit them; been duped to amplify social hatred and reinforce the worst bigotry and sexism of the right-wing; benefited from many of the federal supports they claim to hate; or neglected the existential peril of their belief that the economy is more important than the planet's ability to sustain life; if not all of the above. Should the left have picked a candidate that is more aligned with the issues of the working class? Absolutely. Does that mean it's entirely on them? Hell no.


To put some perspective on this, I am from Minnesota, I was there in 1998 and I helped Jesse Ventura get elected. It had a remarkably similar vibe of unexpected victory up-ending the establishment. Of course Jesse Ventura is much more of a rational and thinking man than Trump, but the race was similar in that no one took him seriously and he won in a huge upset against the major party candidates.

The thing with Trump is that liberals everywhere still can't get over what an asshole he is and what a disaster this is, but that fact that an asshole like him gets elected is very telling. We need to observe and understand this. Saying that people were duped and are idiots is not an interesting story, it doesn't help us make a better future, and quite frankly Trump supporters are not all idiots and bigots. What the intelligent Trump supporter would say is that establishment politicians have duped everyone too, and that Hillary was not going to further your interests. Granted Hillary was definitely the lesser of evils, but I don't think there's a cut and dried case that her policy was going to help the low to middle classes. Plenty of blame to go around here.


If you want to know why Trump got elected, it's because most people don't have a basic clue as to how the foundations of this country actually works. Seriously, go ask 100 random people what the national debt is, and I doubt you'll get more than 1 or 2 who have any sort of clue.


>What the intelligent Trump supporter would say is that establishment politicians have duped everyone too, and that Hillary was not going to further your interests

And that is why your "intelligent" Trump supporter is still clueless in this case, because he just voted in a white house full of career politicians (Newt Gingrich? Are you kidding me?). It's clear that Trump was mostly a protest vote against the establishment, but it was an incredibly hollow one and made literally no sense if the supporters looked beyond catchy slogans and strongman signaling. And every Trump voter, no matter how intelligent or decent or caring, still cast a vote for a racist and sexist candidate and that is something I find to be unconscionable.


Again with the racist accusation. You are either trolling or you have no intention of ever understanding why Trump supporters supported Trump, or you're just venting your anger from Hillary's humiliating defeat.


Trump actually promised to surround himself with experienced people, so "a white house full of career politicians" should not surprise anybody. It's keeping a campaign promise.

There was little protest vote. This was about corruption, job loss in the heartland, and some very non-racial issues with immigration.


Personally I blame it on the left's constant, ruthless beratement and demonization of Christian|white|rural|blue-collar|conservative people. I'm not surprised they pushed back after a decade of bullying in the name of tolerance.


Oh yeah I forgot it was the evil left that was demonizing people during the past ten years. Certainly no Christians during that time berated and demonized someone just on the basis of say, their sexual orientation or religious beliefs.

Give me a break. The right lost the culture wars (for good reason as their regressive social policies trampled all over civic rights) and this is them sending our country into a suicide because they'd rather die than face the reality that America is no longer a white monoculture.

Your reasoning is basically "maybe if you weren't so mean to the white supremacists Trump wouldn't have happened". Are we supposed to appease neo-Nazis and ignorance now? There were racially motivated attacks across the country last night, all done in the name of Trump. It does not look good.


Have you been on Twitter lately? If you phrase one thing the wrong way you get branded as a bigot, racist, sexist, whatever, and the torches and pitchforks come right out. If you don't see the parallels between the extreme right and the extreme left and prescribed moral code of what ideas and views are acceptable to express then you are not as smart and objective as you think you are.


> face the reality

Talk about reality checks.

You guys wanted to have it too fast and too easy. You though you can change things by passing laws and silencing criticism with media hysteria. You celebrated your superficial legal victories while completely ignoring the fact that sentiments against "affirmative actions", "LGBT rights", feminists, Muslims, Blacks and whatnot had been steadily growing for ten years all over the West.

You gave voice to the most bitter representatives of minorities to sell them as victims to the general population and didn't mind that the venom they spill enrages everybody else. As if you completely didn't expect that one day somebody may give voice to the most bitter of your opponents.


I rest my case.


> vote against their own interests

If an unemployed factory worker votes for a protectionist who promises to bring back factories, how is he voting against his own interests?


That's the first-level analysis. Let's dig deeper...

Why would one believe a so-called "protectionist" who has actually off-shored jobs?

Why would one believe a billionaire who claims to not be part of the establishment? A billionaire who was a celebrated part of the establishment during the '80s, in New York. That den of conservative values?

Why would one trust a candidate who was essentially a left-leaning Democrat until he decided to run for President?

Why would one believe a candidate who contradicts himself, is difficult to pin down on any policy details, and who says different things to different groups to a degree that's unprecedented, even for a politician?

Anybody can say anything. If you believe someone this erratic, you almost have to be wilfully ignorant of human nature.


Well, for many of your examples I would say they fit both candidates, with small adjustments. So the only choice is to go with the one that currently states what the voters agree with.

I'm always fascinated that people don't realize that what one side says today the other side said yesterday.


You make some good points, but this:

> Most people are reasonable

Is blatantly false. People are tribal.


How does tribal automatically mean being unreasonable?


It doesn't. I'm conflating two different things that make sense in my context, but that I don't have time to explain.


First of all your use of biased language is ridiculous, you are as bad as the 'alt-right white-nationalish thug media' and I wish people like you had no place in rational discussions.

Second if you tell them to "Fuck off" you risk losing a significant chunk of your user base who believes their content and information is being manipulated against them. Facebook isn't in the politics business, they are in the social networking business. If Facebook is seen as political at all then a competing business will set up a service to sell to the other side. Look at what happened to mainstream media channels. Claims of liberal bias in the media giants directly led to the success of Fox News.


You're right. Telling bullies "no" does risk that other bullies will leave your site. But that risk is already there.

And Facebook is already seen as political; that ship has sailed.


>And Facebook is already seen as political; that ship has sailed.

I don't think that's true. Certainly the extremes on each side will say that. But I think everyone in the middle doesn't. I don't. Certainly not in the same way as I see Fox News/CNN/MSNBC as political.


We're only talking about the extremes. We're talking about spurious pressure from alt-right media.


It's funny that you don't think people marching in the streets, blocking traffic, screaming obscenities and destroying things right now aren't bullies.


I think you're inferring something here that was never explicitly stated.


Correct. I don't know why editorial bias was a problem in the first place.

Right now they have false-equivalency bias that drives fake stories.


> I don't know why editorial bias was a problem in the first place.

Because, like it or not, Facebook has become a trusted news source. Because of that, they have a duty to accurately report the news.

Their human editors fell down by vanishing stories they found politically uncomfortable.

Their algorithmic editor fell down by promoting stories that were false.


> Because, like it or not, Facebook has become a trusted news source. Because of that, they have a duty to accurately report the news.

What other trusted news source has this "duty?" What other trusted news source does not have biased editors?


> What other trusted news source has this "duty?"

All of them.

> What other trusted news source does not have biased editors?

None of them, of course. Man is a fallible creature.

But other news sources try, most of the time, to be objective and accurate.


"OMG, Trump has won through lies and deception! We failed to stop him. How on Earth did that happen? We must out-manipulate our opponents next time."

If you read between the lines, this is what the article condenses to.

The discussion here is mostly creepy groupthink shit.

Social networks fact-checking their content? What's next? Should AT&T stop the spread of misinformation over its phone lines? Should USPS fact-check your mail?

Facebook is not a real news source and never going to be one. At best it's a communication medium. At worst it's a giant propaganda machine. Any moves to get it further away from the former and closer to the latter are just machinations to change who benefits from the propaganda and nothing else.

We don't need an "improved" Facebook. We need a working replacement for old-school newspapers, TV stations and radio channels. The "new media" eroded all of those, but failed (so far, at least) to provide anything of equal utility and value. Hence all the issues involved in the coverage of these elections.


Agreed.

The problem is the users, not the platform. If users propagate misinformation, the users propagate misinformation.

The easy solution people often jump to is fighting negatives with negatives, assuming it yields a positive. But often a positive approach is more effective. Offering incentives for good acts, not just disincentives for bad acts, is a fairly popular recent trend backed by research.

In this case I can't help but think that the best solution is to focus on education and the propagation of correct information, not censorship (or at least, something that smells like censorship) of bad information. If "new media" is a problem we should be fighting it closer to the source.

I don't know what Facebook's role in that would be, but ideally, as a platform, it would be minimal.

But ultimately, it's worth remembering that it's hard to build a good system with bad raw materials. If people are interested in falsehoods and echo chambers, their social media will reflect that.


Hmm, I mostly agree, but it's important to remember that a bad mechanism can encourage bad behavior as well. Facebook must make choices about design of the feed algorithm that have important effects on incentives and behavior. So even if they take a minimal censorship role, we cannot ignore the effect of the feed algorithm - in fact, we should focus on how to design it!

For example, the extent to which one "bubbles" users into their own echo chamber is largely up to the algorithm designer. If you look at "content aggregator" sites like HN, reddit, Facebook, they all have pros and cons. HN and reddit give incentives in the form of karma. They all have different levels of "bubbling" and opting into or out of bubbles.

Good design of such sites is an open problem -- even formalizing good goals for such sites is an open problem -- but it is a design problem we should be thinking about and addressing.


If the news feed were just a reveres chronological list of everything your friends posted, then I'd agree with you -- facebook would be just a medium. But they filter and sort it.

I would actually like to have a knob or a switch I could use to add or remove certain elements from the news feed filter algo. I think it would be great if one of those parameters was a (open source) "factuality score."

they're already doing this with click-bate, why not brain-rot? Especially if it's an optional feature.


>>But ultimately, it's worth remembering that it's hard to build a good system with bad raw materials. If people are interested in falsehoods and echo chambers, their social media will reflect that.

I wouldn't phrase it quite so negatively, but I think you're on to something here. If people don't enjoy going on Facebook, they won't go on Facebook. What do people enjoy going to Facebook for? Interacting with their Facebook friends, whom they likely friended in at least some part due to their shared beliefs.

"Fixing" Facebook in this regard means forcing people to interact more with entities outside their self-selected social group, when the whole reason they got on Facebook in the first place is to interact with their self-selected social group! Facebook would be foolish to fix this problem, it would drive their users away.


I don't see why facebook is attempting to manipulate its users at all nor how anyone thinks this is acceptable.

If people spread misinformation, than the way to stop it is to spread correct information.

If facebook takes a more active role in shaping peoples' perceptions, i think it would be incredibly immoral and I would hope the company tanks.


The irony here is also the idea that all of this so-called propaganda comes from the right.

The left is constantly selecting it's own set of "facts" and tactfully leaves out whatever doesn't agree with their agenda.

Simply look at all the disingenuous pro-Clinton fact checking throughout the election. Just look at the number of rape, assault, race-baity bullshit that is circulated by liberals every day. There articles may not fly in the face of science, but they are usually equally misleading.

Is this a veiled attempt at trying to silence wikileaks? Infowars? There is a reason they are being called the regressive left. They want some articles to be banned from social media because they believe they aren't true. They believe, yet again, that people should not be allowed to make their own decisions in life.


Please don't make generalizations about large groups of people. It leads to irreconcilable feuds.


Should the world stop generalizing because it hurts your feelings? Because you could just look up the dictionary and understand that generalizing does not mean that whatever is said concerns every single person of the generalized group. It should be way easier for both you and the world.


You shouldnt let a fear of feuds control you.


Maybe the word "feud" wasn't strong enough. Have you seen what's happening in Turkey? My Turkish friend suggested that's where this leads.


Eh... unless you have numbers, I'm going to disagree. I don't visit FB much, but I've seen enough other social media to know that both sides play the same game. They both leave out data, mis-use existing data, and outright lie. I'd love statistics to prove me wrong.


Isn't that what the person you are replying to is essentially saying? That it's both sides, not just the right?


Is it? I took it to mean that it's coming from one side. Maybe I'm mis-understanding?


Yes, you are misunderstanding. The comment correctly accuses both sides of misleading propaganda.


>They both leave out data, mis-use existing data, and outright lie. I'd love statistics to prove me wrong.

I don't disagree with this at all. I just said that they are framing this as if the only propaganda on social media comes from the right.


My apologies then for mis-understanding!


Ims ure it is true but only one side has a dominant control over internet services.


What would be so bad about at&t offering a service that banned fraudsters from calling you? Or if the USPS refused to deliver mail that said "URGENT TIME SENSITIVE" on the envelope unless the sender could prove it really was urgent?

As someone with elderly relatives who've fallen for scams over email (which major email providers already do block, automatically), I would celebrate both of those outcomes.


> What would be so bad about at&t offering a service that banned fraudsters from calling you?

That isn't what we're talking about, now is it?


Labeling something "misinformation" is not an objective act, period.

Facebook will be engaging in censorship if they pursue this.


Objectivity is literally about things being factual regardless of subjective opinions.

There's nuance in the world, but if someone says "The earth is flat", it's not dystopian for a newspaper or online newsfeed to editorialize and minimize those claims.


its impossible to label an article presenting a theory or perspective as non-factual. no one ever said it was fact. anyone trying to claim it is fact is a fool.

Facts are things like:

I have 20$ in my pocket

Creationism isnt misinformation, even if it is lacking in evidence.


Agreed. This appears to advocate to fight propaganda with more propaganda. I don't trust anyone at Facebook to determine "facts."


Why? Facts are facts. The definition of a fact is that is an objective, verifiable statement. Trust does not come into play because you don't need to trust anyone judgment to verify a fact. Judgment is not involved.

Whether Fox fired Megyn Kelly or not is a fact. Facebook's is judgment is not involved in verifying it.

Whether an FBI agent that examined Clinton's emails is now dead or not is a fact. Facebook's judgment is not involved in verifying this fact.

You are conflating fact checking with opinion editing. The suggestion here is not that FB should filter out any kind of objectively verifiable fact, no matter which way the fact is used. It's that perhaps FB should filter out or at least not signal boost objectively verifiably false information.


Humans are bad at determining facts. That is a fact.

Consider the irony of refuting that statement. Yes the word "fact" has a definition, but that doesn't mean humans can or will obey that definition. In early 2003 the NY Times reported Iraq had WMD, so that was a fact, right? How long would it take me to find a Facebook employee who would tell me it's a "fact" that vaccines cause autism?

Anyone who claims they can differentiate facts from opinion and truth from fiction without bias, is the last person that should have that responsibility.


> "OMG, Trump has won through lies and deception! We failed to stop him. How on Earth did that happen? We must out-manipulate our opponents next time."

This is a ridiculous oversimplification of a complex and important set of issues.

The issue is the spread of facts vs misinformation, not liberalism vs conservativism, nor Democrats vs Republicans, nor Trump vs anti-Trump. Facts can work on both sides, as can misinformation. It's kind of fucked up to assume facts somehow only go one way.

> Social networks fact-checking their content? What's next? Should AT&T stop the spread of misinformation over its phone lines? Should USPS fact-check your mail?

Well, gee... Let's think carefully. Are your private phone calls on AT&T a public discourse? Is your mail? Do either AT&T or USPS signal boost some of your discussions during public transmittal? Is your analogy even logical?

Facebook is a communication medium, as you said. Moreover, it has a set of rules and policies governing what can be shared, which shared content gets shown to a given user, and how the content is further propagated and boosted. The entire discussion, which your post is completely sidestepping, is what content should or shouldn't be propagated and boosted.

One can make arguments for more rules or fewer on this, for different rules or keeping the same rules. But asserting that it is a non-issue or drawing false equivalences are non-arguments irrelevant to the discussion at hand.


I agree with some of this, but the main difference between AT&T and Facebook is that FB is controlling what you're receiving.

Imagine if USPS sent you things that were sent to you, but also random packages they thought were good for you.

Facebook, even just through its algorithm, is exercising some editorial control and distribution.


facebook will be less effective as a medium/content distributor if people become aware of the filtering that they do; i think they are shooting themselves in the foot if they start to actively censor the messages, people will just move on to twitter for politics and use facebook for the personal stuff only.

trust the market, there is plenty of competition that puts things into their proper place. Facebook doesn't own their users either.


Couldn't agree more.

There is no actual problem: social networks are simply not the right media to be a news source, and they were never meant to be.

Moreover, to expect to be fed "correct" information all the time, without no effort on the consumer's part, is flat out delusional. What we need, and what we have always needed, is to apply critical thinking.

Do you believe everything someone says? Well, then you have a problem.


I agree that the idea of "fact-checking" stories is troubling and is overall a bad idea.

You are not at all addressing what is actually in the article: the Newsfeed feature that surfaces the most shared stories. Facebook isn't discussing deleting homeopathy, Clinton body count, or Trump kompromat articles from our various Facebook walls.


They can fact check whatever they want, it's their site. It's stupid but people that should leave to greener pastures, it's not like in the beginning there was Facebook and we're bound to use that forever.

The problem is people want to 1) trust blindly and 2) not be taken advantage of that. You can only have one of those.


Such a comment betrays an incredibly unsophisticated understanding of the issues being discussed. Perhaps a revisit to McLuhan's maxim would be enlightening:

https://en.wikipedia.org/wiki/The_medium_is_the_message

https://www.youtube.com/watch?v=Ko6J9v1C9zE

https://www.youtube.com/watch?v=UoCrx0scCkM


Perhaps you could avail yourself of some of your erudition and explain what relationship these three links have to your objection.



It has less to do with "the digital age" and more to do with governments ignoring large numbers of voters. This is what's supposed to happen when the ruling class gets out of touch.


The idea of a unified "ruling class" is laughable. The current president is constantly at odds with the political party that controls the Senate and the House of Representatives.

I also disagree with the notion that most congressmen are out of touch with their voters. Each individual congressman is pursuing the agenda he or she was voted into office by his or her constituents. An example: some constituents want to repeal Obamacare while those in other districts want to preserve it. Each set is affected differently so it is reasonable for different districts to have different opinions.

Deadlock is a feature of this Republic; the founders considered it to be superior to a tyranny of the majority over minority interests.


>The idea of a unified "ruling class" is laughable. The current president is constantly at odds with the political party that controls the Senate and the House of Representatives.

Sort of. Not really. The political parties do represent different power factions.

But they are unified in the sense that people in the "deep state" - mid-high level bureaucrats, academia, and the media have more in common with each other than they do with you and me. And anything that threatens their collective control will be dealt with more harshly than they deal with each other.


Can you go into more depth? Also I believe you are leaving out a group; high-level business leaders, who often do a tour in government posts. (I'm not criticizing the practice; we want competent, subject matter experts working in government)


I agree. Business leaders belong on the list.

I once worked a job that involved dealing with a lot of civil service folks. We worked on a land-locked Navy base which had exactly two uniformed naval personnel (that I ever saw, anyway), who were in theory the #1 and #2 people in charge of a few hundred civil servants and a similar number of contractors.

Now, everyone knew the Navy guys (a captain and his XO) would be there for about 18 months and then they'd get transferred or retire. If the captain wanted something to get done, it would only get done if the civil servants (who had been there 20+ years) wanted it to get done, because they knew how to gum up the works until he was gone.

They also knew how to undermine and embarrass him, which at flag ranks (or wannabe flag ranks) will end your career. You can't fire a civil servant unless there's a felony involved, so even if he figured out what was going on he couldn't do much.

That's what happens in Washington, too. Political leaders come and go. They put appointees at the top positions of giant bureaucracies, but the bureaucrats have their own agendas, and they know how to work the system. They're know which reporters to leak what to if the president upsets them.

In Congress the Congressmen (and women) come and go. But they all rely on staff for information, and the same people pop up on congressional staffs over and over. Those are the people who actually write the laws (or edit what the lobbyists produce) - the congressmen don't even read what they're voting on.

The point is there's an entire layer of people, what I've seen called the "deep state", that you don't get to vote on except in the most indirect way. You could say they're not very ideological, if you're generous, or you could say their ideology is power. They went to the same schools, they go to the same parties, they marry each other, they read the same books, watch the same TV shows, etc.

It's not some grand conspiracy. It's just one of those self-organizing aristocracies that pops up whenever a government isn't overthrown for a long time.


Of course there are multiple elements of the ruling class with different interests. But I think it's also safe to say there are a lot of issues where Republicans and Democrats are in complete agreement and their constituents are not.


What would be an example of one of these issues?


How to handle Wall Street post-meltdown or various military ventures seem like the most glaring examples to me.


Immigration.


Brexit and Trump and Syriza and Bernie and all the rest of it point to the collapse of the middle class across the industrialized world more than anything else.


The middle class is doing fine. It's the working class that has been suffering, and they are the ones who voted for Trump in overwhelming numbers in swing states.


Sure, whichever. The terms are imprecise and many of the people we're talking about probably could have reasonably considered themselves middle-class in the past.


You've got it backwards. People imbue meaning into the medium.

Also, saying the comment you replied to is "poor understanding" is both not supported by anything you shared, since your argument is as factual as theirs, and obnoxious. Keep it to yourself, please.


It's also on us to resist the temptation to build social media bubbles around ourselves, and to poke into each other's bubbles. Every time I've wanted to block a friend or relative on facebook, I've put my phone down and come back to it later. Looking away from people we disagree with isn't working.

My cousin shared a post this morning asking why people on the left aren't celebrating the fact that a female campaign manager helped put someone in the white house for the first time. I wasn't sure what to say as his friends piled on to say things like "Yeah, I thought they were for women's rights?!" Here's the response I finally came up with:

I'm not celebrating her "success" because I imagine my facebook and twitter feeds look a lot different than yours. My feeds are filled with first-hand stories from women around the country who are being more openly harassed than they were last week. It's happening often enough that's it's really dismissive to say "Oh those are just a few assholes." People are openly harassing women in the name of Trump.

I think when we act as consumers of social media we need to stop building our own bubbles, and reach out into other's bubbles. And when we help build social networks, we need to intentionally structure them in a way that maintains connections, rather than isolating individuals and groups.


When I knew people who supported Romney, I felt no need based on that to exclude them from my life or my social media circles.

One's ability to support Trump tells me much more about a person. There are too many things that are good and should be fundamental about a functional, enlightened society that one must reject in order to support Trump. Prejudice, fraud, bullying, and sexual harassment must all be accepted.

People who would accept these things are not welcome in my life. It's not because they're on the other side. It wasn't like this in 2008 or 2012. This time it goes deeper than that.


And to support Hillary, one must accept lies, corruption, warmongering, corporatism, voter manipulation and unfettered globalization.

It's a choice between two evils, however you look at it.


Don't forget tolerance of rape and child prostitution. (See Jeffery Epstein and William Jefferson Clinton.)


I mean, Trump is arguably FAR more involved with Epstein though - he literally has a quote saying that he acknowledges Epstein surrounds himself with young women:

“I’ve known Jeff for fifteen years. Terrific guy,” Trump told New York Magazine for a 2002 profile of Epstein. “He’s a lot of fun to be with. It is even said that he likes beautiful women as much as I do, and many of them are on the younger side. No doubt about it — Jeffrey enjoys his social life.”[1][2]

How can someone come to the conclusion that this means only Clintons support this? Clearly the common denominator is rich people abusing their power to rape children, not politicians raping children, and not Clintons only raping children. Rich people are abusing their power and America voted to fix it by electing a rich person who has abused that power.

[1]: http://dailycaller.com/2016/10/09/the-friendship-between-tru... [2]: http://www.snopes.com/2016/06/23/donald-trump-rape-lawsuit/


I don't think unfettered globalization and corporatism are on the same moral level as being a sexual predator.


agreed, they're worse.


so you are okay with supporting killing of innocent civilians half way around the world, bombing of hospitals filled with children and women? Pointing out all the flaws of Trump ( which are true, btw) but ignoring bigger flows of Hillary while assuming some moral ground is more fucked up imo.


But Hillary supports Bill ? Why don't you support Bill but support Hillary ?


There was a comment that just got deleted wondering about if people are just roaming around for liberals to hate on. The answer to that is yes. This just posted on my feed [1]. Driving around Wellesley (also where Clinton graduated from) taunting people.

[1] https://m.facebook.com/?_rdr#!/story.php?story_fbid=10210780...


There used to be a feature on Twitter for viewing other people's feeds[1], but was removed[2].

I made something to replicate that via Twitter Lists[3][4], but it seems like making one for Facebook is much more important.

[1] https://twitter.com/twitter/status/73833309163110400

[2] https://news.ycombinator.com/item?id=12117218

[3] https://otherside.site/

[4] https://gist.github.com/0x263b/7b391a1617fcbbabc57fb1e705884...


I think it would be relatively easy to have an "auto-snopes" feature which detects a URL being shared and immediately attaches a post that says "this has been debunked by X"

For example - I've seen dozens of links to Michael Moore's trumpland speech which strategically ends with the phrase "America will elect Trump and it will feel great"

Sample: http://www.zerohedge.com/news/2016-10-25/michael-moore-trump...

    He concludes:
         
    Yes, on November 8, you Joe Blow, Steve Blow, 
    Bob Blow, Billy Blow, all the Blows get to go
    and blow up the whole goddamn system because 
    it's your right. Trump's election is going to
    be the biggest fuck ever recorded in human 
    history and it will feel good.
but the truth is that is not what he CONCLUDED

He continues: https://www.youtube.com/watch?v=sVLTQIUMq18&t=30 [sic]

    ... and now you're fucked.

EDIT - I feel like I should clarify I mean "auto-snopes" figuratively. Auto fair-balance might be a better descriptor. Something that links to an opposing opinion automatically, in cases of absolute falsehood the debunking, or even a CSS CLASS where there is a bright red "FALSE" wrapper.

Snopes doesn't have to be the automatic choice.

Breaking filter-bubbles & ensuring truth would be my goal on an idea like this. Not strictly "promoting liberal media"


I have personally asked an individual why they don't trust snopes.

Their claim is that it's a Democrat rag towel and shill.

I then ask, then what fact-checker would you trust?

Their response: None of them.

Congratulations. We've reached a point, for this individual, where "facts" as decided in the common forum are suspect and the only thing they trust is themselves (and whatever non-Mainstream Media they listen to). We have reached a point where the only truth is what they decide is the truth.

That is a non-trivial problem to resolve. If we can't even agree what basic facts are, there is no way to even have a discussion.


I should also clarify that I'm not saying snopes is the end-all fact-checker. I would have been curious and explored any alternative, "non-liberal" fact-checker.

My point is that this individual has written off all fact-checking as an entire discipline. They are the final arbiter of truth. That fundamentally prohibits discussion of almost anything, or at least makes it extraordinarily labor intensive.

One might counter-argue, well, start with their beliefs and work outwards to illustrate contradictions and such. Very Kantian. Except that is 1) very labor intensive, and 2) doesn't cover all issues.


Well, is there a fact-checker that doesn't claim Trump's very unenthusiastic acceptance of war with Iraq when asked by surprise on the Howard Stern show means that Trump is somehow lying when he says he didn't support the war?

It's stuff like this, turning a single reluctant "I guess so" or "probably" or "it seems we should" into support, that make supposed fact-checkers suspect. Who will fact-check the fact checkers? The supposed fact-checkers were just another political weapon, tainted as could be.

Fact checkers are like certificate authorities. Once you lose credibility, you might as well close shop. You're done.


He said "I guess so" before the war started, and then started opposing it when others started opposing it. That's hardly the political courage he was attempting to convey here, especially when he says "I was against the Iraq war from the start"


I haven't seen the video yet, so this is just my interpretation.

Asking about Iraq war at the time was similar to asking "Should we try to stop ISIS?" now. I'd give 80% courage credits to whoever that dare saying "I'm not sure, I guess so".

All in all, unless Trump had access to more information than the general populace, it doesn't matter whether he opposed or supported it. We decidedly didn't have enough info to make a sound judgement, and the conclusion we made - be it correct or not - would merely be a metaphorical coin toss.


Read the wikileaks emails yourself. You've got eyes, you don't need to rely on New York Times editors to read it for you.


Generally those fact checking sites will link to sources. It's much easier to find sources through them then to go digging through the emails manually.


Did and done.

However, I fail to see why Wikileaks is considered a Fact-checker, or how this pertains to a belief that NO fact-checker is reliable. Are you claiming that Wikileaks is trustworthy and can replace a fact-checker, but isn't a fact-checker? Because that seems a bit circular and oxymoronic to me.

Perhaps if you provided an alternative fact-checker source? The person in my original post has written off all fact-checking period.


You missed the point and sentiment of his comment.

Wikileaks is not the fact checker in this case, you are.

You were being invited to look at the raw data and draw your own conclusions/truths from it, not somebody else's.


You are being invited to look at "raw data" which is actually filtered, has important information retained for collateral and insurance, and decides when to release things for "maximum impact". To consider this alternative somehow a totally unbiased and impossible to refute thing just because it was founded in the internet era is absurd. It is run by human beings, it is not simply a "upload files here and instantly distribute them" tool. They have their own inherent bias and it is dangerous to consider receiving information that detailed and classified to assume that nobody else dealing in classified information has anything to hide. Obviously people will look bad if you get to snoop through all their trash, but the neighbour was the one who actually let you snoop in their trash, and they won't let you look in their trash either.


I too don't trust "any of them". I don't trust news sites or channels at all anymore, because there aren't any that even _pretend_ to be impartial. The biggest loser by far in this election cycle is the media, especially on the left side of the spectrum. No one will trust them again in the foreseeable future. As someone joked on Reddit: "If Donald Trump walked on water, healed cripples and turned water into wine, the headline would be: "Trump can't swim, takes jobs from doctors, and is a raging alcoholic". That's essentially how Trump's supporters perceived the media towards the end of the campaign, and in my opinion, very deservedly so. They're shameless shills, pure and simple. For the record: I don't consider Breitbart or Fox News to be a valid news source either.


The problem these 'fact' checkers have become leftist doctrines... You just point to snopes or politifact and that is end of discussion? That's not how the world works. Wikileaks is a far more credible source than either of these. People need to do their own research if they want to form conclusions not just little to prepackaged conclusions.


I would argue that wikileaks on it's own can also be very misleading. Reading emails out of context and without knowing if they've been altered is probably not the best way to get a full picture. The value add of media looking at them is they can also go and get other sources to add context or to corroborate what they are reading in the wikileaks releases. Sources which aren't available to everyone. It's also easy to be fooled into seeing things as nefarious which are in the end benign when given context.


Give me six lines written by the hand of the most honest man, and I would find something in them to have him hanged.

... and now Wikileaks has invited the whole internet to come play Richelieu with them, only with ten thousand times the ammunition the Cardinal (apocryphally) asked for. The online mob has been accusing (eg) Podesta and his correspondents of all kinds of insane things - from corruption & arms dealing to satanism & child abduction - based on the flimsiest of cherry-picked quotes, even when the context (beyond the leaks, within them, even in the very same email sometimes) showed without doubt that their claims were obvious bunk.

And Assange merrily fed into that: drawing spurious connections on his twitter feed; linking/retweeting the fever dreams of naked partisans; and dragging out a single dump into an episodic drip-fed spectacle, all the better to whip his fans into an inquisitorial vigilante frenzy.

"wikileaks on it's own can also be very misleading"? I can't argue with that - except perhaps to say it's far too charitable.


There's a lot of stuff in wikileaks that merits investigation but the MSM never did their job. Just wholesale dismissal from comedianchors like John Oliver. Not surprising that the MSM were asleep at the wheel when it came to predicting the outcome, or that most of the country no longer trusts them.

http://www.gallup.com/poll/195542/americans-trust-mass-media...


Oh I'm not saying that they could not be faked or altered. And to be honest when you have a supposedly accurate track record like this it's a perfect brewing storm for someone to come in and exploit it.

But dismissing them as Russian hacking didn't really give a lot of people the choice to make up for themselves. Yes I believe according to wikileaks that the DNC colluded with media and fact checking orgs and pushed their agenda too far. I read the information and made up my mind. So when someone points to Politifact again I'll do the same. Not saying they can't be saved but it's the idea of looking at it as doctrine from the start with a stamp of 'True' or 'False' that I don't like. Things are never that simple.


>... and now Wikileaks has invited the whole internet to come play Richelieu with them

But why do you think that the whole internet wants to play the part of Richelieu, everyone is allowed to choose the role they want and wikileaks does not hide the context of the emails, everyone is perfectly able to check the whole correspondence. If anything the mass media is hiding the context because of their format and time limits.


Wikileaks is being unbiased by claiming to be entirely unbiased on what they choose to release (yes, they CHOOSE what they want to release, what to save for insurance, etc.) yet not acknowledging the dangerous false equivalency of saying that this system implies there is no possible way that the other party would do the same things. Until we have equal information about both sides, this is dangerous. Wikileaks gets to be the unfair arbiter of truth while claiming total transparency that we know doesn't exist.


Wikileaks has it's own issues - the suggestions of Russian influence, the timing of their data releases, questions about Assages political bias against the US/Clintons etc.

"People need to do their own research if they want to form conclusions not just little to prepackaged conclusions." - I could not agree more.


Would you rather that Wikileaks kept the information for itself? Wouldn't that be just as much of an influence on the election?


Personally I think they should just leak it all rather than controlling the flow to suit their political agenda.


The way I understand, they've been burned before by leaking all at the same time - there's an initial "shock", theneveryone forgets about it. If your purpose is generating the maximum amount of outrage and controversy (and, consequentially, change), it makes sense to publish things slowly.

Although I agree that the timing was probably connected with the US elections. But again, that's a good think - the only way organisations will change is if it will hurt them. Now both US parties know that they can easily be targets next time.


So is Wikileaks an organization interested in free information, or using "free information" for maximum political effect? If they are twisting the knife and trying to find out how to cause the maximum possible damage for one side only, then they are clearly not a neutral organization and obviously should be treated as such. If their goal is maximum political discourse, then they are obviously a political organization, not an information or "truth" organization.


Why do you think they wouldn't be causing damage for the other side, if they could? I mean, even if they had emails of the GOP, I doubt it would really hurt Trump in any way, it's not like he had their support anyways... Actually, it would probably help him!


I'll agree with this.


I have personally seen people turn away from Fox News after an excess of anti-Trump coverage; though I guess by now Fox is prolific enough to reliably called Mainstream Media.


Snopes did put out a lot of articles this season where the claim of "false" seemed to hinge on some sort of hair-splitting rather than the core of the claim being inaccurate.


This is something that concerns me a lot as well. How do we rebuild this?


Well, for a start we'd need some form of fact checking that was actually about just the facts rather than political spin, and that's really hard to do once politics becomes involved. Even purely factual questions like whether someone said something or not become blurry.

For example, take the child rape trial Hillary Clinton was a defence attorney for, in which she claimed that an unspecified person had told her the victim "is emotionally unstable with a tendency to seek out older men and to engage in fantasizing" and "has in the past made false accusations" to justify putting her through a nasty court-ordered psychiatric exam. Is this the same as Hillary herself making the claims? The original viral version and the victim saw it this way. Snopes argued that this was a lie because the document showed that other people had made the claims, neglecting to mention that the interpretation and wording was Hillary's and the existence and honesty of these other people rested entirely on her word. Neither of these is inherently more truthful than the other, it's a question of which spin is politically convenient.

This is probably the most clear-cut example imaginable too, since it's about the contents of a court filing. Most of the world is a lot more factually unclear, visible only through a lens of conflicting evidence and multiple sources of information.


A couple thoughts:

- How do we get broad, bi/cross/multi-partisan support?

- Is there any precedence for news articles having citations/eligible for peer-review (post-published, of course) similar to scientific papers?


By meeting the other side, if at all possible in person but at least by video chat, show respect and genuinely care about their side.

Either that or find a common enemy, such as Mexico (after the civil war).


It's reasonable to not trust a media outlet until it's earned your trust. Until then, cross check with the source material, as more often than not, journalists spin and filter the information in their product.

By the way, Snopes is not a common forum and is edited by a very small team. It is a single point of failure and is easily biased. No one entity should be the gatekeeper of truth.


>Congratulations. We've reached a point, for this individual, where "facts" as decided in the common forum are suspect and the only thing they trust is themselves (and whatever non-Mainstream Media they listen to). We have reached a point where the only truth is what they decide is the truth. //

This is similar to the UK going in to the Brexit vote. Politicians that were promoting the break away from Europe were pushing hard the idea that "we don't need experts" and that listening to expert opinions was wrong (listening to experts like pro-Brexit economists who stated, in common with the anti-s, that Britain would be financially worse off leaving Europe).

It's like the media managed to spin the idea that all people who have rigorously studied something are not to be trusted. The biggest wtf is that it worked, writing a big fat lie in big fat letters on a bus trumps (heh!) countless professors of economics telling the people it's a lie.

The trap is to conclude that people are idiots, postmodernism is not really the reason either I feel. The reason IMO is that the media have become very very good at manipulating people's thought processes and feeding them a position with out resorting to having the person think through that position. Yay, we've won, the media can now make us all believe whatever they want.

But don't panic all those rich politicians and business people are sure to use their powers for good ... /s


Well, I mean, they ARE the same experts who told us deregulation of finance was a good idea.


Have you done any research at all on snopes? I just spent 5 minutes on google and found that one of snope's main "fact-checkers" is Kim Lacapria. She used to write for a blog called "The Inquisitor", a site known for posting bad news stories, similar to the right-wing Alex Jones programme. They have published fake quotes and stories without doing research: http://www.inquisitr.com/670091/retraction-and-apology-to-ro... http://www.rawstory.com/2015/07/story-about-costco-pulling-d...

As for Kim, she describes herself as "...an openly left-leaning individual myself", if you ctrl+f for those words here: http://www.inquisitr.com/402558/scandal-envy-behind-petraeus...

As you can see, journalistic integrity is lacking from both sides of the political spectrum. Please do not get all high-and-mighty and think you're better than everyone else, because you would be falling into the same trap you mock others for. I hope in the future you will examine sources with a more objective view and keep an open mind, and not just believe whatever someone wants you to believe so they can make a quick buck off of your ignorance.


Come on. You've highlighted one staff member as having once mentioned having a personal political leaning (like most people do) and that she once worked at a blog with a tarnished reputation and spun that into implying Snopes is untrustworthy. I'm sure you could meet this extremely low bar of "proof" for practically any news organization.

If you really want to present a case of bias, find some real evidence. Specific articles with specific instances of false or misleading information. Otherwise, you're just sowing doubt by casting aspersions based on vague associations.


This post wasn't meant to be a thorough debunking of Kim and snopes. I actually think snopes does a mostly good job of producing accurate information. However, it does show in some cases, such as the Hillary Libya "We didn't lose one person" question from the debate. http://www.snopes.com/hillary-clinton-benghazi-msnbc/

Snopes and Kim claim "Hillary Clinton overlooked Benghazi victims when she said that "we didn't lose a single person in Libya" during a campaign event on MSNBC." is false. However, reading the article for her evidence, you can observe that Kim accepts Clinton lying by omission by saying the conversation was not about Benghazi.

Looking at this from a high level, Clinton is implying the the Benghazi attack is a completely unrelated event from the USA's war in Libya. However, these two events are both part of the Libyan intervention and it's disingenuous from Clinton to make such a claim because as a result of the Libyan intervention, Americans did die. Clinton is obviously playing a political game to play down the negative consequences of the Libyan intervention. I don't blame her for doing this, by the way. I would too if I wanted to be president.

Just because they didn't die in the initial confrontation doesn't mean "No Americans died." Snopes fails to point out the fact that Clinton does not include Benghazi in the overall Libyan confrontation. Let's say I am in a car accident with someone, and they are injured. If 6 months later, they have complications and die from the injuries I caused with the initial car accident, I am still responsible for the death of the individual.

Left wing bias: No Americans died in the initial confrontation with the Libyan government.

Right wing bias: Clinton killed Americans in Libya.

Neutral bias: Clinton made a decision to support the invasion of Libya. As a result of the invasion, Americans died in Benghazi.


> Neutral bias: Clinton made a decision to support the invasion of Libya. As a result of the invasion, Americans died in Benghazi.

Even that statement could be twisted if you draw the conclusion that Clinton supported the invasion, people died, therefore she is partly responsible. And it's missing information about the "600 requests for additional security".

I do understand and agree with your point.

Perhaps the more important issue is that some voters have an overly simplistic world view. "Casualties from an attack on a diplomatic enclave in a volatile country" should be separate from "US Foreign policy had implications for the conflict in Libya".


Agreed, the unbiased statement could definitely use some more words to show it's an opinion that Clinton COULD be responsible for the result, but it's not black and white. It's just frustrating that people don't recognize that yes, snopes, in general, is a trustworthy source. However, snopes is a group of people, each with their own individual biases.

These biases can occasionally influence the content. To treat these sources as some kind of perfect "machine" that can just be unleashed onto facebook to automatically say "You're wrong, this is how it really happened." is a scary thought. Especially when the process and machine are blackbox and we don't have any way to verify which "facts" are being pushed, by whom, and for what reasons. And when you bring up the fact that there might be bias, you become one of _those_ people, like you're some kind of conspiracy nut for asking questions.

I think that happened this election. The polls all said "Clinton crushes Trump." Anyone that asked about the validity of those polls was mocked and laughed at and seen as some kind of rightwing conspiracy nut.

Well, turns out they were dead wrong and maybe if people recognized their unconscious bias, we wouldn't have a president Trump right now and people would have said "Holy $@&* these polls are wrong we need to change our strategy so we can win." I kept repeating to people, don't underestimate Trump, he's smarter than you give him credit for and he's a huge threat to Clinton. Nothing but jokes and mockery from the majority of democrats I talked to.


Trump was a massive supporter of going into Libya[1] and doing "something" about Gaddafi. These world leaders are stuck with the hard problem of satisfying dumb idiots at home who think that the US can wave a wand and stop people across the sea from killing each other because it sucks seeing the poor brown people kill each other, yet they are only doing so because we meddled in their affairs before, and yet we are proposing to solve it by meddling in their affairs more. The US, including Trump, asked for Libya, and now Clinton is on the hook because the US population is too misinformed to understand why Libya is on edge, why it was America's fault, and why there will be no easy solution because, again, until America stops doing this it won't stop.

This is the issue. Americans don't know what they want and are mad because they feel like they somehow have a special place in the world everyone else is "stealing" from them somehow. This attitude is "anti-American" because it is the unfortunate truth, and America just voted that the truth hurts but it would rather blame it on the immigrants and mexicans and muslims. This is scary. You cannot have the attitude that you somehow have a god-given "right" to jobs in your country, and that filthy immigrants are stealing them because they are bad people. Conservatives feel like the government is a credit card they can use to get more when they want it, no matter what the consequence for the rest of the world. This entitlement is a style of attitude and thinking that is guaranteed to cause global conflict, and is the inevitable end of a culture war between western thinking and other powers. Obama proposed to fix this by being basically an apologist and reparist for the rest of the world, but the rest of the world is so fucked up because of America's history of trying to decide what's best for it that it is starting to turn on America, and deciding it doesn't really need them any more. This stubborness is war-mongering and why it is better to have free trade and globalism rather than using literal direct conflict to solve perceived differences between countries.

[1]: http://www.politifact.com/truth-o-meter/statements/2016/sep/...


In fairness, I should mention that by "auto snopes" I mean "implement an auto-debunker" or even just "auto fair balance" which automatically adds "the other side" to anything linked on facebook.

I have no hat in the ring with respect to which side publishes the truth, I just think that if something is FALSE facebook could alter their UI to immediately debunk something.

Like the other quote of Trump saying "I'd run as a republican because they're the dumbest" could have a big "FALSE - he never said that" right underneath.

People who thought it was funny could still post it until their hearts were content - but nobody would be FOOLED


Thanks for clarifying that! I don't have any skin in this game either, and I agree it might be helpful to have an auto-bias feature, but it needs to be completely transparent and open so we can fact-check the fact-check checkers ourselves :) Someone must fact-check the fact checkers, and facebook should do their best to ensure this is possible with any system they decide to implement.


Your comment is missing the link to the snopes article that's wrong.


I like this. But it needs to be open and multiple sources.

so you might get a panel with

SNOPES : FALSE NYT: FALSE HUFF: MIXED DAVID AVOCADO WOLFE: QUANTUM CRYSTAL HARMONIZED


On the days leading up to the election, my feed was full of fake news stories. Every single one that looked fishy to me, Snopes had debunked. I posted a few replies, but I felt like it was a Sisyphean task of shoveling shit against the tide of misinformation.

Is there any hope that Facebook will do the right thing and put information ahead of profits? As a libertarian at heart, it pains to to say this, but I think the only way to fix this is government regulations. Of course it will never happen under Trump, but how else can force them to be good curators with all the power they wield.


the problem is the deeper fundamental philosophical conversations about truth and the meaning of truth and relativism and biases etc aren't being discussed at in the tech community let alone at FB/Twitter HQ.

Having a lil fact checker isn't as easy as adding a button to snopes.

There's bias in language itself and how even a sentence is structured. But again a deeply complex conversation nobody seems to be having in this community.


This is a brilliant idea.

My hope is for a similar thing to exist in real-time in future political debates, where live fact-checking spits back at anyone who speaks a mis-truth, nipping B.S. in the bud.


could we shock the candidates when 3/4 fact checkers disagree? Joking (mostly) aside. I like this idea as well. I think it could replace the useless and toxic chat/twitter feed that is put on websites hosting debate streams.


Agree. We can have a bunch of people fact check on the fly. A ministry of some sort. We can call it The Ministry of Truth.


Unfortunately, sites like snopes and politifact also succumb to the same type of bias as the right-wing sites. See this article where politifact was handed propaganda material from the Clinton foundation and parroted it without doing any actual checking about AIDS drugs the Clinton foundation was funding:

http://www.politifactbias.com/2016/11/the-daily-caller-polit...

The problem is a lack of education by the consumer. I suggest everyone read this book as an intro on the topic:

https://www.amazon.com/How-Lie-Statistics-Darrell-Huff/dp/03...

Everyone has an agenda, follow the money, and trust no one. Whether it's right-wing like Alex Jones or Left-wing like the Tampa Bay Times a.k.a politifact, you need to be suspicious and do your own research if you want the truth.


If Facebook wants to be a "source of truth" they're going to need to hire moderators outside of Silicon Valley that represent a wide array of values, traditions, and backgrounds, in a ratio that represents the actual population, and have them peer review each other without fear of repercussion from their employer. I think it's hard for the Silicon Valley types to surrender this type of control... For instance, would this article even be on HN if Hillary had won the election? If we're talking about neutrality, that's an important point to consider.

Actually, the lack of multiculturism is true for quite a few Silicon Valley companies. Try to find a station that has 'Today's Metal' on Google Music, good luck ...there are a couple hundred different indie channels to listen to that are very meticulously organized into genre, sub-genre, and sub-sub-genres. I'm not complaining, I simply use different products, but I think it points back to the source of where the lack of multiculturism stems from.


You think the users are too stupid to fact check. But what makes you think the users care about the truth anyway?

More often than not they double down with "well it's emblematic of the greater problem" or something to that effect. They then look for new evidence to support their views. Like it or not, some sites are designed to build echo chambers and are not for general discourse.


A lot of people don't care. They believe an image on facebook that reaffirms their belief.

You try to counter with statistics from the FBI, DOJ, BLS, DOL, or any other organization and the numbers are 'made up'. And studies have been done that when presented with counter arguments it strengthens the original misperceptions.

Corrections also don't get much traction. I've seen many conspiracy theories pop up for a week, then die down. While the correct gets no traction at all.

>As a result, the corrections fail to reduce misperceptions for the most committed participants. Even worse, they actually strengthen misperceptions among ideological subgroups in several cases.

http://www.dartmouth.edu/~nyhan/nyhan-reifler.pdf

People don't want news, they want to hear what they already 'know'.


The problem is that there are no organizations that can give out unbiased facts. This includes the FBI, DOJ, etc. If you have grown up in a household that were in opposition to either of these organizations (and that could be either democract or republican, I can't imagine a poor black person in the Ghetto likes the FBI more than a Montana rancher), why should you trust their statistics? Heck even if you do trust them, don't forget the old saying: lies, damned lies and statistics.


When playing identity politics, agreeing with a blatant lie is expected when you need to affirm the narrative of your tribe. If you don't, you're a heretic and not a true believer.


If you do a root cause analysis of this election's result, much of it points back to Silicon Valley in one way or another.

Workforce automation created millions of unemployed people in places far away from economic hubs, with no hope of employment in their hometowns or re-skilling for the new roles, leaving them desperate. Proliferation of mobile apps lead to an explosion in use of social media. The wild popularity of this democratic social media encouraged a culture that allowed misinformation to flourish without a lot of consequences.

Bad actors took advantage of this, and reinforced distorted false narratives. Desperate people latched onto the messages presented to them, that gave them solutions repeatedly reinforced by social media, and many took an enormous risk - they voted against their best interests in order to solve their problems with the information they had available. And that is how we got here today.


>Workforce automation created millions of unemployed people in places far away from economic hubs, with no hope of employment in their hometowns or re-skilling for the new roles, leaving them desperate.

So did the Industrial Revolution.

We may mock the Luddites for breaking the Jacquard looms. But do you know what the punishment for frame-breaking was? Execution.

The Industrial Revolution made way for the tertiary-sector economy. When the tertiary-sector economy becomes redundant, what is next?

There are no politicians talking about this.


Yes, Federal jobs should have been moved from cities to fill those lost. Norway did something similar.


Agreed. Though personally I believe Silicon Valley shares a responsibility - tech companies should encourage more remote work and smaller hub offices in less dense urban areas. It would save them money and help distribute economic gains outside of just the major urban centers.

This was a root cause of Brexit in my opinion as well - all of the UK's best jobs are stuffed into London and London only.


That sounds like an interesting story, how can I find more about it?


Here is one article on the subject. I am sure there are more out there. http://www.lifeinnorway.net/2013/03/have-the-norwegians-got-...


This is seriously hilarious. Now that's he's won, against seemingly all odds, we have to find some other reason for why EVERYONE ELSE, was completely WRONG. Every media outlet, everyone in politics, all the polls, completely shit on Trump. Now look who is the most powerful person in the entire world.

The Democrats, and the rest, did it to themselves. Overconfidence, smear campaigns, and all of the morons doing the polls.

We are all sick of politicians, and the people have spoken, let's just see what happens. Get over it already.


The polls weren't that off actually. They missed the mark overall by about 2%, which is actually pretty similar to the error in past elections. That's not very surprising because without being able to sample the actual distribution you need to make some assumptions for the statistics to work.


I presume that pollsters did their job well. There is a pattern here, they got it wrong both with Brexit and with Trump. I think bad polling was not the reason. The media, the government and academia have created an atmosphere of militant orhodox liberalism, where dissenting voices are not only silenced, but often ridiculed, doxxed, forced to resign etc. These people will choose to disappear from the polls. They will "Think as they want, but behave like others", until they are given the chance to fight back.

Thankfully, in democracies we have elections to release those tensions. And setbacks in democracies are inevitable.


He is in charge of the EPA now but he tweeted that global warming is a conspiracy created by the Chinese. He is in charge of the State Department but he thinks that Mexico will pay for a border wall. He is charge of the FDA and he thinks that vaccines cause autism. Some of his views are deeply disturbing on a level much deeper than just "orhodox liberalism"


The response to people who are wrong should be to engage and argue with them. By tossing them as deplorables they will keep to themselves until they can get back at you.


The use of "deplorables" is presumably a reference to Clinton's remarks and it's really interesting you reference that here because her point of that comment is similar to yours in that you can't just lump all Trump supporters into the racist and xenophobic basket that many people did. I agree with you and I will gladly argue politely with people who are willing but I don't think it's unfair to criticize such damaging views in a presidential candidate.


Regardless of how you feel about him, you can't deny that he knows his audience. He's a salesman par excellence.


> They missed the mark overall by about 2%

This is misleading. Only on a national level were the polls that close. In the actual swing states that gave Trump the presidency, the polls were off well outside the margin of error.

For instance, in Wisconsin no poll ever showed Trump winning that state, and the closest IIRC showed Client winning by 3%. The average was significantly higher than 3%.


I will give you that they were wrong in Wisconsin in a pretty spectacular way.


When you take wildly changing vote count estimates (from +10 to +2 within two weeks) and turn that into more than 80% of probability of a specific candidate winning, that's a completely special level of stupidity.

Nassim Taleb also wrote about this https://twitter.com/nntaleb/status/796357059813974017


Nate Silver had an excellent response to this on one of the "Model Talk" editions of the FiveThirtyEight Elections Podcast[1]. TL;DR of the problems with Taleb's critique:

1. "Volatility" in the model is not random, as Talib asserts -- it happens at predefined points in the time. For example, the accuracy of polls jumps greatly after the conventions complete.

2. Volatility at a previous point in time does NOT mean there must be similar volatility going forward, as Talib implies.

3. Taleb's quickie models that he posted for explanation don't remotely match empirical data.

I respect Taleb, but he seems to have fallen into a trap really smart people often fall into: being right with limited data, but wrong with complete data (in this case, he creates assumptions from his knowledge in other areas like options pricing, which are wrong in this domain).

[1] Skip to 13:30 remaining http://fivethirtyeight.com/features/what-makes-a-tipping-poi...

EDIT: formatting


>we have to find some other reason for why EVERYONE ELSE, was completely WRONG

>and the people have spoken

Trump lost the popular vote, so this isn't true. He got about 25% of the total voting population overall.


> Trump lost the popular vote, so this isn't true. He got about 25% of the total voting population overall.

Since the "popular vote" doesn't elect the president but the Electoral College, each candidate must campaign to win the Electoral College. That is the strategy to win presidency. If you change it to "popular vote" you will see campaign strategy change to optimize towards that metric. It's not something you can look at "in retrospect". Trump would have played the game slightly differently if that was the case.

So actually everyone crying about the fact that he didn't win "popular vote" are just displaying their own ignorance on campaign strategy every time they bring up this point.


I agree that just because Trump won the election doesn't mean the people gave him a mandate, or have spoken, but looking at popular vote in a system that doesn't even focus on popular vote is incorrect IMO.

If the rules were to get popular vote then campaigning would be done totally differently. The candidates are doing everything they can to win states, not win popular vote.


Regarding popular election, "mandate" and "only 25% voted for him", it seems that all people who are now dissatisfied have voted for Clinton. Which suggests that the 50% who didn't vote maybe don't oppose Trump that much.

I observed that leftists have a tendency to systematically overestimate their numbers, influence and the amount of agreement or even familiarity with their ideas among the public.


Irrelevant. This is a republic, not a democracy.

If you want to cherry pick small percentages by comparing to the voting age population.

Clinton got 25.5% and Trump got 25.4%

Voting population 235,248,000

99% reporting per Google Polls.

Trump 47% 59,821,874

Clinton 48% 60,122,876


> He got about 25% of the total voting population overall.

Clinton got about 24% of the total voting population in 1992 and 1996. What is your point?


Trump didnt loose the popular vote yet, until the final vote is counted, it seems CNN and others are still shitting.


> the people have spoken

many marketing and propaganda people would smile knowingly seeing this, knowing that the will of the people is very shapeable. It's been known for 90 years now...


So because Trump was elected, it was shaped by marketing. If Hillary was elected, it was their will alone. Even though the Clinton campaign spent twice the amount of money.


This article would not have been written had the election outcome been different.


There's no way to no for sure. We can suspect, and presume based upon our political leanings, but there is no way to know.


Another place where Facebook's failure to detect fakeness has proved costly is in their social graph.

A lot of sites moved away from comments platforms like Disqus to Facebook in the hope that the quality of discourse would improve and trolling would decrease. Instead, clicking on some of the most vehement commentators' names would invariably lead to suspiciously bare accounts (and often suspiciously fake names) with a handful of friends themselves, all with similar characteristics. Unfortunately the people who are influenced by this sort of thing are not tech savvy enough to do even this basic level of checking.

There is a a sort of "uncanny valley" that a technically savvy and experienced person can detect when looking at a fake profile, that I daresay Facebook's algorithms just can't.

Then there is also the problem that Facebook really doesn't care about blatantly fake accounts until they are reported. The sorts of people who are trapped in some filter bubbles are unlikely to be savvy enough to know how to report these profiles (I've reported many-dozens at least-ranging from community noticeboards to cupcake businesses, all pretending to be people and missed by FB's much vaunted ML)

Of course, there is a certain irony in using a throwaway account to discuss fake accounts, but without a patina of "realness", it triggers a greater level of skepticism, which is a good thing.

When the history of 2016 is written, a large part will be filter bubbles and trolls expertly manipulating huge swathes of electorates enabled by the hubris and greed of Social Media networks.


Except it isn't just the spread of misinformation. Even more important (IMHO) is the insulating effect of showing people only like-minded opinions, effectively trapping everyone in a bubble. I can't think of any way Facebook can solve this problem without drastically decreasing their reliance on ad revenue. What are they going to do, force people to view opinions they disagree with?


Everyone should be responsible for the curation of their own timeline. I barely use Facebook but hate the idea that I didn't get a straight chronological feed.

I was a huge fan of Twitter but was driven away when they followed suit. My reasons had more to do with my own preference of how I enjoy using the service, but having algorithms manipulate the information I see so as to maximize engagement metrics is even more troubling.


There is a "chronological" option on facebook.


It's illusory. They may or may not, based on their algorithms, bump something old to the top of your feed after it gets a like or a new comment. They call it "most recent" and they can define "recent" the way they want to.


> showing people only like-minded opinions

Can't wait until they do the opposite in case of free speech then get sued by people who can't deal with their emotions and accept them. Some of them are already out protesting (and some "protesting" with destroying property[1]).

But hey I guess helicopter parenting and after that safe-space culture they grew in can only get you so far.

[1] https://www.youtube.com/watch?v=1d9lm-T87AQ


It wouldn't even work. Reread the news you found that you agreed with - how much of it is simply restating the same basic things and calling the other side evil, stupid, etc, at least indirectly?

If somebody with different believes than you were forced to read those, do you think the resulting society was more or less polarized?

The kind of content you need to do that is already disadvantaged since it has to be toned way down, and that makes it less much less likely to be shared in the first place.


Not every decision needs to have a positive impact on the bottom line. Sometimes decisions are made which hurt revenue/profit for branding reasons.


> Last week Buzzfeed reported on an entire cottage industry of web users in Macedonia generating fake news stories related to Trump vs Clinton in order to inject them into Facebook’s Newsfeed as a way to drive viral views and generate ad revenue from lucrative US eyeballs.

Does anyone else find this incredibly ironic? Buzzfeed ratting out others doing clickbait?


Buzzfeed lately has been trying to 'outgrow' their reputation as clickbait, and actually do traditional investigative reporting. With moderate success.

I agree with you though, it's hard to take them seriously.


"Actual journalism" Buzzfeed articles are of surprisingly great quality. They fund that quality through clickbait, which is their primary monetization strategy if I recall correctly.

It does make the right hand difficult to believe when the left hand is off being silly - but I'm actually impressed by the level of works produced by their journalism department.


Their monetezation strategy is advertorial content posts.

People are so used to clickbait as a way to get impressions and clicks from display ads that many don't realize there isn't a single 'traditional' display ad on BuzzFeed's entire site.

They make their money by charging for references to 'advertisers' products and services throughout BuzzFeed content - something they do really well.

Doesn't come cheap either - big cash outlays iirc.


Not long ago I saw a post about the oil pipeline protests. It was a photo of a huge crowd, and stated that the media was not covering the protests properly and that people are taking a stand. The photo was of tens of thousands of people, and had over 230,000 shares. 230k....

A quick image lookup showed that it was in fact a photo from Woodstock 1969, but since the comments on the photo were restricted, nobody had been able to point it out.

I went to report it, but facebook seems to have removed the "misinformation" option when reporting content (thought there was one before?)


Trump's strategy worked brilliantly on social media. Here's the strategy in a nutshell:

1) Say something outrageous that news orgs will grab for a quick, clickbait story that will generate tons of ad revenue.

2) Let outraged people share via social media.

3) Benefit as some of the people seeing the content will not disagree and will take the candidacy seriously.

4) Win.

It doesn't matter what Facebook does with its trending section, the real value of Trump's strategy came with the way it exploits the basic sharing / newsfeed mechanism's intended behavior. Even today, many Trump opposers think that they were helping by posting their outrage at every rude comment Trump said.


I agree, I place a great deal of responsibility on the news media for popularizing the circus mentality of the election cycle for profit.

Trump saw it for what it was near the beginning and took advantage. Every time the news media focused on his antics, true or not, and not on the issues pushed him closer to the Presidency.

Well, that and the fact they apparently had problems with admitting Clinton was a flawed candidate.


Except he closed the gap in the last few weeks of the race when he stopped saying outrageous shit.


People still want to believe that Trump couldn't have won except by exploiting stupid people.


His message resonated with a lot of people. His tactics exploited social media patterns and mass media patterns effectively. He also ran against a candidate plagued by scandal and widely disliked.


Let's be frank here: Free speech and facebook never go together. They want to present as a respectable forum where prominent people like politicians can have their platform (same with Twitter). That needs some kind of filtering. And, as we saw with pretty much the entire American press, as soon as you start filtering and selecting, you start to induce gross biases and distortions of the truth. The truth only comes out in a clutter of contradicting opinions, engaging discussions, pieces of evidence and leaked materials. Of course, that is an ugly mess not many people want to put up with.


> The truth only comes out in a clutter of contradicting opinions, engaging discussions, pieces of evidence and leaked materials

It feels like we're way beyond that. It's degenerated to the point of two tribes who no longer have any "shared truth" from which to argue. If I see a post of Facebook saying that Obama has instituted martial law in New York, how can I convince the 4364 people liking it that it's a lie? If I point out that the New York Times would probably write about it, I'll just get laughed at.

The truth seriously fared a lot better in times where newspapers and a few TV stations acted as completely undemocratic arbitrators.


I think it may be worse than two tribes. Its a million tribes with limited overlap in what they consider the truth, but all having a large megaphone that cuts across tribal lines while still building bubbles.

My analogy is breaking down. I don't think we understand how the complete removal of barriers to publishing are changing society. Nor do we understand how to cope with it. A bunch of folks just elected an authoritarian to "guide" the way.


I disagree. I think things like Wikileaks and independent, small blogs and feeds contributed a vast amount to this election - knowledge about Clinton that we would have never had otherwise. (And I say this as someone who supported neither candidate, and if I was an US citizen I would have probably thrown my vote away)


I am not a Trump supporter, as far as his anti-Muslim stance is concerned. But I do support some of his economic stances which are labeled as "protectionist" by mainstream media. I may be wrong as I am not an economic expert also his take on immigration is certainly a thing of significance.

A related note regarding Facebook here: AFAIK, Facebook's curators were very biased and were always removing anything to do support what "mainstream vocal leftists" find objectionable. e.g. Be it to do with the shameless suppression of news related to people like Geert Wilders or Pamella Geller who are not by any means right wing fanatics. Geert Wilders is a staunch supporter of homosexuals. Just because he criticizes the barbaric ideology of Islam he is labeled as a "right-wing nut" by leftists with covert/overt Islam-apologetic stances.

It is well known that Saudi kingdom has large investments in mainstream US media, that's what Trump drew people's attention to. Who knows how much of Facebook is controlled by Saudi and the likes. [1]

It's no surprise that mainstream media got the prediction about Trump wrong. I guess, there predictions and poll-results were in fact propaganda against Trump. I felt it that way, many people I know felt it and I am sure many more people must also have felt it.

[1] https://www.youtube.com/watch?v=Ex9ldUHSgjs


This is a tough position for FB. No matter what they do, half of the the world will think the actions they take are wrong. Most humans are intelligent enough to know that stories like Hillary Clinton running a child sex ring (or whatever) are false. People choose to share these because they want to believe it or think it's funny. As they say in their mission statement "Facebook’s mission is to give people the power to share and make the world more open and connected." it is a commendable mission but it's also a messy one. I think inevitably they will have to do something. Maybe some kind of "truth barometer" on stories that attain a certain volume of engagement. At least then they can say "look! we're trying" without taking away the people's right to troll. In this particular election though I doubt anything would have helped. The winner has been openly trolling for years, his direct actions set the standard for truthfulness much lower than ever before.


It seems abundantly clear that the problem isn't low-quality content in the feed. That's unavoidable. It's the very small trickle of high-quality content in the feed – content from actual people who want to contribute to Facebook. There isn't much of it. And Facebook wants to show a fresh feed every refresh.


Having uninstalled Facebook from my phone and being too lazy to login on it from my primary computer (2fa) I only look at the feed maybe once every other day. Amazing how much better it is. Relevant and interesting stuff from real friend I care about, no junk. I can recommend that to everyone.


The timing of this announcement does not seem to help assuage fears that facebook was using its platform to push an agenda. It was this accusation that initially forced them to switch from human editors to an algorithm for trending news. I suspect that we as a country are going to have a conversation regarding the nature of privately held nearly public spaces on the internet and what their obligation to us is, if any.


In corporate speak "misinformation" is defined as "any information that doesn't support our narrative". Certainly it is the goal of Facebook, the government, and every other large media corporation to control the flow of information. You need look no farther then Obama's recent statement that news needs to be "curated" by government appointed gatekeepers, to filter out all that pesky information that doesn't jive with government propaganda.


Do you mind linking to where Obama said that? The closest I can find is this: https://www.yahoo.com/news/obama-decries-wild-west-media-lan...

>"We are going to have to rebuild within this wild-wild-west-of-information flow some sort of curating function that people agree to," Obama said at an innovation conference in Pittsburgh.

Why do you think that the curation is going to be done by the government or by government appointed gatekeepers?


It's implied by the use of the word "we" and the fact that he is a government employee.


> It's implied by the use of the word "we" and the fact that he is a government employee.

I think it's clear that's he's asking for a rebuilt mainstream media so that people can at agree on facts.


I can see that interpretation, but I don't think that's what he meant. To me, he's saying that those platforms should develop their own systems in a way that's acceptable to them.


Same noises coming from Europe -- today, it was Schäuble proffering this "lesson" from the US election. The degree to which the news in this country is censored or distorted is frankly maddening, and to my mind, ultimately self-defeating.

For example, until half a year ago, important details in articles about terrorism were left out, but you could still sort of fill in the blanks yourself. Now, every single article about an incident has the same stock phrase: "Hintergrund und Motiv der Tat sind noch völlig unklar", and nothing else.

The elision of all details is supported by the flimsiest of justifications, namely that the terrorists relish and are spurned by the media attention (as if their religious fanaticism doesn't play the major role). There is, of course, not a shred of evidence for this contention, except for the opinion of a handful of "experts". It seems to me that a terrorist attack is newsworthy. That newspapers refuse to report details (say, by interviewing a witness) is an awful abrogation of their duty. Sometimes reading the news here feels like living in Soviet Russia.


> Obama's recent statement that news needs to be "curated" by government appointed gatekeepers

Facebook is leaking, I can already imagine what the clickbait headline looked like...

"Obama to assemble government censorship agency"


Please point to the Obama quote on government appointed gatekeepers.


Here's one relevant monologue:

"We are going to have to rebuild within this wild-wild-west-of-information flow some sort of curating function that people agree to," Obama said at an innovation conference in Pittsburgh.

"There has to be, I think, some sort of way in which we can sort through information that passes some basic truthiness tests and those that we have to discard, because they just don't have any basis in anything that's actually happening in the world"

Obama in Pittsburgh PA 10/13/2016


I have no ponies in this race. Counterbalance - charitable interpretation:

"We [the people or the government - does not matter really] are going to have to rebuild within this wild-wild-west-of-information flow some sort of curating function that people [the people] agree to,"

If I interpret "We" as "the government" it still does not matter if "the people" can agree on it as in functioning democracy.

"There has to be, I think, some sort of way in which we [the people] can sort through information that passes some basic truthiness tests and those that we [the people] have to discard, because they just don't have any basis in anything that's actually happening in the world"


I don't see anywhere he says that there will be government appointed gate keepers


Obama saying "we" is not referring to his administration?

What do you think "we" refers to?


When a Prez says "we" it almost always refers to the American people


The public


...and where did this quote come from?


Have you heard of this newfangled thing called a "search engine"?

https://duckduckgo.com/?q="We+are+going+to+have+to+rebuild+w...


The OP asked for a source. I think that is fairly reasonable.


In order to be acceptable to the general public you need to represent a wide array of contradicting opinions. If your propaganda is too bland it will be rejected in increasing numbers and turning up the volume then will only make people close their ears. I don't think it is easy to run such a system but I do think it works pretty well most of the time.


Yes. Tread lightly here. If you designate some power to determine what's good information and what's bad, you're headed down a pretty slippery slope.

Note: I really don't have an issue with Facebook doing this if they want. They're a company after all, not the government. I can just choose to get the information they deem 'bad' elsewhere.


I do object. They may be a private entity but they control a humongous amount of information flow which puts them in a very public position. We may argue where to draw the line but this extreme consolidation of user attention to a few gatekeepers is a problem for the public and needs to be dealt with by the public.


"As the Americans learned so painfully in Earth's final century, free flow of information is the only safeguard against tyranny. The once-chained people whose leaders at last lose their grip on information flow will soon burst with freedom and vitality, but the free nation gradually constricting its grip on public discourse has begun its rapid slide into despotism. Beware of he who would deny you access to information, for in his heart he dreams himself your master."


AI curated feeds are, of course, not the free flow of information, rather they are blind algorithms looking at what one's "friends" like and what you've previously liked and then feeding you more of the same, creating filter bubbles and fracturing society, while allowing nefarious actors opportunities to game their inherent naïveté.

Closer to the free-flow of information was USENET, and even though it was eventually overrun by trolls and spammers, filtering was largely human driven and unsophisticated, via their news clients, and one had little choice but to look at contrary opinions, and even to debunk some of the totally false junk, which is now hidden from view or buried at the bottom of long threads, left for the "true-believers".

I've been reading parts of the "alt-right" (deliberately breaking my own filter bubble) for many years, and although I despise what they stand for, I do agree with them on one thing..human ability is not evenly distributed. Of course they use this point to advance a racist agenda, but the fact remains that many people of all backgrounds are just not smart enough to deal with the "free" flow of information. They lack the intelligence to deeply introspect on their own biases and to understand that what they are being fed is garbage aimed at manipulating them. The sooner geeks (and those with a "small l libertarian" streak) accept that reality, the safer[1] the world will be.

[1] in a literal sense, now.


the elite mainstream media and the supposed-to-be-independent reporters are the biggest losers in this election, as they took sides so strongly that the other 50% will never bother to have anything to do with them. facebook needs improvement? sure, but more important, those biased media outlets will be irrelevant for most of the people due to not only technology advancement, but also what the news industry has been doing over the last few decades, i.e. they're all so biased, that you always know what they're going to say, why bother reading there.

anyone can tell me one true independent news source in US today? there is _none_.


What about the non-elite supposedly non-mainstream media ? Like Fox News ? Who perpetrated a false story days before the election and came back to apologize for it ? They are real winners right ?


for the record I don't read fox-news and don't know what you're talking about. I read Reuters but its US news is limited, I did some study and was educated it is probably the closest to be called as a neutral source.


oddly enough the two financial news wires, Reuters and Bloomberg were the least bias. Cause they give sterile news about markets, their customers want to know how markets will go, not hear what they want.


Wikileaks?

Oh wait...


That's because there was only one valid candidate if anything they're to blame by treating him seriously.


I read somewhere that Facebook isn't liable for information posted there, that the user's posting the information is the liable party. Here in Sweden, that's not how news organizations work. Here, whatever they publish they are liable for, regardless of who originally wrote it.

I know Facebook isn't a news organization, but it's treading a fine line isn't it? At some point one might argue that by being an aggregator and driving a large portion of its engagement in the sharing of stories is akin to being a publisher, and thus should be held liable for the things it publishes – regardless of who pulled the trigger. Of course, this would be a huge liability for Facebook, and a growth inhibitor because all of a sudden, a story going viral isn't exactly a good thing if it means they can easily be sued for libel.

I'm not a lawyer (probably obvious given the above) but I think it's a bit strange that given Facebook's position they can so easily get away with a "wasn't me!" kind of attitude to misinformation and straight up lies being spread around presented as though the stories were in fact news.


I'm not a lawyer either, but I don't think what worked in Sweden will work for the US. Not only is it hard to sue people for libel, but content sites like YouTube fought pretty to hard to make sure they couldn't be responsible for things other people post on their sites (as long as they promptly respond to DCMA notices).


And the same for Hacker News?


I think what you're miss is the safe harbor parts of the DMCA in the USA. They offer protection for any user generated content on the web provided the operators respond to DMCA requests within a certain time (24h?). Without these safe harbor provisions the idea is that the web would crumble under the weight of liability. Think about shared hosting companies, AWS, a public forum like this and then consider how much effort it would take to police the amount of content being created by users.


The moderation of Facebook is outsourced to a select group of organizations, not unlike Twitter with their Trust and Safety Council.

https://about.twitter.com/safety/council

https://blog.twitter.com/2016/announcing-the-twitter-trust-s...

Unsurprisingly, Twitter's Safety Council members are predominantly left-leaning with ties to the (now defunct) Clinton campaign. I would presume there are plenty of seats to be filled now.


Isn't twitter moderation openly mocked for being, well, hardly functional at all? It's been a bastion of bigotry for the past year. It's Trump's main platform. If they're trying to suppress the right, they're doing a TERRIBLE job.


Twitter moderation is slow and ineffective, but still partisan - they're jut not very good at it.


If it's ineffective then what is the evidence is it that it's partisan?


Oh they are doing a terrible job. And one of the reasons I don't use a platform that thinks its ok to name groups things straight out of "Fahrenheit 451".


Ah yes, Twitter, the left-leaning echo chamber where alt-right types and racists don't bombard jews and black people constantly with death threats and gas chamber jokes. Free speech has been suppressed so vigorously by the Safety Council :(

(Spoiler alert: Twitter basically doesn't moderate anything and their department responsible for handling abuse reports doesn't do anything)


Okay but to be fair, there are something like 6000 tweets every second. Are they not moderating because they believe in the principle of free speech, or because it would be impractical?

I do think that they manually curate their trending sidebar. After Comey announced that the emails on the Wiener laptop were cleared by the FBI, the sidebar looked like this: http://i.imgur.com/9LW0iaU.png

Now, since when do hashtags get an explanation underneath them? Who wrote that explanation and why? Why doesn't it show the amount of tweets below it like every other trend? I had never seen that before.

And in this particular case, if they did manually put that there, I think they're vindicated in the sense that they're actively trying to not spread misinformation. Because if I had gone on twitter and saw "Comey" trending, and I hadn't read about the announcement that she'd been cleared, I'd simply think that a lot of Americans were talking about it within the context of an ongoing investigation, and I'd get the sense that twitter users were collectively rallying against Clinton, rather than for her.

And that, in essence, is the real problem that I have with manual curation by FB/Google/Twitter/etc. People are underestimating how much trust the general population puts into algorithmically produced content, and how much of a betrayal of trust it is for them to inject something into that. They're underestimating just how quickly we've learned to interpret algorithmically produced content and infer certain things about it. And in spite of the fact that the sidebar could have mislead you to think something else was being expressed by those tweeting about Comey than was actually taking place, I also think the public is more aware of the shortcomings of algorithmically produced content than we give them credit for.

If I watch Fox/MSNBC/CNN/etc, I know that there's an implicit bias to the pundits and the guests and the networks themselves. I factor that into whatever content I receive from them.

But when you have Twitter trends that are organically produced 95% of the time, and you silently inject something into that, you've done something much more sinister than a news network leaning towards a particular angle. You've taken a metric that a lot of Americans trust entirely to be reflective of the whole country, and you've elevated a certain few or silenced a few others.

That's significantly more powerful. You can provide the illusion that millions of people do or don't feel a certain way. It's the most dangerous thing to happen this election cycle.


Thanks for bringing that up. I had been unaware since I don't follow twitter much anymore. You are right that is amazingly vile.


Determining what is, and what is not, misinformation seems:

1. Intractable. How exactly do you propose to accurately vet every bit of information posted to facebook?

2. Dangerous. Who does the vetting? If it's an algorithm, who writes the algorithm? No matter which way you slice it, "doing more to stop the spread of misinformation" imposes someone's 'one true view' of reality and stifles the engine of western civilization: the open, competitive marketplace of ideas.

This is clearly an emotionally driven sentiment caused by the moral outcry of a person considered 'bigoted' being elected to office. Stifling political expression will only make Trump/next trump/next next trump's populist appeal stronger.


Or Facebook can take their thumbs off the scale on what is supposed to be friends sharing information. Do you realize how Orwellian it is for a big company to say friends Bobby and Sally have to be monitored in what they share? How does this not bother people?

I don't care if Bobby and Sally believe that we never made it to the moon and wear tin foil hats. It is _none of FB's damn business_. They are a _platform_ not the editors of all things true. They should not issue us all with truth detectors.


So what do they propose? To hire more human editors/moderators to evaluate/censor links and stories?

FB already has too much power and its moderators affects 'politically undesirable' content too much.

One week ago, FB blocked an event page of Independence March, which traditionally takes place on November 11th (Polish Independence Day) in Warsaw. This is the biggest mass event of Independence Day, having more than 100k participants each year.

FB also blocked or removed pages of NGOs and political parties, which are organizing or support the march (some of them had 80k or 170k followers) and personal accounts of people involved in these organizations.

Then FB went full rage and started to block personal accounts of everyone who invited or even positively mentioned the Independence March, including for example the personal account of editor-in-chief of the second largest daily newspaper in Poland (https://twitter.com/sjastrzebowski/status/793001362070052864).

The most extreme case was the personal account of a MP, who wrote on his timeline: "I will be [on the Independence March] along with my family, whether FB likes that or not." (https://twitter.com/jakubiak_marek/status/793497135954202625...) His profile was blocked for 24 hours after that.

Another case was a personal profile of a retired Intelligence Agency officer, who revealed in a FB post, that a local coordinator of an anti-government liberal-left protest movement during the communist period was a colonel of Soviet-dependent military intelligence agency.

All of this happened just within the last month. FB actions generated a huge pushback and hit the headlines. Deputy Minister of Justice qualified FB actions as "censorship". Minister of Digitization tweeted that she "asked FB management for a talk". Many people started deleting their FB accounts in protest.

FB got frightened and reactivated the event page of Independence March, but many nationalist/conservative organizations profiles still remain blocked.


Here's what to do about Facebook.

(1) permanently close your account. https://www.facebook.com/help/delete_account

(2) if you're like me and you can't do that, "unfriend" everybody except the people you work with. Delete all your affiliations and locations.

If enough people did this FB would have to do a little struggling to retain membership. That's the kind of incentive they need. Altruism demonstrably doesn't work, especially Sand Hill Road altruism.

(I help a couple of little nonprofits so it happens that I need a fb account.)


Alternative, lightweight solution,

* Do not visit facebook often, do so only in incognito mode, delete other existing facebook cookies on all your browsers.

* Do not use facebook app, inform people that it is a battery hog and steals everything it can on your phone.

* Do not accept new friend requests, or send new ones.

* Do not post status updates or any other kind of content

* Do not "Like" things, do not comment. Do not produce anything for facebook actively.

* Keep it only as a place to check if other people upload pictures of you by scrolling the feed a little, to tell them to stop if they post information about you, use the web interface for messaging any friends who dont know any other means of digital communication.

* Sometimes Like things you dont, spread some misinformation, devalue what content/profile they already have about you.

* Ofcourse set all your posts to private, and all other "settings" to maximum privacy, allow noone to write on your "wall" and so on.


Also a good tip: unfollow all your friends by default, except for a select few.


No. I find facebook actually useful to talk to my friends and keep in touch with them.


If you unfollow your friends, you can still be in touch. Best of both worlds :)


Or just use messenger.com instead of facebook.com : o no feed


Wouldn't Facebook have the perverse incentive to define 'misinformation' to be any information which offends the most people, regardless of truth, in order to keep users from leaving


Of course. Worse, they actually have two different strategic factors that play poorly together: 1.) Retain users 2.) Have a personal bias about what will offend users

These two items can magnify each other in a false direction as they attempt to retain users that they perceive they have.


I don't understand how this is different than any other communication medium.

Telegraph operators shouldn't feel any onus to ensure that they stop the spread of misinformation, right?

In fact, I think it is VERY detrimental for them to step in and started editing posts, bringing back the human editors. Humans have biases, ML less so.

If an algorithm decides that something is trending, even if it is "incorrect" to some interpretation of the word, then it should be treated the same. It's just a matter of principles.


How are we to ever prevent misinformation when news papers are happy to publish that "political candidate did crime X! anonymous john/jane doe said".

If news papers were held to ethical requirements that prevented mere accusations from being news, I could see how Facebook should also be held liable for not following the same rules. As it is now, someone could just append "john doe said" at the end of any conspiracy theory and by news standard, its fact checked and ethical.


It is a temporary aberration. In due course, people will come to realize that they can't believe everything they read on the internet (duh), especially where provenance is not clear, and Facebook will improve their algorithms.

Some kind of reasonable balance will be found.

People blaming Facebook for Trump is far more a reflection of themselves than of FB news algorithms.


I certainly do not blame Facebook for Trump, but I don't see any balance happening anytime soon. It's not just an issue of the Internet, but also with traditional forms of media like cable TV, talk radio and print. Distrust and misinformation are shoveled out daily. Even the local pastor is in on the action these days.


You can't prevent false ideas spreading by censorship; quite the contrary.

The idea that one can create an AI or censor or committee to separate truth from falsehood and then put it in charge is plainly authoritarian.


What about just keeping out stories that are patently false? If I did an image search for cars, I wouldn't want a boat in there. If I did a search for news, I wouldn't want a fabricated story.


I agree with you. I wouldn't completely remove them, but I would mark the ones from certain sources as having a questionable origin.


Such stories would not go viral.


That's not true: false, viral stories were the very lifeblood of this election.


I'm sure that it so. However, for a story to spread it has to seem like it might be true to a significant fraction of the readers. So it can't be 'patently false'.


Throwaway for obvious reasons. It's dangerous to support reform against the reactionary left.

I'm a Trump voter. I consider myself pretty well educated (M.Sc. CS/EE, honors), and I work in a research lab. I'm not your typical Trump voter. Full disclosure: I'm also an immigrant (legal one, nearly 20 years), naturalized US citizen, I make several hundred thousand dollars a year, and pay a ton in taxes. I'm liberal on social issues. I support LGBT and gay marriage. I'm an atheist. I have a wife and a kid. I'm also pro-gun, pro marijuana, and pro letting people do whatever they want with their lives. I'm also extremely fiscally conservative, and against illegal immigration.

My FB feed was so skewed against Trump I had to close my account. Bombshell stories on Clinton wouldn't trend, while the most inane BS about Trump would immediately reach the top and stay there.

It is strange for me to hear FB being accused of being "too conservative". I certainly saw nothing like that "red" feed on WSJ.

BTW, I could elaborate why I voted for Trump if anyone is interested. I don't agree with him on everything, but on the balance he seemed like the lesser of two evils to me.


I would like to hear you elaborate on your reasons for voting Trump, particularly what kinds of changes you are expecting/looking forward to.


Happy to oblige. Let's get the big one out of the way: I disagree with Trump on climate change. That said, the current president did jack sh#t about it, and so would Clinton (who is owned by her wealthy donors). Combined with the fact that the President can't enact policy unilaterally, and there's broad consensus around climate change in the legislative branch, I don't think too much damage will be done if any at all.

I voted for Trump for several primary reasons:

1. I'm very much against war, and Clinton has shown herself as a war hawk again, and again, and again, and basically wrecked much of the Middle East as a secretary of state. Enough is enough. This alone would have been a sufficient reason for me to vote for Trump. 2. I do not believe in government handouts. I believe that for prosperity, and especially to improve the lives of the working poor, it is necessary to create jobs. To do this, you need to reduce supply of unskilled, below-minimum-wage labor flowing across the border, and make keeping jobs in the country the low energy state into which corporations will naturally arrive, if they want to have US markets at their disposal. Trump promised both of those things. 3. I do not believe that free flow of military age males from the countries with which we're de-facto at war is a wise thing to have. Refugee needs would be much better served by their neighbors, not by people whose military blew up their relatives. 4. It is becoming blindingly obvious that Obamacare is a trillion-dollar handout to the Big Pharma and medical insurance companies. The healthcare is SIGNIFICANTLY more expensive now than it was before, and coverage is worse as well. And the cost is growing double digit percentages every year. Donald Trump promised to improve competition by allowing import of drugs and letting people shop for medical insurance across state lines. Not sufficient, IMO, but we just can't afford to pay 10x for the same drugs anymore. 5. The vast majority of Donald Trump's contributions comes from his small scale supporters, not from Wall St and megacorps. As such, he's not beholden to their interests, unlike certain other candidate. Thiel being the only notable exception I can name off the top of my head. 6. I've found what the press did in this cycle completely disgusting.

So TL;DR: anti-war, pro-jobs, anti-sjw/pc, pro-LGBT, not owned by Wall St or the establishment. That sounds pretty good to me, loose tongue and personal antics notwithstanding. I don't agree with him on some things, but by and large there's more overlap than with Clinton.


Thank you for responding, this gave me food for thought. (Also, you can add extra line breaks in your comment to get breaks between your numbered points.)

Also wondering how much you were convinced about Trump's sincerity. Whether he truly cares about the promises he made or the people he made them to.


I'm more convinced of his sincerity than of Clinton's. Let's face it: he doesn't _need_ presidency. 400K a year is pocket change for him, he's not a politician and he did not spend 30 years of his life converging on this goal. Moreover, he partially financed his own campaign with tens of millions of dollars. Truth is, all politicians lie to get elected, he's no exception.

But I believe he did not lie about the core tenets of his campaign: wars, immigration, globalism. He will do something about those at the very least. That's good enough for me.


> That said, the current president did jack sh#t about it

Did you get a chance to see this?

https://www.whitehouse.gov/energy/climate-change

> In September 2013, EPA announced proposed standards for new power plants and initiated outreach to a wide variety of stakeholders to help inform the development of emission guidelines for existing plants. In June 2014, EPA released the Clean Power Plan — the first-ever carbon pollution standards for existing power plants that will protect the health of our children and put our nation on the path toward a 30 percent reduction in carbon pollution from the power sector by 2030.

> Since President Obama took office, the U.S. has increased solar electricity generation by more than ten-fold, and tripled electricity production from wind power. Building on the advancements of the first term, we continue to take new and comprehensive action to encourage cleaner forms of American-made energy. Through public-private partnerships, streamlining the federal permitting process, and furthering American leadership in clean energy, we are on track to meet our clean-energy goals: to install 100 megawatts of renewable capacity across federally subsidized housing by 2020, permit 10 gigawatts of renewable projects on public lands by 2020, deploy 3 gigawatts of renewable energy on military installations by 2025, and double wind and solar electricity generation in the United States — once again — by 2025.

> That is why President Obama created the Advanced Research Project Agency-Energy (ARPA-E) in 2009. This Agency helps to advance high-impact energy projects that have the potential to transform the way we generate, store, and use energy. Every year, the President’s budget continues to invest in the crucial programs that will keep the United States at the forefront of clean energy research, development, and deployment.

> The Obama administration has proposed the toughest fuel economy standards for passenger vehicles in U.S. history, requiring an average performance equivalent of 54.5 miles per gallon by 2025. The Administration has also finalized the first-ever fuel economy standards for commercial trucks, vans, and buses for model years 2014-2018. These standards are projected to save over 500 million barrels of oil and save vehicle owners and operators an estimated $50 billion in fuel costs.

And I'm quoting about a fifth of what's written there. I am honestly surprised that you can dismiss everything here. Even more so in comparison to the person who tweeted like this.

http://www.ecowatch.com/6-of-donald-trumps-most-outrageous-t...


Blah blah blah, announcements of a plan to create a committee to create a committee. The fact is no tough environmental legislation was actually enacted. And solar grew because it got massively cheaper. Guess what, it'll continue to grow under Trump.

Don't get me wrong, I did vote for Obama (again mostly because he promised to end wars), but he ended up being a complete disappointment on almost everything.


> Blah blah blah, announcements of a plan to create a committee to create a committee.

You're willfully ignoring things I wrote up there to paint a very dishonest narrative.

> Every year, the President’s budget continues to invest in the crucial programs that will keep the United States at the forefront of clean energy research, development, and deployment.

Read this -- research into clean energy requires funding from the federal government. Much like the moonshots projects in the 60s in space and defense, we need something similar in a way that's not entirely dependent on the Wall Street markets (who are focused on short term gains).

Don't you think this is much better and actionable than a committee?

> And solar grew because it got massively cheaper. Guess what, it'll continue to grow under Trump.

Why can't Obama claim credit for that? Do you think it would have happened independently of his administration's viewpoints on solar?

No doubt, if he holds status quo, solar will continue to grow at the same or higher pace. But if he subsidizes coal further, I would be skeptical. But we will know whether your prediction was true or not in a few years time.

> Don't get me wrong, I did vote for Obama

I'm not getting you wrong, you're getting me wrong. You're just dismissing reasonable conversation. And you voting for Obama doesn't give you a blank cheque to parade inarticulate arguments nor does it give you the freedom to cherrypick what you liked and disliked about his election.

> http://www.cnn.com/2016/02/09/politics/supreme-court-obama-e...

Read that. His administration has been hampered by Republicans on partisan lines from 2010, and maybe even day 1 depending on whom you ask.

Don't get me wrong, I welcome criticism about his transparency, whistleblower treatment, and spying.


Why do you consider the Clean Power Plan not tough?


Neil Postman argued in "Amusing Ourselves to Death: Public Discourse in the Age of Show Business" (1985) that entertainment had become the "supra-ideology of all discourse on television". Good television is entertaining while bad television is not, so we learn to judge all content on television based on its entertainment value.

"Americans no longer talk to each other, they entertain each other. They do not exchange ideas; they exchange images."

I think Postman's comments are extremely relevant today (30 years after 1985) except that now they would apply to the Facebook news feed. (The quote about 'exchanging images' is now literally true.) The role of "news as entertainment" does a lot to explain the Trump's emergence as a candidate in the first place (he got a lot of coverage early on because of his outlandishness).


I don't think this is only Facebook's fault. I watched several extremely biased Youtube videos on both sides.

I guess it's just how decentralized media works; it only tells people what they want to hear.

Not so long ago, the media used to be relatively trustworthy, now everything has become propaganda. It started online mostly but now even major TV channels and newspapers which used to pride themselves on journalistic integrity have become propaganda machines.

There is no truth in the media anymore; the only way to approximate the truth is by watching opposing media channels and then 'averaging them out' in your mind.

All the news I was reading online was so obviously biased in Hillary's favor that I was compelled to watch Fox news (which I used to think was complete garbage) just to try to get the other side of the argument.


This assumes Fox is as biased in one direction as, say, NPR is in the other. But you can tell the quality of reporting from the content itself and the way it is presented. For example, while cable news channels will go to the same non-expert pundits for every story, NPR will actually call a different specific expert who is knowledgable to comment on each story.


Sure, that's something you have to account for as well (they're not equally biased).

That said, you'd be surprised how well media organizations can bend hard facts (and so-called 'experts') to match their opinions. People who've worked in quantitative market research will tell you that you can find pretty much any pattern you want in any data if you present it (and label it) carefully.

Just because they've put in extra work to get an 'expert' and did some research doesn't mean that they're unbiased; it just gives you extra confidence - But you still have to fact-check yourself.


> There is no truth in the media anymore; the only way to approximate the truth is by watching opposing media channels and then 'averaging them out' in your mind.

Nailed it. I force myself to deliberately read viewpoints I disagree with, to listen to what they're saying, to hear what is important to them. Sometimes I see their points, sometimes I don't, but I always learn something new. It's really quite shocking and scary how many people aren't even familiar with the arguments of "the other side." The only know "their side's" caricature of the other side, which is simply not enough.


Can't wait for the Facebook Department of Truth™ to inform me!


Just stop using aggregators. Pay for real news sources that hire actual investigative journalists.


This. Their aggregator is ridiculously bad. Even the sources of news it shows you when you click on a "trending" item lead to really terrible sites. I'd rather the news feed be an actual news feed and show me legit news from legit agencies.


Real news sources are dead and actual investigative journalists do not exist anymore.


Well that's patently not a true statement. The Guardian, the NY Times, and Der Spiegel are the first three publications that do important investigations that come to mind but they are not alone.

There are a lot of PR plants even in the more reputable newspapers, which is unfortunate, but there are still regular thorough investigations, which I assume are getting harder to do as money and resources continue to shrink.


The blind NYTimes bias has been readily apparent in this election cycle. WaPo as well. The entire media system missed this election.

A staffer in the Obama White House remarked on the quality of reporting now, as of a few months ago:

"All these newspapers used to have foreign bureaus. Now they don’t. They call us to explain to them what’s happening in Moscow and Cairo. Most of the outlets are reporting on world events from Washington. The average reporter we talk to is 27 years old, and their only reporting experience consists of being around political campaigns. That’s a sea change. They literally know nothing."

https://www.washingtonpost.com/blogs/erik-wemple/wp/2016/05/...


That said, NYT performed some significant investigative reporting. Though Gawker broke the Clinton private email server story, the New York Times vastly expanded the investigation and brought the story into the mainstream.

The statement that the entire media system missed the election is an exaggeration. One function of the NYT, the election prediction section, failed. It shouldn't reflect on the various investigative reporters doing actual digging.


I would recommend reading this commentary, from the political director of CBS News, today:

"The mood in the Washington press corps is bleak, and deservedly so.

It shouldn’t come as a surprise to anyone that, with a few exceptions, we were all tacitly or explicitly #WithHer, which has led to a certain anguish in the face of Donald Trump’s victory. More than that and more importantly, we also missed the story, after having spent months mocking the people who had a better sense of what was going on.

This is all symptomatic of modern journalism’s great moral and intellectual failing: its unbearable smugness. Had Hillary Clinton won, there’s be a winking “we did it” feeling in the press, a sense that we were brave and called Trump a liar and saved the republic."

http://www.cbsnews.com/news/commentary-the-unbearable-smugne...


I'm frankly amazed that was published at CBS and not some pseudonymous post on medium.


On this side of the pond, The Guardian is also so biased, that it is not worth reading anymore.


This is actually an incredible option ... and for me, quite a mind-blowing one.

Could news.facebook.com be an actual content-producer with real journalism?

Large, investigative projects on what is possibly the world's largest distribution platform.


The Ny times has a paid option, they have world class investigative journalism...


I do. Now how do we convince everyone else?


Show them the DNC Leaks - they'll inspire trust for the traditional media in anyone.


Misinformation is any information that does not follow the official guidelines laid out by the ministry of truth.


The problem is that the Facebook feed shouldn't be about news in the first place, or anything that isn't original content from the people I follow. I miss the days when the news feed was photos of my friends, notices about new their new job, what they ate, etc. instead of a stream of tabloid junk and recycled news articles.

Unfortunately Facebook gives no tools to limit the posts to only original content and profile updates. Curating your followed list is pointless since reposted crap and original content are mixed together.


I agree. I think there could be a limit on the amount of links and of images containing text that you can post in a day, as well as warning messages next to content that comes from questionable sources.


Admiting to fight misinformation amounts to the same as admiting that they will pursue a left-wing bias.


We are developing systems to identify and check the impact of misinformation at scale (triage + human in the loop).

If you're interested in potentially joining or funding us, let us know https://docs.google.com/forms/d/e/1FAIpQLSclhq7zrUKI3nxJFiYw...

-----------------------------

Also relevant — my comment 43 days ago to the new "Request for Startups": https://news.ycombinator.com/item?id=12594955

"I'm really happy to see YC putting attention toward the (increasing) market failure around media quality. It's a serious problem, with IMHO imperils governance and stability worldwide.

I'm exploring a way to improve this dramatically, and fairly rapidly; essentially creating a stronger market for trust & quality, with resulting ranking rewards from FB, Google et al. If you are interested in learning more and perhaps being involved, let me know ([email protected]). Experience with news organizations, partnerships, platform companies, pr, moderation systems, or machine learning are all especially helpful."


The biggest source of bias in the media is not related to feeding people false information, it's about misdirection.

If you have two articles A and B, where A is more important than B, sometimes the media outlet will deliberately choose to publish B instead of A if it matches their philosophy better (even though it's less important). If the news outlet shows each reader 5 articles of type B for every 1 article of type A - Then they are manipulating the reader's opinion to become biased towards type B thinking (by focusing their attention on less important stuff). It doesn't matter if all articles are 100% accurate, your perception is being affected through the exposure imbalance; through priming.

I think that's what happened with the Clinton/Trump fiasco; "Hillary media" was publishing more articles about Trump's "sexual deviance and unstable character" while "Trump media" was publishing more articles about Hillary's "deception and murky financial schemes".


How do they define misinformation as many grey area's, for example religion.

So in such grey area's one persons information is another's misinformation and that gets messy for anybody, let alone AI.


So Facebook will be openly committed to censoring certain information for the public good. What could go wrong?


Their algorithms are already doing this. It's not like it's a pure chronological feed.


Facebook isn't the problem. It's a function of its users. It is as much responsible for misinformation as a city park is responsible for the crazy guy shouting about the lizard people. The problem is all the users who actually believe anything they read via facebook.


Suppose you were the person to relay the message though.

Imagine one friend tells you "Can you please tell Adam X", but you know X is a flat out lie and is actually horrible and offensive. So do you agree to spread the lie? Oh and you're doing this free, and it's not your friend it's 2 strangers.


"Can you please tell Adam X" is a hair away from spreading rumor, hearsay. I would ask, at least in the context of facebook, why the speaker cannot talk to Adam himself. If he cannot, if he doesn't have the connections required to speak to Adam, then his message probably isn't worth forwarding.


I was trying to paint the picture that Facebook is the one passing along messages from one person to another. So yes, people could turn their computer off and go directly to someone elses house and tell them the message, but I think we've grown accustom to sending messages to each other over social media.


Systems are functions of nodes, and edges, and the biases and characteristics among them.

Facebook's users, absent Facebook, weren't influencing world events without Facebook. Now they are.

What changed wasn't the nodes. It was how the edges (Facebook) connected them.


I would like to see less headlines frame every statement made by a company as "admits." Facebook is not admitting to wrongdoing or confessing to a secret crime they tried to cover up, they are acknowledging there is an issue that they have to improve on.



If it's such a problem why not just disable the Newsfeed?

No one needs to get their news from Facebook


Very relevant to understanding today's media and politics is Adam Curtis' new documentary film Hypernormalisation. It's long but very worth it

http://www.bbc.co.uk/programmes/p04b183c

If anyone has links to the BBC iplayer for the US, if there is one, or another legit source for outside the UK, please do reply with it.

I strongly encourage you to watch it, and if you can only find it from a dodgy source, buy someone a classic BBC DVD of something for Festivus so support their making more. (lots of things, Dr Who, Jeeves and Wooster, the Century of the Self...)


Seconding this recommendation. American viewers can watch the film here:

https://thoughtmaybe.com/hypernormalisation/


"It’s totally forgotten now, but for the 100 years after the American Revolution, the U.S. government made it free or almost free to send newspapers anywhere by mail. It was available to papers of all political perspectives, with no government censorship. The rationale was straightforward: This was necessary for people to participate in governing themselves."

https://theintercept.com/2016/11/09/donald-trump-will-be-pre...


If the goal is not to bias, total transparency is the answer.


Here's a tool someone could write. Write a browser add-on which modifies Facebook pages so that the news links are changed to Google, Bing, and Wikipedia searches for the news headline. Connect to Google/Bing in a mode with no signin or cookie info, so you don't get "search personalization". This will get you out of Facebook's filter bubble quickly. Google and Bing have problems, too, but they are somewhat better at blocking spam, having been at it longer and being in the business of forcing spammers to buy their ads instead.


Better and easier to do a Google News search for the link itself. If the link doesn't show up in news search, it can be considered a less credible source. If the link shows up, it'll be accompanied by follow up sources


Google as the arbitrator of what's good or bad? Please!


well are you going to suggest a better solution?


this is what I would propose. Attach a very visible rating on each story shared from all news sources, the rating says how much the news source holds to it's journalistic integrity. There should be a very clear criteria that facebook holds it's news content providers to. If you click on the rating of a low rated news source, you should be shown exactly what extremely misleading articles they have published in the past, these reviews should also be tagged so you can view it directly if someone is spreading misinformation on the news feed.


I agree with you. I would keep it simple and as transparent as possible.


Look folks, I get that this is a libertarian crowd but hear me out:

If someone says the earth is flat, it's not dystopian for editors to minimize those claims, or even point out how incorrect they are. Instead, in today's media world, the response would be, "two sides disagree on contour of the planet", and would present two talking heads to chatter about their opinions.

Facts are facts, and we're losing the battle online to disinformation.

Beware of slippery slope arguments about how editorializing leads to propaganda.


There are two big problems shown by this. First, that any one service like Facebook could become so large and unopposed that we need to care how it handles something like this. And second, that people seem fine with the idea of living only inside Facebook, unfriending or muting anyone who disagrees with them, until nothing else matters.

This is an utterly unhealthy way for people to be. If you want to combat misinformation, you must also encourage (teach?) people to stop being spoon-fed and start being more curious and more critical.


Facebook could deliberately inject a certain amount of contrarian content into people's feeds.


"Stop the spread of misinformation" is a subset of "stop the spread of information", which is yet one more way of saying that some people can't stand the freedom of speech.


That is fair - however it does not have to be implemented by stopping the information from spreading. An alternative could be to refuse to distribute the false information without a disclaimer reading "DEBUNKED / FALSE / see here for fair balance"

Or include a DIV to one side with a bulleted list of false claims.

Consider the TRUMP quote image: https://pbs.twimg.com/media/CRxEt7pU8AAzekC.jpg

People Magazine 1998 - I'd run as a republican because they are the dumbest. I'd lie my face off and they'd love me [paraphrase]

A Facebook-fact-checker wouldn't have to suppress the sharing of the image. Id could create an overlay or add a subtitle or tag which reads "FALSE QUOTE - this isn't real"

Rather than relying on the hundred million individual conversations being maybe fact checked - an algorithm / ai / editorial squad could apply fact checking automatically.

The IMAGE thumbprint / URL flagged in the facebook database as proven false, it gets the extra-content on its own.


It would be nice if this were approached more as a user experience enhancement as opposed to a content editorial function. I would love to see the former as my feed is so overwhelmed with garbage. The latter, however, would almost inevitably turn sinister - the leftist, PC, Silicon Valley influence would have no ability to check itself - even if it's not on purpose, the isolated bubble think could not be overcome. I can't imagine that it would hard to automate away teenagers purposely creating fake content.


I really enjoy the dynamic of large forums, where varying ideas can pitch in, and I appreciate that much more in comparison to family members who are steeped in facebook.

Global forums can end up with a fairly civil discourse, because everybody knows and acknowledges that there are differences in opinion. If you want your thoughts to be respected, you need to make them level headed, even if it is an emotional argument. The broader the forum, the better it tends to be. Of course, it also depends on good moderation, but even the self-moderation of HN and /. tend to end up being level headed and fairly moderate.

But there are no alternative opinions or moderators in facebook. People subscribe to small echo chamber radicals, which laser focus on some hyper exaggerated version of their outlook and there's no counterpoints in sight. Disagreements with what other people post ends up in unfriending, because it's too personally dissonant.

This whole blurting out of one's impassioned political, religious, and social ideologies is simply not appropriate public discussion!

You don't go out to social gatherings and start yelling stuff like this. Yet, on facebook, people share such content on their walls for all their acquaintances to see, which ends up being incendiary. These are subjects reserved for those you have enough rapport with to warrant deep controversial dives, not your personally public web presence. The very nature of the mechanics of social media tends to stoke and reward this behavior.


Zuck seems to be crafting and curating his persona & reputation for a future life of public office. Changing up facebook's bad content reputation will be important for this path.


Something I haven't seen mentioned yet is the voting system. I think the voting system ruins online discourse.

Let me explain. Within my own social media and online forum bubbles, there are a certain set of norms and beliefs that are easy to express.

But should I express the 'wrong' beliefs in the 'wrong' forum, I simply get downvoted to oblivion. And then people don't see what I have to say. Look what happens here on Hacker News, my comments will literally start faaaaading awaaaaaay!


Truth isn't a majority opinion.

It's correspondence to an external Universe.

Voting systems _may_ help get you there, but not if the voting-group tendency rewards values other than truth.

Related: Celine's 2nd law.

The problem is complex: http://plato.stanford.edu/entries/epistemology/


Ministry of Truth


The linked article from NiemanLab [1] is a really odd read. It ends like this:

"It’s been said that we get the media we deserve: that the journalism we see is a reflection of business structures and audience decisions, not the result of an elite’s decisions to shape public opinion. There’s a lot of truth to that. But the information we produce and consume is generated by human beings, not systems, and those human beings have just gotten the shock of their professional lives."

So he starts by saying that the journalism we see is not the result of an elite's decisions to shape public opinion as if it is a good thing (which you would probably agree with).

Then the next line says that the result of the opposite - the unwashed masses? - is (some bad thing will occur).

At this point, you are probably expecting something neutral to follow. I mean, after all, the author talks about the journalism profession, which is supposed to provide both sides of the picture.

What actually follows is this:

"If we’re going to build a better environment for news, we need to think about these issues in a much bigger context than one election night. And it’ll take everyone — journalists, readers, tech companies, and more — to make it happen."

Great, so the author has just defined himself, Facebook, the general media and the people who agree with his views as the "elite" who wants to "fix" the problem by shaping public opinion in a way which is more palatable to him.

Maybe this attitude is what caused this "systemic shock"?

[1] http://www.niemanlab.org/2016/11/the-forces-that-drove-this-...


I'm still waiting for the day that there's a responsible entity that only ever publishes things when they're 100% certain and is willing to bet their freedom (going to prison) on it. But given how information and influencing of opinions is a market and means control of the population, I'm afraid this won't happen with official support, and only be seen as the crazy lunatics news agency that publishes stuff a week or month later after having vetted things.

I'd like to say leave speed reporting to the Twittersphere and mandate a clear label on unvetted news reports on any network, but I doubt politics can have such influence on the media. I would love it if the news reports had a watermark that says fresh-and-unvetted just like "preliminary results" or "consult your doctor before taking".

First we have to encourage and support critical thinking, but too much of it may lead to some influencers misusing it to support causes which deny past and current crimes of humanity on itself and the planet.


That's scary. 100% certain? Nothing is 100% certain.

So they can't even say "it's likely to rain tomorrow"? This doesn't seem thought through.


Not all topics warrant certainty, but given the power of news outlets forming opinion and thereby influencing the population's behavior, I do think certain topics demand responsible reporting which doesn't report anything at all if uncertain. It's the same as a good police detective not disclosing speculations to the media because they pursue many leads but only conclude one, and you don't want a mob go lynch people. The same logic applies to news outlets forming people's opinions. That's why I think there's a need for, admittedly few, certain-report-only news agencies. That way, if you read posts on TheSun or E!Online, you get accustomed to not taking it seriously, forming doubt that this is most likely speculation. Once something gets confirmed unquestionably, it can migrate to one of the few vetted-only news outlets, if that's something they cover.

That said, our weather models are pretty good but not good enough to make certain predictions that far away into the future, but they can for the next few hours.

It's like a software company's model of code branches. The Apple/Google/whatever filesystem team works on something, it gets pushed into their level of production branch, then it percolates up to the shared production kernel branch, and after a couple more layers it hits the common branch, which is what public production binaries are made from and consists of kernel, userland, foobar modules all merged together. Not all software shops operate this way, but it's what size of a project can demand after it hits certain amount. The linux kernel works this way too, to name a successful non-commercial project. You can argue this doesn't prevent regressions, and that's true, but it's hard to deny there would be more regressions (aka false reporting) with unfiltered (aka unvetted) reporting.


I can understand having some news outlets with a much higher threshold.

I still don't believe the term "100% certain" is meaningful. Maybe if they were to put a label on certain facts: "This fact is considered by our editorial board to be 93% certain." And maybe have a chart, so that figure can change over time.

I think there are better ways, and that there should be accountability. I just am not in favor of black and white terms for concepts that, to me, are purely shades of gray.


So who's going to be Head of Truth at Facebook?


Someone very good at it, I hope.


I think it would be better to provide something like the opposite of "friends you'd get along with"

If we could switch the facebook algorithm to different modes - "people you agree with" "people who'd challenge you"

That would be better than straight up blocking false things. Let the conversations happen.


Censorship


This is sort of starting to remind me of the Reticulum in Neal Stephenson's book Anathem. He talks about how they are constantly checking the reputation of data and only valuing high reputation sources. It's certainly hand wavy but interesting. Life imitating art imitating life.


Good luck with that. Reddit's failed at it. Twitter's failed at it. It's almost always bundled with some sort of hate speech or political rhetoric that triggers swaths of people when they feel like they're being silenced.

There probably is no right way to do it.


I tend to agree—I think that the more we get upset by these sorts of behaviors, the more it encourages them. "Don't feed the trolls" is really the only thing that works, and it's a pretty miserable experience to have nothing you can do about harassment.

Which isn't to say I disapprove of reddit, twitter et al removing hate speech; just that it's a very late and reactive solution to a problem that seems inherent in social media.

Really, all we can do is improve filtering, and facebook seems almost ideologically opposed to increasing user control over the content they see. So I don't see facebook being a pleasant place to visit in the near future.


> hate speech

On many subreddits, being in opposition to illegal massive immigration is being considered as a 'hate speech' and is a reason for ban.


Err, citation needed. Preferably by someone employed by Reddit. If it's just the subreddit.... I don't know what to tell you beyond "find better ones".


I was talking about /r/europe for example.

> find better ones

It's not that simple, having in mind that:

1. Current moderators clique holds the most adequate name for pan-Europe subreddit.

2. Everyone registering account from Europe-located IP address is being automatically subscribed to this subreddit.


I'm willing to allow that approaches to collaborative filtering fail at some level. I've tried a few myself that failed.

Drawing expert opinion into the equation, and looking at how concepts spread, and who spreads them, might be useful.


My elders always taught me believe half of what you read, and believe is not such a good word here.

People need to use their wetware and common sense. Tabloids were printing alien invasions with photos, or that certain celebrities were aliens, during the 70s and 80s, and most rational people did not believe them. The papers were allowed to continue printing.

How about multiple sources, and not just being tuned into FB for your entire world view? I'm glad I taught my children how to live without their smart phones, and now they use them wisely. They can even coordinate meeting up when out in the city if their phone's battery or phone fails.

Common sense is a waning resource.


We obviously need a commission that determines what things are facts and which things are fiction. It will be comprised of experts who know more than the populace, so it will be much better. It will be called the Ministry of Truth.


The radio, print, and television media have laws concerning what they can publish. I don't think it's so incredibly impossible to imagine that the new technology which is replacing those old forms, might also have something in the spirit of those older laws. Also consider: Facebook is already censoring what you see, it's just censoring to show you things you will like.


Call me old fashioned but I blame people for being stupid rather than technology.


LOL, whats it going to do run as a NPO and take donations from its users and no ads? Cmon Zuck, I dare ya because you know that is the only way to "fix" this...fix your bottom line.


"Rhetorical Fever" is entirely too contagious. Recovery from it can provide immunity, but is fairly rare. (See Shikasta by Nobel-Prize winner Doris Lessing for more).


I deleted my Facebook account and suggest others do the same.


Are you going to suggest this solution to every mass media channel which turns out to have perverse or pathological dynamics?

Do you realise your personal actions in this case don't have any impact (at the margin) on the whole?

Are you willing to entertain that perhaps there's something in the function of Facebook (and other online sites) that leads to information flows with strongly negative social welfare outcomes?

(I don't have, and have never had, a personal Facebook account.)


Facebook needs a censorship department! Get a list of prohibited topis from white house and block those posts on facebook.

Better yet, ask Trump what topics to block :)


A humble question --

Had Hillary won, would this post even be on HN?


No. And facebook does not pretend it 's not pro-Hillary (its CEO endorsed her). That doesn't mean the discussion is not warranted. In fact it's long overdue. Facebook is now a medium like the rest, it needs to be regulated like the press imho.


"Regulated like the press?" What the hell does that mean?


Freedom from censorship, enforcement of defamation laws.


Those things already apply to Facebook users. Facebook is actually proposing active censorship here.


Yes that s what i m saying facebook has no business to censor people.


Oh. Well then I agree.


Probably not. What's probably happened is that people of Facebook looked internally and realized they had the power to sway this election and didn't act on it. It's weird because we want Facebook to not sway us and let us build our own belief structure, yet in this case maybe I would want Facebook to do something, and help not spread lies and misinformation may be something everyone can get behind (except snopes).


Yes.

The problem of disinformation, misinformation, propaganda, etc., spreading through any media channel eventually surfaces. This happens to be the event which finally brought it to consciousness.

Your comment reminds me of the early attitudes toward AIDS, dismissing it on the grounds it was just "a gay disease".

It wasn't. But it took a long time for collective (and political) consciousness to reflect that. It still hasn't in many parts of the world.


Yes, because it's an interesting technical and social problem. I do admit the discussion here would have a different tone if the result had gone the other way, but not necessarily a better tone as there's plenty of vitriol on both sides.


Facebook isn't the right venue for this.

Of course it's filled with garbage!

While it's noble to seek better, the reality is people are using poor tools because the good ones are failing them.

Ask where did journalism go and that's the start on real solutions.

If we had that as we used to, the garbage on FB would be much less of a worry. The laughable would fit laughed at as it should.


I love it when engineers try to fix social problems. When the constructs of the solutions are human-made then they are gameable and therefore not a solution. And also why is this suddenly a problem? I would wager that if Hillary won there would be much less talk of "bias" and "misinformation".


This headline could be retitled: "Silicon Valley calls upon Facebook to use more aggressive censorship"


Also to all the people here who are ok with selling their privacy to facebook, google - how do you now feel that a Trump-led government will have 100% access to all your data! It is time that we fix the internet from the menace of the advertising business model - that has not only destroyed the internet but society itself...


Zuckerberg seems to be denying that its a problem: http://money.cnn.com/2016/11/10/technology/facebook-mark-zuc...


What is the difference between 'fake' news and true news thats inconsistent with prevailing narratives?

By letting 'the people' choose what trends, at least you are being honest.

Most real news is fake anyways.


The same way parents are responsible for the education of their kids, not schools, individuals are responsible for feeding their brain and developing opinions.

Facebook isn't promising more than cat gifs, Buzzfeed click bait and rants. It is not more an echo chamber than your group of friends.


Can't proper moderation be crowdsourced somehow?

E.g.: a group of fact-checkers, with reputation based on previous performance and a page-rank like system, and perhaps a meta-level above that (people checking the fact-checkers), so that gaming the system becomes difficult.


Define misinformation, because certain groups will call certain facts misinformation.


An interesting contextual point is that Amazon (or perhaps more precisely Jeff Bezos) has an outstanding and tremendously well respected journalism feature whereby the publish original, factual, well-researched articles every day at the Washington Post.


They could start by including "misinformation" as a reason when you report a post.


Unlike traditional media, social media spreads free opinion https://en.wikipedia.org/wiki/Decline_of_the_West




I don't mean to be all doom-and-gloom but it seems too little, too late. Someone else in this thread put it well:

> people immersed in echo chambers will accuse them of bias no matter what.


I think that news and advertising is too powerful to be in private hands. All major news and advertising sources should be nationalized or turned into non-profits.


They have it backwards. Its not about blocking misinformation. That just puts you in the role of the Inquisition. It's about seeking the truth.


Both matter. Blocking fuckwits (or misinformation) is a very useful heuristic to promoting truths.

(Google "Block fuckwits" for my earlier discussion(s))


What if they added a flag button labelled "This post contains misinformation"? Then, if sufficient people click it, it is reviewed by a human editor.


Great, I can expect every New York Times and Fox News article to be flagged by people who didn't like it.


You'll notice I said that such articles would be reviewed by a human editor, which would prevent this type of abuse.


I mean it seems a little dystopian if Facebook is going to become the arbiter of what information is accurate and start culling stuff.


Facebook relatively recently reaction buttons. Would a "This is untrue" reaction button help?


That's an option. You've got the fundamental problem that many people treat truth as a popularity contest or majority opinion. It's not.

You'd need at some level a source of trusted truths against which to compare or rate statements. Though user-based flagging could work.

There's both a wisdom and a madness of crowds to contend with.


All they need to do is have a skeptical reaction along with their other emotive reactions.


What if Facebook found a way to turn it's billion into tiny wikipedia fact checker.


Itt: people who think the world is dominated by unequivocal ideas and information.


I was thinking something similar about gmail. Once a year or so someone send me powerpoint with "the Mars would be as big as moon" and other obvious nonsense. Gmail should notify them before sending that it is indeed bullshit, if they really want to send it.


why not just add a "spam flag" on all posts and let the users clean it up? Or something like a downvote.


So they admit they want to censorship FB.


It makes sense to reduce the effect of "cottage industries" getting paid to misrepresent the popularity of certain topics. I wouldn't call that censorship.


One person trying to censorship is another person's editorial policy.


Why the communist regimes newer used such a great excuse for their media... /s

Seriously, that's slippery slope.


No, they admit that it may not be the best idea after all to allow people who use 'censorship' as as verb equal footing with serious journalists.


"waahhh somebody told me Breitbart is racist and now I'm being CENSORED"


Facebook is located in the USA, if Facebook institutes its own ministry of truth do you think its more likely that they will antagonize the government or work with it like they have done thus far and comply with the requests of Trump Administration?


Censorship of libel and intentional mischief is no vice.


It's up to courts to decide whether something is libel or slander, not an algorithm or an engineer at Facebook.

Private communication / media organizations like Facebook deciding what's acceptable and what's not would only be censorship.


No, it's not. Courts are a last resort, when private mediation has failed, not the first line of action. If I publish news articles claiming "the sky is the color red, and only evil people disagree", courts do not need to be involved for Facebook to bury my articles. Most mischief is obviously classifiable without outside mediation.


You think the New York Times doesn't decide whether something is libel or slander before they print it in order to avoid having the "courts...decide whether something is libel or slander"? It's not a direct comparison, but close enough for me to dismantle the idea that only courts get a say on this.


Are you saying that Facebook is a government? By the same token as libelslander, censorship is a government task. Everything else is just business rules.


I just had a picture in my head of 60s white policeman in a southern state saying the last part, as he locked a way some civil rights protesters.

One mans mischief and the causing of trouble is another persons fight for freedom.


Let people develop critical thinking skills and determine their trusted sources. Keep censorship out of this!


That's been tried. It scaled poorly.


what is misinformation? what is false today can be true tomorrow.


[Citation needed]?


It must stop its operations. That is too much.


I have had a lot of similar thoughts recently. If you think of this historically -- soon after mass-communication technologies were first invented, they were eventually abused, the radio propaganda of WWII being one of the best examples.

Laws were introduced, and corporations grew up around these technologies, the producers of the content became professionalized, and by doing it full time, some of them even developed ethics and standards of behavior.

Now, that old media has been destroyed by new technology. The new technology is fundamentally superior in terms of the volume of information that can be transmitted, and it is both push and pull. However, the old media's "checks and balances" were also destroyed, and that is having incredibly negative consequences.

I think there are some concrete, simple, technical steps that Facebook & browser vendors can do:

* Before allowing you to share an article from a source that is of questionable trustworthiness, it should show you a warning message: "This site is known to contain untrue content." They might also consider adding an alert next to links from these sites when they are being displayed in the news feed. I think technically, this problem is not that different from the way that email systems deal with spam.

* Similarly, the way that browser vendors show you an alert before allowing you to navigate to sites that contain viruses, etc, the browser can show you an alert message before taking you to a site that is known to have untrue content. And just like you see a red error message when visiting a site that doesn't have a proper SSL setup, this could appear for sites that are untrustworthy.

Don't underestimate the psychological power of strategically placed small red warning messages over time.

Obviously making the decisions of what goes on the list will be a highly political affair, but what is nice to realize is that, unlike the government, all these technologies are developed by private companies and given to their users for free. If you keep the bar for what is considered "lying" pretty high, most educated professional people (who are the employees of these companies) would be able to agree on which things to warn about.

Additionally, the "social media" space has greatly matured and consolidated. There is no longer really a direct competitor to Facebook. Facebook & browser vendors now have the opportunity and, the responsibility to innovate in this area. I think they can do a lot without really impacting their bottom line.

Don't allow yourself to get sucked into the problem of how computers might be able to determine "truth" -- that, is absolutely unsolvable. We will not be able to censor the CNN, the NYTimes or even Fox News (in general). However, there are many dark corners of the internet that are actually pretty influential, that could be toned down with this approach.


Not just mass media.

Writing coincided, and probably facilitated, building of cities, and rule of law (which previously couldn't be written down).

Printing (1436) in Europe saw the split of the Catholic church into multiple denominations.

The continuing fall in printing and paper costs, and rising literacy, eventually lead to revolution and populist reform first in the American colonies, then Europe. Particularly the Chartist movement in England and the Revolutions of 1848 throughout the Continent.

Mass comms as you said, with Fascism in Italy and Germany.

https://ello.co/dredmorbius/post/gqzszjwf4unuqfupzqff8g


Absolutely: technological changes in "information technology" bring about social changes as well.


Corollary: it's not a question of "how do we preserve the current system given some change in media technology?" It's "how will our current system change given some change in media technology?"


ministry of truth'd


*some misinformation


Oh really? You think?


Deleted.


What should we do to you for posting it? Let's start there.


Good point. I've deleted it.


The question remains.


video has been removed the user


How hard is it to write a bot that googles the facts and checks if they contradict the story?


See Google's Knowledge Engine.

The problem is challenging. Perfection is impossible (for deeply philosophical reasons), but a reasonably tractable system strikes me as possible.

https://www.google.com/intl/es419/insidesearch/features/sear...




Consider applying for YC's W25 batch! Applications are open till Nov 12.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: