What Mark Zuckerberg (Facebook) Told Me…

…(and 85,000,000 others) a few weeks ago in a 6000 word essay entitled “Building Global Community” was fascinating.  He essentially formulates a view on how we can create a global community that works for all. Within the essay, he makes some astute observations on human behaviour and polarisation:

  • New religion? He writes of the importance of smaller community structures, such as churches, sports teams and unions, in the past to give us a sense of purpose and hope. But he notes that since 1970, membership of such local groups has fallen by one-quarter. This touches on anidea put forward by Yale computer scientist and writer, David Gelernter, which sees the decline of religion being replaced by a “religious”-type belief in political parties. Gelernter argues this explains much of the polarisation we are seeing in democracies around the world. So one’s political party is like one’s religion, which makes elections more about identity than economics.
  • Filter bubbles.In his essay, Zuckerberg tackles the big question of filter bubbles (only seeing posts with similar views) and fake news on social media. He argues that simply providing the opposite perspective often deepens polarisation. A more effective approach is to provide a range of perspectives and so let people see where they sit on the spectrum.
  • Fake News. On “fake news”, one operational challenge for Facebook is to discern the difference between hoaxes, satire and opinion. But even if misinformation is found, banning it may not change much, instead Facebook could surface additional information including fact checkers’ posts. He’s also found that people often share posts with a sensational headline without ever reading the story. If after reading the story, the user does not share the post, it suggests the headline is sensational. Using this insight, Facebook can take this into account for posts that appear in your News Feed.
  • Artificial intelligence.Zuckerberg talks about how Facebook artificial intelligence models now accounts for one-third of all reports to the team that reviews content for “safety”. One challenge is that different cultures have different norms of what is “safe”. So in future, options will be available to block content with nudity, violence, graphic content or profanity. This drive is leading to major advances in the ability of AI to understand text, videos and photos.
  • Know the complete person, not the just opinion.Finally, he notes that the best to improve discourse between people is when they get to know each other as whole people rather than just opinions. So first sharing what we have in common – sports teams, TV shows, interests – makes it much easier to have a conversation with someone you disagree with. He also writes that the vast majority of conversations on Facebook are social, not ideological. So lots of posts sharing jokes, funny clips and staying in touch across cities.

Bilal

[wysija_form id=”1″]

Facebooktwitterredditlinkedinmail

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.