Why I closed down my Facebook

This week I closed down my Facebook account.

It was not an easy decision to make. I’d been thinking about it for the last few months, after realizing that the site has become a very different creature than what it was when it all began, over ten years ago.

Back then, around 2005, social networks were a novelty. There was no “practical application” to it; it was more like a collection of people that were grouped by common interests. Like Orkut communities, for example, that sported pretty much every crazy idea on the planet, from serious discussions to whole pages dedicated to a single bad joke. Interfaces were rudimentary and awkward, we still had the concept of a “guestbook” and, most important of all, we still were not our own worst enemies.

The idea of sharing as it is today did not exist, either. No one would have ever thought of using Orkut, Facebook or any other of those sites as a source of news.

Facebook, fake news and debate-phobia

These days, Facebook has become the main news source of some people. This fact, in and of it self, is not as important was what comes next: it’s too easy to share anything on Facebook. However the site has no way of tracking if what is being shared is actually true, satire or straight up fake news. Add to that the fact that most people just don’t get the Internet, and there it is: the perfect formula to making people believe whatever “alternative facts” one wants them to. People see something on Facebook and they don’t care to check if it’s legitimate or not, they just pass it along. This way a fake news item gets propagated and gains “status”, sometimes even being published by more traditional news sources when someone doesn’t fact check as well as they should.

Even worse: since control over who can get an account on Facebook is, let’s say, “relaxed”, it’s easy to create a huge number of fake accounts and use them to propagate anything. You create a blog with the focused subject, you publish a series of posts with whatever serves your purpose, and then those posts are shared by an army of fake accounts, and by sheer force of numbers they reach regular users, who give them credit because they’ve been “widely shared”.

The number of shares grants credibility to a post, and from then on the process takes on a life of its own and grows on its own forces. Unsuspecting people see that post and share it along, as long as they agree with its tagline. It doesn’t matter if it’s true or not: the confirmation bias is too strong and inhibits the critical sense of deciding if something is true or not before sharing it.

Add to that the fact that, due to the own nature of Facebook, it tends to group people that think alike. Therefore, unless the person makes a conscious effort of looking for diverging opinions, they will receive an avalanche of news that fit their own view. Lack of opinion divergence is a serious problem, because it tends to inhibit free debate. Without a healthy level of idea sharing, opinions tend to become more and more radical, leading to polarization and hostility.


From the beginning I was never too comfortable with Facebook’s “privacy policy”. But, back then, it still made sense to have an account. That way it would be possible to have some control over what kind of information about me could be shared. They always had a section in their settings page about personal information sharing.

That became much more complicated since a couple of years. With the revelation that private data on some 90 million accounts were used by Donald Trump’s campaign, it has become clear that Facebook does not have the ability (or the motivation) to offer its users the required level of control to avoid incidents like that.

Why is it important? Because using the data of all that people it’s possible to direct posts to people that are more susceptible to it. That maximizes the probability they will share it along. Using tools offered by Facebook itself it’s possible top direct posts to their target audience to get better responses. And that’s determined by rules that are established by whoever pays for promotion. Using data obtained from profiles harvested by Cambridge Analytica, the validating effect of numerous shares reaches immense potentiality. It’s much harder to detect the fake profiles, too. People end up doing the dirty work of propagating a post without ever realizing what is really going on.

And that’s how a troll becomes the man with the biggest button in the planet.

Leaving “Face”

Leaving Facebook, in my case, is a personal matter. Fortunately I had no professional tethers there, neither do I need to promote anything.

More than that, though, is that I’ve grown tired of the polarization that has taken over everybody. Brazil is following the same wave as the United States. The 2018 election is clearly going to be decided on Facebook and its likes, through manipulation, fake news and memes. Ahh, yes, the memes. They have an even higher destructive potential than fake news. Memes are not “serious”, and “for fun”, and that makes them worse. Marketeers are working hard to create the perfect meme for you, be certain of that.

And that’s how we learned to hate each other, and un-learned how to have empathy. We started to disdain any opinion that is not our own. We learned to trust the doubtful and to ignore fact checking. We started to think we have the right of imposing our ideas on other people. And whoever disagrees deserves to be treated as an enemy, and as a threat.

That’s why I closed down my Facebook. And I suggest you, after reading this, do yourself this favor and leave, too. I am not the only one saying that.