黑料门

Facebook is a media company, but what’s a media company?

March 6, 2018

When Facebook says that it鈥檚 a tech company and not a media company, a silent shudder echoes across the internet.

It鈥檚 like the reaction that, say, Morton Salt would get if the C-suite was to declare that it鈥檚 running a logistics company, not a salt company. The audience might nod and applause (as they often do whenever Facebook makes a public statement) but under their breath, they鈥檇 be laughing. How can we not be a salt company? That鈥檚 literally part of our name.

The same irony applies to Facebook, a company synonymous with social media. Key word: media. But this didn鈥檛 stop Facebook COO Sheryl Sandberg from putting her foot in her mouth last fall when she went on record with , confidently declaring that Facebook is not a media company. "At our heart we're a tech company,鈥 Sandberg said. 鈥淲e don't hire journalists."

Just because most of your labor force is engineers and computer scientists doesn鈥檛 mean you鈥檙e not a media company. Apparently being a platform for news and information reaching billions of users every day, selling ads and paying companies to use your network to produce original news and entertainment content is not enough to justify relabeling your company to fit into another industry vertical 鈥 especially 鈥渕edia鈥 鈥 which has far more amendatory regulations and public responsibilities than a rote Silicon Valley tech company. Or as Erin Griffith writes at , 鈥渁dmitting Facebook is a media company would require Facebook to take responsibility for its role in the spread of fake news, propaganda, and illegal Russian meddling in the U.S. election.鈥

Outside of the extensive public apologizing, massive legal fees and corporate restructuring that would trigger if Facebook were officially categorized as a media company, the notion of public responsibility is especially sticky. It鈥檚 also a big part of the reason we鈥檙e still talking about Facebook鈥檚 ethical compass more than a year after the Center for Digital Ethics and Policy鈥檚 own  and be the voice of corporate conscience in a morally turbulent media era.

Facebook still hasn鈥檛 hired a chief ethics officer, and it won鈥檛 because it鈥檚 still just a tech company. Plus, an ethics shift for a company with the reach, influence and profitability of Facebook will need more than a Ph.D. and a few administrators to change its culture. Writing for Slate in response to Heider鈥檚 call-to-action, Anna Lauren Hoffman, a professor with The Information School at the University of Washington in Seattle, pointed out . First, the idea that an internal ethics team could serve as 鈥渁 panacea for all possible ethical problems鈥 is based on a monolithic view of Silicon Valley principalities. The concept ignores 鈥渋nternal dissenters and external advocates already in the trenches, already grappling with key ethical issues.鈥

Second, the chief ethics officer approach is based on the naive assumption that internal processes could be strong enough to combat the outside commercial and political forces that demand maintenance of the status quo. Like an incompetent internal affairs team on a B-grade police drama, they might be noble and high-minded, but they drink from the same coffee machine as everyone else. And as anyone who watches these shows will tell you, real change only happens when the outside agents come in.

Unfortunately for us, life isn鈥檛 a B-grade police drama. There鈥檚 no such thing as fairy tale justice and freeze-frame endings. Real life is messy and vile, and our modern problems are infinitely complicated, especially when we鈥檙e talking about Facebook鈥檚 current existential dilemma. For one thing, they鈥檙e not like a legacy news or media company, such as the New York Times, which had the advantage of old money, urbane authority and analog culture to establish their ethical differentiation. For another thing, Facebook stands in an entirely unprecedented industry space; more like a psychical kingdom than a digital front.

So as Facebook sees it, labels like 鈥渕edia company鈥 don鈥檛 really fit. Facebook is a platform. People use it like a modern commonplace. Though instead of handwritten recipes, quotes, measures and math equations, Facebook has targeted ads, Twitter plugins and Logan Paul videos. And as with any emerging business, Facebook has struggled with ethical complications for years; everything from live-streamed assaults and privacy violations to international free speech restrictions (problems that may sound familiar to media companies). But lately it鈥檚 the nagging and pervasive problem of 鈥渇ake news鈥 that has driven the ethical thorn deeper into Facebook CEO Mark Zuckerberg鈥檚 side.

If Facebook was categorized as a media company, its fake news problem would have spelled the beginning of the end, especially for the integral role the network played in disrupting the 2016 U.S. presidential election. But because it鈥檚 a tech company, it鈥檚 skirted responsibility, much like a gun shop owner who says that what his customers do with his weapons is not his problem; he just sells guns. Facebook is just a social platform. What people do with it is their problem.

This rhetorical twist likely explains Zuckerberg鈥檚 latest tactic in the company鈥檚 ongoing moral battle against fake news: Letting the users themselves decide what鈥檚 trustworthy and what鈥檚 not. In a , Zuckerberg writes:

鈥淭he hard question we've struggled with is how to decide what news sources are broadly trusted in a world with so much division. We could try to make that decision ourselves, but that's not something we're comfortable with. We considered asking outside experts, which would take the decision out of our hands but would likely not solve the objectivity problem. Or we could ask you -- the community -- and have your feedback determine the ranking.鈥

You won鈥檛 hear this type of moral ambivalence from most legacy media companies. And in many ways, this 鈥渨ell, what do you think鈥 type of thinking is typical of Facebook鈥檚 philosophical provenance. But it also reflects Facebook鈥檚 own ethical uncertainty. A chief ethics officer might be able to pose the question in more nuanced language, but it鈥檚 increasingly clear that the solution is cultural, not institutional. This is to say, for Facebook to change and take responsibility, culture must change first.

Maybe the digital marketing clich茅 that 鈥渆very company is a media company鈥 isn鈥檛 far from the truth 鈥 or what the truth should be. Because when a company says yes, we鈥檙e a media company, it willingly takes on the ethical responsibility commensurate with such categorization. Few companies in the modern marketplace can operate or be successful without a strong sense of mission and an articulated message of corporate social responsibility. Even  has one. And according to recent research from , there are emerging industry tools and a growing body of literature that media companies can use to assess the impact of their corporate responsibility programs; namely, through analyzing credibility, usefulness and fairness.

But corporate responsibility still isn鈥檛 a fix for broken institutions. In many cases, it鈥檚 there for show; And while it鈥檚 a useful metric for assessing where a company stands on the issues, it鈥檚 not an ethical salvo. The more Facebook waffles and defers on taking a stance as a media company, the more damage it will do and the more culturally outmoded will become. If Facebook insists on being a tech company at the heart, maybe that tells us all we need to know about what the heart of a tech company looks like.


Benjamin van Loon
 is a writer, researcher, and communications professional living in Chicago, IL. He holds a master鈥檚 degree in communications and media from Northeastern Illinois University and bachelors degrees in English and philosophy from North Park University. Follow him on Twitter  and view more of his work at .

March 6, 2018

When Facebook says that it鈥檚 a tech company and not a media company, a silent shudder echoes across the internet.

It鈥檚 like the reaction that, say, Morton Salt would get if the C-suite was to declare that it鈥檚 running a logistics company, not a salt company. The audience might nod and applause (as they often do whenever Facebook makes a public statement) but under their breath, they鈥檇 be laughing. How can we not be a salt company? That鈥檚 literally part of our name.

The same irony applies to Facebook, a company synonymous with social media. Key word: media. But this didn鈥檛 stop Facebook COO Sheryl Sandberg from putting her foot in her mouth last fall when she went on record with , confidently declaring that Facebook is not a media company. "At our heart we're a tech company,鈥 Sandberg said. 鈥淲e don't hire journalists."

Just because most of your labor force is engineers and computer scientists doesn鈥檛 mean you鈥檙e not a media company. Apparently being a platform for news and information reaching billions of users every day, selling ads and paying companies to use your network to produce original news and entertainment content is not enough to justify relabeling your company to fit into another industry vertical 鈥 especially 鈥渕edia鈥 鈥 which has far more amendatory regulations and public responsibilities than a rote Silicon Valley tech company. Or as Erin Griffith writes at , 鈥渁dmitting Facebook is a media company would require Facebook to take responsibility for its role in the spread of fake news, propaganda, and illegal Russian meddling in the U.S. election.鈥

Outside of the extensive public apologizing, massive legal fees and corporate restructuring that would trigger if Facebook were officially categorized as a media company, the notion of public responsibility is especially sticky. It鈥檚 also a big part of the reason we鈥檙e still talking about Facebook鈥檚 ethical compass more than a year after the Center for Digital Ethics and Policy鈥檚 own  and be the voice of corporate conscience in a morally turbulent media era.

Facebook still hasn鈥檛 hired a chief ethics officer, and it won鈥檛 because it鈥檚 still just a tech company. Plus, an ethics shift for a company with the reach, influence and profitability of Facebook will need more than a Ph.D. and a few administrators to change its culture. Writing for Slate in response to Heider鈥檚 call-to-action, Anna Lauren Hoffman, a professor with The Information School at the University of Washington in Seattle, pointed out . First, the idea that an internal ethics team could serve as 鈥渁 panacea for all possible ethical problems鈥 is based on a monolithic view of Silicon Valley principalities. The concept ignores 鈥渋nternal dissenters and external advocates already in the trenches, already grappling with key ethical issues.鈥

Second, the chief ethics officer approach is based on the naive assumption that internal processes could be strong enough to combat the outside commercial and political forces that demand maintenance of the status quo. Like an incompetent internal affairs team on a B-grade police drama, they might be noble and high-minded, but they drink from the same coffee machine as everyone else. And as anyone who watches these shows will tell you, real change only happens when the outside agents come in.

Unfortunately for us, life isn鈥檛 a B-grade police drama. There鈥檚 no such thing as fairy tale justice and freeze-frame endings. Real life is messy and vile, and our modern problems are infinitely complicated, especially when we鈥檙e talking about Facebook鈥檚 current existential dilemma. For one thing, they鈥檙e not like a legacy news or media company, such as the New York Times, which had the advantage of old money, urbane authority and analog culture to establish their ethical differentiation. For another thing, Facebook stands in an entirely unprecedented industry space; more like a psychical kingdom than a digital front.

So as Facebook sees it, labels like 鈥渕edia company鈥 don鈥檛 really fit. Facebook is a platform. People use it like a modern commonplace. Though instead of handwritten recipes, quotes, measures and math equations, Facebook has targeted ads, Twitter plugins and Logan Paul videos. And as with any emerging business, Facebook has struggled with ethical complications for years; everything from live-streamed assaults and privacy violations to international free speech restrictions (problems that may sound familiar to media companies). But lately it鈥檚 the nagging and pervasive problem of 鈥渇ake news鈥 that has driven the ethical thorn deeper into Facebook CEO Mark Zuckerberg鈥檚 side.

If Facebook was categorized as a media company, its fake news problem would have spelled the beginning of the end, especially for the integral role the network played in disrupting the 2016 U.S. presidential election. But because it鈥檚 a tech company, it鈥檚 skirted responsibility, much like a gun shop owner who says that what his customers do with his weapons is not his problem; he just sells guns. Facebook is just a social platform. What people do with it is their problem.

This rhetorical twist likely explains Zuckerberg鈥檚 latest tactic in the company鈥檚 ongoing moral battle against fake news: Letting the users themselves decide what鈥檚 trustworthy and what鈥檚 not. In a , Zuckerberg writes:

鈥淭he hard question we've struggled with is how to decide what news sources are broadly trusted in a world with so much division. We could try to make that decision ourselves, but that's not something we're comfortable with. We considered asking outside experts, which would take the decision out of our hands but would likely not solve the objectivity problem. Or we could ask you -- the community -- and have your feedback determine the ranking.鈥

You won鈥檛 hear this type of moral ambivalence from most legacy media companies. And in many ways, this 鈥渨ell, what do you think鈥 type of thinking is typical of Facebook鈥檚 philosophical provenance. But it also reflects Facebook鈥檚 own ethical uncertainty. A chief ethics officer might be able to pose the question in more nuanced language, but it鈥檚 increasingly clear that the solution is cultural, not institutional. This is to say, for Facebook to change and take responsibility, culture must change first.

Maybe the digital marketing clich茅 that 鈥渆very company is a media company鈥 isn鈥檛 far from the truth 鈥 or what the truth should be. Because when a company says yes, we鈥檙e a media company, it willingly takes on the ethical responsibility commensurate with such categorization. Few companies in the modern marketplace can operate or be successful without a strong sense of mission and an articulated message of corporate social responsibility. Even  has one. And according to recent research from , there are emerging industry tools and a growing body of literature that media companies can use to assess the impact of their corporate responsibility programs; namely, through analyzing credibility, usefulness and fairness.

But corporate responsibility still isn鈥檛 a fix for broken institutions. In many cases, it鈥檚 there for show; And while it鈥檚 a useful metric for assessing where a company stands on the issues, it鈥檚 not an ethical salvo. The more Facebook waffles and defers on taking a stance as a media company, the more damage it will do and the more culturally outmoded will become. If Facebook insists on being a tech company at the heart, maybe that tells us all we need to know about what the heart of a tech company looks like.


Benjamin van Loon
 is a writer, researcher, and communications professional living in Chicago, IL. He holds a master鈥檚 degree in communications and media from Northeastern Illinois University and bachelors degrees in English and philosophy from North Park University. Follow him on Twitter  and view more of his work at .