Exeter Needs a Harkness Discussion about Facebook

By GORDON HOUGH ‘61, JACK RUSSELL ‘61,  and PRESTON  TORBERT ‘61

Introduction

We and 24 of our classmates of the class of ’61 believe that Exeter needs to talk about Facebook. We seek a Harkness-style discussion of social media, in particular Exeter’s relationship with Facebook (now Meta, and including its subsidiaries Instagram and WhatsApp). This issue came up in Principal Rawson’s most recent conversation on Jan. 12 when he said, “Social media is a concern — for which I do not have answers to.” Our proposed discussion could help the Exeter community come to a consensus on how to deal with Facebook and other social media.

Exeter currently has a relationship with Facebook. It has a page on Facebook, has placed a Facebook icon on the alumni and other webpages, and uses the Facebook app to engage with alums, prospective students, and others. The school attempts to meet alums and others where they are — on Facebook. 

This relationship is understandable. Facebook has many positive features. It has allowed Exeter students to message classmates for homework; enabled old friends to connect; allowed grandparents to enjoy photos of their grandchildren; assisted users to find organ donors; helped users raise relief funds after Hurricane Harvey; helped two million Americans to vote; allowed Egyptian protesters to coordinate demonstrations during the Arab Spring; and alerted French President Macron to Russian hackers two days before the French presidential election. In these ways, Facebook has been useful. 

But Exeter differs from other Facebook users. Exeter’s Deed of Gift instructs that “above all, it is expected that the attention of instructors to the disposition of the minds and morals of the youth under their charge will exceed every other care; well considering that though goodness without knowledge is weak and feeble, yet knowledge without goodness is dangerous . . . [Emphasis added].” As former principal Stephen Kurtz once told one of us, “Exeter is a moral education.”

Immorality of Facebook’s Business Model

We believe that Exeter’s relationship with Facebook betrays this moral mission. Facebook’s business — free service in exchange for private, personal data to be used for whatever business purposes Facebook chooses — is immoral and harmful. This business model is immoral because it uses addiction, surveillance, and manipulation to deprive the user of autonomy. While addiction is generally associated with substances, “behavioral addiction” is now included in the American Psychiatric Association’s Diagnostic and Statistical Manual of Mental Disorders. Facebook website designers include social validation loops and intermittent reinforcement (the “Like” button) to “hook” users and induce addiction. Arguments that users should exercise self-discipline fail when their environment has been engineered to make sure that choices are not free and when the latest findings of psychology on human weaknesses are applied in the design of Facebook’s platform.

Facebook’s need for data requires watching and tracking — surveillance. In fact, the major study of its business model calls it “surveillance capitalism” (Shoshana Zuboff, The Age of Surveillance Capitalism). Facebook’s collection of data — immediate, silent, invisible, unnoticed, and automatic — is so effective that Facebook is the biggest surveillance-based enterprise in history. It not only collects the information of its users, but it also obtains personal information on people who are not on Facebook.

Addiction and surveillance support manipulation. Manipulation is the treating of another person not as a fellow rational agent, but as a device to be operated. Manipulation violates another person’s autonomy. Philosophers have not reached consensus on the precise nature of threats to autonomy. But clearly, when the environment is intentionally arranged to influence the individual in a way that is beneficial to the influencer and detrimental to the individual, the individual is manipulated, and autonomy depleted. As we adopt a more digital life, our environment online is intentionally arranged by Facebook and others for their benefit. Our online experiences are shaped to fit their commercial interests and we pay with a loss of our autonomy. 

Our moral intuitions and analogies tell us that this business model is wrong. Take an example close to Exeter students. It is already common practice for colleges and employers to ask for access to applicants’ social media accounts, so consider the following scenario: How would students feel if they learned that Exeter recorded all their comments at the Harkness table and sold that information to advertising companies so they could manipulate the students? Wouldn’t this conduct “shock the conscience”? But this is a close analogy to what Facebook is already doing. Take another hypothetical. Assume in the future that a person’s DNA has market value.  Would students feel comfortable if Exeter collected their DNA from each strand of hair left on the floor of a dormitory or classroom? Clearly, students — like the rest of us — would find that this was wrong.  What about a person’s microbiome? These analogies and the business model of Facebook exemplify “universal commodification” (everything is a marketable commodity), which is morally repugnant.

This business model is immoral, but Exeter’s posting of the Facebook icon on the alumni and other webpages and the Exeter page on Facebook advertises the fact that Exeter condones, if not endorses, Facebook. Few alums, students, or others consult the online Privacy Notice so it would be unrealistic to deny any endorsement of Facebook by relying on the statement


These analogies and the business model of Facebook exemplify “universal commodification” (everything is a marketable commodity), which is morally repugnant.


there that “None of the links on the Site should be deemed to imply that the Academy endorses or has any affiliation with the third-party.” 

Exeter also has a policy on social media in the E-Book, which states that the Academy recognizes the many benefits of technology, but also recognizes the risk of internet use. It specifically acknowledges the right of students to use social media, including Facebook and Instagram, and supports open dialogue and diversity of thought. But it does not suggest any judgment on the business model of Facebook or other social media companies. It would not surprise us if Exeter students were unaware of the moral failings of Facebook’s business model.

Exeter’s personal connections to Facebook suggest that Exeter has a special responsibility to take a stand.  First, Exeter inspired Facebook. The digitization of Exeter’s Photo Address Book by Kris Tillery ’02 was the precedent for Mark Zuckerberg ’02 to digitize the Harvard student directory — the original Facebook. Further, two of Facebook’s most cogent critics are Exeter classmates. Roger McNamee ’74, the managing director and a cofounder of Elevation Partners and an early investor in Facebook, harshly criticized Facebook in his book Zucked.  Jim Steyer ’74, a civil rights lawyer, Stanford professor, and children’s advocate founded Common Sense Media, a non-profit organization promoting safe technology and media for children and wrote the book Talking Back to Facebook: The Common Sense Guide to Raising Kids in the Digital Age. And Kris Tillery has said of Facebook’s business model, “The moral ambiguity of the [Facebook] platform — which is today the revenue based on advertising and targeting — raises big questions about how we should spend our time for our own happiness.”     

Exeter is, and sees itself as, a leader in American secondary education. We believe that deleting the Facebook icon would be an act of moral courage and leadership and set an example for other schools to review their own practices. In the effort to talk back to Facebook and call it to account, Exeter should be at the front of the line.

Finally, deleting the Facebook icon would remind students that popularity is not truth; convenience is not goodness; and instrumentalism is not morality.     

Some may argue that the business model is not immoral because users consent.  It is true that by using Facebook each user enters into a contract with it that contains an apparent consent to the collection and use of the user’s data. But this consent is arguably void. The applicable law of the contract, that of California, renders void a contract that is “unconscionable.”  Unconscionable contracts are those that are “oppressive” and “shock the conscience.” If the contract is found unconscionable, then the consent it contains would be void and Facebook would have no legal right to collect and use the user’s data. No court has so ruled, but precedent and Facebook’s immorality suggest the plausibility of such a result. 

Harms to Society of Facebook’s Business Model

Regardless of consent, Facebook’s business model, as noted above, is not only immoral, but it is also harmful. The business model creates an insatiable appetite for data because the data makes possible the targeting, and therefore, the success of the advertising. The more time users spend on the app, the more data that can be collected. The algorithms, which prioritize


Finally, deleting the Facebook icon would remind students that popularity is not truth; convenience is not goodness; and instrumentalism is not morality.


what users see, are designed to increase time on the app. Therefore, they lead users to content that will keep them more engaged — content that is extreme in some way. Often that is content that is sensational and false. The aim in recommending content or other users is not accuracy, truth, or the public good; it is profit. Facebook favors falsity.  As Adrienne LaFrance, Editor of The Atlantic, has written, Facebook was built to encourage the things that make it so harmful.  

Examples of how Facebook’s business model harms society are:

1. Intentionally Spreading Misinformation and Disinformation

Facebook’s algorithms intentionally promote misinformation and disinformation because these algorithms cause more engagement, more data, and more advertising revenue.  When President Biden was asked what his message was to social media platforms about Covid-19 disinformation, he said, “They’re killing people.”

2. Prioritizing Profits over Safety

Facebook whistleblower Frances Haugen testified to Congress that “Facebook consistently resolved those conflicts [between its own profits and our safety] in favor of its own profits.”

 

3. Commercial Exploitation of Friendship

Facebook debases friendship by surreptitiously employing the data of a user’s friends to target the user with ads, encouraging extreme voices as friends, and copying all email contacts of users. 

    

4. Harm to Children

Jim Steyer said of Facebook’s Instagram Kids app, “The only thing they care about is hooking kids when they are most vulnerable, keeping them on the platform and getting access to as much of their personal data as possible.” 

5. Harm to Teenagers

Jean M. Twenge, a sociology professor at San Diego State University, has written that “Facebook use causes unhappiness, loneliness, and depression.” Jonathan Haidt, social psychology professor at NYU Stern Business School, has written, “‘We have the largest epidemic ever of teen mental health, and there is no other explanation [than social media], . . . It is a raging public-health epidemic and the kids themselves say Instagram did it . . .”  And Jim Steyer has reported that when he polls his Stanford class, more than half of the students said they wished Facebook didn’t exist.

6. Harms to Users Generally

Studies have shown that more time on Facebook is associated with worse mental health and that taking a break from or deactivating Facebook can improve wellbeing and reduce polarization. 

7. Harm to Country’s Direction

A Pew Research Center survey of U.S. adults found that about two-thirds of Americans (64%) said social media had a mostly negative effect on the country’s direction.  

8. Harm to Public Health 

Nonprofit organizations, many doctors, and the U.S. Surgeon General have said that, by intentionally enabling and spreading misinformation and disinformation, Facebook harms public health.  

9. Voter Manipulation 

Experiments show that Facebook can change voting behavior by explicit or subliminal messages. They strongly suggest that to promote its business interests Facebook could use these messages to intentionally affect the voting behavior of its users as described by Harvard Law professor Jonathan Zittrain in his 2014 New Republic article “Facebook Could Decide an Election


“The time has come to accept that in its current mode of operation Facebook’s flaws outweigh its considerable benefits.”


Without Anyone Ever Finding Out.”   

10. Facilitating the Jan. 6 Insurrection

Facebook, by fomenting and facilitating the spread of false narratives about the 2020 election result, was not a mere passive tool, but a catalyst for the insurrection. 

      

Conclusion

Facebook has positive uses, but as Roger McNamee has written, “The time has come to accept that in its current mode of operation Facebook’s flaws outweigh its considerable benefits.” The bad — the immorality and the harms — outweighs the good.

We believe that Exeter’s relationship with Facebook deserves discussion by the Exeter community. Of course, Exeter cannot prevent students’ use of social media, but we believe it can teach them an important moral lesson by eliminating the Facebook icon on the alumni and other websites and even closing the Exeter Facebook page. These propositions would benefit from discussion and that discussion would help address the concerns that Principal Rawson expressed about social media. Investigative reporting by The Exonian together with inquiry by students and faculty could help prepare for such a discussion. 

Previous
Previous

Prehistoric Creature of the Week

Next
Next

A Brief History of Bridge Jumping