Exploring how goodness fits into the future

Students and alumni mentors decode ethics in the Digital Age.

By
Jennifer Wagner
July 26, 2019
Students look at the chalkboard in Religion Instructor Peter Vorkink's class.

Have you ever stopped to consider the ethical implication of clicking “like” on an Instagram post? Or whether digital assistants like Alexa and Siri have First Amendment rights to free speech? What about corporate social responsibility? Is that a thing? These are some of the salient questions bubbling up in the world today that, more often than not, go left unexplored — especially by many adolescents.

Religion Department Instructor Peter Vorkink P’95 is working to change all that and help keep Exeter’s curriculum in step with the times. Last winter term, the department offered Religion 597: Silicon Valley Ethics: Case Studies in the World of High Tech, an experimental new course Vorkink designed that required students to think deeply about the complex ways in which technology has altered not only how they live their daily lives, but also their values.

Toby Abelmann ’19

Technology use in schools

Discuss how best to educate today’s youth, considering both the benefits and liabilities of easy access to technology.

Read more about this case study.

Catarina Schwab ’92

Connecting with Silicon Valley

Vorkink knew he had to connect Exeter with the epicenter of innovation, California’s Silicon Valley, in order to make the class authentic and relevant. But how? The idea he landed on was crowdsourcing — ask Exonians currently working in the technology field to take part in the class as co-designers of its syllabus. Vorkink leveraged his Rolodex — populated during his 47 years of teaching — and enlisted alumni from such high-tech giants as Google, Apple, Oracle, Instagram, Facebook and YouTube to collaborate one-on-one with students over the course of the term as mentors and resources in developing case studies. It was a first-of-its-kind experience at the Academy. “I know of no other secondary school offering a course like this, and few colleges, either,” Vorkink says, “and what a wonderful way to connect alums to the life of current students.”

Among the 40 experts Vorkink tapped was Christine Robson Weaver ’99, project lead for Google’s Machine Learning Division. “I’m thrilled that Exeter is diving into such a relevant and timely area of study,” Weaver says. “Right now, in Silicon Valley, and in my role at Google, these issues are top of mind.”

Penny Brant ’20 and Christine Robson Weaver ’99

AI, discrimination and corporate liability

The power of AI to make decisions can make our lives simpler and more efficient. However, machines may also learn to discriminate against people based on gender, race and socioeconomic class.

Read more about this case study.

Next-gen problem solvers

Anna Richardson White ’98, brand communications director at Instagram, readily participated as well. “It was an honor to be part of such an important and groundbreaking class,” White says. “I’m pleased that Exeter would create this critical course for the next generation of leaders, creators and problem solvers.”

When Religion 597 hit the Courses of Instruction, Vorkink was overwhelmed with registrants, enough to fill three full sections. Students were attracted to the class’s contemporary practicality and how it could teach them to make a difference. “As a person with a computer science background and interest in entering Silicon Valley,” Pavan Garidipuri ’19 says, “I wanted to explore Silicon Valley’s ethical dilemmas so that, in a non sibi spirit, I’ll know how to create products that will help society, not harm it.”

Anna Richardson White ’98

Image recognition

Examine the morality of current uses of image recognition and their potential discriminatory side effects as well as positive ramifications, including helping visually impaired persons.

Read more about this case study.

Gracie Goodwin ’19

Parsing the Digital Age

Over 10 weeks, the students parsed some of the serious questions of the Digital Age. They took internet truth quizzes (Can you spot fake news?) and read the technology section of The New York Times, plus articles in Wired and on Slate. They listened to TED Talks and watched documentaries on extremists and conspiracy theorists and prank videos on YouTube. They scrolled through Twitter and Instagram (as well as Finsta, or fake Instagram) accounts. They did all of this with an eye on identifying ethical conundrums, and figuring out how they differed from political, legal, sociological, historical or technological issues.

As questions arose, the alums, via email, Skype and cellphone, stepped up to provide context, to frame the issues as constructively as possible, to offer resources for further inquiry and to be solid sounding boards. “This is the first exercise as an alum where I felt clear and direct attachment to a student,” says Quincy Smith ’97. “My mentee shared accountability, we used teamwork and gang tackling on top of Google docs, Sunday calls and a flurry of text messages. It was like Fortnite.edu.”

The term’s efforts culminated in the production of a student-written case study on a single issue. Students chose from a broad list of nearly 70 topics that Vorkink supplied, including body scanners, the surveillance economy, virtual currencies, the gig economy, public shaming and fake news. “Writing the case study was an eye-opening experience,” says Gordon Chi ’19. “I didn’t realize how complex the ethical considerations for self-driving cars could be. I started to question whether the design rationale of our algorithms should be determined by the government of the country where the product is used, or rather simply the programmers that create the self-driving cars.”

Aarsh Kak '19

Corporate social responsibility (CSR)

CSR is a business model where companies choose to make a positive impact on society.

Read more about this case study.

John Griffin ’98

“Know thyself”

On high school campuses where many students may not align with a single religion, broadly scoped classes like Religion 597 offer another avenue for understanding what it means, at the deepest level, to be a person. “When I think about teaching religion to adolescents, I think about ferreting out, identifying, naming, discussing and comparing the personal and practical pieties, the value systems by which our students live, that which they hold most dear in their hearts, what’s really real for them, what really matters, what constitutes the center of meaning for them — all of which is encapsulated for me in the term ‘religion,’” Vorkink says.

An Episcopal priest and ardent world traveler, Vorkink has taught philosophy and religion at the Academy since 1972. He graduated from Yale and earned a divinity degree from Union Theological Seminary in New York. When he was 20, he spent a summer in Florida volunteering for Martin Luther King Jr. “Working closely with King showed me a different side of religion, namely, religion as praxis, or the practice of faith,” he says. He went on to study philosophy and religion at Harvard, leaving the doctoral program ABD (without writing his dissertation) to come to Exeter. Then in 2015, Vorkink wrote his thesis, entitled “Know Thyself: Why and How to Teach Religion and Philosophy to Secondary School Students,” and fulfilled the final requirement for his doctorate.

A key part of Vorkink’s unwritten lesson plan for the Silicon Valley course was to help students figure out how all of their individual choices in this technologically charged world pertain to their philosophical search to “know thyself.” “In a generation where our influences have expanded from our parents to millions of online users around the world, it has become harder for us to determine our own morals and make our own decisions,” Mia Kuromaru ’19 says. “This class helped me realize that every student should understand the importance of their own autonomy.”

Classmate Tina Wang ’19 noticed her habits changing throughout the term. “I’m much more aware of my own technology use now,” she says. “I often find myself pausing to think about the ethical dilemmas behind news stories I see. … This course pushed me, it taught me to question and challenge what we too often blindly accept.”

Vorkink was pleased to hear that what the students were learning in the classroom carried over into their everyday lives: “Probably the most encouraging comment any student made, and I heard it again and again in different contexts during this course was, ‘You know, I never thought about that.’ That was encouraging in terms of why the course was valuable to offer. … It made them think.”

In a final act of reflection inspired by Jill Lepore’s New Yorker article “What 2018 Looked Like Fifty Years Ago,” Vorkink asked the students to imagine the technologies of 2069. Some prophesized a data-driven society completely devoid of privacy, while others saw the end of livestock farming in favor of cellular cloning of lab-grown meats. Many noted an increased reliance on robots to perform low-level jobs and the replacement of traditional “screens” with holograms. Each futuristic prediction was placed in the Academy Library vault as a time capsule, waiting to be opened during the classmates’ 50th reunion.

Editor's note: This article first appeared in the summer 2019 issue of The Exeter Bulletin.