Ramesh Srinivasan | Educating and Protecting Our Future13.12.2019
Are doctors and engineers more similar than different? Natasha Singer connected these two professions in a 2018 New York Times story stating: “The medical profession has an ethic: First, do no harm. But Silicon Valley has an contrasting ethos: Build it first and ask for forgiveness later.”
Excerpted and adapted from “Beyond the Valley: How Innovators around the World are Overcoming Inequality and Creating the Technologies of Tomorrow” by Ramesh Srinivasan (The MIT Press, 2019).
Her story, “Tech’s Ethical Dark Side,” focused on how top engineering universities in the United States, such as MIT, Harvard, Stanford, and New York University (NYU), are now trying to bring a “medicine-like morality to computer science”. Singer goes on to discuss a wave of new class offerings at universities around the country bringing social, political, and cultural issues into a much needed dialogue with engineering and design. Her point is that a new “human-centered” approach toward technology can and should start with what we learn in our schools and universities.
Our iPhones, which we usually keep for only a couple of years, might vanish from our sights and minds when we trade them in for a new one, but in reality they might be just at the beginning of their lives on Earth.
Building on this, it’s time to consider the relationship between technology and our planet given that the “planned” shelf life of many of the devices we use is a fraction as long as they will stay on our planet. For example, our iPhones, which we usually keep for only a couple of years, might vanish from our sights and minds when we trade them in for a new one, but in reality they might be just at the beginning of their lives on Earth. They might travel to landfills and cause environmental harm. Or they might move to other parts of our cities, countries, or world—and once there, be repaired and distributed to those who lack the resources to buy into the corporate policy of “planned obsolescence,” a strategy that has made Apple so incredibly wealthy. This is why legislation that would provide anyone with the “right to repair” is an important topic now debated in many parts of the world.
Currently in the United States, though, this right does not exist. Consider the story of Eric Lundgren, an entrepreneur from the Los Angeles area who is obsessed with recycling. According to a report from the Los Angeles Times, he built an electric car out of recycled parts that drove further than a Tesla, and has created a recycling out of recycled parts that drove further than a Tesla, and has created a recycling facility for electronics that processes 41 million pounds of e-waste each year. He counts IBM, Motorola, and Sprint among his clients. He has done pro bono work to clean up the e-waste that has accumulated in Ghana and China, and donated recycled cell phones to US soldiers overseas.
Lundgren is an iconic figure of selflessness and civic engagement, right? Nope. For his recycling efforts, he is considered a criminal and is going to jail. He should be a hero for helping our society and world, lessening the environmental impact of the electronic devices we throw away while creating a sustainable business that provides jobs. He thought he could get more affordable computers into people’s hands while contributing to a second-hand industry in digital technology devices. But because doing good for the community sometimes clashes with the business interests of tech companies, Lundgren has instead earned himself a powerful enemy: Microsoft. Why has the company targeted him for retribution? Because in some cases, turning the junk he recycled into workable technology that supports its users has required him to install Microsoft Windows on the personal computers he has salvaged. In fact, Microsoft and the government worked together to put him in prison for fifteen months. The assistant US attorney on the case told him explicitly, “Microsoft wants your head on a platter and I’m going to give it to them.”
Repair is just one of the many social or environmental themes that have been overlooked by a traditional education in engineering or computer science.
Lundgren should be feted as a role model for leveraging the recycler’s ethic–recycle, reduce, reuse–to contribute to the common good. His story parallels the examples I’ve shared from Africa, where repair and recycle are necessities, not luxuries. Entrepreneurs like Lundgren can transform the “build to die” model that’s made computer and phone manufacturers so wealthy into a different kind of business focused on employing people, helping the environment, and providing consumers in the second-hand market an alternative, one that will likely complement rather than compete with the current consumer market for new technology.
Repair is just one of the many social or environmental themes that have been overlooked by a traditional education in engineering or computer science. Schools and universities will need to open up and revise their curricula, incorporating new themes and subjects of study if they wish to educate students for a digital future that is inclusive, sustainable, and collaborative.
New jobs can be created for those who can translate across technical and ethical domains as technologies are innovated and rolled out.
Another unfortunate legacy is that most education systems treat the science and engineering fields as separate from the humanities and social sciences. This is why we rarely see courses in which code writing or software design taught along with materials that “understands” the places where the software would “work.”
As we begin to see cultural or social topics being taught in conjunction with engineering, I suspect that we will also see engineers who are better equipped to think deeply about the world they are transforming with the systems they design. New jobs can be created for those who can translate across technical and ethical domains as technologies are innovated and rolled out. In a smattering of new offerings and initiatives in the United States, this process is just beginning: the prestigious Association for Computing Machinery has released a code of ethics, although it’s uncertain how it will be interpreted and taught across the world. A newly released list of computer science ethics classes taught at dozens of universities around the world reveals new course titles, such as “Race and Gender in Silicon Valley” or “Ethics in Video Games.”
Sure, our education is supposed to prepare us to work, to enter the job market. But it is also supposed to prepare us to be creative, reflective, deliberative humans. Education that discourages reflection and criticism treats people as tools, not as humans with social, ethical, and creative needs. Bringing different disciplines together in our schools, for example by marrying the sciences and the arts or by pairing engineering with social sciences, will not only prepare us for the jobs of the future but will also be ethically and intellectually enriching to us as human beings. Science, technology, engineering, and mathematics (STEM) education need not stand on its own. It’s unrealistic to think of science or technology as a given—as some sort of airtight study of “what is”—without recognizing how deeply philosophy, ethics, human behavior, politics, and the arts influence each of these fields.
Toxic design encourages unhealthy behavior, giving us incentives to act in ways that are counter to our best interests as human beings; it exploits our weaknesses and plays to our instinctive selves, subject to the twin forces of punishment and reward.
Mitchell Baker, the executive chairwoman of the Mozilla foundation, echoed these concerns recently. She believes that “we are intentionally building the next generation of technologists who have not even the framework or the education or vocabulary to think about the relationship of STEM … to society or humans or life.” She warns that as users (who number in the billions) become complacent, blindly following what technologies tell us to do, we lose our ability to ask fundamental questions like
“Who does this serve?” or “How might we apply this technical knowledge in different manners?”.
What about design, which is often only seen as a way to make something look pleasing or to make it “usable”? From this limited perspective, we give designers all the power, leaving us none. But this is not the only way we can think about design or engineering. Despite the “lone genius” myths we tend to circulate, great scientists (like Newton) or split-brain artist-engineers (like Leonardo da Vinci) didn’t work in a vacuum; their technical and artistic expertise evolved in response (and shaped) to societal visions of their times. Design is also a process that can be imaginative and speculative. For example, what if supporting user autonomy were a design principle itself?
Online recommendation systems are not that different from “rubbernecking” when there’s a car accident on the highway, or from bingeing on junk food.
These values—design as process, design as communication, even design as humility—can be guiding lights for how we build technology for the future. Tim Wu, a legal scholar and well-known author, recently applied the term toxic design to describe the most popular internet technologies and social media sites (to Facebook in particular). Toxic design encourages unhealthy behavior, giving us incentives to act in ways that are counter to our best interests as human beings; it exploits our weaknesses and plays to our instinctive selves, subject to the twin forces of punishment and reward. These forces, which are at work in our bodies at all times, come from the ways our normal, central nervous system functions. The dopamine, adrenaline, and other neurotransmitter rushes that get us geared-up to pay attention or respond can be hijacked. Toxic design (and toxic technology) works the same way.
Online recommendation systems are an obvious example of the problem: they can either expose us to extreme content to keep our attention, or they can enhance our addiction to affirmation, training us to obsessively log onto Instagram or Facebook in search of more likes, comments, or shares. It’s not that different from “rubbernecking” when there’s a car accident on the highway, or from bingeing on junk food. Both acts release dopamine because the brain believes that information related to danger and food is important to our survival. In the long term though, such obsessions, attractions, and addictions can do us harm.
Good design can also be a source of empowerment, a way of delivering value to everyone. What we need to do, argues Wu, is escape from the “false loops” that make our experiences online feel perpetually incomplete and drive our endless need to check in “just one more time,” whether when scrolling through our Facebook feeds or clicking on a YouTube “Up next” auto-recommendation. Google, meanwhile, follows us all over the internet, only to lure us back to its site through targeted advertising so we can sign on to additional Google products and services. Instead, Wu asks, “can social media be like, here’s the stuff you check up on, and then it ends?”
Our experience of the “open universe of information”—the way the internet was supposed to be—has been clouded by algorithmic goggles that filter the near-infinite possibilities available on the internet with results that masquerade as truth or knowledge.
Related to design ethics is the potential of digital literacy. The term might seem self-explanatory, something like “ensuring that everyone knows how to use the existing technology.” But, in reality, it is far subtler and more important. Literacy, in its traditional definition, isn’t just the ability to read or write—it’s actually about the capacity to reflect, analyze, and create. It’s about taking a newspaper or magazine article, book, even a fictional story and reflecting what’s behind it: who wrote it, what their assumptions were, what world they were a part of, what other information there might be on a similar subject. After all, children who can sound out the words in a book still aren’t quite “reading” if they can’t understand what the characters are doing or why. It’s even more powerful to grasp the meaning of the story— why someone would tell it, or what cultural significance it has.
What about digital literacy? It should be about how to use a technology and reflect upon it. But we’ve gotten to see how easy it is to blindly trust the information that finds us rather than critically reflect on what we see. For example, assuming that Google search results are neutral, trustworthy, and “all we need to know” has become a popular strategy for trying to stay afloat in a tidal wave of digital information.
This is a huge problem for two interrelated reasons: first because the information pushed onto our phones, social media feeds, or search results does not reflect some universal truth; and second because the information it does give us is based on computational choices that support private corporate goals rather than our own. In other words, it’s not as if Google might return the wrong answer to our search by mistake; rather, the answers suit whichever corporation is feeding us this information. Truth-value, social value, or our individual preferences aren’t being attacked; they simply don’t matter much. Our experience of the “open universe of information”—the way the internet was supposed to be—has been clouded by algorithmic goggles that filter the near-infinite possibilities available on the internet with results that masquerade as truth or knowledge.
In our secondary schools and universities, it will be important to teach building blocks for digital literacy: how different platforms are built, the basic concepts of computing even if we don’t wade into code, and real glimpses into what is (or is not) happening behind the scenes when we use a system. Digital literacy is just the doorway, then, to other literacies that we must wrap our heads around to ensure that technology serves all our best interest. As we step through that door we can develop algorithmic literacy (understanding bias in AI systems, or how a search engine system works), data literacy (how/when/where data is collected, how is it aggregated and retained, by whom, and with what effects), and political and economic literacy (what technologies are owned by whom, what industries are shaped by technology in what manners, how technologies shape public and political life, and the relationships between corporate and public/ political interests).
see also
- Life in a Post-Soviet Housing Project Now the Subject of a Video Game
News
Life in a Post-Soviet Housing Project Now the Subject of a Video Game
- Send Your EP to a Robot. How Algorithms Shape the Careers of Musicians
Opinions
Send Your EP to a Robot. How Algorithms Shape the Careers of Musicians
- Papaya Young Directors Announces Date for Grand Finale, Set to Take Place Live Papaya Young Directors
News
Papaya Young Directors Announces Date for Grand Finale, Set to Take Place Live
- A Forest on Rails – Anti-Smog Tram to Hit the Tracks
News
A Forest on Rails – Anti-Smog Tram to Hit the Tracks
discover playlists
-
Music Stories PYD 2020
02
Music Stories PYD 2020
-
Papaya Young Directors top 15
15
Papaya Young Directors top 15
-
05
-
Martin Scorsese
03
Martin Scorsese