Ziv Goldfeld, assistant professor in electrical and computer engineering, received an IBM 2020 University Award for the theoretical machine learning research he is doing jointly with collaborators... Read more about Ziv Goldfeld receives IBM 2020 University Award
Engineering Ethical Students
Engineering ethics is concerned with the study of ethical issues involved in engineering practice. It explores the engineer's role in technical decision-making within organizations, and considers the engineer's relationship to the uses of technology in society.
The Bovay Program in the History and Ethics of Engineering seeks to be a catalyst for consideration of social and ethical issues in the Cornell College of Engineering. Dr. Park Doing, lecturer in electrical and computer engineering, guides the program to introduce ethical concepts to engineering students using real world case studies and current topics.
“Usually these issues come up when things go wrong,” said Doing. Examples include the recent Boeing 737 Max disaster, along with historical catastrophes such as the Columbia and Challenger Space Shuttles and accidents at the Three Mile Island and Chernobyl nuclear plants. “But we also have another mode,” he continued. “What happens when things go right?”
Engineers are tasked with developing technologies, techniques and devices to be taken up and used by the world. Even when functioning as designed, these technologies have social and ethical implications to consider. “How engineers deliberate these questions is definitely worth exploring,” said Doing.
Recently, civil liberties groups as well as the general public have raised concerns about bias in algorithms and data sets, especially those used in corporate hiring practices, medical diagnoses, and law enforcement procedures.
“Algorithms can identify subtleties and correlations within data that a person might not see,” Doing said. “The use of an algorithm in some kind of screening process might well mitigate the bias of any individual human being who is performing the screening.”
For example, a human screener of job applicants might exhibit biased behavior in dismissing resumes from graduates of certain universities or any number of other prejudices. An algorithm could identify well-suited candidates by highlighting characteristics linked to successful employees already working for the company. “It's possible that helps the underdog,” Doing said. “It's not automatic, but it's possible.”
“Of course people are very concerned with biases that are built into the data sets that algorithms use,” he continued, “as well as the lack of transparency in the neural net-based decision making process.”
Consequences of bias exhibited by algorithms or the data sets they work with can also be quite impactful in the areas of security and law enforcement. Tools like facial recognition, deployed with the best of intentions, can nevertheless provoke concern and even outrage from the public as biases are revealed.
When tested by the National Institute of Standards and Technology, facial-recognition algorithms from 99 different developers falsely identified African-American and Asian faces 10 times to 100 times more frequently than Caucasian faces from a database of photos used by law enforcement agencies in the US.
ECE Professor Stephen Wicker, who writes and teaches about data security and privacy, describes the ethical complexity of facial-recognition algorithms. “If we're capturing a face and comparing it to a database of known faces for identity—Which person is this? Do we have a match?—that's one thing,” said Wicker. “But if we want facial recognition to figure out whether a crime is likely to be under way, and it's in a public situation, how are we going to write this algorithm? Are we going to focus on people looking nervous, or hiding things? Are we going to look at particular types of people congregating? Then it becomes problematic.”
Both Wicker and Doing want their students to consider the downstream implications of their design decisions. What is the role of engineering in society and whom do we hold accountable when malevolent effects emerge?
“Most engineers would say: I just build stuff—how people use it is a different realm,” Doing said. “I think it's an open question. Can you really absolve yourself of all responsibility?”
Wicker argues that we cannot. “We accept the plaudits and renumeration for our work—we must also accept the responsibility for subsequent social and legal problems.”
Engineers must challenge their assumptions at every stage of development. Independent review panels within organizations could analyze software for potential bias before deployment, and government regulation could help to promote transparency and allow people to more control over their personal data.
“The Accreditation Board for Engineering and Technology (ABET) has recognized that ethics is an important element of engineering education,” Wicker said, “and we build this into our curriculum.” The goal is to build ethics into engineering solutions as a fundamental ingredient, and not an add-on.
“It's OK to apply value judgment,” Wicker reminds his engineering students, “because we each have an innate ethical sense. The important thing is to pay attention to that sense and to continue to hone it over the course of your career.”
This article originally appeared in the 2019/2020 issue of ECE Connections.