The Ethics of a Potentially Artificial Future

Panel explores implications of Artificial Intelligence

Emma Pollans
The Santa Clara
April 25, 2019

The robot revolution has arrived—but have the ethics? That’s exactly what a panel explored on April 18, discussing issues which surround the fast-growing field of Artificial Intelligence (AI).

The panel consisted of Brian Green, the director of technology ethics at the Markkula Center for Applied Ethics, Maya Ackerman, a professor in the computer engineering department and Juliana Shihadeh, a senior computer science and engineering major.

The panel was part of the “Ethics in Artificial Intelligence” event, was hosted in collaboration with the Association of Computing Machinery, the Association of Computing Machinery Women’s Chapter and the Markkula Center.

Shihadeh began researching how to use machine learning algorithms for medical diagnosis, which eventually lead her to question the ethics surrounding AI. Her goal is to increase awareness of the social and ethical implications behind the growing presence of AI, since it’s a part of everyone’s lives.

“I wanted engineers to become more aware of questions that needed to be asked and considered when developing AI,” Shihadeh said. “Not everyone is involved in developing AI but AI is involved in all of our lives in some form or another. Your opinion about the technology can go a long way in influencing what’s being created.”

The talk began with a brief explanation of AI and the different terms often associated with it. AI refers to a finished product that can act and think on its own, like Apple’s virtual assistant Siri. Other frequently used phrases were machine learning (the process researchers undergo to train the AI), as well as deep learning (where the data is processed).

Once an AI is built, it becomes hard to understand the decisions it makes. Many times, those who use and interact with AIs are working with a “black box” and have no way of seeing how a particular decision was reached. If an AI were to develop biases, it would be difficult to discern where the biases came from in order to correct them.

“AI is extremely misunderstood,” Ackerman said. “The biggest misunderstanding is that AI should try to measure up to [humans]. But AI is already better than us at a lot, and I think recognizing the difference and the diversity can help us understand AI better.”

Another issue with AI involves the question of who is responsible for it. Ackerman is also the cofounder of WaveAI, which created ALYSIA—a songwriting AI that creates original songs. Ackerman built the AI to write songs, something she could not do. However, since she built ALYSIA, Ackerman is able to receive credit for its creations.

On the other end of the spectrum, if an AI were to cause harm, Ackerman feels the creator should not be the one solely responsible for the harm caused. Creators are not able to control how their AI is used or implemented, despite being the ones who initially built them.

“I feel like I’m about as responsible for [the AI] as I am for my kid,” Ackerman said.

Both Ackerman and Green noted that humans are most proud of their creative capabilities and that many people have a utopian view of AI doing all the “garbage work” so humans are free to create. Additionally, Green noted that there are two things humans would want to protect from AI: interpersonal relationships and creativity.

“As we are working with AI, we should make sure that it is not getting in the way between us and other people,” Green said. “Hopefully it is getting out of the way so that we can better interact with each other.”

Green also talked about the work done by the Markkula Center in starting conversations about the ethics surrounding AI. The Markkula Center visits various technology companies in the surrounding area to discuss how AI can be used in the future, how it might be regulated and who might be responsible for it.

Senior computer science major Sam Suri attended the event and believes these talks are important because they help to examine complex issues and produce insight.

“We should have talks on events that may not have a simple answer because we are then at least acknowledging the complexity of the topic and laying infrastructure for further discussions,” Suri said. “If we shy away from the topic out of fear, then how can Santa Clara live up to its goal to make the world a better place through knowledge?”

Contact Emma Pollans at epollans@scu.edu or call (408) 554-4852.

NewsEmma Pollans