California Lutheran University's Student Newspaper Since 1961

The Echo

California Lutheran University's Student Newspaper Since 1961

The Echo

California Lutheran University's Student Newspaper Since 1961

The Echo

    Cars: Moral Machines?

    A self-driving car plows towards a crosswalk full of elderly people. The car can either hit them or swerve onto a sidewalk full of children. What should the self-driving car do?

    Iyad Rahwanโ€™s Scalable Cooperation group at the Massachusetts Institute of Technology answers this question using Moral Machine. According to the Moral Machine website, the program was created to show how autonomous vehicles make these decisions and create a crowd-sourced human opinion.

    The Moral Machine asks users to weigh the lives of children, senior citizens, athletes and even dogs in a very calculated way. It shows how one solution, such as swerving when people are in front of the car, is not always the best applied action.

    But, would the cars be able to recognize and prioritize a situation where a dog, a senior citizen and a child are all at risk?

    Associate Professor of computer science Craig Reinhart, Ph.D. at California Lutheran University said autonomous vehicles have the ability to โ€œseeโ€ through sonar radar or GPS technology.

    According to Reinhart, if a sensor detects something on the road, it will trigger an action coded into the machine. If there is no code that responds to a stimulus, the system cannot respond because it doesnโ€™t have the intelligence to make one up.

    Reinhart said that the machine must rely on the data from sensors. It cannot override to make โ€œmoral decisionsโ€ when the path wasnโ€™t triggered or fails to make commanded paths. If it did, the machine would not be morally correct; it would be broken.

    One may assume that these cars act as we would or even better than us. In doing so, they fail to acknowledge that the machine has never been capable of making moral decisions.

    โ€œWhen it comes to that computer part, itโ€™s basically parlor tricks. Thereโ€™s no intelligence in there,โ€ Rheinehart said. โ€œItโ€™s not people making these decisions, itโ€™s people programming them to make these decisions.โ€

    Perhaps what is ultimately holding technology back is the belief that we will one day be able to give the machines moral capacity.

    While autonomous technology can recognize a situation where a dog, a senior citizen and a child are all at risk, we shouldnโ€™t expect it to have the morality to prioritize those lives on its own. The only moral machines are people themselves.

    Brandy Alonzo-Maylandย 
    Reporter