Technology dictates and controls our life. With the advent of self-driving cars, it also controls our death.
Last year, a jaywalking pedestrian was killed by an autonomous Uber car while its supervising driver watched “The Voice.” Last week, it was discovered that the source of the issue came from the developers of the vehicle not accounting for jaywalkers and thus the machine not recognizing the pedestrian as a cyclist until it was too late.
Surely jaywalkers don’t deserve death. So why did Uber programmers get to decide through their ignorance that this person’s time had come?
This is the problem with the speed at which technological progress is moving. We are in unknown and unregulated time. Advances are being made faster than policymakers can keep up. These software developers and project leaders are not bad people, but they can’t be expected to know all the externalities they need to account for.
That is why we need to stress cooperation between disciplines, especially in technology. When too many on a project come from the same perspective or background, it increases the risk of tunnel-visioning and missing obvious flaws.
For example, this autonomous Uber accident was easily preventable. If an urban developer was on the team, they could have seen the lack of understanding on how people in cities actually live and move. If a policymaker or criminal justice person was consulted, they could have tried to look for holes in the framework. If an ethicist was on-board, they could have raised concerns about putting out this product before enough testing and consideration.
Maybe these did happen, and this fault still slipped through the cracks. However, given Uber’s track record of prioritizing their own growth over consideration for others, I doubt the proper precautions were taken. And now, a person is dead because of it.
We’re seeing this time and time again with big tech companies. YouTube is embroiled in investigations over their potentially predatory advertising practices. Facebook has shown it has little concern for protecting its own users. The list goes on. Maybe these companies are salvageable, but they clearly exhibit a pervasive problem in how companies and our culture view technological development.
There needs to be some more oversight, plain and simple. If this can’t come from governments, then it needs to be an internal restructuring that prioritizes best practices over growth. The leaders of these companies need to take a step back and think about how their practices affect society. They need to adjust. And if they can’t, the workers need to do it for them. We can’t have a legion of software developers build out so much of our society alone; they’re going to fail. They are failing, not because they’re stupid or evil but because diversity of perspective matters for such large-scale projects.
The best products do not come from profit motive alone. They come because people see problems and work together to solve them. It’s inspiring to see some of the ways that technology has allowed us to achieve feats in the humanities and the arts. It’s not at all inspiring to see that technology has made a rotten company like Uber a lot of money by disrespecting communities and putting out dangerous products.
This is a paradigm shift we need to make. People inside and outside of the tech world need to think about how we can work together to build safe, equitable and well-studied systems. Computers are not just another subject to study; they are now the backbone of so much of our society. We can not just consider technology a tool to be worked with by a specific group of people; we need to teach everyone how they can safely and conscientiously use tech to better our world.
Disclaimer: The views and opinions expressed by individual writers in the opinion section do not reflect the views and opinions of The Daily Campus or other staff members. Only articles labeled “Editorial” are the official opinions of The Daily Campus.
Peter Fenteany is the associate opinion editor for The Daily Campus. He can be reached via email at email@example.com.