11/19/2017 0 Comments Reading 12: Self-Driving CarsWhat is the motivation for developing and building self-driving cars? What are the arguments for and against self-driving cars? Would they make our roads safer?
The reasoning behind the push for self-driving cars range from sentiments that self-driving cars will be safer than human-driven ones, the cost of transportation for those without cars will be lower, and that people will be able to either rest, work, or play in their commutes rather than focus on driving. On the other side, you have those who are understandably concerned about the viability of such vehicles. Some articles cite the numerous issues we've had in implementing them so far, including situations where the computer's lack of a "human" element results in it not judging another car's actions like a human would, resulting in a crash. Some are concerned about the technological side of it, worrying about our ability to implement such cars in general, nevermind all of the real-life mistakes (see: accidents, deaths) that would have to occur in order to work out the inevitable kinks. There's also liability issues that would come to surface if such a car killed someone (or multiple people). I personally think that self-driving cars are something we don't need and shouldn't pursue (at least not now). I think that the primary push for them seems to be the interest in making roads safer. As for that point, I think it's safe to assume that a majority of the auto accidents will result from careless driving by one or more parties, and a smaller part of the large majority from conditions (weird construction situations, dangerous weather, etc). This is something that can be solved through other means than far-fetched technological innovation. Want to get rid of poor drivers? Let's make the test to get a license more difficult and require a more reasonable level of all-around ability. Let's make drivers take the test more often than once in their entire life. Let's continue to push for people to not text or goof off while driving (which we've been doing and have seen driving fatality rates fall). Inventing a complex system of self-driving cars is overkill, unnecessary, and wasteful. And, it's all of these things and only presents a possibility of being safer, overall. Screw that. How should programmers address the "social dilemma of autonomous vehicles"? How should an artificial intelligence approach life-and-death situations? Who is liable for when an accident happens? I think that there's great importance of the way that an AV's (autonomous vehicle's) decision making occurs. It's important to note that this isn't a programming problem as much as it is more of an eclectic one, requiring thought and approach from several disciplines, including ethics and philosophy. The consequences of these discussions and how they're ultimately programmed are great, and it's necessary that they're viewed as such. I think that it makes most sense that a vehicle will be more willing to sacrifice itself than to harm someone else outside of the AV. If someone wanted to recklessly decide that having an AV was in their best interest, they should be the ones to deal with the negative consequences of that decision, not other people. I feel that's very simple. Liability would likely lie with the company who made and operates the vehicle, provided that no laws or rules were being broken by the owner or the person otherwise harmed. What do you believe will be the social, economic, and political impact of self-driving cars? What role should the government play in regulating self-driving cars? It'll be great, I'm sure. Not great as in positive, but definitely impactful, for better or for worse. This makes sense, as we're discussing changing the way our world has operated for the last century. Modifying the daily commute and more, an inevitable part of daily life. Companies' stock will rise and fall depending on the success of their AVs, politicians' voter numbers will be affected depending on their view on AVs and how they'll approach either supporting or operating against them. It'll be a big conversation moving forward as long as some people are interested in pursuing it. I feel that the government should definitely be involved in the regulation of these cars, because the implications are huge, and no one has the comprehensive authority to oversee this but the government. Would you want a self-driving car? Explain why or why not. Definitely not. I don't even want a car whose systems are connected to a computer, as many cars are in the last few years. They're easily hackable, and the consequences of such 'hackability' is dangerous. I will continue to trust my own ability to make proper and correct decisions on the road, and will likely refuse to put more trust in another computer or person as long as I live.
0 Comments
Leave a Reply. |
AuthorNikolas Dean Brooks is a current Senior at Notre Dame. This blog is for the "Ethics and Professional Issues" course under Dr. Peter Bui. Archives |