It's a pretty thing and very efficient for what it is, but this Buick likes to get in the way of my driving process. |
I'm all about technology assisting a process, I'm happy to use the rear view camera to make centimeter perfect parking, but there is a big difference between interfering and assisting. When you're backing a car up and it starts bleeping at you about impending impacts that aren't happening it isn't helping, it's introducing false and interrupting signal to your process. When your car aims its mirrors at the ground and then leaves them there thus preventing you from using them to assess incoming threats, they are a hazard rather than a help.
This 'we'll do it for you' technology sets all sorts of dangerous precedents:
This ad doesn't make me think, gee, I need a Kia so when I'm operating a two ton vehicle like a clueless git it'll save me from myself! It does suggest that there should be far fewer people with valid licenses on the road. Driver intervention tools like this muddy the line between expectations of driver competence and technology's ability to take care of things. How often do educational technologies do the same thing in the classroom?
Self driving cars are on the horizon. For many people this will be a great relief. Those who hate driving and do it poorly will all be better off for it, and so will the rest of us when they are no longer operating a vehicle. I have no doubt that for the vast majority self-driving cars will drastically reduce accidents, but they also mean those of us who are willing and capable lose the chance to learn how to do something well. The fact that I can toss pretty much anything into a parallel parking spot (I did in in a van... in Japan... with the steering on the wrong side) is a point of pride and a skill I took years to develop. If machines end up doing all the difficult things for us, what's left for us to do well? If machines end up demonstrating our learning for us, what's left for us to learn?
Based on what I've seen recently, I'm more worried that machines will unbalance and panic us while they are taking care of us. I don't look forward to that future at all. Perhaps clueless, bad drivers won't notice any of this and will do what they're doing now, minus the actually controlling the car part. Perhaps poor learners will happily let AI write their papers and answer their math quizzes, and never have an idea if what they're doing for them is right or not.
I often frustrate people by second guessing GPS. Mainly it's because I know how hokey the software is that runs it, so I doubt what it's telling me. When GPS steers me up a dead end road I'm not surprised. Maybe I'll feel better about it when an advanced AI is writing the software and it isn't full of human programming errors. When that happens maybe it won't matter how useless the people are. There's a thought.
I'm a big fan of technology support in human action, but it should be used to improve performance, not reduce effort and expectation. It should especially not damage my ability to operate a vehicle effectively. The same might be said for educational technology. If it's assisting me in becoming a better learner, then I'm all for it, but if it's replacing me as a learner, or worse, interfering with my ability to learn, then the future is bleak indeed.