I was talking with a friend of mine recently about the future of "full self-driving" capability of Tesla cars (though the same applies to other manufacturers). He hates technology. And so it follows, he absolutely hates the idea of a computer being in control of the car. (Never mind that he was driving my car, which is controlled by computers, or that he flies on jet airplanes.)
Part of what made the discussion interesting is that we agree on almost everything regarding the technology itself. Where we disagree is its application.
One of his arguments is that automation makes people stupid, because they lose any motivation to learn how to do it the "hard" way. Though I agree, I don't see this as a bad thing. I think it frees up the mind to do other things. Perhaps great things.
Another one of his arguments is that you're not in control of the car when a computer is driving. I pointed out that nobody has ever been in control, since you have always had to rely on some kind of mechanism, whether it's mechanical, electronic, or software. That was as far as the conversation got.
But I thought about it some more. The fact that traffic accidents are a major cause of death and injury means this is a really serious subject. If full self driving cars are safer than human drivers, then lives will be saved through the use of them. This seems much more important than one person's irrational bias against technology in general.
If it indeed makes cars safer, it also means that the sooner the technology is implemented the more lives will be saved. Anything we can do to speed up the release and adoption of self-driving technology will be a good cause, regardless of whether it's Tesla, Google, or anybody else.
So then I wonder. Would buying the $3000 upgrade to full self driving help speed up its progress? My guess is maybe. If so, then perhaps there is a moral imperative to buy it even though you receive nothing in the short term, and no guarantees for the long term.
Part of what made the discussion interesting is that we agree on almost everything regarding the technology itself. Where we disagree is its application.
One of his arguments is that automation makes people stupid, because they lose any motivation to learn how to do it the "hard" way. Though I agree, I don't see this as a bad thing. I think it frees up the mind to do other things. Perhaps great things.
Another one of his arguments is that you're not in control of the car when a computer is driving. I pointed out that nobody has ever been in control, since you have always had to rely on some kind of mechanism, whether it's mechanical, electronic, or software. That was as far as the conversation got.
But I thought about it some more. The fact that traffic accidents are a major cause of death and injury means this is a really serious subject. If full self driving cars are safer than human drivers, then lives will be saved through the use of them. This seems much more important than one person's irrational bias against technology in general.
If it indeed makes cars safer, it also means that the sooner the technology is implemented the more lives will be saved. Anything we can do to speed up the release and adoption of self-driving technology will be a good cause, regardless of whether it's Tesla, Google, or anybody else.
So then I wonder. Would buying the $3000 upgrade to full self driving help speed up its progress? My guess is maybe. If so, then perhaps there is a moral imperative to buy it even though you receive nothing in the short term, and no guarantees for the long term.