Ming the Mechanic
The NewsLog of Flemming Funch

Thursday, November 24, 2011day link 

 Blind and Automatic Punishment
picture I got my first HADOPI warning in an e-mail today. If you didn't know, HADOPI is the French three-strikes law that bans people from using the Internet if they 3 times have been caught downloading something that somebody has a copyright claim to. Not caught by the police, mind you, not verified by anybody. It just means that a few multi-national media companies have been given the power to just send an IP address and a date/time to a certain government agency, and whoever happens to have been using the IP address at that time will be punished. They don't have to even mention what I possibly might have downloaded.

This is one example of a dangerous and growing trend: For people in power to use automated means of catching and punishing people who don't follow their rules.

Politicians are notoriously bad at making rules. As are lawyers, and most anybody who appears to have the job of making rules for everybody. To make good rules that actually would work for everybody would require some kind of basic understanding of abstraction and the limitations of words. If not, it becomes simply a string of thinking fallacies. And, if connected with automated enforcement, very destructive things happen.

The rules (laws) in most countries are considered absolute, unless they clearly can be shown to have been ambiguous or conflicting with other laws. Per definition, a law is something that is just supposed to be exactly as it says, or somebody will be punished. The trouble is that most laws are based on something that makes sense in a very specific context, but they're applied generally, at all times, in all contexts. The law maker might really just have tried to make a statement, to communicate the importance of an idea, but he unfortunately used the medium of law.

If there are humans involved, such as police officers, or judges, or juries, or public opinion, there's a chance that an unfair application of a law gets corrected. If you have a good enough reason, the police officer might let you go. If you explain yourself to the judge, he might see that you did the right thing, despite what the law said. If everybody can see that it is a silly law, it might just be ignored. Humans process complexity, they can take all sorts of things into consideration at the same time, at many levels, consciously as well as sub-consciously. If the law says one can't spit on the sidewalk, any reasonable person would grant an exception to somebody who's choking on a piece of food. The law assumes a situation where there's no real reason to spit, and somebody does so for some kind of malicious or careless reason, but the law isn't likely to say so. It says that you will be punished if you spit on the sidewalk. Most laws are much too specific in the wrong way.

Most bodies of law are a mishmash of missing context, self-contradictions and exceptions. The practice of law is a mishmash of argumentation and reasoning and decisions that might go in one direction or another. If asked to actually look at it, most anybody would recognize that the words of the law itself aren't enough. At least anybody but the guys who get the clever idea of automatically enforcing laws.

Most people are now familiar with automated speed radars that measure your car's speed, take a picture of your license plate, and send you a ticket in the mail. No humans are involved. If you were measured as driving 91 and the sign said 90, you'll have to pay. Even if you weren't in your car at all, even if there was a reason for doing so, even if the speed limit isn't reasonable. In the town where I live, the 90km/h limit on the Periphérique circling town was chosen not for safety or traffic flow reasons, but because somebody calculated that gas would be saved if everybody had to drive max 90.

The current examples you see are fairly harmless. But that's only while technology is catching up, and while simple-minded politicians catch up to the idea of what one can do with technology.

Automated drones are increasingly being used in warfare. Actually, most of them are still mostly remote controlled unmanned aircraft. But that will change as the technology becomes better. Imagine high definition cameras with face recognition, reading of license plates, interpretation of body language, combined with offensive weapons, mounted on small flying drones. The military will use the first. But police forces will very happily use stuff like that as soon as they're allowed to. Just imagine how much easier their work would be in, say, policing the current Occupy protests. Automatic tear gassing of people who walk on the street when they've been told not to. It isn't particularly far fetched.

Again, the problem is that most rules are much, much too over-simplified and specific for a complex world. If somebody makes a law that says you're not allowed to talk on a telephone in your car, they'll probably be quite self-satisfied with the reasonableness of such a law. Lots of people will agree and think it is a great law, as they think of lots of people who're distracted, while driving, and therefore not driving as well. But the law says nothing about that. It doesn't define what is considered a phone, and it doesn't define the actual target, driving as safely as possible. Imagine that it was enforced automatically, that some device in the car automatically would kill any cell phone signal, if a call is attempted. Because, again, the law maker thought about how much better it would be if drivers weren't distracted. But it would also kill potentially live saving calls. It might also stop somebody from inventing a service that you could talk to that would help you drive better. It would stop a lot of things that the law maker just failed to imagine. If he were presented with the potential exception, he'll of course admit that, yes, of course it isn't meant to stop that. But his law didn't say so. And if we take the humans out of the equation, nobody else will be there to say so.

Overall, it is one of the prime insanities of humankind. The idea that you can take some words, put together into some sentences, and somehow they'll remain true and appropriate in all possible situations, forever. Forgetting that those words were in the first place merely abstractions of something more real. Maybe the author of the words clearly could see the picture of what he felt that those words applied to. But those words don't mean exactly the same thing to everybody else. And if we go ahead and apply those words to all sorts of other situations, very different from what their author was thinking of, they might not fit very well. Which is not such a big deal if we can notice and talk about it. But if the words have been put on automatic, crazy things can happen. There's a consciousness of abstraction that's largely unknown to the ruling class in most areas.

One of the most central ways that humanity is likely to shoot itself in the foot, or even collectively commit suicide is by ignoring, denying, or removing complexity. You see it in many fields of human activity: government, religion, even science. One size fits all solutions tend to kill life. One moral code for everybody. One crop to grow for miles and miles. Volumes and volumes of laws that tell everybody how they always must do under all circumstances. While totally overlooking what it is that makes the world work. Life is a continuum, multi-dimensional, multi-level. There is somebody home. Something is aware, something evolves, changes based on circumstances. Humans have found themselves able to create stuff that doesn't change based on the circumstances. There are some advantages to that, and a whole bunch of potentially world-killing dangers.
[ | 2011-11-24 00:54 | 14 comments | PermaLink ]  More >

Main Page: ming.tv