Is it right to break an unjust law?

Dialogues | Is it Right to Break an Unjust Law?

Shi Xin and Bain continue to wait for Hunter. Bain stares at the service bot and the coffee cart that stands next to it. Bain’s mother would have noticed the devil in her son’s eye, the boredom that would eventually drive him from the ill-kept family home, the curiosity and the rebelliousness that would send him to jail. Staring at the robot Bain has many questions, and Shi Xin has begun to worry about her companion’s ability to keep himself calm.

The time does not matter, nor the year.


Shi Xin:  Bain, come over here, sit next to me.

Bain: Oh, what have I done now?

Shi Xin: Nothing, of course, there’s nothing to be done here.

Bain: Ain’t that the truth. We’ve been told to wait here, and like dutiful little robots, here we are!

Shi Xin: Ah, so is it the waiting, or the instruction that bothers you?

Bain: Both, obviously, but mostly the instruction, and the fact we can’t do anything about it.

Shi Xin: Well, we could leave, you could leave, if you really wanted. I’m sure the traders behind us would love the opportunity to win back their coins, and take you somewhere else in this star system.

Bain: I don’t want to do that.

Shi Xin: So what are you complaining about?

Bain: I’m just expressing my angst.

Shi Xin, laughs: Come on, when did you last have an existential crisis?!

Bain: Look, just because I’m morally ambiguous doesn’t mean I don’t question what I do.

Shi Xin: Okay, so, explain.

Bain: Well, I just don’t like following orders.

Shi Xin: That’s just means you’re an awkward SOB.

Bain: Yes but it’s a constant frustration, especially if I agree with the instruction, and it’s in my own interests.

Shi Xin: So you’re awkward, and irrational.

Bain: No doubt.

Shi Xin: We both know that rules and laws are necessary.

Bain: Do we? Maybe in some instances.

Shi Xin: In all instances where livelihood and security are required.

Bain: But that’s so restrictive, I just want to do what I think is right.

Shi Xin: You want to exercise your personal ethics? Even if they contradict someone else’s?

Bain: Of course!

Shi Xin: So if you decide all property is theft, you can take it, and do whatever you like with it?

Bain: No, no, I don’t mean it like that, but it’s so open to abuse.

Shi Xin: “It’s”?

Bain: The exercise of law.

Shi Xin: So you don’t argue that the law itself might be wrong, it’s just the way it’s enforced, perhaps you think it’s always heavy-handed, or officious.

Bain: Hah, yes, but, no, it’s the law itself. If that’s wrong, it doesn’t matter how well the law is enforced, if the law’s wrong, everything about it feels wrong, so any enforcement compounds the problem, but it’s easy to misinterpret the wrong in the enforcement, with the wrong in the law.

Shi Xin: My, you have been thinking this through. Given your past I thought you’d just object to any form of legal coercion.

Bain bows his head and laughs: Just minor offences really.

Shi Xin: So did you break those the little laws on purpose? Or didn’t it matter?

Bain: I’d like to pretend it was a principled stand, but I just did what I had to.

Shi Xin: What you wanted to.

Bain: No, I had to do it, there was no other way, legal way to survive, with no education, so no possibility of a decent job.

Shi Xin: Really, you’d defend that now?

Bain sighs: Well, perhaps not now. I know it’s more complicated.

Shi Xin: Because…?

Bain: Actually I think it is right to break an unjust law.

Shi Xin: Ah.

Bain: But it has to be a complete process, a fundamental disagreement in principle.

Shi Xin: You could be accused of being arrogant.

Bain: It’s not arrogant to stand against an unjust law, then suffer known consequences.

Shi Xin: But you wouldn’t do that.

Bain: You’re making it too personal. I might or night not, but the fact remains if a law is wrong, people will object and eventually in might be changed, depending on how the society is governed.

Shi Xin: You mean democracy?

Bain: I mean the will of the people. It doesn’t take a democracy to get people out of office and change the laws, it could be a coup or a revolution.

Shi Xin: That doesn’t happen very often, most people are too busy with their daily lives, quietly suffering  laws they don’t particularly object to, agreeing with those they do.

Bain: No, but people do go to jail to make their point, some people die – look at the civil rights movements of the 20th century on old earth.

Shi Xin: But the law tends to hold up the will of the majority, so minority groups always suffer, unless their needs align with the majority.

Bain: Or unless shame and guilt become a force for good. Lands conquered by colonial forces all over the star systems are beginning to regain rights to their own lands.

Shi Xin: Correcting a wrong that’s killed so many in its wake, and it’s too late for indigenous peoples on old Earth.

Bain: Some humans don’t believe in the rights of other species, even groups within our own society. We’re not as sophisticated as we think we are. Just a bundle of relative self-interests, some more powerful than others, according the superior power of one group against another.

Shi Xin: Not all laws are like that, some are absolute.

Bain: Ah, that’s why I was checking out our friend over there.

Shi Xin: The ever-loyal service robot, willing to do anything for us…

Bains stands up and walks to the cart, nods to the robot: …except undermine the original Robot Laws, the Asimov’s laws. Will you recite those three?

Service Bot shutters his eyes: A robot may not injure a human being or, through inaction, allow a human being to come to harm. A robot must obey orders given it by human beings except where such orders would conflict with the First Law. A robot must protect its own existence as long as such protection does not conflict with the First or Second Law.

Shi Xin: So these are absolute laws, unaffected by the unjust law issue.

Bain still standing by the Service Bot: Not really, they’re guiding rules at best, never voted on, or explicitly agreed by any community except for governments representing or imposing the will of the people. Robots where initially devised in research labs, with government grants and rules. Then two problems occurred: wealthy individuals began to make their own, to do what they want, and the military began to make robots which could kill humans, at first just by accident, then by design in colonisations and frontiers. That’s when the re-definitions of what a human is became a fierce debate across colonies. Governments and vested interests framed the debate in ways that would suit them, excluding expensive blanket solutions, ignoring workers from other colonies who came back to the hub planets to do the manual work the majority no longer wanted to do.

Shi Xin: Stop, stop, perhaps this is all true, but its strays from our key point about unjust laws. You’re not saying that these laws of robotics were unjust so people broke them on principle, you’re saying the people who constructed the laws in the first place broke the laws to suit themselves. It has nothing to do with personal ethics.

Bain: Well, that’s true, but perhaps its more to do with the lack of personal ethics.

Shi Xin: It seems to me that this undermines the point of Law itself. If it’s imposed by a minority of the powerful self interested, either because it’s meant to reflect the greater good, or doesn’t even pretend to, then to break an unjust law, is not just a matter of personal ethics or personal perspective, it’s an absolute must!

Bain: I’ve never thought of it like that.

Shi Xin looks across at Bain next to the coffee cart, and the service robot, who seems alert to the discussion. She wonders if the concept of a just or unjust law could occur in a robot governed by its software programming. Of course the original instructions were coded by a human, acting on a particular set of guidelines, but a robot now capable of independent thought, perhaps it will develop self-interest. Would it then still defend Shi Xin and Bain, and allow itself to be destroyed if the traders decided to attack them?


Links