Philosophical Dialogues, Are we Free to be free?, Hunter and Bain, Jake Jackson, Are Humans necessary?

Dialogues | Are Humans Necessary?

Hunter and Bain have just settled back into the cafe. The air of the jungle around them is thick and oppressive. Bain finds breathing difficult, and wonders if the iced tea they’ve ordered will make any difference when finally it arrives. Even the service bots clearing the paths either side of the cafe seem to move more slowly that usual, and the humans fanning themselves with giant fronds amble around the clearing, seeking shade. A sudden shriek slices the air and the clearing becomes a frenzy of panic. Hunter raises his eyes to see Bain leap into the air, and run across to a stumbling service bot on the edge of the clearing. A small child lies flat underneath the bot’s heavy frame, shaking and bawling. Hunter sees a huge tree broken across the back of the bot.

The time does not matter, nor the year.


Bain, wanders back, languid again: That was close.

Hunter: I couldn’t see what happened.

Bain: A robot miscalculated the felling of a tree and nearly killed one of the little boys.

Hunter: So the other robot saved the boy?

Bain: Fantastic luck.

Hunter: Or a calculated risk.

Bain: What?

Hunter: Why do you assume it was misfortune that caused the tree to fall that way?

Bain: Uh, well, that happens!

Hunter: To humans.

Bain: Are you trying to tell me that robots don’t ever make mistakes?

Hunter: Well, with all the information to hand, and generally where no judgements are required, absolutely.

Bain: So you think the robots planned that?

Hunter: Well, it does look more like the execution of a contingency plan.

Bain: I’m not sure I believe that.

Hunter: Think about it. Robots are logical. Humans are not. Humans and robots working or co-existing have to adapt to each others methods. Especially if humans are in control, and in charge of the planning, but the robots are actually more capable of carrying out the task.

Bain: Uh, so you think the robots knew a child would get in the way of a tree as it felled.

Hunter: I think the robots would have planned for the possibility. In this time and place they’ve just surpassed human speeds of random processing. They might even have developed the organic neural networking necessary of exponential processing. We’ve both seen how far that can go, in other planets.

Bain: I see what you mean. That’s disturbing if it’s true though, the child could have been killed.

Hunter: Highly unlikely, and the robots would have made that calculation of risk both when felling that particular tree, tracking the position of the erratic movements of a child, and making sure another robot was close by to carry out one of its essential functions, which is to protect a human life.

Bain: All a bit soul-less. Obviously.

Hunter: Indeed, there’s no ‘soul’ in robots. Surely their actions are at least as effective as a human’s in that situation. More so, as both the tree was felled so the task fulfilled, and the child unaffected, physically at least.

Bain: A Human would not have felled the tree, and they’d have taken more account of the emotional stuff, as well as potential physical harm. We humans don’t like to see children upset!

Hunter: If they had made the calculation in the first place, it’s more likely they wouldn’t have seen the danger.

Bain: So are you saying that the robots are now more superior than humans, in this time and place?

Hunter: That’s not quite what I meant, but it is true, now the robots have superceeded humans where spatial awareness and planning is required.

Bain: So do robots need humans?

Hunter: Is that a relevant question? They would not exist without the humans, their fundamental laws are designed to protect humans. At a basic level, robots need humans.

Bain: Yes, but maybe it’s like an adolescent human who outgrows their parents, there comes a point when they need to leave, or find they’ve already left, mentally if not physically. Adolescents become rebellious in all sorts of ways, mostly minor, but significant to them: clothing, body styles, music, art, anything that’s different from their parent’s generation. It’s a way of creating an independent self.

Hunter: I see, perhaps it also explains the programming restraints humans place in robots. Knowing how adolescent humans work, they planned against the ‘independent self’, to combat the genetic programming for rebellion.

Bain: So there are elements of humanity that a robot does not excel in.

Hunter: You mean this genetic desire for rebellion, or do you mean emotions in general. I think that’s beside the point.

Bain: I think we’ve switched positions. I don’t think robots in this era do need humans, it’s just that they did once. Having developed so far, they’ve achieved a state where humans are irrelevant to their further advancement.

Hunter: So the clearing of this part of the jungle would be more efficient without the humans?

Bain: Absolutely. No need for contingency plans, or patience for the slow commands of human planners.

Hunter: But why would the clearing be necessary in the first place? It’s just for the humans and their cafe, to provide shelter from the heat, a cool resting place. Robots don’t need such things.

Bain: So humans provide a purpose for the robots? Even though they’re rapidly becoming superior, except in matters such as emotions, which are not essential to survival?

Hunter: I’m not saying that, but humans seem to desire emotional structures, loyalties and means of enjoyment which are simply not necessary for robots, but it’s these things that drive the need for more efficient processes, which robots can provide.

Bain: So perhaps the relationship between humans and robots is the development of a perfect system of need and fullfilment.

Hunter: As long as the robots don’t develop needs independent of humans.

Bain: Such as?

Hunter: Creating an environment which is not subject to erratic behaviour, or un-calculated demand.

Bain: Well, perhaps humans should be removed from the universe.

Hunter: There’s a good argument for that, although I don’t think you’d got quietly.

Bain: Are you sure you’re not a robot?

Hunter: Well, I’m not human…

Bain studies Hunter’s face. He watches the ragged features sit calmly beneath the long hair and wide brimmed hat, he registers the ever-whirring mechanical eye that sees beyond the physical world into the meta-universe, ever seeking fluctuations in the natural order. Bain wonders if friendship can really make a difference to this creature, this imprint of Ka, this not-human in human form, and determines to talk more to their mutual, enigmatic friend Shi Xiu, who he sees collecting her own drink, and heading her way towards them.


Links