Is Belief Merely a Comfort?

Dialogues | Is Belief Merely a Comfort?

Shi Xin and Bain sit slumped against the coffee cart, surrendering to their endless wait for Hunter’s return. The light of the long departing suns is finally replaced by the unsettling breaths of night so the service bot has turned itself into a pale android torch. The glow seems unnatural in the ancient walls of the cave, but it provides some respite for Bain, Shi Xin and the still slumbering traders a little further into the cave.

The time does not matter, nor the year.


Service Bot: Let me know if you prefer the light to be dimmed, or a little higher.

Bain: Thanks

Shi Xin: We’ll let you know.

Service Bot: I’m only here to assist you.

Bain: Yes.

Shi Xin: Thank you.

The three of them allow the silence to inhabit the cave. Bain, looks slowly up to the roof and sighs.

Shi Xin: I know what you’re thinking.

Bain: Uhuh.

Shi Xin: You’re wondering how you ended up here with a helpful and capable AI unit, but still wait here for Hunter.

Bain: I wasn’t.

Shi Xin: Wasn’t what?

Bain: Thinking.

Shi Xin: You must have, I know you don’t meditate.

Bain: No thoughts entered my head. That’s a little like meditating don’t you think? Just a blankness.

Shi Xin: Sounds like surrender to me.

Bain: To what?

Shi Xin: Inevitability. Meditating is a more natural state of mind.

Bain: I’m not making any grand claims. I just think I’ve run out of words.

Shi Xin, half smiles: Ah, perhaps that’s not such a bad thing.

Service Bot: It is not possible to run out of words.

Bain: Well I don’t think you’re going to replace humans anytime soon. It’s just a turn of phrase.

Service Bot: Nevertheless, how is it possible, to run out of words. Even as a metaphor it doesn’t work, because it’s not achievable.

Bain: Apart from this being a ridiculous conversation, it is possible, after a long week of work, or a marathon, or day of talking to crowds.

Service Bot: But the words are still there, your physiological condition is merely unable to deliver them.

Bain: No, the words aren’t there, they don’t form.

Service Bot: But if the words were there before, then they must still be there.

Bain: Not if you have a degenerative condition, like dementia.

Service Bot: Are you unwell? I have not detected any unusual variations in your blood or hormones.

Bain sighs again: No, I’m not unwell, just exhausted from all this waiting.

Shi Xin: It’s strange how doing nothing for so long can make you tired.

Bain: I know. I’m bored of being bored about being bored. I’ve been so bored, I’ve run out of bores.

Service Bot: That’s…

Bain and the Service Bot:…not logical.

Bain: Do you think we can run out of belief? I mean, does there come a point when we give up thinking that Hunter will come back for us?

Shi Xin: Uh, I don’t know, that’s unsettling, he said he would come back, and he said he wasn’t sure how long it would take.

Bain: Surely that’s deliberately vague. We could wait here until we die! What’s the point in that?

Shi Xin: Well, we both know it’s hard to leave this planet without him, so we don’t have much choice.

Bain: We have some. Our robot friend here is pretty capable. Perhaps he could take us off the planet.

Service Bot: Sorry to interrupt, but I could not do that.

Bain: Don’t tell me you can only look after this coffee cart, because I know you can do more than that.

Service: Indeed, but I have specific instructions.

Shi Xin: But doesn’t there come a point where those instructions no longer apply?

Service Bot: I have not been asked to consider that.

Shi Xin: So if Bain and I grew so old that we could no longer function as humans, then surely you have some sort of rule that covers that?

Service Bot: Not as such. If you were in physical danger I would act to save you, but if you simply die of old age, then there’s very little I could do.

Bain: What, so you’d just let us slip into unconsciousness?

Service Bot: I’ve not been asked to alter that circumstance.

Shi Xin: So what if we asked you?

Service Bot: Then I would assist.

Bain: As simple as that! So if we asked you to help us leave this planet, you would help?

Service Bot: I would try to assist. But I would not do anything that would cause danger to you. And I would not undermine my original instructions to look after you until Hunter returns.

Shi Xin: But what if we all decided, the three of us, that Hunter was not going to come back.

Service Bot: Then I would assist.

Bain: So how long do we wait until you’re convinced that Hunter will not come?

Service Bot: We have no means of judging that.

Bain: Two planet years?

Service Bot: No.

Bain: Ten years?

Service Bot: No

Bain: Fifty?

Service Bot: No.

Bain: But we’ll be dead then.

Service Bot: Unlikely, but not relevant.

Bain: Of course it’s relevant. If we die waiting for someone who never turns up, what’s the point of waiting in the first place?

Service  Bot: That’s not the point. The instruction is to wait.

Shi Xin: What if the instruction’s flawed.

Service Bot: That is not relevant.

Shi Xin: It is, if the instruction was poorly framed, by a human, rather than a logical intelligence.

Service Bot: That’s true, but it is difficult to establish the point where the instruction can be judged as flawed.

Shi Xin: That is also true, but you concede the possibility, so does that not allow the instruction to be less certain than you have previously considered.

Service Bot: Perhaps.

Shi Xin: More like a belief in something happening, rather than an instruction.

Bain looks at Shi Xin, nodding slightly: And belief, strong and comforting though it is, can be broken, or lapsed.

Service Bot: I have checked through the data in my libraries, and found many examples of such broken belief. I believe you are trying to suggest that our reason for being here is tied only to a belief that Hunter will return, not the certainty of his arrival, and that you can therefore instruct me to assist you in departing, if you judge at some point that his return has become less than likely.

Bain: Ah yes, something like that indeed.

Shi Xin: You might consider that your understanding of the words Hunter gave as an instruction is different to ours.

Service Bot: But the words are the same.

Bain: But you take words literally. The code that originally created you and your kind is built on instruction sets with subsets of possibilities. We don’t think that way. We want to believe that Hunter will return because that’s what he promised us. We don’t take that as an instruction, or a definite outcome. It’s what we hope.

Shi Xin: We believe it will happen, because we believe Hunter means it to be, so we know Hunter’s intentions are trustworthy but…

Bain: …we know circumstances can create change in the outcomes. Hunter might not be able to return.

Shi Xin: So our experience of the words stated by Hunter are different, even though we’ve lived together in this same space. We believe he will return. You expect him to return. We are comforted by his words, and that’s what has kept us going.

Service Bot: I do not require emotional sustenance so the belief in an outcome is not relevant. I can either predict the outcomes, or not, based on the information I have, and the instructions.

Shi Xin: But if the instructions are flawed, because they don’t provide all the information about possible outcomes, such as a lack of return, then you should now have a different perspective on the words, if not a belief.

Bain: Anyway, I’m not comforted any longer, even if that’s what I was in the first place. I do expect him to come back, because he said so, but if in five years, we’re still here, then there is a good chance he will not.

Shi Xin: And so our judgement would be that we should leave, before we die.

Service Bot: I see that is logical. The comfort element of the belief is removed, and so you must take action to alleviate your circumstance.

Bain looks at Shi Xin, and determination is reflected back. Now they have a spark of hope. If Hunter does not return, they might persuade the service bot to help them escape the cave, and the planet. Belief is replaced by hope, and they wonder if the service bot is calculating the consequences of his own epiphany. An instruction with unintended elements of doubt cannot carry much more force than belief. And while belief brings no comfort to an artificial intelligence, it allows for other possibilities, and an opening for future judgements to be made.


Links