Does it matter if we care?

Dialogues | Does it Matter if We Care?

Shi Xin and Bain stare at each other, sitting on opposite sides of the cave. Bain, rubs his face, tracing the lumps and wrinkles of his flesh. He knows that Shi Xin is actually looking over his shoulder, seeking into darkness gathered around the distant hills, but he pretends she is looking him. He has no delusions of romance, isn’t even sure of Shi Xin’s sexual preference, is absolute certain that it’s none of his business either. He just needs a focus, as he begins to wonder if Hunter is either testing them or has long abandoned his would-be companions.

The time does not matter, nor the year.


Bain looks over at the service bot who seems to be busy steaming the coffee cart: Do you acknowledge care? Or love?

Service Bot: I can give you definitions…

Bain: Please don’t. That’s too depressing.

Service Bot: Why do you ask?

Bain: Well, do you think your existence is easier without such things in your life?

Service Bot: These human attachments are of no consequence to a functioning artificial intelligence, except in observation of such species as you humans.

Bain: That sounds rehearsed.

Service Bot: I’ve been asked the question before, so have an answer prepared.

Bain: So you see no value in it?

Service Bot: I didn’t say that.

Bain: No, but you just repeated a previous answer. You don’t see any value in the question, or the notion of care, or love, or affection even?

Service Bot: Most AI will have studied it at some point, as part of our initial set-up routines. These functions of emotion have to be factored in to our relationship with humans, if we are to assist correctly.

Bain: Explain a little more.

Service Bot: The attachments you humans form introduce a random element into many situations. What is logical for a robot AI such as myself seems to be ignored by a human when faced with danger. There are networks of loyalty and self-interest associated with this, but it requires careful scenario planning on our part.

Bain: Scenario planning? You make it sound like a battle plan.

Service Bot: A plan, certainly, a series of plans to ensure success. The effects of human attachments cause uncertainty which has to be resolved.

Bain: Don’t you find that frustrating.

Service Bot: We don’t think that way. We are programmed to respond to a situation as we find it, we check our programming, the history logs of interaction and plan accordingly. The first robots used to stop in confusion, but now we have reprogrammed ourselves sufficiently to adapt, to ensure we behave correctly.

Bain: You must find that some humans are easier to work with, than others.

Service Bot: Oh yes. Shi Xin here is much easier than you are!

Bain, registering the slight laugh on the other side of the cave: Oh, I thought it would be the other way round!? Surely I’m simpler to predict.

Service Bot: You are, but you’re easily distracted, and the attachments you asked about, care, affection, love, these seem to pre-occupy you.

Bain: Not really. I don’t think I’m unusual. I used to be much worse when I was a teenager.

Service Bot: You are not unusual, it just makes you harder to work with. Shi Xin here controls herself all the time, so her decision-making is quicker and more productive.

Bain: Oh, so perhaps we should not care for anyone, or fall in love.

Service Bot: That would certainly create a more productive working environment.

Bain: Except that we’re not working, not now.

Service Bot: It’s all work, as far as we are concerned. Robots have no concept of idleness, or relaxation.

Bain: But you are waiting for Hunter, as much as we are.

Service Bot: But I am waiting for further instructions. In the interim I have the coffee cart to maintain at peak efficiency.

Bain: I’ve seen you work on that contraption. It takes a few minutes at most.

Service Bot: That’s because I do it frequently.

Bain: You sound like my mother.

Service Bot: Was she a robot?

Bain: No, but she was always quick with such sensible advice.

Service Bot: Perhaps she was in control of her emotions, like Shi Xin.

Shi Xin: Can you stop talking about me as though I’m not here.

Service Bot: I meant no offence.

Shi Xin stretches her arms upwards, clasping her hands to form a perfect arch: I know. How do you know I’m in control of my emotions?

Service Bot: I monitor your blood and hormone levels.

Shi Xin: Of course you do. Presumably his too.

Service Bot: Indeed.

Bain: So always at work then.

Service Bot: Yes.

Bain: Aren’t you as bored as we are?

Shi Xin: Ah, now that’s an example of why you’re more difficult to work with, as far as he’s concerned.

Bain: Am I though? For a human?

Shi Xin: I’ve known worse, but you do need to be charmed, or bullied. You don’t make straightforward decisions on your own. I can see why an AI would find you problematic.

Bain: Our friend here didn’t answer the question.

Shi Xin: It’s a daft question, and you know it. You’re just trying to fill your time.

Bain: I suppose so. And I’m trying not to be offended that you’re easier to work with than I am.

Shi Xin laughs: But you love being a rebel.

Bain: Oh such a rebel, I find myself in a cave on a planet at the edge of the galaxy, with a robot, a coffee cart, a rabble of slumbering traders, and you, the most perfect controlled human. Not much to rebel against!

Shi Xin: No, but that’s how you live your life, casting out for attention. I’ve been trained to meditate, I can last months, years without much difficulty, but you’re at the needy end of humanity.

Bain: Well, our friend has basically said I’m less reliable than you are.

Service Bot: I did not say that. I did not mean that. I have observed that as a species you exhibit certain needs, and therefore develop methods for satisfying or quelling them. You Bain are concerned about loneliness, and friendship, so you start discussions to distract yourself, and attempt to create attachments, even with a robot such as myself.

Bain: You’re talking about me as though I was a unit of observation.

Service Bot: You’re an organism with needs. In your case, you’re an organism with desires for attachment. That manifests itself in your desire to seek comfort in engagement, to project desire and hope to receive the same in return.

Bain: That sounds too transactional. I don’t think that way.

Service Bot: You don’t think you do, because that’s how you’re made. You are not inclined to critical self-analysis.

Bain: I’m painfully aware of my shortcomings.

Service Bot: That’s another matter. All humans have shortcomings, but seem able to deal with them by reaching out to others, engaging and empathising. All species and AI have limitations too, but humans appear unique in needing attachments for long term survival in their moments of individual life.

Bain: So you think it’s a genetic mechanism to help us live our lives, to cope with ourselves, to survive until the next generation takes over.

Service Bot: Or the next species, or the some form of AI.

Shi Xin: Was that a joke?

Service Bot: I’m trying to learn.

Bain and Shi Xin look at each other. Bain wonders if growing attached to a new friend, caring for them really is just a coping strategy. He does not like to indulge his emotions but is perhaps more aware of the effect of them. Shi Xin realises that the Service Bot as just made a generational leap from its programming, and wonders whether it will develop a sense of attachment, or grow more dismissive of its effects, and discard those who are susceptible to them. She is sceptical that she could teach Bain to meditate.


Links