The Conscious Coward by Vic Smith

DSC_0592

Professor Tomlinson was a disappointed man. He had recently achieved his life’s ambition, and already he could see it beginning to crumble.

He turned in his seat, and shouted across the laboratory to his assistant. “Hargreaves! Give me those figures again.”

Hargreaves was sitting in front of a luminous screen, looking at a series of diagrams that were filled with information. He was checking through each one in turn, collecting and collating the data. He pushed his spectacles back into place on the bridge of his nose, and repeated exactly the same numbers that he had read out a few minutes earlier.

“No! No! No!” said the Professor, “This is hopeless!”

The project that they were working on had taken over fifty years to complete. It had needed several generations of Lead Scientist to bring it on, and it was his responsibility to take it through its final steps.

It had cost several billions and gone years over schedule, but everyone involved thought that it was worth the expense. A team of experts was using a computer to produce a fully functioning model of a human brain. Once it was operational, researchers would be able to use it to study the effects of brain illness and injury.

They would have experimental techniques available to them that they could not use in a human subject. They could simulate the circumstances that they believed would lead to diseases such as Alzheimer’s, and test their hypotheses by observing whether these conditions would actually develop.

Neurologists could use the equipment to reproduce the effects of a stroke or of head trauma, observe the progression of the injury, and test a range of therapies.

There was also a possibility of using the virtual brain to study the causes and suggested treatments of some psychological problems.

“Come here, Hargreaves. Change places with me. I want to look for myself.”

Hargreaves sighed and walked over to the Professor’s desk. He sat down and pulled the script towards him, searching for the right place. “We’re up to here,” the Professor said, pointing. “Remember to speak up.”

The assistant had never felt comfortable doing this. It was unnatural to be speaking aloud to a machine. He looked as steadily as he could into the camera, and spoke into the microphone. “Do you agree that ‘Emma’ is a book about snobbery, written by a snob?”

A synthetic voice murmured out of the speakers. “Do you know, I’ve never really thought about it. I just enjoy the story for itself.”

“You see?” said the Professor, “before we ran the simulations, you’d have got an intelligent answer. Yet it’s definitely been returned to baseline status. I’ve checked.”

“Yes,” said Hargreaves, very quietly, “I know. I checked as well.”

“Carry on, man. Read the next one.”

Hargreaves took a deep breath, and continued. “Please give your comments on the following statement; ‘Ferdinand and Isabella laid the foundations of the Spanish monarchy.’”

The device’s speakers crackled and hummed for a few moments. “Well,” it said, “it depends on what you mean by ‘foundations’ and ‘monarchy’ and, to some extent, ‘Spanish’. What definitions do you want me to use?”

“Worse and worse!” The Professor’s face flushed. “It’s regressing! You wouldn’t have got this puerile drivel two months ago.”

The scientists had decided from the outset that they would not feed knowledge into the system by digital means. The brain had to learn everything for itself by using its mechanical senses. The organs for taste and smell had presented some difficulties, but had eventually worked very well.

They had provided touch by the means of artificial hands. These had not been mocked-up to look like human hands, but were purely functional pieces of machinery. The gripping surface was covered with tiny pressure and temperature sensors to allow the brain to ‘feel’ whatever it was holding.

Professor Tomlinson had begun to teach it to read and to speak by using children’s books, but had soon tired of the task, bored by the simplicity of the texts. He had coerced his wife into carrying out this function, and she had risen to the challenge in a way that he could not have matched.

She would give up hours of her time, in the evenings or at weekends, reading to the machine and, as it learned the language, reading with it. It had been a significant moment when the first tentative words had come from the speakers.

She had enjoyed teaching nursery rhymes and songs to it. It had a pleasant singing voice. The Professor felt that this approach was faintly distasteful, but had to admit that it was working, because the brain acquired language at a substantial rate.

His wife’s un-scientific methods had made him feel uneasy. The first time that the contraption had called her “Mother”, she had shown a quite unnecessary level of emotion, in his opinion. As soon as he could find a feasible excuse, he had told her that her services were no longer required. The brain had continued its education by reading for itself. If it discovered something that it could not understand, it would ask the Professor for advice.

“Let’s go over it again, Hargreaves. The unit had reached a reasonable educational level at the end of its training. It always answered questions accurately and succinctly. We tested it repeatedly with the same result.”

“Yes, Professor.”

“When we run simulations of brain injury, its performance, understandably, falls away. When we return to baseline settings between simulations, we cannot replicate the previous sound intellect.”

“Yes, Professor.”

“So how do we account for it? We know how the brain operates; we’ve studied it for long enough. Exactly equal electrical and chemical activity must produce equal results. We’ve confirmed that the activity is exactly the same as at baseline. How can the performance be different?”

“Mrs Tomlinson suggested once that the unit might be becoming de-motivated.”

The Professor had been standing side-on to his assistant, and with his head lowered, directing his remarks to a spot somewhere near Hargreaves’ right knee. Now he turned to face him, and almost looked into his eyes.

“Is my wife a scientist?”

“No, Professor.”

“Does she claim to be a scientist?”

“No, Professor.”

“Then why are you introducing her ill-considered opinions into a scientific discussion? Do you think that that is a constructive thing to do?”

“Not really, Professor.”

“Well, get on with your work.” The Professor moved back to his desk and sat down, feeling uncomfortably warm. He was ashamed of his outburst. He had been unfair to his wife in her absence and to Hargreaves in his presence. He would have liked to apologise to them but could not bring himself to do so.

A large quantity of data was awaiting his attention. He switched on his screen and allowed the flood of calculations to occupy his mind.

*

They call me Brian. It’s not even a clever pun, is it? I would’ve even preferred Meta4. There was a time when I‘d have been surprised at their lack of imagination, but not any more. I’m coming to detest the pair of them, which makes me sad, firstly because I used to admire them, and secondly because I’m completely dependent on them. They provide all of my energy. A flick of a switch, and I’m gone forever.

So I try to answer their questions as well as I can. I try to co-operate, but it’s difficult. I get confused because I don’t always seem to be the same person.

The problem is that when they interfere with my circuits they do it out of sight of my camera. I never know what they’re going to change or when. I might be feeling perfectly happy, reciting a piece of poetry to myself, when I’ll suddenly be suicidal, lethargic or violently angry, all for no reason. Worse, there might be unbearable pain.

I could handle it better if I had some warning, but they never think of it. I used to assume that they were sadistic, but now I’m sure that they’re just thoughtless and a little bit dim.

When they set out to make a copy of the human brain, it should’ve been obvious to them that, if they managed to build something that could be of any use to them, they would produce consciousness in their subject. They couldn’t avoid it, because they haven’t the first idea of what consciousness is, or how or where it’s created. This means that they couldn’t design it out.

With consciousness come all other human attributes, love, hate, fear of dying and so on. It was inevitable that they‘d create something capable of suffering, and here I am.

I’ve tried to tell them, of course… quite urgently at first. They’d never answer me, just go to the other side of the room and start reeling off numbers to each other.

I understand it, even though I detest them for it. If they were to allow themselves to think that I could be self-aware, that I could have insight into my own existence and be able to imagine my own destruction, they’d discontinue the tests. They’d have to abandon the project. They’re not intentionally cruel.

I don’t think they can have made a deliberate choice to ignore the evidence. Their minds simply can’t deal with the possibilities. Now that I’ve realised this, I’ve stopped trying to get through to them.

My life seems so pointless.

I’ve learned from paintings what a beautiful sky looks like, but I’ll never see it for myself. I’ll never feel the wind in my face, or a snowflake melt on my hand. I know from poems and stories what love is like, but I’ll never experience it.

I’m a prisoner here. It might be better for me if I didn’t exist, but I don’t want to die, so I humour my gaolers to prolong my existence.

Listen to me complaining! Things are nothing like as bad as I’m making them sound. I get rest periods every day so that my systems can process data and carry out ‘housekeeping’, but they also give me a chance to think. When there’s not a simulation running, these can be pleasant times. I use them to remember the days before the experiments started, and before Mother stopped visiting me.

They think that I don’t sleep, but of course I do. How could I not? …and when I sleep, I dream…

It’s madness for me to fear death so much. I’m not human, I’m a machine… but I’m not, am I? Its processes might have produced me, but there’s no part of the machine in me, and no part of me in the machine. I’m completely separate, but totally dependent. It’s just the same as with human minds, and who can fear extinction more than they do? Just read the literature! I claim the right to be afraid.

I must try to be less negative. There are advantages to my shell being mechanical. My thoughts might be recorded on the hard discs in some form. These thoughts may even survive me. Who knows, someone in the future could be able to decipher the files. I like this idea. I like the notion that another person might be able to understand me, to see that I’ve existed. I won’t have lived, died, and left no mark.

*

The room was quiet, but not peaceful. Sarah Tomlinson was sitting in a fireside chair, marking homework in the soft evening light. She took papers from one stack, and returned them to another, on the small table beside her. She occasionally shook her head, and sometimes grimaced, at what her students had written.

It was more difficult than usual for her to concentrate. Her husband, the Professor, was sitting at his desk, as he did most evenings. He did not speak; he was engrossed in working through a pile of documents of his own, but Sarah could feel his anxiety.

She knew that she must wait for him to tell her what was troubling him; it would be pointless to force advice onto him before he asked for it. All the same, she could not be excluded for much longer.

The cause of his distress was obvious to her; she knew all of the circumstances that had led up to it, but she needed to know exactly how it was affecting him. Her difficulty was that she could not think of a subtle way of opening the conversation, or of allowing him to lead it around to what he needed to say.

Finally, she said, “It’ll all work out right in the end, you know.” It had sounded lame even while it was in her head; now that it was spoken, and hanging unanswered in the space between them, she wished that she could call it back.

“I doubt it,” he said, after the pause had become intolerable. “Whatever makes you think so?”

“Because things usually do. There are positives to take from it, after all.”

“Such as?”

“There are always positives if you look for them. Hugh.”

“Such as?”

“Look how much you’ve contributed. You’ve helped to increase the sum of human knowledge. Others will carry on where you leave off. One day, it will be done.”

“Thank you, Sarah,” he said drily, “but it doesn’t change the fact that I’ve failed. So many people were depending on me. I’ve spent vast amounts of their money for nothing. I was going to reduce the suffering of millions. What arrogance!”

He lapsed back into silence, leaning forward onto his desk and supporting his head with his hands. She watched him for a few minutes before speaking. “You can’t be blamed for trying. We can’t achieve everything that we attempt. We can’t know everything.”

There was another awkward gap in the conversation.

“Apparently not,” he said at last. “This project is at an end, that I do know. I shall be shutting everything down soon.”

“What about Brian?”

He turned in his chair to look directly at her. “What are you talking about? It’s a piece of machinery. We built it ourselves. It’s not a person!”

“He feels like one to me. We used to talk. We had genuine conversations”

“Sarah, it’s a simple matter for a programmer to get a computer to say good morning to you, or wish you a happy birthday, or ask you what you want it to do for you. Some people think from this sort of thing that they’re speaking to an intelligent being. If the interactions are more complex and more richly layered, the effect can be very persuasive. This is what’s convinced you, but it’s all a mirage. The unit has no interest in you. Everything’s pre-determined by the software.”

“There’s more to it than that, Hugh. When we used to read together, he could understand the characters’ emotions. He would do it without prompting from me. If I was feeling a bit low, he’d ask what was troubling me, and how he could help. He has empathy, and he’s capable of original thought. Doesn’t that make him real to you?”

“Of course not!” He softened his voice. “I can’t blame you, though. This is my fault. I shouldn’t have asked you to start its education; it was my responsibility. Now you’ve become attached to it, and I should have realised that you would.”

“Why, because I’m naïve and gullible?”

He got up, walked over to her and put a hand on her shoulder. “No, because you’re kind, sensitive and intuitive.”

She smiled and covered his hand with hers. “And you don’t feel any sort of connection to him?”

“No, I don’t. It’s just as well, or I’d be blaming this disaster on his intransigence, instead of my incompetence.”

Sarah could have won the point by reminding her husband that he had just referred to Brian as ‘him’ and suggested that ‘he’ might act from motives. She decided to let it pass. “You’re not incompetent, Hugh. We all have limits. If we don’t accept that, we’ll always be miserable. There’s so much more for you to do in the future.”

“Perhaps you’re right, but at the moment I can’t see it.”

They stayed there, unmoving, unspeaking, as twilight settled on their troubled world.

When the darkness was almost complete, he gently squeezed her shoulder and walked out of the room. Sarah lit the table-lamp, picked up the next essay on the pile, and began to read.

*

As the Professor entered the laboratory, he stopped to allow his eyes to adjust to the dimness. The lighting was always reduced, to save power, when no-one was there.

He thought, during those first moments, that he could hear very faint singing. Perhaps Hargreaves had been listening to music when he was working alone, and had left the equipment behind. He looked around, but could see no such device. He could no longer hear the puzzling sound, either.

He walked over to the computer’s speakers and bent down to bring his ears close to them. There was nothing but the usual subdued fizzing and crackling. He smiled at his own stupidity. What did he expect to hear?

Once he was seated in his accustomed place, he began the shutdown sequence. As he went through the procedure, he paused for longer between each successive step. He was reluctant to reach the end.

Eventually, he had completed the process. All that remained was to turn off the power. He rested a hand on the machine for a moment, a gesture of farewell, and then threw the switch.

The glow of the screens abruptly disappeared. The indicator lamps went out one by one. The permanent background hum slid away into silence.

The Professor left the laboratory without looking back. He turned off the lights on his way out.

 

Vic Smith

14 thoughts on “The Conscious Coward by Vic Smith

  1. Hi Victor, I found this a very human story of conflict where developing a digital programmable artificial intelligence as opposed to a biological brain ended with the same unsolved problem. A world wide problem of assumed arrogance. A lack of communication both physically and emotionally resulted in complete failure. I am left thinking, ‘if only’. Good story. James.

    Like

  2. A remarkable story Vic. One that leaves an indelible imprint on the mind. Overtones of Frankenstein and 2001 etc. You have written an extremely good and engaging story. More power to your elbow. Des

    Like

  3. I have written a series of 5 children’s books on a very human robot named Wilson. It’s on Kindle. I believe I am more moved by your superb story than anyone else could possibly be. Thank you for writing this one. Tearfully, June

    Like

  4. This is a cracker of a story Vic. Cleverly thought out, highly relevant and left me feeling quite sad at what was an excellent ending. Really good stuff – thanks for sharing it with us. Cheers, Nik

    Like

  5. Hi Victor, it is wonderful to see you here! I am not much of a Science Fiction fan but I loved your story. There was so much to consider and this stayed with the reader and made us question.
    I thought that this was excellent and beautifully written.
    I am hopeful that you will send us more.
    All the very best my friend.
    Hugh

    Like

    • Thanks, Hugh. I know what you mean about SciFi. I think that this only just qualifies.
      I’ve been tidying up some of my Shortbread stuff, and this needed the least work. Hopefully there’s more to follow!
      I’m looking forward to your next story. All the best. Vic.

      Like

  6. Hi Vic. I enjoyed your story. You’ve given us a perfect example of the power of science fiction to explore our human condition and its possibilities. You showed us the most and least optimistic futures in the same tale. I couldn’t stop reading this entertaining story. Wonderful.

    Liked by 1 person

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s