Gridiron - Philip Kerr
Шрифт:
Интервал:
Закладка:
'Well, there at least we are in agreement,' said Beech. 'Not why, Sergeant. How. Why implies a motive. This is a machine we're talking about, remember?'
'Why? How? What fuckin' difference does it make? I'd like to know what's happening.'
'Well, it could be there's been some kind of brown-out.'
'What the hell's a brown-out?' said Coleman.
'A low voltage level rather than no voltage at all. The back-up generator is supposed to kick in if there's a major power failure. There could be just enough power so that the Powerbak system doesn't come on-stream, but not enough so as to let Abraham run things properly. Could be it's starved of power. Like a brain without oxygen.' He shrugged. 'I dunno. I'm just guessing, really.'
'Are you sure about this, Bob? About Abraham?'
'Mitch, there's no other explanation. I've been reading the transactions on the terminal as they were made on the Yu-5 downstairs. The speed of the transactions alone convince me that there's nobody in there operating the computer. I'm sure of it. No pre-programmed instructions either. Abraham is doing this all himself.'
'Bob? Maybe there's another explanation,' said Mitch.
'Let's hear it,' said Beech.
'This is a very complex system we're dealing with here, right? And complexity implies an inherent instability, doesn't it?'
'That's an interesting possibility,' admitted Beech.
'Come again?' said Curtis.
'Complex systems are always on the edge of chaos.'
'I thought there was some kind of law that prevented computers from attacking humans,' said Coleman. 'Like in the movies.'
'I think you're talking about Isaac Asimov's First Law of Robotics,'
Beech said thoughtfully. 'That was fine when all we had to deal with were binary systems, computers that function according to a sequential yes/no system. But this is a massively parallel computer, with a neural network that functions according to a system of weighted maybes, a bit like the human mind. This kind of computer learns as it goes along. In the established church of computer discipline and practice, Abraham is the equivalent of a Nonconformist. A free-thinker.'
'Maybe so,' said Marty Birnbaum. 'But that's a whole different ballpark from the one you guys are batting in. Initiative's one thing. Intention is something entirely different. What you're suggesting here is
-' He shrugged. 'No other word for it. Science fiction.'
'Shit,' said Beech. 'Mitch, this is unbelievable.'
'Could it be,' Mitch argued to Beech, 'that Abraham has passed a certain threshold of complexity and become autocatalytic?'
'Auto what?' said Levine.
'A computer self-organizes from the chaos of its various programmed responses to form a kind of metabolism.' Beech was looking more and more excited.
Jenny stood up slowly.
'Whooa,' she said. 'A kind of metabolism? Are you saying what I think you're saying, Mitch?'
'That's exactly what I'm saying.'
'What's he saying?' demanded David Arnon. 'Bob? Do you know what he's saying? Because I don't have a fucking idea.'
'I tell you something,' said Beech, 'I'm not a religious man. But this is the nearest I've ever come to experiencing a revelation. I have to admit the possibility that, for want of better words, Abraham is alive and thinking.'
-###-What Bob Beech had to say left Willis Ellery feeling more nauseous than before. Believing that he was going to throw up, he went to the men's room, closed the cubicle door and knelt before the toilet bowl. His own shallow breathing and the cold sweat starting on his forehead seemed to underline the turmoil that was taking place inside his stomach. Only nothing happened. He belched a couple of times and wished that he had the nerve to stick a couple of fingers down his throat like some bulimic, adolescent schoolgirl. But somehow he could not bring himself to do it.
After several more minutes had elapsed, the feeling in his stomach seemed to drop down to his bowels and Ellery thought he would have to take a shit instead. So he stood up, unsteadily, unbuckled his belt, dropped his pants and shorts and sat down.
Why did it have to be Kay? he asked himself. Why? She had never done anyone any harm. Couldn't have been more than twenty-five years old. What a waste. And how was it possible for her to have drowned?
Even if Abraham had intended to kill her, how could it have managed it?
It wasn't like there was a diving board, or a wave machine. How was it possible?
The engineer in Ellery wanted to find out. He told himself that as soon as he was finished in the can he would call Ray Richardson on the walkie-talkie and get some details regarding the way in which Kay had met her death. No doubt Richardson had found her floating in the water and had simply made an assumption, as most people would have done. But there were other ways it could have happened. She had been electrocuted perhaps. Gassed even. Now that really was a possibility. With the automatic dosing pump it might have been possible for Abraham to have manufactured some kind of lethal gas. Or maybe he just hit her with ozone.
After a short spasmodic cramp Ellery evacuated his bowels and almost immediately started to feel better. He elbowed the toilet flush and activated the automatic personal-cleansing unit, left the cubicle and went to wash his hands in the long marble step of a sink that someone had considered fashionable. Ellery wanted to fill a bowl and push his whole face into it, but the shape of the sink made that impossible. It was not the kind of sink that encouraged you to linger.
Ellery looked at himself in the mirror and found his face recovering some of its former high colour.
'A sink ought to look like a sink, not a goddamn desktop,' he growled to himself.
He ran the tap, splashed cold water on his face and then drank some. The thought suddenly struck him that he was going about his business in much the same way that Kay Killen would have been going about hers when she met her death. The nausea returned as he realized he was in as much danger as Kay Killen had been.
Abraham controlled the washrooms just like he controlled the swimming pool.
Ellery did not want to touch the tap to turn it off, nor to dry his hands under the hot-air machine, for fear that he might be electrocuted. He ran to the door and laughed as he managed immediately to haul it open. Tony Levine nearly fell on top of him.
'What the fuck's the matter with you, man?' snarled Levine. 'Jesus, you scared me.'
Ellery smiled sheepishly. 'I think I scared myself, Tony,' he said. 'I was just thinking about Kay. I don't think she drowned at all. In fact, I'm quite certain of it. Richardson thought that because he found her floating in the water, that's all.'
'So what happened to her, Lieutenant Columbo?'
'It came to me just now. Abraham has charge of all the chemicals that go into the pool. I think she must have been gassed.'
Levine's nose wrinkled with disgust. 'She sure would have been gassed if she'd walked in here.' He laughed loudly. 'Man, this place stinks even worse than it does in the rest of the building. Whaddya eat for breakfast, Willis, dog food?'
Levine pushed past Ellery.
'Obnoxious bastard,' he said. He stared at the door for a moment and then returned in silence to the boardroom.
The clunk of the door closing behind Levine muffled the quieter sound of the airlock as the computer prepared to change the pungent atmosphere.
-###-'The more complex a system is,' Mitch was explaining, 'the less predictable it becomes; and the more likely it is to act according to its own set of priorities. You see, no matter how smart you think you are, no matter how much you think you know about what an algorithmic system is capable of, there will always be results that you could not have predicted. From a computer's point of view, chaos is just a different kind of order. You ask, why should any of this be happening? But you might as well ask, why shouldn't any of this be happening?'
'How can a machine be alive?' said Curtis. 'C'mon, let's get real here. No one outside of comic books believes that such a thing is possible.'
'It all depends what you mean by life,' argued Mitch. 'Most scientists agree that there is no generally accepted definition. Even if you were to say that the ability to reproduce yourself was a basic condition of being alive, then that would not actually exclude computers.'
'Mitch is right,' agreed Beech. 'Even a computer virus fulfils all the conditions of being alive. It's a fact we might not like to face, but possession of body is not a precondition of life. Life is not a matter of material, it's a matter of organization, a dynamic physical process, and you can get some machines to duplicate those dynamic processes. Fact is, some machines may be held to be quite lifelike.'
'I think I prefer lifelike to their being alive,' admitted Jenny Bao. 'Life still seems sacred to me.'
'Everything seems sacred to you, honey,' muttered Birnbaum.
'The Yu-5 — Abraham — is designed to be self-sustaining,' said Beech.
'It's designed to learn and to adapt. To think for itself. Why do you look surprised? Why is it so hard to believe that Abraham can think? That it might be any less capable of thought than God, for example? In fact, it ought to be a good deal easier to accept. I mean, how do we know that God knows, that God hears, that God sees, that God feels, that God thinks, any more than Abraham? If we're willing to overlook the essential absurdity of belief that makes a sentient God possible, then why do we find it hard to do the same with a computer? Language is at the root of the problem. Since it's certain that machines can't behave more like humans, then humans are obviously going to have to behave much more like machines. And language is where that homogenization will have to begin. Computers and people are going to have to start speaking the same language.'
'You speak for yourself,' said Curtis.
Beech smiled. 'You know, people have been writing about this kind of thing for years,' he added. 'The story of Pygmalion. The Golem from Jewish fable. Frankenstein. The computer in Arthur C. Clarke's 2001. Maybe now it has happened: an artificial being, a machine just took charge of its own destiny. Right here in LA.'
'There are plenty of other artificial beings in LA already,' said Arnon.
'Ray Richardson, for one.'
'Great,' said Curtis. 'We made the history books. Let's hope we stay alive to tell our grandchildren about it.'
'Look, this is serious, I know. People have been killed and I deeply regret that. But at the same time I'm a scientist and I can't help feeling somehow — privileged.'
'Privileged?' Curtis spoke with contempt.
'That's the wrong word. But speaking as a scientist, what's happened is enormously interesting. Ideally one would like time to study this phenomenon properly. To investigate how it has happened at all. That way we could reproduce the circumstances in order that it could be repeated somewhere else, under controlled circumstances. I mean, it would be a shame just to wipe it out. If not immoral. After all, Jenny's right. Life is something sacred. And when you create life, that makes you a kind of god and that in itself brings certain obligations vis a vis that which you have created.'
Curtis took a pace back and shook his head with confusion.
'Wait a minute. Wait just a minute. You said something there. You said it would be a shame just to wipe it out. Are you saying that you can put a stop to all of this? That you can destroy the computer?'
Beech shrugged coolly.
'When we built the Yu-5, naturally we considered the possibility that it might end up competing with its creators. After all, a machine doesn't recognize normal sociological values. So we included a tutelary program in Abraham's basic architecture. An electronic template called GABRIEL. To deal with the unpluggability scenario.'
'The unpluggability scenario?'
Curtis grabbed Beech by the necktie, and thrust him hard against the boardroom wall.
'You dumb asshole,' he snarled. 'We've been breaking our balls trying to save the lives of three men stuck in an elevator controlled by a homicidal computer and now you're telling me that you could have unplugged it all along?' His face became even more contorted, and he seemed about to strike Beech until he was restrained by Nathan Coleman.
'Cool it, Frank,' urged Coleman. 'We still need him to turn it off.'
Beech pulled his tie free of Curtis's fist. 'They were dead anyway!' he yelled. 'You said so yourself. Besides, you don't trash a $40 million piece of hardware without checking the subsumption architecture. An accident is one thing. But A-life culpability is another.'
'You piece of shit,' sneered Curtis. 'Dollars and cents. That's all you people can think about.'
'What you're suggesting is absurd. Nobody in their right mind would dump a Yu-5 down the toilet without first attempting a proper verification.'
'There are five people dead, Mister. What more verification do you need?'
Beech shook his head and turned away.
'Now you've got your damned verification,' said Curtis, 'what are you going to do about it?' He glanced impatiently at Coleman. 'It's OK, Nat, you can let go now.' He tugged his arms free of his colleague's slackening grip. 'Do more of us have to die before you get it through your stupid skull that this isn't some half-assed experiment at Caltech or MIT or whichever petrie dish mould you sprang from? We're not talking artificial life now. We're talking real life. Men and women with families. Not some tin fucking man without a heart.'
'Bob?' said Mitch. 'Can you turn it off? Is that possible?'
Beech shrugged. 'By rights I should get Mr Yu's permission to do it. There's a proper protocol for doing something like this, y'know?'
'Screw Mr Yu,' said Curtis. 'And screw his fucking protocol. In case you'd forgotten, it's not that easy to get hold of anyone right now.'
'Come on, Bob,' Mitch urged.
'OK, OK,' said Beech and sat down in front of the terminal. 'I was going to do it anyway.'
The walkie-talkie buzzed. Coleman answered it and stepped out of the boardroom into the corridor, heading towards the balcony.
'Hallelujah,' said Helen. 'Now maybe we can get the hell out of this multi-storey lunatic asylum.'
'Amen to that,' said Jenny, 'I've had a bad feeling about this place all afternoon. That's why I came here in the first place. To rid the place of its bad spirits.'
'Whatever floats your boat,' said Arnon and flopped down on the sofa.
'But the sooner we get out of here the better.'
'Yeah, well, don't hold your breath,' said Beech. 'It takes time to pour programming acid into the equivalent of a thousand ordinary computers.'
'How long?' said Curtis.
'I really couldn't say. 'I've never trashed a $40 million computer before. It took thirty-six minutes to kick Isaac's ass into touch, and that program was only a couple of hours old. You remember, Mitch? The SRS?' Beech started to type some transactions.