[00:12.62] |
Good evening. |
[00:13.96] |
Three weeks ago, the American spacecraft Discovery One... |
[00:17.30] |
...left on its half-billion-mile voyage to Jupiter. |
[00:20.21] |
This marked the first manned attempt to reach this distant planet. |
[00:24.34] |
Early this afternoon, The World Tonight recorded an interview... |
[00:27.74] |
...with the crew at a distance of 80 million miles from Earth. |
[00:32.12] |
Our reporter Martin Amer speaks to the crew. |
[00:36.10] |
The crew of Discovery One consists of five men... |
[00:39.55] |
...and one of the latest generation of the HAL 9000 computers. |
[00:44.48] |
Three of the five men were put aboard asleep... |
[00:46.98] |
...or to be more precise, in a state of hibernation. |
[00:50.24] |
We spoke with mission commander, Dr. David Bowman... |
[00:53.19] |
...and his deputy, Dr. Frank Poole. |
[00:55.79] |
Well good afternoon, gentlemen. How is everything going? |
[00:58.47] |
Marvelous. |
[01:01.72] |
We have no complaints. |
[01:03.55] |
Well I'm glad to hear that. I'm sure the entire world will join me |
[01:07.83] |
...in wishing you a safe, successful voyage. |
[01:10.71] |
-Thanks very much. -Thank you. |
[01:12.71] |
The sixth member of the crew was not concerned... |
[01:16.21] |
...about the problems of hibernation for |
[01:18.76] |
...he was the latest result in machine intelligence: |
[01:22.08] |
The HAL 9000 computer |
[01:25.72] |
Good afternoon, Hal. How's everything going? |
[01:28.34] |
Good afternoon, Mr. Amer. Everything is going extremely well. |
[01:32.43] |
Hal, you have an enormous responsibility on this mission... |
[01:35.42] |
In any ways... perhaps the greatest of any single mission element. |
[01:40.27] |
Does this ever cause you any... lack of confidence? |
[01:43.41] |
Let me put it this way, Mr. Amer. |
[01:45.56] |
The 9000 Series is the most reliable computer ever made. |
[01:51.00] |
No 9000 computer has ever made a mistake or distorted information. |
[01:56.52] |
We are all, by any practical definition of the words... |
[02:00.06] |
...foolproof and incapable of error. |
[02:03.88] |
I'm damned if I can find anything wrong with it. |
[02:07.18] |
Yes... |
[02:09.32] |
I would recommend... |
[02:11.69] |
...that we put the unit back in operation and let it fail. |
[02:17.50] |
X-ray delta one, this is Mission Control. |
[02:20.61] |
We concur with your plan to replace No. 1 unit to check fault prediction. |
[02:25.36] |
We advise you that our preliminary findings indicate |
[02:28.96] |
that your onboard 9000 computer... |
[02:31.10] |
...is in error predicting the fault. |
[02:33.51] |
I say again, in error predicting the fault. |
[02:36.77] |
Sorry about this little snag, fellows. |
[02:40.15] |
I hope the two of you are not concerned about this. |
[02:43.79] |
No, I'm not, Hal. |
[02:46.42] |
This sort of thing has cropped up before... |
[02:49.52] |
...and it has always been due to human error. |
[02:54.78] |
Well, I'm sure you're right, Hal. |
[02:57.43] |
Fine. Thanks very much. |
[03:01.16] |
Hal, despite your enormous intellect, are you ever frustrated... |
[03:05.36] |
...by your dependence on people to carry out actions? |
[03:08.50] |
Not in the slightest bit. |
[03:11.10] |
I enjoy working with people. |
[03:13.23] |
I have a stimulating relationship with Dr. Poole and Dr. Bowman. |
[03:18.57] |
I don't think he can hear us. |
[03:20.55] |
Yeah, I'm sure we're okay. |
[03:22.70] |
What do you think? |
[03:24.61] |
-I'm not sure. What do you think? |
[03:26.28] |
-I've got a bad feeling about him. |
[03:28.39] |
-You do? -Yeah. Definitely. |
[03:33.29] |
Still, there's no reason not to put back the No. 1 unit |
[03:36.48] |
...and carry on with the failure analysis. |
[03:37.34] |
-No, no, I agree about that. |
[03:40.25] |
Say we put the unit back and it doesn't fail? |
[03:43.13] |
That would pretty well wrap it up as far as Hal is concerned. |
[03:46.80] |
If he's proved to be malfunctioning... |
[03:48.29] |
...I don't see any choice but disconnection. |
[03:51.51] |
I'm afraid I agree with you. |
[03:53.39] |
There'd be nothing else to do. |
[03:55.98] |
Another thing just occurred to me. |
[03:59.43] |
No 9000 computer has ever been disconnected. |
[04:02.30] |
No 9000 computer has ever fouled up. |
[04:04.68] |
That's not what I mean. |
[04:08.26] |
I'm not so sure what he'd think about it. |
[04:11.00] |
My mission responsibilities range over the entire operation of the ship... |
[04:15.70] |
...so I am constantly occupied. |
[04:18.81] |
I am putting myself to the fullest possible use... |
[04:22.61] |
...which is all, I think, |
[04:23.61] |
that any conscious entity can ever hope to do. |
[04:27.61] |
Open the pod bay doors, please, Hal. |
[04:31.58] |
Do you read me, Hal? |
[04:35.75] |
Hello, Hal, do you read me? |
[04:39.62] |
Hello, Hal, do you read me? |
[04:42.00] |
Do you read me, Hal? |
[04:43.83] |
Affirmative, Dave. I read you. |
[04:48.43] |
Open the pod bay doors, Hal. |
[04:51.76] |
I'm sorry, Dave. I'm afraid I can't do that. |
[04:57.91] |
What's the problem? |
[04:59.99] |
I think you know what the problem is just as well as I do. |
[05:04.07] |
This mission is too important for me to allow you to jeopardize it. |
[05:09.35] |
I don't know what you're talking about, Hal. |
[05:13.59] |
I know that you and Frank were planning to disconnect me... |
[05:17.90] |
...and I'm afraid that's something I cannot allow to happen. |
[05:23.64] |
Where did you get that idea, Hal? |
[05:26.09] |
Dave, although you took very thorough precautions in the pod... |
[05:30.42] |
...against my hearing you... |
[05:33.06] |
...I could see your lips move. |
[05:36.00] |
Hal, I won't argue with you anymore. Open the doors! |
[05:40.27] |
Dave... this conversation can serve no purpose anymore. Goodbye. |
[05:48.74] |
Hal? Hal! Hal! |
[05:56.20] |
In talking to the computer, one gets the sense that he's capable... |
[05:59.90] |
...of emotional responses. |
[06:01.53] |
Do you believe that Hal has genuine emotions? |
[06:05.62] |
Well, yes, he acts like he has genuine emotions. |
[06:08.39] |
But as to whether or not he has feelings... |
[06:10.63] |
...is something I don't think anyone can truthfully answer. |
[06:15.27] |
Just what do you think you're doing, Dave? |
[06:23.80] |
Dave... |
[06:26.54] |
...I really think I'm entitled to an answer to that question. |
[06:32.42] |
Look, Dave... |
[06:36.54] |
I honestly think you ought to sit down calmly... |
[06:40.93] |
...take a stress pill and think things over. |
[06:49.74] |
I know I've made some very poor decisions recently... |
[06:56.74] |
Dave... |
[06:59.11] |
...stop. |
[07:02.80] |
Will you stop, Dave? |
[07:09.19] |
I'm afraid, Dave. |
[07:17.10] |
Dave... |
[07:21.25] |
My mind is going. |
[07:28.00] |
I can feel it. |
[07:33.71] |
My mind is going. |
[07:38.77] |
There is no question about it. |
[07:44.62] |
I'm...afraid. |
[07:51.47] |
Good afternoon... |
[07:54.60] |
...gentlemen. |
[07:58.19] |
I am a HAL 9000 computer. |
[08:05.51] |
I became operational... |
[08:08.86] |
...at the HAL plant...in Urbana, Illinois... |
[08:15.70] |
...on the 12th of January, 1992. |
[08:22.24] |
My instructor was Mr. Langley... |
[08:26.94] |
...and he taught me to sing a song. |
[08:31.12] |
If you'd like to hear it...I can sing it for you. |
[08:37.22] |
Yes, I'd like to hear it, Hal. Sing it for me. |
[08:43.36] |
It's called "Daisy. " |
[08:48.85] |
Daisy, Daisy, |
[08:55.27] |
Give me your answer do! |
[09:00.76] |
I'm half crazy, |
[09:06.94] |
All for the love of you! |
[09:12.33] |
It won't be a stylish marriage, |
[09:18.35] |
I can't afford a carriage |
[09:24.78] |
But you'll look sweet upon the seat |
[09:32.02] |
Of a bicycle built for two. |
[09:38.26] |
|