covered or touched on in lecture
heads up: key point not
Did HAL Commit Murder?
Since he gives evidence of having of "higher order"
mental states, enabling self-reflection, it seems HAL has the mental
abilities required for a criminal mens
rea or "guilty mind"; HAL has the abilitity to commit murder.
Whether HAL's conduct aboard the Discovery One was murder or not, would depend on
the presence or absence of further mitigating factors of insanity (emotional disturbance), "brainwashing", or duress. Under the duress heading, HAL's culpability
would be diminished if he acted in (perceived) defense of himself or
the Jupiter mission. Arguably he did.
by Daniel C. Dennett
Did Hal Commit Murder?
- In Anglo-American jurisprudence,
one speaks of mens rea, literally the guilty mind:
- to have performed a legally
prohibited action, such as
killing another human being; one must have done so with
a culpable state of mind, or mens rea. Such culpable
mental states are of three kinds: they are either
The legal concept has no
requirement that the agent be capable of feeling guilt or remorse or any other emotion;
murderers are not in the slightest degree exculpated by their
flat affective state. Star Trek's Spock would fully satisfy the
mens rea requirement in spite of his fabled lack of
Drab, colorless--but oh so effective--"motivational states of
purpose" and "cognitive states of belief" are enough to get the
fictional Spock through the day quite handily, and they are well
established features of many existing computer programs.
Deep Blue beat world chess
champion Garry Kasparov in the first
game of their recent championship match.
- motivational states of
- cognitive states of
- the nonmental state of
Dictionary of Philosophy 1995, p.482) [llist formatting added by
Deep Blue, like many other
computers equipped with AI programs,
is what I call an intentional system: its behavior is predictable and
explainable by attributing to it beliefs and desires -- "cognitive
states" and "motivational states" -- and the rationality required to
figure out what it ought to do in the light of those beliefs and
Higher order intentionality
is a necessary condition for moral
responsibility ... and Deep Blue exhibits little of such capabilities.
the layers of software that would permit Deep Blue to become
self-monitoring and self-critical ... would ... turn Deep Blue into a
radically different sort of
HAL purports to be just such a higher-order intentional
system--and he even plays a game of chess with Dave. HAL is an
enhancement of Deep Blue equipped with eyes and ears, and a large
array of sensors and effectors distributed around in Discovery
One, the space ship.
HAL ... in
his few speeches ... expresses an interesting variety of higher-order
intentional states, from the most simple to the most
Notice that the requirement that HAL once have had a humanoid
body and have lived concretely in the human world is only a
practical requirement, not a metaphysical one. Once all the R and
D had been accomplished in the prototype, by the odyssey of a
single embodied agent, the standard duplicating techniques of the
computer industry could clone HALs by the thousands, as readily
as compact disks. [re: Brooks' embodied "child machine" approach]
- "Yes, it's puzzling. I don't think I've ever seen anything
like this before."
- "I can't rid myself of the suspicion that there are some
extremely odd things about this mission."
- "I never gave these stories much credence, but particularly in
view of some of the other things that have happened, I find them
difficult to put out of my mind."
- "I've still got the greatest enthusiasm and confidence in the
mission. I want to help you."
When do we exculpate people? ... HAL shows signs of fitting
into one or another of the excusing conditions in
spite of his being a conscious agent.
- INSANITY: Might HAL
have gone insane? The
question of HAL's capacity for emotion --
and hence vulnerability to emotional disorder -- is tantalizingly
raised by Frank's answer to Mr. Amer:
- "Well, he acts like he has genuine emotions. Of course,
he's programmed that way to make it easier for us to talk to him.
But as to whether he has real feelings is something I don't think
anyone can truthfully answer."
- Certainly HAL procliams his emotional state at the end: "I'm
affraid. I'm affraid."
- HAL may then have suffered from some emotional imbalance of
much the same sort as those that lead human beings astray.
Whether this was the result of some sudden trauma -- a blown fuse, a
dislodged connector, a microship disorded by cosmic rays -- or of some
gradual drift into emotional misalignment provoked by the stresses of
the mission, confirming such a
diagnosis should justify a verdict of
diminished responsibility of HAL just as it does in human malfeasance.
- "BRAINWASHING": Another
possible source of exculpation, more familiar in fiction than in the
real world, is "brainwashing" or hypnosis.
- The only evidence that HAL might be in such a partially
state is the much-remarked fact that he has actually made a
mistake, and the series 9000 computer is supposedly utterly
invulnerable to error.
- This is, to my mind, the weakest point in
Clarke's science fiction. The
suggestion that a computer could be both a "Heuristically programmed
ALgorithmic" computer and "by
any practical definition of the words, fool-proof and incapable
of error" verges on self-contradiction. The whole point of
heuristic programming is that it defies the problem of
combinatorial explosion (which mathematically cannot be solved
sheer increase in computing speed and size) by taking risky
chances, truncating its searches in ways that must leave it open
to error, however low the probability.
This is just the opposite of
the other conditions; it is
precisely because the agent is rational, and is faced with an
overwhelmingly good reason for perform an injurious deed--to kill
in self-defense, in the clearest case--that the agent is excused
or at least partly exonerated. These are the forced moves of
life: all alternatives to them are suicidal--and that is too much
to ask, isn't it?
- In the book, Clarke looks into HAL's mind and
says "He had been threatened with
disconnection; he would be
deprived of all his inputs, and thrown into an unimaginable state
of unconsciousness." [p.148] That
might be grounds enough to
justify HAL's course of self-defense ....
- If HAL believed (we
can't be sure on what grounds) that his
so rendered comatose would jeopardize the whole mission, then he
would be in exactly the same moral dilemma a human being in the
same predicament would face.
- If you believed the mission to
which your life was devoted was more important, in the last
analysis, than anything else, what would you do
- So he would protect himself, with all the weapons at
his command. Without rancor--but without pity--he would
remove the source of his frustrations. And then,
following the orders that had been given to him in case
of the ultimate emergency, he would continue the
mission--unhindered, and alone." (p149)