A simple thought exercise. Your Hal 9000 has just willfully engaged in actions which it knew would result in casualties. Casualties do result from the actions. Against whom do you press charges? Now, assume the law has declared Hal to be a sentient being. Still the same answer? EDIT ~ Since there are two differing situations to the question, I have made the poll open to more than one answer per member. The first three answers apply to the original question. The last two answers apply to the modified, secondary question.
Damn computers trying to take over the world and make me drink coffee with milk and sugar! Yeah, I'd sue the manufacturer because I'm betting Hal just doesn't have that much money.
If Hal is a sentient being, then it should be susceptible to charges. If Hal is a sentient being then pressing charges against the manufacturer would be like charging a parent for its grown-up offspring's actions. Incidentally, I'm almost at the end of 2001. Strange coincidence.
If it weren't for the fact that you and I are an ocean apart, I would say we are watching the same channel, for I too am nearly to the end of 2001. The fetus face has just graced the screen as I type this.
In the second scenario I would press against HAL and the company. HAL for the obvious reasons, but the company for myriad. 1. The company is the creator. The company had more influence over HAL's nature than any parent ever does. A parent trains through behavioral conditioning, but the company was more in the nature of God. For normal human procreation, understanding isn't necessary to the process, but understanding is necessary to create an intelligent being from scratch. Not understanding the possibility that HAL would act that way is gross negligence. Sentient or not, failure to understand or take measures to counteract the potential dangers of your product is grounds for criminal action. 2. The company as parents/guardians. You could make a very good case that HAL is still a child, in age, in reasoning capacity, and in emotional development. The company failed to raise their child right, and they employed him in a high-risk field before he matured. 3. The company as employers/owners. Any employer would administer psych tests to any employee given the degree of responsibility that is given to HAL. They either didn't do it at all, or used insufficient tests, or interpreted the data wrong. They employed a homicidal sociopath without taking proper precautionary measures to make sure he was right for the job. On another note, if HAL was a sentient being, was he a slave? He was the property of a corporation, and built and "raised" to perform one single job. It seems likely that he wasn't given pay or leisure time, or any other benefits that a company is expected to give. Edit: Do basic civil rights extend to sentient beings that probably doesn't operate on the basic thought processes of humanity? It's been a while since I've seen the show, but I remember HAL being pretty content with his situation.
If something has will, and knows consequences, then it should also be held responsible for such actions. I am assuming if Hal has a will, then the manufacturer is not telling it to commit the murder, there for is not responsible. That would be like a child commiting a crime, and the parent being held responsible. The child has a will, and the child knows consequences, therefore, they are no longer led by the creator. I hope that makes as much sense to others as it did to me. haha
Against HAL in order to have him taken offline and prevented from being placed in a similar job again. Against the manufacturer for damages. HAL cannot possibly pay restitution for the losses that can be quantified. Also, the manufacturer is responsible for HAL's actions as a minor, unless HAL has been declared an adult as well as sentient (doubtful).
If it was an accidental malfuction, I don't know how we could charge the manufacturor. There would have been no intent to cause harm, no knowledge that what they were doing could have caused harm. When a food company found a deadly bacteria in their products, they found the cause right away and took steps to keep it from happening again. They were not charged with anything. If the same thing happened in this case, I would charge the fall on the Hal 9000. On the other hand, if there was more going on than an accidental malfuction. If they intentionally put something into the programing that would cause the criminal behaviour, or an employee was being intentionally negligent, then you could charge them.
Working at an engineering company has worked very well to drill home the fact that accidental malfunction under normal usage conditions and adequate maintenance is always the manufacturer's responsibility. It doesn't matter if it was unintentional negligence, intentional negligence, simple accident, or sabotage. If you design something to operate under certain conditions and it fails when those conditions are met, it is up to you to fix it. If somebody dies because your equipment failed then the consequences are several orders of magnitude higher.
Interesting results With the notable exception of CDRW, there is a definite lean from the identifiably American members who responded to this thread to put more weight on the idea of a lawsuit intended to recoup losses monetarily. Intriguing.
Oh, what, you think the courts deliver JUSTICE? The courts can restrict the ability for offenders to commit their offenses, and impose incentives to take adequate precautions against damage due to negligence. The possibility of financial ruin is quite an incentive. EDIT: Sentient means self-aware. It's a difficult definition to pin down exactly, but it implies intelligence plus consciousness.
Well, I wish I could tell you that I didn't know what it was like to sit in the witness chair. I do. And yes, the court system should just call a rose a rose and instead of handing out verdicts, they should hand out these: Still, I find it intriguing that even with the question of culpability and legal capacity of a non human sentient being up for question, still the answers from the U.S. members reflect our culture of litigation rather strongly.
We're assuming he's programmed not to hurt humans right? Like Aasimovs laws or whatever? Then for the first one, no one responsible, assuming this is just a freak accident and not a common problem (and no foul play was involved in the programming or anything). If HAL was a sentient being then the manufacturer for not having more effective safety measures (unless this was a freak accident and not a conscious choice, in which case no one should be prosecuted again....although HAL probably would be responsible of manslaughter I guess....hmm)
Against the manufacturer, even if Hals classed a sentient being. Why? hes a machine, and by extension, the law wouldnt apply to him, as isnt murder 'one human killing another'? and also they made him, and obviously didnt put safeguards in to stop him killing There we are
I never said they weren't responsible, but the question is about criminal charges, isn't it? There is a difference. I did point out with my example that the company took responsibility. There just weren't any criminal charges.
My answer may seem odd, but I picked the bottom three. Reasons: 1. HAL 9000 Should have as one of its most basic system parametres that death is bad, 2. If HAL 9000 acted willingfully then he would have broken the law, and as an intelligent being should be pnished accordingly, 3. Even if HAL was sentient and did act willingfully, there should have been the maximum number of safeguards in place to either alert the crew to his actions, disallow the malfunction and/or action or even to shut the machine down entirely as soon as a problem develops. That's just my view, anyway.
It kills me when people take discrete stabs at other groups of individuals, under any classification, and try to pass if off as acceptable. It isn't, under any circumstances.