A Message From Eliezer Yudkowski
I recently received an email from Eliezer Yudkowski, in reference to a comment of mine which he had deleted. The message, and the content of the comment that precipitated it, follows:
You’re welcome to repost if you criticize Coherent Extrapolated Volition specifically, rather than talking as if the document doesn’t exist. And leave off the snark at the end, of course.
———- Forwarded message ———-
Date: Wed, Oct 15, 2008 at 3:01 PM
Subject: [Overcoming Bias] Caledonian submitted a comment to ‘Ends
Don’t Justify Means (Among Humans)’
A new comment from “Caledonian” was received on the post “Ends Don’t Justify Means (Among Humans)” of the weblog “Overcoming Bias”.
Comment: “Eliezer: If you create a friendly AI, do you think it will shortly
thereafter kill you? If not, why not?”
At present, Eliezer cannot functionally describe what ‘Friendliness’ would actually entail. It is likely that any outcome he views as being undesirable (including, presumably, his murder) would be claimed to be impermissible for a Friendly AI. Imagine if Laws of Robotics were to be implanted in , but couldn’t specify what those Lawsnot only lacked the ability to specify *how* the
were supposed to be. You would essentially have Eliezer. Asimov specified his Laws enough for himself and others to be able to analyze them and examine their consequences, strengths, and weaknesses, critically. ‘Friendly AI’ is not so specified and cannot be analyzed.
No one can find problems with the concept because it’s not substantive enough – it is essentially nothing but one huge, undefined problem.
The last sentence — the one that Eliezer took particular offense to — concisely sums up the reality of Eliezer’s ‘work’ on Artificial Intelligence.
Over the next few days, I’m going to demonstrate why. If you’d like to argue the point, feel free — I’m not afraid of criticism. No comments addressing the subject of this thread will be deleted, regardless of their content, as long as they can be legally displayed in the United States to minors. I reserve the right to delete content incompatible with filtering programs. Everything else is fair game.