Running Away

Does the concept of Friendly AI somehow drain one’s enthusiasm? Does it cause motivation to dwindle? Does it drive people mad?

Or is it just that no progress has been made on even the most basic foundational aspects of FAI, and discussing the subject makes that painfully obvious?

Breaking down a complex problem into basic ones is a good and useful step, but Eliezer is right in saying that trying to replace a complex problem with a basic one is an error. You may be able to switch from ‘failure’ to ‘success’ that way, but it’s not a useful success, it’s not what you needed to succeed at, and so it’s an entirely empty achievement.

But replacing a simple problem with a complex one, in order to avoid having to face the fact of your failure to deal with the simple problem, is the same sort of mistake and delusion, at a much greater level.

Additional note: in this post, Eliezer turns away from rationality and towards sensation in his moral arguments. If you don’t have mirror neuron activation, and you see someone else being hurt, what should you do?

If you’re a rational agent, precisely the same thing you’d do if you did have mirror neurons.


4 Responses to “Running Away”

  1. Mhm. Morality has nothing to do with mirror neurons. Incidentally, I doubt if Eliezer could rigorously justify the prohibition on slavery, given that he is a vile utilitarian. I vomit a little every time he mentions his clever little Specks vs. Torture scenario.

  2. “Morality has nothing to do with mirror neurons.”

    Well, it does, and it doesn’t.

    Rationality, however, doesn’t.

  3. Maybe rather than an absolute prohibition on slavery there would be a general presumption against it. Eliezer’s post on the end-justifying-the-means and following clear moral rules could suggest something like that.

  4. Slavery can be easily discounted from a utilitarian perspective. The enormous suffering of slavery leads only to indulge the decadency of a relative few. Meanwhile removing slavery does not remove the jobs or people willing to pay for them, only forces people to implement fairness which can bring more efficient valuation of goods and services to the economy which would help out everyone in the long run.

    The problem with Eliezer’s moral views is that they are not utilitarian (or some form of, like reverse), in fact there’s a good deal of non-physicalism involved, and lack of definition.

    But P got it right when he called Eliezer’s dust-speck example “clever”. There’s nothing to that argument other than it being highly counter-intuitive and therefore impressive – if it were right.

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

%d bloggers like this: