Archive for Eliezer Yudkowsky

Dresden Codak vs. Eliezer Yudkowsky?

Posted in Comics with tags , , on April, 2011 by melendwyr

See the latest update of Dresden Codak ( #9 of the Dark Science series). Does anything about it seem… familiar… to you?

As per request, a link has been added to the comic in question.

Advertisements

Recognizing the Obvious

Posted in Uncategorized with tags , , on February, 2009 by melendwyr

Well, it’s not an issue of Firefox interacting poorly with a CATCHPA, as has happened to me before.

LessWrong refuses to register either of the handles Eliezer knows me by. Any handles he’s not familiar with are immediately registered.

I always thought the story of The Emperor’s New Clothes ended unrealistically. In reality, the child would be committed to an asylum and drugged until he wouldn’t know a hawk from a handsaw no matter what direction the wind came from.

We know what the Emperor had to fear – what’s Eliezer so frightened of?

A Deep And Dark December

Posted in Doom, GIGO, Politics and Society with tags , , on February, 2009 by melendwyr

I don’t feel like coming up with an original post today, so I’ll just piggyback on these Overcoming Bias threads.

What are Robin and Eliezer talking about? Even they don’t seem to have a clear idea, so we’ll have to figure it out for ourselves. To the dictionary!

Cynicism:

An attitude of scornful or jaded negativity, especially a general distrust of the integrity or professed motives of others.
The attitude or beliefs of a cynic.
The holding or expressing of opinions that reveal disbelief and sometimes disdain for commonly held human values and virtues.

Also, a Greek philosophy of the 4th century B.C. advocating the doctrines that virtue is the only good, that the essence of virtue is self-control and individual freedom, and that surrender to any external influence is beneath the dignity of man.

Ohky dohky. What about idealism?

Idealism:

The act or practice of envisioning things in an ideal form.
Pursuit of one’s ideals.
Idealized treatment of a subject in literature or art.
The theory that the object of external perception, in itself or as perceived, consists of ideas.
The tendency to represent things in their ideal forms, rather than as they are.

The penultimate definition there is specific to philosophy; incidentally, it’s not only useless, but silly. (What a surprise!)

None of the usage-meanings of ‘idealism’ are incompatible with those of ‘cynicism’; that is, they can be incompatible when certain variables are given defined values, so they have the potential of incompatibility. But not necessarily.

Why, then, do R & E take such a conflict for granted, when as far as I can determine they haven’t established that we should take one of the perspectives in which conflict is inevitable?

But I suppose assuming points they haven’t demonstrated is everyday practice for them.

Running Away

Posted in GIGO with tags , , , , on February, 2009 by melendwyr

Does the concept of Friendly AI somehow drain one’s enthusiasm? Does it cause motivation to dwindle? Does it drive people mad?

Or is it just that no progress has been made on even the most basic foundational aspects of FAI, and discussing the subject makes that painfully obvious?

Breaking down a complex problem into basic ones is a good and useful step, but Eliezer is right in saying that trying to replace a complex problem with a basic one is an error. You may be able to switch from ‘failure’ to ‘success’ that way, but it’s not a useful success, it’s not what you needed to succeed at, and so it’s an entirely empty achievement.

But replacing a simple problem with a complex one, in order to avoid having to face the fact of your failure to deal with the simple problem, is the same sort of mistake and delusion, at a much greater level.

Additional note: in this post, Eliezer turns away from rationality and towards sensation in his moral arguments. If you don’t have mirror neuron activation, and you see someone else being hurt, what should you do?

If you’re a rational agent, precisely the same thing you’d do if you did have mirror neurons.

Archival of Dissent

Posted in Uncategorized with tags , , on February, 2009 by melendwyr

A poster named Faré, with whom I was previously unfamiliar, has made a series of interesting posts on Overcoming Bias.

To prevent their loss in case of deletion, I reproduce them below. Note: I have not explicitly been given permission to do so by Faré; the messages appear, with the exception of an introduced link, exactly as they appeared, and have not been altered in any way. I hope this will not be offensive; if it is, I will remove the content from this site.

I wonder how long it will be before Eliezer bans and deletes this poster’s thoughts?

Meanwhile, the Arabs and the Jews, communicating through the exclusive channel of the Great Khalif O. bin Laden negotiating through Internet Sex with Tzipi Livni, arrived at this compromise whereby the Jews would all worship Mohammed on Fridays, at which times they will explode a few of their children in buses, whereas the Arabs would ratiocinate psychotically around their scriptures on Saturdays, and spawn at least one Nobel prize winner in Medecine and Physics every five years.

Of course, meeting two new species on the same day is the crew of the Impossible having its leg pulled by some superior entity, namely Eliezer. But Eliezer is not above and outside *our* world, and we don’t have to let ourselves intimidated by his scripture.

Why and how would communication possibly happen through only one channel? Since when is the unit of decision-making a race, species, nation, etc., rather than an individual? Is this Market-driven spaceship under totalitarian control where no one is allowed to communicate, and the whole crew too brain-damaged to work-around the interdiction? I wonder how the Soviet Union made it to the Interstellar Age. Where has your alleged individualism gone?

Why and how is compromise is even possible between two species, much less desirable? In the encounter of several species, the most efficient one will soon hoard all resources and leave the least efficient ones but as defanged zoo animals, at which point little do their opinions and decisions matter. No compromise. The only question is, who’s on top. Dear Tigers, will you reform yourself? Can we negotiate? Let your Great Leader meet ours and discuss man to animal around a meal.

And of course, in your fantasy, the rationalist from way back when (EY) effectively wields the ultimate power on the ship, yet is not corrupted by power. What a wonderful saint! Makes you wonder what kind of wimps the rest of mankind has degenerated into to submit to THAT wimpy overlord. Where has gone your understanding of Evolutionary Forces?

Wanna see incredibly intelligent people wasting time on absurd meaningless questions? Come here to Overcoming Bias! A stupid person will believe in any old junk, but it takes someone very intelligent to specifically believe in such elaborate nonsense.

And while I’m at it — confusing pleasure and happiness is particularly dumb. Entities that would do that would be wiped from existence in a handful of generations, and not super-powerful. Habituation is how we keep functioning at the margin, where the effort is needed. The whole idea of a moral duty to minimize other people’s pain is ridiculous, yet taken for granted in this whole story. Eliezer, you obviously are still under the influence of the judeo-christian superstitions you learned as a child.

If you’re looking for an abstract value to maximize, well, it’s time to shut up and eat your food. http://sifter.org/~simon/journal/20090103.h.html

Pain and pleasure are *signals* that we are on the wrong or right path. There’s a point in making it a better signal. But the following propositions are wholly absurd:
* to eliminate pain itself (i.e. no more signal)
* to bias the system to have either more or less pain in the average (i.e. bias the signal so it carries less than 1 bit of information per bit of code).
* to forcefully arrange for others to never possibly have pain in their own name (i.e. disconnecting them from reality, denying their moral agency — and/or obey their every whims until reality strikes back despite your shielding).
* to feel responsible for other people’s pain (i.e. deny the fact that they are their own moral agents).

As for promising a world of equal happiness for all, shameless self-quote:
“Life is the worst of all social inequalities. To suppress inequalities, one must either resurrect all the dead people (and give life to all the potential living people), or exterminate all the actually living. Egalitarians, since they cannot further their goal by the former method, inevitably come to further it by the latter method.”

A rational individual has no reason to care for the suffering of alien entities, or even other human entities, except inasmuch as it affects his own survival, enjoyment, control of resources.

25% suicide rate? Over something completely abstract that they haven’t felt yet?

You didn’t tell us about humans having been overcome by some weird Death Cult.

But, now it makes sense why they would give power to the Confessor.

Obviously, in this fantasy of would-be-immortal 21st century abstract thinker, your immortal 21st century abstract thinkers are worshipped as gods. And unhappily, they were told too much about Masada and other Kool-Aid when they were young.

There comes your judeo-christian upbringing again, in addition to the intellectual masturbation.

Eliezer — get a life! The worst thing that ever happened to your intelligence was to be disconnected from reality by too early success.

Transparency

Posted in Doom with tags , , on January, 2009 by melendwyr

There is a comment, held in suspension, for the post “The Other Shoe”.

I reference the principle of Lex Talionis. I am not obligated to respect the speech of those who do not respect my own.

The Other Shoe

Posted in Doom with tags , , , on January, 2009 by melendwyr

From this comment thread at OB:

[bored now. comment deleted.]

Comments under that name are no longer permitted. The transition is now complete.

For the record, O Moderately Beloved Readers: has Eliezer yet demonstrated that recursive self-improvement is possible, let alone likely? Has he shown that ‘Friendly AI’ is coherent, let alone desirable? Has he shown that the only alternative to FAI is a nightmare or a null? Has he made any objective progress towards his claimed goals? Has he been able to objectively demonstrate the validity of his assertions about rationality?

You’ll have to ask those questions yourselves, now.

[edited to add] Eliezer’s replacement text is probably a reference to Buffy the Vampire Slayer.

Specifically, when Willow is consumed by rage and despair after the shooting of her lover, she tracks down the people responsible. After capturing one, she toys with him for a time, then says “Bored now”, and kills him by magically ripping off his skin.