You are viewing squid314

Jackdaws love my big sphinx of quartz - Stuff [entries|archive|friends|userinfo]
Scott

[ userinfo | livejournal userinfo ]
[ archive | journal archive ]

Stuff [Jul. 4th, 2012|05:57 pm]
Previous Entry Add to Memories Share Next Entry
[Tags|]

The atheist blog Camels With Hammers has been pretty vocal lately about their theory of objective value. The theory is well-written and well-argued but also makes a sort of Grand Tour Of Philosophical Concepts Scott Finds Sketchy. It deserves a comprehensive treatment which I might or might not have time to give it, but part of the setting-up of that treatment would be to re-iterate: please don't try to relate moral value to "complexity".

(I'm going to deliberately confuse morality and axiology in this post for a few reasons: to keep things readable to people who don't know the distinction, to keep things writeable for me, and because the moral systems I'm talking about are very closely related to their axiologies)

I see why it's tempting to do this. It's the same reason it's tempting to relate value to "happiness" (hello, Jeremy Bentham!) Most of the things we find morally valuable seem kind of related to happiness in some vague way, and most of the things we dislike seem kind of related to unhappiness in some vague way. Robbing people? Makes them unhappy. Cheating on your spouse? Makes the spouse unhappy. Giving money to the poor? Makes them happy. Creating a well-ordered and peaceful society? Makes everybody happy.

But of course when you look at this closely it doesn't pan out. Tying someone down and injecting them with a super-powerful form of opium for their entire life would make them really happy, but it's certainly not moral if the victim doesn't consent and it's arguably not moral even if they do. If we hacked everybody to enjoy very simple works of art and literature, and then destroyed all existing complex works of art and literature as no longer necessary, everyone would be happy but we would feel like value had been lost.

I've seen some people try to save the identity of value and happiness. They say that true happiness is the kind you get from changing the world for the better - not the kind you get from super-powerful opiates. And that consuming great art necessarily produces greater true happiness than simple art, even if you've hacked people's brains to react otherwise.

But once you start saying these things, you gradually turn the concept of "happiness" into something exactly as complex as the concept "moral value" you were hoping to explain. Reductionism delights in being able to explain complicated ideas via simpler ideas. If you don't know what moral value is, I can say "Well, you know being happy? Of course you do! Moral value is just a fancy word for some complicated results of that" and now you've learned something new.

But if we've got to go with this concept of "true happiness", then it doesn't explain anything. If you don't know what moral value is, I say "Well, you know true happiness, a concept completely different from the regular happiness you experience every day? No? Well, um...". And then my problem of explaining true happiness is exactly the same problem I had when I just had to explain "moral value".

This is really hard to notice in philosophy, because we're almost never talking to someone who genuinely doesn't understand moral value. We're talking to a philosopher who understands it exactly as well as we do, but has to pretend not to understand it in order to see whether a certain definition is convincing. So if our interlocutor isn't very good at playing make-believe, and we say something that points to the concept of morality assuming you already understand it, she may say "Oh, yeah, that thing! I recognize that!" even though you haven't given a definition suitable for someone who doesn't already know it to derive it.

This is an especial curse of people who think they're deriving an objective morality. More often than not, what they're doing is sticking all of their pre-existing moral intuitions into a black box labelled "true happiness", drawing them right back out again, and exclaiming "Look! I discovered morality! All along it was right inside this here box!"

So let's go back to complexity. Like happiness, a lot of our value judgments seem to involve complexity. For example, a human seems more valuable than an ant, and humans are more complex than ants. A live person seems more complex than a dead person, and we think killing is wrong. The Large Hadron Collider is more complex than a paperclip, and if we had to choose which one we thought was more valuable, which one was a greater triumph of the human spirit and a representative of all that is best about our species, we'd choose the LHC.

Unfortunately, complexity is really hard to define. There are simple and precise mathematical descriptions of complexity, but they don't match our intuitive ideas at all. For example, a human sized hard-drive full of random bits (0 and 1 arranged randomly) is more complex than a human by most mathematical definitions of complexity.

Trying to come up with a non-mathematical definition of complexity is also harder than it sounds. Some people say humans are very complex because we have 20000 genes, but rice has 32000 genes. If we go by total genome information content in megabytes rather than raw gene number, we cruise to an easy victory over rice plants but get beaten by a rather silly-looking species of lungfish.

Other people might say that, although our genome is smaller, we can do more with it. But "more" is a relative term, and the lungfish might be skeptical of applying it to a species that can't even breath underwater (they themselves are quite happy both on water and land). We can't lay eggs, we're horrible swimmers and horrible at digging burrows, and we can't even place ourselves in suspended animation during the dry season.

It's tempting to say "But we humans have art and music and totally just discovered the Higgs Boson! Surely that's more complex than whatever tricks a lungfish can do!" I agree. Based on my moral values, art and music and boson-discovery are more complex, more interesting, better than the ability to dig burrows. But here I'm using an idea of "true complexity" which is much like the idea of "true happiness" - it takes my values as a given, then spits them back at me. The way I know art and music are more complex than amphibiousness is because of my value system; I can't then go back and define my value system in terms of complexity.

One can imagine a bunch of ways to rescue this argument. Humans have a greater ability to adapt to different situations (except that there are bacteria that can adapt to temperatures from freezing to boiling and even the void of space). Humans have the largest, most convoluted, most metabolically active, most synaptically connected brains (false; whales beat humans in all of these measures). The human brain has the most interacting parts of any object in the universe (false; the atmosphere has more interacting parts if you count air molecules, and good luck coming up with a non-value-based reason not to). Well, humans are just plain smarter than everyone else! (Well, giraffes just plain have the longest necks; why are you privileging intelligence other than that you have it and like it?)

I'm not saying that humans don't have more moral value than lungfish. I'm just saying this doesn't automatically fall out of some reasonable definition for "complexity", unless your definition for complexity is itself so complex that you selected it from zillions of equally plausible definitions for "complexity" solely in order to justify your original moral intuitions. In which case save us all some trouble and just keep your moral intuitions without justifying them.

And now that we've given some empirical counterexamples, let's discuss what's wrong with "complexity" on a more fundamental level.

I'm not a computer programmer, but by a strange coincidence everyone I know and talk to is. They tend to bemoan companies that pay their programmers per line of code, saying that this encourages people to write bloated and inefficient algorithms. One can understand the temptation - simple one-line programs like "Hello World" have very limited functionality, and the programs that do awesome things like launch the space shuttle are probably very long and complicated. But enshrine this relationship between program length and program usefulness in law, and you end up with a programmer writing a five-hundred line version of "Hello World" which runs like molasses just so he can get paid more.

Enshrining complexity as the arbiter of value has some of the same risks as paying programmers by the line. We do assign higher value to the LHC than to a paperclip. but not because the LHC is more complex. We assign it higher value because it performs a more difficult and interesting task. If we were to make a device as complex as the LHC to hold two sheets of paper together, that wouldn't be super-valuable. It would just be stupid. If we were to come up with a way to discover the Higgs Boson with a paperclip, we wouldn't mourn the loss of complexity, we would celebrate the increase in efficiency. And if someone (presumably Rube Goldberg) made a paperclip with even more interlocking parts and complex machinery than the human brain, we wouldn't worship it as our new God, we'd laugh at him for wasting our time.

(Someone might say "Rube Goldberg machines aren't the kind of complexity I meant! I meant to the complexity of the machine by its output!" But a 800 x 600 monitor in which each pixel can display 256 different colors can produce...uh, something like 256^480000 different outputs. Are you confident you can produce that many distinguishable behaviors?)

Our values are complex. They are small targets. Therefore, we often need complex things to achieve them. And many of the things we like, such as other people, are also complex. But we do not value complexity itself, or at least that's not the only thing we value. If we try to act like it is, we just end up redefining "complexity" so thoroughly that we import all our moral values into it.

Our real moral values are the ones that tell us that any definition of complexity that places a lungfish above a human is one we can reject our of hand. Even if there's some extremely specific and wildly over-fit definition of complexity that corresponds to our values, our real moral values are the ones that tell us to choose that definition, rather than all the others.

Camels With Hammers is not just saying morality is identical to complexity. Like I said, they're a Grand Tour of every philosophical argument I'm uncomfortable with, and a single stop does not a Grand Tour make. But I know some people who do try to ground morality in complexity, and I know a lot of other arguments that focus around this same idea of "identify morality with a vague concept and hope everyone just knows what you mean."
linkReply

Comments:
[User Picture]From: sniffnoy
2012-07-04 10:23 pm (UTC)

(Link)

While I agree that complexity (or any such similar thing) is not the answer, I feel like I should probably point out this blog entry of Scott Aaronson's in which he tries to get at just what we mean by "complexity" (not in a moral sense, but also not in an entropy sense or a Kolmogorov sense either).
[User Picture]From: squid314
2012-07-04 10:51 pm (UTC)

(Link)

I understood about 50% of that, but not nearly enough to have the slightest intuition about whether we could apply it to the human/lungfish problem or what would happen if we did.
[User Picture]From: George Koleszarik
2012-07-05 06:58 am (UTC)

(Link)

Off-topic: this didn't show up in my RSS feed and I had to see it on Luke Muelhauser's Twitter account and what is going on

On-topic: My thoughts on this subject (http://lesswrong.com/r/discussion/lw/bio/complexity_based_moral_values/69j4). Also I agree completely except for a few trivial caveats that I haven't the spoons to elucidate.
[User Picture]From: squid314
2012-07-05 05:07 pm (UTC)

(Link)

I'd forgotten that thread...and the title gave me a scare for a second that this was repeating something that'd already been said.

You're Grognor, are you? I'd totally failed to make that connection before. What does the name "Grognor" mean, anyway?
[User Picture]From: George Koleszarik
2012-07-05 07:30 pm (UTC)

(Link)

I completely forget how I actually came up with it, but my current retcon is that I had a dream about pirates in which the captain yelled the line, "Not a drop of neither grog nor whiskey's gonna fall overboard today, mates!"
[User Picture]From: ciphergoth
2012-07-05 07:37 am (UTC)

(Link)

Agreed, except that I think you do the efforts in Camels with Hammers way too much credit; the arguments you knock down are better than then ones they set up.
[User Picture]From: squid314
2012-07-05 05:09 pm (UTC)

(Link)

At least he doesn't just say it's all complexity, problem solved. And the guy's a philosophy Ph. D, so I tend to err on the side of interpreting what he says charitably, and after reading more of his posts I have slightly more positive feelings for the system than after reading only one or two.
[User Picture]From: naath
2012-07-05 09:22 am (UTC)

(Link)

When it comes to lung-fish vs humans I think we are wired to think that "things that are more like me" are better than "things that are less like me" so naturally we prefer humans to lung-fish. Personally I think this whole notion is ludicrous and results in a lot of suffering (especially when applied to different groups of humans rather than lung-fish); I don't see why I shouldn't put the same value on a lung-fish as I do on a human.

With machines I guess value is a more complex problem... some combination of a ranking on the difficulties/necessity (is it "better" to cure cancer or find the Higgs Boson?) of tasks combined with a ranking on the complexity of machines/programs where the "best" thing is the least complex machine that manages the hardest/most-needed task.

Of course there is the question of whether in "defining morality" we are attempting a description of the morality people exhibit (and trying to systemetise it) or a prescription for correct moral action (even where that goes against our instincts).

[User Picture]From: squid314
2012-07-05 05:11 pm (UTC)

(Link)

I think humans are better than lungfish because we're smarter (interpreting smart not just as doing well on IQ tests but the whole use language/build tools/make art thing.) I don't know if that a "true smartness" fallacy equivalent to the "true happiness" and "true complexity" fallacies, but I also don't think it's just "things that are more like us". I feel like something that was even "smarter" than us would deserve more moral worth.
From: (Anonymous)
2012-08-01 06:16 pm (UTC)

(Link)

Even if the "smarter" agent was Clippy?
[User Picture]From: platypuslord
2012-07-06 06:05 am (UTC)

(Link)

I've seen some people try to save the identity of value and happiness. They say that true happiness is the kind you get from changing the world for the better - not the kind you get from super-powerful opiates.

Hmf. I say that true happiness is the kind you get according to your current utility function, not according to your hypothetical future utility function after you've altered it.

-- Like, let's suppose we've written a paperclip-maximizing AI, and that this AI can alter its own programming. The AI is considering whether to wirehead itself. The AI asks: "How many paperclips will be created in the future where I wirehead myself, versus how many paperclips will be created if I don't?". Of course it reacts to the wireheaded future in horror (so few paperclips!) and decides not to do any wireheading.

I think most humans have essentially the same response when presented with super-powerful opiates.

Under this definition, we still allow types of happiness such as "watch a sunset" and "hold a purring kitten" which don't really change the world for the better; but we don't allow happiness such as "alter my brain to enjoy X and then do X".
[User Picture]From: amuchmoreexotic
2012-07-06 02:38 pm (UTC)

(Link)

I don't think it's that difficult to specify in what way humans are 'more complex' than other species. Gene counts are certainly the wrong measure to pick, especially given the latest findings of epigenetics showing that a lot of complexity of gene expression patterns isn't actually encoded in the DNA sequence itself.

Humans have a much wider range of behavioural flexibility and between-individual bandwidth due to their specialised brains - using the more appropriate enchepalization quotient rather than absolute brain size, the human is a clear outlier - and larynxes. Being able to build on the ingenuity of others allows us to replicate the feats of the lungfish - I *can* breath underwater using SCUBA gear, even though I could never have invented it or made it from scratch, but I can leverage the ingenuity of my fellow conspecifics, and human "burrows" like the Channel Tunnel beat lungfish burrows hands down. "Man is the time-binding animal".

I'm also not sure that the relationship between "moral values" and complexity is that mysterious, either. The instinctive 'moral values' to which you constantly refer (it's "certainly not moral if the victim doesn't consent" to drug someone into happiness, "our real moral values are the ones that tell us to choose that definition") arise from social instincts which our species has evolved to allow us to co-operate in tribes of a hundred or so individuals.

Drugging someone might upset their relatives and cause turmoil in the tribe, so we instinctively recoil. Flipping the trolley switch to save ten lives will make you a hero and get you laid - pushing a fat guy onto the tracks to save ten lives might subject you to reprisal from the fat guy's relatives, so we instinctively recoil.

Where complexity comes in is that only a complex brain designed to track complex social lives and punish subtle cheating could have these intuitions. The intuitions are a set of hacky rules of thumb . 'Values' don't arise from a coherent system of thought - there's that classic experiment where people who can't solve a maths problem can instantly get it if it's framed as working out who to exclude from a bar for breaking rules.

People are more concerned with humans than ants for because humans are their conspecifics - complexity is beside the point. If someone had to choose between saving a human and a petabyte hard drive full of random bits, they would choose the human because that's a conspecific, not because of complexity.

Human intuitions, adapted for small groups, break down on large scales, which is how we somehow accept vast income inequality, and Westerners are happy to buy luxuries for themselves rather than save distant, dying children, even though they'd be horrified if those same children were on their doorsteps, and so on.

Putting aside my mammalian inheritance, I can't believe that there is such a thing as "objective value". Nothing exists per se except atoms and the void. I guess I am an expressivist moral nihilist.
[User Picture]From: eyelessgame
2012-07-06 03:28 pm (UTC)

(Link)

For example, a human seems more valuable than an ant, and humans are more complex than ants.

Yeah, but if an alien race visited us, and the aliens were demonstrably more complex than humans, we wouldn't consider them more valuable. Which is just argeeing that the simple concept isn't good enough. Complexity is like happiness - it is among the things that humans value. But the reason we humans think humans are more valuable than ants is that we're humans and not ants.

I agree you really can't get too reductionist about this. Value is, to be tautological, that which people want. Among the things we want is to be happy. And while one can argue that every thing we choose and want and do ultimately gives us some sort of satisfaction in some way... well, yes, but why do those things give us satisfaction? Do we choose it? Is it chosen for us? Can it change? There are humans. Humans want things, here and now. Obviously this is different from what humans wanted at other times and places.

[User Picture]From: eyelessgame
2012-07-06 03:33 pm (UTC)

(Link)

Wow, it's very easy to meander when trying to discuss philosophy, especially if one is (like me) not very good at it. I meandered a lot in only two paragraphs.

Which is to say, your essay was clearly hard to write, and required skill and effort. And that doesn't intrinsically mean it has value, but I like and admire it. :)
[User Picture]From: squid314
2012-07-07 04:44 am (UTC)

(Link)

I think I might consider that alien race more valuable. See 7.6 on http://raikoth.net/consequentialism.html