You are viewing squid314

Jackdaws love my big sphinx of quartz - Post a comment [entries|archive|friends|userinfo]
Scott

[ userinfo | livejournal userinfo ]
[ archive | journal archive ]

Stuff [Jul. 4th, 2012|05:57 pm]
[Tags|]

The atheist blog Camels With Hammers has been pretty vocal lately about their theory of objective value. The theory is well-written and well-argued but also makes a sort of Grand Tour Of Philosophical Concepts Scott Finds Sketchy. It deserves a comprehensive treatment which I might or might not have time to give it, but part of the setting-up of that treatment would be to re-iterate: please don't try to relate moral value to "complexity".

(I'm going to deliberately confuse morality and axiology in this post for a few reasons: to keep things readable to people who don't know the distinction, to keep things writeable for me, and because the moral systems I'm talking about are very closely related to their axiologies)

I see why it's tempting to do this. It's the same reason it's tempting to relate value to "happiness" (hello, Jeremy Bentham!) Most of the things we find morally valuable seem kind of related to happiness in some vague way, and most of the things we dislike seem kind of related to unhappiness in some vague way. Robbing people? Makes them unhappy. Cheating on your spouse? Makes the spouse unhappy. Giving money to the poor? Makes them happy. Creating a well-ordered and peaceful society? Makes everybody happy.

But of course when you look at this closely it doesn't pan out. Tying someone down and injecting them with a super-powerful form of opium for their entire life would make them really happy, but it's certainly not moral if the victim doesn't consent and it's arguably not moral even if they do. If we hacked everybody to enjoy very simple works of art and literature, and then destroyed all existing complex works of art and literature as no longer necessary, everyone would be happy but we would feel like value had been lost.

I've seen some people try to save the identity of value and happiness. They say that true happiness is the kind you get from changing the world for the better - not the kind you get from super-powerful opiates. And that consuming great art necessarily produces greater true happiness than simple art, even if you've hacked people's brains to react otherwise.

But once you start saying these things, you gradually turn the concept of "happiness" into something exactly as complex as the concept "moral value" you were hoping to explain. Reductionism delights in being able to explain complicated ideas via simpler ideas. If you don't know what moral value is, I can say "Well, you know being happy? Of course you do! Moral value is just a fancy word for some complicated results of that" and now you've learned something new.

But if we've got to go with this concept of "true happiness", then it doesn't explain anything. If you don't know what moral value is, I say "Well, you know true happiness, a concept completely different from the regular happiness you experience every day? No? Well, um...". And then my problem of explaining true happiness is exactly the same problem I had when I just had to explain "moral value".

This is really hard to notice in philosophy, because we're almost never talking to someone who genuinely doesn't understand moral value. We're talking to a philosopher who understands it exactly as well as we do, but has to pretend not to understand it in order to see whether a certain definition is convincing. So if our interlocutor isn't very good at playing make-believe, and we say something that points to the concept of morality assuming you already understand it, she may say "Oh, yeah, that thing! I recognize that!" even though you haven't given a definition suitable for someone who doesn't already know it to derive it.

This is an especial curse of people who think they're deriving an objective morality. More often than not, what they're doing is sticking all of their pre-existing moral intuitions into a black box labelled "true happiness", drawing them right back out again, and exclaiming "Look! I discovered morality! All along it was right inside this here box!"

So let's go back to complexity. Like happiness, a lot of our value judgments seem to involve complexity. For example, a human seems more valuable than an ant, and humans are more complex than ants. A live person seems more complex than a dead person, and we think killing is wrong. The Large Hadron Collider is more complex than a paperclip, and if we had to choose which one we thought was more valuable, which one was a greater triumph of the human spirit and a representative of all that is best about our species, we'd choose the LHC.

Unfortunately, complexity is really hard to define. There are simple and precise mathematical descriptions of complexity, but they don't match our intuitive ideas at all. For example, a human sized hard-drive full of random bits (0 and 1 arranged randomly) is more complex than a human by most mathematical definitions of complexity.

Trying to come up with a non-mathematical definition of complexity is also harder than it sounds. Some people say humans are very complex because we have 20000 genes, but rice has 32000 genes. If we go by total genome information content in megabytes rather than raw gene number, we cruise to an easy victory over rice plants but get beaten by a rather silly-looking species of lungfish.

Other people might say that, although our genome is smaller, we can do more with it. But "more" is a relative term, and the lungfish might be skeptical of applying it to a species that can't even breath underwater (they themselves are quite happy both on water and land). We can't lay eggs, we're horrible swimmers and horrible at digging burrows, and we can't even place ourselves in suspended animation during the dry season.

It's tempting to say "But we humans have art and music and totally just discovered the Higgs Boson! Surely that's more complex than whatever tricks a lungfish can do!" I agree. Based on my moral values, art and music and boson-discovery are more complex, more interesting, better than the ability to dig burrows. But here I'm using an idea of "true complexity" which is much like the idea of "true happiness" - it takes my values as a given, then spits them back at me. The way I know art and music are more complex than amphibiousness is because of my value system; I can't then go back and define my value system in terms of complexity.

One can imagine a bunch of ways to rescue this argument. Humans have a greater ability to adapt to different situations (except that there are bacteria that can adapt to temperatures from freezing to boiling and even the void of space). Humans have the largest, most convoluted, most metabolically active, most synaptically connected brains (false; whales beat humans in all of these measures). The human brain has the most interacting parts of any object in the universe (false; the atmosphere has more interacting parts if you count air molecules, and good luck coming up with a non-value-based reason not to). Well, humans are just plain smarter than everyone else! (Well, giraffes just plain have the longest necks; why are you privileging intelligence other than that you have it and like it?)

I'm not saying that humans don't have more moral value than lungfish. I'm just saying this doesn't automatically fall out of some reasonable definition for "complexity", unless your definition for complexity is itself so complex that you selected it from zillions of equally plausible definitions for "complexity" solely in order to justify your original moral intuitions. In which case save us all some trouble and just keep your moral intuitions without justifying them.

And now that we've given some empirical counterexamples, let's discuss what's wrong with "complexity" on a more fundamental level.

I'm not a computer programmer, but by a strange coincidence everyone I know and talk to is. They tend to bemoan companies that pay their programmers per line of code, saying that this encourages people to write bloated and inefficient algorithms. One can understand the temptation - simple one-line programs like "Hello World" have very limited functionality, and the programs that do awesome things like launch the space shuttle are probably very long and complicated. But enshrine this relationship between program length and program usefulness in law, and you end up with a programmer writing a five-hundred line version of "Hello World" which runs like molasses just so he can get paid more.

Enshrining complexity as the arbiter of value has some of the same risks as paying programmers by the line. We do assign higher value to the LHC than to a paperclip. but not because the LHC is more complex. We assign it higher value because it performs a more difficult and interesting task. If we were to make a device as complex as the LHC to hold two sheets of paper together, that wouldn't be super-valuable. It would just be stupid. If we were to come up with a way to discover the Higgs Boson with a paperclip, we wouldn't mourn the loss of complexity, we would celebrate the increase in efficiency. And if someone (presumably Rube Goldberg) made a paperclip with even more interlocking parts and complex machinery than the human brain, we wouldn't worship it as our new God, we'd laugh at him for wasting our time.

(Someone might say "Rube Goldberg machines aren't the kind of complexity I meant! I meant to the complexity of the machine by its output!" But a 800 x 600 monitor in which each pixel can display 256 different colors can produce...uh, something like 256^480000 different outputs. Are you confident you can produce that many distinguishable behaviors?)

Our values are complex. They are small targets. Therefore, we often need complex things to achieve them. And many of the things we like, such as other people, are also complex. But we do not value complexity itself, or at least that's not the only thing we value. If we try to act like it is, we just end up redefining "complexity" so thoroughly that we import all our moral values into it.

Our real moral values are the ones that tell us that any definition of complexity that places a lungfish above a human is one we can reject our of hand. Even if there's some extremely specific and wildly over-fit definition of complexity that corresponds to our values, our real moral values are the ones that tell us to choose that definition, rather than all the others.

Camels With Hammers is not just saying morality is identical to complexity. Like I said, they're a Grand Tour of every philosophical argument I'm uncomfortable with, and a single stop does not a Grand Tour make. But I know some people who do try to ground morality in complexity, and I know a lot of other arguments that focus around this same idea of "identify morality with a vague concept and hope everyone just knows what you mean."
linkReply

post comment:

No HTML allowed in subject

  
 
   
 

Notice! This user has turned on the option that logs your IP address when posting.