Dawkins' Rule in Comparative Complexity

Agent Smith January 12, 2023 at 10:10 675 views 2 comments
What is complexity?

From what seesms to be an informational POV, Richard Dawkins (famed evolutionary biologist, atheist, science educator) offers a simple, easy-to-understand, rule in re comparative complexity (how complex is A with respect to B?)

Dawkins' Rule in Comparative Complexity

1. Take two objects, A and B.

2. Take a horizontal slice of both at the same level of organization e.g. for A is a crab and B is a whale, if one is being studied at the cellular level, do the same for the other and if at the molecular level, the same for the other, you get the idea.

3. Now record the information content for both A and B. Suppose I(x) = information content of x.

4. Possibilities [conclusions]

i) I(A) = I(B) [same complexity]
ii) I(A) > I(B) [A is more complex than B]
iii) I(A) < I(B) [A is less complex than B]

In a loose sense, it boils down to how thick a book on A and B is.

Comments (2)

180 Proof January 12, 2023 at 10:18 #771754
Complexity is information. Less complex, less informational (i.e. lower entropy). :fire:
Agent Smith January 12, 2023 at 10:32 #771756
Reply to 180 Proof

A given signal's information content seems to be a measure of how much noise (Shannon entropy) it has to deal with.

Again got this from Dawkins, if I say message x has 4 bits of information, what I mean is there are 16 possibilites (x itself + 15 others) ([math]\log_2 16 = 4[/math])and I need 4 pieces of information to rule out all the wrong alternatives: 1 piece/bit of information to go from 16 possibilities to 8; 1 more piece/bit of information to narrow it down from 8 to 4; 1 piece/bit of information again to pare down the options from 4 to ×; 1 more piece/bit of information, the last one, and we're done - from 2 to 1 (we've finally got the message as it were). That makes a total of 4 bits [1 + 1 + 1 + 1] of info.