owned this note
owned this note
Published
Linked with GitHub
---
title: Complexity A Guided Tour
tags: 读书笔记
---
# Complexity A Guided Tour
# Layer 1 Read
*Purpose of the book?*
*What question the author want to solve?*
1. The author tried to explain: in several context, how emergent self-organized bebavior comes about from a history prospective. (see definition of emergent and self-organized)
2. How Can Complexity Be Measured? The author still does not have an definite answers yet...
*Subject of the book?*
Start with a history and content of four fundamental areas of complex system: information, computation, dynamics and chois, and evolution. Then followed by three chapter which describe how these four areas woves together.
*Conclusion*
*How the author orgnize the content?*
*What do I learned?*
The building blocks are information and computation, but what makes a big difference is networking. Networking is the infrustructure to process information (computing).
*How will this book related to me?*
*Keywords to understand author:*
- Complex system
- Information
- Computation
- Evolution
TODOs:
- Explore in details
- Maxwell's demon
- Turing machine
- Cellular automaton
- Koch curve
- Copycat
- Game of life..
# Layer 2 Read
- Life in computing
- Computing with life
- Networking
## Part 1: Background and History
### 1. What is Complexity
Example of complexity: insect colonies, brain, immune system, economies, world wide web,
A number of simple components (limited number of types) with limited communication with each other, can achieve complex functions without a central control.
Properties in common:
- Complex collective behavior
- Signaling and information processing
- Adaptation
:::info
*Complex System*: A system in which large network of components with no central control and simple rules of operation give rise to complex collective behavior, sophisticated information processing, and adaptation via learning or evolution.
*Self-organizing*: Systems in which organized behavior arises without an internal or external controller. (the author put sometime in this definition... why?)
*Emergent*: Since simple rules produce complex behavior in hard-to-predict ways, the macroscopic behavior of such system is called emergent
:::
### 2. Dynamics, Chaos, and Prediction
*Dynamic* means changing. Chaos is an offspring of dynamic systems.
Start with Newton's machanic and dynamic, goes to Poincare(庞家来)'s three body problems.
'uncertainty principle' and 'chaos' make prediction in both large and small scaope imposibble.
*Nonlinearity* is the reason. *Linear* system is one you can understand by understanding its partes individuallly and then putting them together.
*Chaos*, as used to describe dynamical system with sensitive dependence on initial conditions, was first coined by physicists.
Chaos' features:
- The period-doubling route to chaos
- Feigenbaum constant
:::info
- *bifurcation*: abrupt periods.
- *rate*: spead of bifurcation
:::
### 3. Information
*Information* is used to characterized and meansure order and disorder, complexity and simplicity.
Energy, work, and entropy.
Laws of thermodynamics:
- Energy is conserved, meanning no creation or destroyed.
- Entropy always increases until it reaches a maximum value, unless outside agent works to decrease it.
Interestingly, the second law is the ONLY fundamental law of physics that distinguishes between past and future.
the entropy (and thus information content) of a source is defined in terms of message probabilities and is not concerned with the “meaning” of a message.
:::info
Shannon defined the information of a macrostate (here, a source) as a function of the number of possible microstates (here, ensembles of possible messages) that could be sent by that source.
In other words: information concerns the predictability of a message source.
:::
(I need detail read about the micro/macro states of entropy)
### 4. Computation
Information is prcessed via *Computation*.
There is a limit of what we can compute.
(I need more detail read here about the Turing Machine)
### 5. Evolution
> We know that to decrease entropy, work must be done. Who or what is doing the work of creating and maintaining living systems and making them more complex?
Entropy decreases (living systems become more organized, seemingly more designed) as a result of the work done by natural selection. The energy for this work comes from the ability of individual organisms to metabolize energy from their environments (e.g., sunlight and food).
> Discrete variation in the genes of an organism can result in continuous-seeming variation in the organism’s phenotype—the physical traits (e.g., height, skin color, etc.) resulting from these genes. Darwinism and Mendelism were finally recognized as being complementary, not opposed.
### 6. Genetics
### 7. Defining and Measuring Complexity
> Science often makes progress by inventing new terms to describe incompletely understood phenomena; these terms are gradually refined as the science matures and the phenomena become more completely understood.
There are several possible ways to define complexity of a system:
## Part 2: Life and Evolution in Computers
Explore notion of life and evolution occurring in computers.
### 8. Self-Reproducing Computer Programs
Exmaple: self-copy program. Solution is using information store in memory in two ways: instracutions and data!
DNA not only contains the code for its self-replicating program, but also it encodes its own
John von Neumann's automaton design is the first science of artificial life.
### 9. Genetic Algorithms
It is about Mutation. GA framework. (I did some simple application of this...)
## Part 3: Computation Writ Large
In contract to Part 2, the extent to which computation itself occurs in nature.
### 10. Cellular Automata, Life, and the Universe
Turing machines provide a way of formalizing the notion of “definite procedure”—that is, computation.
Cellular automaton is equvialent with Tunring Machine but slow... (Why time makes a different here???)
Wolfram's pricinple of computational equivalence:
- Process in nature is computing
- Simple rules can support universal computation. ??? What is universal computation?
- Universal computation is an upper limit on the complexity of computation in nature.
- The computations done by different processes in nature are almost always equivalent in sophistication.
### 11. Computing with Particles
There is big difference between von-Nuemann-style computer and cellular automaton: cellular automaton does not have RAM to store data/operation, does not have CPU to do counting. Each individual cells, has only his state and the state of its neighbors.
> general difficult to design cellular automata to perform tasks that require collective decision making among all the cells. Peter and I were interested in how a genetic algorithm might solve this design problem.
The core to use cellular automaton to calculate is to set right rule.
### 12. Information Processing in Living System
The purpose of this chapter is to explore the notion of information processing or computation in living systems.
They are decentralized systems.
? What is Information Processing or Computing?
-> Information processing in Turing Machine
-> Information processing in cellular Automata
> For cellular automata, no such compilers or decompilers exist, at least not yet, and there is still no practical and general way to design high-level “programs.”
>
Three examples:
- The immune system
- Ant colonies
- Biological metabolism
All three systems described above use randomness and probabilities in essential ways.
### 13. How to make analogies (If you are a computer)
(I did not read much on this topic yet.) "Copycat"
### 14. Prospects of Computer Modeling
What is a model?
Compuer model of Prisoner’s Dilemma.
## Part 4: Network Thinking
### 15. The science of networks
“six degrees of separation.”
>**Network thinking** means focusing on relationships between entities rather than the entities themselves.
>
:::info
- Networs: is a collection of nodes connected by links.
- Clustering:
- Degree of a node: The number of links on the node
- Hub: high degree node
:::
> A major discovery to date of network science is that high-clustering, skewed degree distributions, and hub structure seem to be characteristic of the vast majority of all the natural, social, and technological networks that network scientists have studied.
>
Two classes of networks:
- Small-world networks
- scale-free networks
### 16. Applying network science to real-world networks
Examples:
- The Brain
- Metabolic Network
- Genetic regulatory networks
### 17. The Mystery of Scaling
> the way in which properties of living organisms scale with size.
>
### 18. Evolution, Complexified
## Part 4: Conclusion