Complexity: A Guided Tour by Melanie Mitchell

It’s the best possible time to be alive, when almost everything you thought you knew is wrong.

A definition of the term complex system: a system in which large networks of components with no central control and simple rules of operation give rise to complex collective behavior, sophisticated information processing, and adaptation via learning or evolution.

An alternative definition of a complex system: a system that exhibits nontrivial emergent and self-organizing behaviors.

Aristotle believed that earthly objects move in different ways depending on what they are made of.

His scientific method was to let logic and common sense direct theory; the importance of testing the resulting theories by experiments is a more modern notion.

Physicists call the general study of motion mechanics. Mechanics is divided into two areas: kinematics, which describes how things move, and dynamics, which explains why things obey the laws of kinematics.

Calculus, the branch of mathematics that describes motion and change.

Linearity is a reductionist’s dream, and nonlinearity can sometimes be a reductionist’s nightmare.

A small error in initial conditions produce an enormous error in the final phenomenon.

Prediction becomes impossible….

The logistic map, which is perhaps the most famous equation in the science of dynamical systems and chaos. The logistic map is an extremely simple equation and is completely deterministic: every xt maps onto one and only one value of xt+1. And yet the chaotic trajectories obtained from this map, at certain values of R, look very random—enough so that the logistic map has been used as a basis for generating pseudo-random numbers on a computer.

Even if we have a simple model in which all the parameters are determined exactly, long-term prediction is nevertheless impossible.

The discovery and understanding of chaos has produced a rethinking of many core tenets of science.

Seemingly random behavior can emerge from deterministic systems, with no external source of randomness.

Physicists have a specific meaning of “work” done by an object: the amount of force applied to the object multiplied by the distance traveled by the object in the direction that force was applied.

Laws of thermodynamics apply to “isolated systems”—ones that do not exchange energy with any outside entity. First law: Energy is conserved. The total amount of energy in the universe is constant. Energy can be transformed from one form to another. However, energy can never be created or destroyed.

As R was increased from 2.0 to 4.0, iterating the logistic map for a given value of R first yielded a fixed point, then a period-two oscillation, then period four, then eight, and so on, until chaos was reached. In dynamical systems theory, each of these abrupt period doublings is called a bifurcation. This succession of bifurcations culminating in chaos has been called the “period doubling route to chaos.” 4.6692016, now called Feigenbaum’s constant.

Second law: Entropy always increases until it reaches a maximum value. It will never decrease on its own unless an outside agent works to decrease it. The second law is the only fundamental law of physics that distinguishes between past and future. All other laws are reversible in time.

Shannon’s definition of information involves a source that sends messages to a receiver.

“Average amount of surprise” - in which “surprise” means something like the “degree of uncertainty” the receiver had about what the source would send next.

Kurt Gödel: incompleteness theorem showed that if arithmetic is consistent, then there are true statements in arithmetic that cannot be proved: “This statement is not provable.”

Let’s call this statement “Statement A.” Now, suppose Statement A could indeed be proved. But then it would be false (since it states that it cannot be proved). That would mean a false statement could be proved—arithmetic would be inconsistent. Okay, let’s assume the opposite, that Statement A cannot be proved. That would mean that Statement A is true (because it asserts that it cannot be proved), but then there is a true statement that cannot be proved—arithmetic would be incomplete. Ergo, arithmetic is either inconsistent or incomplete.

Since this is a simple problem, it would probably be pretty easy for a human to figure out a good strategy for Robby to follow. However, the point of genetic algorithms is that humans, being intrinsically lazy, don’t have to figure out anything; we just let computer evolution figure it out for us. Let’s use a genetic algorithm to evolve a good strategy for Robby.

Turing’s first goal was to make very concrete this notion of definite procedure. The idea is that, given a particular problem to solve, you can construct a definite procedure for solving it by designing a Turing machine that solves it. Turing machines were put forth as the definition of “definite procedure,”

I knew that my strategy wasn’t perfect, but this little trick never occurred to me. Evolution can be pretty clever. GAs often come up with things we humans don’t consider.

Turing’s big result—there can be no definite procedure for solving the Halting problem. The Halting problem is an example that proves that the answer to the Entscheidungsproblem is “no”; not every mathematical statement has a definite procedure that can decide its truth or falsity.

Darwin’s theory of evolution: Philosopher Daniel Dennett: If I were to give an award for the single best idea anyone has ever had, I’d give it to Darwin, ahead of Newton and Einstein and everyone else. In a single stroke, the idea of evolution by natural selection unifies the realm of life, meaning, and purpose with the realm of space and time, cause and effect, mechanism and physical law.

Entropy decreases (living systems become more organized, seemingly more designed) as a result of the work done by natural selection.