By Edward Frenkel, The New York Times, September 30, 2013
Our daily lives are growing ever more dependent on algorithms, those omnipresent computational procedures that run programs on our laptops, our smartphones, our GPS devices and much else. Algorithms influence our decisions, too: when we choose a movie on Netflix or a book on Amazon, we are presented with recommendations churned out by sophisticated algorithms that take into account our past choices and those of other users determined (by still other algorithms) to be similar to us.
The importance of these algorithms in the modern world is common knowledge, of course. But in his insightful new book “Probably Approximately Correct,” the Harvard computer scientist Leslie Valiant goes much further: computation, he says, is and has always been “the dominating force on earth within all its life forms.” Nature speaks in algorithms.
Dr. Valiant believes that phenomena like evolution, adaptation and learning are best understood in terms of “ecorithms,” a term he has coined for algorithms that interact with and benefit from their environment. Ecorithms are at play when children learn how to distinguish cats from dogs, when we navigate our way in a new city — but more than that, Dr. Valiant writes, when organisms evolve and when brain circuits are created and maintained.
Here is one way he illustrates this complex idea. Suppose we want to distinguish between two types of flowers by measuring their petals. For each petal we have two numbers, x for length and y for width. The task is to find a way to tell which type of flower a petal with given measurements x and y belongs to.
To achieve this, the algorithm is fed a set of examples meant to “train” it to come up with a good criterion for distinguishing the two flowers. The algorithm does not know the criterion in advance; it must “learn” it using the data that are fed to it.
So it starts with a hypothesis and test it on the first example. (say flower No. 1’s petals can be described by the formula 2x-3y greater than 2, while for flower No. 2’s it is 2x-3y less than 2.) If that works, we proceed to the next example. If the hypothesis misclassifies an example, it is updated by a certain precise rule, and the new hypothesis is applied to the next example. The algorithm is literally learning as it goes along.
A striking mathematical theorem is that if a rule separating the two flowers exists (within the class of criteria we are considering, such as linear inequalities), then our algorithm will find it after a finite number of steps, no matter what the starting hypothesis was.
And Dr. Valiant argues that similar mechanisms are at work in nature. An organism can adapt to a new environment by starting with a hypothesis about the environment, then testing it against new data and, based on the feedback, gradually improving the hypothesis by using an ecorithm, to behave more effectively.
“Probably Approximately Correct,” Dr. Valiant’s winsome title, is his quantitative framework for understanding how these ecorithms work. In nature, there is nothing so neat as our idealized flower algorithm; we cannot really hope to get a precise rule distinguishing between two types of flowers, but can hope only to have one that gives an approximate result with high probability.
The evolution of species, as Darwin taught us, relies on natural selection. But Dr. Valiant argues that if all the mutations that drive evolution were simply random and equally distributed, it would proceed at an impossibly slow and inefficient pace.
Darwin’s theory “has the gaping gap that it can make no quantitative predictions as far as the number of generations needed for the evolution of a behavior of a certain complexity,” he writes. “We need to explain how evolution is possible at all, how we got from no life, or from very simple life, to life as complex as we find it on earth today. This is the BIG question.”
Dr. Valiant proposes that natural selection is supplemented by ecorithms, which enable organisms to learn and adapt more efficiently. Not all mutations are realized with equal probability; those that are more beneficial are more likely to occur. In other words, evolution is accelerated by computation.
This is an ambitious proposal, sure to ignite controversy. But what I find so appealing about this discussion, and the book in general, is that Dr. Valiant fearlessly goes to the heart of the “BIG” questions. (I don’t know him, though we share an editor at Basic Books.) He passionately argues his case, but is also first to point out the parts of his theory that are incomplete, eager to anticipate and confront possible counterarguments. This is science at its best, driven not by dogma and blind belief, but by the desire to understand, intellectual integrity and reliance on facts.
Many other topics are discussed, from computational complexity to intelligence (artificial and not), and even an algorithmic perspective on teaching. The
book is written in a lively, accessible style and is surprisingly entertaining. It’s funny how your perception of even mundane tasks can change after reading it — you start thinking algorithmically, confirming Dr. Valiant’s maxim that computer science is “more about humans than about computers.”
book is written in a lively, accessible style and is surprisingly entertaining. It’s funny how your perception of even mundane tasks can change after reading it — you start thinking algorithmically, confirming Dr. Valiant’s maxim that computer science is “more about humans than about computers.”
Edward Frenkel, a professor of mathematics at the University of California, Berkeley, is the author of the new book “Love and Math.”
No comments:
Post a Comment