Chapter-Level Table of Contents
Genetic programming (GP) is method for automatically creating computer programs. It starts from a high-level statement of what needs to be done and uses the Darwinian principle of natural selection to breed a population of improving programs over many generations.
Genetic Programming IV: Routine Human-Competitive Machine Intelligence presents the application of GP to a wide variety of problems involving automated synthesis of controllers, circuits, antennas, genetic networks, and metabolic pathways. The books describes 15 instances where GP has created an entity that either infringes or duplicates the functionality of a previously patented 20th-century invention, 6 instances where it has done the same with respect to post-2000 patented inventions, 2 instances where GP has created a patentable new invention, and 13 other human-competitive results. The book additionally establishes:
· GP now delivers routine human-competitive machine intelligence.
· GP is an automated invention machine.
· GP can create general solutions to problems in the form of parameterized topologies.
· GP has delivered qualitatively more substantial results in synchrony with the relentless iteration of Moore's Law.
A 42-minute video overview of the book is contained in a DVD that comes with the book.
“The research reported in this book is a tour de force. For the first time since the idea was bandied about in the 1940s and the early 1950s, we have a set of examples of human-competitive automatic programming. These examples include the automated re-creation of 21 previously patented inventions and the creation of 2 patentable new inventions” — John H. Holland, University of Michigan
“In 1992, John Koza published his first book on genetic
programming and forever changed the world of computation. At the time, many
researchers, myself included, were skeptical about whether the idea of using
genetic algorithms directly to evolve programs would ever amount to much. But
scores of conquered problems and three additional books makes the case utterly
persuasive. The latest contribution, Genetic Programming IV: Routine
Human-Competitive Machine Intelligence, demonstrates the everyday solution
of such ‘holy grail’ problems as the automatic synthesis of analog circuits,
the design of automatic controllers, and the automated programming of
computers. This would be impressive enough, but the book also shows how to
evolve whole families of solutions to entire classes of problems in a single
run. Such parametric GP is a significant achievement, and I believe it
foreshadows generalized evolution of complex contingencies as an everyday
matter. To artificial evolutionaries of all stripes, I recommend that you read
this book and breath in its thoughtful mechanism and careful empirical method.
To specialists in any of the fields covered by this book’s sample problem
areas, I say read this book and discover the computer-augmented inventions that
are your destiny. To remaining skeptics who doubt the inventive competence of
genetics and evolution, I say read this book and change your mind or risk the
strong possibility that your doubts will soon cause you significant intellectual
embarrassment.” —David E. Goldberg, University of Illinois
“The adaptive filters and neural networks that I have worked
with over many years are self-optimizing systems where the relationship between
performance (usually mean-square-error) and parameter settings (weights) is
continuous. Optimization by gradient methods works well for these systems. Now,
this book describes a wider class of optimization problems where the
relationship between performance (fitness) and parameters is highly disjoint, and
self-optimization is achieved by nature-inspired genetic algorithms involving
random search (mutation) and crossover (sexual reproduction). John Koza and his
colleagues have done remarkable work in advancing the development of genetic
programming and applying this to practical problems such as electric circuit
design and control system design. What is ingenious about their work is that
they have found ways to approach design problems by parameterizing both
physical and topological variables into a common code that can be subjected to
genetic programming for optimization. It is amazing how this approach finds
optimized solutions that are not obvious to the best human experts. This fine
book gives an accounting of the latest work in genetic programming, and it is
‘must reading’ for those interested in adaptive and learning systems, neural
networks, fuzzy systems, artificial intelligence, and neurobiology. I strongly
recommend it. — Bernard Widrow, Electrical Engineering Department, Stanford
University
“John Koza’s genetic programming approach to machine
discovery can invent solutions to more complex specifications than any other I
have seen.” — John McCarthy, Computer Science Department, Stanford
University
Click here to read chapter 1 in PDF format
John R. Koza received his Ph.D. in Computer Science from the University of Michigan in 1972 under the supervision of John Holland. He was co-founder, Chairman, and CEO of Scientific Games Inc. from 1973 through 1987 where he co-invented the rub-off instant lottery ticket used by state lotteries. He has taught a course on genetic algorithms and genetic programming at Stanford University since 1988. He is currently a consulting professor in the Biomedical Informatics Program in the Department of Medicine at Stanford University and a consulting professor in the Department of Electrical Engineering at Stanford University.
Martin A. Keane received a Ph.D. in Mathematics from Northwestern University in 1969. He worked for Applied Devices Corporation until 1972, in the Mathematics Department at General Motors Laboratory until 1976, and was Vice-President for Engineering of Bally Manufacturing Corporation until 1986. He is currently chief scientist of Econometrics Inc. of Chicago and a consultant to various computer-related and gaming-related companies.
Matthew J. Streeter received a Masters degree in Computer Science from Worcester Polytechnic Institute in 2001. His Masters thesis applied genetic programming to the automated discovery of numerical approximation formulae for functions and surfaces. His primary research interest is applying genetic programming to problems of real-world scientific or practical importance. He is currently working at Genetic Programming Inc. as a systems programmer and researcher.
William Mydlowec is Chief Executive Officer and co-founder of Pharmix Corporation, a venture-funded computational drug discovery company in Silicon Valley. He received his B.S. degree in Computer Science from Stanford University in 1998. He formerly did research at Genetic Programming Inc. with John Koza between 1997 and 2000.
Jessen Yu is Director of Engineering of Pharmix Corporation. He received a B.S. degree in Computer Science and Chemistry from Stanford University. He formerly did research at Genetic Programming Inc. with John Koza between 1998 and 2000.
Guido Lanza is Vice President of Biology and co-founder of Pharmix Corporation. He received his B.A. degree in 1998 from the University of California at Berkeley from the Department of Molecular and Cell Biology and Department of Integrative Biology. He received an M.Sc. in 1999 in Bioinformatics from the University of Manchester, UK. He formerly did research at Genetic Programming Inc. with John Koza in 2000.
1 Introduction
2 Background on Genetic Programming
3 Automatic Synthesis of Controllers
4 Automatic Synthesis of Circuits
5 Automatic Synthesis of Circuit Topology, Sizing, Placement, and Routing
6 Automatic Synthesis of Antennas
7 Automatic Synthesis of Genetic Networks
8 Automatic Synthesis of Metabolic Pathways
9 Automatic Synthesis of Parameterized Topologies for Controllers
10 Automatic Synthesis of Parameterized Topologies for Circuits
11 Automatic Synthesis of Parameterized Topologies with Conditional Developmental Operators for Circuits
12 Automatic Synthesis of Improved Tuning Rules for PID Controllers
13 Automatic Synthesis of Parameterized Topologies for Improved Controllers
14 Reinvention of Negative Feedback
15 Automated Reinvention of Six Post-2000 Patented Circuits
16 Problems for Which Genetic Programming May Be Well Suited
17 Parallel Implementation and Computer Time
18 Historical Perspective on Moore’s Law and the Progression of Qualitatively More Substantial Results Produced by Genetic Programming
19 Conclusion
Appendix A: Functions and Terminals
Appendix B: Control Parameters
Appendix C: Patented or Patentable Inventions Generated by Genetic Programming
Bibliography
1 Introduction
1.1 Genetic Programming Now Routinely Delivers High-Return Human-Competitive Machine Intelligence
1.1.1 What We Mean by “Human-Competitive”
1.1.2 What We Mean by “High-Return”
1.1.3 What We Mean by “Routine”
1.1.4 What We Mean by “Machine Intelligence”
1.1.5 Human-Competitiveness of Results Produced by Genetic Programming
1.1.6 High-Return of the Results Produced by Genetic Programming
1.1.7 Routineness of the Results Produced by Genetic Programming
1.1.8 Machine Intelligence
1.2 Genetic Programming Is an Automated Invention Machine
1.2.1 The Illogical Nature of Invention and Evolution
1.2.2 Overcoming Established Beliefs
1.2.3 Automating the Invention Process
1.2.4 Patentable New Inventions Produced by Genetic Programming
1.3 Genetic Programming Can Automatically Create Parameterized Topologies
1.4 Historical Progression of Qualitatively More Substantial Results Produced by Genetic Programming in Synchrony with Increasing Computer Power
2 Background on Genetic
Programming
2.1 Preparatory Steps of Genetic Programming
2.2 Executional Steps of Genetic Programming
2.2.1 Example of a Run of Genetic Programming
2.3 Advanced Features of Genetic Programming
2.3.1 Constrained Syntactic Structures
2.3.2 Automatically Defined Functions
2.3.3 Automatically Defined Iterations, Automatically Defined Loops, Automatically Defined Recursions, and Automatically Defined Stores 93
2.3.4 Program Architecture and Architecture-Altering Operations
2.3.5 Genetic Programming Problem Solver (GPPS)
2.3.6 Developmental Genetic Programming
2.3.7 Computer Code for Implementing Genetic Programming
2.4 Main Points of Four Books on Genetic Programming
2.5 Sources of Additional Information about Genetic Programming
3 Automatic Synthesis of
Controllers
3.1 Background on Controllers
3.2 Design Considerations for Controllers
3.3 Representation of a Controller by a Block Diagram
3.4 Possible Techniques for Designing Controllers
3.4.1 Search by Hill Climbing
3.4.2 Search by Gradient Methods
3.4.3 Search by Simulated Annealing
3.4.4 Search by the Genetic Algorithm and Genetic Programming
3.4.5 Previous Work on Controller Synthesis by Means of Genetic and Evolutionary Computation
3.4.6 Possible Approaches to Automatic Controller Synthesis Using Genetic Programming
3.5 Our Approach to the Automatic Synthesis of the Topology and Tuning of Controllers
3.5.1 Repertoire of Functions
3.5.2 Repertoire of Terminals
3.5.3 Representing the Plant
3.5.4 Automatically Defined Functions
3.5.5 Three Approaches for Establishing Numerical Parameter Values
3.5.5.1 Arithmetic-Performing Subtrees
3.5.5.2 Perturbable Numerical Value
3.5.5.3 Arithmetic-Performing Subtree Containing Perturbable Numerical Values
3.5.6 Constrained Syntactic Structure for Program Trees
3.6 Additional Representations of Controllers
3.6.1 Representation of a Controller by a Transfer Function
3.6.2 Representation of a Controller as a LISP Symbolic Expression
3.6.3 Representation of a Controller as a Program Tree
3.6.4 Representation of a Controller in Mathematica
3.6.5 Representation of a Controller and Plant as a Connection List
3.6.6 Representation of a Controller and Plant as a SPICE Netlist
3.7 Two-Lag Plant
3.7.1 Preparatory Steps for the Two-Lag Plant
3.7.1.1 Program Architecture
3.7.1.2 Terminal Set
3.7.1.3 Function Set
3.7.1.4 Fitness Measure
3.7.1.5 Control Parameters
3.7.1.6 Termination Criterion and Results Designation
3.7.1.7 Knowledge Incorporated into the Preparatory Steps
3.7.2 Results for the Two-Lag Plant
3.7.3 Human-Competitiveness of the Result for the Two-Lag Plant Problem
3.7.4 AI Ratio for the Two-Lag Plant Problem
3.8 Three-Lag Plant
3.8.1 Preparatory Steps for the Three-Lag Plant
3.8.1.1 Program Architecture
3.8.1.2 Terminal Set
3.8.1.3 Function Set
3.8.1.4 Fitness Measure
3.8.1.5 Control Parameters
3.8.2 Results for the Three-Lag Plant
3.8.3 Routineness for the Three-Lag Plant Problem
3.8.4 AI Ratio for the Three-Lag Plant Problem
3.9 Three-Lag Plant with a Five-Second Delay
3.9.1 Preparatory Steps for the Three-Lag Plant with a Five-Second Delay
3.9.1.1 Program Architecture
3.9.1.2 Terminal Set
3.9.1.3 Function Set
3.9.1.4 Fitness Measure
3.9.1.5 Control Parameters
3.9.2 Results for the Three-Lag Plant with a Five-Second Delay
3.9.3 Routineness for the Three-Lag Plant with a Five-Second Delay
3.9.4 AI Ratio for the Three-Lag Plant with a Five-Second Delay
3.10 Non-Minimal-Phase Plant
3.10.1 Preparatory Steps for the Non-Minimal-Phase Plant
3.10.1.1 Program Architecture
3.10.1.2 Terminal Set
3.10.1.3 Function Set
3.10.1.4 Fitness Measure
3.10.1.5 Control Parameters
3.10.2 Results for the Non-Minimal Phase Plant
3.10.3 Routineness for the Non-Minimal Phase Plant Problem
3.10.4 AI Ratio for the Non-Minimal Phase Plant Problem
4 Automatic Synthesis of
Circuits
4.1 Our Approach to the Automatic Synthesis of the Topology and Sizing of Circuits
4.1.1 Evolvable Hardware
4.2 Searching for the Impossible
4.2.1 Preparatory Steps for the RC Circuit with Gain Greater than Two
4.2.1.1 Initial Circuit
4.2.1.2 Program Architecture
4.2.1.3 Function Set
4.2.1.4 Terminal Set
4.2.1.5 Fitness Measure
4.2.1.6 Control Parameters
4.2.2 Results for the RC Circuit with Gain Greater than Two
4.2.3 Routineness of the Transition from a Problem of Controller Synthesis to a Problem of Circuit Synthesis
4.2.4 AI Ratio for the RC Circuit with Gain Greater than Two
4.3 Reinvention of the Philbrick Circuit
4.3.1 Preparatory Steps for the Philbrick Circuit
4.3.1.1 Initial Circuit
4.3.1.2 Program Architecture
4.3.1.3 Terminal Set
4.3.1.4 Function Set
4.3.1.5 Fitness Measure
4.3.1.6 Control Parameters
4.3.2 Results for the Philbrick Circuit
4.3.3 Human-Competitiveness of the Result for the Philbrick Circuit Problem
4.3.4 Routineness for the Philbrick Circuit Problem
4.3.5 AI Ratio for the Philbrick Circuit Problem
4.4 Circuit for the NAND Function
4.4.1 Preparatory Steps for the NAND Circuit
4.4.1.1 Initial Circuit
4.4.1.2 Program Architecture
4.4.1.3 Terminal Set
4.4.1.4 Function Set
4.4.1.5 Fitness Measure
4.4.1.6 Control Parameters
4.4.1.7 Termination Criterion
4.4.2 Results for the NAND Circuit
4.4.3 Human-Competitiveness of the Result for the NAND Circuit Problem
4.4.4 Routineness for the NAND Circuit Problem
4.4.5 AI Ratio for the NAND Circuit Problem
4.5 Evolution of a Computer
4.5.1 Preparatory Steps for the Arithmetic Logic Unit
4.5.1.1 Initial Circuit
4.5.1.2 Fitness Measure
4.5.1.3 Control Parameters
4.5.2 Results for the Arithmetic Logic Unit
4.5.3 Routineness for the Arithmetic Logic Unit Circuit Problem
4.5.4 AI Ratio for the Arithmetic Logic Unit Circuit Problem
4.6 Square Root Circuit
4.6.1 Preparatory Steps for Square Root Circuit
4.6.1.1 Initial Circuit
4.6.1.2 Program Architecture
4.6.1.3 Terminal Set
4.6.1.4 Function Set
4.6.1.5 Fitness Measure
4.6.1.6 Control Parameters
4.6.2 Results for Square Root Circuit
4.6.3 Routineness for the Square Root Circuit Problem
4.6.4 AI Ratio for the Square Root Circuit Problem
4.7 Automatic Circuit Synthesis Without an Explicit Test Fixture
4.7.1 Preparatory Steps for the Lowpass Filter Problem Without an Explicit Test Fixture
4.7.1.1 Initial Circuit
4.7.1.2 Program Architecture
4.7.1.3 Function Set
4.7.1.4 Terminal Set
4.7.1.5 Fitness Measure
4.7.1.6 Control Parameters
4.7.2 Results for the Lowpass Filter Problem without an Explicit Test Fixture
4.7.3 Routineness for the Lowpass Filter Problem without an Explicit Test Fixture
4.7.4 AI Ratio for the Lowpass Filter Problem without an Explicit Test Fixture
5 Automatic Synthesis of
Circuit Topology, Sizing, Placement, and Routing
5.1 Our Approach to the Automatic Synthesis of Circuit Topology, Sizing, Placement, and Routing
5.1.1 Initial Circuit
5.1.2 Circuit-Constructing Functions
5.1.3 Component-Creating Functions
5.1.4 Topology-Modifying Functions
5.1.5 Development-Controlling Functions
5.1.6 Developmental Process
5.2 Lowpass Filter with Layout
5.2.1 Preparatory Steps for the Lowpass Filter with Layout
5.2.1.1 Initial Circuit
5.2.1.2 Program Architecture
5.2.1.3 Function Set
5.2.1.4 Terminal Set
5.2.1.5 Fitness Measure
5.2.1.6 Control Parameters
5.2.2 Results for the Lowpass Filter with Layout
5.2.3 Human-Competitiveness of the Result for the Lowpass Filter Problem with Layout
5.2.4 Routineness of the Transition from a Problem of Circuit Synthesis without Layout to a Problem of Circuit Synthesis with Layout
5.2.5 AI Ratio for the Lowpass Filter Problem with Layout
5.3 60 dB Amplifier with Layout
5.3.1 Preparatory Steps for 60 dB Amplifier with Layout
5.3.1.1 Initial Circuit
5.3.1.2 Program Architecture
5.3.1.3 Function Set
5.3.1.4 Terminal Set
5.3.1.5 Fitness Measure
5.3.1.6 Control Parameters
5.3.2 Results for 60 dB Amplifier with Layout
5.3.3 Routineness for the 60 dB Amplifier Problem with Layout
5.3.4 AI Ratio for the 60 dB Amplifier Problem with Layout
6 Automatic Synthesis of
Antennas
6.1 Our Approach to the Automatic Synthesis of the Geometry and Sizing of Antennas
6.2 Illustrative Problem of Antenna Synthesis
6.3 Repertoire of Functions and Terminals
6.3.1 Repertoire of Functions
6.3.2 Repertoire of Terminals
6.3.3 Example of the Use of the Functions and Terminals
6.4 Preparatory Steps for the Antenna Problem
6.4.1 Program Architecture
6.4.2 Function Set
6.4.3 Terminal Set
6.4.4 Fitness Measure
6.4.5 Control Parameters
6.5 Results for the Antenna Problem
6.6 Routineness of the Transition from Problems of Synthesizing Controllers, Circuits, and Circuit Layout to a Problem of Synthesizing an Antenna
6.7 AI Ratio for the Antenna Problem
7 Automatic Synthesis of
Genetic Networks
7.1 Statement of the Illustrative Problem
7.2 Representation of Genetic Networks by Computer Programs
7.2.1 Repertoire of Functions
7.2.2 Repertoire of Terminals
7.3 Preparatory Steps
7.3.1 Program Architecture
7.3.2 Function Set
7.3.3 Terminal Set
7.3.4 Fitness Measure
7.3.5 Control Parameters
7.4 Results
7.4.1 Routineness of the Transition from Problems of Synthesizing Controllers, Circuits, Circuits with Layout, and Antennas to a Problem of Genetic Network Synthesis
7.4.2 AI Ratio for the Genetic Network Problem
8 Automatic Synthesis of
Metabolic Pathways
8.1 Our Approach to the Automatic Synthesis of the Topology and Sizing of Networks of Chemical Reactions
8.2 Statement of Two Illustrative Problems
8.3 Types of Chemical Reactions
8.3.1 One-Substrate, One-Product Reaction
8.3.2 One-Substrate, Two-Product Reaction
8.3.3 Two-Substrate, One-Product Reaction
8.3.4 Two-Substrate, Two-Product Reaction
8.4 Representation of Networks of Chemical Reactions by Computer Programs
8.4.1 Representation as a Program Tree
8.4.1.1 Repertoire of Functions in the Program Tree
8.4.1.2 Repertoire of Terminals
8.4.1.3 Constrained Syntactic Structure
8.4.1.4 Example of a Program Tree
8.4.2 Representation as a Symbolic Expression
8.4.3 Representation as a System of Nonlinear Differential Equations
8.4.4 Representation as an Analog Electrical Circuit
8.4.5 Flexibility of the Representation
8.5 Preparatory Steps
8.5.1 Program Architecture
8.5.2 Function Set
8.5.3 Terminal Set
8.5.4 Fitness Measure
8.5.5 Control Parameters
8.6 Results for the Phospholipid Cycle
8.6.1 Routineness of the Transition from Problems of Synthesizing Controllers, Circuits, Circuits with Layout, Antennas, and Genetic Networks to a Problem of Synthesis of a Network of Chemical Reactions
8.6.2 AI Ratio for the Metabolic Pathway Problem for the Phospholipid Cycle
8.7 Results for the Synthesis and Degradation of Ketone Bodies
8.7.1 Routineness for the Metabolic Pathway Problem Involving Ketone Bodies
8.7.2 AI Ratio for the Metabolic Pathway Problem Involving Ketone Bodies
8.8 Future Work on Metabolic Pathways
8.8.1 Improved Program Tree Representation
8.8.2 Null Enzyme
8.8.3 Minimum Amount of Data Needed
8.8.4 Opportunities to Use Knowledge
8.8.5 Designing Alternative Metabolisms
9 Automatic Synthesis of
Parameterized Topologies for Controllers
9.1 Parameterized Controller for a Three-Lag Plant
9.1.1 Preparatory Steps for the Parameterized Controller for a Three-Lag Plant
9.1.1.1 Program Architecture
9.1.1.2 Terminal Set
9.1.1.3 Function Set
9.1.1.4 Fitness Measure
9.1.1.5 Control Parameters
9.1.2 Results for the Parameterized Controller for a Three-Lag Plant
9.1.3 Routineness of the Transition from a Problem Involving a Non-Parameterized Controller to a Problem Involving a Parameterized Controller
9.1.4 AI Ratio for the Parameterized Controller for a Three-Lag Plant
9.2 Parameterized Controller for Two Families of Plants
9.2.1 Preparatory Steps for the Parameterized Controller for Two Families of Plants
9.2.1.1 Program Architecture
9.2.1.2 Terminal Set
9.2.1.3 Function Set
9.2.1.4 Fitness Measure
9.2.1.5 Control Parameters
9.2.2 Results for the Parameterized Controller for Two Families of Plants
9.2.3 Human-Competitiveness of the Result for the Parameterized Controller for Two Families of Plants
9.2.4 Routineness for the Parameterized Controller for Two Families of Plants
9.2.5 AI Ratio for the Parameterized Controller for Two Families of Plants
10 Automatic Synthesis of
Parameterized Topologies for Circuits
10.1 Five New Techniques
10.1.1 New NODE Function for Connecting Distant Points
10.1.2 Symmetry-Breaking Procedure using Geometric Coordinates
10.1.3 Depth-First Evaluation
10.1.4 New TWO_LEAD Function for Inserting Two-Leaded Components
10.1.5 New Q Transistor-Creating Function
10.2 Zobel Network with Two Free Variables
10.2.1 Preparatory Steps for the Zobel Network Problem with Two Free Variables
10.2.1.1 Initial Circuit
10.2.1.2 Program Architecture
10.2.1.3 Function Set
10.2.1.4 Terminal Set
10.2.1.5 Fitness Measure
10.2.1.6 Control Parameters
10.2.2 Results for the Zobel Network Problem with Two Free Variables
10.2.3 Routineness of the Transition from a Problem Involving a Non-Parameterized Circuit to a Problem Involving a Parameterized Circuit
10.2.4 AI Ratio for the Zobel Network Problem with Two Free Variables
10.3 Third-Order Elliptic Lowpass Filter with a Free Variable for the Modular Angle
10.3.1 Preparatory Steps for the Third-Order Elliptic Lowpass Filter with a Free Variable for the Modular Angle
10.3.1.1 Initial Circuit
10.3.1.2 Program Architecture
10.3.1.3 Function Set
10.3.1.4 Terminal Set
10.3.1.5 Fitness Measure
10.3.1.6 Control Parameters
10.3.2 Results for the Lowpass Third-Order Elliptic Filter with a Free Variable for the Modular Angle
10.3.3 Routineness for the Lowpass Third-Order Elliptic Filter with a Free Variable for the Modular Angle
10.3.4 AI Ratio for the Lowpass Third-Order Elliptic Filter with a Free Variable for the Modular Angle
10.4 Passive Lowpass Filter with a Free Variable for the Passband Boundary
10.4.1 Preparatory Steps for the Passive Lowpass Filter with a Free Variable for the Passband Boundary
10.4.1.1 Initial Circuit
10.4.1.2 Program Architecture
10.4.1.3 Terminal Set
10.4.1.4 Function Set
10.4.1.5 Fitness Measure
10.4.1.6 Control Parameters
10.4.2 Results for the Passive Lowpass Filter with a Free Variable for the Passband Boundary
10.4.3 Routineness for the Passive Lowpass Filter with a Free Variable for the Passband Boundary
10.4.4 AI Ratio for the Passive Lowpass Filter with a Free Variable for the Passband Boundary
10.5 Active Lowpass Filter with a Free Variable for the Passband Boundary
10.5.1 Preparatory Steps for the Active Lowpass Filter with a Free Variable for the Passband Boundary
10.5.1.1 Initial Circuit
10.5.1.2 Program Architecture
10.5.1.3 Function Set
10.5.1.4 Terminal Set
10.5.1.5 Fitness Measure
10.5.1.6 Control Parameters
10.5.2 Results for the Active Lowpass Filter with a Free Variable for the Passband Boundary
10.5.3 Routineness for the Active Lowpass Filter with a Free Variable for the Passband Boundary
10.5.4 AI Ratio for the Active Lowpass Filter with a Free Variable for the Passband Boundary
11 Automatic Synthesis of
Parameterized Topologies with Conditional Developmental Operators for Circuits
11.1 Lowpass/Highpass Filter Circuit
11.1.1 Preparatory Steps for the Lowpass/Highpass Filter
11.1.1.1 Initial Circuit
11.1.1.2 Program Architecture
11.1.1.3 Terminal Set
11.1.1.4 Function Set
11.1.1.5 Fitness Measure
11.1.1.6 Control Parameters
11.1.2 Results for the Lowpass/Highpass Filter
11.1.3 Routineness of the Transition from a Parameterized Topology Problem without Conditional Developmental Operators to a Problem with Conditional Developmental Operators
11.1.4 AI Ratio for the Lowpass/Highpass Filter Problem
11.2 Lowpass/Highpass Filter with Variable Passband Boundary
11.2.1 Preparatory Steps for the Lowpass/Highpass Filter with Variable Passband Boundary
11.2.1.1 Fitness Measure
11.2.2 Results for the Lowpass/Highpass Filter with a Variable Passband Boundary
11.2.3 Routineness for the Lowpass/Highpass Filter with a Variable Passband Boundary
11.2.4 AI Ratio for the Lowpass/Highpass Filter with a Variable Passband Boundary
11.3 Quadratic/Cubic Computational Circuit
11.3.1 Preparatory Steps for the Quadratic/Cubic Computational Circuit
11.3.1.1 Initial Circuit
11.3.1.2 Program Architecture
11.3.1.3 Function Set
11.3.1.4 Terminal Set
11.3.1.5 Fitness Measure
11.3.1.6 Control Parameters
11.3.2 Results for the Quadratic/Cubic Computational Circuit
11.4 A 40/60 dB Amplifier
11.4.1 Preparatory Steps for the 40/60 dB Amplifier
11.4.1.1 Initial Circuit
11.4.1.2 Terminal Set
11.4.1.3 Fitness Measure
11.4.1.4 Control Parameters
11.4.2 Results for 40/60 dB Amplifier
12 Automatic Synthesis of
Improved Tuning Rules for PID Controllers
12.1 Test Bed of Plants
12.2 Preparatory Steps for Improved PID Tuning Rules
12.2.1 Program Architecture
12.2.2 Terminal Set
12.2.3 Function Set
12.2.4 Fitness Measure
12.2.5 Control Parameters
12.3 Results for Improved PID Tuning Rules
12.4 Human-Competitiveness of the Results for the Improved PID Tuning Rules
12.5 Routineness of the Transition from Problems Involving Parameterized Topologies for Controllers to a Problem Involving PID Tuning Rules
12.6 AI Ratio for the Improved PID Tuning Rules
13 Automatic Synthesis of
Parameterized Topologies for Improved Controllers
13.1 Preparatory Steps for Improved General-Purpose Controllers
13.1.1 Function Set
13.1.2 Terminal Set
13.1.3 Program Architecture
13.1.4 Fitness Measure
13.1.5 Control Parameters
13.2 Results for Improved General-Purpose Controllers
13.2.1 Results for First Run for Improved General-Purpose Controllers
13.2.2 Results for Second Run for Improved General-Purpose Controllers
13.2.3 Results for Third Run for Improved General-Purpose Controllers
13.3 Human-Competitiveness of the Results for the Improved General-Purpose Controllers
13.4 Routineness for the Improved General-Purpose Controllers
13.5 AI Ratio for the Improved General-Purpose Controllers
14 Reinvention of Negative
Feedback
14.1 Genetic Programming Takes a Ride on the Lackawanna Ferry
14.1.1 Fitness Measure
14.1.2 Initial Circuit, Function Set, Terminal Set, and Control Parameters
14.2 Results for the Problem of Reducing Amplifier Distortion
14.3 Human-Competitiveness of the Result for the Problem of Reducing Amplifier Distortion
14.4 Routineness for the Problem of Reducing Amplifier Distortion
14.5 AI Ratio for the Problem of Reducing Amplifier Distortion
15 Automated Reinvention of
Six Post-2000 Patented Circuits
15.1 The Six Circuits
15.1.1 Low-Voltage Balun Circuit
15.1.2 Mixed Analog-Digital Variable Capacitor
15.1.3 Voltage-Current Conversion Circuit
15.1.4 Low-Voltage High-Current Transistor Circuit
15.1.5 Cubic Function Generator
15.1.6 Tunable Integrated Active Filter
15.2 Uniformity of Treatment of the Six Problems
15.3 Preparatory Steps for the Six Post-2000 Patented Circuits
15.3.1 Initial Circuit
15.3.1.1 Test Fixture for the Low Voltage Balun Circuit
15.3.1.2 Test Fixture for the Mixed Analog-Digital Variable Capacitor Circuit
15.3.1.3 Test Fixture for High-Current Load Circuit
15.3.1.4 Test Fixture for the Voltage-Current Conversion Circuit
15.3.1.5 Test Fixture for the Cubic Function Generator
15.3.1.6 Test Fixture for the Tunable Integrated Active Filter
15.3.2 Program Architecture
15.3.3 Function Set
15.3.4 Terminal Set
15.3.5 Fitness Measure
15.3.5.1 Fitness Measure for Low Voltage Balun Circuit
15.3.5.2 Fitness Measure for Mixed Analog-Digital Variable Capacitor
15.3.5.3 Fitness Measure for High-Current Load Circuit
15.3.5.4 Fitness Measure for Voltage-Current Conversion Circuit
15.3.5.5 Fitness Measure for Cubic Function Generator
15.3.5.6 Fitness Measure for Tunable Integrated Active Filter
15.3.6 Control Parameters
15.4 Results for the Six Post-2000 Patented Circuits
15.4.1 Results for Low-Voltage Balun Circuit
15.4.2 Results for Mixed Analog-Digital Variable Capacitor
15.4.3 Results for High-Current Load Circuit
15.4.3.1 Results for First Run of High-Current Load Circuit
15.4.3.2 Results for Second Run of High-Current Load Circuit
15.4.4 Results for Voltage-Current Conversion Circuit
15.4.5 Results for Cubic Function Generator
15.4.5.1 Results for First Run of Cubic Function Generator
15.4.5.2 Results for Second Run of Cubic Function Generator
15.4.6 Tunable Integrated Active Filter
15.4.6.1 Results for First Run of Tunable Integrated Active Filter
15.4.6.2 Results for Second Run of Tunable Integrated Active Filter
15.4.6.3 Results of Third Run of Tunable Integrated Active Filter
15.4.6.4 Results of Fourth Run of Tunable Integrated Active Filter
15.5 Commercial Practicality of Genetic Programming for Automated Circuit Synthesis
15.6 Human-Competitiveness of the Results for the Six Post-2000 Patented Circuits
15.7 Routineness for the Six Post-2000 Patented Circuits
15.8 AI Ratio for the Six Post-2000 Patented Circuits
16 Problems for Which Genetic
Programming May Be Well Suited
16.1 Characteristics Suggesting the Use of the Genetic Algorithm
16.2 Characteristics Suggesting the Use of Genetic Programming
16.2.1 Discovering the Size and Shape of the Solution
16.2.2 Reuse of Substructures
16.2.3 The Number of Substructures
16.2.4 Hierarchical References among the Substructures
16.2.5 Passing Parameters to Substructures
16.2.6 Type of Substructures
16.2.7 Number of Arguments Possessed by Substructures
16.2.8 The Developmental Process
16.2.9 Parameterized Topologies Containing Free Variables
16.3 Characteristics Suggesting the Use of Genetic Methods
16.3.1.1 Non-Greedy Nature of Genetic Methods
16.3.1.2 Recombination in Conjunction with the Population in Genetic Methods
The Changing Mix of Schemata
Viewing Mutation and Crossover in a Common Framework
Taking Advantage of the Information Stored in the Population in Allocating Future Trials
17 Parallel Implementation
and Computer Time
17.1 Computer Systems Used for Work in This Book
17.1.1 Alpha Parallel Computer System
17.1.2 Pentium Parallel Computer System
17.2 Computer Time for Problems in This Book
18 Historical Perspective on
Moore’s Law and the Progression of Qualitatively More Substantial Results
Produced by Genetic Programming
18.1 Five Computer Systems Used in 15-Year Period
18.2 Qualitative Nature of Results Produced by the Five Computer Systems
18.3 Effect of Order-of-Magnitude Increases in Computer Power on the Qualitative Nature of the Results Produced by Genetic Programming
19 Conclusion
19.1 Genetic Programming Now Routinely Delivers High-Return Human-Competitive Machine Intelligence
19.2 Genetic Programming Is an Automated Invention Machine
19.3 Genetic Programming Can Automatically Create Parameterized Topologies
19.4 Genetic Programming Has Delivered Qualitatively More Substantial Results in Synchrony with Increasing Computer Power
Appendix A: Functions and Terminals
Appendix B: Control Parameters
Appendix C: Patented or Patentable Inventions Generated by Genetic Programming
Bibliography
Kluwer Academic Publishers
Customer Service Department
P.O. Box 358, Accord Station
Hingham, MA 02018-0358 USA
Telephone : (781) 871-6600
Toll Free Phone: (866) 269-9527
Fax : (781) 681-9045
E-mail : kluwer@wkap.com
URL: www.wkap.nl
Last updated July 29, 2003
· The home page of Genetic Programming Inc. at www.genetic-programming.com
· For information about the field of genetic programming in general, visit www.genetic-programming.org
· The home page of John R. Koza at Genetic Programming Inc. (including online versions of most papers) and the home page of John R. Koza at Stanford University
· Information about the 1992 book Genetic Programming: On the Programming of Computers by Means of Natural Selection, the 1994 book Genetic Programming II: Automatic Discovery of Reusable Programs, the 1999 book Genetic Programming III: Darwinian Invention and Problem Solving, and the 2003 book Genetic Programming IV: Routine Human-Competitive Machine Intelligence.
·
For information on 3,198 papers (many on-line) on genetic programming (as of
June 27, 2003) by over 900 authors, see William
Langdon’s bibliography on genetic programming.
· For information on the Genetic Programming and Evolvable Machines journal published by Kluwer Academic Publishers
· For information on the Genetic Programming book series from Kluwer Academic Publishers, see the Call For Book Proposals
· For information on annual GECCO
conference (which includes the annual GP conference) on June 26–30, 2004
(Saturday – Wednesday) in Seattle, visit the International Society for Genetic
and Evolutionary Computation (ISGEC).
· For information on the annual
Euro-Genetic-Programming Conference to be held on April 5-7, 2004 (Monday –
Wednesday) at the University of Coimbra in Coimbra Portugal, visit http://www.evonet.info/eurogp2004/