-
Notifications
You must be signed in to change notification settings - Fork 32
Expand file tree
/
Copy pathnmax.cc
More file actions
178 lines (164 loc) · 8.01 KB
/
nmax.cc
File metadata and controls
178 lines (164 loc) · 8.01 KB
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
/*
* moses/moses/example-progs/nmax.cc
*
* Copyright (C) 2002-2008 Novamente LLC
* All Rights Reserved
*
* Written by Predrag Janicic
* Documented by Linas Vepstas, 2011
*
* This program is free software; you can redistribute it and/or modify
* it under the terms of the GNU Affero General Public License v3 as
* published by the Free Software Foundation and including the exceptions
* at http://opencog.org/wiki/Licenses
*
* This program is distributed in the hope that it will be useful,
* but WITHOUT ANY WARRANTY; without even the implied warranty of
* MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
* GNU General Public License for more details.
*
* You should have received a copy of the GNU Affero General Public License
* along with this program; if not, write to:
* Free Software Foundation, Inc.,
* 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA.
*/
#include "headers.h"
using boost::lexical_cast;
// Demonstration program for the "nmax" optimization problem. This
// is a standard optimization demonstraton problem: a scoring
// function is given that totals up the values of a set of discrete
// variables. This is the "n_max" scoring function. The optimizer is
// supposed to be able to find the best solution to this function:
// namely, a set of variables, the value of each of which is at maximum.
// This is a generalization of the "onemax" problem, which is nmax with n=2.
//
// XXX setting n=2 currently fails due to a bug, see
// https://bugs.launchpad.net/moses/+bug/908230
//
// NOTE: This is NOT a demonstration of program learning, which is what
// MOSES is designed for; rather, this a demonstration of the use of a
// certain component within MOSES, the so-called "optimizer". MOSES itself
// relies heavily on this optimizer to implement its meta-optimization
// algorithm.
//
// As such problems go, this is among the simplest to solve. The code
// below illustrates how to do this using the MOSES infrastructure.
//
// This program requires five arguments:
// -- an initial seed value for the random number generator
// -- the number of discrete variables
// -- the population size
// -- the maximum number of generations to run.
// -- the number of values that variables may take.
//
// Suggest values of 0 8 70 100 5 for these.
//
// The population size should always be at least (n-1)*num_variables
// as otherwise, the least-fit individuals will be bred out of
// the population before the fittest individual is found, causing the
// algo loop forever (well, only up to max generations). There's even
// some risk of this when the population is only a little larger than
// this lower bound; maybe one-n-a-half or twice this lower bound would
// be a good choice.
//
// As output, this will print the fittest individual found at each
// generation. At the conclusion, the entire population will be printed.
int main(int argc, char** argv)
{
// Tell the system logger to print detailed debugging messages to
// stdout. This will let us watch what the optimizer is doing.
// Set to Logger::WARN to show only warnings and errors.
logger() = Logger("demo.log");
logger().set_level(Logger::FINE);
logger().set_print_to_stdout_flag(true);
// We also need to declare a specific logger for the aglo.
// This one uses the same system logger() above, and writes all
// messages ad the "debug" level. This allows the main loop of the
// algo to be traced.
cout_log_best_and_gen mlogger;
// Parse program arguments
vector<string> add_args{"<number of values>"};
optargs args(argc, argv, add_args);
int n = lexical_cast<int>(argv[5]);
// Initialize random number generator (from the first argument
// given to the program).
randGen().seed(args.rand_seed);
// Create a set of "fields". Each field is a discrete variable,
// with 'n' different possible settings. That is, each field has
// a multiplicity or "arity" of 'n'. The number of such discrete
// variables to create was passed as the second argument to the
// program.
//
// For the onemax problem, 'n' would be 2.
field_set fs(field_set::disc_spec(n), args.length);
// Create a population of instances (bit-strings) corresponding
// to the field specification above. The length of the bit string
// will be ciel(log_2(n)) times the length of the field specification.
// This is because 'n' values require at least log_2(n) bits to be
// represented in binary. The population size was passed as the
// third argument to the program.
instance_set<int> population(args.popsize, fs);
// Initialize each member of the population to a random value.
for (instance& inst : population)
generate(fs.begin_disc(inst), fs.end_disc(inst),
bind(&RandGen::randint, boost::ref(randGen()), n));
// Run the optimizer.
// For this problem, there is no dependency at all between different
// fields ("genes") in the field set. Thus, for the "structure
// learning" step, use the univariate() model, which is basically a
// no-op; it doesn't try to learn any structure.
//
// The "num to select" argument is number of individuals to select
// for learning the population distribution. For this problem, it
// makes sense to select them all. For smaller selections, the
// SelectionPolicy is used to make the selection; here, the
// tournament_selection() policy is used to select the fittest
// individuals from the population. Since we select all, holding
// a tournament is pointless.
//
// The "num to generate" is the number of individuals to create for
// the next generation. These are created with reference to the
// learned model. If the model is working well, then the created
// individuals should be fairly fit. In this example, it makes
// sense to replace half the population each generation.
// The generated individuals are then folded into the population
// using the replace_the_worst() replacement policy. This
// replacement policy is unconditional: the worst part of the
// current population is replaced by the new individuals (even if
// the new individuals are less fit than the current population!
// But this is good enough for this example...)
//
// The n_max() scoring function simply totals up the values of all
// the fields in the instance. It is defined in scoring_functions.h
// The termination policy will halt iteration if an individual is
// discovered to have a score of "(n-1)*args.length" -- but of
// course, since each field takes a value from 0 to (n-1) so the
// largest score would be the max value times the number of fields.
//
int num_score_evals =
optimize(population, // population of instances, from above.
args.popsize, // num to select
args.popsize / 2, // num to generate
args.max_gens, // max number of generations to run
n_max(fs), // ScoringPolicy
terminate_if_gte<int>((n-1)*args.length), // TerminationPolicy
tournament_selection(2), // SelectionPolicy
univariate(), // StructureLearningPolicy
local_structure_probs_learning(), // ProbsLearningPolicy
replace_the_worst(), // ReplacementPolicy
mlogger);
// The logger is asynchronous, so flush it's output before
// writing to cout, else output will be garbled.
logger().flush();
cout << "A total of " << num_score_evals
<< " scoring funtion evaluations were done." << endl;
// Show the final population
// cout << "Final population:\n" << population << endl;
cout << "The final population was:" << endl;
instance_set<int>::const_iterator it = population.begin();
for(; it != population.end(); ++it) {
cout << "Score: " << it->second
<< "\tindividual: " << population.fields().to_string(it->first)
<< endl;
}
}