Week 14
I. Problems Solved
- I added the ability to specify the activation functions for the hidden
layers by typing one-letter abbreviations at the "Hidden Nodes Per Layer"
prompt. For example, typing "8S,4S,2L" would generate a network whose
first hidden layer has 8 nodes and uses a sigmoid activation function,
whose second layer has 4 nodes, also with sigmoid activation functions,
and whose third layer uses 2 linear nodes.
- By using an architecture of "8S, 1S", I was able to obtain an RMS
error of 0.008932 on the 30-sample subset of the balance-scale problem
that I used to train the networks. Unfortunately, when evaluated on
the entire problem set, the RMS error jumps up to 0.210697. The following
image shows the network that obtained these fitness ratings:

- The next problem I tried from the Machine Learning Repository was the
only one I could find that was even easier than the scale balancing
problem, the "balloon classification" problem, which involved determing
whether a balloon was "inflated" or not based on four boolean attributes.
The rule was that the balloon was inflated if any only if it was both
"yellow" and "small". An admittedly simple problem, but one worth
trying nonetheless. The following network obtained an RMS error of
0.000261 on this problem:

- The next problem I solved consisted of predicting the market value
of a house based on 13 attributes, including crime rates, distance
to major employment centers, student/teacher ratios for the town, and
others. Using an architecture
of "4S, 2S", I was able to obtain an RMS error of just over 0.03 on my
30-sample subset, which corresponds to guessing the house price to within
about 1500 dollars. On the entire training set, the RMS error was about
0.17, or within $8500 of the actual price. After noting these facts, I
left my program to run for an hour or two while I went to a lab, hoping
that it might improve a little more given time. When I came back (several
thousand generations later), I found that the 30-sample fitness had
dropped to 0.028316, but when I tested the overall fitness it had shot
up to 0.79, which goes to show you never can tell.

- Try as I might, I was unable to obtain significant convergence on the
tic-tac-toe problem. Using an architecture of "8S, 4S, 2S", I was
able to get the RMS error down to about .21, which indicates that the
network is making some progress on the problem, but not nearly enough
to converge to a solution. I think it might help to use a bias in
solving this particular problem.
- Finally, when evolving solutions to a few of these problems, I noticed
long plateaus in the fitness graphs, followed by sudden drops. Though
this is what you would expect with an evolutionary algorithm, I was
surprised by the amount of time that the plateaus survived. The plot
on the left represents the fitness scores I obtained in trying to solve
the tic-tac-toe problem; the plot on the right is for the
housing problem described above.
II. Executable Demo
The following link allows you to download the latest
version of my program, this time with the .DLLs included.