Week 15
I. A Busy Week
- Between finals, final projects, and coming down with the flu over
the weekend, I did not have much time to work on my MQP this week. One
thing I did do was to add generation numbers next to the matrix displays
in the "Generations" window:

This makes it easier to keep track of what you're looking at, especially
when zooming in or out.
II. Term Summary
- Perhaps my most important accomplishment this term was making my
program dramatically more interactive. I have added the ability to
browse generations, to view the family tree for any network, to save
key networks for later use, and to adjust the level of mutation while
the evolution is in progress.
- I have made my network display more informative by using a
combination of line length and brightness to indicate weight strength,
and have added perpendicular lines to indicate the volatility of each
weight. In addition, I came up with a compact matrix display which
allows many networks to be viewed and compared at once.
- Finally, I have solved at least two non-trivial problems, namely
the scale-balancing problem and the problem with housing data described
in the previous week's notes.
III. Observed Utility
- I have found my network display to be useful in spotlighting
unnecessary hidden nodes. When all the weights going into or out of
a given node are near zero, I have been able to optimize the search
process by reducing the number of hidden nodes.
- When I was trying to design neural networks by hand, I found the
Network Editor to be a useful debugging tool. By selecting specific
hidden nodes and monitoring their inputs and outputs for different
samples, I was able to see which instances in the training set were
causing problems.
- By looking at the "Generations" window and examining the family
trees for different networks, I was able to better understand the
phenomenon of genetic drift.
IV. Future Goals
- In the coming term, I would like to further increase the interactivity
of my program. I think it would be useful to be able to
treat the weight and sigma bars as sliders, and to be able to slide a
weight back and forth and watch the fitness change on the fly.
- You should be able to apply back-propagation to individual networks
or to the entire population.
- It would be nice to be able to load & save entire sessions, rather
than just individual networks.
- Finally, I would like to solve more problems from the Machine Learning
Repository.
V. Executable Demo (Again)
When I uploaded the revised demo last week, I forgot to change the case
from DOS capitals to mixed, human-style upper and lower case. Here, at long
last, is a link to a demo that actually works.