Changing weights & local optima
- Error backpropagation algorithm performs gradient-based search
(local optimum)
- Weight dragged while backprop is running will either "snap back" to
original optimum, or all weights will shift to new optimum
- Can estimate the length of a local optimum with respect to each
axis in weight-space