Readings are up in the normal place. This week's readings are fairly deep into neural networks, which we haven't really talked about. Rather than give you something else to read, I'm going to try to give you what you need to know in this post. First, there are two kinds of neural networks, those that have recurrent connections and those that don't. A recurrent connection is something like this: The circle represents a neuron (AKA node; unit) and the arrow represents a connection from the neuron back to itself. Recurrent connections can also be more distant, e.g. from a child neuron to its parent, as long as there is a path that leads back to the originating neuron, e.g. child --> parent --> child. The important thing to understand about recurrent neural networks is that they are dynamical systems that unfold in time. Imagine that we initialize the neuron above to have a value (AKA energy; activation) of 1 and the recurrent connection to have a streng