There is no class 11/21 because of the Thanksgiving holiday, see the University calendar.
Feel free to reach out if you have any questions/concerns about your paper/project. I will be in town and on email :)
Just posted readings for the rest of the semester as requested :)
For those thinking of using lasso in their project, please see
- The McNeish paper we read - The ISLR book we read, section 6.5 for R code examples using glmnet - Older code using glmpath: http://members.cbio.mines-paristech.fr/~jvert/svn/tutorials/practical/linearclassification/linearclassification.R
Readings are up in the normal place. This week's readings are fairly deep into neural networks, which we haven't really talked about. Rather than give you something else to read, I'm going to try to give you what you need to know in this post. First, there are two kinds of neural networks, those that have recurrent connections and those that don't. A recurrent connection is something like this: The circle represents a neuron (AKA node; unit) and the arrow represents a connection from the neuron back to itself. Recurrent connections can also be more distant, e.g. from a child neuron to its parent, as long as there is a path that leads back to the originating neuron, e.g. child --> parent --> child. The important thing to understand about recurrent neural networks is that they are dynamical systems that unfold in time. Imagine that we initialize the neuron above to have a value (AKA energy; activation) of 1 and the recurrent connection to have a streng...
Readings are up for next week :) Following up on class discussion yesterday, here is the paper that I thought of assigning but was concerned about overloading people: https://www.microsoft.com/en-us/research/wp-content/uploads/2016/05/Bishop-MBML-2012.pdf This is not assigned, but it's an excellent paper and feedback is welcome. Also, someone asked about Gaussian processes yesterday. In thinking about the whole context of that discussion, I thought it might be helpful to recap some terminology. A random process is something that creates non-deterministic (uncertain) outcome Examples: Rolling dice (fair dice or not) Flipping a coin (fair coin or not) Random sampling Counterexamples Flipping a coin with two heads (because there is only 1 outcome, it is deterministic/certain) A random variable maps the outcome of a random process to a number. Example: Flipping a coin Random process: flipping a coin Random variable X where 1 is heads and 0 is tail...
Readings are up for next week 😀 Please also take a look at the data description for the latest Kaggle lab (link distributed yesterday). The lab for next week will step through the Zuur protocol to build a predictive model for this dataset.
Just posted readings for the rest of the semester as requested :)
ReplyDeleteFor those thinking of using lasso in their project, please see
- The McNeish paper we read
- The ISLR book we read, section 6.5 for R code examples using glmnet
- Older code using glmpath: http://members.cbio.mines-paristech.fr/~jvert/svn/tutorials/practical/linearclassification/linearclassification.R