The lab component of the course will cover mathematical concepts used in computational linguistics. The purpose of the lab is to familiarize you with not-so-basic probability theory, information theory, Bayesian inference, linear algebra, and descriptive and inferential statistics. These concepts are crucial in understanding computational linguistics and natural language processing algorithms covered in lecture. If you are shaky about these topics, you are recommended to attend the lab. If you are going to take CS134 next year (Statistical Natural Language Processing), the lab is highly recommended.

Lab instructor: Kenneth Lai

Place and Time: 2 p.m-3 p.m Friday weekly in Lown 201 depending on the progress of the class.

**Lab notes**

Notes from the lab will be posted here as the semester progresses.

- 1/17 (in Volen 201): Intro to Numpy | installation instructions, tutorial
- 1/24: Finite State Machines and Array Math in Numpy | slides, code
- 1/31: Probability and Broadcasting in Numpy | slides
- 2/7 (in class): Naive Bayes and Logistic Regression | slides, code
- 2/14: Basic Math of Neural Networks | slides, maxent code, mlp code, data
- 2/28: N-grams and More Math of Neural Networks | slides
- 3/6: Markov Chains and Hidden Markov Models | slides on LATTE
- 3/13: Recurrent Neural Networks | slides
- 3/20 (online): Fancy RNNs | slides
- 3/27 (online): Context-Free Grammars and CKY Algorithm | slides
- 4/3: no lab
- 4/17 (online): Word2vec Tutorial | slides on LATTE
- 4/24 (online): Contextualized Word Embeddings | slides, ELMo, BERT
- 5/1 (online): Neural Networks in Python | slides, code

Last year’s labs:

- Intro to Python
- Earley Algorithm

Labs from before last year:

- Frequentist and Bayesian Statistics
- Entropy and Mutual Information
- Bayesian Inference, Conjugate Priors, and Summation Pushing