The lab component of the course will cover mathematical concepts used in computational linguistics. The purpose of the lab is to familiarize you with not-so-basic probability theory, information theory, Bayesian inference, linear algebra, and descriptive and inferential statistics. These concepts are crucial in understanding computational linguistics and natural language processing algorithms covered in lecture. If you are shaky about these topics, you are recommended to attend the lab. If you are going to take CS134 next year (Statistical Natural Language Processing), the lab is highly recommended.
Lab instructor: Kenneth Lai
Place and Time: 2 p.m-3 p.m Friday weekly in Lown 201 depending on the progress of the class.
Notes from the lab will be posted here as the semester progresses.
- 1/17 (in Volen 201): Intro to Numpy | installation instructions, tutorial
- 1/24: Finite State Machines and Array Math in Numpy | slides, code
- 1/31: Probability and Broadcasting in Numpy | slides
- 2/7 (in class): Naive Bayes and Logistic Regression | slides, code
- 2/14: Basic Math of Neural Networks | slides, maxent code, mlp code, data
Last year’s labs:
- Intro to Python
- Markov Chains and Hidden Markov Models
- CKY Algorithm
- Earley Algorithm
Labs from before last year:
- Frequentist and Bayesian Statistics
- Entropy and Mutual Information
- Word2vec tutorial
- Bayesian Inference, Conjugate Priors, and Summation Pushing