- I have been organising some notes on topics I teach.
- Together with Philip Schulz, I am organising a tutorial on variational inference. The tutorial is modular and some of its modules have already been presented a few times (check the repository for a gist of our schedule).
2018-present (offered in Spring)
I coordinate and teach Unsupervised Language Learning together with Dr Ekaterina Shutova. This is a second-year course offered to students in the Masters’ in Artificial Intelligence programme of the University of Amsterdam. The course covers advanced unsupervised learning techniques in natural language processing with a focus on meaning representation.
2018-present (offered in Winter)
I coordinate and teach Natural Language Models and Interfaces. This is a second-year course offered to students in the Bachelor’s in Artificial Intelligence programme of the University of Amsterdam. The course covers some of the essential techniques in natural language processing with a focus on language modelling and word representation:
- Review of probability theory
- Probability of a setence
- Markov Models and n-gram language modelling
- Hidden Markov Models and part of speech tagging
- Probabilistic context-free grammars
- Locally normalised log-linear models
- Distributional semantics and neural models of word representation
- Overview of NLP applications
2015-present (offered in Spring)
I organise and teach Natural Language Processing II at UvA (the course is focused on machine translation). This is the third edition which I organise, the first one was in the spring of 2015.
The course is organised in three blocks
- Unsupervised word alignments: directed models (EM estimation and VB) and undirected models (MLE via gradient-based optimisation)
- Statistical machine translation: (hierarchical) phrase-based MT (linear models), latent-variable CRF for hierarhical PBSMT
- Neural machine translation: fully supervised sequence to sequence models, deep generative models
March 2017 - 6EC
4 weeks course on Bayesian inference for PCFGs, here is a list of the topics covered:
- Parsing as weighted deduction (CKY, Earley)
- Inside-Outside and ancestral sampling
- Maximum likelihood of PCFGs via EM
- Introduction to Dirichlet distribution and related processes
- Bayesian inference for PCFG: Gibbs sampler and collapsed Metropolis-Hastings sampler