Teaching

Neural Networks for NLP

Graduate Course, Carnegie Mellon University, Language Technologies Institute, 2017

I was a teaching assistant for the class “Neural Networks for NLP” (CMU 11-747). Working with Prof. Graham Neubig, I help design part of the course syllabus, quizs and projects. I am also responsible for grading the quizs and projects.

Computational Semantics for NLP

Graduate Course, Carnegie Mellon University, Language Technologies Institute, 2017

I was a teaching assistant for the class “Comptuational Sematnics for NLP” (CMU-11-727). Working with Professor Eduard Hovy and Professor Teruko Mitamura, I help grade the homeworks and projects. I also gave a lecture on deep learning methods for computational semantics.

Coreference Resolution

Guest Lecture, Carnegie Mellon University, Language Technologies Institute, 2015

I gave a guest lecture for the class “Tools for NLP” on the topic on Coreference Resolution. Starting from reviewing the history of modern coreference resolution, I introduce the techniques, approaches and evaluation methods for entity coreference. I then talk about some new topics on a more general form of coreference, including quasi-coreference and event coreference. I also discussed some relevant topics with the students, such as scripts and clustering.

Script Induction in Probabilistic Way

Seminar Talk, Carnegie Mellon University, Language Technologies Institute, 2014

I gave a talk at “Advanced Seminar in Semantics” (CMU 11-728) on the topics of script induction. I compare various probablistic methods for this task.