Computational Semantics for NLP

Graduate Course, Carnegie Mellon University, Language Technologies Institute, 2017

I was a teaching assistant for the class “Comptuational Sematnics for NLP” (CMU-11-727). Working with Professor Eduard Hovy and Professor Teruko Mitamura, I help grade the homeworks and projects. I also gave a lecture on deep learning methods for computational semantics.

Lecture

I gave a lecture on “Embeddings and deep semantic representations” for computational semantics.

Course Outline

This course surveys semantics from a language processing perspective. It is divided into three main sections supplemented with a substantive semester-long computational project. The first section addresses traditional topics of computational semantics such as formal representation systems, inference, and semantic processing. The second focuses on modern statistics-based distributional and neural-based deep models of semantics. The third section covers the resources and tasks specific to computational lexical semantics, including resources such as WordNet, Framenet, and some word-based ontologies, and computational applications such as coreference resolution, entailment, etc. Each week focuses on one topic, covered by the lecturers, and will include one or two core introductory readings plus several optional more advanced readings. All students will read at least one paper weekly for in-class discussions.