An introduction to foundational ideas in computational linguistics, such as: n-gram language models; noisy channel models; supervised and unsupervised learning; and distributional semantics. Focus on developing an understanding of the intuitions behind the ideas, including relevant algorithms and math, and implementing this understanding in hands-on programming applications.

Prerequisites: Linguistics 102 or Linguistics 205 or Computer Science 8 or equivalent experience.

4

Units

Optional

Grading

1, 2, 3

Passtime

Graduate students only

Level Limit

Letters and science

College
TODD S J
No info found
Lecture
SH 3519
T R
14:00 PM - 15:15 PM
6 / 15
Sections
GIRV 2110
W
10:00 AM - 10:50 AM
6 / 15
See All
LING 208 Mithun M Spring 2015 Total: 6
LING 208 Mithun M Spring 2014 Total: 5
LING 199
0 / 10 Enrolled
Independent Studies In Linguistics.
T B A
LING 211
3 / 15 Enrolled
Experimental Methods in Linguistics
Brehm L E
M W
12:30 PM - 13:45 PM
LING 212
15 / 15 Full
Discourse Transcription
Mary Bucholtz 4.6
T R
09:30 AM - 10:45 AM