Deep Learning for Natural Language Processing

Participants of this course will learn to solve a wide range of applied problems in Natural Language Processing, such as text representation, information extraction, text mining, word sense disambiguation, language modeling, similarity detection, and text summarization. The approaches studied in this course focus on neural network architectures such as recurrent neural networks, sequence-to-sequence, and transformers.

Course Content

The lecture will cover the following topics:

  • Basics: Text representation, text processing, regular expressions, tokenization, stemming, lemmatization
    • Bag-of-Words, weighting schemes (e.g., tf-idf), language modeling
    • Language models, N-grams, perplexity, information gain, smoothing
  • Word sense, lexical databases, distance measures
  • Applications and tasts: document classification, text summarization, named entity recognition, word sense disambiguation
  • Word embeddings and dense vector representations models: word2vec, GloVe, fastText, paragraph-vectors, multi-sense embeddings
  • Basics on neural networks, feed-forward networks
  • Activation functions, cost function, gradient descent, regularization, backpropagation
  • Neural language models, recurrent neural networks, vanishing gradients
  • Architectures used in NLP: LSTM, GRU, biLSTM, Seq2Seq, Attention, CNN, Transformers,
  • ChatGPT etc.
  • Bias and ethics in NLP

In the exercise, students will work on applied research projects (teamwork is possible) that address complex information retrieval and natural language processing tasks. Using the programming language Python and presenting the intermediate and final results of the projects is mandatory.

What is NLP? Explained by the Course Instructor Dr. Terry Ruas

Learning Objectives

After successfully completing the course, students should be able to:

  • Explain state-of-the-art methods to tackle NLP sub-problems, such as text representation, information extraction, text mining, language modeling, and similarity detection
  • Determine the conceptual requirements of specific NLP tasks
  • Assess the strengths and limitations of state-of-the-art NLP approaches
  • Devise solutions for complex, interdisciplinary NLP problems by implementing and adapting suitable algorithms and data structures
  • Evaluate NLP methods and systems quantitatively and qualitatively

The course provides a good foundation for a master’s thesis in our group. Check this page for our current theses proposals.

Requirements

  • Knowledge of at least one object-oriented programming language, preferably Python, is required.
  • Python is used as part of the course.
  • Basic knowledge of neural networks is desired to participate in this course. We recommend that participants are familiar with basic neural network architectures, hidden layers, activation functions, derivatives, classification, training and test strategies, precision, recall, backpropagation, gradients, and other foundational topics in machine learning and artificial neural networks. For participants who are unfamiliar with these topics, an integrated and fast-paced introduction focused on the use case of natural language processing will be provided. At the University of Göttingen’s computer science department, the course B.Inf.1236 Machine Learning provides an excellent foundation for this course.

Exam

  • Applied research project (includes teaser, intermediate, and final presentation) – 67% of the final grade
  • Written test (90 min.) or oral exam (approx. 20 min.) on the lecture content – 33% of the final grade

Time schedule

Type Day Time Periodicity Room Dates
lecture Thu 10:15 – 11:45 weekly Informatik-Provisorium 0.102 (Container) 2023-04-13 – 2023-07-13
exercise Thu 12:15 – 13:45