Held in cooperation with the University of Chicago Department of Computer Science
TTIC 31010 - Algorithms (CMSC 37000) + Tutorial (100 units)
Instructor: Julia Chuzhoy
Location: TTIC Room 530
Time: Tuesday and Thursday - 9:30-10:50am | Tutorial: Wednesday - 4:30-5:20pm
TTIC 31150 - Mathematical Toolkit (CMSC 31150) (100 units)
Instructor: Avrim Blum
Location: TTIC Room 530
Time: Monday and Wednesday - 3:00-4:20pm
TTIC 31230 - Fundamentals of Deep Learning (CMSC 31230) (100 units)
Instructor: David McAllester
Location: TTIC Room 530
Time: Tuesday and Thursday - 2:00-3:20pm
TTIC 31040 - Introduction to Computer Vision (100 units)
Instructor: Greg Shakhnarovich
Location: TTIC Room 530
Time: Monday and Wednesday - 1:30-2:50pm
TTIC 31020 - Introduction to Machine Learning (100 units)
Instructor: Nathan Srebro
Location: TTIC Room 530
Time: Monday and Wednesday - 3:00-4:20pm | Tutorial: Friday - 3:00-4:20pm
TTIC 31200 - Information and Coding Theory (CMSC 37220) (100 units)
Instructor: Madhur Tulsiani
Location: TTIC Room 530
Time: Tuesday and Thursday - 11:00-12:20pm
TTIC 44000 - Special Topics: People, Society, and Algorithms (50 units)
Instructor: Jingyan Wang
Location: TTIC Room 530
Time: Tuesday and Thursday - 3:30-4:50pm
TTIC 31100 - Geometric Methods in Computer Science (CMSC 39010) (100 units)
Instructor: Yury Makarychev
Location: TTIC Room 530
Time: Tuesday and Thursday - 11:00-12:20pm
TTIC 31110 - Speech Technologies (CMSC 35110) (100 units)
Instructor: Karen Livescu
Location: TTIC Room 530
Time: Tuesday and Thursday - 2:00-3:20pm
TTIC 31120 - Statistical and Computational Learning Theory (100 units)
Instructor: Zhiyuan Li
Location: TTIC Room 530
Time: Tuesday and Thursday - 3:30-4:50pm
TTIC 31170 - Planning, Learning, and Estimation for Robotics and Artificial Intelligence
Instructor: Matt Walter
Location: TTIC Room 530
Time: Tuesday and Thursday - 9:30-10:50am
Weekly lectures and discussions by TTIC researchers introducing their research and research problems. Provides a broad view of research carried out at TTIC. Course is pass/fail credit. Satisfies one quarter of credit (of the three required) to fulfill the Research at TTIC Series Requirement. (See Academic Program Guide for details)
100 units
This is a graduate level course on algorithms with the emphasis on central combinatorial optimization problems and methods for algorithm design and analysis. Topics covered include greedy algorithms, dynamic programming, algorithms for maximum flow and minimum cut, applications of linear programming, randomized algorithms, combinatorial optimization, and approximation algorithms. Time permitting, additional topics, such as online algorithms and probabilistic method will be covered.
The course textbook is “Algorithm Design” by Kleinberg and Tardos
Specific topics covered:
Expected outcomes:
Prerequisites: Assumes familiarity with proofs and formal reasoning, knowledge of basic graph notions (such as graphs, trees, paths etc) and algorithms (such as BFS, DFS, Minimum Spanning Tree etc). Also assumes familiarity with asymptotic notation and running time analysis of algorithms, as well as basic knowledge of the notion of NP-hardness.
100 units
A systematic introduction to machine learning, covering theoretical as well as practical aspects of the use of statistical methods. Topics include linear models for classification and regression, support vector machines, regularization and model selection, and introduction to structured prediction and deep learning. Application examples are taken from areas like information retrieval, natural language processing, computer vision and others.
I will focus on supervised learning and only talk about unsupervised settings when necessary (e.g., mixture models and density estimation for generative methods for classification). So, no clustering. There will be twenty 1.5 hour lectures; numbers in parentheses are estimated # of lectures per topic.
Prerequisites: knowledge of basic linear algebra, probability, multivariate calculus and programming.
Expected outcomes:
Course Webpage: https://ttic.edu/31020
100 units
Introduction to the principles and practice of computer vision. This course will provide in-depth survey of topics involved in major tasks in moden vision, and offer hands-on experience in implementing some of them.
Prerequisites: a good background in linear algebra and probability (undergrad level) and background in machine learning (TTIC 31020 or equivalent) required. CMSC 25040 is recommended.
Topics:
Expected outcomes:
100 units
This course will focus on the application of mathematical models and computer algorithms to studying molecular biology. In particular, this course will cover the following topics.
Expected outcomes:
Prerequisites: None
100 units
Part one consists of models for defining computable functions: primitive recursive functions, (general) recursive functions, and Turing machines; the Church-Turing Thesis; unsolvable problems; diagonalization; and properties of computably enumerable sets. Part two deals with Kolmogorov (resource bounded) complexity: the quantity of information in individual objects. Part three covers functions computable with time and space bounds of the Turing machine: polynomial time computability, the classes P and NP, NP- complete problems, polynomial time hierarchy, and P-space complete problems.
The tentative plan for this course is to do the following three parts.
Part I. Models for defining computable functions: primitive recursive functions, (general) recursive functions, and Turing machines; the Church-Turing Thesis; unsolvable problems; diagonalization; and properties of computably enumerable sets.
Part II. Kolmogorov (resource bounded) complexity: the quantity of information in individual objects.
Part III. Functions computable with time and space bounds of the Turing machine: polynomial time computability, the classes P and NP, NP-complete problems, polynomial time hierarchy, and P-space complete problems.
Here is the Web page of the previous version of this course: http://people.cs.uchicago.edu/~razborov/teaching/spring14.html. But I will be very willing to adjust the speed, style and content depending on the background of the students enrolled in the course.
Expected outcomes:
100 units
The course will cover techniques in unconstrained and constrained convex optimization and a practical introduction to convex duality. The course will focus on (1) formulating and understanding convex optimization problems and studying their properties; (2) understanding and using the dual; and (3) presenting and understanding optimization approaches, including interior point methods and first order methods for non-smooth problems. Examples will be mostly from data fitting, statistics and machine learning.
Prerequisites: Linear Algebra, Multidimensional Calculus, Undergraduate Algorithms
Specific Topics:
Expected outcomes:
100 units
This course studies approximation algorithms - algorithms that solve a given combinatorial optimization problem approximately. One major reason to study approximation algorithms is that many central combinatorial optimization problems are NP-hard, and so they are unlikely to have efficient algorithms that solve them exactly. Approximation algorithms provide a way to get around this intractability and to still efficiently obtain solutions that, while not being optimal, are close to optimal. There are however many other settings in which approximation algorithms may be desirable. For example, the running time of the algorithm or its simplicity may be more important in some settings than obtaining the best possible solution. Some other setting, such as streaming algorithms and sublinear algorithms also naturally lead to approximation.
The main focus of this course is on the design of approximation algorithms for combinatorial optimization problems. On the one hand, we will cover central tools and methods that are used in the design of approximation algorithms (and beyond). On the other hand, we will explore central combinatorial optimization problems that are used as building blocks in many different algorithms, in a variety of settings.
Much of this course is dedicated to approximation algorithms for NP-hard problems. Time permitting, we will also explore some additional settings where approximation takes place (such as very fast algorithms, streaming and/or sublinear algorithms). We will also explore the question of why some problems have good approximation algorithms while other do not, via hardness of approximation, or inapproximability, proofs.
Assumes the knowledge of material covered in the Algorithms course.
Expected outcomes:
Prerequisites: Algorithms (TTIC31010 or CMSC 37000)
Course Webpage: https://canvas.uchicago.edu/courses/37476
100 units
Introduction to analysis of signals and linear time-invariant systems at a graduate level. Topics include: Continuous and discrete-time transforms (Fourier and others); linear filtering; sampling and aliasing; random processes and their interaction with linear systems; applications in areas such as speech and image processing and robotics.
Prerequisites: familiarity with basic linear algebra, notions of eigenvalues and eigenvectors, and (multivariate) Gaussian distributions.
Expected outcomes:
100 units
The course covers fundamental concepts in high-dimensional and metric geometry as well as applications of metric techniques in algorithm design. Specific topics include:
Expected outcomes:
Prerequisites: undergraduate-level algorithms, linear algebra, and probability classes; a solid background in mathematical analysis/calculus
100 units
This course will introduce techniques used in speech technologies, mainly focusing on speech recognition and related tasks. Speech recognition is one of the oldest and most complex structured sequence prediction tasks receiving significant research and commercial attention, and therefore provides a good case study for many of the techniques that are used in other areas of artificial intelligence involving sequence modeling. The course will cover core techniques in detail, including hidden Markov models and neural network-based models. The course will include practical homework exercises where we will build and experiment with speech processing models. Finally, it will include a sampling of the connections between speech and other modalities like images and text.
Prerequisites: a good background in basic probability and introductory machine learning.
Expected outcomes:
100 units
We will discuss classic results and recent advances in statistical learning theory (mostly under the agnostic PAC model), touch on computational learning theory, and also explore the relationship with stochastic optimization and online regret analysis. Our emphasis will be on concept development and on obtaining a rigorous quantitative understanding of machine learning. We will also study techniques for analyzing and proving performance guarantees for learning methods.
Pre-Requisites: Mathematical maturity, as obtain, e.g., in a rigorous analysis course. Discrete Math (specifically combinatorics and asymptotic notation) Probability Theory Introduction to Machine Learning Algorithms; Basic Complexity Theory (NP-Hardness) Familiarity with Convex Optimization, Computational Complexity and background in Statistics can be helpful, but is not required.
Specific Topics:
Expected outcomes:
Course Webpage: https://canvas.uchicago.edu/courses/29926/assignments/syllabus
100 units
The course is aimed at first-year graduate students and advanced undergraduates. The goal of the course is to collect and present important mathematical tools used in different areas of computer science. The course will mostly focus on linear algebra and probability.
We intend to cover the following topics and examples:
Expected outcomes:
Prerequisites: None
100 units
TTIC 31160 will focus on the application of mathematical models and computer algorithms to studying structure biology, in particular, protein, RNA and DNA molecule structures.
Here is a list of topics that I am going to cover in this course.
There will be both homework and final projects.
Expected outcomes:
Prerequisites: None
100 units
This course concerned with fundamental techniques in robotics and artificial intelligence (AI), with an emphasis on probabilistic inference, learning, and planning under uncertainty. The course will investigate the theoretical foundations underlying these topics as rigorous mathematical tools that enable solutions to real-world problems drawn broadly from robotics and AI. The course will cover topics that include: Bayesian filtering (Kalman filtering, particle filtering, and dynamic Bayesian networks), simultaneous localization and mapping, planning, Markov decision processes, partially observable Markov decision processes, reinforcement learning, and graphical models.
Expected outcomes:
Prerequisites: Basic familiarity with basic linear algebra; background in probability theory; basic programming experience
100 units
Many problems in machine learning, computer vision, natural language processing, robotics, computational biology, and beyond require modeling complex interactions between large, heterogeneous collections of random variables. Graphical models combine probability theory and graph theory to provide a unifying framework for representing these relationships in a compact, structured form. Probabilistic graphical models decompose multivariate joint distributions into a set of local relationships among small subsets of random variables via a graph. These local interactions result in conditional independencies that afford efficient learning and inference algorithms. Moreover, their modular structure provides an intuitive language for expressing domain-specific knowledge, and facilitates the transfer of modeling advances to new applications.
This graduate-level course will provide a strong foundation for learning and inference with probabilistic graphical models. The course will first introduce the underlying representational power of graphical models, including Bayesian and Markov networks, and dynamic Bayesian networks. Next, the course will investigate contemporary approaches to statistical inference, both exact and approximate. The course will then survey state-of-the-art methods for learning the structure and parameters of graphical models.
Expected outcomes:
Prerequisites: TTIC 31020 (or equivalent)
100 units
This course will introduce fundamental concepts in natural language processing (NLP). NLP includes a range of research problems that involve computing with natural language. Some are user-facing applications, such as spam classification, question answering, summarization, and machine translation. Others serve supporting roles, such as part-of-speech tagging and syntactic parsing. Solutions draw from machine learning (especially deep learning), algorithms, and linguistics. There is particular interest in structured prediction in which the output structure is a sequence, tree, or sentence.
Topics include:
Assignments include formal exercises as well as practical exercises involving implementing algorithms and using NLP toolkits.
Expected outcomes:
Prerequisites: basic knowledge of calculus, linear algebra, and probability; basic programming experience; a machine learning course is recommended but not required.
Course Webpage: https://home.ttic.edu/~kgimpel/teaching/31190-a20/index.html
100 units
This course is meant to serve as an introduction to some basic concepts in information theory and error-correcting codes, and some of their applications in computer science and statistics. We plan to cover the following topics:
Expected outcomes:
Prerequisites: Discrete probability. Some knowledge of finite-field algebra is required for the part on error-correcting codes but required basics are reviewed in class.
100 units
This course will cover some of the basic theory underlying machine learning and the process of generalizing from data. We will talk about both the PAC model for batch learning (learning over one set of data with the intent of producing a predictor that performs well on new data) and models for learning from feedback over time (online learning). We will discuss important fundamental concepts including overfitting, uniform convergence, formal notions of Occam’s razor, VC-dimension, and regularization, as well as several classic algorithms including the Perceptron algorithm, SVMs, algorithms for combining expert advice, and boosting. We will also discuss limited-feedback (bandit) algorithms, reinforcement learning, connections between learning and game theory, and formal guarantees on privacy. This will be a proof-oriented course: our focus will be on proving performance guarantees for algorithms that aim to generalize from data as well as understanding what kinds of performance guarantees we can hope to prove.
Pre-Requisites:
The main pre-requisite is comfort with a proof-oriented course, having taken some algorithms class, and comfort with basic probabilistic modeling and reasoning. For example, 1000 programs are submitted to a stock-market prediction challenge, and we find that one of those programs has correctly predicted whether the market will go up or down the next week for 10 weeks in a row; should we feel confident that the program is a good one? Comfort with thinking about points and vectors in high-dimensional space is a plus.
Specific Topics Include:
Expected outcomes:
100 units
This course is a follow-up to TTIC 31190. It will go into more depth of the fundamentals of natural language processing (NLP) and cover a broader range of applications. Some of the class meetings will be hands-on, guided laboratory-style meetings; a laptop is strongly recommended for these class meetings, but not strictly required.
Topics include:
Assignments include formal exercises as well as practical exercises involving implementing algorithms and using NLP and deep learning toolkits.
Expected outcomes:
Prerequisites: TTIC 31190 or permission of the instructor.
100 units
This course will introduce concepts and techniques for analyzing and learning from unlabeled data. Unsupervised methods are used, for example, for visualization, data generation, and representation learning. The course will motivate settings in which unsupervised methods are needed or useful, and discuss relationships between unsupervised and supervised methods. Topics will include fixed feature representations such as Fourier methods and count-based features; linear and nonlinear dimensionality reduction; clustering; density estimation, mixture models, and expectation maximization; and semi-supervised/ distant-supervision settings. Linear, kernel, and deep neural network-based methods will be included. Assignments will include theoretical and practical exercises.
Prerequisites: a good background in basic probability, linear algebra, TTIC 31020, and familiarity and comfort with programming; or permission of the instructor.
Expected outcomes:
100 units
Introduction to fundamental principles of deep learning. Deep learning systems are evolving rapidly and this course presents up to date material at a conceptual level. The course emphasizes theoretical and intuitive understanding rather than particular programming formalisms.
Topics:
Expected outcomes:
Prerequisites: linear algebra, vector calculus, familiarity with multivariate Gaussian probability distributions and Markov processes.
Course Website: https://mcallester.github.io/ttic-31230
100 units
This course considers problems in perception, navigation, and control, and their systems-level integration in the context of self-driving vehicles through an open-source curriculum for autonomy education that emphasizes hands-on experience. Integral to the course, students will collaborate to implement concepts covered in lecture on a low-cost autonomous vehicle with the goal of navigating a model town complete with roads, signage, traffic lights, obstacles, and citizens. The wheeled platform is equipped with a monocular camera and a performs all processing onboard with a Raspberry Pi 3, and must: follow lanes while avoiding obstacles, pedestrians and other robots; localize within a global map; navigate a city; and coordinate with other robots to avoid collisions. The platform and environment are carefully designed to allow a sliding scale of difficulty in perception, inference, and control tasks, making it usable in a wide range of applications, from undergraduate-level education to research-level problems. For example, one solitary robot can successfully wander the environment using only line detection and reactive control, while successful point-to-point navigation requires recognizing street signs. In turn, sign detections can be “simulated” either by using fiducials affixed to each sign, or it can be implemented using “real” object detection. Realizing more complex behaviors, such as vision-based decentralized multi-robot coordination, poses research-level challenges, especially considering resource constraints. In this manner, the course is well suited to facilitate undergraduate and graduate-level education in autonomy.
The course will be taught in concurrently and in conjunction with classes at the University of Montreal and ETH Zurich, which provides opportunities for interaction and collaboration across institutions.
Topics:
The course covers fundamental techniques in perception, planning, and control for autonomous agents and their integration as part of a complex system. Specific topics include:
Expected outcomes:
Prerequisites: There are no formal course requirements, though having taken TTIC 31170 – Planning, Learning, and Estimation for Robotics and Artificial Intelligence is desireable. Students must be familiar with the GNU/Linux development environment and are required to have access to a laptop with Ubuntu 16.04 installed.
100 units
This course covers theoretical topics at the interface of computer science and economics. These include: solution concepts in game theory, such as Nash equilibrium and correlated equilibrium, and connections to learning theory; the price of anarchy in routing and congestion games; computational social choice: the axiomatic approach to ranking systems and crowdsourcing, manipulation; algorithmic mechanism design, focusing on truthful approximation algorithms; market design, with an emphasis on optimization and incentives; diffusion of technologies and influence maximization in social networks; and procedures for fair division, such as cake cutting algorithms.
Prerequisites: Basic knowledge in algorithms, computational complexity, and probability theory. This is a proof-oriented course.
The course textbook is Algorithmic Game Theory (Nisan, Roughgarden, Tardos, Vazirani, eds.) which is freely available online.
Topics include:
Expected outcomes:
100 units
Course Objectives:
The course aims to:
Intended Outcomes:
Prerequisites: A basic understanding of machine learning and some experience with programming.
Recommended prerequisite course: Introduction to Computer Vision: TTIC 31040 or CMSC 35040
50 Units
This course considers designing and analyzing algorithms with a focus on explicitly taking consideration of people and society. The course covers selected topics in this area such as data elicitation, crowdsourcing, causal inference, etc., including recent research. The course will put an emphasis on theoretical principles underlying problems in these domains, including derivations and proofs of theoretical guarantees. Some application-specific considerations and directions will also be discussed as case studies. As this is an interdisciplinary field, we will also touch upon literature in psychology and economics that study the behavior of people.
Prerequisites: Knowledge of basic probability and linear algebra.
Topics include:
Expected outcomes:
100 units
Original academic research conducted under guidance of an instructor (normally student’s PhD advisor), directed at making progress towards advancing the state of the art in the chosen area.
Expected outcomes:
100 units
A structured study of literature on a topic chosen in consultation with an instructor (normally student’s PhD advisor)
Expected outcomes:
Advisor
In-depth involvement in areas of computer science in a research lab, University or business setting. Internship activities and objectives must be related to the student’s program objectives. The Internship may require the student to be out of residence.
(Required enrollment for F-1 CPT internship for some international students).
Advisor’s Consent Required.
Internships do not award credit units, and do not fulfill set degree requirements.
Advisor
Part-Time Internship is an internship that requires minimal time of engagement: no more than 20 hours per week. The student remains in-residence at TTIC.
Advisor approval is required.
Internships do not award credit units, and do not fulfill set degree requirements.