Eric Price


My research explores the fundamental limits of datalimited computational problems. How efficiently can we recover signals from noisy samples? And how much space do we need to compute functions of large data streams? My work has given algorithms with tight, or neartight, sample complexities for a variety of such problems, along with corresponding lower bounds.
Examples areas of my research include:One nice feature of sample and space complexity is that (unlike time complexity) we have the tools to prove matching lower bounds. A large portion of my research is on proving such lower bounds. Strong lower bounds give us a target to aim for, guide how algorithms should be designed, and tell us when to stop looking for better algorithms and instead look for better problems.
You can find my list of publications here.
Spring 2022:  Introduction to algorithms (CS 331). 
Fall 2021:  Randomized Algorithms (CS 388R), a graduate course. 
Spring 2021:  Honors introduction to algorithms (CS 331H). 
Fall 2020:  Sublinear Algorithms (CS 395T), a graduate course. 
Spring 2020:  Introduction to algorithms (CS 331). 
Fall 2019:  Randomized Algorithms (CS 388R), a graduate course. 
Spring 2019:  Honors introduction to algorithms (CS 331H). 
Fall 2017:  Algorithms and Complexity (CS 331). 
Fall 2017:  Randomized Algorithms (CS 388R), a graduate course. 
Spring 2017:  Honors introduction to algorithms (CS 331H). 
Fall 2016:  Sublinear Algorithms (CS 395T), a graduate topics course. 
Spring 2016:  Honors introduction to algorithms (CS 331H). 
Fall 2015:  Randomized Algorithms (CS 388R), a graduate course. 
Fall 2014:  Sublinear Algorithms (CS 395T), a graduate topics course. 
I ran the Algorithms and Complexity Seminar at MIT.
I created and maintain NewsDiffs, which tracks postpublication changes to online news articles. [slides]
I am a coach for USACO, the USA Computer Olympiad. This program provides an excellent algorithms education for high school students.