Department 1: Algorithms and Complexity

The department investigates a broad range of theoretical and practical aspects of modern algorithmics. We design new algorithms and algorithmic techniques, analyze their efficiency and the quality of their solutions, develop provably efficient and correct software, and package our programs in software libraries. The strength of our approach lies in the fact that we consider these aspects in unity and not in isolation.

Headed by Prof. Danupon Na Nongkai, PhD.

Research Areas

  • Algorithmic Game Theory

    In the problems we consider in this group, we usually try to optimize some goal function while dealing with selfish agents that may have separate and conflicting goals, and that may lie to us in order to improve their own goal function. In algorithmic mechanism design, we ensure that it is in the best interest of the agents to tell us the truth. We also examine the price of anarchy and price of stability for various problems.

    More information
  • Approximation Algorithms

    Most interesting optimization problems are NP-Hard. For such problems, unless P=NP, exact algorithms cannot be efficient. In the field of approximation algorithms, we take the reverse perspective: efficient algorithms cannot be exact. But if we naturally insist on efficient algorithms, how close can we get to an optimal solution? In this group we design efficient algorithms with provable guarantees on the distance between their output and the (unknown) optimal solution. Complementing this, we are also studying the limits of such algorithms.

    More information
  • Fine-Grained Complexity and Algorithm Design

    Fine-grained Complexity Theory is the design of reductions that prove running time lower bounds assuming a plausible complexity-theoretic conjecture such as the Strong Exponential Time Hypothesis. In this area the design of efficient algorithms goes hand in hand with proving fine-grained lower bounds: our goal is to prove matching upper and lower bounds, thus establishing best-possible algorithms, that achieve the optimal constant in the exponent of the running time. We apply this methodology to problems from various domains such as graphs, strings, geometry and optimization.

    More information
  • Graph Algorithms

    Our long-term vision is to develop techniques for designing efficient graph algorithms and use them to understand graph data.  We currently focus on algorithms that work across many models of computation, such as dynamic, distributed, streaming, parallel, and quantum algorithms. We aim to achieve two goals simultaneously: (i) solutions to notorious long-standing open problems, and (ii) efficient algorithms that can fully exploit both the features of modern computing devices and the characteristics of contemporary data.

    More information
  • Optimization

    Many real world applications are naturally formulated as combinatorial optimization problems, i.e. problems of finding the best solution(s) out of a finite set. Various methods have been developed to tackle such problems: integer programming, fixed-parameter tractable and exact algorithms, approximation algorithms and combinatorial algorithms, among others. D1 works on applying these methods to various problems from different areas, ranging from bioinformatics to geometry, to scheduling, and several others.

    More information
  • Parameterized and Counting Algorithms and Complexity

    Parameterized complexity analyzes how different parameters of the input influence the complexity of hard algorithmic problems. The general goal is to show with fixed-parameter tractability results that the combinatorial explosion can be confined to certain well-defined parameters, or to understand why such algorithms are not possible. An interesting class of hard problems that we consider are counting problems (where the goal is to count all solutions), as they may allow for interesting phenomena that are not observed in the corresponding decision problems.

    More information
  • Robust Learning

    Machine learning algorithms have many applications. Can we theoretically prove they are consistent and helpful? We focus on two paradigms: algorithmic stability and algorithms with predictions. Stable algorithms, which can tolerate changes in their inputs, can inherit many desirable properties such as generalization, differential privacy, and replicability. Algorithms with predictions use their predictions to circumvent worst-case bounds when the predictions are good, and mitigate the effect of bad predictions

    More information