About

This is the homepage for biweekly research lunch hosted at the Theory and Application of Algorithms (TAA) subunit of the Faculty for Computer Science at the University of Vienna. The talks are informal, and the speaker can give a practice talk, explain a new interesting idea (it doesn’t have to be the speaker’s work), present their own work, etc. The goal is to share research within the group in an accessible manner.

The talks will be held biweekly on Thursdays 11:45 – 12:45 in Forschungslabor 3 (sofa room) unless otherwise specified in the post. Get in touch with us if you would like to give a talk in the seminar.

  • 10 April 2025 – Maximilian Vötsch

    Date: 10 April 2025
    Time: 11:45 – 12:45
    Location: Forschungslabor 3 (sofa room)

    Speaker: Maximilian Vötsch
    Title: The Power of Random Projections

    Abstract:
    Many of the optimization problems computer scientists care about can be formulated as a matrix problem of the form minX ||AX – B|| or minX,Y ||XY – B|| where constraints on the matrices X and Y encode our problem’s specific structure. In some cases these problems are easy and have analytic solutions, in other cases they are hard and solving the problem exactly involves trying all possible matrices.
    What if there was a tool that lets us speed up solving such problems by reducing the search space for “free”? I will talk about one of the most fundamental Lemmas (according to some) in CS, the Johnson-Lindenstrauss lemma and more recent developments in JL-flavored methods, as well as as many applications as we can fit into 60 minutes to give you an idea when to use these techniques in your own research.

  • 27 March 2025 – A. R. Sricharan

    Date: 27 March 2025
    Time: 11:45 – 12:45
    Location: Forschungslabor 3 (sofa room)

    Speaker: A. R. Sricharan
    Title: How to Concentrate If You Must

    Abstract:
    Random events happen a lot, generating random variables in their wake. Chernoff’s bound says that you can be pretty sure that these random variables are very close to where you expect them to be. One does not think twice about when to apply a Chernoff bound, but one (yours truly) does indeed think multiple times and gets oneself mightily confused on how to go about doing this.

    I tried a month ago to get to the bottom of this, and I have a few observations and many many more questions to share. It is related to, among other things,

    1. Legendre transforms in physics,
    2. Not wanting to solve integrals,
    3. Strong convexity, and who would have guessed,
    4. The Multiplicative Weights Update method.

  • 13 March 2025 – Peter Kiss

    Date: 13 March 2025
    Time: 11:45 – 12:45
    Location: Forschungslabor 3 (sofa room)

    Speaker: Peter Kiss
    Title: Dynamic (1+ε)-Approximate Matching Size in Truly Sublinear Update Time

    Abstract:
    We show a fully dynamic algorithm for maintaining (1+ε)-approximate size of maximum matching of the graph with n vertices and m edges using m0.5−Ωε(1) update time. This is the first polynomial improvement over the long-standing O(n) update time, which can be trivially obtained by periodic recomputation. Thus, we resolve the value version of a major open question of the dynamic graph algorithms literature (see, e.g., [Gupta and Peng FOCS’13], [Bernstein and Stein SODA’16], [Behnezhad and Khanna SODA’22]). Our key technical component is the first sublinear algorithm for (1,εn)-approximate maximum matching with sublinear running time on dense graphs. All previous algorithms suffered a multiplicative approximation factor of at least 1.499 or assumed that the graph has a very small maximum degree.