BannerDM1400x300

Seminar and PhD Seminar on Combinatorics, Games and Optimisation

Below you'll find the programme for the Seminar and PhD Seminar on Combinatorics, Games and Optimisation.

This seminar series covers many of the research areas in the Department: discrete mathematics, algorithms, game theory and operational research.

Unless stated below, this Seminar normally takes place:

Questions, suggestions, etc., about the seminar can be forwarded to Enfale Farooq, the Research Manager, on E.Farooq@lse.ac.uk

Upcoming speakers:

Wednesday 19 February - Jim Orlin (MIT)
Venue: 32L.LG.14 from 15:30 - 17:00

On the Design of Fully Polynomial Time Approximation Schemes
A fully polynomial time approximation scheme (FPTAS) is an algorithm that guarantees the following: for any fixed positive , the FPTAS finds an -optimal solution while running in time polynomial in the size of the problem and in 1/. Many of the FPTASes that have been developed over the past 45 years have relied on dynamic programming. 

In this talk, we briefly review some of the classical FPTASes. We then describe two frameworks. The first is the Calculus of K-approximation sets, which helps guide the development of FPTASes. The second is a a framework for the development of new FPTASes. The framework uses a relatively small set of subroutines that operate on non-negative monotone functions of a single variable. This framework permits the development of computer codes for FPTASes with the following properties:

- The FPTASes can be expressed simply and concisely.
- The output of each FPTAS includes a proof that the solution is within a factor of 1+ of optimal.

We illustrate the framework by developing an FPTAS for the stochastic lot-size problem.

This research on FPTASes has been joint with many collaborators including Nir Halmon, CL Li, Giacomo Nannicini, and David Simchi-Levi.

Thursday 20 February - Heinrich Nax (ETH Zürich)
Venue: NAB.2.16 from 14:00 - 15:30

Dynamic matching in assignment games
We examine two-sided markets where players arrive stochastically over time and are drawn from a continuum of types. The cost of matching a client and provider varies, so a social planner is faced with two contending objectives: a) to reduce players' waiting time before getting matched; and b) to form efficient pairs in order to reduce matching costs. We show that such markets are characterized by a quick or cheap dilemma: Under a large class of distributional assumptions, there is no `free lunch', i.e., there exists no clearing schedule that is simultaneously optimal along both objectives. We further identify a unique breaking point signifying a stark reduction in matching cost contrasted by an increase in waiting time.
Generalizing this model, we identify two regimes: one, where no free lunch exists; the other, where a window of opportunity opens to achieve a free lunch. Remarkably, greedy scheduling is never optimal in this setting.

Friday 21 February - Edin Husic (LSE)
Venue: 32L.B.09 from 12:00 - 13:00
Approximating Nash Social Welfare for asymmetric agents with Rado-valuations
The Nash Social Welfare problem asks for an allocation of indivisible goods to agentsthat maximises the geometric mean of their utilities, a well-studied objective for fair resource allocation.Finding an optimal allocation is NP-complete already for additive valuation functions.Constant-factor approximation algorithms have been known for additive valuations and some simple extensions,and under the assumption that the agents are symmetric: every agent is considered with the same weight in the objective.We present a constant-factor approximation algorithm for asymmetric agents with Rado-valuations.This is a general class of valuations that arise from maximum weight independent matching problems,including as special cases assignment valuations as well as weighted matroid rank functions.For the asymmetric case, this yields the first constant-factor approximation already for additive valuations,assuming the ratio between the weights is bounded by a constant.This is joint work with Jugal Garg and László Végh.

Wednesday 26 February - Nathanaël Fijalkow (CNRS)
Venue: 32L.LG.14 from 15:30 - 17:00

Parity games, the quasipolynomial era
Parity games is a model of zero-sum two-player games played in graphs which play a central role in logic, automata, and verification, among others. Whether there exists a polynomial time algorithm for solving parity games is a long standing open question. In 2017 a quasipolynomial time algorithm was constructed, quickly followed by some more algorithms with the same complexity. In this talk I will present a combinatorial notion called universal trees and explain that all existing quasipolynomial time algorithms for solving parity games exploit universal trees in some way. I will then discuss how to extend this notion to universal graphs for the study of mean payoff games.

Thursday 27 February - Oliver Roche-Newton (Linz)
Venue: NAB.2.16 from 14:00 - 15:30

Hypergraph containers with applications in discrete geometry and additive combinatorics
In recent years, the newly developed theory of hypergraph containers has resulted in several remarkable results in graph theory and extremal combinatorics. Essentially, this theory gives detailed information about the independent sets of hypergraphs, provided that the edges are distributed reasonably well. I will discuss recent joint works with Audie Warren and Cosmin Pohoata, in which these tools were applied to problems in discrete geometry and additive combinatorics. In particular, an upper bound for the number of subsets of the finite plane with no collinear triples is given.

Friday 28 February - Coulter Beeson (LSE)
Venue: 32L.B.09 from 12:00 - 13:00

The Empirical Core of the Multicommodity Flow Game Without Side Payments
Policy makers focus on stable strategies as the ones adopted by rational players. If there are many such solutions an important question is how to select amongst them. We study this question for the Multicommodity Flow Coalition Game, used to model cooperation between autonomous systems (AS) in the Internet. In short, strategies are flows in a capacitated network. The payoff to a node is the total flow which it terminates. Markakis-Saberi show this game is balanced and hence has a non-empty core by Scarf's Theorem. In the transferable utility (TU) version this gives a polynomial-time algorithm to find core elements, but for ASs side payments are not natural. Finding core elements in NTU games tends to be computationally more difficult. For this game, the only previous result gives a procedure to find a core element when the supply graph is a path. We extend their work with an algorithm, Incorporate, which produces many different core elements.

Thursday 5 March - George Szpiro 
Venue: NAB.2.16 from 14:00 - 15:30

Risky Decisions: How Mathematical Paradoxes and Other Conundrums Have Shaped Economic Science
At its core, economics is about making decisions. In the history of economic thought, great intellectual prowess has been exerted toward devising exquisite theories of optimal decision making in situations of constraint, risk, and scarcity. Yet not all of our choices are purely logical, and so there is a longstanding tension between those emphasizing the rational and irrational sides of human behavior. One strand develops formal models of rational utility maximizing while the other draws on what behavioral science has shown about our tendency to act irrationally.

In this talk George Szpiro will give examples of mathematical paradoxes and psychological conundrums that have led to advances in economic science. He will challenge the audience with questions about how to make decisions, and thereby show how people who believe themselves to be rational can be led astray.

Previous seminars in the series:

Thursday 13 February - Hubie Chen (Birkbeck)

One Hierarchy Spawns Another: Graph Deconstructions and the Complexity Classification of Conjunctive Queries
We study the classical problem of conjunctive query evaluation. This problem admits multiple formulations and has been studied in numerous contexts; for example, it is a formulation of the constraint satisfaction problem, as well as the problem of deciding if there is a homomorphism from one relational structure to another (which transparently generalizes the graph homomorphism problem).
We here restrict the problem according to the set of permissible queries; the particular formulation we work with is the relational homomorphism problem over a class of structures A, wherein each instance must be a pair of structures such that the first structure is an element of A. We present a comprehensive complexity classification of these problems, which strongly links graph-theoretic properties of A to the complexity of the corresponding homomorphism problem. In particular, we define a binary relation on graph classes and completely describe the resulting hierarchy given by this relation. This binary relation is defined in terms of a notion which we call graph deconstruction and which is a variant of the well-known notion of tree decomposition. We then use this graph hierarchy to infer a complexity hierarchy of homomorphism problems which is comprehensive up to a computationally very weak notion of reduction, namely, a parameterized form of quantifier-free reductions. We obtain a significantly refined complexity classification of left-hand side restricted homomorphism problems, as well as a unifying, modular, and conceptually clean treatment of existing complexity classifications, such as the classifications by Grohe-Schwentick-Segoufin (STOC 2001) and Grohe (FOCS 2003, JACM 2007).
After presenting these advances, we will compare this line of research with another that aims to classify the complexity of the homomorphism problem where the second (target) structure is fixed, and that is currently being studied using universal-algebraic methods. We will also present and discuss two intriguing variants, injective homomorphism (also called embedding) and surjective homomorphism.

This talk is mostly based on joint work with Moritz Muller.

Thursday 6 February - Rida Laraki (Paris-Dauphine and Liverpool)

Stable Games of Matching
Gale and Shapley defined in 1962 a matching problem between two finite sets of distinct agents M and W (men/women, workers/firms, students/universities, doctors/hospitals) who should be paired or matched. They call a matching is stable if no unmatched couple can strictly improve its utility by being together and constructed a stable matching for every instance (thanks to an algorithm). Our article proposes a new extension by assuming that a matched couple obtains its payoff as the outcome of a strategic interaction. Agents, beside choosing partners, play now a non-cooperative game in which they choose a strategy that maximises their payoff under the constraint that their partner does not want to quit the relationship. We call a matching externally stable if no unmatched couple can find a strategy profile which provides both of them a strictly higher payoffs than the ones they have. It is internally stable if no agent can, by deviating and changing her strategy inside her couple, increases her payoff without breaking the external stability of her couple. We provide a sufficient condition on a strategic game (feasibility) under which there exists matching profiles that are externally and internally stable. We prove that the class of feasible games includes zero-sum, potential, infinitely repeated and perfect information games. We also provide a 3 by 3 game in mixed strategies which is not feasible. Finally, we show how our model extends the Shapley-Shubik's model with monetary transferts and the Milgrom-Hatfield???s model with contracts.

Joint work with my PhD student Felipe Garrido. 

Thursday 30 January - Christina Pawlowitsch (Université Panthéon-Assas)

Evolutionary dynamics of costly signaling
Costly-signaling games have a remarkably wide range of applications (education as a costly signal in the job market, handicaps as a signal for fitness in mate selection, politeness in language). The formal analysis of evolutionary dynamics in costly-signaling games has only recently gained more attention. In this paper, we study evolutionary dynamics in two basic classes of games with two states of nature, two signals, and two possible reactions in response to signals: a discrete version of Spence's (1973) model and a discrete version of Grafen's (1990) formalization of the handicap principle. We first use index theory to give a rough account of the dynamic stability properties of the equilibria in these games. Then, we study in more detail the replicator dynamics and to some extent the best-response dynamics. (Joint work with Josef Hofbauer.)

20192018,  2017, 2016

The Seminar on Combinatorics, Games and Optimisation started in 2016. It is a combination of two previous seminar series:

Seminar on Discrete Mathematics and Game Theory: 

2016, 2015, 2014, 2013, 2012, 2011, 2010, 2009, 2008, 2007, 2006, 2005, 2004, 2003

Seminar on Operations Research: 

2016, 2015, 2014, 2013, 2012, 2011, 2010

PhD Seminar on Combinatorics, Games and Optimisation:

2019, 2018, 2017, 2016, 2015, 2014, 2013, 2012