Thursday 28 March - Olivier Gossner (LSE)
SCAMP
We study (interim correlated) rationalizability in games with incomplete information. For each given game with incomplete information, we show that all hierarchies of iterative deletion of dominated strategies can be captured through an automaton, the strategic automaton. We then prove that a finitely parameterized class of information structures, SCAMP, is sufficient to generate every outcome distribution induced by general common prior information structures. In this parameterized family, a profile of signals is identified to a path over the automaton, and the probability distribution of signal profiles defines a Markov chain.
Friday 22 March - Sahar Jahani (LSE)
Automated Equilibrium Analysis of 2x2x2 Games
The set of all Nash equilibria of a non-cooperative game with more than two players is defined by equations and inequalities between nonlinear polynomials, which makes it challenging to compute. This paper presents an algorithm to compute this set for the simplest game with
more than two players with arbitrary (possibly non-generic) payoffs, which has not been done before. We define degeneracy in these games and we prove an upper bound on the number of possible equilibria in non-degenerate games. Furthermore, we introduce new elegant formulas for completely mixed equilibria as well as the visual representations of the best-response correspondences and their intersections in 3D, which defines the Nash equilibrium set. These have been implemented in Python and will be part of a public web-based software for automated equilibrium analysis.
Thursday 21 March - Tara Abrishami (University of Hamburg)
Width parameters and the maximum independent set problem
Width parameters are graph invariants that measure the ''complexity'' of a graph. Treewidth, the most well-known graph width parameter, has important algorithmic implications: problems that are NP-hard in general, such as finding a maximum independent set in a graph, can be solved in polynomial time in graphs of bounded treewidth. However, treewidth does not capture the solvability of maximum independent set very well: there are many graph classes with unbounded treewidth which admit efficient algorithms for the maximum independent set problem. Recently, a number of new graph width parameters have been introduced with the intention of better representing the solvability of maximum independent set. In this talk, I will introduce some of these new width parameters, discuss recent results on a new parameter called induced matching treewidth, and describe several interesting open problems. This talk is based on joint work with Marcin Brianski, Jadwiga Czyżewska, Rose McCarty, Martin Milanic, Pawel Rzazewski, and Bartosz Walczak.
Friday 15 March - Madison Van Dyk (University of Waterloo) **16.00-17.30 Joint CGO-Operations Research Reading Group Seminar**
Fast Combinatorial Algorithms for Efficient Sortation
Modern parcel logistic networks are designed to ship demand between given origin, destination pairs of nodes in an underlying directed network. Efficiency dictates that volume needs to be consolidated at intermediate nodes. In practice, such consolidation requires parcel-sortation. In this work, we propose a mathematical model for the physical requirements, and limitations of parcel sortation. We then show that, in the general model, it is NP-hard to determine whether a feasible sortation plan exists. We discuss several special settings, where (near-)feasibility of a given sortation instance can be determined efficiently. The algorithms we propose are fast, and construct solutions as well as combinatorial witness set type lower-bounds that are reminiscent and extend those used in early work on degree-bounded spanning trees and arborescences. This is joint work with Kim Klause, Jochen Koenemann, and Nicole Megow. To appear in IPCO 2024.
Friday 15 March - Laurentiu Ploscaru (University of Birmingham)
A bipartite version of the Erdos-McKay conjecture
A set of vertices in a graph is said to be 'homogeneous' if it is a clique or an independent set. An old conjecture of Erdős and McKay states that if all homogeneous sets in an n-vertex graph are of order at most O(log n), then the graph contains induced subgraphs of each size from {0,1,..., \Omega(n^2)}. We prove a bipartite analogue of this conjecture: if all balanced homogeneous sets in an n by n bipartite graph are of order at most O(log n), then the graph contains induced subgraphs of each size from {0,1,...,\Omega(n^2)}. Based on joint work with Eoin Long (Birmingham).
Friday 8 March - Michal Feldman (Tel-Aviv University)
Ambiguous contracts
In this work we explore the deliberate infusion of ambiguity into the design of contracts. We show that when the agent is ambiguity-averse and chooses an action that maximizes their max-min utility, then the principal can strictly gain from using an ambiguous contract. We provide insights into the structure of optimal contracts, and establish that optimal ambiguous contracts are composed of simple contracts. We also provide a geometric characterization of ambiguity-proof classes of contracts. Finally, we show that when the agent considers mixed strategies, then there is no advantage in using an ambiguous contract.
Friday 8 March - Estelle Varloot (University of Liverpool) **11.00-12.00**
Level-strategyproof Belief Aggregation Mechanisms
In the problem of aggregating experts' probabilistic predictions or opinions over an ordered set of outcomes, we introduce the axiom of level-strategyproofness (level-SP) and argue that it is natural in several real-life applications and robust as a notion. It implies truthfulness in a rich domain of single-peaked preferences over the space of cumulative distributions. This contrasts with the existing literature, where we usually assume single-peaked preferences over the space of probability distributions instead. Our main results are (1) explicit characterizations of all level-SP methods with and without the addition of other axioms (certainty preservation, plausibility preservation, proportionality); (2) comparisons and axiomatic characterizations of two new and practical level-SP methods: the proportional-cumulative and the middlemost-cumulative; (3) an application of the proportional-cumulative to construct a new voting method that extends majority judgment and where voters can express their uncertainties/doubts about the merits/qualities of the candidates/alternatives to be ranked.
Thursday 7 March - Nemanja Draganic (University of Oxford)
Hamiltonicity of expanders: optimal bounds and applications
An n-vertex graph G is a C-expander if |N(X)|≥C|X| for every X ⊆ V(G) with |X|<n/2C and there is an edge between every two disjoint sets of at least n/2C vertices. We show that there is
some constant C > 0 for which every C-expander is Hamiltonian. In particular, this implies the well known conjecture of Krivelevich and Sudakov from 2003 on Hamilton cycles in (n,d,λ)-graphs. This completes a long line of research on the Hamiltonicity of sparse graphs, and has many applications. Joint work with R. Montgomery, D. Munhá Correia, A. Pokrovskiy and B. Sudakov.
Wednesday 6 March - Meike Neuwohner (LSE)
Improved Approximation Algorithms for Weighted k-Set Packing
Abstract
Friday 1 March - R Ravi (Carnegie Mellon University) **16.00-17.30 Joint CGO-Operations Research Reading Group Seminar**
Models and Algorithms for Information Freshness in Graphs
I will describe two related formulations of information dissemination under a telephone model in graphs. After drawing out the relations between them, I plan to present an algorithm for a new variant (called Average Broadcast Time) described in this SODA 23 paper “Timeliness Through Telephones: Approximating Information Freshness in Vector Clock Models” by Chen, An, Niaparast, Ravi, and Rudenko. The talk will describe joint work with several graduate students over the years.
Friday 1 March - Miriam Fischer (Imperial College London)
Fair interventions in weighted congestion games
In this talk I present joint work with Martin Gairing and Dario Paccagnan. We study the power and limitations of fair interventions in weighted congestion games. Specifically, we focus on interventions that aim at improving the equilibrium quality (price of anarchy) and are fair. Within this setting, we provide three key contributions:1) We show that no fair intervention can reduce the price of anarchy below a given factor depending solely on the class of latencies considered. This lower bound is unconditional, i.e., it applies regardless of how much computation interventions are allowed to use. 2) We propose a taxation mechanism that is fair and show that the resulting price of anarchy matches this lower bound, while the mechanism can be efficiently computed in polynomial time. 3) We complement these results by showing that no intervention (fair or not) can achieve a better approximation if polynomial computability is required. We do so by proving that the minimum social cost is NP-hard to approximate below a factor identical to the one previously introduced. Our results also show that the randomized algorithm proposed by Makarychev and Sviridenko [JACM’2018] for the class of optimization problems with a “diseconomy of scale” is optimal, and provide a novel way to derandomize its solution via equilibrium computation.
Thursday 29 February - Andreas Feldmann (University of Sheffield)
Parameterized Algorithms for Steiner Forest in Bounded Width Graphs
We study the parameterized complexity of the well-studied Steiner Forest problem in several graph classes of bounded width. The problem takes an edge-weighted graph and pairs of vertices as input, and the aim is to find a minimum cost subgraph in which each given vertex pair lies in the same connected component. It is known that this problem is APX-hard in general, and NP-hard on graphs of treewidth 3, treedepth 4, and feedback vertex set size 2. However, Bateni et al. [JACM, 2011] gave an approximation scheme with a runtime of n^O((k^2)/ε) on graphs of treewidth k. Our main result is a much faster efficient parameterized approximation scheme (EPAS) with a runtime of 2^O((k^2)/ε log(k/ε)) n^O(1). If k instead is the vertex cover number of the input graph, we show how to compute the optimum solution in 2^O(k log k) n^O(1) time, and we also prove that this runtime dependence on k is asymptotically best possible, under ETH. Furthermore, if k is the size of a feedback edge set, then we obtain a faster 2^O(k) n^O(1) time algorithm, which again cannot be improved under ETH.
Friday 23 February - Fei Wu (King's College London)
Strategic Bidding Wars in On-chain Auctions
The Ethereum block construction market has changed significantly since the emergence of Proposer-Builder Separation. Validators access blocks through a marketplace, where block builders bid for the right to construct the block and earn MEV (Maximal Extractable Value) rewards in a competition, known as the MEV-boost auction. While more than 90% of blocks are currently built through the auction process, trade-offs between builders’ strategic behaviors and auction design remain poorly understood.
We introduce a game-theoretic model for MEV-Boost auctions and use simulations to study different builders’ bidding strategies observed in practice. We study various strategic interactions and auction setups and evaluate how the interplay between critical elements such as access to MEV opportunities and improved network connectivity impact bidding performance and strategy effectiveness.
Thursday 22 February - Olaf Parczyk (FU Berlin)
Maximal diameter of simplicial d-complexes
The diameter of a simplicial d-complex is the diameter of its dual graph. We improve the best known lower bounds for the maximum diameter of a strongly connected simplicial d-complex. For d=2 we even obtain optimal constructions matching the trivial upper bound. This is joint work with Tibor Szabó and Silas Rathke.
Wednesday 21 February - Thomas Sauerwald (University of Cambridge)
Balanced Allocations: The Power of Choice versus Noise
In the balanced allocation problem we wish to allocate m balls (jobs) into n bins (servers) by allowing each ball to choose from some bins sampled uniformly at random. The goal is to maintain a small gap between the maximum load and average load. For the one-choice protocol, where each ball is allocated to a random bin, the gap diverges for large m. However, for the two-choice protocol, where each ball samples two bins and is placed in the least loaded of the two, it was shown that gap is only O(log log n) for all m. This dramatic improvement is widely known as ``power of two choices’’, and similar effects have been observed in hashing and routing.
In this talk, we will first give some intuition why two-choice maintains such a good balance in theory and practice. Then we will present new results in settings where the load information is incomplete or subject to some noise. Roughly speaking, our results demonstrate that if the noise is not too strong, then the performance of two-choice is not affected too much. However, we will also explore a setting with strong noise where having more choices leads to a worse performance. This is based on joint works with Dimitrios Los and John Sylvester.
Friday 16 February - Luming Zhang (LSE)
Stretching demi-bits and more on non-deterministic secure pseudorandomness
Super-bits and demi-bits are variants of pseudorandom generators which are secure against nondeterministic statistical tests. They were introduced to rule out certain approaches to proving strong complexity lower bounds beyond the limitations set out by the Natural Proofs barrier of Rudich and Razborov. Whether demi-bits are stretchable at all had been an open problem since its introduction. We answer this question affirmatively by showing that: every demi-bit can be stretched into sublinear many demi-bits. I may talk more about our contribution to non-deterministic secure pseudorandomness if time permits. [2304.14700] Stretching Demi-Bits and Nondeterministic-Secure Pseudorandomness (arxiv.org)
Thursday 15 February - Ehud Shapiro (Weizmann Institute of Science and LSE)
Grassroots Digital Democracy: Transferring power and wealth from global platforms to the people
The digital realm today is dominated by two architectures: Centralized autocratic global platforms that have adopted surveillance-capitalism as their business model; and up-and-coming blockchain-based global platforms that are decentralized but fundamentally plutocratic. Global platforms exacerbate inequality and undermine the fabric of human society by depleting the social, economic, civic, and political capital of local communities, worldwide. The talk will review the work of my group at Weizmann with colleagues around the world on a third, alternative architecture for the digital realm, termed grassroots digital democracy. The architecture and its engendered grassroots applications (including social networking; cryptocurrencies; social contracts) aim to provide scalable foundations for grassroots digital economies that can emerge without initial capital or external credit, and for sovereign democratic digital communities, both operating solely on the networked smartphones of their members, independently of any global resources and platforms. The publications that form the basis of the talk are collectively available here.
Friday 9 February - Namrata (University of Warwick)
Kneser graphs are Hamiltonian
For integers k ≥ 1 and n ≥ 2k + 1, the Kneser graph K(n, k) has as vertices all k-element subsets of an n-element ground set, and an edge between any two disjoint sets. It has been conjectured since the 1970s that all Kneser graphs admit a Hamilton cycle, with one notable exception, namely the Petersen graph K(5, 2). This problem received considerable attention in the literature, including a recent solution for the sparsest case n = 2k + 1. The main contribution of this paper is to prove the conjecture in full generality. We also extend this Hamiltonicity result to all connected generalized Johnson graphs (except the Petersen graph). The generalized Johnson graph J(n, k, s) has as vertices all k-element subsets of an n-element ground set, and an edge between any two sets whose intersection has size exactly s. Clearly, we have K(n, k) = J(n, k, 0), i.e., generalized Johnson graph include Kneser graphs as a special case. Our results imply that all known families of vertex-transitive graphs defined by intersecting set systems have a Hamilton cycle, which settles an interesting special case of Lovász’ conjecture on Hamilton cycles in vertex-transitive graphs from 1970. Our main technical innovation is to study cycles in Kneser graphs by a kinetic system of multiple gliders that move at different speeds and that interact over time, reminiscent of the gliders in Conway’s Game of Life, and to analyze this system combinatorially and via linear algebra.
Thursday 8 February - Andrea Freschi (University of Birmingham)
Hamilton cycles with large oriented discrepancy
Given graphs G and H, what is the largest t such that in any 2-colouring of the edgesof G, there is a copy of H in G with at least t edges in the same colour? This topic was firstraised by Erd˝os in the 1960s and has attracted renewed attention in recent years. Gishboliner,Krivelevich and Michaeli considered an analogous question for Hamilton cycles in oriented graphs.They conjectured that if G is an oriented graph on n vertices and δ(G) ≥ n/2 then G contains aHamilton cycle with at least δ(G) edges pointing in the same direction. In this talk we present aproof of this conjecture, as well as an overview of the recent developments in the area.This is based on joint works with Allan Lo and Joseph Hyde, Joanna Lada and Andrew Treglown.
Wednesday 7 February - Martin Bullinger (University of Oxford)
Online Coalition Formation under Random Arrival or Coalition Dissolution
Coalition formation considers the question of how to partition a set of n agents into disjoint coalitions according to their preferences. We consider a cardinal utility model with additively separable aggregation of preferences and study the online variant of coalition formation, where the agents arrive in sequence. The goal is to achieve competitive social welfare. In a purely deterministic model, the natural greedy algorithm is known to achieve an optimal competitive ratio, which heavily relies on the range of utilities.
We complement this result by considering two related models. First, we study a model where agents arrive in a random order. We find that the competitive ratio of the greedy algorithm is of order 1/n^2, whereas an alternative algorithm, which is based on alternating between waiting and greedy phases, can achieve a competitive ratio of of order 1/n. Second, we relax the irrevocability of decisions by allowing to dissolve coalitions into singleton coalitions. We achieve the asymptotically optimal competitive ratio of of order 1/n by drawing a close connection to a general model of online matching. Hence, in both models, we obtain a competitive ratio that does not only get rid of the utility dependencies in the base model but even essentially matches the best possible approximation ratio by polynomial-time algorithms.
This is joint work with René Romen. The paper is available on arXiv: https://arxiv.org/abs/2306.16965. A preliminary version of this paper has appeared in the Proceedings of the 31st Annual European Symposium on Algorithms (ESA 2023).
Friday 2 February 2024 - Alexandra Lassota (Eindhoven University of Technology), 4.00-5.30pm
Small Lecture on Block-Structured Integer Programs
Integer Programs (IPs) are an ubiquitous and potent modeling tool, but in general also NP-hard to solve. Thus, special cases are studied that are solvable in polynomial time. One important case is that of block-structured IPs. The simplest class of such IPs have a constraint matrix consisting of independent sub-matrices that are linked by either few variables (2-stage stochastic IPs) or few constraints (n-fold IPs) This talk gives an overview on the results of these IPs (including our brand-new SODA results) and outlines the most important techniques to achieve them.
Friday 2 February 2024 - Alex Malekshaian (KCL)
Counting antichains in the Boolean lattice
An old question of Dedekind asks for the number of antichains in the Boolean lattice on $n$ elements. After a long series of partial results, this was answered asymptotically by Korshunov in the 1980s. Using methods from probability and statistical physics, we refine his asymptotics, giving a formula to arbitrary exponential precision. We also give asymptotics for the number of antichains of a given, large size. As a corollary, we derive a 'sparse' version of the classical theorem of Sperner which states that the largest antichain is precisely the middle layer: we determine the threshold for the size at which an antichain is almost surely contained in the middle layer.
Joint work with Matthew Jenssen and Jinyoung Park.
Thursday 1 February - Robert Hancock (University of Oxford)
A resolution of the Kohayakawa Kreuter conjecture for almost all pairs of graphs
We study asymmetric Ramsey properties of the random graph G(n,p). For r ≥ 2 and a graph H, Rödl and Rucinski (1993-5) provided the asymptotic threshold for G(n,p) to have the following property: whenever we r-colour the edges of G(n,p) there exists a monochromatic copy of H as a subgraph. In 1997, Kohayakawa and Kreuter conjectured an asymmetric version of this result, where one replaces H with a set of graphs H_1,...,H_r and we seek the threshold for when every r-colouring contains a monochromatic copy of H_i in colour i for some i ∈ {1,...,r}.
The 1-statement of this conjecture was confirmed by Mousset, Nenadov and Samotij in 2020. We extend upon the many partial results for the 0-statement, by resolving it for almost all cases. We reduce the remaining cases to a deterministic colouring problem. Our methods also extend to the hypergraph setting. Joint work with Candida Bowtell (University of Warwick) and Joseph Hyde (University of Victoria).
Friday 26 January - George Chionas (University of Liverpool)
Who gets the MEV? A Dynamical Sharing Blockchain Mechanism
Maximal Extractable Value (MEV) has emerged as a new frontier in the design of blockchain systems. The marriage between decentralization and finance gives the power to block producers not only to select and add transactions to the blockchain but, crucially, also to order them so as to extract additional value. However, the additional value comes from the economic activity of users. In this work, we propose to make the MEV extraction rate part of the protocol design space. Our aim is to leverage this parameter to maintain a healthy balance between block producers (who need to be compensated) and users (who need to feel encouraged to transact). Inspired by the principles introduced by EIP-1559 for transaction fees, we design a dynamic mechanism which updates the MEV extraction rate with the goal of stabilizing it at a target value. We analyze the evolution of this dynamic mechanism under various market conditions and provide formal guarantees about its long-term performance.
Friday 19 January 2024 - Joachim Spoerhase (University of Sheffield)
Parameterized Approximation Schemes for Clustering with General Norm Objectives
This paper considers the well-studied algorithmic regime of designing a (1+epsilon)-approximation algorithm for a k-clustering problem that runs in time f(k,epsilon)poly(n) (sometimes called an efficient parameterized approximation scheme or EPAS for short). Notable previous results of this kind include EPASes in the high-dimensional Euclidean setting for k-center as well as k-median, and k-means. However, existing EPASes handle only basic objectives (such as k-center, k-median, and k-means) and are tailored to the specific objective and metric space. Our main contribution is a clean, simple, and unified algorithm that yields an EPAS for a large variety of clustering objectives (for example, k-means, k-center, k-median, priority k-center, l-centrum, ordered k-median, socially fair k-median aka robust k-median, or more generally monotone norm k-clustering) and metric spaces (for example, continuous high-dimensional Euclidean spaces, metrics of bounded doubling dimension, bounded treewidth metrics, and planar metrics), and which is (almost) entirely oblivious to the underlying objective and metric space. Key to our approach is a new concept that we call bounded epsilon-scatter dimension -- an intrinsic complexity measure of a metric space that is a relaxation of the standard notion of bounded doubling dimension.
This is joint work with Fateme Abbasi, Sandip Banerjee, Jarosław Byrka, Parinya Chalermsook, Ameet Gadekar, Kamyar Khodamoradi, Dániel Marx, and Roohani Sharma.
Thursday 18 January 2024 - Olivier Gossner, Matoula Kotsialou, Bernhard von Stengel, Adam Ostaszewski, Bob Simon (LSE)
Research Catch-Up Session