Sigma Club Past Events


28 April

This seminar has been cancelled.

12 May

J. Brian Pitts, Cambridge

Real Change Happens in Hamiltonian General Relativity

In General Relativity in Hamiltonian form, change has seemed to be missing, because the Hamiltonian is a sum of first-class constraints and thus supposedly generates gauge transformations.  The gauge generator, a specially _tuned sum_ of first-class constraints, facilitates seeing that a solitary first-class constraint generates not a gauge transformation, but a bad physical change in electromagnetism or General Relativity, spoiling the constraints, Gauss's law or the Gauss-Codazzi relations, in terms of the physically relevant velocities.  Insistence on Hamiltonian-Lagrangian equivalence holds the key, including a reformed Bergmann-inspired notion of observables.  Taking objective change to be  ineliminable time dependence, there is change in GR without matter just in case there is no time-like Killing vector.  The Hamiltonian formalism agrees.

19 May

Owen Maroney, Oxford

How epistemic can a quantum state be?

The “psi-epistemic” view is that the quantum state does not represent a state of the world, but a state of knowledge about the world. It draws its motivation, in part, from the observation of qualitative similarities between characteristic properties of non-orthogonal quantum wave functions and between overlapping classical probability distributions.
It might be suggested that this gives a natural explanation for these properties, which seem puzzling for the alternative “psi-ontic” view. However, looking at the two such similarities, quantum state overlap and quantum state discrimination, it will be shown that the psi-epistemic view cannot account for these values shown by quantum theory, and must instead rely on the same kind of explanations as the “psi-ontic” view.

23 June

Luke Glynn, UCL

Unsharp Best System Chances

Much recent philosophical attention has been devoted to variants on the Best System Analysis of laws and chance. In particular, philosophers have been interested in the prospects of such Best System Analyses (BSAs) for yielding *high-level* laws and chances. Nevertheless, a foundational worry about BSAs lurks: there do not appear to be uniquely appropriate measures of the degree to which a system exhibits theoretical virtues, such as simplicity and strength. Nor does there appear to be a uniquely correct exchange rate at which the theoretical virtues of simplicity, strength, and likelihood (or *fit*) trade off against one another in the determination of a best system. Moreover, it may be that there is no *robustly* best system: no system that comes out best under *any* reasonable measures of the theoretical virtues and exchange rate between them. This worry has been noted by several philosophers, with some arguing that there is indeed plausibly a set of tied-for-best systems for our world (specifically, a set of very good systems, but no robustly *best* system). Some have even argued that this entails that there are no Best System laws or chances in our world. I argue that, while it *is* plausible that there is a set of tied-for-best systems for our world, it doesn't follow from this that there are no Best System chances. (As I will argue, the situation with regard to laws is more complex.) Rather, it follows that (some of) the Best System chances for our world are *unsharp*.


Katherine Brading, Notre Dame

Absolute, true and mathematical time in Newton’s Principia

In the famous scholium at the beginning of Newton’s Principia, Newton discusses time, space, place and motion, making use of three distinctions: absolute and relative; true and apparent; mathematical and common. The literature on the scholium has focused primarily on absolute versus relative motion, and on absolute space, with comparatively little discussion of time, and nothing that I know of addressing why Newton has this three-fold set of distinctions. I discuss the three distinctions for the case of time, arguing that all three are important for the project of the Principia, and that all three become subject to empirical investigation through their relationship to the project of the Principia. I conclude by explaining the significance of this for philosophy of time.

Lent Term

20 January

Richard Pettigrew, Bristol

Accuracy, Risk, and the Principle of Indifference

In Bayesian epistemology, the problem of the priors is this:  How should we set our credences (or degrees of belief) in the absence of evidence?  The Principle of Indifference gives a very restrictive answer.  It demands that an agent with no evidence divide her credences equally over all possibilities.  That is, she ought to adopt the uniform distribution.  In this paper, I offer a novel argument for the Principle of Indifference.  I call it the Argument from Accuracy.  It shares much in common with Jim Joyce's ''nonpragmatic vindication of probabilism'' (Joyce 1998).  Joyce showed that, for a broad range of accuracy measures, if an agent's credences do not satisfy the axioms of probability, then there are credences that do satisfy those axioms that are guaranteed to be more accurate.  In my argument, I show that if an agent's credence do not satisfy the Principle of Indifference, then, for a similarly broad range of accuracy measures, they risk greater inaccuracy than they need to.

3 February

Bryan W. Roberts, LSE

Three Merry Roads to T-Violation

This talk aims to give a general perspective on how the fundamental laws of nature can distinguish between the past and the future, or be T-violating. I argue that, in terms of basic analytic arguments, there are really just three approaches currently being explored. I show how each is characterised by a symmetry principle, which provides a template for detecting T-violating laws even without knowing the laws of physics themselves. Each approach is illustrated with an example, and the prospects for each are considered in extensions of particle physics beyond the standard model.

17 February

James Ladyman, Bristol

What are Weak Values  

This paper is about the philosophical interpretation and implications of weak values in quantum mechanics. In particular, we present an argument for a subtle kind of retrocausality (which we define as the causal dependence of earlier events on later events) that is based on the probabilistic structure of experiments to determine weak values that we explain. We consider what type of property weak values are, and explain some of their unusual features and how they relate to the nature of the retrocausality they exhibit, and we briefly consider the merits of the two-state vector approach to weak values, and explore how they should be understood in the context of Everettian quantum mechanics.

17 March

Hans Halvorson, Princeton

Explanation via surplus structure

Our attempts to model physical systems seem to be cursed by the problem of surplus structure: our mathematical representations of such systems contain structure which apparently has no analogue in the system under study. When interpreting our theories, then, we invoke some notion of "physical equivalence" of models in order to wash out this surplus structure. At least that's how the story is typically told. I will argue, in contrast, that surplus structure can itself be explanatory. For concreteness, I will focus on the case of surplus structure in quantum field theory. I will show that explanations in QFT trade on the fact that there are multiple "representations" that are nonetheless mathematically isomorphic.

Michaelmas Term

7 October

Oswaldo Zapata, LSE

Is there still any hope for string theory?

Abstract: In the first part of the talk we will cover how string theory is related to supersymmetry, and what we can learn from the experiments at the LHC. We will also touch upon how supersymmetry extends our notion of the standard relativistic  four-dimensional space-time. A short discussion of supergravity, that is, local supersymmetry, follows. We will see that supergravity is in fact a special limit of superstring theory. Next, we will explore the string theory proposal stating that the universe is a sort of hologram. According to this idea, the universe we live win is in one to one correspondence with a lower dimensional world where all the information of our higher dimensional universe is stored. Some phenomenological consequences  of this correspondence, such as the understanding and conception of new materials with technological applications, will be discussed at the end.

21 October

Joseph Henson, Imperial College London

Locality Reinstated?

The assumptions of Bell's theorem were meant by its author to to characterise a lack of any physical superluminal influences.  The ``Quantum non-locality'' he discovered does violence not only to our intuitions, but arguably also to any thoroughgoing attempts to apply causal explanation in QM.

It is natural to wonder if there is a better way to characterise locality (= lack of superluminal influence) that is consistent with QM, thereby resolving an apparent contradiction by discarding some conceptual ``excess baggage.''  However, in order to do this we must be careful to preserve what is essential in the idea of locality.  In this talk, I will describe a particular view of  Bell's condition, and point out a (hopefully uncontroversial) list of conditions that any reinstated ``locality'' should obey.  I will then apply these conditions to a number of candidates in the literature, including some treatments of locality in many-worlds QM, Howard's non-separability and ``super-determinism,'' and find them to be violated.  Finally I will speculate about the extent to which a useful concept of locality \textit{can} be preserved in QM.

4 November

Samuel Fletcher, Logic and Philosophy of Science, University of California Irvine

On the Reduction of General Relativity to Newtonian Gravitation

Abstract: Accounts of the reduction of general relativity (GR) to Newtonian gravitation (NG) usually take one of two approaches. One considers the limit as the speed of light c → ∞, while the other focuses on the limit of formulae (e.g., three-momentum) in the low-velocity limit, i.e., as v/c ≈ 0.  Although the first approach treats the reduction of relativistic spacetimes globally, many have argued that ‘c → ∞’ can at best be interpreted counterfactually, which is of limited value in explaining the past empirical success of NG.  The second, on the other hand, while more applicable to explaining this success, only treats a small fragment of GR.  Further, it usually applies only locally, hence is unable to account for the reduction of global structure.  Building on work by Ehlers, I propose a different account of the reduction relation that offers the global applicability of the c → ∞ limit while maintaining the explanatory utility of the v/c ≈ 0 approximation.  In doing so, I highlight the role that a topology on the collection of all spacetimes plays in defining the relation, and how the choice of topology corresponds with broader or narrower classes of observables that one demands be well-approximated in the limit


Summer Term

29 April 2013: Jeffrey Ketland, Oxford. Leibniz Equivalence

This talk discusses two topics.

1. Leibniz Equivalence: Given a spacetime model (M, g, ...), where M = (X, C) is a manifold with atlas C, an arbitrary permutation h : X -> X of the base set of M generates an isomorphic model. h need not be a diffeomorphism of M to itself. And Leibniz equivalence is the anti-haecceistic claim that isomorphic spacetime models represent the same possible worlds.

2. Abstract Structure & Possible Worlds: Given a model A, what is the abstract structure of A? It is proposed that the abstract structure of A is the categorical first-order-ramsified second-order propositional function that defines the isomorphism type of A.

What is a possible world? It is proposed that, for any possible world w, w = F[R1, ...], for some sequence R1, ., of relations, where F is the abstract structure of some model. A possible world w is therefore a categorical proposition expressing a ``pattern of instantiation'' of relations.

14 May 2013 (TUESDAY, LAK.2.06, 5.30 - 6.45 pm) : Orly Shenker (Hebrew University) and Meir Hemmo (Haifa University). We Are the Demons, My Friend

Thermodynamic phenomena are successfully predicted by statistical mechanics, which assumes (as a working hypothesis) that mechanics is the fundamental theory of the world, and adds to it some extra assumptions. However, the justification for these extra assumptions is unclear. In particular, in the prevalent accounts, notions such as macrostates and probability are not grounded in any physical theory. In our book *The Road to Maxwell's Demon* we fill this gap by showing how statistical mechanical concepts such as macrostates and probability can be constructed bottom up from mechanics together with some testable empirical generalizations. In our construction a central role is played by the notion of a physical observer within a "theory without an observer" (such as classical mechanics). In this talk we discuss the nature and status of the physical observer in classical mechanics and address two implications of this idea. One implication is the emergence of an objective and physical notion of probability. Another implication is that entropy-decreasing operations, associated with Maxwell's Demon, may be ubiquitous rather than exceptional, and in fact, we may be the demons.

20 May 2013: Erica Thompson (Centre for the Analysis of Time Series, LSE). Attitudes Towards Uncertainty: The Perspective of a Physical Scientist

Different scientific disciplines have very different attitudes towards the presence and nature of uncertainty in scientific knowledge.  I will discuss some of these from my own perspective, having been trained in the physical sciences.  The differences are most distinct when we are making out-of-sample predictions and therefore, necessarily, relying on some form of modelling.  In my own field, climate science, there is a particular tension between a statistical view of the climate system and a dynamical-systems view, which lead to somewhat different attitudes towards uncertainty and potentially very different assessments of risk.  I will suggest why I think such disagreements recur, and the implications for climate (and other) decision making.

24 June 2013: László E. Szabó (Department of Logic, Institute of Philosophy, Eötvös University, Budapest). On the Meaning of the Special Relativity Principle

In its most widespread formulation, the special principle of relativity is the following statement: "The laws of physics have the same form in all inertial frames of reference." While there is a long-standing discussion about the interpretation of the extended, general principle of relativity, there seems to be a consensus that the above quoted special principle of relativity is absolutely unproblematic.

In my talk, I will challenge this view through an analysis of the precise meaning of the statement. The analysis will be based on a precise and general mathematical formulation of the principle. It will be seen, however, that the main difficulties are not formal or mathematical in nature, but rather conceptual. What is counted as a "law of physics" here -- for example, the Maxwell equations, or a Coulomb solution, describing a concrete physical situation? How to identify a physical law, and how to identify its counterpart in another reference frame? What does it take to be of the "same form"? -- one and the same physical law can be expressed in different, but logically equivalent forms. In what sense can a law of physics be "in" an inertial frame of reference? How do we identify a physical quantity, and how do we identify its counterpart in another reference frame? If they are identified by means of their operational definitions, how are the etalons and the measuring devices shared between the different reference frames? Under what physical conditions can two measuring devices -- one being at rest in one inertial frame, the other being at rest in another inertial frame -- be regarded as the same measuring device in the same (pointer-position) state? -- and, the similar question about the physical objects to be measured. After all, under what conditions can a physical object or phenomenon -- Galileo's fishes, butterflies, and smokes -- be regarded as being "in" or "co-moving with" an inertial frame of reference? In fact, some of these questions do not have a satisfactory answer.

Lent Term

4 February 2013: James Wells, CERN and University of Michigan at Ann Arbour. Will we ever be sure the Higgs boson has been discovered?

Recent news from CERN is that the Higgs boson has been discovered. I will first review what the Higgs boson is and why the discovery is so momentous. After that I will make the case that we will never be sure of what we have discovered. I give an example of how exotic new physics interactions can change its identity in subtle ways. Nevertheless, there are experiments to be made in the next few years that can establish it as the Higgs boson "for all practical purposes" or point the way to a new energy frontier. This talk is intended for an audience with limited exposure to particle physics.

25 February 2013: Richard Dawid, University of Vienna. Evidence for the Higgs Particle and the Look Elsewhere Effect

Last summer CERN reported the observation of a scalar particle that is strongly believed to be the long expected Higgs particle. One specific aspect of the process of data collection and analysis that led up to this discovery is of specific philosophical interest. In the early months of 2012, when the collected data was already significant but technically did not yet amount to an observation of a new particle (i. e. did not yet constitute a 5 sigma effect), an interesting debate on the most adequate characterisation of the status of the available data arose among particle physicists. That debate circled around the role of the so called ‘look elsewhere effect’ and provided an interesting perspective on the relation between theoretical and empirical arguments in high energy physics. The talk will analyse this debate and draw some conclusions regarding the epistemic status of claims in high energy physics

4 March 2013: Silvia De Bianchi,  University College London and University of Rome "La Sapienza". "A symmetric theory of electrons and positrons" and its legacy for the history and philosophy of physics

In this talk I shall detect and discuss philosophical and methodological questions in dealing with the case study of Majorana’s "A symmetric theory of electrons and positrons" (1937). According to Majorana, a symmetric theory can be constructed on the ground of a two component theory that implies physical consequences for neutral particles different from those advanced by Dirac. The strategy to obtain this result consists in assuming a different variational principle (different from Dirac's), in assuming a priori a mass term for neutral particles, and then proceed via second quantization.  The consequence of this procedure leads to the fact that neutrino can be its own antiparticle. In the second part of my talk I show that, in Majorana’s view, symmetries depend on the choice of a variational principle and frame this question within the current realism and anti-realism debate. Conclusively, I shall deal with the question of objectivity that Majorana raised on accounting for quantum dynamical systems. 

Michaelmas Term

29 October 2012: Gábor Hofer-Szabó, Institute of Philosophy, Hungarian Academy of Sciences. Bell Inequality and Common Causal Explanation in Algebraic Quantum Field Theory

In the talk it will be argued that the violation of the Bell inequality in algebraic quantum field theory does NOT exclude a common causal explanation of a set of quantum correlations if we abandon commutativity between the common cause and the correlating events. Moreover, it will turn out that the common cause is local, i.e. localizable in the common past of the correlating events. It will be argued furthermore that giving up commutativity helps to maintain the validity of Reichenbach's Common Cause Principle in algebraic quantum field theory.

19 November 2012: Giovanni Valente, Department of Philosophy, University of Pittsburgh. Local Disentanglement in Relativistic Quantum Field Theory

In their paper on "Entanglement and Open Systems in Algebraic Quantum Field Theory", Clifton and Halvorson (2001) raised the question whether entanglement between quantum systems can be destroyed by means of local operations and claimed that, contrary to non-relativistic quantum mechanics, this can never be the case in relativistic quantum field theory. In this talk I will argue that Clifton and Halvorson's no-go result applies only to a special kind of local o,perations, and thereby I will reject their conclusion. In fact, after providing sufficient conditions for local disentanglement to be achieved, I will show that, if the split property holds, there exists a class of local operations which disentangle all states across any pair of spacelike separated quantum field systems.

3 December 2012: Hugo Touchette, School of Mathematical Sciences, Queen Mary, University of London. The large deviation approach to statistical mechanics: A basic introduction

I will give in this talk a basic overview of the theory of large deviations, developed in 1970s by Varadhan (Abel Prize 2007), and of its recent application in statistical mechanics. In the first half of the talk, I will discuss basic results of large deviation theory, which can be traced back in mathematics to Cramer (1938) and Sanov (1960), and on the physics side to Einstein (1910) and Boltzmann (1877). In the second half, I will then discuss how these results can be used to rebuild or re-interpret the foundations of equilibrium and nonequilibrium statistical mechanics.

Related Events

30 April 2013 (Tuesday 5.30 - 7pm): Matt Parker, LSE. Infinitesimal Probabilities and Regularity

In standard probability theory, probability zero is not the same as impossibility.  If an experiment has infinitely many possible outcomes, all equally likely, then all the outcomes must have probability zero, but at least one of them must occur nonetheless.  Many have suggested that this should not be so-that probabilities (ontic or epistemic, depending on the author) should be regular:  Only impossible events should have probability zero and only necessary or certain events should have probability one.   This can be arranged if we allow infinitesimal probabilities, but it turns out that infinitesimals do not solve all of the problems.  I will show that regular probabilities cannot be translation-invariant, even for bounded and disjoint events.  Hence, for various events confined to finite space and time (e.g., dart throws and vacuum fluctuations), regular chances cannot be determined by space-time invariant physical laws, and regular credences cannot satisfy seemingly reasonable symmetry principles.  Moreover, these examples are immune to the main objections against Timothy Williamson's infinite coin flip examples.

13 May 2013: Barrie Tonkinson. Relativity Theory in the 21st Century - Formalism & Interpretation


Lent Term

16 January 2012: Seth Bullock, University of Southampton. Levins and the Legitimacy of Artificial Worlds

13 February 2012: Jeffrey A. Barrett, University of California at Irvine. How to Understand Everett's Pure Wave Mechanics as Empirically Adequate

27 February 2012: Michael Redhead, LSE. The Relativistic EPR Argument

Michaelmas Term

10 October 2011: William Harper, University of Ontario. Isaac Newton’s Scientific Method

31 October 2011: Huw Price, University of Cambridge. Retrocausality – what would it take?

21 November 2011: Nazim Buatta, University of Cambridge. Fields, Strings and Black Holes, and all that


Summer Term

9 May 2011: Nick Huggett, University of Illinois at Chicago. Reading the Past in the Present

16 May 2011: Arthur Peterson, VU University Amsterdam. Model structure uncertainty: a matter of (Bayesian) belief?

23 May 2011: Meinard Kuhlmann, University of Bremen. On the Furniture of the Quantum World – Particles, Fields, Structures, or Tropes?

Lent Term 

24 January 2011: Guido Bacciagaluppi, University of Aberdeen. Quantum Logic and Truth Functionality

7 February 2011: Dennis Lehmkuhl, University of Wuppertal. The Difference between Matter and Spacetime

7 March 2011: Chris Wuthrich, University of California San Diego. To the Planck scale and back: On the emergence of spacetime in quantum theories of gravity

21 March 2011: Daniel Isaacson, Universithy of Oxford. Structural realism for physical theories and for mathematics

Michaelmas Term

25 October 2010: Jeremy Butterfield, Trinity College, University of Cambridge. Particles as Properties (joint work with Adam Caulton, University of London)

1 November 2010: Bob Coecke, Oxford University. In the beginning God created tensor

15 November 2010: Chris L. Farmer, University of Oxford. Inverse Problems: Formulation, Computation and Interpretation


Summer Term

17 May 2010: Adam Caulton, University of Cambridge. Weak Discernibility, But of What?

24 May 2010: Michael Redhead, LSE. The Gödel Argument and EPR Relativity Argument

14 June 2010: Alexei Grinbaum, Ecole Polytechnique, Paris. Understanding quantum mechanics via information-theoretic reconstructions 

28 June 2010: Thomas Prellberg,  School of Mathematical Sciences, Queen Mary, University of London. The Mathematics of the Casimir Effect

Lent Term

18 January 2010: Sorin Bangu, University of Cambridge. Best Friends or Worst Enemies? Bridge Laws in Inter-theoretic Relations

15 February 2010: Eleanor Knox, School of Advanced Study, University of London. Geometry and Inertia: Constraints on spacetime theories

15 March 2010: Paul Busch, University of York. Unsharp Quantum Reality 

Michaelmas Term

28 September 2009: Aidan Lyon, University of Sydney. Counterfactual-Probability

12 October 2009: Erik Curiel, LSE. Classical Mechanics Is Lagrangian; It Is Not Hamiltonian

23 November 2009: Mattias Frisch, University of Maryland, College Park. Causes in Physics: Projection or Discovery?

7 December 2009: Alberto Cordero, CUNY Graduate Center & Queens College CUNY.  Realism and the Case of the Light Ether


Summer Term

1 June 2009: Antony Eagle, University of Oxford.Chance and Randomness

15 June 2009: Tian Yu Cao, Boston University. Structural Realism and Theory-Creation 

29 June 2009: Matt Parker, LSE.On the Physical Implementation of Universal Computers

Lent Term

19 January 2009: Fay Dowker, Imperial College London.Dynamical Logic

9 February 2009: Joseph Berkovitz, University of Toronto. On Predictions in Retro-causal interpretations of Quantum Mechanics

9 March 2009: Dan Parker, Virginia Tech. Molecular Disorder, Probability and Time

16 March 2009: José Díez, University of Barcelona. Operational Meaning and Metrical Concepts 

Michaelmas Term

13 October 2008: John Worrall, LSE. Structural Realism: the "Newman Objection"? What objection?

27 October 2008: Miklós Rédei, LSE. Is Algebraic Relativistic Quantum Field Theory Causally Complete?

17 November 2008: Cancelled due to illness of the speaker 


Summer term

19 May 2008: Tim Palmer, University of Cambridge and ECWMF. Bell Inequalities, Free Variables and the Undecidability of Fractal Invariant Sets

9 June 2008: Andreas Doering, Imperial College, London. Topos Theory in the Foundations of Physics

23 June 2008: Frank Arntzenius, University of Oxford. More Space, Less Clutter?

7 July 2008: Lamberto Rondoni, Politecnico di Torino. Deterministic Thermostats and Fluctuation Relations in Nonequilibrium Statistical Mechanics

Lent term

21 January 2008: Reimer Kuehn, King's College London. On the Constitutive Role of Large Numbers in Theory Development for Macroscopic Systems

4 February 2008: F. A. Muller, Erasmus University Rotterdam. Leibniz's Revenge: How to Discern Elementary Particles in Quantum Mechanics

18 February 2008: Charlotte Werndl, University of Cambridge. What Are the New Implications of Chaos for Unpredictability?

3 March 2008: Lennie Smith, LSE and University of Oxford. Philosophical Questions at the Coal-face of Climate Policy: Models, Muddles, and Insight

24 March 2008: Eric Winsberg, University of South Florida. Verification and Validation in Computer Simulation

Michaelmas Term

23 November 2007: Jos Uffink, University of Utrecht. Motivating outcome independence: locality versus sufficiency


Lent Term

9 January 2007: Harvey Brown, Oxford University. Why is general relativity a geometric theory? The red shift argument revisited

30 January 2007: Stephan Hartmann, LSE. Probability and Decoherence

20 February 2007: Rob Spekkens, LSE. Quantum coherence: fact or fiction?


Summer Term

8 May 2006: John Lucas, University of Oxford. Informal Logic

11 May 2006: Pierre Cartier, Institut des Hautes Etudes Scientifiques; Chair: Sir Roger PenroseExotic Symmetries of Space and Time

7 June 2006: Nick Huggett, University of Illinois. The Identity of Indiscernibles for Quantum Particles of Any Symmetry

19 June 2006: Lee Smolin, University of Waterloo and Perimeter Institute for Theoretical Physics. Science and Democracy: The Essential Partnership

Lent Term

9 January 2006: Alexey Kryukov, University of Wisconsin Colleges/University of Wisconsin, Madison. Functional relativity: How to extend the principle of relativity to quantum mechanics?

16 January 2006: Christopher Timpson, University of Leeds. The Ontological Status of Quantum Information: Progress and Open Questions?

27 January 2006: Mini-Workshop on the Philosophical Foundation of Statistical Mechanics

  • Peter Ainsworth,LSE. The Spin-Echo Experiment and Statistical Mechanics
  • Wolfgang Pietsch, University of Augsburg. Equilibrium and Non-equlibrium - what's the difference and what can we learn from it?
  • Roman Frigg, LSE. Entropy and Randomness in Dynamical Systems

13 February 2006: George Zouros, LSE. An Analysis of Popper's Thought Experiment

Michaelmas Term

24 October 2005: Jonathan Halliwell, Imperial College, London. Commuting Position and Momentum Operators, Exact Decoherence and Emergent Classicality

14 November 2005: Ray Streater, King's College London. The Meaning of Entropy Out of Equilibrium 

28 November 2005: David Lavis, King's College London. Boltzmann and Gibbs: An Attempted Reconciliation