LSE100B      Half Unit
The LSE Course: How can we control AI?

This information is for the 2022/23 session.

Teacher responsible

Dr Christopher Blunt KSW.4.12 and Dr Jillian Terry KSW.4.11

Availability

The course will be compulsory for all first year undergraduate students.

Course content

LSE100 is LSE’s flagship interdisciplinary course taken by all first-year undergraduate students as part of your degree programme. The course is designed to build your capacity to tackle multidimensional problems through research-rich education, giving you the opportunity to explore transformative global challenges in collaboration with peers from other departments and leading academics from across the School. Before registering at LSE, you will have the opportunity to select one of three themes to focus on during LSE100, each of which foregrounds a complex and pressing question facing social scientists. In 2022/23, the available themes are:

  • How can we avert climate catastrophe?
  • How can we control AI?
  • How can we create a fair society?

In the ‘How can we control AI?’ theme, you will explore the emergence of artificial intelligence and its implications. Rapid technological advances in artificial intelligence are augmenting our ability to solve previously intractable problems, fundamentally changing society in ways that are both thrilling and terrifying. 

The same tools which could tackle social problems, automate burdensome tasks, and optimise systems can be used to threaten the freedom, physical safety, and economic security of people worldwide. Will AI transform society for the better, or will it simply reinforce existing systems and relationships, further embedding biases, inequalities, and structures of power? Who decides? Can we harness the power of AI for good?

In this module, we will explore the ways in which social systems are being transformed by technological change. You will learn to use the tools and frameworks of systems thinking in order to analyse the impacts of AI, broaden your intellectual experience, and deepen your understanding of your own discipline as you test theories, evidence and ideas from different disciplinary perspectives.

Teaching

7 hours and 30 minutes of seminars in the MT. 7 hours and 30 minutes of seminars in the LT.

90-minute seminars take place in alternate weeks. Students will attend an LSE100 seminar in either weeks 1, 3, 5, 7 and 9 or weeks 2, 4, 6, 8 and 10 of Michaelmas term, and weeks 1, 3, 5, 7 and 9 or weeks 2, 4, 6, 8 and 10 of Lent term.

MT: Seminar – 5 x 90min

LT: Seminar – 5 x 90min

In addition to seminars students will engage with bespoke video lectures featuring academics from across the School (approx. 20 minutes per seminar).

Formative coursework

In seminars throughout both terms, students will practice:

  1. analysing quantitative and qualitative data
  2. using systems thinking and systems change tools
  3. constructing and communicating evidence-based academic arguments

Teachers will provide feedback during seminars and in post-seminar communications to groups and individuals.

During the Lent Term, students will also have the opportunity to try out the tools of systems thinking and systems change that they will use in their digital reports and presentations.

Indicative reading

The following readings are indicative of the texts students will be assigned. The total amount of reading assigned for each seminar will be a maximum of 20 pages.

  • Oran R. Young (2017). ‘The age of complexity’ in Governing Complex Systems: Social Capital for the Anthropocene (MIT Press)
  • Ruha Benjamin (2019). ‘Default Discrimination: Is the Glitch Systemic?’ in Race after Technology: Abolitionist Tools for the New Jim Code (Polity).
  • Frank Levy (2018). ‘Computers and populism: artificial intelligence, politics and jobs in the near term’ in Oxford Review of Economic Policy, Volume 34, Issue 3, Pages 393–417: https://doi-org.gate3.library.lse.ac.uk/10.1093/oxrep/gry004 
  • Mark Coeckelbergh (2020). ‘AI for climate: freedom, justice, and other ethical and political challenges’ in AI Ethics https://doi.org/10.1007/s43681-020-00007-2
  • Emily Bender, et al. (2021). 'On the Dangers of Stochastic Parrots: Can Language Models Be Too Big?', in FAccT '21: Proceedings of the 2021 ACM Conference on Fairness, Accountability, and Transparency, pp.610-623.
  • Kate Crawford & Ryan Calo (2016) ‘There is a blind spot in AI research’ in Nature, 538: 311-3
  • Sarah Myers West, Meredith Whittaker & Kate Crawford (2019) Discriminating Systems: gender, race and power in AI (AI Now Institute)
  • Robert Sparrow & Mark Howard (2017) ‘When human beings are like drunk robots: driverless vehicles, ethics and the future of transport’ in Transportation Research, Part C: 80: 206-15

Assessment

Coursework (50%, 1500 words) in the MT.
Project (50%) in the LT.

Summative assessment will include an individual written assessment in the Michaelmas Term (50%) and a collaborative research project in the Lent Term (50%).

Key facts

Department: LSE

Total students 2021/22: Unavailable

Average class size 2021/22: Unavailable

Capped 2021/22: No

Value: Half Unit

Guidelines for interpreting course guide information

Course selection videos

Some departments have produced short videos to introduce their courses. Please refer to the course selection videos index page for further information.

Personal development skills

  • Leadership
  • Self-management
  • Team working
  • Problem solving
  • Application of information skills
  • Communication
  • Application of numeracy skills