Sharad Goel (Harvard Kennedy School of Government), "Designing Equitable Algorithms for Criminal Justice and Beyond"
Machine learning algorithms are now used to automate routine tasks and to guide high-stakes decisions, but, if not carefully designed, they can exacerbate inequities. I’ll start by describing an evaluation of automated speech recognition (ASR) tools, which power popular virtual assistants, facilitate automated closed captioning, and enable digital dictation platforms for health care. We find that five state-of-the-art ASR systems -- developed by Amazon, Apple, Google, IBM, and Microsoft -- exhibited substantial racial disparities, making twice as many errors for Black speakers compared to white speakers, a gap we trace back to a lack of diversity in the audio data used to train the models. I'll then describe recent attempts to mathematically formalize fairness. I'll argue that some of the most popular definitions, when used as a design principle, can, perversely, harm the very groups they were created to protect. I'll conclude by describing a general, consequentialist paradigm for designing equitable algorithms that aims to mitigate the limitations of the dominant approaches to building fair machine learning systems.
The Applied Statistics Workshop (Gov 3009) meets all academic year, Wednesdays, 12pm-1:30pm, in CGIS K354. This workshop is a forum for advanced graduate students, faculty, and visiting scholars to present and discuss methodological or empirical work in progress in an interdisciplinary setting. The workshop features a tour of Harvard's statistical innovations and applications with weekly stops in different fields and disciplines and includes occasional presentations by invited speakers.
More information is available at the Gov 3009 website: https://projects.iq.harvard.edu/applied.stats.workshop-gov3009