home Home navigate_next Events navigate_next Leading-Edge Lecture Series navigate_next 【Leading-Edge Lecture 32】Beyond Ethical Tech: Why Understanding the Sociocultural History of Our Technical Defaults...
2022/12/29

【Leading-Edge Lecture 32】Beyond Ethical Tech: Why Understanding the Sociocultural History of Our Technical Defaults...

Leading-Edge Lecture 32

Topic: Beyond Ethical Tech: Why Understanding the Sociocultural History of Our Technical Defaults Matter

Speaker: Wendy Hui Kyong Chun (Canada 150 Research Chair in New Media, the School of Communication, Simon Fraser University; Director, Digital Democracies Institute)

ModeratorSebastian Hsien-hao Liao (Distinguished Professor, Department of Foreign Languages and Literatures; Dean, Institute of the Humanities and Social Sciences, NTU)

Time: 1/11(Wednesday) 10:00-12:00 A. M. (GMT+8)

Venue: The DFLL conference room in Gallery of University History, NTU

Organizer: Institute for Advanced Studies in the Humanities and Social Sciences, NTU

Co-organizer: Department of Foreign Languages and Literatures, NTU; ASE Cultural & Educational Foundation

 

 

Speaker:

Wendy Hui Kyong Chun is Simon Fraser University’s Canada 150 Research Chair in New Media in the School of Communication and Director of the Digital Democracies Institute. She has studied both Systems Design Engineering and English Literature, which she combines and mutates in her research on digital media. She has authored many books, including: Control and Freedom: Power and Paranoia in the Age of Fiber Optics (MIT, 2006), Programmed Visions: Software and Memory (MIT 2011), Updating to Remain the Same: Habitual New Media (MIT 2016), and Discriminating Data: Correlation, Neighborhoods, and the New Politics of Recognition (2021, MIT Press). She has been Professor and Chair of the Department of Modern Culture and Media at Brown University, where she worked for almost two decades and is currently a Visiting Professor. She is a Fellow of the Royal Society of Canada, and has also held fellowships from: the Guggenheim, ACLS, American Academy of Berlin, Radcliffe Institute for Advanced Study at Harvard.

 

Abstract:

The dangers of predatory predictive algorithms are well known: from amplifying discrimination to cementing polarization. If this is so, what can we do? This talk outlines how the humanities, social sciences, and STEM might come together to address the problems we face not by ignoring the past but examining how past injustices, such as segregation, have been embedded within our technical defaults.

 

Share: