About event

Seminar Series on COVID-19, co hosted by the Alfred Deakin Institute for Citizenship & Globalisation and the Science and Society Network

Please join us seminar # 4 in the COVID-19 seminar series, co-hosted by the Alfred Deakin Institute for Citizenship & Globalisation (ADI) and the Science and Society Network (SSN).

Title:

COVID-Up: Trust and Transparency in Contact Tracing Applications?

Abstract:

On Sunday the 26th of April 2020, the Australian Government released a contact tracing application “COVIDSafe” to assist with the response to the global pandemic. The app was marketed as a way to speed up the processes of contacting people exposed to the coronavirus to “support and protect you.” The Australian Government is not alone in seeking such technological “solutions” to the global pandemic, and this represents but one significant surveillance development from the COVID pandemic. A series of privacy and civil liberties concerns quickly emerged, including: its intersection with Australia’s anti-encryption laws, the use of Amazon AWS and potential for extraterritorial data access via the Clarifying Lawful Overseas Use of Data (CLOUD) Act, the app becoming mandatory, or the data being used for “national security” purposes. An open letter signed by circa 80 technical and academic experts called for the source code and design specifications to be released to enable greater transparency and trust, sentiments shared by academics quick to publish on the topic (see Greenleaf & Kemp, 2020). However, despite initial assurances to release the source code and design specifications of the “Covid Safe” app, this has not occurred.

This talk takes these developments and expands critique on the transparency ideal which has been termed a “fallacy” (Edwards & Veale, 2017), and only meaningful if there is a critical audience to be transparent to (Kemper & Kolkman, 2019). Transparency does not constitute accountability in and of itself (Vedder & Naudts, 2017). A separate but related critique inspired by science, technology and society (STS) studies situates algorithms and applications in context (see Ananny, 2016; Crawford, 2016). Indeed, it has been argued that a focus on transparency reinforces a form of techno-determinist “fetishism” that “manifests in the pleasurable pursuit of opening the black box, discovering the code hidden inside, exploring its beauty and flaws, and explicating its intricacies” (Monahan, 2018, p. 2). In turn, this eschews understanding of algorithms and applications as embedded within institutions that have their own politics and power plays too. Related arguments have been presented by Ananny and Crawford (2018) who advance ten failings of transparency including (but not limited to) its disconnection from power and advancing neoliberal modes of agency. In connecting these developments with the critical literature on transparency, this talk advocates for a need to consider broader contexts of surveillance and to “decenter technology” (Peña Gangadharan, & Niklas, 2019) both in terms of techno-solutionist responses to the COVID pandemic, and our critique of them.

References

Ananny, M. (2016). Towards an ethics of algorithms: Convening, observation, probability and timeliness. Science, Technology and Human Values, 41(1), 93-117.

Ananny, M. & Crawford, K. (2018). Seeing without knowing: Limitations of the transparency ideal and its application to algorithmic accountability. New Media and Society, 20(3), 973-989.

Crawford, K. (2016). Can an algorithm be agonistic? Ten scenes from life in calculated publics. Science, Technology and Human Values, 41(1), 77-92.

Greenleaf, G., & Kemp, K. (2020). Austrlia’s ‘COVIDSafe App’: An experiment in surveillance, trust and law. Work-in-Progress Draft 30 April 2020. Available at: https://papers.ssrn.com/sol3/papers.cfm?abstract_id=3589317

Kemper, J. & Kolkman, D. (2019). Transparent to whom? No algorithmic accountability without a critical audience. Information Communication and Society, 22(14), 2081-2096.

Monahan, T. (2018). Editorial: Algorithmic Fetishism. Surveillance and Society, 16(1), 1-5.

Peña Gangadharan, S., & Niklas, J. (2019). Decentering Technology in Discourse in Discrimination. Information, Communication and Society, 22(7), 882-899.

Vedder, A., & Naudts, L. (2017). Accountability for the use of algorithms in a big data environment. International Review of Law, Computers & Technology, 31(2), 206–224.

About the speaker:

Dr Monique Mann is a Senior Lecturer in Criminology and member of the Alfred Deakin Institute for Citizenship and Globalisation at Deakin University. Mann’s research expertise concerns three main interrelated lines of inquiry: (1) new technology for policing and surveillance, (2) human rights and social justice, and (3) governance and regulation. Mann has contributed to advancing Australia’s national research agenda in these areas through her activities not only as an academic and author, but also as an advocate, media commentator, and policy advisor. She is author of ‘Politicising and Policing Organised Crime‘ (Routledge, 2020), ‘Biometrics, Crime and Security‘ (Routledge, 2018), and editor of ‘Good Data‘ (Institute of Network Cultures, 2019).

Watch the seminar:

Seminar will be available to stream on YouTube live. Access using the live link: https://youtu.be/QFWCQhtO2Is

Date/time: Tuesday 19th May, 10am – 11:30am (Australian Eastern Standard Time, GMT+10)

Q&A with the speaker to follow. To send questions/participate in the chat, you’ll need to sign-in using a YouTube account.

The seminar will be recorded and available to watch on the SSN YouTube channel after the Livestream.

If you have any questions, please send to ssn-info@deakin.edu.au

Event Contact:

Events