Facebook is yet again dominating news headlines, this time for its #10YearChallenge encouraging users to post pictures of themselves from 10 years ago and today. While ostensibly a fun and trivial exercise, the viral challenge raised questions regarding whether the game was initiated by Facebook to gather data on its users. This led Zak Doffman to claim in Forbes, “The world of Big Data has clearly changed in the last year”.
Indeed, it has. In March 2018, Facebook was embroiled in a much more significant scandal. Years before, the consulting firm Cambridge Analytica hired a researcher to harvest data from over 50 million Facebook users under the guise of academic research. Cambridge Analytica used the data to develop software for predicting voter behavior, selling it to political parties to create targeted campaign ads. In a Congressional hearing about the Facebook data breach, Mark Zuckerberg, the founder and CEO of Facebook, apologized for the invasion of privacy, claiming “we didn’t do enough to prevent these tools from being used for harm … That goes for fake news, foreign interference in elections, and hate speech, as well as developers and data privacy.”
Although academics and civil liberty groups have long raised concerns over the ethics of Big Data technologies and companies, these events have thrust the conversation into the mainstream, demonstrating the urgency of addressing issues of privacy, access, and corporate control of data. The recent controversies regarding Facebook highlight a central challenge for academic researchers. While users may produce “content”, they often relinquish control over how that content is used. For individuals vulnerable to surveillance, harassment, exploitation, or other forms of abuse, the ways that researchers access and utilize their social media data may place these individuals at further risk of harm.
Using our own research on #WhyIStayed as an example, we have examined some of these challenges for Big Data researchers. Corporate control of online content, and in particular social media content, raises critical questions regarding knowledge production itself and who stands to benefit. For example, what gets researched and who does it represent? Who gets to conduct research and in whose interests? What knowledges are even possible to be produced if access and content are controlled by corporate entities? What symbolic violence might we enact if, as researchers, we simply adhere to “regulatory norms”, such as those of institutional review boards or user agreements? Who is most vulnerable to that violence?
In order to address these considerations, we have drawn attention to the importance of feminist ethics for Big Data social media research. We have argued that power, context, and subjugated knowledges – key tenets of feminist holistic reflexivity – must each be central considerations in conducting Big Data social media research. While our study, which involves victims/survivors of domestic violence, highlights particular risks and vulnerabilities, we believe that the practices of feminist holistic reflexivity that we have discussed can help other researchers navigating ethical issues related to Big Data social media research.
Cheryl Cooky, PhD, is an associate professor in the American Studies program and Women’s, Gender, and Sexuality Studies program at Purdue University. She is the author of No Slam Dunk: Gender, Sport and the Unevenness of Social Change (Rutgers University Press, 2018).
Jasmine R. Linabary, PhD, is an assistant professor in the Department of Communication and Theatre at Emporia State University. Her research centers on issues of safety and inclusive participation in digital and physical spaces.
Danielle Corple is a PhD candidate in the Brian Lamb School of Communication at Purdue University. She studies issues of vulnerability in online and offline organizations.
Cheryl Cooky, Jasmine R Linabary, Danielle J Corple
First Published: October 30, 2018
From Big Data & Society