Home > Department of Media and Communications > News > Are Communications Scholars Facing a Data Analysis Divide?

 

Are Communications Scholars Facing a Data Analysis Divide?

Big data. Big Questions. Christian Ledwell (MSc Comm & Development, '13) explores the future of communication researcher in light of more computational approaches to inquiry.

Will big data—data unprecedented in scale and scope—create what US computer science professor Lev Manovich calls, a “data analysis divide” between researchers with computational knowledge and those without? Drs. Mike Savage and Roger Burrows argue in their 2007 article The Coming Crisis of Empirical Sociology that in light of big data the tried and true in-depth interview and sample survey are invaluable but dated methodologies and that social researchers must actively respond to new practices around the “collection, use, and deployment of social data.”

While one might be tempted to claim questions around big data rehash the classic, and unhelpful, debate around quantitative versus qualitative inquiry, the acknowledgment and appreciation of interdisciplinary perspectives on new technologies is an important aspect of critically assessing the promise and perils of big data.

Drs. Eric Meyer and Ralph Schroeder, scholars at the Oxford Internet Institute (OII), argue that e-Research, the use of new technologies to support emerging and existing forms of research, is often the product of multiple communities of scholars engaged in knowledge sharing. In the recently published Oxford Handbook of Internet Studies, they describe e-Research is the product of “individuals or groups build[ing] their own tools for wider uptake or sharing and collaboration.”

The Economic and Social Research Council (ESRC), the UK’s biggest funder of social research, has been a key player in emphasizing the importance of such endeavors. It is preparing to host a conference that will lay the groundwork for the creation of an International Centre for Social Media Research with the goal of facilitating communication across disciplinary boundaries. Drawing on the OECD Global Science Forum’s report on Data and Research Infrastructure for the Social Sciences, this initiative aims to unite diverse scholars who will contribute their various expertise to the study of digital media.

Despite these initiatives there have been predictions that big data will lead to a research environment where theory and the scientific method itself are no longer needed. Chris Anderson claims in his divisive 2008 Wired article The End of Theory: The Data Deluge Makes the Scientific Method Obsolete| that in a world of petabytes (i.e. 1 million gigabytes) and supercomputers “correlation supersedes causation, and science can advance even without coherent models, unified theories, or really any mechanistic explanation at all.” that in a world of petabytes (i.e. 1 million gigabytes) and supercomputers “correlation supersedes causation, and science can advance even without coherent models, unified theories, or really any mechanistic explanation at all.”

Counter to such claims, there is good reason to believe that theory and more qualitative inquiry will continue to play a key role in social research despite the growing importance of big data. To say the least, Anderson’s claims are problematic. Just because big data may offer a novel means of investigating the social realm does not excuse the need to approach that information with a critical lens, particularly due to the dangers around the depoliticization of social phenomena. As the OII’s Director of Research Dr. Mark Graham states in Wired’s follow-up piece Big Data and the Death of the Theorist| by Ian Steadman, “when talking about 'big data' and the humanities, there will always be things that are left unsaid, things that haven't been measured or codified.” by Ian Steadman, “when talking about 'big data' and the humanities, there will always be things that are left unsaid, things that haven't been measured or codified.”

Dr. Robin Mansell, Professor of Internet Studies at the LSE, argues that theory is integral to research that aims to address social problems and questions the trajectory of research in light of big data. “An interesting question is whether the availability of big data sets begins to shape the research questions that are given priority,” said Mansell, “squeezing out some research questions that need to be explained using qualitative methods.”

All researchers, both quantitative and qualitative, are tasked with rigorously interpreting and producing data. Echoing the aims of the ESRC initiatives, Mansell suggests that what is needed is a collaborative, interdisciplinary approach to social inquiry that is facilitated by projects with a ranging emphasis on the qualitative, quantitative and computational. Ultimately, the implicit arrogance of techno-optimism around the end of theory in light of big data on the one hand, and distrust of computer scientists, social physicists and computationally inclined social researchers on the other only serve to inhibit true interdisciplinarity and a rich understanding of socio-technical phenomena.

To learn more about the promises and perils of big data and the importance of a reasoned approach to its assessment be sure to check out danah boyd and Kate Crawford’s Critical Questions for Big Data: Provocations for a cultural, technological, and scholarly phenomenon|.

 

Share:Facebook|Twitter|LinkedIn|