Home > Department of Media and Communications > Research > EU Kids Online > Best Practice Guide > FAQ 36: How do I report my quantitative data?

 

FAQ 36: How do I report my quantitative data?

What's the issue?

As Abelson (1995: 2) put it, quantitative data analysis should "make an interesting claim; it should tell a story that an informed audience will care about and it should do so by intelligent interpretation of appropriate evidence". No matter how appropriate the research design, how thorough the interviews, how proper the statistical analysis, how representative the sample, how carefully crafted the questionnaire or the questions, how stringent the quality control on the data collection process, in the end the real value of a research project depends on how it manages to communicate the results to those that can use them.

Common practice

  • A research report should give a thorough overview of how the research was conducted and what the results are.
     
  • Use graphics to display your results. A visual representation of data can reveal the meaning and implications of your study in a way that abstract numbers might conceal.
     
  • Remember the distinction between statistical significance and substantive significance.
     
  • Remember the distinction between statistical significance and effect size.
     
  • Do not expect statistics to speak for themselves. It is not enough to fill endless pages with tables and graphs.
     
  • Keep it simple when possible. Complex statistics can lead to confusion.
     
  • Resist the temptation to present too much raw data, try to make a focused analysis - even if a question was put into a questionnaire it does not necessarily have to appear in the report.

Pitfalls to avoid

Many will undoubtedly have heard the phrase (quoted from Disraeli) that there are three kind of lies: Lies, damned lies and statistics; used in the meaning that statistics can be used to confuse, distract and even change the truth. This is of course true up to a point. But it is also necessary to keep in mind that it is not the statistics that lie but rather it is the researchers who consciously or unconsciously provide statistical information which is confusing, misleading or even wrong.

Questions to consider

As a rule of thumb any argument based on quantitative data has to contain information on five important dimensions (Abelson, 1995: 11-13):

  • Magnitude, how big is the difference and how strong is the correlation?
     
  • Articulation, what precisely is it that we have found?
     
  • Generality, to what extent are the findings applicable to other people in other situations?
     
  • Interestingness, how relevant are the findings and should anybody be interested?
     
  • Credibility, are the findings methodologically and theoretically sound?

Further resources

Byrne, D. (2002). Interpreting Quantitative Data. London: Sage

Example of good practice

The Pew Internet and American Life Project has conducted a series of surveys of American teens on different aspects of their internet use. In each case, they provide a clear and succinct statement of the exact sampling frame used, in order that percentages reported can be accurately interpreted. For example, on the first main page of their 2007 report on teens' use of social networking sites, and in addition to a detailed appendix on methodology, they state:

"This Pew Internet & American Life Project report is based on the findings of a nationally representative telephone survey of American teens and a parent or guardian. All numerical data were gathered through telephone interviews conducted by Princeton Survey Research Associates between October 23, and November 19, 2006 among a sample of 935 teens ages 12-17 and a parent or guardian. For results based on the total sample, one can say with 95% confidence that the error attributable to sampling and other random effects is +/- 3%. For results based [on] teen internet users (n=886), the margin of sampling error is +/- 4%."

Through this statement, they seek to minimise the likelihood of some common misunderstandings made when interpreting survey findings. Pew strives for further clarity by adding the following subscript to every reported table in the findings: "Source: Pew Internet & American Life Project Parents and Teens Survey, October-November 2006. Based on online teens who use the internet from home. Margin of error for the overall sample is ±4%." Although it can be difficult to ensure that such information is also reported in a press release and, especially, in press reports of research findings, researchers should strive to ensure that their findings are accurately reported.
(Sonia Livingstone, UK)

Share:Facebook|Twitter|LinkedIn|