Survey Methodology for the 2019 Digital News Report

This study has been commissioned by the Reuters Institute for the Study of Journalism to understand how news is being consumed in a range of countries. Research was conducted by YouGov using an online questionnaire at the end of January/beginning of February 2019.

  • Samples in each country were assembled using nationally representative quotas for age, gender, region, and education.1 The data were also weighted to targets based on census/industry accepted data.
  • As this survey deals with news consumption, we filtered out anyone who said that they had not consumed any news in the past month, in order to ensure that irrelevant responses didn’t adversely affect data quality. This category averaged around 3%.
  • We should note that online samples will tend to under-represent the consumption habits of people who are not online (typically older, less affluent, and with limited formal education). In this sense it is better to think of results as representative of online populations who use news at least once a month. In a country like Norway this is almost everyone (99%) but in South Africa this is just over 50%.
  • These differences mean we need to be cautious when comparing results between countries. We have marked countries with lower internet penetration or less representative online samples with an asterisk (*) in the table at the end of this section and have been careful in the report not to directly compare these countries on issues where we know that the sample difference would make results invalid (e.g. paying for news).
  • It is also important to note that online surveys rely on recall, which is often imperfect or subject to biases. We have tried to mitigate these risks through careful questionnaire design and testing. On the other hand, surveys can be a good way of capturing fragmented media consumption across platforms (e.g. social media, messaging, apps, and websites), and tracking activities and changes over time.
  • It is important to note that some of our survey-based results will not match industry data, which are often based on very different methodologies, such as web-tracking. The accuracy of these approaches can be very high, but they are also subject to different limitations, meaning that data can also be partial or incomplete. We will often look at this data to sense check our results or help identify potential problems with our survey data before publication. On occasions we will include industry data as supporting evidence with appropriate attribution.
  • Each year we also commission some qualitative research to support and complement the survey. This year, we worked with Flamingo, an international market research company, to look in detail at the habits and behaviours of younger groups in the United States and United Kingdom. The methodology included tracking actual online behaviour of 20 participants for several weeks, depth interviews, and small group discussions with their friends. Insights and quotes from this research are used to support this year’s Digital News Report but will also form a separate report to be published in September.
  • Along with country-based figures, throughout the report, we also use aggregate figures based on responses from all respondents across all the countries covered. These figures are meant only to indicate overall tendencies and should be treated with caution.
  • Due to a scripting error we needed to repoll respondents for one question in Norway about the use of social networks for news. 1,387 of the original 2,000 sample responded to the recontact request and the results are included on the Norway country page.
  • Download a list of the questions from the online questionnaire.

More Detail on YouGov Research Methods

By David Eastbury, Director (International Research), YouGov

Internet surveys administered through panels have become a feature of the research landscape in the last decade and a half. YouGov pioneered the use of online research in the UK in 2000 and, particularly in its early adoption stages, has been pretty much a lone voice in advocating the use of the internet for social research.YouGov conducts its public opinion surveys online using something called Active Sampling for the overwhelming majority of its commercial work, including all nationally and regionally representative research. The emphasis is always on the quality of the sample, rather than the quantity of respondents.

When using Active Sampling, restrictions are put in place to ensure that only the people contacted are allowed to participate. This means that all the respondents who complete YouGov surveys will have been selected by YouGov, from their panel of registered users, and only those who are selected from this panel are allowed to take part in the survey.

YouGov’s Panel

Over the last fifteen years, YouGov has carefully recruited a panel of over one million British adults and over 6 million people worldwide to take part in their surveys. Panel members are recruited from a host of different sources, including via standard advertising, and strategic partnerships with a broad range of websites. These people cover a wide range of ages, gender, social grade, ethnicity and tenure. These panels are large enough to enable YouGov to select both nationally representative samples that reflect the actual breakdown of the population on the key demographics (such as age, gender, region etc.) and target specific samples (such as legal service users, cat owners etc.).
When a new panel member is recruited, a host of socio-demographic information is recorded. For nationally representative samples, YouGov draws a sub-sample of the panel that is representative of the population in terms of a number of different demographic variables (generally age, gender and region as a minimum), and invites this sub-sample to complete a survey.To reiterate, with Active Sampling only this sub-sample has access to the questionnaire via their username and password, and respondents can only ever answer each survey once.

Quality Sampling

Obtaining good-quality samples is a challenge for all methodologies. Response rates for telephone polls for example, have been declining in recent years – to typically below 10% – and often much lower in inner city areas. The ability to extrapolate from the under 10% of telephone respondents that pollsters can get hold of, to the 90% that they cannot, is clearly a challenge – leading to concerns over the quality of achieved samples, whether telephone or face-to-face. There are, of course, some areas where an online approach is unsuitable, and YouGov would always alert their clients to this. For example, it would be inappropriate for an online survey to be used to estimate the incidence of paying income tax online or for identifying service needs on a disadvantaged housing estate.
However, it would be unfair to say that online is ‘biased’ in a way that offline is not. The fact is, there are different biases for which all approaches have to account. Online research can be used for a wide variety of topics and given that in the Digital News Survey we are asking about usage of and attitudes towards digital news and screening out those who don’t access it, online is an appropriate methodology. Especially as the main purpose is to track the activities and changes over time within the digital space – as well as gaining understanding about how offline media and online media are used together.

Analysis of the Data

Once the survey is complete, the final data are then statistically weighted to the national profile of all adults aged 18+ (including people without internet access) or whatever the target sample has been defined as, in case of non-representative surveys. All reputable research agencies weight data as a fine-tuning measure and almost all surveys involve weighting, whether they are conducted online, face-to-face or by telephone. This is to ensure that the published results properly reflect the population they seek to measure. For example, men comprise 48% of the electorate and women 52%. The raw figures in a well-conducted survey will be close to this, but not necessarily match these numbers exactly. Suppose the raw figures contain 50% men and 50% women. YouGov’s computer would slightly “downweight” the replies given by the men (so that the replies of 500 men count as if they were 480) and slightly “upweight” the replies given by women (so that the replies of 500 women count as if they were 520).

In practice, the task is more complex than this, as other demographic variables, as well as gender, have to be considered simultaneously. This is a task for YouGov’s computer, which adjusts the raw data to take account of all these factors. At YouGov, the exact demographics vary by country (largely dependent on publicly available figures) – as an example, in the UK, YouGov weights by age, gender, social class, newspaper readership and region, as well as level of education, how respondents voted at the previous election, how respondents voted at the EU referendum and their level of political interest on occasion. Targets for the weighted data are derived from a number of sources, including:

1. The census
2. Large scale random probability surveys, such as the Labour Force Survey, The National Readership survey and the British Election Study
3. The results of recent elections.
4. Official ONS population estimates

Active Sampling ensures that the right people are invited in the right proportions. In combination with YouGov’s statistical weighting, this ensures that the results are representative of the country as a whole. Not just those with internet access, but everyone. While it is true that not everyone does have access to the internet, independent academic research shows that its widespread uptake means the views of those with access to the internet are now mostly indistinguishable from those without.

Online Approach – Additional Considerations

Interviewer bias and social distance

Another key advantage of online for this survey is the neutrality of the interview mode. Independent research has found that respondents modify their answers in the presence of an interviewer, including when the interviewer is on the other end of the phone. This lack of ‘social distance’ can mean that respondents feel compelled to give a ‘safe’ answer.

Online surveys increase social distance so respondents are more likely to disclose important and sensitive information. In addition it enables a respondent to give an answer free from embarrassment and, therefore, a truer reflection of their actual feelings.

The influence of questionnaire design on measurement error has received attention in a number of publications. Chang and Krosnick (2010) conducted an experiment, randomly assigning respondents to complete a questionnaire either on a computer or to be interviewed orally by an interviewer. They found that respondents assigned to the computer condition manifested less non-differentiation and were less susceptible to response order effects. In other words the computer surveys were more likely to extract a truer response.

Pace of interview

Online research is more convenient for respondents; they can fill in the survey in their own time, at their own pace and can formulate more considered answers. The nature of the Digital News Report survey is fairly complex and requires a great deal of time and thought on the respondents’ behalf. Therefore an online approach is ideal for this study.

YouGov Accuracy

YouGov is a leading player in the UK’s media polling and is one of the most quoted research agencies. The only way to demonstrate the accuracy of attitudinal research is to compare predictions with actual outcomes. This is why YouGov have consistently published pre-election polls, even in difficult-to-call contests such as local government elections.

YouGov has a strong history of accurately predicting actual outcomes across a wide range of different subjects, including national and regional elections, political party leadership contests and even the results of ITV talent show The X Factor.

YouGov is also part of ESOMAR – full details of the YouGov answers to the ESOMAR 28 questions can be viewed here

  1. Education quotas were not applied (or not fully applied) in Brazil, Mexico, South Africa, Malaysia, Romania, Bulgaria, Croatia, Greece, and Turkey so these samples will have a higher proportion of highly educated people than the general population.