This study has been commissioned by the Reuters Institute for the Study of Journalism to understand how news is being consumed in a range of countries. Research was conducted by YouGov using an online questionnaire at the end of January/beginning of February 2015.
- The data were weighted to targets based on census/industry accepted data, such as age, gender, region, newspaper readership, and social grade, to represent the total population of each country. The sample is reflective of the population that has access to the internet.
- As this survey deals with news consumption, we filtered out anyone who said that they had not consumed any news in the past month, in order to ensure that irrelevant responses didn’t adversely affect data quality. This category averaged around 5% but was as high as 11% in the US.
- A comprehensive online questionnaire was designed to capture all aspects of news consumption.
- Core questions were asked in France, Germany, Denmark, Finland, Spain, Italy, Japan, Brazil, Australia, Ireland the US, as well as the UK, where there was a slightly longer questionnaire.
- Online focus groups were also held in the UK and US to complement survey data on sponsored and branded content.
Scroll data area to see more
|Country||Starting sample||Non News Users||Final Sample Size||Total population||Internet penetration|
Source: Internet World Stats http://www.internetworldstats.com internet population estimate 2014
This is an online survey – and as such the results will under-represent the consumption habits of people who are not online (typically older, less affluent, and with limited formal education). Where relevant, we have tried to make this clear within the text. The main purpose, however, is to track the activities and changes over time within the digital space – as well as gaining understanding about how offline media and online media are used together.
Along with country-based figures, throughout the report we also use aggregate figures based on responses from all respondents across all the countries covered. These figures are meant only to indicate overall tendencies and should be treated with caution.
YouGov’s online panel based research methods
David Eastbury Associate Director, International Omnibus at YouGov explains the methodologies around online panels and non-probabilistic sampling techniques
Internet surveys administered through panels have become a feature of the research landscape in the last decade. YouGov pioneered the use of online research in the UK and has been pretty much a lone voice in advocating the use of the internet for social research.
Online is not a suitable method for all research objectives. For example, it would be inappropriate for an online survey to be used to estimate the incidence of paying income tax online or for identifying service needs on a disadvantaged housing estate. However, online research can be used for a wide variety of topics and given that in the Digital News Survey we are asking about usage of and attitudes towards digital news and screening out those who don’t access it, online is an appropriate methodology.
There are a number of technical and survey related issues as well as advantages that should be discussed which include:
- How online panels, such as the YouGov one operate and how they compare to more traditional surveys
- Response rates
- Interviewer bias
- Social distance
- Pace of interview
YouGov has a panel of over 400,000 adults in the UK who have signed up to undertake research. These people cover a wide range of ages, gender, social grade, ethnicity and tenure. The panel is large enough to enable us to select both nationally representative samples that reflect the actual breakdown of the population on the key demographics of age, gender, region, social grade, ethnicity, newspaper readership and specific samples such as legal service users. The panel sizes for some of our other markets are listed below (2014 figures)
Scroll data area to see more
By definition a person can only be a member of the panel if they have internet access. People without internet access have a zero probability of taking part. It is possible, therefore, that there could be a bias against those groups who do not have internet access.
Statistics from Ofcom show that 84 per cent of people in the UK have access to the internet with the vast majority having broadband access. It is important to compare the characteristics of people who have the internet with those who do not so that any possible coverage bias can be identified.
Research by Oxford University shows that 46 per cent of DE social grades are online compared with 88 per cent of AB social grades. In relation to age, 40 per cent of people aged 65-74 are online compared with 81 per cent of 18-24 year olds. For the 75 plus group only 20 per cent are online but this group is hard to reach regardless of research method. The pattern is similar with regard to employment status with 48 per cent of unemployed people being online compared with 81 per cent of employed people.
The key issue from the above discussion is the need for a panel that has a sufficient coverage of members drawn from lower prevalence groups.
The table below shows the number of panel members from the low prevalence groups on the YouGov panel. This panel has a high number of people from each of the low prevalence groups.
Scroll data area to see more
|% of the population falling into each group||Number needed for nationally representative survey||Number on YouGov panel|
|Social Grade DE||28%||560||34,000|
|Aged 75 plus||9%||180||4,400|
Even if a panel has sufficient number of low incidence groups a secondary key question is: are people who join online panels different from those who do not even if their demographic group is the same?
Our own research and that across a range of panels in different countries suggests that online panel members can have some differences from members of the general population. For example, relative to the general population, online panels can contain disproportionately more voters, more highly educated people, heavier Internet users, and be more involved in the community or political issues.
People who are willing to participate in surveys may have higher cognitive capabilities. If respondents join a panel or participate in a survey based on their cognitive capabilities or needs then it can lead to differences in results compared with samples selected independent of cognitive capabilities or needs. For example, in a self-administered survey, people are required to read and understand the questions and responses.
However, none of these factors seem to have affected the research conducted on the YouGov panel. This is due to the way in which we use purposive sampling to select an achieved sample that reflects the key demographics (age, gender, social class) of the population. We also control for newspaper readership which is closely correlated with educational achievement (higher educated people are more likely, on average, to read a broadsheet rather than a tabloid newspaper).
The accuracy of our research and the non-existence of panel bias has been independently verified by researchers from the University of Essex. As part of the British Election Study 2005/6 the researchers compared a sample of 4,000 respondents drawn from our panel with a similar sized face to face sample that was drawn using random probability sampling. The study was related to voting patterns and was used to model the outcome. The accuracy of both surveys was then evaluated against the actual election result, the results of which can be found here.
This study showed that the marginal distributions on key variables in models of voting behaviour differed only slightly between the two surveys. The authors concluded that YouGov’s internet sample appeared to be slightly less ‘left-leaning’ than the probability sample. The researchers identified that this was likely to be due to people being more honest in internet surveys because of the lack of the presence of an interviewer.
More importantly, it was concluded that the relative explanatory power of predictive models derived from both the online survey and the face to face survey was exactly the same. In the authors’ words the online model when compared with the face to face model ‘yielded impressive similarities’. These findings are important because it is independent verification of the explanatory power of the YouGov approach to online research.`
A key issue with any survey is the response rate as low response rates can lead to bias in the survey – this happens when the people who do not respond to a survey are materially different from those who do. The consequence of this is that the survey cannot be said to be representative of the population due to it being biased towards one section of the community.
Response rates to online surveys have overtaken telephone interviewing especially among working adults (particularly ABC1s aged 25-55) who have less time to take part in research. YouGov surveys typically achieve 40 per cent response rates and often rates of over 60 per cent. This is much better than can be achieved by telephone using random digit dialling. Response rates for telephone polls, for example, have been declining in recent years – to typically around 20% – and often much lower in inner city areas. The ability to extrapolate from the 20% telephone pollsters can get hold of to the 80% they can’t is clearly a challenge leading to concerns over the quality of achieved samples, whether telephone or face-to-face.
Another issue to consider is that of incentivisation. Our panel members receive an incentive for taking part in the survey. The amount varies depending upon length, but is commonly 50p per survey. It is only a small incentive but it is important in showing our appreciation for the time people have taken to fill out the survey. This appreciation in turn increases the response rate.
Interviewer bias and social distance
Another key advantage of online for this survey is the neutrality of the interview mode. Independent research has found that respondents modify their answers in the presence of an interviewer, including when the interviewer is on the other end of the phone. This lack of ‘social distance’ can mean that respondents feel compelled to give a ‘safe’ answer.
Online surveys increase social distance so respondents are more likely to disclose important and sensitive information. In addition it enables a respondent to give an answer free from embarrassment and, therefore, a truer reflection of their actual feelings.
The influence of questionnaire design on measurement error has received attention in a number of publications. Chang and Krosnick (2010) conducted an experiment, randomly assigning respondents to complete a questionnaire either on a computer or to be interviewed orally by an interviewer. They found that respondents assigned to the computer condition manifested less non-differentiation and were less susceptible to response order effects. In other words the computer surveys were more likely to extract a truer response.
YouGov dominates Britain’s media polling and is one of the most quoted research agencies in Britain. Its well-documented and published track record demonstrates the accuracy of its survey methods and quality of its client service work. YouGov’s unique methodology enables us to create representative samples through the Internet.
It is a methodology of demonstrable superiority in terms of accuracy, frankness and depth of response – as well as speed and cost-effectiveness. The only way to demonstrate the accuracy of attitudinal research is to compare predictions with actual outcomes. This is why YouGov have consistently published pre-election polls, even in difficult-to-call contests such as local government elections. Our record of accuracy in opinion polling is unsurpassed in the UK. For example, in the recent European elections, we were both the closest of the pollsters in terms of individual percentages and the only one to correctly predict the order of the top five parties. In addition to election polls, YouGov have also predict things such as the x-factor and pop idol correctly as shown in the figure above.
YouGov’s accuracy extends outside of the UK as well to include US Presidential Election (2008), Election of the State Parliament of Hesse (2009) and Danish National Election (2011). In 2013, YouGov’s prediction of global quarterly sales for the Apple iPhone was accurate within 0.05% of the actual sales results.
Even though the Finnish panel is relatively small, we believe and our customers believe it to produce very accurate, valid and reputable results. Below are a few recent examples from nationally representative surveys to show the reliability of the data:
- The share of daily smokers: YouGov survey 20%; Statistics Finland 21%
- Market share of insurance companies and banks: matches exactly the shares reported by the Federation of Finnish Financial Services
- The share of households with children: YouGov survey 39%; Statistics Finland 40%
Pace of interview
Online research is more convenient for respondents; they can fill in the survey in their own time, at their own pace and can formulate more considered answers. The nature of the survey will be quite complex and require a great deal of time and thought on the respondents behalf, an online approach would be ideal for this study.
Academic Journals on Online Research
The following academic papers have looked at the validity of Online Research:
Prof David Sanders, Prof Harold D. Clarke, Prof Marianne C. Stewart and Prof Paul Whiteley. 2007. ‘Does Mode Matter for Modelling Political Choice?: Evidence from The 2005 British Election Study.’ Political Analysis 15: 257-85. http://www.bes2009-10.org/papers/DoesModeMatter.pdf
Lindhjem, Henrik, and Ståle Navrud. 2008. ‘Internet CV Surveys.’ MPRA Paper #11471 http://mpra.ub.uni-muenchen.de/11471/1/MPRA_paper_11471.pdf
Braunsberger, Karin, Hans Wybenga, and Roger Gates. 2007. ‘A comparison of reliability between telephone and web-based surveys.’ Journal of Business Research 60: 758-764.
Joe Twyman. 2008. ‘Getting it Right: YouGov and Online Survey Research‘ http://www.tandfonline.com/doi/abs/10.1080/17457280802305169#.UezuZ9LqkS4
Chang and Krosnick (2010). ‘Comparing Oral Interviewing with Self-Administered Computerized Questionnaires An Experiment’ http://poq.oxfordjournals.org/content/early/2010/02/12/poq.nfp090.abstract
Academic studies using YouGov panels
We have provided a number of notable examples of online panel based research projects, many of which were ESRC funded. All of them use the YouGov panel with nationally representative samples as would be applied for this project.
The British Election Study (BES) has been conducted at every General Election since 1964. For the 2001, 2005 and 2010 General Elections YouGov has conducted the internet components for the study. In 2010 the internet component was comprised of a pre-election survey covering a host of political and social issues with a nationally representative sample of 16,816 GB adults taking part. All respondents were then invited to a rolling campaign survey. 14,622 took part in one daily survey during the duration of the campaign. In nearly ten years of working on the various incarnations of the BES, YouGov has gained unrivalled experience into using the internet for conducting fast, cost-effective, accurate nationally representative surveys on a host of political and socio-political subjects. Once again in 2010 our standard methodology for achieving nationally representative results was employed to help ensure accuracy of data and continuity with comparable results was maintained.
A project within the ESRC Public Services Programme: Exit and Voice as a Means of Enhancing Service Delivery. In partnership with Professor Keith Dowding at the London School of Economics, YouGov has established a longitudinal panel of 4,000 UK adults. Over a five year period, regular online surveys examine citizen’s satisfaction with the services they receive, the ‘exit’ options they consider – e.g. moving house, using private services, shifting public service providers – and the ‘voice’ options they adopt to try to improve the services they receive.
British Cooperative Campaign Analysis Project (B-CCAP) – Working with the University of Oxford (Department of Politics and International Relations) and others. A multi-wave panel study of approximately 5,000 – 10,000 respondents to understand how formal and informal campaigns reach citizens, how citizens consume and react to them and what role campaigns play in the presence of strong structural forces such as the economy and party identification. YouGov is conducting similar projects in the US (with Stanford) and Germany.
The Structure, Causes, and Consequences of Foreign Policy Attitudes: A Cross-National Analysis of Representative Democracies. This research project seeks to better understand the nature and consequences of the foreign policy attitudes of individuals from six advanced democracies (United Kingdom, Canada, Italy, France, Germany and the US).
The Welsh Referendum Study (WRS) was an independent, academic study of the March 2011 devolution referendum in Wales. The study was designed around a two-wave panel survey of a representative sample of the Welsh electorate. Both waves of the survey were conducted online.
The Scottish Election Study (2011) – The 2011 Scottish Election Study, funded by the Economic and Social Research Council is based in the Department of Government at the University of Strathclyde. The Scottish Election Study takes the form of a three-wave internet panel survey of around 2,000 participants, with data collection undertaken by YouGov.