A recent article by Wiley colleagues Bob Campbell (Senior Publisher), Alice Meadows (Director, Society Relations), and Keith Webster (VP and Director, Academic Relations), published open access in the July 2012 issue of Learned Publishing, reviews a number of surveys and other studies on access to research journals and/or data, identifying common themes, addressing some seemingly contradictory findings, and suggesting some best practices for future surveys of this kind especially important given that their findings are increasingly used to inform policy decisions which may have a major impact on some or all scholarly communication key stakeholders.
The studies and surveys reviewed were:
1) Online survey on scientific information in the digital age a 2011 EC survey, open to all, which generated 1,159 responses from 42 countries, mainly from individuals
2) Public Consultation on the European Research Area (ERA) Framework Preliminary Report another 2011 EC survey open to all, which received 691 responses, mostly online and again mostly from individuals
3) Access to scholarly content: gaps and barriers a 2011 CIBER report based on a survey of 20,000 individuals, 13.2% of whom responded, of which 42% were from universities or colleges, 28% from industry or commerce, and the rest from hospitals or research institutes.
5) US Office of Science and Technology Policy Requests for Information on Public Access to Scholarly Journals and to Digital Data these 2011 RFIs generated 377 and 118 responses respectively, mostly from universities and libraries, as well as societies
Although each survey had a somewhat different focus, there were several common themes, in particular around the issues of access to content and/or data and Open Access (OA), and yet there are a number of inconsistencies in the surveys findings.
Access to journal content
Responses to questions on this issue ranged from the EC survey on scientific information in the digital age, in which almost 84% disagreed or disagreed strongly with the statement, there is no access problem to scientific publications in Europe, to the CIBER survey, which found that 93% of respondents in universities and colleges believe research papers are easy or fairly easy to access. Conversely though, when asked by CIBER which of a range of resources they would most like to see access improved, a large majority identified journal articles as their first choice; as the CIBER study authors point out, For many researchers, easy access to most of the journal literature is not good enough.
Barriers to access
It is clear from the EC survey on scientific information, the OSTP RFI on access to scholarly journals, and the CIBER survey, that the high price of journal subscriptions and shrinking library budgets are widely seen as the most important barriers to access, although they were by no means the only barriers noted. Conversely, however, when asked in the Go8 survey how they would rate the value for money represented by the range, depth, and ease of access to information resources made available to them by their university after being informed on the cost, 92% of Australian researchers overwhelmingly answered Excellent, Very Good, or Good.
In the EC survey on scientific information, 90% of respondents supported the idea that publications resulting from publicly funded research should, as a matter of principle, be available through open access means. Likewise, in the ERA survey, the section on OA spurred high interest among all: out of the total of 590 responses, 69% of respondents replied on average to the questions related to Open Access and 62% considered Open Access as one of the most important gaps to be filled to achieve ERA. In the OSTP RFI on access to scholarly journals, most academics and librarians also saw a stronger need for government mandates and centralized repositories. Almost everyone who responded to the RFI felt the government had some basic right to the research information generated from public funds the real dividing line was whether or not the published Version of Record had to be submitted and curated by the government.
Access to data
Two of the surveys the EC survey on scientific literature and data, and the OSTP RFI on public access to data specifically addressed the challenges of access to data. In the EC survey there was strong support (90%) for research data that is publicly available and results from public funding to be, as a matter of principle, available for re-use and free of charge on the internet. In the US, almost all comments recognized the inherent value of sharing and curating data sets, and in making these more available, but a number of concerns about how to do this were expressed.
Why such differences?
Although there are common themes in the findings of these surveys, there are also some differences. For example, although the two EC surveys appear to indicate that there are significant problems with access to scholarly publications in Europe, the UK-based CIBER study and Australian-based Outsell survey tell a different story, with the vast majority of researchers and faculty seeing little or no problems with access. One explanation might be that, while most researchers know that almost all journal articles are now available in digital form and, therefore, express high degrees of satisfaction with the provision of scholarly literature, this same high level of satisfaction can lead to a strong degree of complaint when digital access becomes difficult. Another contributing factor may be the differing levels of understanding and exposure to scholarly content of the respondents, hence the high level of satisfaction by respondents to the CIBER and Go8 studies, both of which focused primarily on researchers, faculty, and authors, all of whom arguably have the most interest in and need for access to scholarly publications.
Likewise, although there appears to be a large degree of consensus about the desirability of OA across several of the surveys, there are differences in the level of support, especially for Green OA, with support for OA strongest by far in the EC surveys.
How useful are the results?
These studies and surveys are something of a mixed bag. First, the groups represented in the surveys vary. In the cases of the EC and OSTP studies there is no frame or listing of units from which a sample has been selected anyone was free to respond, including groups and organizations as well as individuals. Conversely, the Go8 and CIBER surveys have a clear focus on academics and researchers and a clear sampling frame of individuals being surveyed.
Second, the methodology of most of the surveys is open to criticism. For example, although the CIBER survey was carefully constructed some will dismiss the findings on the grounds that the names were provided by a commercial publisher; no such criticisms can be made of the Go8 survey, however, which produced almost the same result. Conversely, neither the EC surveys nor the OSTP RFIs made any attempt to create a sampling frame, so standard measures of quality such as response rates and precision cannot be evaluated. In addition, the wording of at least some of the items in the EC surveys appears to be leading.
Finally, the way in which the surveys allowed stakeholders to identify themselves may also be significant for example, neither of the EC surveys differentiated learned societies as a key stakeholder group. As a result, it is possible that the views of learned societies are less well represented than they should be.
Given the increasing involvement of governments around the world in scholarly communications policy, and their increasing use of studies and surveys to support changing legislation, surely it is of the utmost importance that these studies and surveys properly reflect the views of those most likely to be impacted by these changes? The poor methodology (including loaded questions) of some of these surveys indicates a motivation to achieve a certain result rather than sound evidence for informing policy. Our recommendation is that governments and others involved in public policy for scholarly communications should hold themselves to the same high standards for data collection and analysis as those used by the scholarly and scientific community itself.
The Learned Publishing article on which this summary was based can be found here.