Nearly a decade ago, headlines highlighted a disturbing trend in science: The number of articles retracted by journals had increased 10-fold during the previous 10 years. Fraud accounted for some 60% of those retractions. Although statistics were sketchy, retractions appeared to be relatively rare, involving only about two of every 10,000 papers. Sometimes the reason for the withdrawal was honest error, not deliberate fraud. And whether suspect papers were becoming more common-or journals were just getting better at recognizing and reporting them-wasn't clear. Still, the surge in retractions led many observers to call on publishers, editors, and other gatekeepers to make greater efforts to stamp out bad science.
Scientific journal articles are not the same as news stories in magazines or newspapers. These journals are publications that exist to publish scientific papers, and they are typically peer-reviewed: when the journal receives a paper from a scientist, they send it to other scientists and ask whether the paper is worth publishing. Traditionally, no money changes hands between the scientists and the journal. The journal then charges readers. This model was solidified in the days when journals were all printed on paper and you had to buy them. Libraries would buy subscriptions, of course. But now that everything is digital, libraries pay subscription fees to provide access to their patrons.
Pressures on the first two fronts are forcing journals to stay relevant in newer ways. A big source of such pressure is the availability of preprints - i.e. manuscripts of papers made available by their authors in the public domain before they have been peer-reviewed. Preprint repositories like arXiv and biorXiv have risen in prominence over the last few years, especially the former. They are run by groups of scientists - like volunteers pruning the garden of Wikipedia - that ensure the formatting and publishing requirements are met, remove questionable manuscripts and generally - as they say - keep things going. Scientific journals typically justify their access cost by claiming that they have to spend it on peer review and printing. Preprints evade this problem because they are free to access online and are not peer-reviewed the way 'published' papers are. In turn, the reader who wishes to read the preprint must bear this caveat in mind.
While the current scientific publishing industry is one of the most lucrative worldwide, the current scholarly publication process has several problems that affect the research community, such as high publication costs, copyrights held by publishers instead of authors, biased publication and peer review processes, lack of rewards and recognition for reviewers, and a proliferation of low-quality journals. Blockchain can eliminate market inefficiencies and improve the quality and effectiveness of scientific publishing. Through the utilisation of smart contracts, decentralised storage solutions, big data analytics, and cloud computing, blockchain technology can be harnessed to greatly accelerate the peer review process and usher in a new era of transparency and efficiency for the sector.
College costs are getting too high. Benefits are falling relative to costs. Higher education is a labour-intensive enterprise, and cost savings have to come by lowering labour expenses. While a huge hunk of that involves curtaining the burgeoning administrative bloat, a more healthy balance of teaching and research responsibilities probably is in order. A majority of academic journal articles are never read, but many journal articles are rarely cited. And some academic journals have circulations of well under 1,000, many copies sitting unread on library shelves.
Following the Finch Report in 2012, Universities UK established an Open Access Coordination Group to support the transition to open access (OA) for articles in scholarly journals. The Group commissioned an initial report published in 2015 to gather evidence on key features of that transition. This second report aims to build on those findings, and to examine trends over the period since the major funders of research in the UK established new policies to promote OA.
Whitepaper by INSPEC (White Paper - Cookies, fake news and single search boxes: the role of A&I services in a changing research landscape) examines the growing importance of A&I databases in an open web landscape increasingly dominated by advertising and irrelevant results. Librarians and researchers share their thoughts on how they use search tools for academic research and highlight the differences between curated resources and general search engines. The contrast between these search results demonstrates why A&I services have an important role to play in contemporary research.
It is frequently claimed that open access (OA) has the potential to increase usage and citations. This report substantiates such claims for books in particular, through benchmarking the performance of Springer Nature books made OA through the immediate (gold) route against that of equivalent non-OA books. The report includes findings from both quantitative analysis of internal book data (chapter downloads, citations and online mentions) and external interviews conducted with authors and funders. This enables the comparison of actual performance with perceptions of performance for OA books.
This report is the outcome of research commissioned and funded by the four presses. It engages with usage data made available by JSTOR relating to OA books in order to assist publishers in understanding how their OA content is being used; inform strategic decision making by individual presses in the future; and shed light on the potential for data relating to the uses of OA books to support the potential of open access books to reach wide audiences. Additional key aims of the research are to help inform JSTOR in the development of the JSTOR OA Books platform; and to inform the development of JSTOR usage reporting. Ensuring that JSTOR usage reporting reflects the needs of OA publishers is also an important goal of the project. All four publishers have contributed to a discussion of the role and practicalities of usage reporting services provided by JSTOR.
This report examines how peer review can be improved for future generations of academics and offers key recommendations to the academic community. The report is based on the lively and progressive sessions at the SpotOn London conference held at Wellcome Collection Conference centre in November 2016. It includes a collection of reflections on the history of peer review, current issues such as sustainability and ethics, while also casting a look into the future including advances such as preprint servers and AI applications. The contributions cover perspectives from the researcher, a librarian, publishers and others.
What can sales data tell us about e-book adoption and digital reading habits? In this presentation Len Vlahos, Executive Director of the Book Industry Study Group (BISG), takes a close look at book industry statistics from the publisher's perspective, identifying trends related to global e-book adoption, and answering questions about where digital reading is going, to help publishers and libraries prepare for the future.
At the annual Project Muse Publishers Meeting held in Baltimore, Todd Carpenter, Executive Director of NISO (National Information Standards Organization), shared a presentation with attendees about his organization and current projects and initiatives they're working on. Following what NISO is up to is a useful (and interesting) way to monitor emerging and current trends/technology as well as seeing how current standards are being adapted for the changing landscape.
The survey is a follow up to Wiley's 2012 open access author survey and is the second such survey conducted by Wiley. Consistencies were seen between the 2012 and 2013 surveys in authors' desire to publish in a high-quality, respected journal with a good Impact Factor, but the survey also shed light on differences between early career researchers (respondents between the ages of 26-44 with less than 15 years of research experience) and more established colleagues in their opinions on quality and licenses. Differences were also seen across funding bodies and in the funding available for open access to different author groups.
Dr. Charles Kurzman, Professor of Sociology, University of North Carolina, Chapel Hill, presented "Shifts in Scholarly Communications Among World Regions" at the OCLC Research Briefing at UNC Chapel Hill on June 7, 2013. At this event, Dr. Kurzman presented his research on changing academic attention to world regions over the past 50 years, "attention" as measured by analyzing works published about each region of the world and collected in U.S. academic libraries for each year of publication since 1958. The patterns that emerge from this research will help to inform social scientists and educational policymakers about trends and possible gaps in scholarly attention to different regions of the world.
These 201 slides from a pre-con tutorial titled, 'Introduction to Linked Open Data (LOD)' was presented on September 2, 2013 at Dublin Core 2013 (DC-2013) in Lisbon, Portugal. The instructor was Ivan Herman, Semantic Web Activity Lead at the World Wide Web Consortium (W3C). The goal of the tutorial is to introduce the audience into the basics of the technologies used for Linked Data. This includes RDF, RDFS, main elements of SPARQL, SKOS, and OWL. Some general guidelines on publishing data as Linked Data will also be provided, as well as real-life usage examples of the various technologies.