Books and publishing ecosystem has undertaken a change, owing to the advent of e-books as an effective substitute to physical formats, making literature more verbose and portable. E-books have found preference amongst the younger generation, which can be contributed to its versatility and even the corporate arena, alongside the educational system is taking the in new digital format. The exponential growth has transpired due to factors such as digital boom; low production cost, nominal investments, and decreased storage costs while covering a larger base in an efficient manner. Publications are reinventing their methods since an increasing number of their readers have been opting to go online for consumption of books. This move has motivated traditional publication houses to make their foray into digital for visibility and profits.
Foundations have been throwing their weight around for a while in the fight for open access to research, a contentious issue in academia in which proponents seek to break down steep paywalls and stodgy practices of journals. A lot of the philanthropic participation has emerged in the form of traditional grantmaking toward projects like online platforms to exchange information, convenings on open science, and even an entire philanthropy-backed, open access journal. But another way that funders, especially the big ones, wield influence on the issue is by setting ground rules on openness when it comes to the work done on their dime. One of the biggest actors in this space is the Gates Foundation, and its flexing to get journals to open things up just got real.
Peer review, in which papers written by scientists and submitted to scholarly journals are reviewed by two or three other experts in the field before being published (or rejected), is the gold-standard for evaluating science. However, its effectiveness in rapidly disseminating good science and eliminating bad has recently come under the scanner. Gender, institutional affiliation and other human fallibilities introduce unwelcome biases in the peer-review process, further raising uncomfortable questions on the very foundation of scholarly communication. In light of these concerns, where does scholarly communication - especially in the life and biomedical sciences - stand today? In the pre-internet era, journal-mediated limited peer-review was certainly better than nothing. Print space was (and is) a premium and journals had to find ways to be selective about what gets shown to the world. Peer review is a way to ensure that they publish only what they would like to publish and so establish their reputation as a purveyor of gourmet science.
The publishing industry has been watching from the sidelines for the better half of a year, waiting to see what will happen to the IDPF's proposed merger with the W3C. The International Digital Publishing Forum called for a member vote about a proposed merger with the WorldWide Web Consortium back at 2016 BookExpo/IDPF event. That proposal was immediately met with opposition from a number of publishing entities, most notably Steve Potash of OverDrive. The IDPF countered that the vote was not akin to the merger, but more of an interest survey. Later, the IDPF announced that the vote was in favor of a merger, and therefore, they would move forward with the plan. Again, OverDrive strenuously voiced its concerns, namely that the flagship of digital publishing - the ePub standard - would no longer be in the hands of the publishing industry but would instead fall under a company that handles lots of web-related standards.
Platforms are fond of selling publishers on their reach, but they don't always deliver. In April, a batch of small publishers migrated to Medium in the hopes that the platform's network effect would increase their reach. But seven months after the move, comScore and Alexa data show that several of these publishers have seen their traffic decline. Of the 16 largest publishers on Medium that have existed for at least a year, nine of them (56 percent) have seen their Alexa rank plummet, four of them have seen their rank increase (25 percent) and three have seen their rank fluctuate in no clear discernible pattern (19 percent). A source familiar with Medium publishers' traffic said that third-party providers do not typically account for app traffic, so Medium's reach is likely greater than third-party data indicates. However, a comScore's data takes app traffic into account. But either way, even with those caveats in mind, third-party data does not paint a pretty picture for the publishing platform.
Publishers wanting to develop long-term content strategies to increase the value of their scholarly book programs must consider chapter-level metadata, particularly abstracts, to stay competitive. The long-term benefits of investing in abstracts for the backlist and building production workflows into new releases are supported by publishers, aggregators, librarians, and researchers alike. As technology rapidly changes, and publishers face increased pressure to grow revenue, abstracts are a clear opportunity for publishers to meet these demands. This white paper examines the return on investment publishers in humanities and social science fields may gain by adding chapter-level abstracts and curated keywords to their metadata.
This report, commissioned by BSN 4 and BSN 7, is concerned with the new ways in which open access journals can be editorialised. The transition to open access has accelerated in recent years. Several countries have established a legal framework to secure the depositing of articles in open archives (in France, a provision of this type is included in the Digital Bill). In May 2016, the Council of the European Union called for open access to be made a "default option" in all Member States by 2020.
A study has developed scenarios for transitioning Switzerland's scientific publication system towards Open Access (OA). It recommends a model that proposes a pragmatic and flexible way of making publicly funded research freely available at no charge and with no delay. The study was initiated by the Swiss National Science Foundation (SNSF) in collaboration with the funding programme 'Scientific Information' (SUC P-2) run by swissuniversities. In 2015, the libraries at Switzerland's higher education institutions paid a total of 70 million Swiss francs in licences and subscriptions to publishing houses in order to make more than 2.5 million scientific articles available. Researchers spent a further 6 million Swiss francs on article processing charges so they could have their results published in open access mode in a scientific journal. These figures were generated by an initial analysis of financial flows in the Swiss higher education system.
Open Access to research is a public benefit which enhances transparency, scientific integrity and rigour, stimulates innovation, promotes public engagement, and improves efficiency in research. The UK is widely recognised as being the leading nation in the Open Access and Open Data movements. This is both underpinned by, and underpins, the UK's position as second only to the USA as a leading research power. This document presents the background, evidence base and details of advice from Professor Adam Tickell, Provost and Vice-Principal, University of Birmingham and Chair of the Universities UK Open Access Coordination Group, to the Minister for Universities and Science, Jo Johnson MP, following his letter of request dated 22 July 2015. This paper does not cover Open Access monographs, other than to note that the UUK OA Coordination Group will convene a working group to make progress and further recommendations.
The authors found that there is a spectrum of discussion in the information studies literature: at one end, accidental discovery of unknown information is seen as a fundamental method of scholarly information seeking (Cooksey, 2004); at the other end, chance information encounters are rejected as having a useful role to play in academic practices at all (Gup, 1998). The purpose of this paper is not to take a position on that debate but to share some of what SAGE has learned about the dynamics of unplanned discovery and how information professionals can encourage this type of unplanned discovery to drive better research outcomes.
What can sales data tell us about e-book adoption and digital reading habits? In this presentation Len Vlahos, Executive Director of the Book Industry Study Group (BISG), takes a close look at book industry statistics from the publisher's perspective, identifying trends related to global e-book adoption, and answering questions about where digital reading is going, to help publishers and libraries prepare for the future.
At the annual Project Muse Publishers Meeting held in Baltimore, Todd Carpenter, Executive Director of NISO (National Information Standards Organization), shared a presentation with attendees about his organization and current projects and initiatives they're working on. Following what NISO is up to is a useful (and interesting) way to monitor emerging and current trends/technology as well as seeing how current standards are being adapted for the changing landscape.
The survey is a follow up to Wiley's 2012 open access author survey and is the second such survey conducted by Wiley. Consistencies were seen between the 2012 and 2013 surveys in authors' desire to publish in a high-quality, respected journal with a good Impact Factor, but the survey also shed light on differences between early career researchers (respondents between the ages of 26-44 with less than 15 years of research experience) and more established colleagues in their opinions on quality and licenses. Differences were also seen across funding bodies and in the funding available for open access to different author groups.
Dr. Charles Kurzman, Professor of Sociology, University of North Carolina, Chapel Hill, presented "Shifts in Scholarly Communications Among World Regions" at the OCLC Research Briefing at UNC Chapel Hill on June 7, 2013. At this event, Dr. Kurzman presented his research on changing academic attention to world regions over the past 50 years, "attention" as measured by analyzing works published about each region of the world and collected in U.S. academic libraries for each year of publication since 1958. The patterns that emerge from this research will help to inform social scientists and educational policymakers about trends and possible gaps in scholarly attention to different regions of the world.
These 201 slides from a pre-con tutorial titled, 'Introduction to Linked Open Data (LOD)' was presented on September 2, 2013 at Dublin Core 2013 (DC-2013) in Lisbon, Portugal. The instructor was Ivan Herman, Semantic Web Activity Lead at the World Wide Web Consortium (W3C). The goal of the tutorial is to introduce the audience into the basics of the technologies used for Linked Data. This includes RDF, RDFS, main elements of SPARQL, SKOS, and OWL. Some general guidelines on publishing data as Linked Data will also be provided, as well as real-life usage examples of the various technologies.