The digital preservation field is evolving rapidly. Focal areas are changing and best practices are still under debate. Preservation efforts must address not just preservation of the technologies of the past, but also those of the future. The rapidly increasing volume of data requiring preservation makes other digital preservation challenges inherently larger and more complex. The shorter lifespan of digital materials also makes the need for timely and effective preservation action more urgent. This article describes what the author sees as the current major challenges in digital preservation, and covers a range of technical, administrative, legal and logistical aspects.
Let’s face it. The biggest blockage we have to widespread Open Access is not researcher apathy, a lack of interoperable systems, or an unwillingness of publishers to engage (although these do each play some part) – it is the problem that the only thing that counts in academia is publication in a high impact journal.
This situation is causing multiple problems, from huge numbers of authors on papers, researchers cherry picking results and retrospectively applying hypotheses, to the reproducibility crisis and a surge in retractions.
This blog was intended to be an exploration of some solutions prefaced by a short overview of the issues. Rather depressingly, there was so much material the blog has had to be split up, with several parts describing the problem(s) before getting to the solutions.
Prepare yourself, this will be a bumpy ride. This first instalment looks...Read more
The aim of this task force is to produce a number of competency profiles that will help to build capacity in libraries for supporting new roles in the area of scholarly communication and e-research. The profiles will enable library managers to identify skill gaps in their institution, form the basis of job descriptions, enable professionals to carry out self-assessments, and act as a foundation for the development of training programs for librarians and library professionals. In addition, the toolkit will provide an outline of new organizational models that are evolving in this dynamic environment.
· The number of article processing charges (APCs) paid doubled between 2013 and 2014. Growth remained strong in 2015, but slowed in part due to limited room for growth in institutions’ internal budgets
· The average APC has increased by 6% over the past two years, a rise well above the cost of inflation
· Publishers’ APC costs are converging to a more uniform price range, although they still vary widely. Journals with low APCs are raising their prices, perhaps to avoid being perceived as low quality following expectations set by the Finch report
· APC expenditure is unevenly distributed between publishers, with the lion’s share of income distributed among a handful of major publishers... Read more
This report documents the Strategic Thinking and Design work that the Association of Research Libraries (ARL) engaged in from the fall of 2013 through the end of 2015. Fuelled by the deep desire of the ARL membership to rise to the challenges facing higher education in the 21st century, and with grants from the Institute of Museum and Library Services and the Andrew W. Mellon Foundation, the Association engaged in an unprecedented project to reimagine the future of the research library and then reshape ARL, its organization, to help bring that future into being.
This process was unprecedented in that, instead of trying to ameliorate, one by one, the challenges that research libraries face—challenges that are a product of the friction between the research libraries’ historical evolution and a rapidly changing context—or seek a silver-bullet technological solution to move the community forward, the process focused on what the research library would be if it were specifically designed for the context of the 21st century—for the digital and networked age....Read more
Looking at the vast ocean that is modern-day computing, we can see that major developments come in waves. The arrival of mainframe computers in the 1960s generated the first wave (one computer for many people), followed in the late 1970s by personal computers in the second wave (one computer for one person). In 1988, Mark Weiser presciently observed that computers embedded into everyday objects, objects all around us, were forming the third wave—what he called ubiquitous computing (many computers for one person). A decade later, in 1999, Keven Ashton put forth the ideas behind, and coined the term for, the fourth wave: the Internet of Things.
In this paradigm shift, Weiser's computer-embedded everyday objects—or "things"—are connected to the Internet and can communicate with users and with other devices. The guiding principle is connection, along with the conviction that if something can be connected, it will be connected. Indeed, in recent years, the wave appears to be rising to a crest. The...Read more