Citation
Hanson, H. and Stewart-Marshall, Z. (2015), "New & Noteworthy", Library Hi Tech News, Vol. 32 No. 2. https://doi.org/10.1108/LHTN-02-2015-0012
Publisher
:Emerald Group Publishing Limited
New & Noteworthy
Article Type: New & Noteworthy From: Library Hi Tech News, Volume 32, Issue 2
Stanford University Press awarded $1.2 million for the publishing of interactive scholarly works
Scholars of digital humanities and computational social sciences will soon have an academic publisher offering a validated, peer-reviewed process for their interactive scholarly research projects. Stanford University Press, with grant funding from the Andrew W. Mellon Foundation, will accelerate the integration of interactive scholarly works, usually revealed as websites, and new narratives enriched with digital objects and rich linking, into its publishing portfolio.
Stanford University Press will undertake a digital publishing process that mirrors the rigor and consideration of book publishing. “Adding interactive scholarly works to traditional publishing programs will lead to the next generation of university press publishing”, said Michael A. Keller, university librarian at Stanford and publisher of the Stanford University Press.
The grant will be implemented over three years and according to Keller, “will spark changes in the expertise and practices of specialists in all aspects of 21st century academic publishing, beginning with the practice of research by teams of scholars and leading through ultimately to all publishing roles”.
Advances in technology have provided scholars with new ways to visualize and analyze data. The impact of these tools in the academy continues to evolve as more digital projects take form. “There are several examples, from across the globe, which demonstrate how the capabilities of visualization tools have informed humanities scholarship and offered researchers new lenses for discovery”, said Keller.
Currently, individuals and research groups host their digital materials online through their own websites, or on various public platforms. “For the most part these hosting models do not share common benchmarks or standards and very few incorporate rigorous peer review processes”, said Dr Alan Harvey, Director of Stanford University Press.
“One goal for establishing a publishing methodology for interactive scholarly works is to provide a distribution channel that is held in the same high regard as the long-form monograph counterparts”, said Harvey. “It is our intent to give scholars an opportunity to accumulate a digital publishing pedigree that provides the same consideration for hiring and tenure as traditional book publishing offers.”
Stanford University Press will collaborate with the University of Richmond’s Digital Scholarship Lab on implementation of the grant. In addition to developing the system and framework for publishing digital-born scholarship, the grant will develop a cost-basis for publishing digital objects and establish an example of publishing practices that other academic presses can emulate, adopt or adapt.
Stanford University Press will seek interactive scholarly works and help its authors prepare them for submission to the publishing program. Consistent with its present publishing program, the Press will serve as publisher of the interactive works, acquiring titles, operating independent peer and technical reviews and marketing each published title. To guard against lost content, Stanford Libraries will perform Web and data archiving for each project.
According to Harvey, Stanford University Press receives numerous digital project proposals, some breaking new pedagogical ground, but given the lack of publishing support for interactive works, the projects haven’t benefited from editorial support. “We purposefully have loaded the first year of the grant with outreach efforts, devoting ourselves to supporting and orienting scholars on the nature of fully edited, peer-reviewed, publishable born-digital projects”, said Harvey.
The Press will hire a Digital Projects Editor and a Digital Production Associate to support the projects associated with this grant. The editor will work closely with authors, in the way a traditional developmental editor guides book authors through the review process. “The workflow will be fluid to allow innovation to continue apace as new practices, new tools and environments for network – based communication, and new methods in humanities and social sciences develop”, Keller concludes.
“This project will accelerate the transition to multifaceted digital publications, encourage thoughtful experimentation and document and promulgate best practices”, said Edward L. Ayers, President of the University of Richmond. “The project also has the potential to become a prototype for the 21st-century academic publisher.”
Full press release: http://library.stanford.edu/news/2015/01/stanford-university-press-awarded-12-million-publishing-interactive-scholarly-works
Humanities Open Book program to republish out-of-print humanities books
A new joint grant program by the National Endowment for the Humanities (NEH) and the Andrew W. Mellon Foundation seeks to give a second life to outstanding out-of-print books in the humanities by turning them into freely accessible e-books.
Over the past 100 years, tens of thousands of academic books have been published in the humanities, including many remarkable works on history, literature, philosophy, art, music, law and the history and philosophy of science. But the majority of these books are currently out of print and largely out of reach for teachers, students and the public. The Humanities Open Book pilot grant program aims to “unlock” these books by republishing them as high-quality electronic books that anyone in the world can download and read on computers, tablets or mobile phones at no charge.
The NEH and the Andrew W. Mellon Foundation are the two largest funders of humanities research in the USA. Working together, NEH and Mellon will give grants to publishers to identify great humanities books, secure all appropriate rights and make them available for free, forever, under a Creative Commons (CC) license.
The new Humanities Open Book grant program is part of the NEH’ agency-wide initiative The Common Good: The Humanities in the Public Square, which seeks to demonstrate and enhance the role and significance of the humanities and humanities scholarship in public life.
“The large number of valuable scholarly books in the humanities that have fallen out of print in recent decades represents a huge untapped resource”, said NEH Chairman William Adams. “By placing these works into the hands of the public we hope that the Humanities Open Book program will widen access to the important ideas and information they contain and inspire readers, teachers and students to use these books in exciting new ways.”
“Scholars in the humanities are making increasing use of digital media to access evidence, produce new scholarship, and reach audiences that increasingly rely on such media for information to understand and interpret the world in which they live”, said Earl Lewis, President of the Andrew W. Mellon Foundation. “The Andrew W. Mellon Foundation is delighted to join NEH in helping university presses give new digital life to enduring works of scholarship that are presently unavailable to new generations of students, scholars, and general readers.”
The NEH and the Andrew W. Mellon Foundation will jointly provide $1 million to convert out-of-print books into EPUB e-books with a CC license, ensuring that the books are freely downloadable with searchable texts and in formats that are compatible with any e-reading device. Books proposed under the Humanities Open Book program must be of demonstrable intellectual significance and broad interest to current readers.
Application guidelines and a list of FAQs for the Humanities Open Book program are available online. The application deadline for the first cycle of Humanities Open Book grants is June 10, 2015.
Humanities Open Book program: http://www.neh.gov/grants/odh/humanities-open-book-program
Information Standards Quarterly themed issue on open access infrastructure
The latest issue of Information Standards Quarterly with a theme of Identity Management is now available on the National Information Standards Organization (NISO) website in open access. Identity management is critical to ensuring that licensed or owned electronic content is available to those who have rights to it. Libraries, publishers and content providers are challenged with finding and implementing appropriate standards to implement identity management across platforms and among varied institutions. Guest editor, Andy Dale, CTO Respect Network, has compiled articles that discuss Privacy by Design, OCLC’s vision for identity management, JSON standards and identity management for the Internet of Things.
Articles in the issue include:
Privacy by Design and the Online Library Environment by Dan Blum, Chief Security and Privacy Architect with Respect Network.
From the Library of Congress to the Library of Me by Don Hamparian, Senior Product Manager, Identity Management at OCLC.
The Intention Publishing Economy: When Patrons Take Charge by Doc Searls, Director of ProjectVRM at Harvard’s Berkman Center for Internet and Society.
A JSON-Based Identity Protocol Suite by Michael B. Jones, Identity Standards Architect at Microsoft.
The full issue and individual articles are available for download at: http://www.niso.org/publications/isq/2014/v26no3/
Digital Preservation Coalition publishes OAIS Introductory Guide (2nd edition) Technology Watch Report
The Digital Preservation Coalition (DPC) has announced the publication of the OAIS Introductory Guide (2nd edition) Technology Watch Report. Written by Brian Lavoie of OCLC Research, and published in association with Charles Beagrie Ltd., the latest report in the Technology Watch series looks back on the development, features and impact of the Open Archival Information System (OAIS) Reference Model, one of the core standards of digital preservation.
Research Scientist at OCLC, Brian observes that perhaps “the most important achievement of the OAIS in this history is that it has become almost universally accepted as the lingua franca of digital preservation”.
Emphasising its flexibility and conceptual nature, the report describes the OAIS, its core principles and functional elements as well as the information model which supports long-term preservation, access and understandability of data – highlighting the in-built level of abstraction which makes it such a widely applicable foundation resource for digital preservation.
Brian adds “it is possible to identify a few limitations associated with the OAIS’s impact”, generally associated with the very conceptual nature of the model, and goes on to recommend that the digital preservation community would certainly “benefit from a careful assessment of where more precise and authoritative definitions of OAIS concepts and relationships would accelerate progress in achieving robust, widely applicable, and interoperable digital preservation solutions”.
The Introduction to OAIS was the first of the DPC Technology Watch reports, and although it was first published a decade ago, it has remained popular. The second edition updates and expands this first report, providing an excellent introduction to the OAIS for those new to digital preservation and a resource for practitioners wishing to re-acquaint themselves with the basics of the model, supplemented by the wisdom of a decade of research, development and implementation.
Sarah Higgins of the Department of Information Studies at Aberystwyth University praises the report, calling it “a much needed and important update. It lays out both the content of the second edition of the OAIS Reference Model, and the results of over a decade of research and development that can trace its roots to OAIS. The tools and processes for practical implementation of digital preservation and measuring their success are expertly explained and evaluated. The report will be invaluable to both established and new entrants to the digital preservation profession who need to understand the basic concepts of an OAIS and the tools available to them. This clear and comprehensive report will be embedded as core reading for Aberystwyth University students studying Digital Curation or Digital Information Management at Master’s level”.
The not-for-profit DPC is an advocate and catalyst for digital preservation. The coalition ensures its members can continue to deliver resilient long-term access to digital content and services through knowledge exchange, capacity building, assurance, advocacy and partnership. Its primary objective is raising awareness of the importance of the preservation of digital material and the attendant strategic, cultural and technological issues. The DPC Technology Watch Reports support this objective through an advanced introduction to topics that have a major bearing on its vision to “make our digital memory accessible tomorrow”.
OAIS Introductory Guide (2nd edition) is the latest in the state-of-the-art Technology Watch Reports that give an advanced introduction to ensuring that high-value and vulnerable digital resources can be managed beyond the limits of technological obsolescence.
Read Brian Lavoie’s Technology Watch Report OAIS Introductory Guide (2nd Edition): http://www.dpconline.org/component/docman/doc_download/1359-dpctw14-02
Designing storage architecture presentations now available
On September 22-23, 2014, the Library of Congress hosted its annual invitational meeting on Designing Storage Architectures for Digital Collections in Washington DC. This meeting, also known as the “Storage Meeting”, brings together digital preservation practitioners who manage digital collections, libraries and archives; IT professionals who manage infrastructure within their organizations; and data storage vendors who develop storage technologies. Through sharing expertise and experiences with managing and preserving digital content among librarians and archivists, data-driven industry organizations and research organizations, the meeting provides an opportunity to discuss current and future requirements for digital collection management and preservation with technical engineering architects in storage industry companies.
Presentation materials from the 2014 meeting are now available online at: http://www.digitalpreservation.gov/meetings/storage14.html
Archivematica 1.3.0 released: new open source preservation solution
Artefactual has announced the release of Archivematica 1.3.0 with full DuraCloud integration. Archivematica is an open-source digital preservation system with a Web-based dashboard for ingesting digital holdings and generating Archival Information Packages (AIPs). DuraCloud is an open-source cloud-based archiving and preservation service platform that manages and preserves digital objects in secure, replicated storage. Archivematica 1.3.0 now features the ability to configure the storage option to deposit AIPs into DuraCloud archival cloud storage via the Web-based dashboard. Institutions and organizations may now choose to implement the Archivematica open-source preservation stack in-house, or to take advantage of the DuraCloud hosted service for long-term secure archival storage directly from their hosted Archivematica dashboard.
The key new feature in the Archivematica 1.3.0 release is the ability to manage all storage functionality in DuraCloud, specifically:
ability to store AIPs in DuraCloud;
ability to store Dissemination Information Packages (DIPs) in DuraCloud; and
ability to synchronize a local copy with a remote copy in DuraCloud.
“Working with Artefactual to integrate our two software platforms means that if you use Archivematica locally, you can now upload content into your DuraCloud account”, explained DuraSpace CEO Michele Kimpton, “or add the open source software to your in-house technology stack. Either way it provides Institutions with a complete end to end open source solution”.
Evelyn McLellan, President of Artefactual systems: “Having both local and cloud-based storage options available to Archivematica users helps us to offer our communities additional ways to preserve and protect their digital holdings”.
Archivematica 1.3.0 is production-ready and now available for download here: http://www.archivematica.org/wiki/Installation
Organizations interested in learning more about the hosted Archivematica and/or the DuraCloud service can complete the inquiry form at: http://duracloud.org/archivematica
Artefactual systems: http://www.artefactual.com/
DuraSpace: http://duraspace.org/
DuraCloud: http://www.duracloud.org
Archivematica: http://www.archivematica.org
Digital Preservation Coalition launches new Strategic Plan for 2015-2018
The Digital Preservation Coalition (DPC) launched its new Strategic Plan for 2015-2018 at the 4C (Collaboration to Clarify the Costs of Curation)/DPC “Investing in Opportunity” Conference at the Wellcome Trust in November 2014.
“By realising its goals, the intention of the Strategic Plan is to sustain the Coalition’s vision to make our digital memory accessible tomorrow”, explains William Kilbride, Chair of the DPC. This shared vision lies at the core of the DPC’s Strategic Plan, embodied in a new mission statement and supported by four strategic objectives.
DPC’s Mission: “We enable our members to deliver resilient long-term access to digital content and services, helping them to derive enduring value from digital collections and raising awareness of the attendant strategic, cultural and technological challenges they face. We achieve our aims through advocacy, workforce development, capacity-building and partnership”.
Responding to the changing environment of digital preservation, the DPC will pursue four strategic objectives, to provide:
1. Competent and responsive workforces ready to address the challenges of digital preservation.
2. A political and institutional climate responsive to the need for digital preservation.
3. Better tools, smarter processes and enhanced capacity in digital preservation.
4. Closer and more productive collaboration within and beyond the Coalition.
The Strategic Plan lays out the direction for the DPC over the next three years, but also recognises that it is the very Coalition itself which enables the mission to be followed, and the vision achieved.
Download the abridged version of the new Strategic Plan 2015-2018: http://www.dpconline.org/component/docman/doc_download/1347-dpc-stratgic-plan-2015-2018
UNESCO PERSIST project seeks input, best practices on selection policies and digital strategies
Collecting and preserving born-digital heritage is a major challenge. Formulating sustainable collection policies and economic solutions to safeguard the digital output of the public and private sector requires close collaboration between governments, industries, memory institutions and other stakeholders, including creators and consumers. In order to get one step closer to this aim, the UNESCO PERSIST project, a cooperation between UNESCO (United Nations Educational, Scientific and Cultural Organization), the International Council on Archives (ICA), the International Federation of Library Associations (IFLA), LIBER (Ligue des Bibliothèques Européennes de Recherche), the National Library of The Netherlands and the Digital Heritage Netherlands Foundation (DEN), is now setting out to collect and share worldwide best practices and guidelines on selecting policies and digital strategies.
The UNESCO PERSIST project (Platform to Enhance the Sustainability of the Information Society Transglobally) aims to enhance the sustainability of the information society by assisting institutions to preserve and to provide access to digital heritage. To achieve this, PERSIST seeks to secure important mechanisms of good governance and the right of access to knowledge and information.
PERSIST has identified governments, heritage institutions and the IT-industry as the three main stakeholders in the project. PERSIST will establish a platform to support dialogue and cooperation among the three main stakeholders, and to create practical solutions in the area of sustainable digital preservation. Additionally, academia will be an important partner in PERSIST, as researchers have a supporting role in assessing the quality of the dialogue and cooperation.
The PERSIST partners have asked Wilbert Helmus to carry out a literature survey on the world-wide trends and developments on selection of digital heritage collections (including libraries, archives and museums). Helmus is a Dutch expert in cultural heritage management who has carried out projects for The Archives Portal Europe, Picturae and various other partners in the archival and governmental sector. The literature survey will focus on international (world-wide) literature, publications, policies, strategies and guidelines on collecting and selecting born digital heritage collections. The documents will be analysed in order to compose an outline of the most important publications. The outcomes of this literature survey and analysis will provide the background for the UNESCO Guidelines on selecting digital heritage.
In support of this important project, the PERSIST partners are asking institutions involved in the preservation of digital heritage to send Wilbert Helmus (mailto:wilbert@helmus-advies.nl) relevant best practices, publications, guidelines, papers, etc. on the selection of digital heritage for long-term preservation. PERSIST is interested to see how archives, libraries and museums select digital collections and information, such as digital objects, archival collections, digital documents, websites, games, etc. for long-term preservation.
PERSIST is aware of the fact that most of this information is not accessible via Internet or public portals. For this reason, the result of the survey is largely depending on voluntary participation.
The Netherlands National Commission for UNESCO is responsible for the general coordination of PERSIST.
UNESCO PERSIST: http://www.unesco.nl/artikel/share-your-digital-heritage-strategy-unesco-persist-project
“Dodging the Memory Hole” forum on born-digital news content: presentation videos available
A forum entitled “Dodging the Memory Hole: Saving Born-digital News Content” was held at Reynolds Journalism Institute, University of Missouri in Columbia, Missouri, on November 10-11, 2014.
In today’s digital newsrooms, a software/hardware crash can wipe out decades of text, photos, videos and applications in a fraction of a second. Digital archives can easily become obsolete due to evolving formats and digital systems used by modern media, not to mention media failure, bit-rot and link-rot. One recent survey found that most American media enterprises fail to adequately process their born-digital news content for long-term survival. This potential disappearance of news, birth announcements, obituaries and feature stories represents an impending loss of cultural heritage and identity for communities and the nation at large: a kind of Orwellian “memory hole” of our own unintentional making.
Videos of presentations from this event are now available, including the opening keynote address by Clifford Lynch, Director of the Coalition for Networked Information.
Watch at: http://www.rjionline.org/jdna/dodging-memory-hole-videos
Digitised Manuscripts to Europeana (DM2E) project newsletter published
The Digitised Manuscripts to Europeana (DM2E) project is focused on building the tools and communities to enable humanities researchers to work with manuscripts in the Linked Open Web. In DM2E’s October 2014 newsletter, you can find a roundup of the activities of the project since the summer, as well as a look ahead to the upcoming final project months, including:
final round of DM2E events;
first reports of Open Humanities Awards round two winners;
update on DM2E data and data model;
JudaicaLink: Pioneering initiative to link reference works on Jewish culture and history;
Digital Humanities Advisory Board call;
Open Data in Cultural Heritage workshop at Open Knowledge Festival;
Pundit community session at DARIAH (Digital Research Infrastructure for the Arts and Humanities) Virtual Competency Centre (VCC) meeting; and
recent DM2E publications and presentations.
DM2E newsletter: http://dm2e.eu/files/No.7-DM2E-Newsletter-October%202014.pdf
Digital Preservation Network (DPN) launches member content pilot
The Digital Preservation Network (DPN) is a federation of more than 50 academic institutional members who are collaboratively developing the means to preserve the complete scholarly record for future generations. DPN has launched a Member Content Pilot program as a step toward establishing an operational, long-term preservation system shared across the academy. The pilot is testing real-world interactions between DPN members through DPN “nodes” that ingest data from members of the Digital Preservation Network and package it for preservation storage. Three DPN nodes (Chronopolis/Duracloud, The Texas Preservation Node and the Stanford Digital Repository) will be functioning as First Nodes. All five DPN nodes (the three named above along with APTrust and HathiTrust) will be providing replication services for the pilot data.
The higher education community has created many digital repositories to provide long-term preservation and access. DPN replicates multiple dark copies of these collections in diverse nodes to protect against the risk of catastrophic loss due to technology, organizational or natural disasters.
Participants in the DPN Member Content Pilot include Chronopolis, University of California San Diego, Dartmouth University, the DuraSpace organization, Stanford University, Texas Preservation Node and Yale University.
Steven Morales, DPN Chief Business Officer, is pleased with pilot project progress. “The DPN Technical Working group, comprised of the five Replicating Nodes for DPN, has done a phenomenal job linking together their existing repositories”, he said. “It feels great to be at a point where we can begin testing the network with real content.”
The pilot provides:
A functioning preservation network capable of accepting and replicating Member Pilot content.
Opportunity for all participants to play out a realistic content deposit scenario and to discuss and capture the requirements and questions raised.
A preliminary report to the DPN membership regarding results.
In 2012, DPN was launched with the support of founding member institutions. By 2013, replicating nodes had been brought together to begin building the network, software and messaging system. 2014 has been a testing year. This summer, three rounds of successful internal testing were completed. In the current phase, real member content is being tested as DPN members have joined together as “first nodes”. Content has been identified and prepared for packaging into DPN “bags”.
Through the end of 2014 and the beginning of 2015, multiple rounds of testing will be ongoing. A soft launch of a production system will be available in the summer of 2015 through the end of 2016 with all member schools participating.
The DPN will ensure that the complete scholarly record is preserved for future generations. It will be the long-term preservation solution shared collectively across the academy that protects local and consortia preservation efforts against all types of catastrophic failure. The supporting ecosystem enables higher education to own, maintain and control the scholarly record throughout time. While commercial entities may partner with us to contribute to this effort at different points in time depending on priorities and business models, final control must reside with the academy.
Digital Preservation Network: http://www.dpn.org/
NISO publishes recommended practice on metadata indicators for accessibility and licensing of e-content
The National Information Standards Organization (NISO) has published a new Recommended Practice on Access License and Indicators (NISO RP-22-2015) that defines metadata to be used to indicate free-to-read content and a link to license terms for the use/re-use of that content. Developed by the NISO Working Group on Access License and Indicators (formerly Open Access Metadata and Indicators), the recommended practice proposes the adoption of two core pieces of metadata and associated tags: <free_to_read> and <license_ref>. The first tag would indicate that the work is freely accessible during the specified timeframe (if applicable). The second tag would contain a reference to a URI that carries the license terms specifying how a work may be used.
“Publishers provide articles that are ‘free to read’ under a wide range of re-use terms and licenses”, explains Cameron Neylon, Advocacy Director, PLOS (Public Library of Science), and Co-chair of the NISO Access License and Indicators Working Group. “Currently, publishers of hybrid journals have no simple mechanism for signaling the ‘free to read’ status of specific articles or the re-use rights of downstream users. Funders find the lack of information and cooperation between stakeholders creates difficulty in determining whether a specific published article is compliant with their policies. Authors have difficulty confirming whether they are compliant with a given funder policy. Readers face the burden of figuring out what they can and cannot do with specific articles. Aggregators and platform or knowledgebase providers have no consistent mechanism for machine-processing metadata and identifying the accessibility or rights status. Adoption of <free_to_read> and <license_ref> metadata designations will allow both humans and machines to assess the status of content.”
“The combination of the two metadata tags can be particularly useful in indicating the subtle nuances of different Open Access content”, states Greg Tananbaum, Consultant at SPARC and Co-chair of the NISO Access License and Indicators Working Group. “The indicators include a date component so that content with access and re-use rights that change over time can be adequately understood. This supports the existing embargo practices in use by some publishers. By including URIs to applicable licenses in the metadata, more detailed explanations of rights can be made available.”
“The recommended metadata tags can easily be incorporated into existing metadata distribution channels, encoded in XML, and added to existing schemas and workflows”, said Ed Pentz, Executive Director, CrossRef, and Co-chair of the NISO Access License and Indicators Working Group. “Publishers and platform providers can also use the <free_to_read> tag to automate the display of appropriate status icons to users and for signaling or determining compliance with most funder and institutional policies.”
“Adoption of these two metadata indicators can have a significant positive impact on all the participants in the scholarly communications chain”, stated Todd Carpenter, NISO Executive Director. “This NISO Recommended Practice also complements a number of other related efforts, including the CrossRef FundRef service; the HowOpenIsIt? guide developed by PLOS, the Scholarly Publishing and Academic Resources Coalition (SPARC) and the Open Access Scholarly Publishers Association (OASPA); EDItEUR’s ONIX-PL specification for communicating licensing terms; and the Linked Content Coalition initiative.”
Access and License Indicators (NISO RP-22-2015) is available for free download from the ALI Working Group Web page on the NISO website at: http://www.niso.org/workrooms/ali/
Introducing Usus: a community website on usage
Usus (Latin for usage) is a new, independent, community-run website for all those interested in the usage of online content. It is designed to support a productive conversation among librarians, publishers, aggregators and repository managers so that we can all get the best possible usage reports for our electronic resources.
The Usus website provides:
a forum for community discussion on issues relating to the COUNTER usage reports, as well as wider usage issues;
a place for COUNTER (and SUSHI) to solicit feedback on future plans and ideas;
a collection point for suggestions for new COUNTER usage reports and metrics;
a source of hints and tips on solving known problems;
a list of vendors/reports with problems that are affecting the credibility and or usefulness of the COUNTER reports; and
links to other usage resources.
Usus is supported by COUNTER, the international organization that sets and maintains standards for measuring usage of online content, but is editorially independent of COUNTER.
Usus is overseen by the Supervisory Board, whose members represent a broad spectrum of librarians, publishers and scholars. The Usus Supervisory Board will ensure that the website is editorially independent and will serve the needs of the community. Chaired by Anne Osterman, Deputy Director of VIVA (the Virtual Library of Virginia), the members of the Supervisory Board are Anne Osterman, VIVA, USA (Chair); Simon Bevan, Cranfield University, UK; Melissa Blaney, ACS Publications, USA; Anna Creech, University of Richmond, USA; Lorraine Estelle, JISC, UK; Oliver Pesch, EBSCO, USA; Kari Schmidt, Montgomery College, USA; and Mark Tullos, ProQuest, USA.
For more information, contact Usus at: mailto:Usus.stats@gmail.com
Usus home page: http://www.usus.org.uk/
Common ground: new white paper on linked data models from Library of Congress and OCLC
Jointly released by OCLC and the Library of Congress (LC), this white paper compares and contrasts the compatible linked data initiatives at both institutions. It is an executive summary of a more detailed technical analysis that will be released later in 2015.
The white paper summarizes the recent activity of the Bibliographic Framework Initiative at the LC which proposes a data model for future data interchange in the linked data environment that takes into account interactions with search engines and current developments in bibliographic description. It also provides an overview of OCLC’s efforts to refine the technical infrastructure and data architecture for at-scale publication of linked data for library resources in the broader Web. In addition, it investigates the promise of Schema.org as a common ground between the language of the information-seeking public and professional stewards of bibliographic description.
Key highlights:
Work on the LC’ BIBFRAME vocabulary has advanced nearly to the point of testing its use for original cataloging, which they will be doing later this year.
OCLC has published linked data on WorldCat.org using both the Schema.org vocabulary as well as extensions to that vocabulary defined at BiblioGraph.net.
LC and OCLC continue to work collaboratively to identify the different use cases of these efforts and how they complement each other in a rich bibliographic universe.
This report will be of interest to anyone wanting to know more about these complementary linked data efforts and how they compare.
Download Common ground: http://www.oclc.org/research/publications/2015/oclcresearch-loc-linked-data-2015.html
Linked Data for Libraries (LD4L) project progress report video available
The Linked Data for Libraries (LD4L) Project: A Progress Report, a project briefing presented at CNI’s fall 2014 membership meeting, with Dean Krafft (Cornell) and Tom Cramer (Stanford), is now available online.
This presentation reports on the first year of The Andrew W. Mellon Foundation-funded Linked Data for Libraries (LD4L) project, a partnership of Cornell University Library, Stanford University Libraries and the Harvard Library Innovation Lab. The goal of the project is to use Linked Open Data to leverage the intellectual value that librarians and other domain experts and scholars add to information resources when they describe, annotate, organize, select and use those resources, together with the social value evident from patterns of usage.
Watch the project briefing video:
YouTube: http://youtu.be/QYd_OlenZ5U
Vimeo: http://vimeo.com/115088888
Linked Data for Libraries project: http://www.ld4l.org/
Software as a Service and Cloud Based Applications: report from CNI
A report of the Coalition for Networked Information (CNI) Executive Roundtable Software as a Service and Cloud Based Applications, held at the CNI Spring 2014 Membership Meeting, is now available.
There has been a major shift in how some of the largest companies in the software industry are offering their products to the higher education community. Some companies are encouraging institutions to license their software in the cloud and others are providing no choice. The pressure to access software from a provider’s cloud service, rather than from a locally hosted service, is becoming intense. In addition to standard desktop applications software, many universities have moved enterprise-level systems cloud-hosted services. On the surface, this change may appear to be merely one of efficiency, but the move to cloud services has the potential to disrupt the ways in which institutions are able to manage their information and the information of their community. There are important procedural and policy questions that must be aired in addition to the financial and control discussions that usually take place when moving operations to the cloud. The CNI Executive Roundtable participants discussed their experiences with cloud services, the policies and procedures they have put into place and the opportunities and risks that they perceive in this environment.
Held at CNI’s membership meetings, CNI Executive Roundtables bring together a group of campus partners, usually senior library and information technology leaders, to discuss a key digital information topic and its strategic implications. The events build on the theme of collaboration that is at the foundation of the Coalition; they serve as a forum for frank, unattributed intra and inter-institutional dialogue on digital information issues and their organizational and strategic implications. In addition, CNI uses roundtable discussions to inform our ongoing program planning process.
Read the report at: http://www.cni.org/go/software-service-cloud-applications/
Other reports by and related to CNI are available at: http://www.cni.org/go/cni-reports/
OCLC acquires Sustainable Collection Services
OCLC has acquired Sustainable Collection Services (SCS), the industry leader in helping libraries manage their print collections.
Libraries everywhere are changing. Library collections are moving from print to digital, and spaces once used to house books are now dedicated to collaboration and research. Librarians need to decide what materials to keep, what can be shared among groups of libraries and what can be recycled.
OCLC maintains WorldCat, the largest aggregation of library data in the world, as well as the world’s largest library resource sharing network. SCS is the leader in analyzing print collection data to help libraries manage and share their materials. SCS services leverage WorldCat data and analytics to show individual libraries and library consortia which titles should be kept locally, which can be discarded and which are the best candidates for shared collections.
“OCLC and SCS have worked as strategic partners to help libraries manage print materials since 2011”, said Skip Prichard, OCLC President and CEO. “By bringing together the innovative services of SCS, the power of WorldCat and the thought leadership of OCLC Research, we can move quickly to build services to address this critical need for libraries.”
“Our partnership with OCLC has been vital to SCS since our first day of operation”, said Rick Lugg, Executive Director, SCS. “We rely on WorldCat to provide libraries the holdings data that is critical to intelligent collection management decisions. As part of OCLC, we will expand and extend our analytics capabilities, develop new products, and serve more libraries than we could ever reach on our own.”
OCLC Research has been at the center of the evolution of library collections. Recent studies and reports on the subject include Right-scaling Stewardship (2014) and Understanding the Collective Collection (2013).
“Interest in shared print management among OCLC member libraries reflects a growing awareness that long-term preservation of the published record can be organized as a collective effort”, said Constance Malpas, OCLC Research Scientist. “Working together, OCLC and SCS can significantly accelerate our efforts in collection management and shared print projects.”
Sustainable collection services (SCS): http://www.sustainablecollections.com/