batch loading and editing of vendor marc records

Post on 24-Mar-2016

120 Views

Category:

Documents

0 Downloads

Preview:

Click to see full reader

DESCRIPTION

Batch Loading and Editing of Vendor MARC records. Anastasia (Nastia) Guimaraes University of Notre Dame Presented at the ALA Midwinter meeting in Philadelphia, PA , on January 25, 2014. Hesburgh Libraries at the University of Notre Dame. - PowerPoint PPT Presentation

TRANSCRIPT

Batch Loading and Editing of Vendor MARC records

Anastasia (Nastia) GuimaraesUniversity of Notre Dame

Presented at the ALA Midwinter meeting in Philadelphia, PA, on January 25, 2014

Hesburgh Libraries at theUniversity of Notre Dame

• Over 8,000 undergraduates; 3,500 graduate students

• $10M+ annual materials budget• Aleph, SFX, Metalib, Primo & Primo Central,

CORAL ERM• Went through organizational redesign in

September 2012

Ebooks landscape at Notre Dame

• Started buying ebook collections in 2007• Current annual budget: approx. $1 million• Systematically process and load vendor MARC

records in catalog since 20102.5FTE dedicated to batch loading activities

• Number of MARC records loaded in catalog to date: 720K records

Batch loading workflow steps

• MARC records negotiated as part of electronic resource ordering/licensing workflow

• Records downloaded from vendor sites, ftp’d, received as email attachment, or ordered from OCLC WorldCat Collection Sets

• Automatic pre-processing using Access Level Record guideline and MarcModifier program

• Record sets analyzed by catalogers and instructions for changes created

Batch loading workflow steps, pt.2

• Changes implemented to record files using MarcEdit• Record sets loaded and reviewed on a test server of

library catalog• Record sets loaded in catalog production• Post load clean up performed if needed• CORAL ERM entry updated to reflect completion of

all batch loading workflow steps

Pre-processing stepsUsing Access Level Record as a guideline

Pre-processing steps, pt.2Automatic pre-processing of vendor record files with MarcModifier

• Locally developed program that we utilize as an automated record validation process

• Written in Java; has configuration file written in XML consisting of conditions and rules

• Customizable -- configuration file can be duplicated and modified

• Makes set changes to all records in the file• Provides reports about records for staff doing analysis

Records analysis• Catalogers analyze each set and create change instructions• Examples of issues identified by staff dedicated to analysis

activities:Following RDA implementation, some Project Muse records

lacked fields 260/264SPIE distributed records containing unresolved DOIs OUP’s Oxford Scholarship Online collection files with

updates contained records for incorrect content World Bank e-Library record file had hundreds of URLs

mixed up pointing to wrong contentIdentifying records that lack call numbers and subjects

MarcEdit• Free software program used to facilitate batch

processing of records• Developed by Terry Reese of the Ohio State University• Converts raw MARC files into human-readable, editable

files• Once file is edited, MarcEdit is used to convert file back

to raw MARC for loading in library catalog• MarcEdit Listserv hosted by George Mason University. To

subscribe go to: http://metis3.gmu.edu/cgi-bin/wa?A0=MARCEDIT-L

Keeping track of batch loading workflow

• ERM system CORAL• Developed at Hesburgh Libraries by Ben Heet and Robin

Schaaf• Open Source program http://erm.library.nd.edu/• Consists of several modules (Resources, Licensing,

Organizations, etc.) that are interlinked• Seamlessly links acquisitions, licensing, and MARC

records information• Transparent way to share batch loading activities

CORAL

CORAL

Title-level records in discovery layer systems

• Not all records are loaded manually by the Batch Loading group in our ILS

• Some records are loaded directly into PRIMO, but not in AlephHathiTrust (1.6 million records for public domain

titles)CRL (will be reviewing as a possibility)Project Gutenberg (maybe a possibly)

Final observations

• Vendor records quality continues to present issues for cataloging community

• Replacing AACR2 with RDA introduced new record quality issues

• Display of data in discovery layer systems vs traditional library catalogs

• Opportunities for traditional catalogers to participate in batch loading activities

Questions?Anastasia (Nastia) GuimaraesHead, Batch Processing, Data Support, and Metadata ServicesHesburgh LibrariesUniversity of Notre Dameaguimara@nd.edu

top related