Duplicates and Processing Records

From SharedBib

Jump to: navigation, search

FSU

Issues with record loading are the most vexing in Shared Bib, and because of the time restraints on the Single Bib Pilot Project and the Shared Bib Implementation we have ended up doing most of our testing in the live database, which obviously is not optimum. I believe that the solution to the duplicate record problems which has caused the situations in William Miller’s email will need to be solved by creation of a robust, automated de-duplicating algorithm. FLVC will need to create this algorithm and will need to run it against the database on a regular basis. Also, FLVC should work to create a weekly or biweekly single load of vendor records which can be de-duped and have all appropriate auxiliary records added before they are added to the catalog.

The problem mentioned by FAU with records failing to load because of duplicates in the system is caused by the current version of GenLoad, which needs to be fixed so that this is not a persistent issue.

As for our workflow, we have taken our cue from the State University of Florida: Guidelines and Procedures for the Shared Bibliographic Catalog general rule 2.1, which states that SULs should make every effort to follow these guidelines given the limitations of their individual resources and staffing. Because we have a small Technical Services staff we spend relatively little time searching for the perfect record or transferring orders, although we do some of both of these things. Since we do relatively little ordering directly from Coutts we don’t have the kinds of problems caused by large batchloads. We have had some issues with catalogers spending too much time “fixing” records in the database (which the Guidelines discourages) but Annie and I are working to alleviate this problem.

Personal tools