1. If we don’t bring in provisional authority records into Aleph (there has to be a several hundred thousand of those at least), does that mean that all of those headings will be unauthorized and/or that LTI will recreate all of those? Is this an Aleph thing or an LTI thing? (Maybe Laura A. knows the answer to that.) This may be more an Aleph thing but we need to bring in provisionals or have a separate database for them – there is much valuable information in local notes, 690’s, in the provisional records.
2. (This is related to #3 below.) As for one of the “number of factors” in Recommendation under Authority Express (AEX), we want to think very carefully about “how much the process of export and import can be automated,” because we have to remember that there are a lot of authority records changed and/or “suggested” (via “LTI sheets”) for which automation isn’t really an option:
a. what I call “Good Samaritan” work, e.g. when an author’s authority record changes (e.g. a death date is added), I have students fix all the records for that author (both “official” from OCLC, and provisional from LTI), and which wouldn’t get done (maybe?) if LTI were to only rely on “official” records
b. differentiating between authors with the same or similar names (and in many cases, LTI suggests a record that is not the one we want). This is not because they send the wrong record but the heading on the bib. record from OCLC did not have the correct version of the name or subject.
3. (This is related to # 2 above.) In Current state of the library’s databases, bullet a., they say that “The library has routinely not loaded new and revised authority records provided by AUP.” Some of those were not “routinely” loaded because:
a. they were for the wrong author or title
b. because we didn’t need them (i.e. no corresponding headings anywhere in the database, either in the authority or bibliographic file)
c. because we follow the “rule of specificity” (and for example, don’t replace “year and city specific” conference headings with “generic” conference headings)
- The relationship between 2 and 3 is that we have to be very careful in having LTI “automatically” load and/or update records because there are many instances where loading their “suggested” records would not be correct. Not sure of the percentage, but I suspect that it’s significant … and probably enough, I suspect, to wreak havoc (or maybe that’s a bit hyperbolic) with the authority database.
4. Regarding the “don’t catalog” time (starting on Friday afternoons at 3:00), Laura suggested doing this only once a month instead of weekly. That sounds like a good idea, but how about if we also change the time from Friday at 3:00 p.m. to something like Friday (or Saturday) at midnight instead. I guess that this is probably our call, not LTI’s. Yes, this time was established by the EUCLID team in terms of sending the records out and reloading them. Monthly would work well and save a little time and money; but it needs to be done at least monthly.
5. Regarding Authority Update Processing (AUP), what is the difference between “Level I” (in bullet a.) and the “comprehensive AUP service” (under Recommendation in this section)? Is it on the order of the way that we (GovDocs) have MARCIVE reload any record from OCLC if the GPO updates it? Again, automated updating isn’t going to work perfectly because, like it or not, we have and will continue to have hundreds of thousands of provisional records (that presumably wouldn’t get updated when “official” records do). I, too, would like a more detailed explanation of the “comprehensive AUP service”.
6. I suggest that whatever it takes with LTI (and if they offer this as an option), we need to keep unauthorized headings “in the loop.” As far as I understand it (from Laura Akerman), once a record is sent to LTI, we can’t send those to them again, even if headings in that record are still unauthorized. I routinely find unauthorized headings in records that were cataloged years before, and without the option of resending those to LTI, I am required to manually take care of those headings, including proposing local provisional authority records. Being able to let LTI work on those records (that were somehow “missed” earlier) would be helpful. So my question is, does the “AUP” service offer that option? This is complicated because Bernardo has explained there are various reasons: sometimes a bib. record or several in a weekly load don’t get sent for some reason; occasionally an authority record doesn’t get loaded into EUCLID; so usually LTI thinks they’ve already sent the heading so won’t send it again even if we change the cataloging date; sometimes an authority record gets deleted by mistake, etc.
7. Related to number 6 above, our current profile with them is (I believe) that once they send a “suggestion” on the LTI lists, they will not do that again, even if we haven’t loaded it. Let’s say, for example, that we lose some of the LTI sheets and those records just don’t get loaded. Is LTI somehow going to monitor our authority database and let us know that we haven’t loaded records they’ve suggested? I guess that maybe this has more to do with the $90,000 deal, and I guess that it might also be something that we’d have to do on our end. I believe Steve is right and in regard to 6 & 7, the only way to fix this would be to reauthorize all the bib. records.
That’s about all I can think of right now. I’ll send more as I think of them.
By the way, as for how Authority records are working in Aleph, the jury is still out. There are problems, but they’ve been reported. However, we didn’t get a chance to do a lot of looking at authority records as we would have liked because so many of them didn’t load properly. But we do know that the Aleph “Check Record” function (which shows “Doc Validation Error” messages and which are related to “Triggers” … sort of) doesn’t seem to be working correctly and/or it’s confusing (and we haven’t figured out exactly how it works … it would help if we could load records). For example:
- “Parallel” linked fields (e.g. one 245 in Chinese characters and another parallel 245 in Pinyin) are showing up as errors (“Required 245 field is either missing or duplicated”). However, linked 245s (one in characters, one in a transliterated “vernacular”) are allowed.
- Using the “Record check” button in Aleph pops up “Doc Validation Error” messages (which seem to be related to form of the record, i.e. it’s looking for errors in tags, indicators, etc.). But to get to “authorizing” errors, you have to go to (Cataloging Module)\Edit Actions\Records Triggers. So we still haven’t found the equivalent to the “Validate headings” function in WorkFlows.
Here are some examples from the LTI Deletes Report of items that illustrate why the Deletes section of the semi-annual LSA report could not become automated. Manual review is necessary to complete the process. Items that have related headings are also affected, since they need to be changed manually as well.
For each item on the Deleted report, we must manually look up what the new (replacement) form is. Often doing more research if the new form isn’t easily provided. Other sections of the report have the new form provided. It would be helpful if the deletes had this information as well (not sure if it is possible to have it added to our profile)
-Laura Trittin
-Delete by LC of Auth Rec prev. linked to name/title bib heading
old: 111 2 aGPC (Conference)d(2007 :cParis, France)
001: nb2007012524
////// conference… didn’t touch… new form: …GPC (Conference)
-Delete by LC of Auth Rec prev. linked to LCSH bib heading
old: 150 aPeddlers and peddling
001: sh 85099129
//////new forms: … Peddlers
… Peddling
-Delete by LC of Auth Rec prev. linked to LCSH bib heading
old: 150 aFairy talesvFilm and video adaptations
001: sh 92002219
//////new forms: … Fairy Tales|vFilm adaptations
… Fairy Tales|vTelevision adaptations
… some bib records are film and some are television, some could be both. some manual changes needed
-Delete by LC of Auth Rec prev. linked to name/title bib heading
old: 100 1 aTu, Jingyi
001: n 81021518
//////new forms: … Tu, Ching-i, |d 1935-
… Tu, Jingyi, |d 1941- …two different names & years
-Delete by LC of Auth Rec prev. linked to name/title bib heading
old: 100 1 aParkinson, Brian
001: n 88609799
//////new forms: … Parkinson, Brian,|c PGCE
… Parkinson, Brian J. investigation determines that we don’t need both records
-Delete by LC of Auth Rec prev. linked to name/title bib heading
old: 100 1 aSakamoto, Takeshi
001: nr 90027042 …
///// this was deleted from oclc authority files, and no replacement form given. A browse by name search comes up with three possible choices. Manual review is needed to complete the process.
1 Sakamoto, Takeshi [100]
2 Sakamoto, Takeshi, ǂd 1899-1974 [100]
3 Sakamoto, Takeshi, ǂd 1925- [100]
A couple of examples, similar to what Laura sent, where I can’t envision how manual review wouldn’t be necessary:
The old “100” heading is now authorized in auth rec #no2010182442
Old: Clarke, J.S.
New: Clarke, J.S.|q(John Stuart)
N82026495
Step 1: check EUCLID to see what topic Clarke, J.S. writes about = 14 books on ground water
Step 2: look at N82026495 and see what that Clarke, J.S. writes about
Step 3: If the Clarke in N82026495 writes about ground water, then replace N83026495 with the new version and no need to look at or bring in #no2010182442
Step 4: if the Clarke in N82026495 doesn’t write about ground water, then look at #no2010182442 to see if that person writes about ground water; if yes, the overlay N82026495 with #no2010182442.
The above it the basic procedure. Sometimes more is involved: it may be an entirely different Clarke, J. S. that we want; or, for example, some of the 14 books are about ground water and the others are on another topic, so we need BOTH headings. Then have to manually figure out which heading goes with each of the 14 bibs.
Authority Update Processing (AUP):
Recommendation: The library should move to the current, comprehensive AUP service. Semi-annual processing could be retained, though, given the size of the database, quarterly runs should be considered.
Files of revised bibliographic records can then be imported to overlay the existing version, thereby making needed changes with minimal staff intervention. The new and revised authority records should also be loaded; authority records no longer in LC should be deleted using the LCDEL file, if the ILS allows. I’d like more detail about these two sentences of this paragraph and discuss pros and cons. Otherwise, deleting authority records will require individual searching and deletion based on LCCN. This is related to what Laura T. sent you; we don’t see how this would work. If the no longer valid headings were automatically deleted, someone would still need to look at each one and figure out why it was deleted and what the new one(s) should be.
Files of revised bibliographic records can then be imported to overlay the existing version, thereby making needed changes with minimal staff intervention. The new and revised authority records should also be loaded; authority records no longer in LC should be deleted using the LCDEL file, if the ILS allows. I’d like more detail about these two sentences of this paragraph and discuss pros and cons. Otherwise, deleting authority records will require individual searching and deletion based on LCCN. This is related to what Laura T. sent you; we don’t see how this would work. If the no longer valid headings were automatically deleted, someone would still need to look at each one and figure out why it was deleted and what the new one(s) should be.
I’d also like us to ask her about what level of service other libraries of our size use.