Doorgaan naar de website
OCLC Support

2018 AskQC office hour member Q&A

Review all AskQC office hour member questions from 2018.

January 2018: 260 to 264 conversion

Topic-specific questions

Will there be a Connexion macro for updating 260s in bib records manually when upgrading them to RDA? (Is there one already?)

There is not a macro yet to update the 260 field when upgrading records to RDA. This is something that can be considered in the future.

If you change the 264 to 2 264s and one has a copyright date, will you also update the 008-date type to "t" and add the copyright date to 008 Date 2?

Records should already have this information coded in the 008.

Will distribution data in 260 fields be mapped to 264 _2 fields?

Yes.

Will this be only for $b eng in the 040? or it will be done to all languages in the 040?

At this time, we will begin with $b eng cataloged records. It is possible that other language of cataloging will go through the conversion process in the future.

Won't there be an issue with hybrid records that continue to be input?

No, since the records that are being targeting are all coded as RDA. Hybrid records will not be considered for this project.

Will 260 convert to 264 _1 even if material is manuscript (which might be unpublished)?

Yes, manuscript materials will be managed separately and coded accordingly.

Will there be any flagging of RDA records with 260s that cannot be updated automatically by the conversion project?

Yes. Records that cannot be changed automatically will be processed manually by WorldCat Metadata Quality staff.

When you get new bib records from uploading institutions that are not RDA, is there a macro to convert those 260s to 264s?

No. We will not be converting 260 to 264 when records are not coded as RDA.

How far back in years would you be converting 260 to 264?

As long as the record is coded in RDA and has 260 field it will be changed through this automated process.

For CJK records, will OCLC share some testing examples to review?

Yes. For Chinese, Japanese, and Korean (CJK) records we will be working closely with the OCLC CJK Users Group to review these records.

Have you seen a trend in the occurrence of 260 on new RDA records?

Yes, as RDA has become more used in WorldCat records, there has also been an increase in the number of records that have a 260 field rather than a 264.

When using the workforms for new records, the 26x field doesn't fill in as 264. Can that be changed?

The 26x was done purposefully to allow libraries to follow whatever content cataloging standards they wanted and continue using 260 if they follow AACR2 or 264 if they follow RDA. At this time there are no plans to change workforms since OCLC Connexion is no longer being enhanced. You could use a constant data record as a workaround for the workform.

Are there any other 26X projects under consideration, e.g., recording of place of publication in the 008 based on recognized 26X data?

There is a possibility for this as a future project. There are some complexities to this as a city may have the same name in various countries to change the fixed field can become tricky.

264 fields, when is it appropriate to have the original copyright date in the second 264 field instead of the current copyright date? I have seen records with the current date, some with the current and original, and others with the current copyright with a 500 field listing the original copyright.

You will have to look at the LC PCC PS for this, but past practice would have you ignore the renewable copyright dates.

General questions

Merging/Reporting duplicates

Why it is that reporting dups takes so long? What it is the ETA?

It does take some time to process due to the number of duplicate requests we receive. Currently our backlog goes back to 4 to 5 months depending on the format.

Speaking of duplicate records... any plans to delete multiple very minimal vendor records?

We currently run macros to merge duplicate vendor records.

We used to get an automated message when duplicates were reported. Can this be reinstated for catalogers' record keeping?

If you use the OCLC Connexion function “Report error” it will send you an email with your request and you can track the requests you sent.

I have noticed a merging of OCLC records where the pub. and/or copyright dates do not match under the WorldCat record they are merged into. Can you explain this?

We would need the OCLC records that got merged in this scenario to investigate if a possible incorrect merge occurred. The DDR software that merges records follows a set of rules and each incorrect merge needs to be looked in a case-by-case basis. Please report incorrect merges to bibchange@oclc.org.

What's the current advice on editing or deleting FAST headings when making a change to the LCSH heading they are derived from?

A monthly process monitors additions or changes to LCSH and makes applicable changes to FAST headings. Because of this, catalogers do not need to edit FAST headings when they change LCSH. If a cataloger would like to change the FAST headings, this is okay, and the monthly process will look at those changes, updating or correcting the FAST headings, as necessary. However, with cataloger entered changes, no attempt will be made to synchronize the LCSH and FAST headings. This message was originally sent to the PCCLIST on 30 May 2017.

Any possibilities to have a special form to report duplicate with boxes for needed fields that will make the process easier?

Yes, we have a web form similar to what you are asking. You can access it here: https://www.oclc.org/forms/record-quality.en.html

When using report record function in OCLC, include both OCNS?

When using the Connexion “Report Error” function all you need to do is include the duplicate OCLC record number to the record you have in display. This function shows us the record in question and all you need to do is include the duplicate OCLC number.

BFM

I've occasionally reported a need for batch correction and control of a heading (e.g., 50+ bib records). Is this a reasonable request? Are there conditions that make such requests unreasonable?

These are allowable requests as the WorldCat Metadata Quality staff uses macros to help update headings.

6xx fields

I have noticed in a lot of records for fiction works that there are genre headings in 650 fields. I thought they were supposed to go in the 655 field. These headings are LCSH that is being used as a genre heading. Have I missed some communication that says it's OK to put LCSH that is being used as genre headings in the 650 field?

We recommend you follow LC’s practice. LC has not terminated the genre use of LCSH in 650 fields. You can use both the still-correct LCSH genre term and an LCGFT 655.

I've noticed a lot of records for adult books with 650 _ 1 subject headings. or 651 _ 1 subject headings. Do you know why this is happening?

This is due to the new data sync process where subject headings from different thesaurus are transferred to records if they don’t exist. We are aware of this issue and are working on getting it resolved.

What if we had a text file of OCLC numbers and then sent that and asked, can genre heading X be added to all these records? Is that an OK bib change request?

You can send this type of request, but we will have to manually verify each record to make sure that the genre heading is appropriate to add.

What's the current advice on editing or deleting FAST headings when making a change to the LCSH heading they are derived from?

If changes have been done to the LCSH headings in a record you do not need to do anything with FAST. They will automatically update the headings based on whatever LCSH the record now has.

In addition to controlling LCGFT genre terms, does OCLC have any plan to control AAT for art genre terms which are quite widely used worldwide?

At this time, we do not have plans to implement the AAT vocabulary for controlling.

We noticed that if one subject heading doesn't match an authority record, then no FAST is generated even if the other headings are established. Is there a reason why FAST isn't added for the ones that match something in the authority file?

This is another area to stay tuned for as FAST will be changing some of these mechanisms in the future.

RDA

I've noticed hundreds of RDA-coded vendor records with 260 and no 33X. Will there be an attempt later to also add the 33X?

We are continuing to constantly add 33x fields to records in WorldCat. These records will eventually be targeted for conversion.

Is there a deadline for no longer cataloging new records using AACR2 standards?

No. The WorldCat database will continue accepting AACR2 and are not requiring libraries to switch to RDA.

Dates

If there are 2 dates in a resource, one a publication date and one a copyright date, AND they are the same, how are the fixed fields for Date Type and Dates to be coded?

You can code the date type as “s” for single date and just have the one date for publication or you can code the date type as “t” for publication and copyright date and include both dates even if they are the same (e.g. 2017, 2017).

Class numbers

What is the issue with duplication of 084 field copying the 082 field? I have seen this on some DLC records.

The 084 field may be transferring when records are merge automatically by the Detection Duplications and Resolution (DDR) software. We have been correcting these records when we come across them by removing the 084 field. You can report them as well to bibchange@oclc.org.

Connexion Software

Do you have a timeline for the new metadata editor that will replace Connexion?

At this time, we do not know when Connexion is going away. The new metadata editor is Record Manager and if you have a cataloging subscription you can begin using it at your convenience I have noticed a merging of OCLC records where the pub. and/or copyright dates do not match under the WorldCat record they are merged into. Can you explain this?

WorldCat Metadata Quality

What kind of coordination between Batchload and QC happens to try to cut back on errors coming through Batchload records?

The WorldCat Metadata Quality staff and Data Ingest specialist interact constantly. Whenever we receive requests for corrections to records that may have been loaded through data sync, we make sure to communicate these issues with the data ingest specialist.

Can QC search WorldCat for values that are not indexed? For example, can we ask you to do a query on the 250?

Yes, we do have tools that can search for fields that WorldCat does not index.

Identifiers

Many in the community have been eagerly awaiting integration of ISNIs in 024 fields in the authority file. Now that LC has announced there is not an active project to do so in the NACO file, would OCLC consider a project within the authority file it hosts?

OCLC has a copy of the Name Authority File (NAF) but the file belongs to the Library of Congress (LC). OCLC cannot integrate ISNIs in name authority records and must get make a resolution with the Library of Congress and PCC members to make changes like these to name authority records.

Is there any internal discussion to integrate VIAF within the new metadata editor as another authority resource?

At this time no but we can bring forth this idea to the VIAF team in OCLC.

Will $0 be implemented on those fields that are "controlled"? Or will it be suppressed from bib records coming in via Batchload or ingest?

We are currently developing practices and policies that will be used with $0 and $1 so please stay tuned!

February 2018: Cataloging defensively with edition statements

Topic-specific questions

If a book lacks a first edition statement, but is otherwise identical, should you use the record?

Bibliographically speaking, a first edition statement is essentially ignored in the cataloging that we do every day. The lack of a first-edition statement and a first edition statement are equal as far as determining whether to input a new record. If there is a bibliographic record that has first edition, but your resource does not say that, and if you have the option, you can edit the record locally if you want to remove that statement; however, a new record should not be created.

Why is a "Book club edition" statement in field 250 not a good justification for a new record even if that is the only difference?

Long-standing practice as outlined in When to Input a New Record takes "Book club edition" and says not to create a new record if that is the only difference. There will be cases where a book club edition has other differences (paging, size, etc.) that would have an impact on the content of the item so that page for page it's not the same. When you look at book club editions, in many cases, they are just a cheaper binding and that page for page it really is the same content. That was the original decision that went into putting that criteria into When to Input a New Record, to say discount book club editions.

I have a Spanish language book with several edition statements. The only difference between the statements is the month of publication. Are these "real" edition statements, or just printings?

This is what we have come to incorrectly or to narrowly refer to as the Romance language problem. In certain languages, such as Spanish, Italian, German, and various other languages what look to be edition statements but are printing statements and are often associated with a number of copies that are printed. These should be generally ignored and not even transcribed, although practices have differed over the years. Those are not considered to be real edition statements. They are just printing statements and should not be a factor whether you input a new record. They should be ignored.

Is it preferable to have two separate 250 fields like in the case of your example "Content 3rd ed. Pew ed. with readings" rather than recording all the info in one field? For example: "3rd ed., Pew ed., with readings (all in one field)?

That will depend on the resource. There are some things, particularly with scores, where what used to be separate edition statements versus what used to be called, under AACR2, musical presentation statements. Where the statement in the edition statement and the statement that was in the 254 field, are grammatically and intellectually separate. So those would be legitimately separate 250 fields. It is a matter of judgment. Field 250 was made repeatable just a few years ago to accommodate multiple edition statements, where previously we were obligated to record it in a single 250 field, separating one edition statement from another by a comma. That no longer must be the case.

I catalog a great many state government publications. It is not uncommon to have two or more versions of a document published in the same year. The physical descriptions are identical and there are no edition statements. The only difference between one and another is something textual, such as an updated directory of contacts or the updated text of a law. Over the years, I have been frustrated by having some of my document records merged because I described the textual differences on in a note field. Thank you for your suggestions for devising edition statements.

If you come across records that have been merged incorrectly, please report them to us. Depending on how long ago they were merged, they can often be recovered. In cases where we can recover an incorrect merge, we or you can often supply something that will help differentiate the records in the future so that DDR won't incorrectly merge them again. In government documents, it is very common to have multiple documents published in the same year. If you can identify a particular month or date in addition to the year and include that information in a supplied edition statement, that is one option to prevent the records from being merged. DDR tries to look for certain quoted information, either specific publication numbers or serial numbers, that may appear in quoted 500 notes.

If a 250 field is associated with a subfield $3, does that affect the field's value for distinguishing one version from another?

Subfield $3 in a 250 field is not taken into consideration in DDR.

How does OCLC handle a record submitted with one language in the 008 field and a different language in the 250 field?

If the error is reported to us, we will fix it. There is no automatic mechanism to alert us about something like that.

Are galley copies considered different editions?

Yes, galley copies are prepublications, different editions. So, if you have a galley version, you can include that in a 250 field and DDR will not merge that to the published version or other similar things. Many of the books that we pick up at ALA that are advance reader's copies, you can use "Advance reader's copy" in a supplied edition statement, or if that is what it says on the item you can use that in the edition statement.

General questions

Notes fields

Is the 502 Dissertation Note field solely used for unpublished dissertations? Can this field also be used for published dissertations, or should we use field 500 for published dissertations?

In the United States, dissertations in 502 fields should be used only for the actual unpublished dissertation. References to the fact that it is an adaptation of a dissertation, or along those lines, would be recorded in a 500 note. DDR does try to make that distinction between the published version and unpublished version of a dissertation, including looking for the presence of field 502 in addition to other elements as well.

The definition of field 501 says "A note indicating that more than one bibliographical work is contained in the physical item at the time of publication, release, issue, or execution. Use field 501 only when the physical item is not being described as a unit. The works that are contained in the item usually have distinctive titles and lack a collective title. Field 501 usually begins with the designation Issued with, etc.". Does it mean that the different parts of the physical item are published at the same time, or can they be published at different times? Can we still use field 501 even if the different parts of the item (bound in one volume) are published at different times?

The 501 field (With note) is generally not used in RDA. It should be used properly only when the physical item is published as that conglomeration of things, not for things that your institution has bound together after their individual publications.

BIBFRAME

I've been hearing that when BIBFRAME is eventually implemented that MARC will become obsolete. Is that true and if so, will any aspects of MARC still be valid in the future?

Cannot definitively say anything for sure; however, there will be no one day at a point where we will map all of our MARC data to BIBFRAME and everyone will all switch over to use BIBFRAME exclusively. There will be a long time period where data is created in one format and needs to be mapped to the other. We will probably see a lot of back and forth, moving data around from one format to another. Even though MARC may be on its way out in the long run, it will probably be in use for a good number of years.

Series statements

Recently I have seen a lot of Library of Congress bibliographic records with series statements that have no authority records to match them. Indeed, sometimes I see different forms of the series in an 800 and an 830 field. Is there any policy that series must have a matching traced series statement to be added to an LC record, or does it depend on if it is a PCC record or not?

The Library of Congress made a decision a number of years ago, widely known as the "LC Series Decision", where they decided as an individual library that they were no longer going to make series authority records. The PCC continues to make series authority records in PCC libraries outside of LC, but there is no requirement within PCC that a series authority be made. The idea being that if you do have a traced series, meaning that you are going to use it as an access point and have an 8XX, you do want to have a series authority record if you are a PCC library. LC just transcribes the series in a 490 field and does not add an access point in an 8XX field.

Dates

If all elements are identical, including ISBN, but dates are different (one year apart) should you use the record?

If it's what we have come to call a "trade publication" (a major publisher), you may want to use an existing record even if the date is a year off. That depends on the individual instance and your own judgment.

DDR

Does DDR also look at the Date type to differentiate a date such as 1999 03 vs. 1999 04 in the fixed field?

No. In theory, if that information is reflected elsewhere in a record such as the edition statement or possibly in a quoted note, DDR would pay attention to that.

If I enclose a date in quotes in a 500 field, would this prevent an improper merge?

We try to identify dates in quoted notes. There are lots of different ways to present a date (mmddyy, yyyymmdd, etc.) and which part of the date should come first, second, or third. We try to interpret and parse those different methods of quoting dates in 500 quoted notes. In many cases that will prevent an improper merge; however, if that information is in a 250 edition statement it is more easily interpreted and seen as a differentiating factor that will prevent an improper merge.

Why are there so many bibliographic records in OCLC with encoding level M when many of the records are the same or very similar?

Encoding level M indicates that the record was Batchloaded without a human looking at the record trying to compare it to other records that are in the database, but our algorithms trying to find a duplicate and to merge it to that. Our algorithms are not perfectible, they are imperfect and don't have the advantages that we humans have of being able to interpret the information. We try to bring together records that should be together, but we also try to do our best to keep apart records that should be kept apart. It is a balancing act to merge records that should be merged and keep records part that should be kept apart. If you find records that are duplicates, report them to us. If you find records that have been merged incorrectly, report those to us as well. We learn something from every incorrect merge, and many incorrect merges allow us to further fine-tune our bibliographic and matching algorithms.

RDA

We've noticed that RDA fields, like the 33X fields, have not been added to most of the older serial records. Are there plans to add those to serial records?

Yes. When we started the RDA hybridization of records to add fields like the 33X fields, and other things like the spelling out of abbreviations in field 300, we started with Books because they are the easiest ones to deal with and then went on to other formats. Serials still need to be done. One of the factors we've considered is how many of the CONSER records will be affected, because the changes that we make are then transmitted in the file that we send to LC. We will continue to make headway on Serials and all of the other records that are in the database that don't have 33X fields.

Reporting incorrect merges

Who would I contact about incorrect merges? How do I contact that person?

Incorrect merges can be sent to bibchange@oclc.org.

Error messages

Why am I given an error on a particular field when I try to add my holdings and I did not add that field to the record, for instance the 084 field?

There is a way to change your validation level check in settings. You may have it set to where it is doing a full validation. That is something that you can minimize.

77x references

Any plans to create a process to delete 77X references to cancelled records? When cleaning up electronic records, I am deleting a lot of these. A related question, is OCLC able to search and attach to existing records for electronic titles vs. making duplicates when they are generating HathiTrust records?

The process that creates HathiTrust records is out of date and needs work. We have put together requirements for what needs to be done with that, but we do not have a timeline to when that will take place. We have certainly thought about a process to delete 77X references, but we don't have a mechanism in place that can go through the database and detect that those fields exist and that they are pointing to records that no longer exist. It is something that we will keep in mind and try to get something in place if it is possible to do so.

Does having a 776 field in the print help prevent duplicate the HathiTrust duplicates?

No, not right now.

006 field

I am now finding the 006 a e 000 1 in some records for CD audiobook sound recordings. This field for "Additional Material Characteristics" appears to be coded for "Books" "Adult" Fiction" however these CDs do not appear to have additional characteristics, there is no CD-ROM or pdf added material. Should this 006 field be retained in bibliographic records for Sound Recordings when the resource cataloged is just a spoken word sound recording?

It would seem that the 006 field does not belong on those types of records. If you are unsure as to whether or not to remove those fields, you can send those to bibchange@oclc.org for further investigation.

Type of record

In Bibliographic Formats and Standards, the Type of Record (Leader/06) code for mixed materials (code 'p') says that "for made-up collections in which one form of material predominates, use the appropriate code for that predominate material". A search in WorldCat for archival materials only looks for that 'p' code. What about archival material that is purely textual, or any other single format?

In WorldShare and WorldCat Discovery, the Material Type "mix" does retrieve only records coded as Type of Record (Leader/06) "p", according to Searching WorldCat Indexes.

"When performing a command-line search in Connexion or an expert search in FirstSearch, WorldShare, or WorldCat Discovery" the Material Type search "mt:arc" (for Archival Material) should retrieve all records coded as "a" (Archival) in Type of Control (Leader/08) (according to Searching WorldCat Indexes. Note the difference between Type of Record (Leader/06) and Type of Control (Leader/08).

Although this additional fact is no longer accurately reflected in Indexing, as far as I can tell, the Connexion indexing of "mt:mix" should actually retrieve Type of Record (Leader/06) values p, t, d, and f. A few test searches in Connexion suggest to me that this is definitely still true.

Type of Record (Leader/06) value "p" should be used for "Collections of materials in two or more forms that are usually related by virtue of having been accumulated by or about a person or body. … This category includes archival and manuscript collections of mixed forms of materials such as text, photographs, and sound recordings." Manuscript and archival collections that are primarily textual should be coded Type of Record (Leader/06) value "t"; primarily cartographic should be coded "f"; and primarily notated music should be coded "d".. For all of these sorts of mixed/archival material collections, Type of Control (Leader/08, Ctrl) should be coded with value "a".

Future office hours sessions

Will there be additional sessions like this webinar in the future? If so, where can I sign up?

We are planning to hold these on the last Wednesday of each month through the end of June this year. We will then evaluate and decide whether to continue beyond that point or not. You don't need to sign up any place, you can just use the login information and log in at 1:00 PM Eastern Standard Time. It will be the same login information that was used for today.

Will the recording and notes be added to the OCLC website? What about the login information with WebEx link and phone number to call in?

The recording will be posted for this session. We will announce this on OCLC-CAT when the recording is posted. We may also post some notes as well and will announce on OCLC-CAT when those are posted and where they're posted. Today's presentation will be added to the Cataloging Defensively page on the OCLC website as well. We can post the login information at the same place that we post the recording. This information will also be announced on OCLC-CAT and included in the Message of the Day in the Connexion login a few times before each session.

Are you looking for suggested topics for future sessions?

Yes, please send your ideas in to askqc@oclc.org.

March 2018: Processing change requests

Topic-specific questions

Processing change requests

If the change we are requesting is to reconcile a descriptive field with fixed fields - for example if fixed fields indicate illustrations and maps but the 300 field lacks this information, or vice versa - do we have to provide proof?

No, you do not need to provide proof for that. Whether or not to record illustrations can be considered cataloger's judgment. If the Fixed Field is coded for illustrations, we can add it to the 300 as needed.

Why wouldn't libraries just update things like the 300 fields themselves?

Sometimes this doesn't fit into their workflow, it may be easier for them to just report it. Or it could be that the record is a PCC record, and they are not able to make changes to a PCC record and therefore they need to report it.

If I have a number of merge requests for maps, is it better to submit a batch of them to you, or just submit them as I find them?

It is your preference based on what fits best in your workflow. You can send multiple requests periodically or individually as you come across them.

When reporting a duplicate in Record Manager, besides the record numbers it asks the user to provide a "description" (required field). What sort of information is wanted in this box? For us, we're not a PCC library.

You can use this area to describe what the error might be if you wanted to elaborate on some details. Or just writing the word duplicate would let us know what you are requesting.

Adding to that... why isn't there an "undo" button in Connexion?

We wish we had an answer to that, we would love to have an undo button as well.

For older items, dealing with the print version records, can "proof" be supplied by scans from HathiTrust, Internet Archive, or Google Books?

Yes. A lot of times we use Amazon, Google Books, HathiTrust, etc. as proof when processing change requests.

In the past, I have submitted change requests to bibchange@oclc.org via regular e-mail vs. Report Error in Connexion. Does one take priority over the other as far as "place in line"?

No, requests are processed first in, first out.

My correction of records in OCLC is sometimes blocked by anomalous coding in 007 fields, usually the wrong codes for space/blank. Could OCLC provide a way to delete the problem 007 coding (not the whole 007)?

We would need to see an example of that. You can always save these to your online save file and then send an email or report a change request. We can then go into your online save file and see the record and issue you are reporting.

Processing authority requests

Yesterday I made a change to an authority record and realized after I'd replaced it that I shouldn't have changed the record at all. Is it possible to intercept a record and revert it to the original form before it goes off for distribution?

No. Once the change is made, a NACO lock is placed on the record until it completes the distribution process. We would encourage institutions in this situation to wait until the record has completed the distribution cycle and make any corrections themselves. Optionally they could report this to us, but we cannot stop the distribution or make any changes to the record until the process completes either. One of the features being built for authorities in Record Manager, which will be released in the fall, will be the ability to make changes to an edited authority record before it is locked for distribution to LC.

Local practice questions

I've been noticing a significant number of additions or changes to records that have to do with local practice, and this is a change from what I've seen before. Is OCLC aware of this and what are you doing to address this?

An example was requested. Due to the amount of records in WorldCat, we may or may not have seen the issue before. When reported we will look at more than just that one record and we will look for patterns or any other changes, especially if they all came from one institution. We will look at their other records to see if we shouldn't be cleaning more up than just what was reported. So, we would ask if your workflow allows it, please report those.

Example of local practice: Book with 100/240 fields as well as 700 field with the same info.

Yes, we are aware of this, this was brought up via OCLC-CAT as well, so we are in the process of looking into that.

Continuing on with the "local practice" info in the WorldCat record, is it permitted for users to remove 856 fields that are based on a proxy linked URL?

We routinely will try to revise proxy URLs to a usable URL structure. We have a macro that can change these. If they are able users could also make this change or report them so we can see if the change could be made to other such records.

General questions

Bracketed 250 in videos are appearing to create duplicates with language information which belongs in the 246 field. Lady Bird motion picture #1020547485 language English; Spanish subtitles; English sdh. 2 dups have a 250 [English dialogue only version/Spanish subtitled version] and 2 dups have [English dialogue only version].

Please report them and we’ll take a look at them. We will merge as appropriate.

The Playaway bib records have two 007 fields – one for sound recording and one for electronic device; however, there is no consistency in the order these two fields are entered in the bib records. As a result, in our discovery layer their format shows as sound recordings or as electronic resources, depending on which 007 field appears first in a given record. Are there any ‘rules’ about the order these fields should be entered in the bib records and is standardizing of these bib records something that can be done at OCLC level?

Standardizing the order of 007 fields is probably not a good idea. Sometimes the 007 relates to the main part of the resource and other times a 007 may relate to accompanying material. There is no way to programmatically determine which is which. There are certainly other built-in problems regarding the 007 as far as MARC 21 is concerned and distinguishing between the main item and accompanying material is one of those things. There is no way to distinguish 007s in that respect. It probably wouldn’t be possible or necessarily a good idea to standardize the order of 007 fields. It sounds like it is something you need to address in your discovery system, it should be determining the material type of your resource by some other means than by looking at the first 007.

OCLC used to provide training for basic and advanced cataloging for all formats. That's how I learned much of what I know about cataloging. The training I now see available seems to be focused on using WMS. Is OCLC no longer providing cataloging training that is not specific to WMS? Other training options I find tend to charge which limits access.

We do provide training for Connexion and Record Manager covering the interface functionality. We don’t provide training on how to use RDA or how to use MARC. If you would like to email askqc@oclc.org we can put you in touch with our training team.

There are scores of records with the 100 New York Times, if one record is reported will staff fix all occurrences of such a mistake?

Yes, we do. As mentioned earlier, if you come across an error and send it to bibchange@oclc.org or authfile@oclc.org we will look to see if there are other records affected. Also, it’s helpful if you notice a pattern to include that information in your request.

Is there a script or could a script be created that crawls the authority file and updates the name in name/title authority records to match the 1xx form of name? I frequently notice that someone has changed the personal name but hasn't bothered to update the rest of the file.

This is something which might need to be reported to LC or a NACO funnel. NACO institutions are supposed to update related name/title records when changing the 1xx form of the name. There may be lots of complications within this question because there will be situations where we have headings that were never controlled to the authority record but they don’t match the form in the 1xx on the authority record nor do they match any of the 4xx references that are on the authority record. They are just a little bit different, so they will not get globally controlled on their own and will just sit out there in a different form than the established heading in the authority record unless it’s called to our attention. We would do a manual and perhaps with the assistance with macros, the follow up to clean up the headings. In other cases, we may have situations where the authority record had changed and headings that were controlled didn’t get completely changed. We need to be alerted to that situation so that we can do the follow-up and make sure that everything is in step with the authority file.

Update: Since the WorldCat authority file is a copy of the Library of Congress authority file, we would not run a script to crawl through and make changes. For non-NACO institutions, if you do see a situation where a name authority record (NAR) was updated but the corresponding name/title NARs were not updated, you may report these to authfile@oclc.org and staff will investigate further. For NACO institutions, please contact your NACO funnel or NACO directly for all NACO related issues.

Are there any restrictions on editing bib records in WorldCat with Encoding Level M (batch loaded)? It seems like I am not authorized to enhance those sometimes. I've been told those are "machine-loaded" and that is why they can't be edited. Is that correct?

No, you should be able to upgrade encoding level M records. You may run into an issue if another library has locked that record for editing, but otherwise, you should be able to edit these records depending on your authorization level. If you had a Search authorization you would not be able to make the edits, but with Full-level authorization or higher, you should be able to edit these records. If you came across a situation where you were not able to edit the record, send your request to bibchange@oclc.org and we will make the edits for you. This would also be a good opportunity to put the record in your online save file so we could take a look at it and see if there are reasons why that record is not editable for you.

My boss reported that she could not control access fields which contain a $0. What do you recommend in those cases - delete the $0 and control, or leave uncontrolled?

Subfield $0 by itself should not have an impact on your ability to control a heading. What I suspect in this case is that it’s not the kind of record, in other words, it’s not necessarily cataloged in English, and you’re trying control a name to the LC/NACO authority file, in which case you wouldn’t be able to do that. Typically, subfield $0s exist in records that are from outside the US, so it’s not uncommon to see a record that is cataloged in German that has subfield $0s on every access point, or records that are created with language of cataloging Dutch that have subfield $0s. But typically for a record that’s created in English you don’t normally see a subfield $0. Even if the subfield $0 is there you should be able to control it, but the subfield $0 will disappear when you do that if you are successful in controlling. Another possibility with that though is if you are trying to control MeSH or some of the other subject vocabularies which you cannot control in Connexion. You can control them through Record Manager, and if you were in Record Manager you would see the typical blue link that you see with LCSH. Those controlled headings, like the medical subject headings, controlled in Record Manager will display a subfield $0 in there. You may also see a subfield $0 for FAST and those actually aren’t controlled in Record Manager or in Connexion, but the subfield $0 is part of our processing when we enter that into the record.

So, if she encounters the issue with not being able to control with a $0 and it IS an English language record, should she report it?

Yes, that sort of thing should be reported so we can investigate and see what’s going on. It may be that there is a disconnect between the text and the authority record that you’re attempting to control to, so it’s not finding it. It could be any number of issues. If it looks like it should fit into the category that should routinely control, by sending it to us as an example we can investigate and get back to you.

On records for Chinese materials, we have noticed that we frequently see redundant 600 14 paired fields. For example: OCLC no. 11114810. This is on the record:

600 14 [Chinese vernacular]

600 10 Romanization.

600 14 [Chinese vernacular]

600 14 Romanization.

If you come across something like this, report it and we’ll look into it to see if there’s a bigger issue at hand with other records. Also, if you were to notice a bigger issue or pattern, do let us know.

Are new OCLC 10-digit numbers replacing older numbers? The great war video #1023729399, record entered 20051201; replaced 20180307. The was a tape loaded record. Another record for this title has same scenario.

What you are probably seeing is this record probably came in through our data sync project and the date you actually seeing was probably the date from the local catalog of the institution that sent that record, so sometimes you’ll see those entered dates don’t make sense, but that’s where they are originating from.

I'm extremely new at cataloging. I'm looking at a book entitled Ghosts of Greenglass House by Kate Milford; and in the fixed field after Ctry the letters mau appear. I was unable to determine why mau was typed for country? Also, what does the space stand for when typing a LCCN in the search dialog box?

mau is the code for Massachusetts. So, if the city listed in the 260 or 264 subfield $a was a place in Massachusetts that was the reasoning for the country being coded mau for Massachusetts.

Chat comment: check out https://www.oclc.org/bibformats/en/f...ield/ctry.html and scroll down to Codes. This explains how to code this field for items published in the U.S.

Chat comment: The LCCN structure is controlled by LC https://www.loc.gov/marc/lccn_structure.html - OCLC's search system takes that into consideration, so the space is part of the LCCN structure.

Are there any plans to be able to control Art & Architecture Thesaurus terms?

There aren’t any plans to do that right now, we are working on creating a more robust authorities’ infrastructure that will allow adding new authority files in the future much, much easier. Jody DeRitter, Director of Metadata Frameworks, she was hired last August to look into that and she’s also working on moving FAST into production as well as moving VIAF out of Research and into production. So, there are no current plans for AAT at the moment but stay tuned as all of the authorities’ infrastructure beefs up.

I have found many 007 fields for DVDs with byte 4 displaying a "g" (laserdisc) instead of a "v" (DVD). I doubt in some cases the "g" is correct, but don't want to assume anything. Our ILS will display the laserdisc icon in our OPAC if that byte is a 'g". It will display the DVD icon in our OPAC if it is a "v". I do not change the WorldCat record, but I do use the record for our local database and make the change to a "v" in the 007 field.

A little bit of history, the value v for DVDs was a later addition to the MARC format. After DVDs began being published it took months, or possibly even years before code v for DVDs was defined in MARC 21 and then validated. So, there are probably records that are incorrectly coded in the 007. If you have evidence in the record that it’s a DVD and not an earlier laserdisc technology you should, and it’s perfectly proper, to change it yourself and replace the WorldCat record. Otherwise, you can report it to us, and we’ll take care of it.

I’ve never quite understood the rationale for locking records. This keeps other libraries from editing and replacing the record and therefore having to update in their local ILS which is more difficult in a lot of cases. Is this function really necessary?

Yes, we do allow libraries to lock records if they are upgrading the records, also if they are doing the NACO work necessary for upgrading the records. Locking a record may also be used for staff training.

#879468616 is an example of several records in the database with 505s that have summary information duplicating the 520s. Many of them came from changes contributed by New Zealand libraries in the middle of last year which I reported to bibchange at the time. For the specified record, OCLCQ appears twice in the 040 after NZHSD so I'm wondering...does QC look over the whole record when making changes?

Yes, it but depends. The first OCLCQ could have been because it was in a group of records the macro ran on, and the second OCLCQ could have been to remove the incorrect field, but then a batch or ingest process has reintroduced the error back into the record. There is a variety of reasons a library would send us updates to a record, so if we take it out once, it’s possible that if the library resends us the record it will show up again. But you can report those, we do try to work with the specific institution to try and minimize errors that keep coming back. We do ask that you report these, so we do know it’s happening.

Any progress on where duplicate 264 _4 fields with a bad Unicode character of some type (like question mark in diamond), alongside one with the proper copyright symbol, are coming from, or a fix?

We are continually trying to clean those records up as they are coming in. That is a known issue that is being looked at but no specific data on a fix for that, so we are just trying to stay on top of it until we do get a fix in. We recognize that is very frustrating to a cataloger, you are more than welcome to take them out as well, but it is a known issue we are working on.

I'm noticing a "bl" in the 010 field that sometimes proceeds the LCCN, but other times the number is NOT the LCCN. What does the "bl” represent? British Library? And why is the "bl" present in front of an actually LCCN?

That is another thing we continually clean up; those are coming in through batch load. We don’t know what the bl stands for, but we are aware of that and we do try to stay on top of that and clean those up as they come in.

Who can establish the heading Indigenous peoples -- France --Colonies in OCLC for us to use? Who establishes headings needed to enable subject strings to control properly? Indigenous peoples--Great Britain--Colonies is a valid heading but when I try to control Indigenous peoples--France—Colonies it flips to Indigenous peoples--Colonies—France

This particular combination of place with colonies is an issue within the heading control software for subjects. There is a portion of that process that looks at subdivisions and whether they can be geographically subdivided and then moves the geographics to that spot. In the case of these headings with colonies where you can further subdivide by a continent like Africa or Asia, we have two geographics that are actually separated by a topical subdivision and the software doesn’t get them into the correct order. We’ve known about this for a while and have a ticket in place with our developers to try and get that fixed, but that has not happened yet. So, we’ve generally just have looked the other way on these, looking forward to the time period when once it’s adjusted we’ll be able to go back through and actually control them correctly. Otherwise, you ought to be able to add such a heading to a record, leave it uncontrolled and make use of it that way.

But if you leave a heading like this uncontrolled won't OCLC automated processes come through and (incorrectly) control it later on?

Absolutely, which is why we don’t bother to correct them because they will be flipped back to the incorrect form. So, we’re still really waiting on the software to be fixed before we go through and globally fix these.

Today I learned that I can ask OCLC QC Staff to look at records in our institution-specific online save file. In what types of situations would that be appropriate?

That would be appropriate when we are asked by that institution to look at a particular record. We don’t go into institution specific online save files on a regular basis, we don’t do that unless we are asked to. It’s a tool that we have to help institutions and we use it for that purpose. An institution may have made changes to a record and they want us to look it over, they have a question about it, a specific situation with a record, we can help them with that. We’ve also run into issues where someone has tried to replace the WorldCat record and are unable to do so, but they are able to save it to their online save file, and then we can go in and try to recreate the problem with that actual record with the edits that they made.

April 2018: URLs in a shared cataloging environment

Topic-specific questions

There are 856 fields with URLs representing the table of contents that have a second indicator of 1 (856 41 $3 TOC $u [URL]), and there are 856 fields with URLs representing the cover art have a second indicator of 0 (856 40 $3 Cover image $u [URL]). What is the preferred method for coding the second indicator in fields 856 and 956 for URLs representing table of contents and URLs representing cover images? Most of the 856 fields with TOC in subfield $3 are not coded with second indicator 0 in electronic records.

It depends what type of record that 856 field is on. If these URLs are on an electronic record, then both the 856 field representing the table of contents and the 856 field representing the cover image should have second indicators coded 0 because those URLs represent a part of the resource itself. Second indicator 0 should be used for URLs that represent the entire resources itself or a portion of the resource itself.

If we find institution specific URLs in the WorldCat record, should we delete them or report them?

It depends. If an institution specific URL represents the same link as another URL in the record, then you may delete that institution specific URL or report it to bibchange@oclc.org. For example, if the institution specific URL was a Wiley URL but a non-institution specific Wiley URL already existed in the WorldCat record, then you would retain delete the institution specific URL. If an institution specific URL was unique, then you would convert it to a non-institution specific URL instead of deleting it. If you are unsure, please report the URL to bibchange@oclc.org and staff will decide whether to delete or convert the institution specific URL.

OCLC used to have a project called something like Econtent Synchronization where it created bibliographic records for HathiTrust and Google Books titles which are not accessible to anyone due to copyright restrictions. Will those be retained since the 856 URL's do not point to an accessible resource?

Yes, they will be retained. Some of the links to HathiTrust and GoogleBooks records are freely accessible. Others are only searchable or not accessible yet due to copyright restrictions, but that could change at a later date. If we deleted the URLs that were not accessible, we would need to go back and repopulate those records with the appropriate URLs after they become accessible, so these HathiTrust and GoogleBooks URLs that are searchable only or not accessible are okay to leave on the WorldCat records.

How can a provider be considered materials specified?

Materials specified is a general caption for what subfield $3 represents. It is used for both specifying the specific material as well as differentiating between different providers in a provider neutral record. OCLC discussed the issue of whether subfield $3 covers the name of a provider many years ago at the outset of provider neutral cataloging. While it is a stretch of that definition, there was no other place to very conveniently indicate what URL belonged to what provider. After making use of subfield $3 in that way, there are now millions of records with subfield $3's with provider names in them, so usage has since dictated the change and shift in the definition.

Currently the subfield $y is used about 50 million times in WorldCat. The most common use is "View online". If you are interested in seeing how this field is used, click on this link: http://experimental.worldcat.org/marcusage/2018-01-85. Note that this list is a ".txt" file and its pretty large so it may take time to download.

What is the appropriate use of $y in 856?

The subfield $y is defined as "the text that is used for display in place of the URI in subfield $u." It converts the text in the subfield $y into a clickable link. For example, the phrase, "Click here", could be used in the subfield $y so that the URI would no longer displays in the library's catalog but instead, the phrase, "Click here" would be displayed as a clickable link. The URI can also be masked using subfield $3 and subfield $z. So, subfield $y is not always necessary if the text from the subfield $3 or subfield $z is being used by the library's catalog. Years ago, subfield $y was used with the phrase, "Click here", pretty frequently but later on it was not used as much. What works for one library doesn't necessarily work for another library that would prefer to use some different kind of phrasing. Because of this, subfield $y is not used all that often anymore.

So, if the only URL we have is a local one, we don't add an 856. Will this still facet as an eBook or the like in WorldCat Local?

Currently, if a bibliographic record does not have an 856 field, the eBook icon will not appear in any of the OCLC's interfaces: WorldCat.org, WorldCat local, Discovery, or Record Manager. In order for the eBook icon to show, the record currently needs to have an 856 field. This is a known issue and OCLC staff are currently working to resolve it. In the next few months, we hope for a resolution that will change how the eBook icon is generated, removing the requirement for field 856. OCLC will send out announcements when this change takes place.

Should the 856 field be deleted from the local record if the electronic resource is not available in the local library?

This would depend on the local practice of your institution. You may delete it from the local copy of the record, but it should remain on the WorldCat record.

Do we have any idea what percentage of PURLs are broken?

We do not know what percentage of PURLs are broken. We know that the way that OCLC PURLs were handled changed over time, but we don't have any way of knowing the percentage of broken PURLs. Please report all broken OCLC PURLs to bibchange@oclc.org.

General questions

Whether or not you choose to delete it from your local record would be up to your institution. However, the 776 fields should remain in the WorldCat record.

Can you briefly explain the 776 field in a record? If someone copy catalogs, does this link go anywhere?

The 776 field is used to link between two different versions of a record. So, for example, it would be used to link an electronic version record with a print version record. These two records have the same title, the same publisher, and everything is identical except for the version (i.e. print versus online). The 776 field on the print version record would point to the online version record and the 776 field on the online version record would point to the print version record. This links the records together in the library's online catalog.

Does OCLC have a policy about local printouts of websites? For example, patrons here printout websites and donate them. I tend to want to create an electronic resource record in WorldCat for the online resource and just add the printouts in the local ILS. However, when the online resource goes away, should I do a new record for the printout?

Yes. You may catalog the electronic resource record in WorldCat for the online resource and mention the printouts in local fields. It is also okay to create a record representing the printout resource versus the online resource itself. That record would not be considered a duplicate record. How you handle what you attach your holdings symbol to, whether you are working with the record for the online resource itself and treating the printout as a copy or creating two records, is up to you locally but it is possible to have the two records.

I recently made a Name Authority Record that affects a whole lot of bib records. This calls for bibliographic file maintenance (BFM). Whom do I notify about that?

You may email any needed BFM to Metadata Quality staff at bibchange@oclc.org. Please include the authority record number (ARN) or Library of Congress control number (LCCN) representing the added or updated authority record and Metadata Quality staff will add your request to their workflow for processing.

May 2018: Bibliographic record validation

Topic-specific questions

How do you turn on 006 mnemonics?

The only way you would do that in the Connexion client, for example, is by going through the drop-down menus for the guided entry and you can look at the existing field to potentially edit it with the fixed field mnemonics in that case. If you have an 006 field in the record and you right-click on it the guided entry box will come up as well as in the drop-down from the top.

How is record validation different for batch loaded records?

It really is the same. It used to be that in our Batchload processing that we had a separate set of validation rules, but over time we have come to use the very same set of validation rules so that we don’t have to maintain multiple sets of rules. The difference comes in how we deal with the errors that are spotted in the records after the fact. In the Batchload processing, there is an error level that is assigned to various records and causes the records to go through different kinds of processing in those cases, it is not as if records have to be absolutely perfect to be added to the database but we do detect the very same set of errors.

Why do I see records that have 6xx fields with second indicator 7 and no subfield 2? I have to clear this before I can validate these records in the Client.

That is a case where the record has most likely come the Batchload process because it is impossible to do a record like that as an online input. We have a relationship in place between 6xx with the second indicator coded 7 and a subfield 2 but relationships in Batchload are considered a lower-level error and those can find their way into the database.

Can you give us any tips on how to identify bad characters?

This is really tough thing to go look for and find. This may be different in Record Manager, but in Connexion client you could input the vertical bar charter at different positions in the field as a way to spot where you should be to find that invalid character. There may also be macros out there that can help find that spot.

Why is it that a record can fail validation, but I can still attach our holdings?

This is where you have options that you can set in your holdings so that you don’t require full validation. A lot of libraries would prefer to be able set their holdings without necessarily fixing everything on a record. If you do an explicit validation command you will get back full validation with all of the errors listed, but that is not necessarily something that you have to fix in order to set a holding.

When exporting a record from OCLC and it shows a validation error and you fix the error then export the record with no errors, does that correction stay in the record in OCLC and the next time someone exports it the record will show the same error?

In a case like this you need to replace the record if you are able to do so. If the error is on a PCC record, and you only have a full level authorization, you can report so that we can fix it.

I have noticed that some records have 336-338 $b not compliant with $a (and with the rest of the metadata), when $2 = rda.

The question here is that the code in subfield $b is actually out of step with the term in subfield $a. At this point we validate the term in $a, we validate the code in $b but we don’t actually have the two of them related together.

I also see records that show 650_1 headings underlined as controlled.

This is an error where we have transferred the controlling from an LC heading in the past, if you see ones like that report them so that we can investigate.

Our library used to use a single record for print and online resources, on the monograph form. We are a PCC library. We now create individual records. When we are adding a 776 for the online version, can/should we remove the electronic elements from the print version?

In part, there was a decision within CONSER in the past that considered the issues related to the print record carrying so much information about the electronic resource because the record should represent the print and note the existence of the electronic. We were including elements from the electronic version in the record that could then be confusing when processing the record in the future. So, a decision was made to remove certain things like 006, 007 relating to the electronic version, at least for serials initially, and then that conversation carried on over into monographs where there was a PCC decision to handle that exactly the same way. But you will still find records in the database that represent the print version and the form is coded blank in the fixed field also indicating that it represents the print version but there might be a 006 or 007 there for electronic information that may come out.

Unless I am mistaken, the validation tool does not let me know when I forget to include the subfield b or e in the 040 field. Are there plans to include this in the validation process?

For 040 subfield $b which is the language of the cataloging, we have discussed before making that a mandatory element so that you would be required to input subfield $b when you were creating a new record. Subfield $e is a harder thing to require in that there is not another element to link that to because if you created an AACR2 bibliographic record you would not have a subfield $e in the 040. Desc in the fixed field would be just coded as just ”a” and there. But, Desc i without 040 subfield $e is a valid combination so subfield $e may be something you will always just have to remember, subfield $b may be something we input a relationship for in the future.

So even though the Client is not being updated anymore as a tool, you can update the validation rules it uses?

Yes, that’s correct, since it is the same validation rules that is used for various services. When the MARC update is applied, we update the validation database. It is automatically updated for Connexion and Record Manager.

I have a found a number of RDA records that have a 264 #1 $c 2017 when the resource only has a copyright date. It's my understanding that the date should be in brackets in this case. These are recently published books so unlikely that there is a different publication which includes the date as a publication date. Am I misunderstanding how the date should be input in the 264 #1 field? If not, should I assume it's an error and correct it or assume it's not an error and create a new record?

In most cases a copyright date can be used in RDA to infer a date of publication if there isn’t an explicit date of publication. So, in a 264 1 that inferred date of publication would be bracketed. A subsequent 264 4 could be input with only the $c identified as a copyright date. So, if you find items without the brackets that is an error, and it should be fixed or reported to OCLC. Do not input a new record.

Why can't we enter both an ISSN in tag 022 and ISBNs in tags 020 in one and the same record? E.g. yearbook comprehensive record.

This question has come up before when our validation was previously based on AACR2. In the context of MARC records, we took this issue to CONSER and had a discussion with them about what would this mean? Because I could potentially have a serial that has all the ISBNs that are assigned to all the individual volumes. So, this is really the constraint of MARC at this point that is a consideration in the decision to continue how we have done it in the past, which to omit the ISBN’s for the individual parts.

Is there a reason 020 $z requires a valid check digit for the ISBN number? It makes it difficult to record information for books which represent a number as an ISBN, but that number is not actually a valid ISBN.

020 subfield $z should not require a valid check digit. In fact if you had a number that you were going to include in 020 subfield $a, if the check digit was incorrect that is a case where you would put it in $z in addition to the cases where it is not the number that is appropriate for the item being described in the record.

Since the new data ingest software was implemented, I have had to do way more cleanup in the 6xx fields than I did previously. Could the software be a little fussier about bringing in duplicated headings and headings that have been controlled in error?

I agree, it could be fussier about bringing in those kinds of headings. There are some issues that are being resolved about the number of headings that transfer and the state they are in when they do transfer. We are also attempting to clean up problems that we know about e.g. certain combinations of headings. We will try to go after them and get them out of the way.

Omitting the ISBNs for individual parts means you may miss the record if you search by ISBN. This has happened to me and I almost made a new record because of the non-existent of appropriate ISBNs in the record. I hope this decision will be changed. Making a decision like this because a few records might have 500 ISBNs seems unhelpful.

The flip side would be the person that wants to catalog the individual volume in a series who would then complain to us that “I searched this ISBN expecting to get the one monograph record but I am also always retrieving the serial and I don’t want that” it is a really difficult position and it can be extremely useful but in other cases not so much.

Will it be there validation for other languages in $b in the 040?

We have validation in place for all languages of cataloging.

With some Arabic records, we see some problems with “dot below” diacritic combined with s, t, d, z. (precomposed character). This causes the “transliterate” macro to fail as well as the ability to control the field in 1XX or 7XX? they pass the validation though!

They pass validation because all of those characters are now valid with OCLC’s implementation of Unicode. The pre-composed characters that have the diacritic combined with the letter are somewhat problematic when using macros because the macro language is not Unicode compliant. There are issues with any macro that are written, including the transliterate macro. We are thinking about possible solutions but do not have any definite plans yet.

I sometimes come across English-language records which have non-English subject headings. Should those subject headings be deleted from the WorldCat (English) record, or should they be left in place?

The language of cataloging code in the 040 subfield $b applies to the descriptive cataloging not to the subject cataloging. So subject headings 6xx can be any languages as long as they are coded correctly.

Speaking about subject headings. It seems like OCLC is now doing some kind of validation process that when run puts the subject headings in order by number 600s then 610s, 650s ... When that is done then the most relevant headings are no longer necessarily among the first headings. Can this process be changed to not change the order of the headings?

Yes, we would like that process to change. It is the result of records being built in the data ingest process, batchloading, there was also an issue at one point in Record Manager where these things were sorted into a tag order rather than keeping the most relevant heading intact.

Does OCLC have guidance for print-on-demand publications? Specifically, printers who print HathiTrust materials and bind the pages and sell them as reprints. These items have no dates other than the original and also often do not mention the printer/publisher. Should I just put the data about the printer in the 037? Thanks.

Treat the item according to the Print-on-demand and photocopy provider neutral guidelines, which you can find on the PCC website. We will be including information about provider neutral cataloging for print on demand publications in an update in Bibliographic Format and Standards, but otherwise you can search for that on the PCC website and find the guidelines. There will be one record for the reproduction. You would have a 533 field that would indicate that it is a reproduction in print. It would not include any details on its publication.

Sometimes validation finds an error that could simply be corrected by the system itself; for example, in the authority format, "name" may be coded "a" when the 1XX field is coded 110 (and therefore the code should be "n"). Couldn't the system simply fix the problem rather than giving a validation error message?

Validation was designed solely to report back errors. In a case like this when you have a mismatch between two elements the question is “which one is really wrong?” It may be that the heading that is coded 110 shouldn’t be changed to a 100. It may be that the 110 is correct and name should be coded “n” or it could be the reverse.

Where are we with expanding the number of institutions allowed to do bib record merges? I know a 2nd cohort has at least been identified.

Yes, there is a second cohort that started last year and they're going great guns. We are planning on starting a third group sometime this summer, at least later this year. That group is still being formed we are very excited about moving forward with that.

It used to be that if one subject heading on a record didn't have a match in the LCNAF or LCSH, no FAST headings at all were added to the record. We're now noticing that headings that match LCNAF/LCSH get FAST headings, while those that don't do not. Was this a deliberate change? If so, we like it.

Yes, that was a change on how those FAST headings are generated and applied to existing records. It is no longer a requirement that all the heading have to be convertible to FAST. We will do the ones that we can do.

The PCC guidelines for provider neutral records seem to refer only to e-resource items. Is there one for print?

Yes, there is. The thing to look for is Print-on-demand.

How do you now choose the topics for these sessions?

A couple of the topics that we have done so far have been suggested by our members and we have also picked a few topics that we thought were important. A survey will be coming out soon and we are hoping that you will suggest lots of great topics. Feel free to let us know what you want in the survey, through AskQC@oclc.org or write to any of us individually.

Is OCLC aware of the great number of pairs of UKMGB records, in which one has: 260 $a Place : $b Publisher, the other has 260 $a Publisher, $b Place?

Yes, we are aware. We have attempted to put together a macro to try and fix them, but it is a very tricky thing to do. In many cases the ones that are in the combination of publisher/place are not necessarily subfielded. We have got commas and things that we have to rely on. We don’t get reliable results all the time. We are also aware that because of some of these having been corrected in the past while we were still batchloading them, that we have 260 fields that did not compare and match correctly so that we have lots of duplicates as well. We are aware of this and working to clean up as much as we can.

Could there be validation on incongruence of 337-338 and 008/23 (Form of Item) = 'online'?

This is a case where since you could include multiple 33x fields for different aspects of an item. It might be difficult to include this kind of thing in validation and it may be better for us to look for these things in the database. This one occurs frequently for Hathi Trust and Google Books because existing records are simply cloned and the 337 and 338 are not removed and replaced with their online counterparts.

June 2018: What is the expert community?

Topic-specific questions

What ability do libraries have of adding non-Latin scripts to existing WorldCat records?

Libraries are welcome go back and add non-Latin scripts to any WorldCat records because of the expert community. They will be parallel fields and are a great addition to records. This is especially a good time to add non-Latin scripts to records since OCLC accepts all Unicode, which allows more languages to be used in WorldCat.

Are all users with Full level authorization automatically in this expert community?

Yes.

Should changes to a bibliographic record's main entry always be regarded as arguing with another cataloger's judgment and therefore avoided, or are there cases where changing a main entry would be correct (e.g., to align a bibliographic record with an LCNAF authority for the work or expression)?

Would be a matter of cataloger’s judgment but if you are certain that a change needs to be done to correct the record, especially in the case of wanting to change the form of the name or title in an access point to match the LCNAF, it would be right to do so. When you are changing the main entry of the record due to a cataloger mistakenly placing an editor in a 100 field, make sure to move that access point to a 700 field and not delete it completely.

Do you have data on the kinds of Expert Community changes that are being made and in what proportions to the total?

We do keep monthly statistics and have a record of those. These statistics are broken down by type of changes that were done in the past such as minimal level upgrades or database enrichment. We are not able to get too granularly on the exact change that was done to the record and we currently do not have numbers we can share on this.

How does OCLC prefer handling errors for which no proof is available, e.g. a supplied publication date in 260 $c that precedes the author's birth date?

As part of the expert community you can make this change. If you have the item in hand and are sure that the change is correct to do, please go ahead and make the change. If the record is PCC, then please send us whatever proof you have in order for us to see if we can make the change. If we do see something that is obviously wrong, like a publication date that proceeds the author’s birthdate, we would do some research to see if we can infer what is the correct metadata for the record.

If a library enhances a bib record with a 490 0 to 490 1 plus 8XX, is that appropriate or is that just disagreeing with another cataloger's judgment?

We think it is appropriate, especially if you check it in the LC/NACO authority file and find an authorized series access point. If it is not in the authority file but your library wants to add an 8xx field, that would be cataloger’s judgment and it’s okay to do so as other libraries would find that helpful.

When upgrading an AACR2 record to RDA, should the 500 Compact disc note be retained (as an example of "do no harm") - RDA does not specifically call for such a note.

You can take out that note if your upgrading the record to RDA. The compact disc note was discontinued in AACR2.

General questions

Do you have any update about the 520 fields that have been added to the wrong records? I sent a message to the Bibchange email address about a record over a week ago (to see if there was a bigger problem that the record might help identify), but the WorldCat record has yet to change. Can/should I update the record myself?

We are aware of the problem with data transfer and we are looking to have this resolved. We are also correcting these records so if you do come across issues like these please go ahead and send them to bibchange@oclc.org. We would like to be able to identify who is the source behind the incorrect data transfer and make sure we are aware of all the records that were affected to later correct them all.

Sometimes I find records in which the 040 $b does not match the actual language of cataloging (e.g., 040 $b is spa but notes/description are in English). Would you prefer we change the 040 to make it match the record, report it to OCLC as an error and input a new record if necessary, or something else?

If the record is clearly in one language that is not reflected in the $b of the 040 MARC field please feel free to go ahead and correct the language of cataloging code in $b. Make sure to consider all the descriptive fields such as 300 and 5xx fields. The subject headings being in a different language does not count in this situation though. If you think changing the 040 $b of the record would drastically make the record different do not hesitate to report this to us at bibchange@oclc.org and we can determine what is the best course of action to take.

Does it make a difference whether it appears the original inputting library miscoded the 040 $b versus another library has come along and possibly hybridized the record so now there's a mismatch or conflicting languages of cataloging?

This would be cataloger’s judgment and if it is too difficult to decide on what is the appropriate action to take it would be best to go ahead and report the record to bibchange@oclc.org to help make the correction.

You mentioned language of cataloging other than English. People in English-speaking regions might be advised that it would likely be preferable to derive a CatL:eng record rather than work with the other-language-of-cataloging record.

Correct, even if a record is in a language of cataloging that is not used in your institution you can still use the record by deriving it and making a new master bibliographic record that follows English language cataloging practices.

Regarding 856 2nd indicator 0 (zero), Bib formats says: "Resource. The electronic location in field 856 is for the same resource described by the record as a whole. In this case, the item represented by the bibliographic record is an electronic resource. If the data in field 856 relate to a constituent unit of the resource represented by the record, use subfield ǂ3 to specify the portion(s) to which the field applies. The display constant Electronic resource: may be provided." Does the sentence about "constituent unit" mean that indicator zero "0" should be used for tables of contents or chapters or other portions of the work represented by the record. I thought 2nd indicator zero "0" means that the link goes to an electronic copy of the entire eBook.

Ideally yes, a second indicator 0 in an 856 field would indicate that the entire resource is available on the web. The second indicator 0 does not imply anything about the resource being freely available or behind a pay wall. It just indicates that the resource is available at this link. For multivolume sets or serials records that provide separate links to each volume or issue it is still appropriate to have a second indicator 0 because you will be obtaining the whole volume or issue for that multipart or serial.

I see Chinese libraries inputting records with 040 $b eng, but it is clear they are not using our cataloging guidelines or LC transliteration. Should we input a new record or try to work with that record.

We would suggest that you try to work with that record if the vendor or library intended to use this record for English language cataloging conventions. You can upgrade them to the guidelines that you are using since these institutions are not required to use RDA.

Duplicates, do you have a preference on how we choose which duplicate to use/enhance and which to consider a duplicate? First inputted, most holdings, best (i.e., needs the least work)?

We try to use the record that needs the least amount of work and is most complete. In most cases, it would be a PCC record, but you do not have to feel obligated to choose a record for us to retain. When we are merging, we follow a hierarchy like DDR that helps us determine which record should be retained. Users can just send us the duplicates and we will make the choice. Generally, we also keep the record with the most holdings.

Would it be wrong to use 856 40 for a link to a table of contents, index, or chapter, when the MARC record is for the entire book?

If it’s just a link to a table of contents or index it would be inappropriate to use a second indicator 0. In those cases, you would use second indicator 1. For a chapter in a book that would be more difficult to determine, and it would be a cataloger’s judgment call to decide if indicator 0 is appropriate.

Can you talk about the difference between (1) 650 second indicator 4 and (2) 653? I see 650s that don't appear to follow any controlled list.

The definition of second indicator 4 is that it’s a local vocabulary that is not following a controlled list. If there are duplicates of subject access points, this is not useful, and it would be appropriate to remove the 650s with second indicator 4. If they are different then it may be helpful to keep them in the record. 653 holds uncontrolled headings that are not related to any list which go beyond the topical. 650s would be generally held for topical headings but 653s holds any information from subjects to names. 650 second indicator 4 is also structured in some way, unlike the 653 which are just keywords. You wouldn’t be able to use subdivisions in 653s.

I see 650 #4 $a Electronic books in almost all records for electronic resources. Why is this used instead of 655 #0 $a Electronic books?

LCSH "Electronic books" doesn't have a scope note to help us determine if it can be used as a genre, so judgment applies, if it falls under the category of "disciplines in which LCGFT authority records have not yet been made." If not, then the local 655 would be appropriate (using _4). This response was derived from LC's Frequently Asked Questions about Library of Congress Genre/Form Terms for Library and Archival Materials (LCGFT).

If I'm correcting a name access point to an authorized form and it has a parallel vernacular 880 field, what if anything should I do with the parallel field?

Unless you have the script and language expertise of that vernacular field it would be better to leave it alone.

Has 655 0 been redefined from LCSH to LCGFT?

No. LCGFT would be identified by second indicator 7 and $2 lcgft.

When would you use a 720 field?

It is not recommended to use this field; they are intended for conversion of bibliographic data by a machine and not humans. They are not properly differentiated access points as you would see done by human catalogers. If a cataloger sees a 720 field and can appropriately determine what it is trying to represent, then go ahead and correct it to the correct form of the LC/NACO authority file and move it to a 100 or 700 field.

We cannot link to our ebooks from our record (due to IT set up). Does it matter which field I put a note about accessing this resource?

If the record is going to be in your local catalog this note can be anywhere in the record. We would prefer for you to please not add this information in the master bibliographic record. You can put this information in the local holdings record as well or in the 956 which is the locally defined equivalent of the 856.

If a bib record has bad tagging (e.g., pagination data in 260 and 260 $b data in 260 $a, lack of 245 $c, etc.), is it better to correct the tagging or report the bad record as a duplicate?

Don’t spend your time correcting it if you know this record is a duplicate. Go ahead and report it to bibchange@oclc.org. If the record is not a duplicate, then please go ahead and correct it.

What is the difference in 040 between OCLCQ and OCLCA?

You can learn more about the different OCLC specific symbols used in WorldCat by checking out the latest updates from chapter 5.4 of Bibliographic Formats and Standards. OCLCQ is for WorldCat Metadata Quality, it appears whenever changes in the record are done, whether automatically or manually by Metadata Quality staff. OCLCA is the automated process where a controlled authorized access point in a bibliographic record is updated to match changes to an authority record (OCLCO is similar to OCLCA).

I thought I heard a comment in passing about Connexion ending but I can't find any information. Is there any information or did I hear incorrectly?

There is no end of life date for Connexion. Someday Connexion will end and its successor is Record Manger. You can already begin using Record Manager now. When there is an end-of-life date of Connexion we will make sure to notify way in advance. Connexion is not being developed anymore though. All the functionalities upgrades are being done in Record Manager.

I searched for a record by ISBN and the same record (OCN) appeared twice in the results list. Why is this happening?

This is an indexing issue. Please go ahead and send us the OCN number that is being display twice in your results list to bibchange@oclc.org. We will re-index the record which will clear up the issue.

Is there any Expert functionality now that is missing from Record Manager? e.g., controlling headings.

There are some functionalities that are being added to Record Manager one of them being controlling headings.

How about a vendor that uses Connexion for authority work?

Record Manager will be able to sustain the work that is done with authorities. Both vendors and libraries will be able to continue doing their work in the LC/NACO authority file through Record Manager once Connexion ceases to exist.

Can you use macros to clean up the OCLC records in Record Manager?

At this moment no, but we are hopeful that a new mechanism will be built in Record Manager to help us modify large amounts of records. There are some functions in Record Manager that mimic current macros in Connexion such as taking a print version record and converting it to an electronic version record. There are at least about 5 to 7 advanced functions available in Record Manager.

When the new authorities coding changes talked about at ALA get introduced, will they cause validation errors in Connexion?

Whenever there are changes in MARC coding it will continually be changed in our systems, regardless of the interface changes that are being done on the front end. Jay manages the MARC update which will be done in the next 6 months (we will be implementing the 2018 MARC update). We will be publicized the changes in the MARC coding and additions through our technical bulletin.

I can copy multiple fields from a record in Connexion Client into a Word document, for such purposes as editing them, transferring them to another master, etc. Can you copy in this way from a record in Record Manager?

This question would be better addressed to Customer Support at support@oclc.org. We try our best to answer Record Manager questions but the AskQC sessions are more focused on cataloging questions and Record Manager questions can be sent to Customer Support who will forward it to the appropriate department.

Has the update to BFAS meant that all Technical Bulletins previously issued are now obsolete?

No. Technical Bulletins’ are valid and not obsolete until a certain point in time. We try each year when a technical bulletin is issued to update BFAS and incorporate those updates to the document.

September 2018: Meet Metadata Quality

Topic-specific questions

If a provider neutral record adapted in our local catalog with an 843 for specific publisher and holding, an 843 is converted to an 853 as part of our data sync process, will that create a new non-provider neutral record.

We suspect that it probably will, given the way we process information in the 533 in our data sync processing. We pay attention to publishers that are in the 533 so that, for example in the case of microforms with different publishers, we wouldn't want to merge them together. So, the fact that we have a record that has a 533 that would include what looks like a publisher versus another record that doesn't, we would probably end up adding that at this point. We also have macros that we run on occasion to go after records for certain online resources, to make them provider neutral. So, it's possible that a 533 that once appears in the database might be removed as part of the process to make it provider neutral, and then the record could be subject to being merged after the fact through our duplicate detection algorithm.

Do you have a preference as to how we report duplicate records (online form, report error, etc.)?

Whatever method works best for you and fits into your workflow is fine.

Is there any clean-up plan for local headings and headings with unknown or invalid sources?

There aren't any specific plans, at this point, to clean up local headings. We encounter local headings as part of all the other work that we do, and we look to make sure that they are correctly formulated. So, if we have a pattern of some local heading that is problematic in that respect, we will sometimes go after it and fix it up. In some cases, if we are encountering multiple forms, staff will do the additional step of establishing an authority record for that name. We know that a lot of local headings have been entering the database through our data sync processes, and when those are duplicating subject headings already in the record that are not local, we do have a macro established that deletes those headings. We have not taken a systematic approach, but when we do encounter those, we are deleting the local headings. We also have in process a correction to the way data sync works, so that a few of these local headings will transfer from incoming records to WorldCat records.

What is the status of FAST headings? Are there any public interfaces that use them?

We don't know of any public interfaces that are using them (or specific ones), but they are being used by various institutions. Nathan said that he would have to talk to Jody DeRidder, who is overseeing the FAST process to figure out what specifically they are. We are in the process of creating an editorial board for FAST so that we can go through the process of updating them or adding terms, or various things like that, and not be solely dependent on the conversion of the LCSH. We have some announcements in the works that will go out in the next few weeks, or sometime in the future about those sorts of things. We are looking at creating a sustainable future of the FAST headings.

We've had experiences where we've manually corrected broken diacritics in OCLC records, only to have them return. Can you give us an update about this situation? Is manually correcting these fields when we find them the correct thing to do?

We presume this is referring to the situation where the character typically shows up as a black diamond with a question mark. Fixing them is certainly an appropriate thing to do. That would involve deleting the fields because they are often duplicates of fields that are already there. That's because the character is considered a non-Latin character within the OCLC database, and that is partly the reason they have transferred in. We are working on the root cause of this problem before we take an approach to finally cleaning all of them up. We have, on occasion, gone back to get rid of large groupings of these but find that in some cases they transfer back in again. So, helping us by cleaning them up when you happen to see them is a good thing. Nathan added that the time frame for the fix thing is within a couple of months and hopes that with the October or November office hours to state where we are in the development work for that. The biggest thing is that we don't want them to continue coming in.

When records are merged together in error, what is the process of reporting this and how are these errors resolved?

A request can be submitted to the Bibchange inbox (bibchange@oclc.org) to let us know that an error in merging has occurred. As long as the records were not merged prior to 2012, they can be recovered. Once the records have been pulled apart, we can test them to see if subsequent changes to Duplicate Detection and Resolution (DDR) may have taken care of the problem that allowed them to merge in the first place. If DDR would still merge them for one reason or another, we can often work with the reporting institution to come up with a way to prevent DDR from merging them again. As we have mentioned in previous sessions, our de-duplication process is continually evolving. As we stumble upon an incorrect merge, we do go back and test it and change the algorithms that are merging it in hopes of preventing future cases like that.

Follow-up question: Are records recoverable since 2012 or 6 years before the present?

As of right now, it's since 2012. This date is fixed, so in 2020 we will be able to go back to 2012. As we look at data retention and the size of this file, because there is a lot of stuff in the journal history file that is kept, this decision may be reversed or changed, or shortened sometime in the future.

We are in the transition to move to WMS, for authorities that are not in NACO or headings that are different from headings in bib records, how will OCLC control them?

Right now, within WMS, there is controlling for multiple sets of authority records. NACO being prominent, and the one most used with the English language of cataloging records. There is also controlling for LC subject headings, for MeSH subject headings, Maori subject headings, Dutch names (mostly used on the Dutch language of cataloging records), and German names (mostly used on the German language of cataloging records). Later this year we are going to be implementing a French-language authority file from Canada. That will be controlling names in the French language of cataloging records. If something is not in NACO, it will not be controlled. If an authority record is needed for a particular heading, where one does not exist, one of the things that OCLC can do is create an authority record for that heading. Requests for new headings can be sent to authfile@oclc.org and WorldCat Quality staff handle those requests.

If a library's records are provided to a vendor and the vendor distributes the records, why does the 040 subfield $and subfield $c reflect the loading library? Both the vendor and the WorldShare library have ignored the MARC conventions for subfield $a and subfield $c.

This is an OCLC thing, not a vendor thing. If a vendor sends us records on your behalf and it's loaded under your symbol or a collection created for your library, our software here at OCLC (DataSync software) changes the 040 subfield $a and subfield $c to your OCLC symbol. This is a decision that was made when the DataSync system was being programmed.

Follow-up comment: There are records provided by a national library for MARCIV. Now we are looking at our records, and rather than match/replace our records, the WorldCat records are match and attach.

Response: These are reasons why we would want to pay attention to the subfield $a and subfield $c in the 040 field, so we will take that into account as we continue to look at that system.

Can you provide details on which tags can be edited or deleted and which organizations have the permissions? Specific example: 015 field.

There are edit restrictions that are built into the system for certain fields and in certain kinds of records that would prevent somebody from replacing a record after making certain kinds of changes. There is no edit restriction on field 015. Libraries can add them or delete them as they see fit. It seems pretty unlikely that an 015 field would be deleted if it's legitimate. And in this case, since we're talking about records that have come from Library and Archives Canada, I would imagine that your 015 fields are safe.

What are the points used in duplicate detection? The records I have seen have the similar title but different authors.

There are roughly two dozen comparison points for bibliographic records in DDR (Duplicate Detection and Resolution). That is misleading, in the sense that many of those comparison points actually draw from various parts of the bibliographic record and not simply one field. In many cases, the information gets manipulated in order first to see if things are the same that are transcribed differently or look different to see if they are actually considered to be the same thing or two, if they appear to be the same but really are different. There are roughly 300 fields possible in a MARC bibliographic record and there are roughly over 200 fields that we look at or otherwise consider. Most of those are the things that you would expect such as the title (245 field), places of publication, publishers, dates, series. But there are all sorts of other things where a comparison point is specific to a particular kind of bibliographic record. For instance, scale in Maps records, publisher numbers in Sound Recordings, and in Scores various elements of instrumentation. So, there are lots and lots of comparison points. Jay has done a defensive cataloging presentation that helps you know which fields play in, so that if you wanted to create a record it's not merged into another record. We can maybe do a session sometime in the spring about what our merging process is.

Is there any way there could be an option to not include other national bibliography authority headings in the browse headings function? This would be similar to the limit by the language of cataloging in the search function, but in the browse function instead.

This seems like a very worthwhile request. What we have in place, in terms of browsing headings in the bibliographic file, is that all languages of cataloging are included with all their headings integrated into the same index. So, sometimes you'll see variations in names that are legitimate names because one is the form that's used by the Germans, the other is the form that's used in English language cataloging and so there's a difference in qualifiers. Or, if it's personal names, one will have a date and the other one doesn't. It's a little bit confusing as you're looking at that display. I presume this is what you are asking about in this question, as opposed to searching related to an authority file where you normally just pick the file that you want to search in. In browsing through headings in the bibliographic file, they are all mixed together. This is something that we ought to keep in mind for Record Manager because it would be a desirable thing to have.

At Library and Archives Canada for CIP records, do we have to take the vendor records and enhance them, or can we create new records?

Since you will be using WMS and WorldCat is your database for the Library and Archives Canada, we would hope that you would enhance what was there and not create a duplicate record. In many cases, there won't be a record already there when you are doing Canadian CIP.

Is anything being done about subject headings order being changed to field number order rather than the intentional order of the cataloger? Also including second indicator zero is taken a priority if that is how the cataloger ordered them.

This is a known problem in data processing. It is on a list of things to resolve, along with other things, but we are not sure where it ranks. It is a problem in that tag 600 is automatically going to float to the top rather than the 650 that a cataloger put there intentionally.

When trying to add the subject heading Shīʻah $z Lebanon $x History $x Press coverage, controlling moved the Lebanon piece to the end which of course isn't the same thing. Do we need to make a subject heading for validation purposes, can OCLC do this, or is that only LC?

As an LC subject heading, that would be only up to LC to establish a subject authority record for it that we could potentially control to. The re-positioning of the geographic is determined by what can be subdivided in terms of the subdivisions that were here. So, this heading had History and Press coverage, presumably Press coverage is a subdivision that can be subdivided geographically which is why Lebanon was moved to the end in this case.

Knowledge base question: When OCN chosen as override OCN in the KB is DDR'd (merged) in the WorldCat bibliographic database and one is chosen over the other, what happened to our 'held by' in Discovery? It seems to disappear or not appear on the remaining OCN. If it requires an action on our part, is there a way to be alerted?

Right now when you merge two records and the OCN that is in the Knowledge Base is not the OCN that is retained in the bibliographic database, there's a brief disconnect period where we have to wait until the OCN gets updated in the Knowledge Base. That will happen automatically eventually but if you need it to happen much sooner, then the OCN needs to be updated in the Knowledge Base. We are working on making that much more streamlined because it doesn't do any good to not have the correct OCN in the Knowledge Base. At the moment there aren't any notifications as to when records are merged, but we should probably look into that more.

If two records were merged possibly not by an automated process but an actual person, can the records be recovered back to 2012 as well?

Yes, they can. It doesn't matter the process they were merged, as long as it happened after 2012, we can have them recovered.

October 2018: Parallel record and language of cataloging

Topic-specific questions

 

What is GLIMIR clustering and what does it have to do with parallel records?

 

GLIMIR stands for Global LIbrary Manifestation IdentifeR. GLIMIR is an effort by OCLC to bring together holdings for the same manifestation that are distributed across multiple parallel language records. GLIMIR began as a project in 2009 and was fully implemented around 2012. When WorldCat was GLIMIR-ized, the same or similar records were clustered together to improve end-user searching. Since the purpose of this tool was end-user searching, it is not very useful in a cataloging context. It is much more useful in WorldCat Discovery or WorldCat.org since it brings together all of the parallel records as well as print and microform records representing the same manifestation. For catalogers, make sure the GLIMIR box is not checked when you search in a catalog interface.

 

Why is GLIMR clustering turned on by default in Connexion if it is not really useful for that interface?

GLIMIR clustering can be disabled. Once it has been disabled, the system will remember that preference. If it is not, make sure that only one instance of Connexion Client is open, open the search dialog box, and uncheck the "Display using GLIMIR clustering" option. Search WorldCat, then exit out of Connexion Client. The Client will save the last preferences that you chose before exiting the Client. When you open the Client back up, the "Display using GLIMIR clustering" should no longer be selected.

 

There is a source in Japan (TRCLS) that codes 040 as eng but generally does not follow English language rules on most of their records, e.g. the author is in a 700 instead of 100.

TRCLS is a vendor institution in Japan and their intent is to catalog in English. While their intent is to create an English language record, they do not catalog the same way we would catalog the resource. These should remain in the English language of cataloging records, so please feel free to correct these records as needed.

 

What is the national French library code, not Canadian French, but French French?

The OCLC symbol representing the Bibliothèque nationale de France is BDF. To identify the OCLC symbol of a particular library, go to the Directory of OCLC Members and search by the institution's name. As far as the language code to put in field 040 subfield $b, the code would be "fre" whether the language of cataloging is French from Canada or French from France.

 

I have seen a lot of 520 in foreign languages, quoted directly from the book jacket of a novel. These are done by English-language cataloging agencies. Shouldn't the quoted foreign-lg note be in a 500? Is 520 supposed to be in English if your cataloging lg is Eng?

If the 520 is a quoted note, then that is perfectly fine to keep in the record no matter what the language of cataloging. If your library has the language knowledge to add a translated summary note in the language of cataloging of that record, you may do so and replace the quoted summary in the other language. If you are cataloging in English then, yes, the summary note in the 520 field is supposed to be in English as well. Allowances are made for libraries serving multiple communities, for example a library who catalogs in English but also serves a Spanish language community, may keep the quoted Spanish language summary statement along with the English language summary note. If you see these translated summary notes in a record, you may fix it to represent the language of cataloging of that record.

 

When searching OCLC Connexion, how do we search English records? Which option do we pick, "Apply language of cataloging limiter" or language? There are two choices.

To limit records to a particular language of cataloging, use the "Apply language of cataloging limiter" option. You may also use ll: if searching in the command line search. For example, ll:eng would limit your command line search to only English language of cataloging records.

 

Should we check to see whether fixing a hybrid record will result in a duplicate for that language of cataloging?

If you notice that correcting the hybrid record will result in a duplicate record for that language of cataloging, we encourage you to report them as duplicates to bibchange@oclc.org. In many cases, DDR will come along a few days later and find and merge these records. For more information on DDR, see Defending Differences from Duplicate Detection and BFAS 5.1 OCLC Member Quality Assurance.

 

Is it corrected to say records in traditional Chinese and simplified are considered as the same record and get merged?

The records may be merged but Metadata Quality staff would first consult with experts in the language before making a final determination on the matter.

 

When deriving a new parallel record, should the same 010 be given in the new record?

The same field 010 may be used in a parallel record since the identifier in that 010 field represents the same resource.

 

I've seen a number of records created by Dutch language institutions that have 040s with eng as language of cataloging but the 700 fields with (NL-LeOCL). Do you think the intent was English language cataloging or was it Dutch? Sometimes the 3xx fields may only be partially filled in.

It is common practice for some Dutch academic institutions to catalog certain records in English. If all of the fields except for the access points are cataloged in English, then most likely the intent of the institution inputting the record was to catalog it in English even if they have used the Dutch authority file. Verify that field 300, the 33x fields, non-quoted notes, and other elements are English language of cataloging then you may correct the form of the access point to conform to the English language of cataloging practice. It’s also worth noting that a number of Dutch academic libraries are planning on joining NACO and will start creating English language authority records. This should make a difference for records contributed by Dutch academic libraries in the future.

 

What is the status of the "PR" notes in 936 fields? Can you explain what happened to field 936?

,dd>When parallel records were introduced in 2003, OCLC also introduced "PR" notes in field 936. This note contained a list of OCLC numbers representing parallel language records for that same manifestation. Because this field was not always used as intended, OCLC stopped using the field altogether a number of years ago and deleted them from WorldCat.

 

Do we add the language of cataloging or the language of the item to the 3XX fields? Is it mandatory?

When entering the codes into the 3xx if your institution enters English language records and is following RDA guidelines, you do not need to add the language code after the code in subfield $2. For example, if you were cataloging an item in English under RDA guidelines, you would expect to have English language terms in the 33x fields subfield $a with rdamedia, rdacontent, or rdacarrier in subfield $2. Under RDA guidelines, there is no need to include the slash "eng" ( /eng) after the code in the subfield $2. If you were cataloging in a language other than English, though, you would include the appropriate term in subfield $a for that language and then follow the source code in the subfield $2 with a slash and the appropriate language code.

 

A lot of records from foreign libraries are Batchloaded and I see the "language of resource" in 008 coded to be the same as language of cataloging when in fact it's a different language. Do you encourage OCLC members to correct such errors?

Yes, please correct errors that you find in these records. If you are unsure or see a pattern of errors, please send these to bibchange@oclc.org for WorldCat Metadata Quality staff to review and correct and needed. Metadata Quality staff correct these errors on a regular basis but member libraries are also encouraged to correct them as well.

 

I wonder if the OCLC team knows if it’s possible to link fields after they've been exported to an ILS?

No. You will need to contact your specific ILS provider and work with them.

 

You mentioned subject headings can be in any language. Does that include allowing non-English uncontrolled (653, 650_4) SHs without $2 in English-language records?

Yes.

 

We put 520s in Spanish for our Spanish-language patrons and use a second one for the English version. I have seen some folks say we should use only 1 520 with both languages in it. Which is preferred?

OCLC prefers that summary notes added to field 520 match the language of cataloging of the record. While it’s understandable that a library would want to include summary notes in other languages, it's preferred that libraries treat these other language summary notes as local notes and either add them to your record locally or add it to an LBD record.

 

All simplified characters vs All traditional characters should not be merged because it’s likely it's a different publisher.

We urge you to look carefully at all elements in the records but if you do feel like they are duplicates, please report them to bibchange@oclc.org. We will make a determination whether to merge them or not, consulting with our language experts if needed.

 

From what you said, you can submit duplicates for other languages of cataloging in $b of 040. Is it allowable to update those records if there are wrongly displayed diacritics?

Yes, please feel free to correct errors if you see them.

 

Can you derive a new record from a parallel record?

Yes, you may derive a new record from a parallel record. When you do, be careful not to enter a hybrid record. Make sure to verify the non-transcribed elements to make sure that the language of cataloging matches the language of cataloging of your institution.

 

Most CJK records I see have 520s only in the vernacular and a transliteration. So, all of these should only have been in English?

It depends on the language of cataloging of the record. Look at the whole record to determine what the intended language of cataloging was before deciding what to do. If the language of cataloging is determined to be Chinese and the summary notes in the 520 fields are in the vernacular with a transliteration, then do not change this record. If the language of cataloging is determined to be English and the summary notes in the 520 fields are in the vernacular with a transliteration, then you may correct the 520 field. Yes, OCLC does allow the practice of adding a quoted summary statement in CJK along with an English summary note in an English language record.

 

How long does it take after you report a duplicate record via the OCLC form on the web for the duplicates to be processed?

When you send duplicate requests using the OCLC form on the web, the request is sent directly to bibchange@oclc.org. Once it’s sent to Bibchange staff, it's placed into the duplicate workflow. In general, staff trained in each format process duplicate requests on a first in, first out basis. Be aware that there is currently a backlog in duplicate requests. Some formats have a bigger backlog than others, but your requests sent through the online form will be sent to bibchange@oclc.org and processed when staff are able to get to them. For more information on Bibchange Staff workflow, please see Processing change requests, the Virtual AskQC Office Hours presentation given on March 28, 2018.

 

There are a lot of pre-2003 records that are hybrid records (040 says eng but the notes are in French or Spanish, for example). In general, is it better to fix them or to derive a new one? What happens to all the holdings that were attached to the old record if you derive a new one? Will they move or stay?

When you derive a new record, you are creating a new record, so the holdings will not move. Holdings only move if records are merged together. It’s a judgment call whether to fix the record or just derive a new one.

You should determine the language of cataloging based on the intent of the cataloging agency that input the record, the language of the libraries who have attached holdings, and the language of the descriptive cataloging elements in the record. Deciding what language of cataloging should be used on a record needs to be determined on a case-by-case basis. It is worth looking at the hybrid record in WorldCat to see if you can resolve the problem since incorrect coding in field 040 subfield $b may lead to an incorrect merge via DDR. Also, you may have a case where considering the holdings will inform you on the best course of action. If a record was intended to be a French language of cataloging record but only 2 out of the 102 holdings are French libraries, while the other 100 libraries are English language libraries, then it may be that it should remain English and be corrected to reflect English language of cataloging instead. If you can do this, then feel free to modify the record as needed to correct the hybrid record, otherwise you may report the record to bibchange@oclc.org.

 

Does evidence that names are controlled to a non-English authority file, does that count in deciding what the intended language of cataloging was?

If the names in a record are controlled to a non-English authority file, then the language of cataloging ought not to be English. Records cataloged with English language of cataloging, are controlled to the NACO authority file. Records cataloged with Dutch language of cataloging are controlled to the Dutch language authority file. The same with German, French, etc. It’s unlikely that you will see an English language record controlled to another authority file other than the NACO authority file, because of how OCLC controls authorized access points. There may be cases where a record coded as English language of cataloging has access points with subfield $0s that link to a non-English authority file. If you encounter this, carefully review the record to determine which access points to retain and edit the record accordingly.

 

What if there are two authority files in the same language? e.g., French from Canada, and French from Europe?

This is a great question and is something that we have been wondering what to do about. So far, we have the French Canadian authority file that we will be using to control access points in French language records. We currently do not have a French authority file from Europe yet. We are thinking about it and carefully considering what we will do in the future but in the short term there is no conflicting French authority file to consider in the short term.

 

Now that credits are long gone what is the incentive to use different encoding levels? Such a K vs I ?

While there may no longer be a credit incentive as there was in the past, encoding level K and I communicate to other catalogers the record's level of completeness. Updating the ELvl fixed field when upgrading the record will assist others in identifying what your intent was when cataloging it. Please feel free to upgrade minimal-level records to full-level records if you come across them.

 

I am seeing a lot of 856 fields with "Table of contents" and " |z Available to ----affiliated users at |u". If a note indicates that the resource is only available to one institution, should it be deleted?

Yes. You may delete local URLs from the WorldCat record. For more information on URLs in field 856, see URLs in a shared cataloging environment, the Virtual AskQC Office Hours presentation given on April 15, 2018.

 

How long do vendor records with titles that actually say untitled stay in OCLC? The are some going back to 2017, many are level M.

Level M means that the record came in through Data Sync, it doesn't indicate the completeness of the record. We do work to clean these up as we see them. The current policy for vendors is to delete vendor contributed records that have existed in WorldCat for about 4 years and do not have library holdings attached. If you can identify one of these records by ISBN, you are welcome to upgrade it to match the item you have in hand. You may also report these to bibchange@oclc.org.

 

What is DataSync?

This is the new OCLC batch load system and stands for Data Synchronization. Libraries send us files of MARC records and those files are taken in through the Data Sync system so that the records and are either matched to records or added as new records in WorldCat in a batch mode.

 

To the question of 856 fields: If we find a particular institution that is adding these fields, should we report it to OCLC so you will delete them from the database?

Yes, feel free to delete these or report them to bibchange@oclc.org if you see a particular institution's local URLs being added to WorldCat records. Staff are continually working to remove these from WorldCat records and anticipate changes will be made behind the scenes this fiscal year so that less of those will get added to WorldCat.

 

Merging records, do you encourage members to report multiple records for merging?

Yes, we do encourage members to report multiple records for merging. Please send all duplicate requests to bibchange@oclc.org.

 

Why do so many records, especially level M, have data that can't pass OCLC's own validation program? We do an awful lot of cleanup on fields that have to be fixed or just plain deleted if we can't fix the problem. Why doesn't your data ingest program validate records first?

The Data Sync process does make use of validation; however, it's handled differently than online inputs. The reason for this is that when you catalog online and receive a validation error, you are able to fix the validation problem at that time, but when records arrive in files through Data Sync, they are run through validation and separated into significant errors and minor errors. The significant errors include a bad tag or an incorrect record structure. These records are set aside and not loaded into WorldCat. The minor errors include relationship errors, such as if you have one of these then you must have one of those. These records are loaded into WorldCat. If these records with minor errors were not added the copy would not be made available for other libraries to use and the library's holdings would not be added as well. We do realize that validation errors are a problem and have change instructions up front where we try to fix some of the errors that are coming in. Metadata Quality staff also have macros and tools to clean up the records as they are loaded as well. If you are seeing a pattern of a problem, please report these to askqc@oclc.org. We may be able to fix the one record you are reporting but also make the same change across the thousands of records that have the same problem.

 

If we upgrade a WorldCat record to PCC, are we responsible for checking the validity of the 856 fields?

No, if you are upgrading a WorldCat record to PCC, you are not responsible for checking the validity of the 856 fields. That being said, we encourage you to look at them and delete of any URLs that are obviously local to an institution. You may not be able to check all of the URLs because your institution may not have access to the providers.

 

I was told that enhancing an OCLC bib will not receive monetary credit whereas creating/deriving a new bib will receive monetary credit. Is that true?

No, that is not true. In the past there was a system of credits, but the credits have been discontinued for quite a few years now. So, you are not going to receive a monetary credit for creating or enhancing bibliographic records. For guidelines on when to input a new record or use a record already in WorldCat, see BFAS Chapter 4, When to input a new record.

 

Is there any update on fixing the corrupted copyright symbols in the 264 field or corrupted diacritics? I'm seeing a lot on recent DLC records.

This is similar to the issue where more 856 fields than we would like are transferring to the WorldCat record. There is an effort underway to address the corrupted copyright symbol problem so that we don't transfer in these fields as often as we have in the past. The problem that we have with some of these corrupted diacritics is that they turn into a character that, while a valid Unicode character, is one that we do not want in the WorldCat record. They transfer in because it looks to the system like it’s a valid non-Latin script. We are currently working to resolve this problem. Once we get to the point where we are no longer transferring them to the extent that we currently do, then we will start the clean-up process in WorldCat. Unfortunately, as we have cleaned up these records, we have seen the same errors reappear on the record from a different library's Data Sync load on the same day. Because of this, we are focusing efforts on resolving the underlying source of the problem before going in and cleaning up the WorldCat records that are currently affected by this problem.

 

I have seen records with multiple merges and after researching the merges have found (for example) DVD records that have 2 discs where the current record displays only 1 disc. Are these records checked for the number of discs before merging? Two of the 4 OCLC records merged had differing numbers of discs.

Yes, that is something we do check for in both DDR and manual merging. If you suspect that records have been incorrectly merged, please report them to bibchange@oclc.org and we will look into the records through OCLC's Journal History and if appropriate, we will recover the records.

 

If you're looking for common validation problems, one that we find frequently that stops validation is in 775 or 776 fields that have one extra space in a $w.

This is good to know. Staff have gone through and fixed this spacing error before but we appreciate knowing that these are coming back. We'll work on cleaning these up again.

 

Is there any way you can get certain institutions to update their records for online resources to reflect the publication vs. their digital imaging data in the 260/264 and fixed fields? They will often have duplicate records coded, one coded mixed material records and the other coded as an online monograph. These are not always Dublin Core vs. MARC.

Most likely these are digital gateway records, which maps Dublin Core data to MARC 21. This often results in a mixed materials type so it would not be added as a book. Because they are not necessarily constructed according to the same cataloging rules that we would use for other materials, we do not merge these records at this point. If there is a specific institution please contact askqc@oclc.org so staff can look into the problem to see how significant it is.

 

I am not sure if it is an appropriate question but which language should be used in the fixed fields in bilingual DVDs?

The Lang fixed field contains the language of the resource itself. If the resource is bilingual then you would use both the Lang fixed field and field 041 to code the languages of the resource. For example, if the DVD of a German film included dialog in both German and French with subtitles in English and Spanish, the Lang fixed field would contain "ger" while the 041 field would contain all of the languages included in the appropriate subfields. For example:

Lang: ger

041 1 ger $a fre $j eng $j spa

 

Was that 20-minute presentation the topic in its entirety?

The presentation given today was an overview of the topic. Please refer to the guidelines for more information and further examples. Metadata Quality staff are currently working to clarify the guidelines for Parallel language records, which is currently located in BFAS 3.10, Parallel Records for Language of Cataloging. These guidelines, once refined, most likely will be moved to BFAS Chapter 2. And, if you have additional questions, please send them to askqc@oclc.org.

 

Can we report duplicate level M records by sending the ISBN to Bibchange or do you always need the OCLC numbers?

Yes, you may send ISBNs as duplicate requests to bibchange@oclc.org.

November 2018: How the OCLC MARC update process works

Topic-specific questions

 

Yesterday I noticed that a particular organization code (DOT) is obsolete and has been supplanted by ONET. Should the DOT be removed? It makes a difference-- Creative writers is in DOT but not in ONET.

If a code has been made obsolete by the Library of Congress, it should have a dash in front of it. If it doesn’t have a dash in front of it, it is still a valid code and can be used. If the thesaurus or reference document the code corresponds to has been superseded by a later edition or a more up-to-date list but is still a valid MARC code, you can continue to use the MARC code in the field. Occupation Term Source Codes still has the code dot listed for the Dictionary of occupational titles.

 

There have been numerous duplicate DLC records being added. Some have been reported and quickly deleted but there is still a lot that DDR seems not to pick e.g. 1061860203 and 1043516446 both have DLC in 040$a. Why are these being added?

In general, DDR can’t catch everything, and we have designed it to be extremely careful and to err on the side of leaving a duplicate rather than incorrectly merging records that shouldn’t be merged. If you have record numbers that DDR has missed, do report them to bibchange@oclc.org. It may take us some time to get to these, but it does help us to get these reports as they can help us find other patterns of issues that we can address.

For the specific records reported, #1043516446 is a Library of Congress contributed record as indicated by the symbol DLC in the 040 subfield $c. #1061860203 is a member contributed record as indicated by the institution symbol TXN in the 040 subfield $c. This record was contributed on November 8, 2018, which was only 6 days ago. For records contributed through Connexion and Record Manager, there is a 10-day grace period before DDR evaluates them as potential duplicate records.

Field 040 subfield $a is used to record the original cataloging agency, not the transcribing agency which is recorded in subfield $c. For more information about field 040 and what the different subfields are used for, please see OCLC’s Bibliographic Formats and Standards, 040 Cataloging Source.

 

When we report possible mis-merged records how long does it take for someone to look at them?

If you would put something in the subject line to alert us to it being a possible incorrect merge, that would help as we do have several hundred requests that come into our inbox on a daily basis, so we do have to prioritize those requests. So, if you could give them some sort of indication that would get our attention, once we see that it usually only takes a matter of a few hours to recover that merge.

 

Follow-up: is it better to email than to do an online error report?

We have no preference for us, they all come to the same place, the inbox that QC staff work from. But just based on what was just said about drawing attention to it, an email might be better so you could put something in the subject line.

 

"National Level Full and Minimal Requirements" is current only through 2010. Is that being updated somewhere other than http://www.loc.gov/marc/bibliographic/nlr/? How does OCLC determine requirements where they're not in official MARC?

That’s a question you should ask the Library of Congress. We check that page on a regular basis to see if it’s been updated. The Library of Congress has not updated the national requirements for almost a decade now, that would be nice if they would. Since then, we have basically made them up as we go along. We have tried to determine in our QC Policy meetings and in documentation meetings for BFAS what would be the most logical requirements for both full-level records and minimal-level records for everything that’s been implemented by OCLC, all the bibliographic elements since 2010. So, we essentially create those requirements which are in the Input Standards sections of each bibliographic BFAS page.

 

As a Canadian library am I allow to add Canadian subject headings to the records with our library symbols attached to them in Record manager?

Yes, there is no reason not to.

 

How are the ramifications of processes for your users and our users being run considered? For example: a process run not too long ago, perhaps because of MARC headings changes, changed the order of subject headings so they are no longer in order of importance as assigned by the cataloger. They now seem to be in order by field numbers. This is unhelpful to our WorldCat local users and probably to other catalogs as well.

This is a known issue that has been reported. It’s happening through some ingest processing where subject headings are being reordered and we have alerted the staff and I believe there is a problem report for that and hopefully being worked on soon.

We are aware of it, it’s not an ideal situation. You are more than welcome if it fits into your workflow to reorder those appropriately. We are actively working to try and prevent that from happening, because as you said yes, it is unhelpful to both WorldCat local users and those who are downloading those MARC records.

 

To go back to the beginning of my question. What safeguards are in place to prevent problems like this subject headings order? We seem to have this kind of problem crop up every now and then and it would be nice to prevent it from taking place at all.

We 100% agree with you. I don’t have a good answer to this other than we are looking at it. The factors involved is whether or not basically is, the record is being brought in through ingest.

 

What kinds of safeguards is OCLC putting in place is my question, not just the subject heading orders, the process?

That’s part of our development work. We have reports out to our development team, and those have to fit into all the other development processes and priorities that are going on. We’ve outlined the safeguards that need to go in, talked with, and communicated this with our development team. Now it’s just a matter of when that can be scheduled to be put in.

We have quite robust validation rules and processes that are continuously being updated and which is one of the reasons why we like you to report any kinds of problems that you run into with the records because if we can find a pattern we can add it to our validation rules, to prevent them from happening in the first place.

 

I guess I'm confused about the x47 fields. Are a 647 to replace a 6xx field or are both added to the record? In the slide the 647s were fast headings. I thought those were machined generated. Are we supposed to be adding them? If we derive from a record that has fast headings should we keep them or remove them and let the system re-add them?

As of the time of our implementation of the 647 field, you are allowed to add 647 fields to bibliographic records. Whether there is an LC authority record that corresponds to what would now be a 647 field is a different question and I do not know if there are any yet. Yes, the FAST headings are machine generated although there are institutions that create their own as well. Either manually or through a macro of some kind. You may add 647 FAST headings if they are appropriate, which do not necessarily have to refer to the LC authority file. You aren’t obligated to add them, but you may. The system will regenerate FAST headings once a month, so if you are changing LC subject headings in a record of any kind, you can allow the system to regenerate the FAST headings, you don’t have to remove the FAST headings or do anything with them and they will be taken care of within a few weeks in most cases.

 

Hi. I'm creating records for four books in a publisher series. Is there any reason that I can't class them as Pt. 1-Pt. 4, even though each has its own record under a title main entry? And, since this is my first time attending, is this an appropriate question for this forum? Thanks

Yes, absolutely a perfect question. Also, you don’t have to wait for one of these to come along since our next one won’t be until January. You can always email questions to askqc@oclc.org, but to answer your question…

You have the choice of cataloging that 4-book set as a set, in which case I would think that the set would have one classification number, and you could do as you wish locally with parts 1through 4. It’s also legitimate to catalog each of those volumes separately if you wish and of course that would allow you to also “classify” them as parts 1 through 4. That kind of issue would be a local determination.

 

In regard to the question that is being discussed right now, "couldn’t this be a whole part instead of series".

Yes. If there are additional questions on that feel free to email askqc@oclc.org. A lot of times it is best for us to look at the actual records in question to give you a more definitive answer.

 

When Dewey numbers are discontinued or updated in Web Dewey, how can we view documentation on this? For example, 793.932 is "good" in DDC23 but not WebDewey. Or 641.56362 was added, but not in book. How are catalogers notified when changes like this are made? Thanks.

At the top of the WebDewey page there are a series of orange buttons, one is labeled Updates which may be the information you are looking for. Or, on the upper right-hand side of the page there is a “contact” button where you could ask this question. There is also the Dewey Blog where you may find additional helpful information.

 

If there is $e pn in the 040 could this code be used as a help in validation process? it wouldn’t get everything but some?

This is a good question. Right now, we aren’t doing anything with validation from provider-neutral coding but is something we could look into.

 

This is a problem children's librarians face where the series title is entered with a part title in the 245 or the title is just in the 245 and the series is in the 490/8xx.

There are differences in [local] practices and whether you do each record separately or one record for the entire thing.

 

Sorry, no it's not one record for the entire thing. it's how the 245 is entered. As [series title].$n [number],$p[part title] or just [title] with 490/8xx {series title] ;$v[number]. I think this was what the first questioner was asking.

So we have a lot of variations in cataloging practice here that is not easily reconcilable, and I understand that it’s difficult for different communities when they go in and find 3 different records for the same thing.

 

I understand that MARC changes are announced through reports and bulletins. Is the publication of these reports and bulletins announced anywhere (e.g. a listserv)? How can I be sure to get regular notification of these reports/updates?

Just last week the Library of Congress issued the new MARC21 Update number 27. If you subscribe to the Library of Congress MARC discussion list you will get all of the MARC update announcements, all of the technical notice announcements, and so on. You can subscribe to that via the Library of Congress MARC Standards page.

As far as OCLC is concerned, our OCLC MARC changes are announced via the OCLC-CAT discussion list, lists for specific communities such as the Music OCLC Users Group discussion list, the Online AV Catalogers discussion list, etc. And now there is a page that is on the OCLC website which has the WorldCat Validation Release Notes and Known Issues. This page is fairly new and there does not appear to be a way to subscribe to the page.

 

Is OCLC still merging duplicate bib records for HathiTrust and GoogleBooks? Is it acceptable to add archive.org links to those records since those are accessible?

Yes.

 

Why aren't batch loaded records run through the validation process?

Actually, they are run through a validation process, it’s just a different validation process that’s not as strict as online validation, because that would possibly prevent a lot of records from being processed through matching and getting added to WorldCat. There are some safeguards in place where really serious validation errors prevent a record from being fully indexed and then they have to be manually corrected in order to be indexed and available in WorldCat. So we have some safeguards in place that keep really corrupt records and records that aren’t structurally correct from being added to WorldCat, but some levels of validation errors that we deem to not be as serious do make it through the batch load process, so those records can be processed and then eventually they can be corrected through other means.

 

I get records that won't validate when I did not make any changes, so I end up deleting fields that are hanging it up but I know nothing about.

Your options would be to update the record so that it passes validation by deleting those fields, but we hope you aren’t losing data and those fields are really illegitimate.

By all means, if you have any errors you aren’t able to correct you can always send them to bibchange@oclc.org.

Note: In Connexion client, this is located in Options… under the Tools menu option, under the General tab. There is a selection for Validation Level Options.

 

Where should we report issues with FAST headings? I recently discovered that the FAST heading for the Jewish Holocaust is (still) in X11 instead of X47?

FAST headings can be reported to fast@oclc.org. Because the x47 is a relatively new field we have not necessarily converted all of the x11s that should now be x47 fields. There are not x47 event heading fields in the LC Authority File, so we also have to wait for that if meeting headings haven’t yet been changed to event headings.

 

I would like $l issn to be searched as valid with $a This way I wouldn’t need to cheat and use $y. links to those records since those are accessible?

If I remember correctly, indexing of the field 022 subfield $l simply has not been implemented yet, but it is on the list of things that need to be changed within indexing. As mentioned in today’s presentation, the indexing schedule is usually much longer, sometimes long after the MARC update to which it corresponds.

 

I found a record for a 100 disc set that has 100 020, 024, 028 fields and freezes my session. #885361932. Can that be fixed? We found it with an ISBN search.

We were able to bring it up, but it locked us up too. This would be something we would need to look into further, but it may be a situation where Record Manager has a better feel for it than Connexion.