Arrangement / Assessment / Preservation / Processing / Research

Thoughts on Traditional Processing and MPLP…

Written by Rona Razon, ICFA Archives Specialist

For a long time, I have been meaning to write a post about my thoughts on the proper way to perform archival processing (i.e. research, assessment, arrangement, description, and rehousing) and the value of “More Product, Less Process” (MPLP). As most archivists know, MPLP is a different approach to processing developed by Mark Greene and Dennis Meissner of the Minnesota Historical Society that advocates for greater efficiency through minimal-level processing.  I know this topic is so 2005! But, I think it is something that a lot of archivists and processors are still dealing with and often unsure of.  In my case, over the years, I have realized (after a lot of re-thinking) that archival processing should be a balance between traditional standards and the MPLP approach.  It can’t be only one or the other.

It was back in 2010 when I started processing the Byzantine Institute and Dumbarton Oaks fieldwork collection, which contains more than 100 boxes spanning the years between 1920s and 2000s, that I began to consider the advantages and disadvantages of traditional processing versus MPLP.

Byzantine Institute staff and Thomas Whittemore cleaning the mosaics at Hagia Sophia, 1936

Byzantine Institute staff and Thomas Whittemore restoring the mosaics at Hagia Sophia, Istanbul, Turkey, 1936. One of the many treasures from the collection, The Byzantine Institute and Dumbarton Oaks Fieldwork Records and Papers, 1920s – 2000s.

Aided by our former archival assistant, Laurian Douthett, we agreed that each method offered certain benefits, so we used a blend of the two approaches to formulate the following rules:

  • Conduct preliminary collection-level research (folder- and item-level research, only if necessary; i.e., conduct research on items that contain informational and evidential value)
  • Assess and describe the contents of each folder (unique items, only if necessary; i.e., assess and describe items that contain informational and evidential value)
  • Assess the items’ physical condition
  • Identify common terms, such as subjects, individual names, geographic areas, and institutions, at the collection level
  • Evaluate the collection’s existing order or arrangement at the box and folder level
  • Draft an arrangement outline if there is no existing order, or it is not appropriate for the collection
  • Finally, organize and rehouse the collection

As you can see, this blended approach requires that a significant number of steps be taken in processing a collection. However, in the long run, I believe that this additional work will result in more accessible collections and user-friendly access tools.

ICFA archival assistants (Laurian Douthett and Jessica Cebra) processing the oversize drawings in the Byzantine Institute and Dumbarton Oaks fieldwork collection.

ICFA archival assistants (Laurian Douthett and Jessica Cebra) processing the oversize drawings in the Byzantine Institute and Dumbarton Oaks fieldwork collection.

ICFA archival assistants (Jessica Cebra and Laurian Douthett) processing the oversize drawings in the Byzantine Institute and Dumbarton Oaks fieldwork collection.

ICFA archival assistants (Jessica Cebra and Laurian Douthett) processing the oversize drawings in the Byzantine Institute and Dumbarton Oaks fieldwork collection.

I always tell our assistants and interns that as processors (and as future archivists) they need to understand and know the collection, both at the collection and folder level so that they can make informed decisions concerning its arrangement and process the collection appropriately. From an administrative perspective, I believe that the standards of traditional processing also make sense because the more time we spend working on collections yields more comprehensive answers to our standard questions, such as:

  • Who created the collection and why?
  • How does the collection relate to the department/institution?
  • When and where was the collection created?
  • How are the contents of the folders and boxes related to each other?
  • Is there an existing order? Has the collection previously been pre-processed or re-organized?
  • What kind of medium and formats does the collection contain? And how many?
  • What is the collection’s acquisition history? (In our experience in ICFA, we have often found that this information is hidden within the items.)

The answers to these questions will guide us on how we should properly deal with arrangement, description, housing, storage matters, and copyright issues. For us, it is important that we know the collection’s existing arrangement so we can determine if that order makes sense for accessing the collection, or if we need to apply a new order that will better highlight the collection’s administrative and/or project history. Furthermore, since we are a small archive and many people don’t know what we have (and often times, what we do), we need a deep knowledge and understanding of our collections, so that we can most effectively promote and describe our holdings to our fellow colleagues and researchers.

So how are we meeting our users’ research needs if we are spending all this time doing all of this incredibly detailed work? Let me first point out that we do make our collections available to researchers even if the collection has not been fully processed and even if the finding aid is not yet complete. I understand that the idea of MPLP is to get rid of the “never-ending” backlog (as seen below) and to make pre-processed and unprocessed collections available as soon as possible.

Backlog sample from the Robert Van Nice Collection

Backlog sample from the Robert Van Nice Collection

After all, that’s our job as archivists. Supply and demand! I get that. But what I don’t get is how you can make a collection available (i.e. open for use and discovery) if it’s not accessible (i.e. the collection and finding aid is not usable, findable, or searchable). Two different things, right?

Meissner pointed out in his webinar on MPLP that archivists are doing a disservice to their users by not making their collections available, while they focus on processing their collections perfectly. Yes, of course I agree. It does not make sense to concentrate on small details (i.e. item-level research, assessment, description, and re-housing), unless you want to process one collection until you are 80! Additionally, collections are not created equal – not all collections possess the same evidential, informational, and intrinsic value (Meissner’s Webinar, Slide 9). Therefore, not all collections should be processed to the same level.

However, I recently had an experience while researching the Byzantine Institute and Dumbarton Oaks collection that presents a tangible example of this problem. My online research yielded a number of finding aids related to the collection, most with collection- or box-level descriptions. However, the collection guides did not contain the level of detail necessary to answer my questions without having to contact the archive. Sure, I was able to determine that this repository holds materials related to one of our protagonists, since the creator’s name is listed in the subject terms, but the finding aid did not state specifically what items relate to my subject. I also had a similar experience when I visited a repository to access a related collection. While the finding aid did point me to the boxes related to my research needs (which was merely based on the inclusive dates), it did not guide me to the right folder. As a result, I had to look through all the folders for the entire day. And so, are we doing a disservice to our users by providing finding aids and collections with “pre-processed” or collection-level information? Sure, MPLP results in some level of description, but is it enough if it requires users to engage the archivist, who may not yet know the answers themselves, since their knowledge of the collection is similarly limited by minimal time spent processing?

Joshua Ranger of AV Preserve also has the same concerns with audiovisual materials. Ranger states in his blog post, “Is the Product of Less Process Sufficient for Audiovisual Collections?:”

To describe paper collections at the folder, box or other higher level and let researchers dig through them to discover items themselves is sensible for the most part. But does that strategy still work when the researcher comes across audiovisual materials that are inaccessible, unlabeled, in too poor of shape to play back, or otherwise facing issues that would prevent a researcher from determining anything about an object outside of apparent format or condition characteristics?

I think the same problem is applicable to paper and photograph collections as well, although they are open for use and discovery. In most cases, researchers do not have the time and luxury to discover the exact item they need. And in our case, some of the folders in our pre-processed and unprocessed collections are unlabeled, or contain inaccurate titles. So what should we do in that case? Should we leave it as is for the sake of availability and efficiency, or should we go through all the steps that I mentioned above to make the collection fully available and accessible?

Other questions that come to mind are:

  • Meissner kept stating in his webinar (Slide 11): “Digitize, digitize!” But if collections are processed at the collection and even series level, then how can we plan for a digitization project without really knowing what the unique items are or what items need to be digitized because of preservation issues? Do we just digitize for the heck of it?
  • How would Greene and Meissner deal with sensitive data, such as social security numbers? This was one of the questions that came up while I met up with other archivists at the MARAC 2012 Conference in Richmond, VA.
  • Meissner and Greene also point out that archivists should not obsess about a collection’s appearance. Meaning: don’t make a big deal about re-labeling and re-housing every single thing. Sure, I understand that this can be a superficial concern, but aren’t we doing a disservice to our donors if we don’t properly re-label and re-house their collections? Does a more thorough re-housing effort on our part more effectively encourage our users to handle our collections with care?

But of course, I do understand and see the benefits of the MPLP approach. While our processing steps may seem to lean more towards the standards of traditional archival processing, we try to apply Meissner and Greene’s approach to our workflow as much as we can. For instance, I regularly remind our interns that when assessing the contents of each folder or box, they should always focus on the “common denominator” and not the individual items. Surely, we don’t require them to assess and describe individual items – if we did that, then we might never produce any “product” at all! Furthermore, while it is nice to conduct research on and describe individual items for further access (again, only if the item contains informational and evidential value), I do not believe that it is necessary to do extra research on single items, unless it’s unique material that may be useful to future researchers. We also do not rehouse every single item in the collection, especially if they are still in good condition.

Overall, I think we need to apply both methods as much as we can. It can’t be one or the other, as clearly expressed by Matthew White of Matthew White and Assoc. LLC in Ranger’s blog (scroll down to the comments section):

[MPLP] and other ‘best practices’ that are finding consensus among archive professionals are best seen as tools in your management portfolio: a way to approach a given job. They are not necessarily the right or only tool for a certain job. […] I took MPLP at face value, as a good, smart way to help limit the work in the field, and as one of several management tools for our inventory.

In short, “[best practices] are just [guides and] tools in our portfolio, and [so] we need to know when to use the right tool and when to stop using it, or when to combine it with another tool” (Ranger; scroll down to the comments section).

To end this post, here’s the link to the original ground-breaking article by Greene and Meissner:

And the link to Meissner’s webinar about the “More Product, Less Process” (MPLP) approach:

And finally, the links to some articles that inspired me to write this overdue post:

3 thoughts on “Thoughts on Traditional Processing and MPLP…

  1. Pingback: New Finding Aids from ICFA | Image Collections and Fieldwork Archives

  2. Hi Rona, thanks for sharing your thoughts on MPLP, I don’t think your comments are overdue as archives all over the world struggle with keeping up with processing backlogs. We too have tried to streamline our processing but find that less isn’t always more!

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s