“Unidentified”: Crowdsourced Metadata & “Citizen Archivists”

“Our collective creativity – and innate human propensity for community – will undoubtedly stimulate as-yet-unimagined ways to harness knowledge bytes into remarkable resources” – “Crowdsourcing Cultural Heritage: ‘Citizen Archivists’ for the Future,” Jan Zastrow

This past summer, the Library of Congress National Audio-Visual Conservation Center started sharing unidentified stills on their blog. After accumulating stacks of mysterious photos (mostly promotional shots from old movies) in their office, the Motion Picture, Broadcasting and Recorded Sound Division (MBRS) staff started to solicit the general public for information – asking for it to be posted in the “comments” section of each of their posts.  Such informality, as well as the “game-like” nature of identifying these materials, is nothing new.

A significant amount of research has already been done to evaluate the effectiveness of crowdsourcing in the archive (e.g., tracing the origins and rationales, reviews of methods). Sites and platforms like CitizenScience.gov and Metadata Games utilize public participation, interest, and (in some cases) competitiveness to gather information, descriptions, and transcriptions for the collections of archives, libraries, and museums. These institutions can use the Internet and social media to “outsource” archival work to individuals – amateurs and experts alike. Does this process equalize or democratize our information-gathering, our research orientations? Does it help expedite processing, improve access, and perhaps even encourage some form of “archival civic engagement” – wherein citizens-cum-archivists are engaged with their communities’ histories?

On the flip side, might not some flummoxed volunteer (either from inside the archive itself or across the world, seated in front of their interface) mislabel or misidentify materials – obscuring them from interested researchers in perpetuity?  As a legitimate concern, we might be reassured by the pseudo-checks-and-balances system that emerges from crowdsourcing – always having someone else “checking up on” or confirming the quality of your work. Then again, a herd mindset might serve to perpetuate errors. Ultimately, professional archivists must take the lead on these projects.

Leave a Reply