For my crowdsourcing participation, I decided to participate in a Zooniverse Project called “Operation War Diary”. In this project, “citizen historians” are asked to review World War I documents related to the British Army. Volunteers are asked to first tag the kind of document they are viewing (from a prepared list of document types) and then asked to tag details in the document. These tag-able details include: times mentions, dates, locations, people, unit activity, army life, unit strength, weather, grid references and “other”. The goal of this project, as stated, is to, “create new ‘Citizen Historians’. Working together…[to] make previously inaccessible information available to academics, researchers and family historians worldwide, [while] leaving a lasting legacy for the centenary of the First World War”. The project is teamed up with the Imperial War Museum’s “Lives of the First World War” project and affiliated with the National Archives (UK).
Participating in this project was relatively easy and straight forward. You are given a quick tutorial of the User interface and how to accomplish your task. The documents are clear and zoom-able and the tagging tool bar is very easy to use. The project clearly informs its volunteers that it is not looking for a word for word transcription, but instead, well tagged documents. This is a very reasonable request to make from volunteers, which is essential to any crowdsourcing project. As a prospective historian, I enjoyed looking at the documents themselves and trying to decode them; the ease of the interface made the work very less daunting. The project However, there are several things I did not enjoy about the project. The project should have a target goal for how many documents it wants its volunteers to help tag and some way to acknowledge or reward/incentivize its volunteers. A Reward could be as simple as delivering a positive message to a volunteer after completing so many documents. People should be incentivized to conduct good volunteer work, even if it is just a small generic pop-up window that positively affirms a participant’s work after so many documents. Another issue I foresee is if documents are poorly or wrongly categorized. However, I can assume once the project is complete, individuals who access the documents can correct them.
This project is crowdsourced because of the number of documents available. The cost for these documents to be categorized by paid professionals would be astronomical. In addition, it is a way to get amateur historians and students into the field. It Is very fitting that this project is being conducted during the anniversary of the war years (1914-1918). The contributions volunteers have made will be used to categorize these documents for further historical reference by scholars and the public. Crowdsourced document analysis projects, such as Operation War Diary, not only save time and money, but ensure that archived documents are actually viewed by the public. Rather than forgotten in an old storage room or collecting dust on a self, these documents are once again being read by not just professional researchers, but the public.