This post was co-written with Mara Blake.
Three members from the Big Ten Academic Alliance Geospatial Data Project attended Geo4LibCamp 2017. The organizers describe this annual event as “a hands-on meeting to bring together those building repository services for geospatial data. The main focus is to share best-practices, solve common problems, and address technical issues with integrating geospatial data into a repository and associated services.”
Geo4LibCamp is hosted by Stanford University in Palo Alto, California. Stanford is a fitting location for this event, because it has been a leader in the development of geospatial repositories, or georepositories, for libraries, notably with their work with the University of Santa Barbara to develop the National Geospatial Digital Archive, as well as their more recent contributions to discovery platforms for archived GIS data, particularly GeoBlacklight. It is also home to the interactive Rumsey Map Center, which includes holdings from the world-renowned David Rumsey Map Collection.The Participants
Plaza at Stanford University
The event included 48 registered participants from 30 institutions. Most of the attendees were from research universities, but there were also a few from private consulting firms specializing in open source geospatial tools. The universities represented are currently in varying stages of developing their own geospatial infrastructure, ranging from libraries that have already established a publicly searchable georepository to those who were still exploring the idea. For those who were farther along in the process, Geo4LibCamp offered a time to exchange workflows and lessons learned. For those who were still exploring the idea of a georepository, several un-conference sessions included persuasive discussions for how and why an institution should adopt one.
Preservation of public data
The impetus for libraries to preserve public data was a topic that emerged several times throughout the week. This is not surprising, since the academic library community has recently been the subject of many 2017 news stories due to an increased concern over at-risk government data. Despite the legal and cultural advances made with the Open Data movement, public information is still ephemeral, and recent historical data is often not systematically archived. Many participants at Geo4LibCamp expressed a newly energized commitment to begin a process at their institution for archiving public data using collaboratively built tools and workflows.
Activists Rush to Save Government Science Data - If They Can Find It
Example of recent news article about data rescue
Open Source Software Stacks
The tools promoted at Geo4LibCamp are largely open source and based upon the Hydra Project. Throughout the week, participants got a chance to interact with several Hydra focused geospatial developers and find out about their work to improve existing repository applications and hear about those still in development. Their concept and implementation of an impressive framework for a georepository built upon GeoConcerns, Fedora, GeoServer, and GeoBlacklight was demonstrated through a series of presentations and hands-on workshops. Although assembling and funding the technical infrastructure for a full-fledged georepository can be daunting, this framework appears to be well designed and has the potential to be widely adopted.
Simplified version of a georepository
The topic of usability studies for GeoBlacklight generated quite a bit of interest among the participants, as the Big Ten Academic Alliance Interface Steering Group presented their findings on sixteen user tests conducted in November 2016. Unlike commercial software, open source projects often do not include robust usability tests, but the participants identified it as an important component to further developments in geoportals. In the unconference session about GeoBlacklight user experience and usability, GeoBlacklight developers and participants decided to report usability problems as issues in GitHub.
The challenge of creating complete and interoperable metadata
As the digital library world moves further towards the model of aggregating metadata records for platforms such as the Digital Public Library of America and Europeana, the need for machine-readable, interoperable metadata increases. In many ways, the XML-based standards of FGDC and ISO have prepared geospatial metadata well for aggregation. However, these standards are so complex that they can be difficult to normalize, a challenge which led to the development of the Dublin Core-based GeoBlacklight schema.
Metadata was frequently cited as a topic of interest by participants at Geo4LibCamp. This resulted in un-conference sessions on creating GeoBlacklight metadata, improving the compatibility of GeoBlacklight records between institutions, tools for harvesting existing metadata, and extending the capabilities of linked data with regards to geospatial elements, such as bounding box and scale.
Integrating Scanned Maps with GIS Data in Repositories
Many libraries treat GIS data and scanned maps as distinct types of resources, described with different metadata standards and presented via different catalog platforms. However, scholars frequently use both formats while doing research, especially for historical projects. This was evident during the week’s one research presentation, where Christy Hyman described her research into the experience of enslaved African Americans escaping northward through the Great Dismal Swamp in Virginia and North Carolina. This research relied on both historical maps and GIS data to draw together a spatial narrative. Fittingly, Hyman’s talk took place in the Rumsey Map Center, where participants got to view scanned historical maps on large touch screens.
The week also featured presentations and un-conferences devoted to methods for creating indexes for map series, and there were several discussions around scanning best-practices and models for representing maps in a georepository. A weakness in the current geospatial library community was identified concerning the need to have better standards in place for maintaining the relationships between paper maps and their many derived manifestations, such as scanned images, georeferenced files, and extracted feature classes.