Access 2014 Calgary Tuesday notes
As always, it’s a pleasure to be at Access, reconnecting with colleagues and learning about exciting new developments. This year’s version also has the distinction of being the first, and likely only, library conference I will ever attend that’s taking place within a stone’s throw of a 400m speedskating oval. The missed opportunity–my Viking klaps are at home buried in the basement–will sting for a while, but I suppose I could have looked before I got on the plane, since I knew quite well that Calgary has an oval.
As always with my notes, editorial comments are in italics to distinguish them from the speaker’s points.
Public Digital Humanities Center
Kim Martin, Western U
Was interrupted by a critical IT issue at work, so had to jump into this talk a bit late and my notes are correspondingly vague.
Showed how the DHMakerBus has been a way to work with a wide range of organizations and entities, which is a manifestation of her mantra “network by doing.” This stands in contrast to asking them “how can you help us” and replaces it instead with “how can we work together.” She ran through a number of sample events with some of these groups, noting how varied and successful they were.
This was a rare talk at an academic conference where children figured fairly large in the narrative. It occurred to me that this is a welcome departure, and if we’re talking about engaging people in the humanities via the digital humanities, we’ve missed the boat if we think we can achieve this with students who have already arrived at university. It needs to start much younger.
With regard to making DH public, she asked how we in libraries can make the artifacts of DH work permanent and accessible, using Minecraft worlds as one example.
Growing the PLN: Challenges and Opportunities
John Durno, UVic; Bronwen Sprout, UBC
The PLN (COPPUL‘s Private LOCKSS Network) is running up against space issues. This is a good thing, since it represents a ramping up of preservation activities, but also a problem that needs to be solved. PLN has most libraries from COPPUL involved (10 in all, a fair chunk of COPPUL’s full members), and no fees are charged, although members must pay their own LOCKSS Alliance fees and are obligated to set up a 2TB LOCKSS box.
The materials included locally hosted open access journals (especially OJS journals), digitized collections, Websites/online resources, and electronic theses and dissertations. Overall, there are 764 archival units, representing <1TB total. Why so little? She promised to get back to that point.
She described the governance model. The PLN falls under the Digital Preservation Working Group, which beyond overseeing the PLN takes an interest in broader digital preservation practices and needs, e.g.- Archive-it and Archivematica-as-a-service.
John covered some of the challenges they’ve faced and how they overcame them. He started with an overview of a LOCKSS network, noting how they can self-repair across nodes when necessary. They have a total of 2TB of storage available, which is sufficient due to low needs, but it faces what he termed “ingestion constraints.” This is a way of saying that there are technical challenges with getting the content into LOCKSS. OJS has a plug-in that simplifies this, but other systems/platforms do not.
As he put it, there are now some game changers: Archive-It, LOCKSSdm plugin, DSpace plugin, LOCKSS-O-Matic, and Archivematica. These tools and solutions make it easier to export content to LOCKSS. He jokingly referred to the DSpace plugin as a quasi-urban legend, but U of Calgary recently released some code, and Simon Fraser has been working away at LOCKSS-O-Matic.
The problem they face now is essentially who gets the space. With only 2TB, people won’t put their content in, assuming that others have similar needs (he described this as the quintessential Canadian “no, after you” problem). To begin to address this, the DPWG did a survey of all members (not just PLN nodes) to see what content they have that would make sense in the PLN. They also asked the nodes if they could scale it up to 4TB within a month, or 10TB within a year. They ultimately held a capacity planning workshop in May 2014 to try to address these issues and questions at scale.
Their recommended actions were based on the notion of go big or go home. In other words, they need to put it under central management and pull together more technologies and services. They have proposed a half-time COPPUL position to manage this work, and he showed a daunting list of tasks that could fall to this position, including re-architecting the PLN, software development, platform-as-a-service beyond Archivematica, knowledge/expertise sharing, etc.
In answer to the question “where to from here,” they offered these directions:
- evaluate other DP communities
- needs assessment
- structured expansion plan
- establish governance, membership, sustainability models
- cost-recovery services?
Using library lab PCs to crunch academic research data
Jonathan Younker, Brock
They’re using idle lab machines to do HPC tasks. They have 120 nodes, with a total of 468 cores and 632GB RAM. Student use has priority, of course, but when the machines are idle, such as overnight, they ramp up the usage.
It was, as he put it, “a solution looking for a problem.” Machines sit idle for long periods and represent a significant investment. This will also fall under their digital scholarship lab initiative, which is in development, and is a way to get research data and methods into the library collection. Their goal: to become a “baby SharcNet.”
Mentioned a few early adopters, which include some Brock researchers processing EEG data. They just wrapped up the first dataset in September, and the researchers are pleased with the results. Now they have a relationship with these researchers about data management, which is a desired and intentional side benefit of making this capacity available.
They use the Microsoft HPC pack for this since it fits their existing environment well. They have had some challenges, such as a steep learning curve, and need to have what he calls a Mike, i.e.- a person skilled at setting this up (Mike was their guy). It also involves proprietary software, which introduces a set of issues. Ideally, they could automate the whole process and involve more researchers (they have seven EEG labs, for example). They also need to invest a bit in some hardware, and, as usual, wrap this in SLAs, procedures, etc.
#HackUOBiblio – libraries, hacking, and open data
Catherine McGoveran, U Ottawa
U Ottawa and Ottawa PL have begun to release their data. This was related to an open data event, where they met with various new partners and explored ways to collaborate on open data ideas and initiatives. The U Ottawa library is working with the registrar’s office and their data, for example, as an outgrowth of this new direction. Showed examples of data visualizations that were created, which ranged from the geographically broad, such as flights across Ontario, to the small, such as skating rinks in Ottawa.
Hacking the city: Libraries and the open data movement
Alex Carruthers, Edmonton PL; Lydia Zvyagintseva, U Alberta
Started with a brief description of the open data movement, but noted that this movement has benefited municipalities and business more than the general public. The open data events–HackYEG and Open Data Day–they described are intended to address that.
One goal is to promote data literacy, which can be described in a set of competencies and is a subset of broader digital literacy. As they put it, these events also leverage community knowledge. They closed by including a quote noting that fostering this kind of engagement and literacy is the work of libraries.
Useful Usability Panel
Krista Godfrey, Memorial U; Gillian Byrne, Ryerson; Jeff Carter, James MacKenzie, U New Brunswick
Krista began with thoughts on user feedback and its lack of reliability and the biases of those who offer it. Pointed out that the language we use when doing usability testing is not necessarily the language of the users, so we need to test the test before deploying it. She asked how many of us have a garbage can labelled user feedback where we chuck the stuff we don’t want to hear.
Gillian took up a topic that they called “feedback triage.” This involves taking feedback and properly addressing it, either by providing answers, i.e.- referring feedback to those who can solve discrete issues, or putting it into a category that informs and drives decisions. Also mentioned that it’s not just important to gather feedback, but some of the ‘meta-feedback,’ i.e.- how people were speaking and the words they used.
James spoke specifically about user research they did connected to a responsive Web design project at UNB. They went through what sounds like an extensive user research process as part of this project.
Jeff pointed out that meeting accessibility standards does not guarantee a good user experience. His talk was about making accessibility and usability work together.