Skip to content

Access 2018 in Hamilton

October 11, 2018

As always, Access was a delight to attend. I am so grateful to my former colleagues and still friends at McMaster, who carried on with the planning after I put my hand up at last year’s wonderful event in Saskatoon and volunteered to bring it to Hamilton, only to subsequently depart for the University of Alberta. I had the impression this year that both the audience and the list of speakers showed the result of years of study progress toward diversifying Access both topically and in terms of gender. This impression was confirmed when I went back and looked at my notes from previous Access conferences, where the preponderance of men isn’t difficult to spot. Not suggesting at all that there isn’t more work to do, as our keynote speakers so eloquently and forcefully underscored for us.

Onward to the notes I took. As always, all errors are mine, all brilliance theirs.

Wednesday, October 10

jump to Thursday

Opening keynote
Sheila Laroque – Edmonton Public Library

I rarely take notes during keynotes since the nature of these talks is geared more toward provoking thought and relating a perspective than imparting specific information or practical steps. So it was with Sheila’s talk. Much to think about, in particular (for me) her mention that the CBC suspended comments on articles related to Indigenous topics and hasn’t lifted that. It signals a lack of civility and understanding in the discourse around Indigenous topics that we all recognize to be true, which doesn’t make it any less distressing.

User Experience at McGill
Ekatarina Grguric – McGill

Pointed out that she doesn’t seek REB approval for usability testing, noting also that REB approval doesn’t necessarily resolve or fix ethics issues, anyway. Summary of her approach: UX is a decision making tool.

Walked through a number of UX case studies related to specific issues they addressed at McGill. The first involved improving the interface of self-check machines. Used Jenn Downs’s “laptop hugging” technique. The testing did not delay the rollout.

Second example involved WorldCat Discovery and a redesigned search bar for their main page. Tested various configuration decisions using A/B tests. Also used paper prototyping for their testing and she spoke positively about this for doing quick, low-cost testing.

Third example was an ongoing usability testing series: “guerrilla” usability testing. Gets her out to various branches, which is important for raising access to usability testing. These tests are run by one note-taker and one facilitator, although they’ve also trained students to solo test.

Libraries in the Age of Extended Reality: Virtual Reality (VR), Augmented Reality (AR), Mixed Reality (MR)
Ken Fujiuchi, Joseph Riggie – Buffalo State College

What can libraries do in this age? Content curation and evaluation: 3D scanning and 360 capture and quality and authority control. Also cataloging, not least for the linked data environment. Also digital preservation, not just preserving objects but also interactions and experiences, including 3D reproductions. Ken referred to this as “preservation through data.”

Showed examples of what they are doing in this area. One was a 360 recording kit with a Ricoh Theta V camera and related accessories including instructions as well as legal guidance. There is also a portal where people can upload their 360 captures (Omeka S – no demo yet, brand new!). They use an augmented reality shelf reading tool called Shelvar. He noted that his role as systems librarian includes a lot of work that is pretty dull, so this is a departure from the norm. Sadly, Shelvar can’t come to market because Amazon owns too many patents. Last example was a variation on a theme. Joe pointed out that since Shelvar and its optical recognition tags isn’t available, one could use LIDAR to scan shelves using the descriptive data in records to identify books. It’s an idea, not a reality. He called it a “Roomba” for shelf reading.

Developing an Open Source Application for Managing EzProxy Configuration File
Juan Denzer – SUNY Oswego

Noted that the EZproxy config file tends to bloat and get very long and cluttered. Described the dissatisfaction that exists out there in libraries with this file. Why do this:

  • Config files bloat over time
  • Neglected for years
  • Staff changes
  • 1000s of lines of stanzas

All of this eats time–humans can’t parse thousands of lines of EZproxy stanzas–and we have better technology, so why don’t we use and apply it. Worked with two students, who convinced him to use Java rather than .NET as well as a noSQL database such as MongoDB. The tool should be able to search, sort, view, and test stanzas. Also it should be able to manage proxy settings. Other features: commenting system and an ability to export the managed config file. Future features:

  • Importing stanzas from OCLC or lists
  • Version control
  • Backup file system
  • Google calendar API integration

Data migration to Open Journal System (OJS) using R
Yoo Young Lee – U of Ottawa

Walked us through pulling data out of systems and cleaning it with a series of R commands. Also needed to create the OJS XML format, so she had to build her own R function to do this. Pointed to other ways one could use R: DH work, data wrangling, data analysis, machine learning, etc.

When asked how to get into this work, she mentioned that she learned R in her statistics courses, but that anyone who knows any scripting language such as Python should be able to get into it. She called R an intuitive language.

GAME ON!!!: Building Interactive Educational Fiction
No wait, title change: Failure! Games are hard?!
Ruby Warren – U of Manitoba

Gave a good introduction of her starts and mis-starts with this work, including swapping out the game engine midstream. Offered a rule: “don’t do everything in the wrong order.” In essence, don’t waterfall a project. Underestimated how long it would take to learn tools and design.

Anyone thinking of building an educational game should double their time estimates. Then triple them. Make time for:

  • learning outcome design
  • engine familiarity
  • customization
  • scripting
  • everything being on fire

Last advice: don’t forget about the users. Bit embarrassed that she did this, as a UX person, but got sucked into the project and lost sight of testing during the development phase. With regard to users

  • plan for apathy
  • plan for testing and trials
  • think about appeal – will the game be attractive?

See the work at her site.

Archive your ILS: How to keep a copy of your old ILS when you migrate to a new ILS
Calvin Mah – Simon Fraser U

SFU just migrated from III’s Millennium to Alma. Data such as order records do not migrate, nor do patron fines details (only the fine amount). As he put it, people don’t trust migrated data, so want to compare the new records with the old.

Calvin noted that Millennium lacked export abilities, e.g., with regard to library fine information, so he ended up screen-scraping all of it for the archive database.

What are the challenges? No one will use the archive until they have do so, i.e., when the old system goes dark. Also, users want it to behave like the old ILS. They also wanted searching that could respect the second filing indicator (The New Yorker vs. New Yorker). Yeesh! Privacy was another concern given BC’s strict privacy regime. To respect this, they had to restrict access to patron data much as the ILS can do.

The archive resides in Postgres and they index with solr to make it search quickly. Downsides: data doesn’t update, gets stale. Staff might get reliant on it.

What did they learn: it was handy in the weeks after the old system went dark. Also, there’s a privacy concern around the patron data, so they plan to remove the patron data and load the fines data directly into Alma.

Contract Work Means Work for You Too: How to make your project sustainable after the contract is over
Bobbi Fox – Harvard U

Her talk was largely a plea toward administrators to tread carefully when setting up contract work. Before one even gets started, it’s essential to consult with the IT group to find out what technologies they support to insure that the vendor is using technologies that the local team can support post-project (simple example: Windows v. Linux). Are the operations people willing to support the project? Are the resources available to handle upgrades and updates? Who provides user support? Who owns the project going forward? These are critical questions. Some may not apply if it’s an enhancement to an existing system.

Also ask yourself what your minimum viable product would be. It all takes longer and costs more than you might budget. Ask ahead of time if you want it to be open source if the vendor will support that goal and follow through with it. Will it be accessible? Not just in the WCAG sense, but also for people who lack the latest and greatest hardware and software. Use version control. Always. System must be configurable. Tests must be run. Logging should be in place and should provide meaningful and actionable errors.

Documentation got its own slide. It should include or be:

  • installation / deployment
  • technical (why, not how)
  • user-oriented
  • useful information for those who provide end-user support
  • attached to each version, not just the end

Ensure the contractor will know and be willing to use the language you need. Work out the details around open source, meaning that there will be version updates so the application should be coded accordingly. Identify and ensure vendors understand any existing customization hooks.

“Contract management is not for the conflict averse.” If the contractor isn’t getting the job done, listen to those who can assess that and take action when necessary.

When the Digital Divides Us: Reconciling Emerging and Emerged Technologies in Libraries
Monica Rettig – Brock U; Gillian Byrne – Toronto PL; Krista Godfrey – Memorial U; Rebecca Laroque – North Bay PL

Panel. First question: staff are often on the front line of troubleshooting in a single service point model. How has this impacted staff? Monica: AskUs desk staff provide first round of troubleshooting. This staff asks technical staff for top ten tips, a checklist, etc. But that’s not how troubleshooting works. It takes a shift in mindset. Krista: Students don’t understand when things don’t work, such as when printers don’t work because they are passing an invisible boundary between library and campus computers. Solution: library still manages them, but uses the central image. Not perfect, some are Windows 7, some 10. Suggested that more of us do a “work like a patron day” as at Brock.

How to do both the new and old (bad gloss of the actual question)? Rebecca: move less quickly and consider what you are doing. We need not just to move forward, but we need to lift up users to come along. Monica: care and thoughtfulness should guide our actions (used the washroom example). We need to consider how people use our buildings; similarly, we need to think about how they use our digital tools, such as foundational tools such as printing systems.

The scope of IT work continues to expand. How do we cope with this? Krista: strategic priorities take precedence. They are the ‘shiny,’ versus the core. We see shiny and want to do it sometimes without considering whether our users are actually interested.

Core topics in the Web Content Accessibility Guidelines (WCAG)
Mark Weiler – Wilfrid Laurier U

WCAG dates back to the 1990s. 2.0 emerged in 2008 and is used as the basis for accessibility of Web content in many jurisdictions, including most of Canada. There have been criticisms with other initiatives such as the Global Public Inclusive Infrastructure suggesting other ways forward.

WCAG is not just something to hand to a technical team. It’s a “narrow understanding” of the intent of WCAG 2.0. It should involve policy makers, teachers, administrators, etc.

His analysis in this talk stems from reading the primary source documents (four in all). What is this like? Tone is formal, the vocabulary precise. The idea is to be clear to an international audience, so this and the intricate degree of organization are necessary. WCAG isn’t technical in nature at all. Technologies change quickly, so if the guidelines had addressed current technologies it would have been dated very quickly. It uses abstractions instead.

He gave a good summary of the guidelines and its level that I cannot capture in notes (his slides will do the job far better).

Practical Linked Data Implementation: Trail of the Caribou
Heather Pretty – Memorial U

Why linked data, she asked. It gives us the ability to answer complex questions that span multiple sources. She gave a brief primer on linked data.

Mentioned the Muninn project and showed how to query it for any predicates and objects related to a subject (a specific name). She used Apache Jena to do remote SPARQL queries. A single query can go against your local triple store as well as against remote stores and then update your local store (I think I got that last piece right).

Great talk. If you ever need someone to come teach or explain linked data in your organization, Heather would be fabulous at this if it’s something she wants to do. She noted in response to a question that this work resulted from learning that she did during a sabbatical.

Thursday, October 11

Sh!t Happens
Krista Godfrey – Memorial U

Walked through how they recovered from their major data blowout. One thing she noted was that their documentation was in their ticketing system, which went out with the data centre. Suggested having some print documentation on hand.

As she put it, their crisis enabled them to do in two months what they had planned to do in five years. They were able to restore things back to current state, with the exception of one service which had to be changed.

RA 21: What are you doing to protect your patrons?
Tim Ribaric – Brock U

Started with a somewhat tongue-in-cheek speech in praise of EZproxy, tracing its demise to its acquisition by an ‘Ohio-based’ company. Finished that with the question: why are we moving away from IP authentication? What was wrong with it?

RA21 is intended to move beyond IP authentication, but they aren’t planning to develop tools, just lead a conversation. Mocked their FAQ and its tone about IP authentication, which asserts that it is hard to maintain and confuses users. Showed the list of people on the RA21 steering committee and it is chock-a-block with vendor representatives. [n.b. – The co-chairs of the steering committee work for Elsevier and the American Chemical Society, so before anyone barks at me about my reflexive lack of trust, I suggest they consider the past actions of these and similar publishers who belong to the APP with regard to shenanigans such as PRISM in 2007 and the Research Works Act in 2011, the former an anti-open access smear campaign and the latter an attempt to enact legislation to prohibit open access mandates. I do not extend trust to proven bad actors. The mere tone of the RA21 Website reminds one of the language used during the PRISM stunt: overly dramatized and based on starting assertions without the slightest gesture toward any research that would support the claims.]

His theories:

  • Resistance to IP authentication is bunk.
  • RA21 is a move by vendors to “eat our authentication lunch,” i.e., it’s in their interest, not ours.
  • If you care about user privacy, you must intercede. Our log files have in-depth information that we need to protect, not open to vendors.
  • We are screwed, but it’s not hopeless. “We are facing down Goliath.”

How to pushback? Change the narrative around access control. It’s our job, it’s not that hard, and we can do it. We need to develop capacity in this area. Support new schemes: SAML, Shibboleth, etc. Make it business as usual to work in this area. We need to spread the word. IT needs to work with Collections; the administration needs to understand patron rights and privacy. We need to concentrate on agnostic tools: LibX, Zotero, etc. Keep things open that can be kept open, e.g., the ILS.

Programming Historian IRL
Adam Doan, Kimberley Martin – U of Guelph

Kim was running a basic digital research skills meet-up, which turned out to attract an all-female crowd. She wanted people to learn as a community, rather than setting it up as instruction or assignments. They met for about two hours every two weeks. They used Programming Historian as their ‘curriculum.’ Python was their choice to learn first. Aside from learning Python, they also covered Git and GitHub, TwitterBots, and Audacity (podcasting).

What worked? The self-direction by the participants did. It had low overhead for the organizers and engaged peers in supporting each other. They also had a buddy system for late-joiners. Programming Historian also worked well for them in some regards. Kim was excited to see women attend and stick with it. She doesn’t know why exactly this happened, but suggested that using an image of a woman on the poster advertising it may have played a role, as her involvement may also have done.

Not everything was perfect. There was irregular attendance and participants had different knowledge levels; it was challenging to maintain their momentum. As Adam put it, writing code every two weeks means that it’s hard to remember and build between sessions. Programming Historian also had some issues, such as cohesion since the lessons are written by many people. There are also gaps in the content, with leaps from simple concepts to advanced concepts without the middle steps or explanations.

Integrating Digital humanities into the web of scholarship with SHARE:: An exploration of requirements
Joanne Paterson – Western U

Encouraging the community to support SHARE and to provide ideas about what SHARE could be doing and how the data could be utilized. Showed how it can be searched via a discovery tool, but also pointed out various APIs.

Offered some key conclusions to her discussion of the DH topic. One is that the article remains the primary vehicle for dissemination and evaluation. Also, peer review of DH projects is challenging.

Navigating through the OER Desert with OASIS
Bill Jones, Ben Rawlins – SUNY Geneseo

The goal of their project is to make it easier to find and use OER. They started in April 2018 and launched in September with 52 resources and ~155,000 records. Have added more in the meantime.

One thing they wanted to do was institute a degree of quality control, so they relied on other partners (SUNY OER Services) at the school to suggest quality resources.

How did they build it? They stressed that it wasn’t super high tech: “people can do this!” They use Python scripts to pull content from the targets and host OASIS at SUNY Geneseo.

Check out OASIS.

(Why Aren’t We) Solving common Library problems with common systems?
May Yan, MJ Suhonos – Ryerson U

May started out by talking about what an Electronic Resources Librarian does: access, knowledge base fidelity, etc. To do this, one needs to look at acquisition records, but these come in various forms. They need a system to manage these because they document their legal right to access resources, but they have limited resources and need low effort solutions.

They started by writing functional requirements. It must integrate with the ERM workflow, be easy to use, and allow access restriction. They looked around at systems and thought that AtoM might be a good fit. They had experience with it already for archives and special collections work. Their verdict was that it was workable but that it still required development to meet their requirements.

One feature of AtoM that fit well was that it understands and represents relationships given its original purpose. They took a system they knew and configured it to work for their use case.

May described an analysis they undertook to come up with principles to guide their strategy for technology used when supporting collections. One stipulated that when library technology does not exist, they look around for solutions from outside that world. MJ spoke about the types of functions they needed within software and noted that WordPress can actually do all of those functions. In their ERM example, WordPress is the middleware that ties a bunch of components together, including multiple services external to the university and libraries (indexing, file storage).

What’s next? Potential applications: institutional repository, archival collections, catalogue, intranet, faculty research projects, etc.

Developing a Digital Initiatives Centre at a University Research Library
Shannon Lucky, Craig Harkema – U of Saskatchewan

DI, using various names, goes back 20 years at U of S. Much of the early work was grant-based (into the early 2000s). They need now to expand their model to bring in DH, data visualization, etc.

They recognize the centrality of metadata work, but do not have a metadata librarian. They have expertise, but it comes via collaboration and learning as they go. As Craig put it, “we’re expected to know this stuff.”

They also spoke about DAMS (digital asset management systems). Many DH projects can benefit from the use of a DAMS, so they’re trying to make sure their infrastructure and tools are ready for this.

From there, they turned to curation. Shannon opened by characterizing this as a matter of prioritization. Where to apply one’s efforts is important. Need to know what the work will take and what the scope will be. It also means a shift in focus from internally on special collections and archives toward faculty work and projects.

They have learned that having space is key, any space. Theirs is not beautiful, but they have it and are pulling things together in that space. The focus for their launch is making this space more attractive and functional. It’s symbolic presence is worth respecting.

Open Badges for demonstrating Open Access compliance: A pilot project
Christie Hurrell – U of Calgary

Gave a brief explanation of open badges and then turned to the issue of using open badges to encourage deposit to the institutional repository. Some academic journals are now issuing badges to reward compliance with open practices and there is some literature showing that these badges do, in fact, increase targeted behaviours (Kidwell et al, 2016).

Their approach was to survey Tri-Agency funded researchers to ask their opinion of this approach and to do some user testing around actually applying the badges in the IR. They used a Web survey and had a 22% response rate (n=48). They found that researchers are not willing to spend much time applying badges.

They did user testing with six people from the survey respondents. Used a mockup of their DSpace UI, with an added field to choose an open badge. As with typical testing, they gave subjects a scenario and tasks to complete. The range of time to complete the task was fairly broad, with description of the submission taking the longest (many fields that require manual entry). Found that users ignored the badge field because they were already overwhelmed with the other inputs. No participants noted it.

Doing the Work: Settler Libraries and Responsibilities in a Time of Occupation
Monique Woroniak – Winnipeg PL

Noted at the outset that we need to do our homework. There are–fortunately–so many Indigenous voices now (as compared to fairly recently) from whom we can learn. That’s the good news, as she put it. It’s easier to connect with resources and hear Indigenous voices. No excuses in that regard.

Not all of the work in this area needs to be done by them, i.e., Indigenous people. We should find things on our own and take care of our own education needs. “At least make the attempt,” as Monique put it.

The second thing we need to do is assess our capacity. Monique noted that this–as well as doing our homework–are really no-brainers, but clearly we need to be observant of these steps and not rush in. What capacity do our staff have for building respectful relationships with Indigenous people? Will this be the first time? Are there experiences from their personal lives that will support this work or move it forward? The worst thing we can do in this work is over-promise and under-deliver.

More no-brainers, per Monique: listen. Keep up with stories, social media, themes, trends, issues. Listen for fissures in the community: Indigenous people do not speak with one voice.

This work is both head work and heart work. Engaging in this work should change people. She described what I would characterize as going through a process where Indigenous people are as close to her and integrated into her life as the people around whom she was raised. Narrowing distance and building normal, healthy interpersonal relationships sounds like a wonderful goal.

This work takes a long time; in her words, it’s long haul work. As she noted, it took seven generations for the Canadian nation to reach the point where the TRC addressed the past, so it’s not going to be reversed and made right overnight. Front line workers should “safeguard” this notion, which I took as saying that administrators make take a shorter view because they want to be able to show progress or meet an objective. I get that and agree that it’s something that could happen in many organizations, including mine. All that said about it taking a long time, one does have to start.

After walking us through these tips, she brought it to a point: we need to shift the centre of power. We are very far from this, but we need to set it and see it as a goal. We do need to speak up when we are not taking these steps forward. When we are skipping steps or missing opportunities, we need to say something.

2018 is “no excuses time.” We know the issues. They have been laid out before us.


Comments are closed.

%d bloggers like this: