CNI Fall 2015 notes
As per usual, this week’s CNI meeting offered a surfeit of updates and reports on a number of interesting and emerging projects. After making the hard choices about what to skip and what to see, I tried to take detailed notes on the talks below.
I’ve been doing this for a number of years–taking extensive notes and publishing them via this blog–and have heard from many of you via various channels that the notes are useful. They certainly help me with remembering what I’ve heard, as well as when and where and from whom. Please share any suggestions or requests with me through whatever channel works for you or via the comments below.
Digital Dissertations in an Increasingly Welcoming Landscape
Amanda Visconti, Purdue; Matthew Kirschenbaum, Maryland
Noted that she wants to move beyond a discussion of whether digital dissertations are ‘equal’ to written chapters to a broader discussion of the many ways that one can achieve scholarly goals. Interested in the work that takes public digital collections and makes them “truly” available to all, not just to specialized scholars. In other words, public humanities in a participatory vein.
Her dissertation asked what would happen if one created a text edition and invited everyone to read it and work with it. Created InfiniteUlysses.com, an open edition of Joyce’s Ulysses. This work involves building the edition, research blogging, and a whitepaper as debriefing (written in the last month before her defense). She set up her own eponymous Website to host all of the pieces of her dissertation. A zipped version was deposited in the institutional repository.
As she put it clearly one a slide, the major challenge was how to evaluate the dissertation. Need a Joycean and an interface critic, in simple terms, among other expertise. As she noted, this takes some time and effort to organize, but it leads to a conversation between the parties. She noted that working with graduate students doing such work is time intensive and faculty doing it need to have consideration both in terms of credit for this work and release from other work to enable it.
Matt commented on her work and noted that she had taken the notion of a digital dissertation further down the path of moving away from traditional dissertations. She is not the first to include digital components in a humanities dissertation, but the scope of her work and the multiple platforms used made it a broader and more complex project.
Preparing for New Roles and Transformed Libraries: Models and Implementation
Kristen Burgess, Ted Baldwin, Leslie Schick – U of Cincinnati; Greg Raschke, North Carolina State U; Deanna Marcum, Ithaka
Transforming Roles for Transformed Libraries – Deanna Marcum
Started out by attempting to define what a “transformed library” might be. Noted an ALA definition that she found that discussed engaging in new behaviours and supporting communities, but found that didn’t go far enough. Offered her own attribtutes:
- user focus
- services, not collections
- agile, experimental
- fluid teams solving specific problems
- measure outcomes and impact
- evidence-based decisions
Continued by outlining more specific services and roles a transformed library fills, including the notion that technical support becomes as important as content. Other aspects that relate to teachers and researchers are familiar: data management services, supporting online teaching, tool and technology assistance, data analytics, etc. There were more, but these are the ones that popped out at me.
Preparing Research Librarians for Transformed Libraries: Creating a Community of Practice – Greg Raschke
His talk focused on the Hunt Library as a major “evolutionary driver” in their organizational development. Moving from providing materials to being a collaborative partner presents a number of opportunities. Noted their guiding principles:
- engaging research enterprise at all levels
- integration
- foundation for expert consultancy
- pole into the future
- mosaic – informal, organic, formal, credentialed
He noted that it’s not helpful to be negative in tone about subject librarianship. A positive approach helps.
Noted the formal components of a community of practice, which include:
- organizational job expectations – it has to be part of the job, with all the rewards and assessment that that entails
- time and resources – put money and time into this, explicitly
- organizational expertise – hiring experts: copyright, coders, etc. – key is to engage them in teams and projects, not let them cordon themselves off
- peer-to-peer learning – he didn’t expect this to happen, but found that the librarians made it effective
- formal training opportunities – put out calls for participation and got overwhelming responses
- engagement for purpose and mastery – intrinsic motivation, in other words, works better than extrinsic efforts such as money
Preparing for New Roles – Kristen Burgess, Ted Baldwin, Leslie Schick
The new role they’ve created is an informationist. Why informationists? It reflects a U.S. trend in academic health sciences libraries. They also identified a number of new work areas they needed to address: data services, GIS, grant support, etc.
When first used, the term meant “information specialists in context.” UC created and filled a number of informationist roles: clinical, science, and research (informationist, in each case).
Showed a bit about how they have adapted some spaces in their various locations to accommodate this newer work. More open, flexible, technology-rich, etc. They also plan and host events, such as a GIS Day.
The Great VCU Bike Race Book: Visualizing the Mind of a University
Gardner Campbell, Virginia Commonwealth U
Started with the unusual notion that a data visualization of the ‘mind’ of a university would be far more informative than all of the reports and statistics we generate. How would one do this, he asks. The Great VCU Bike Race Book is an attempt to move toward such a thing.
In conjunction with the UCI World Championships, VCU ran a series of one-credit, $50 courses related to topics inspired by the racing. It worked out, and led to some outcomes that they didn’t expect. One interesting thing to note is that it all happened very quickly. There had been a debate about whether to keep campus open during the race or just close down to avoid the traffic and other issues. In February 2015, they decided to keep campus open but cancel classes and call them “reading days.” Was still being cast as a “bike apocalypse” in many ways. In March they put out a call for courses; in April they set up WordPress sites that were intended to be aggregated in a central site. Developed the courses from mid-August to mid-September. Gardner hired a project manager, a postdoc in behavioural genetics; noted that her efforts were key to keeping things on track. The work was coordinated and steered by VCU’s ALT Lab.
They made some very attractive images–cycling jerseys–to advertise the courses. The names on the jerseys were “Cycling and Film” and “Anthropology of the Crowd” and so forth. Really graphically catching.
Had to teach students a bit about copyright, so they asked students to apply a Creative Commons license. Most granted it, with only three refusing. As he noted, this was a stellar learning opportunity. Also enabled deposit into VCU’s IR, Scholars Compass, which was also another teaching opportunity (and told them that their output was scholarly and valuable).
Archivportal-D: The National Platform for Archival Information in Germany
Christina Wolf, Nadine Seidu – Landesarchiv Baden-Württemberg
This was this year’s “DFG” talk about a national portal to archival resources that the Landesarchiv Baden-Württemberg manages for the nation. There are 16 such state archives in Germany, one for each federal state.
They posed the question: why does one need such a German archives portal. The division of materials across 16 archives means that users would theoretically have to know where materials are held to know where to search, and given the federal structure of Germany, this kind of distribution is inevitable. The Archivportal-D, as one would expect, is meant to aggregate the collections and make them available via a single search interface. This project is not the first attempt to achieve this but others have failed to achieve the goal for a variety of reasons.
All archives in Germany may participate, and it includes content from the existing German Digital Library. There are about 11 million records in all. Content is not homogeneous. There is digitized material, but also finding aids, guides to holdings, and information about the archives themselves. They currently have 80 partners and references to ~600 archives.
The German Digital Library (Deutsche Digitale Bibliothek – DDB) collects broadly, not only from libraries, with the goal being to offer everyone unrestricted access to Germany’s cultural and scientific heritage. It currently stands at 18 million objects. As with many such projects in Germany, it received substantial federal and state funding (9.5 million Euros, for example, as startup funds from the federal government). They outlined how it operates, with the German National Library acting as the technical coordinator with service centres operating around the country, including their home archive.
How do the two work together (DDB and Archivportal-D)? The latter uses specific software components of the DDB but adds other elements that are specific to the presentation and description of archival materials. In other words, they didn’t have to completely recreate the wheel, which saves resources. Archives can prepare their materials and push them to both platforms in one step, which is critical. The core argument for the archives platform is the need to reflect archival practice and requirements.
The project started in 2012 and runs to spring 2016. DFG put in money, and there are six major partners: three archives, the German Archives School, the National Library, and FIZ Karlsruhe, an IT infrastructure organization. The original version went live in September 2014 at Europe’s major archives conference.
More work to do, of course. They still need to add more content, improve ingest structures and processes, optimize display for mobile devices, etc. Many goals remain, it seems.
The idea of Archivportal-D is not to replace its component archives, but connect these individual archives with larger European initiatives via the Archivportal-D (e.g.- Archives Portal Europe, Europeana). In other words, access to an archive’s collections can be via their own interface all the way up to an international aggregation such as Europeana.
The challenges they face are not insignificant. Participation for smaller archives is a problem, as one would expect. It’s also surfaced the need for persistent identifiers that work at scale, not just at the local level. Authority files are also a bit troublesome. The Gemeinsame Normdatei (GND) exists and is maintained by the National Library; they want to integrate it into the portal in the future to help push this forward.
Emulation as a Preservation Strategy
David S.H. Rosenthal, Stanford
Started by noting emulation and virtualization have a long history. Notwithstanding the demonstrated feasibility of emulation (thought to be impractical by many), format migration has been the preferred method for enabling access to legacy materials. Noted the work done at Freiburg and Carnegie Mellon and elsewhere that is showing that emulation isn’t such a bad alternative.
Showed an example using Theresa Duncan’s feminist CD-ROM games from the mid-1990s. Started a Mac emulator to show this working in a modern Web browser. Mentioned Freiburg’s bwFLA, which we heard about at a previous CNI meeting (talk by Webster and Cochrane, notes here). It’s emulation as a service, which means no need for a user to install anything or use wonky ports, etc.
Showed another example where old Web browsers are emulated, pulling content from the Internet Archive and other Web archives and rendering it in period-appropriate browsers. As he noted, even in those browsers, the content doesn’t always render uniformly, let alone in modern browsers, as we know. Also showed a 1997 version of TurboTax running on Windows 3.1; it runs in Chromium on Ubuntu (his platform). Gave the technical details, noting that it relies on Olive Archive software developed at Carnegie Mellon. The details were hard to follow, but gist was that not much network traffic is required and that by virtue of caching, most of the work is done on the local machine. The downside is that it takes a specific environment, but it’s not one that is hard to establish.
Tried to show an example of the world’s first spreadsheet program VisiCalc using the MESS emulator developed by gamers and available via the Internet Archive. The IA has over 36K software packages available for emulation, some of which work well, others less well, as he noted.
There are concerns in this area, some of them around emulators. The developers of QEMU, for example, do not prioritize preservation. Other emulators, particularly those from enthusiasts, work well in many instances but from a preservation perspective, they are unreliable. Metadata concerns are also an issue. Emulation requires technical, bibliographic, usability, and usage metadata. Tools exist for technical metadata, but bibliographic and usability metadata have no tools. It’s hand created and crowd sourced (works well, as he noted, for games).
Fidelity is also a concern one hears around emulation. CPU and memory can be exactly emulated, but other devices may not be so fixable. As he noted, playing Space War in the 1960s involved loading paper strips, pressing buttons, and staring at a large, round CRT. Using modern hardware simply isn’t the same. Load and scaling are also concerns. Currently emulation isn’t a major infrastructure problem, but as it becomes popular, problems arise and costs increase. For example, Rhizome’s release of Duncan’s games created a usage spike and increased their AWS hosting costs.
He shifted the topic from emulating artefacts from the 20th century to current artefacts, which are more fluid (not fixed code, but Web programs) and far vaster in scale. This makes preserving current artefacts much more difficult, which neither emulation nor migration can entirely solve. We also have seen an evolution in end-user devices, not least the rise of smartphones. As he noted, sales of “traditional” desktops and laptops are in free fall, while tablets have plateaued. The shift in computing from the CPU to the GPU and the lack of Moore-scale speed gains in the former have changed the landscape.
Not surprisingly, there are also legal issues. Emulation is constrained by copyright and end user license agreements. As he noted on his slide, “emulation almost certainly violates EULA.” When confronted with takedown notices, complying with them is the path of least pain. This conflicts with how memory institutions collect, where instability is anathema. National libraries, he noted, have collection permission, but these rights are not being exercised with regard to software. Even if they could not distribute online, they could offer access on-site. He asked whether lending could solve the access issues, as they are with e-books. The IA lends e-books and has faced little resistance, so perhaps it could work with software.
Experiences with High Resolution Display Walls in Academic Libraries
John Brosz, U of Calgary; Josh Boyer, North Carolina State U; Patrick Rashleigh, Brown U
NC State showed their visualization facility in the Hunt Library, as one would expect given the exposure it’s gotten in recent years. Their screens wrap around the room and are quite visually intense. Brown has a Digital Scholarship Lab with a 7’x16′ wall made up of 12 tiled displays. As Patrick noted, the most important asset in the room is the flexible furniture. Calgary also has a large wall, as well as a touch table.
Calgary noted fairly quickly that they had a fair bit of usage from humanists, who were viewing manuscripts, images, etc. They had expected mainly visual analysis coming from the sciences. They ran a study, inviting people to come to the room and use it to see how they took advantage of it. The subjects came from all disciplines across campus. From the sound of it, just having any kind of displayed data or image at that size and scale engages people. They see new things, whether visual details or numerical quirks. It also allows many people to look together at one computer screen, which solves an inherent issue with desktop machines. Calgary’s conclusions:
- size may not matter, but size plus resolution does
- aids discoveries and improves design processes
- helped observation
Going forward, they might want a larger space (it’s too small for classes). Also could use more input devices beyond keyboard and mouse.
Patrick noted that the displays are beautiful and make a strong impression. He remarked that, as at Calgary, their display is in a room, and the difference between having them in open spaces or rooms really determines, to a great extent, how they are used and perceived. Pointed out the inherent flaw in projected technologies: shadows.
One can use these displays using various modalities. Simplest is using it as one big display, i.e.- projecting a single image onto it. Another way to use it involved showing multiple similar artefacts, e.g.- pages from a document. Going further, one can display multiple distinct artefacts, which he referred to as a ‘dashboard effect.’ One can show source material on the left, say, and a timeline on the right. Or a Google Hangout screen and a document. Last, but not least, they can do multiple artefacts from multiple sources. This can be as simple as having multiple computers (say, students’) connected and displayed.
He noted, as did John, that for interaction you need more than a mouse and keyboard. He said they have work to do in this area.
Josh showed their spaces briefly, then a bar graph of the user demographic. Engineering leads the pack (they are close to Hunt), while Humanities and Social Sciences come in second and are nowhere near Hunt. Use is mostly from faculty and graduate students. It occurred to me while he was speaking that I had missed in the introduction that their large 270 degree space is projected, not display screen technology. That explains how it can be so large.
Comments are closed.