HASTAC 2015 Friday notes
HASTAC was a great event. Well run, with worthwhile sessions from a range of perspectives. As always, my editorial comments are in italics where I remembered to do this. Hopefully it’s obvious as well where I forgot to add it.
Panel: Tales from the Library Basement: Doing Digital Humanities as CLIR Fellows
Digital Humanities at UC Santa Cruz
Started by noting how well hidden some of this can be. Used the walk to her office, down stairwells and through locked doors, as a metaphor for that. Her role is to do outreach, but she literally has to go out since people cannot get to her.
Showed a useful definition of DH: “Using digital resources, methods, and tools to do good transformative humanities research” (Lorna Hughes, at http://whatisdigitalhumanities.com/). That site, incidentally, shows a different definition each time it loads.
Using Omeka to Forge Partnerships with Faculty: Reflections and a Case Study
Charlotte Nunes, Southwestern University
Challenging to persuade faculty at a “small, cozy” liberal arts college to embrace or take on DH projects and methods. Omeka exhibits have been an avenue where she has been able to gain some traction with faculty.
DH at Lafayette College
Emily McGinn, Lafayette College
Lafayette has a dedicated team of developers and other staff for this work, a “luxury” at a liberal arts college. She sees her role as being a translator between that group and faculty, connecting programming to research.
Made a joke about Omeka noting that she and the others don’t work for Omeka even though they all use it. Showed some examples from classes that demonstrated a range of possibilities with the platform.
Digital Liberal Arts
Alicia Peaker, Middlebury
Spoke about a few aspects of her work, the first of which concerned a no closed door policy. A closed door in a discipline means don’t come in, I’m doing research, but in a library at Middlebury, the architecture doesn’t support that: it’s open, airy, and full of glass.
Second issue: who am I? She’s considered a faculty member, but isn’t really, and works in a library, but isn’t a librarian. She lives outside of org charts.
Made a plea for formalizing what she called “mixed cultural work.” This is about the divide between an individual’s work and work for other people.
Panel: Scaling Up: Media Analysis and Transmedia Experience
Scaling Up: From Miniature Worlds to Vast Narratives
Based on her work with Cardamom of the Dead, an Oculus Rift project. She mentioned in her opening remarks that after about six minutes, people would be physically ill using the tool.
She made an interesting comment about how a construction delay led to a shift from studio work to computer-based augmented reality. Rather than being a waste of time, it led to new work and new ideas and turned out to be a productive detour.
She noted that the introduction of tools such as Flash led to increased granularity of work. Work got smaller, in scope and size. The tools were bad: big, clumsy, poor-HCI attributes. This led to “impoverished stories.” We needed better tools.
She reviewed some of her earlier AR projects built around pop-up books. She called these miniature and narrow, but they showed the possibilities. Began to use iPads, although she would prefer other devices (head-mounted displays, for example). Once Unity appeared, they stopped using their own software for production and switched to Unity. She resisted that, but they wanted to be platform agnostic, for example, but Unity is simply too powerful to ignore.
Said something interesting about these being expressive works, rather than works that for now will find a large audience. The specific example was that in “Everyone at This Party is Dead” she had embedded email fragments that should/could be read to enhance the story, but no one reads them.
Scaling Up (+Down)
Laid our her theoretical framework, which is based around grounded cognition as articulated by Barsalou, et al. Also spoke about other notions in rapid succession, including extrapolation, which expands on a given modality, conversion, which takes one sensory input and redirects it to another, and augmentation, which is the detection of phenomena beyond human ability.
Stressed that her work is based around a team: “almost nothing I do is single anymore.” Showed her team members by name and discussed their role and place.
One project she showed was Lib Viz, which is built around a gestural interface. Noted in passing that this was designed in the Unity game engine, so this was an all Unity panel. The installation reacted to human motion; when walking by, things would swirl, as one slowed down, the objects slow down. If one stops, it’s possible to search and interact. Used haptics and gesture to “reinvigorate interest in the library.” It’s a way to reduce abstraction that machine systems (such as Dewey, in her example) introduce. Surfaces materials that are locked away, typically; the objects’ metadata doesn’t provide users with a sense of the object, such as the examples of artist books that she demonstrated.
While talking about her work, she noted the “materiality” of the work, which was a fancy/academic way of saying that it involves schlepping around tons of equipment and a lot of physical and detailed work. Refreshing to hear that highlighted. It also takes a lot of time. She noted that she spent an entire day just 3D-scanning five books.
Turned to VAT, which is a project around video analysis (Video Analysis Tableau). The idea behind is that USC has a massive film archive, but there’s no good way to expose it to research; it takes too long to view video and one needs to create a form that works for “sustained inquiry.” The computer that enabled this was the first or one of the first all-SSD supercomputers, which finally offers the power to do such massive video analysis. Uses the database/interface known as Clowder, which used to be called Medici.
Panel: “Something of Great Constancy” Preserving the Elements of Innovative DH Work
Nancy Maron, Ithaka S+R
She started by discussing research she and others did into the role and views of funders with regard to sustainability. Discovered that there is a bit of a void in terms of policies or procedures for ensuring this, rather more of a “they’ll do the right thing” culture. They identified a gap between what funders thought and what institutions were actually doing.
At the same time, there’s a lot of interest (as demonstrated by various measures, e.g.- subscribers of DHNow) on our campuses, so that points toward a need for more structure and common understanding between funders and DH scholars and workers.
Led to “Sustaining the Digital Humanities: Host Institution Support Beyond the Start-Up Phase,” a report from Ithaka S+R that she co-authored. They wanted to know what institutions were doing to support the work that comes after or results from the grant.
Gave her top ten things to consider when sustaining DH output:
- everyone’s a builder – most people consider themselves content creators
- find the content – where and what
- manage expectations – some content may not require preservation
- determine who “owns” the activity
- cover all stages of the lifecycle – various units handle different parts, how do they work together?
- communicate the process widely – people need to know where to go for what
- seek shared solutions where possible
- determine which model best fits – service model, production centre, etc.
- for creators: seek guidance from libraries and IT units and follow it
- get the conversation started, preferably at the top – dispel the vague notions of DH that live in administrators’ heads
Shared Shelf as Infrastructure for the Digital Humanities
Sketched the start of Artstor, which is more or less to supplant the urge of various institutions to scan their slide collections, which would be largely redundant. Spoke of the humility that comes from working with primary source materials; we can never capture or have everything.
They have a fairly good group of US institutions to create Shared Shelf in Artstor. Am curious if this means that the content on the Shared Shelf is available without an Artstor account. Seems that this is the case per his subsequent comments, via Shared Shelf Commons. Goal: first you build/gather the assets, then you do the various things you can do with them to disseminate them.
Feels the main missing piece of National Digital Infrastructure (IMLS project) is the ability to build and use shared thesauri and taxonomies. Further, interoperability and linked open data require standards. Key: the “endless sources of primary source materials” increase the need for an importance of shared and networked solutions.
Sustaining the Digital Humanities: The Library Perspective
Julie Bobay, Indiana
The holy grail: “align your [the researcher] goals with readily available and sustainable tools and support.” Like the holy grail, this is elusive. Beyond that, planning up front, creating structure, considering how to gather and describe their data are all key to do up front before most of the work has happened, not after the fact.
She described parts of IU’s approach:
- When developing support services: “first of a kind” rather than “one of a kind” (borrowed from NYU). This requires anticipation and foresight. In their case, they now have a DH Toolkit that lists what they offer, in some ways quite specifically, e.g.- Fedora long-term preservation repo.
- Consult on open-source tools, e.g. – Omeka, WordPress, etc.
- Project Consultation Checklist, with specific elements: short description, audience, impact, content, and so on.
- Space – services require space. Theirs is the IU Scholars’ Commons. They focus on consultation services.
- Programming – events, lots of them. Envious of their staff and its ability to put these on.
Offered her Open Folklore as a case study of how they did things, noting up front that this was the “do as I say, not as I did” part of her talk. Joint project with the American Folklore Society. Original idea was to extend the notion of collections beyond what one library can hold, to collections in a global sense, with topic as the unifying element.
Panel: The Material Turn & The Digital Archive
Project Arclight: Critical Reflections on Search, Visualization, and Media History’s Big Data
Eric Hoyt, Wisconsin
- It’s time to go beyond “beyond search.”
- DH needs more interpretive methods to accompany technical methods and tools.
- Dynamic visualizations are tough to do well because users want contradictory things. But one has to try.
Learned these things from working on various projects. Showed a Photoshop version of what they set out to do at the beginning, which was to increase the reading reach of trade publications. Everyone cites a few major publications, but too many remain far out on the edge. So they built the Media History Digital Library using the Internet Archive scanning service. Much of it comes from LoC’s Packard campus (62% – the “cake” as he put it).
To surface this content, he built the search engine Lantern. Arclight extends it and tries to treat the corpus as a “giant Twitter stream,” if I caught that correctly. The method that drives this is Scaled Entity Search (SES). They have also applied an interpretive framework, a triangular notion that links entities, the corpus, and the digital component (OCR, OCR correction, copyright, etc.). Showed a very simple demo of an early version that creates line graphs based on concepts. Noted that one still has to have a ready interpretive framework since there is noise in the corpus (e.g.- credits pages in scanned texts). Will launch later this summer.
Intellectual Capital at Risk: Data Management Practices and Data Loss by Faculty Members at Five American Universities
Drew VandeCreek, Northern Illinois
Most people have experienced data loss, but still express confidence that they can preserve their materials for 25 years. Scope of the study was five small and medium-sized Illinois universities.
Told a common tale around the ephemerality of various media. Data can also become decontextualized, e.g.- a picture where we don’t know who the people are. These points are well known and there are solutions emerging. Other aspects of his research demonstrated that faculty do not utilize some of the resources their university offers, such as networked storage.
The Digging Condition of Digital Humanities: Historicizing the Material Turn through Sound
Jentery Sayers, University of Victoria
Asked some difficult questions:
- what’s the difference between storage and memory? (Chun)
- how are matter and meaning entangled? (Barad)
- how do we avoid reduction to “stuff”?
Defined and reviewed various aspects of materiality. Frankly, he was moving fast and spontaneously, so it was difficult enough to follow, let alone take notes. Would enjoy hearing the “long” version, or perhaps reading the book he’s working on around these topics. For anyone not immersed in the body of literature he was citing on the fly, it was a bit of a firehose.
His conclusions are clearer for me:
- media should not be reduced to concepts
- media history risks anthropomorphism
- media history determined often by narrative
- from objects of inquiry to agents of inquiry
- critique need not be negative or secondary
- media studies may privilege surprise – what can automation or computation generate that leads us to new questions?
“Knowing how things work [visualizations was the example he was mentioning] helps us not stand in awe of them.” Given his work with physical computing, this resonates.