Main »

DH 2019

These are my notes from DH 2019 in Utrecht. As usual I must warn readers that these notes are my distracted idiosyncratic wander through parts of the conference. I often don't have the energy, attention, or battery power to take notes. For that matter, I often stop taking notes when I'm intrigued, confused or trying to listen closely. Please email me if you feel I have misrepresented something.

Tuesday

DLS Tool Criticism. An Anatomy of Use Cases

Tuesday morning I attended a workshop of the Digital Literary Stylistics SIG. See the outline of the day. This was organized by J. Berenike Herrmann. She talked about the Anatomy of Tool-in-use Cases. What does it mean to anatomize a tool? What are we trying to do? Are we trying to do scientific replication where reliability, objectivity and validity are important.

Francesca Frontini talked about the tool inventory they are developing. I wonder if we can collaborate with TAPoR.

Clémence Jacquot: Textometry and stylistics: which tools and practices for literary interpretation?

Jacquot talked about how exploratory tools modify and improve the interpretation of annotated poetic corpora? She is looking at Apollinaire's poetic corpus using TXM. Textometry is a recent term, but draws on a history in France of analytics including correspondence analysis. It is characterized by both quantitative and qualitative practices.

She showed a KWIC from TXM and how it enables both quantitative and qualitative anaysis. .

She then talked about the creation of a corpus of Apollinaire. She has added metadata to the corpus. Apollinaire's poetry is in movement and can be reorganized. His work is often organized into thematic collections. She is asking how can we conceive style as a dynamic factor?

She talked about doing lexicometric analysis of Apollinaire - comparing poetic language and grammatical language. She looked at the syntactic link markers. She talked about the metadata she added to enhance the analysis. She had to distinguish writing dates (the assumed year vs the authorized year.) Apollinaire often rewrites his text which makes dating difficult. She then showed how she can use TXM to graph the syntactic link markers both using assumed and authorized dates.

She then showed and discussed a nice distribution graph of the different types of markers.

Geoffrey Rockwell: Zombies as Tools: Replication in Computer Assisted Interpretation

I then gave a short paper about replication. I argued we need to develop our own practices of replication in the digital humanities. For us replication should be:

  • Historical - practices for recovering a history of tools, methods and techniques.
  • Hermeneutical - practices that are not necessarily about reproducing results, but of interpreting tools/methods as interpretations.
  • Applicable - practices that we can reapply to other collections.
  • Playful - practices that let one play with tools and learn through playing.

Steffen Pielstrom: LDA Topic Modelling for Semantic Corpus Analysis

Pielstrom started with some context. He is from biology and he planned to say something controversial about topic modelling.

He started by explaining topic modelling and showed an example from Topic Modelling Vogue. TM is a tool for the discovery of content. There are good examples of the tool like Mallet and Pielstrom has written Jupyter notebooks to do TM. Dariah also has a Topics Explorer Application.

He asked "what is a good topic?" This leads to asking what topic modelling is good for? One answer is semantic text classification. In this case one wants distinctive topics. That said, word frequencies can give us just as good classifications as TM. Another answer is corpus exploration and what one wants then is interpretability. Another answer is information management.

I tend to think of TM as a way of generating hypotheses about themes in a corpus.

Pielstrom sees TM as not being strong at quantitative analysis or scientific analysis. It is not as good for classification as other techniques. He sees it as a fancy visualization tool. I tend to agree, but believe that it is more an idea pumps.

Topic modelling raises all sorts of interesting questions:

  • What is the genesis of the technique? How important is the story of how writing happens that is used to describe TM?
  • Can we use TM without understanding the statistics? What do we make of interpretations that are based on TM, but don't necessarily use it correctly?
  • Why is TM so attractive to colleagues?
  • How can we use TM as a tool for vector space reduction?
  • What is the role for black boxes in the humanities? Isn't most humanities scholarship a form of connoisseurship that is a black box? What about theory?

I think we could do a Kuhnian reading of the field. We have periods of stability and periods of transition. We are now in a moment of transition. We look through the tools, at the tools, through tools at tools.

Welcome

We were welcomed by the local organizers from the U of Utrecht and the programme chairs. This is the largest DH with over 1000 people. The Dutch Minister of Science talked to us about collaboration and how we need to learn to recognize it academically.

Francis B. Nyamnjoh: African Inspiration for Understanding the Compositeness of Being Human through Digital Technologies

Nyamnjoh started by commenting on complexity and how that is a theme that resonates well with the nature of the continent. Complexity is tied to incompleteness. He talked about how humans, nature and even the Gods are incomplete. It is shared.

Incompleteness is featured in the works of Amos Tutuola. his books include The Palm-Wine Drinkard. The novel has magical objects that extend us like the digital humanities. Jujus are magical objects prepared by diviners - someone who is a hidden activator. You can use the juju to extend yourself and rise above ordinary being. The gadgets in our pockets are similar.

We should see continuities in change - to see the magic in the digital and digital in the magic. We can think of the digital humanities as the compositeness of being. It is not something to be incomplete. DH should learn to speak the language of conviviality. DH can extend us - not make us complete, but help us.

The full paper is available here.

He ends his paper with:

To what extent does the idea of incompleteness challenge us to rethink our current approaches to digital humanities as a field of study? Would more inter- and multi-disciplinary conversations informed by the reality of interconnections and interdependencies challenge us to contemplate conceptual and methodological conviviality? If yes, what form would they take? And would factoring in the interconnections within and between categories such as race, ethnicity, culture, geography, class, gender, sexuality and age affect the form and content of such conceptual and methodological conviviality? Beyond the conceptual and methodological implications of doing digital humanities research, what does it mean to actually understand and relate to ICTs in terms of incompleteness as a philosophy of personhood and agency?
As a field, Digital Humanities would be truly enriched by a disposition to accommodate improvisation and innovation. In support of this, let me draw on another Chinua Achebe proverb, from his novel, Arrow of God: ‘The world is like a mask dancing. If you want to see it well you do not stand in one place’.4 The idea of Digital Humanities as a dancing mask is suggestive of its potentials to become a truly inclusive field of inquiry that continues to enrich itself through its open-ended encounters and conversations with the creative diversity of subjectivities of a truly universal humanism.

In answer to a question he talked about a rupture with the present paradigm that is one of completeness. He wants to break with the logic of completeness where one doesn't give credit to others.

Wednesday

Tobias Julius Schweizer: An Interactive, Multi-layer Edition Of Jacob Bernoulli's Scientific Notebook Meditationes As Part Of Bernoulli-Euler Online

I couldn't initially get a seat so my notes on this are after the fact.

Schweizer talked about a neat project developing electronic editions of Bernoulli. They are using LaTech with macros to produce different language versions, but I didn't catch the context.

The Bernoulli Euler Online project (BEOL) project is not only creating electronic editions, but also developing linked data that can be queried through an API and used by multiple projects/editions.

BEOL is built on the KNORA (Knowledge Organization, Representation, and Annotation) infrastructure. KNORA is meant to make sure humanities resources can be maintained over time.

David Hoover: The Invisible Translator Revisited

Hoover wanted to show that the author is important. He did this by taking different English translations of Chekhov and testing how they group. Do they group by text or by translator? They group by text. The translators are irrelevant. Text is a strong signal. He uses Stylo.

He then looked at different authors translated by the same person and the original authors still clustered. He has now a collection with 30 "known" texts and 47 unknown texts. They are duplicate translations in the unknown, but not in the primary known set. This should allow him to test if the author stands out despite translation without the text. He got a classification success rate of 94.5%. Gogol still classifies with Gogol despite translation.

Then he made it harder with known and unknown sets have the same translator for the same original author. This way a Gogol translator would not be in both sets. Does the author signal still shine through. He still got a 88% success rate in correctly classifying the author in the unknown set.

Could it be that authors have characteristic themes?

Then he erased the real author but had translators in both sets and then the translator came through. The translator was properly recognized 93.8% of the time. Thus the translator also has a signal.

Then he talked about using Burrow's Zeta approach that counts consistency of words. He showed a graph that showed the translators are distinct. Some of it has to do with older British word forms and newer American word forms. Contractions come in in the late 19th century. He showed most characteristic words for two different translators. For example one uses "until" and one used "til".

He is trying to work backwards from why translators can be distinguished to why authors can still override translators. Are translators still visible?

There was a great discussion after that I couldn't possibly capture.

Anne-Kathrin Schumann: A Finite-State Approach to Automatic Greek Hexameter Analysis

Schumann began by talking about Greek hexameter which influenced later poetry. Philologists learned how to do scansion by hand some time ago. Now they want to do it automatically. Hexameter has six "feet" with different types of feet which leads to 32 variants types of hexameter. Their algorithm first identifies if a text is hexameter and then what variant.

Sebastian Kruse created a corpus. Others have tried this problem using rules. She talked about other algorithms. She used a finite-state automaton to recognize and solve the analysis. Deterministic finite-state automata are a well known approach in computer science and computational linguistics. She distinguished automata from transducers. Transducers map to output.

She then showed how her algorithm compares to a baseline. They seem to match the baseline using global knowledge.

The philologists wanted a transparent algorithm that they can review and train students on.

There are some specific problems with complex hexameter that call for a "black art". She pointed out that it is hard to reproduce a black art on a computer. Machine learning might work given a much larger training dataset.

Keli Du: A Survey On LDA Topic Modeling In Digital Humanities

Du started by introducing Topic Modelling. There is no common understanding on how to use it in the humanities. Some spit the text, some don't.

Du collected 60 abstracts and papers on TM. There has been an increase in TM since 2011. He then talked about common pre-processing procedures. Removing stop words, lemmatization and POS tagging are some things that people done. Some people remove names, some don't. Document chunking is also done differently or not at all.

The number of topics generated also varies from 10 to a thousand. There are also difference between what people do with the topics afterwards. Some evaluate topics, some visualize them, and so on.

Du feels there is no common understanding of how to use TM which makes the stability of the research questionable. I can't help feeling that some of the differences have to do with very different purposes to which TM is being put. People are using it in very different ways. Is that really a problem? Is it a case of a hammer being used for anything whether it is a nail or not?

Jing Chen: Defining and Debating Digital Humanities in China: New or Old?

Chen started by asking if what we are doing in the digital humanities is the same in different countries. In China DH is approached differently. Even between China and Taiwan there are differences in how text is represented. In China they don't have some of the common resources we take for granted like Facebook. Instead they have WeChat.

From 1970 to 2000s Chinese scholars have being doing work one could call digital, but they don't use the term digital humanities. Does one need this term? In Taiwan there are large scale digitization projects funded by the nation. In China the states have also been digitizing. Scholars have not been involved in many of these projects.

Digital humanities initiativ es emerged in the 1990s and 2000s with collaborations where scholars started creating research-oriented databases. Scholars from Taiwan and Hong Kong have begun to introduce the idea of DH. None the less Chinese scholars question whether they need this new field. There are misconceptions of what DH is. Some feel it is just quantitative methods.

There have now been some first courses in DH at the History department of Nanjing. There are also conferences happening. She closed on how some journals are having special issues and books.

There is a transformation from institution driven to research driven DH. Gaps between disciplines are narrowing, but this is a problem. There are also problems internationalizing DH as few scholars from China can come to conferences.

Du is from Digital Humanities Initiative at the Institute of Advanced Study at Nanjing University.

She talked about the challenge of convincing humanists to use digital methods when they are happy with their existing practices.

She talked about how in China, everything is political so they can try to piggyback on hot developments like AI and so on.

Luis Meneses: A Framework to Quantify the Signs of Abandonment in Online Digital Humanities Projects

Meneses talked about a project I heard about at Congress about how digital projects are abandoned. You can see my notes on CSDH 2019.

He talked about different forms of abandonment. There are sites that are up, sites that are not maintained, sites with error pages, timeouts, and when others squat on your domain. He has used the DH book of abstracts to extract URLs (over 8800) and then analyze response codes they get. They analyze the http responses. There is degradation even 9 months ago. Many sites are not up a year after the DH conference which is depressing.

The lifespan of projects is about 5 years. We need better preservation strategies.

There was a great question about whether we should value longevity. Is there anything wrong with ephemeral web sites? Could projects run for a bit and then deposit.

Thorsten Ries: Born-Digital Archives A Digital Forensic Perspective on the Historicity of Born-digital Primary Records

Ries talked about James Bakers blog post on digital forensics that made the point that it is relevant to historians. Hacking and spoofing of born digital materials has raised questions about historical archives. Were the right things archived? Born digital materials have a physical and logical materiality. This ideas of materiality is drawn from Blanchette's ideas of materiality. History is still grappling with how to understand born digital objects as historical objects.

He showed examples of versions of documents looked at forensically. Hard drives contain deleted materials that is possibly recoverable. Deleted material can be hidden in special partitions by versioning tools. Key loggers can keep data.

Hardware failures and crashes are interesting to forensics. Crashes leave all sorts corrupted files around a drive. Head crashes of the older hard drives call for different approaches than solid state drives and now cloud storage.

WARCs in the Internet Archive are not full web sites, but snapshots and they don't have forensic depth. Helmond, Track the Trackers and Watch the Watchers is a blog essay that looks at how one can gather data about how people are tracked.

Hacking has become normal so we need to understand the worms and hacking tools. The reports on hacks need to be preserved.

This paper went by to quickly - it should have been given a longer slot as it was rich and important.

Agiatis Benardou: A World of Immersive Experiences: The Scottish Heritage Partnership

Benardou talked about a large scale survey of the uses of VR and AR in Scottish museums. They produced a video and a report. Some of the lessons from the study include:

  • People prefer mixed experiences with some physical handling of objects to all digital
  • Older people prefer more physical interactions

Pauline Junginger and Dennis Ostendorf: Close-Up Cloud: Gaining A Sense Of Overview From Many Details

Junginger and Ostendorf talked about a design project working on with a museum in Hamburg to create a database of glass slide negatives. They decided to focus on the details rather than images. People can follow details and tags. There is a neat overview.

Joanne van der Woude: Atlantic Journeys through a Dutch City's Past: Building a Mobile Platform for Urban Heritage

van der Woude talked about a mobile locative app that lets students study the Atlantic connections and history of Groeningen. You can explore buildings in the city and the connections to the new world. They are launching the app in October. This is also leading to a larger Atlantic Stories project.

She demoed the application and it was neat how they made links from the local to pirates like Rock the pirate (Roche Braziliano).

They can track where users go with their app. How long do they spend on the portals. They are making the portals shallower and more interactive.

Nicole Basaraba: Creating Complex Digital Narratives for Participatory Cultural Heritage

Basaraba is interested is interested in remixing narratives for cultural heritage sites. Typically the narratives are created by experts - think about audio guides. She talked about how narratives can be created that mix expert knowledge and bottom up user experiences (sources like Trip Advisor, Ancenstry ...) By mixing in user stories one can add new stories, include other perspectives, include shared cultural perspective. She wants to use cultural analytics to do this remix.

She built corpora of materials about locations using materials on the web and used those.

Claes Neuefeind and Philip Schildkamp: Sustaining the Musical Competitions Database: a TOSCA-based Approach to Application Preservation in the Digital Humanities

Neuefeind and Schildkamp talked about how to preserve digital humanities projects. They come from a data center. They talked about a recent DHQ paper about what other Data Centers are doing. Some approaches:

  • Archiving
  • Agreements to maintain
  • Move elsewhere

Neuefeind and Schildkamp are interested in service agreements. They want to model resources in TOSCA in XML so that system components can be preserved. All this is embedded in openTOSCA so that you have bundles that make up projects. These bundles can then be moved and installed.

They showed an example use case of the Musical Competitions Database. They model MCD in Winery of OpenTOSCA. I think they are using Docker to create virtual machines. He talked about how one can change nodes in the model to deal with changes over time.

Dirk Van Hulle and Joshua Schäuble: Modelling Text-Genetic Relationships

Hulle and Schauble talked about representing all the documents that could contribute to the genesis of a text. Genetic criticism is putting the text into motion. The idea is to open it to the constellations of documents involved in its genesis.

He talked about Epigenesis (what influences the writing), Endogenesis, and Exogensis (documents that continue to influence new edits after publication).

He gave examples. First he showed microgenesis - Percy Shelley's edits of the Frankenstein manuscript. Second he showed an example of a link from manuscript to a source text.

He then showed macrogenesis - zooming out. He showed visualizing different versions of beginnings of a text. He showed following an idea in Joyce from notebook to manuscript to typescript to novel.

One can follow things forwards and backwards. From notebooks forward or from final text back.

Then they talked about the prototype that they are developing that has modules for different types of items. One can go up and own a hierarchy of files. It is one large tree. You then have paths through the hierarchy.

They you have complex graphs that are not hierarchies. You make connections across the types of items in the hierarchy. A library item might connect to a page in a notebook then to manuscript. Do they need an ontology of different types of links.

He then showed the prototype tool where you can go into any document and open a graph viewer to show what is linked to.

Margot Lise Mellet and Marcello Vitali-Rosati: Palatine Anthology. Complexity for a digital research project

Mellet talked about a collaborative edition of the Palatine Anthology. Collaboration makes things more complex. The PA is a collection of Greek epigrams. It is a philological maze. There are various manuscripts used. Many original sources are lost. The final PA is an editorial collection that is a dialogue between all the various documents list.

She talked about an anthological bouquet (of flowers.) How are the epigrams linked? There are intertextual links, topoi links, and associations. She talked about an anthology as a living network.

She talked about collective intelligence as discussed by Pierre Levy. The project is not about reaching a truth, but of illustrating the connections of a collectivity.

She talked about an API that outputs JSON. She showed the Antologia. She showed editing and then showed a reading path. She showed how you can add and then get all sorts of media. She talked about how these sorts of collaborative systems can shift the boundary between expert and publics.

The tool allows anyone to create an account and then edit anything they want. How does this stay open and not be vandalized?

Vitali-Rosati talked about how they have involved school kids who can translate Greek.

Holst Katsma: The Novel And The Quotation Mark

Katsma is looking at visual features that emerge within novel as novel emerges. Two features that he

  • The quotation mark
  • Chapter headings

He focused on the quotation mark. He wanted to add a new feature to look at which not words. He shifts focus from lexical to visual features. This allows one to look not only at the author, but also others involved in the novel's emergence like the editors and publishers.

He created corpora of different genres in 18th century (including the novel) and visual moved through the PDFs scanning for visual features.

He then switched to the rise of the modern quotation mark in the novel. The modern quotation mark was adapted from earlier use of the mark which was used only on the margins. He showed examples from outside the novel. The dominant practice changes in the novel with the mark moving into the text (from the margin.)

He is documenting the methods for marking dialogue in novels and how they change. There are inclusive and exclusive versions. Sometimes indirect discourse is also quoted.

The novel could have used quotation marks in different ways? What does this tell us about the novel and its emergence? He thinks this shows that printers and authors are forming the novel. It could also show that dialogue was important to the novel and this was being negotiated. The visual index is linked to changes in content. The mark is tied to a slow change in the novel.

He talked about how the use of the modern quotation mark made possible other forms of prose. Once you have ways of marking direct speech then authors can do other things.

His method offers a way that is not author-centric. This treats those involved in printing as important in the emergence of the novel.

Thursday

XR in DH: Extended Reality in the Digital Humanities

I was part of a panel on extended reality.

Rachel Hendery started off the panel talking about projects that created virtual reality exhibits that allowed people to explore languages, trade in the Pacific and other immersive environments. They shared the VRs through galleries. They have a system now that can represent any data that has a geospatial element that sounds really cool.

They moved from controllers to a system that can track hands. You can collect things like language information. She showed an interesting VR showing travel distances/times. She see this as a system to tracking relationships.

She closed on Barrawao - an outreach project showing a different way of thinking about the world drawn on aboriginal protocol. When you are in the space only the language comes up from the land for you.

The choices you make are often different when you do something for a gallery than when you do VR for outreach.

Mona Kasra then talked about her interactive VR work. She works with artists to create live like experiences. One is Dreams of Solitude. Liveness is a concept that a lot of disciplines engage. Ideas of human to human interaction influence our conception of liveness. Digital liveness is a relationship between us and the other.

She is also collaborating with an indigenous community document performances. One has to approach these projects respectfully. The idea is not to just "capture" performances but they are co-developing materials that they will have ownership of. U of Virginia has a large collection of Australian aboriginal art that is also being woven in.

Amanda Licastro talked about Teaching Narrative and Literary Analysis with Virtual Reality. Her project is part of a cross campus interdisciplinary project. The project started around the issue of why there is a drop in empathy among this generation. Millenials are supposed to be the "me" generation that is narcissistic. There is supposed to be a lack of close face-to-face community. This may mean that youth may not be able to understand empathy with the characters they meet in the literature they study. They feel desensitized. She is working with current novels and a play that she combines with scholarly works (Cybord Manifesto) and movies/TV. They look at critiques of empathy like Against Empathy by Bloom.

Students had to read Frankenstein and then try to develop a formal proposal/pitch for a VR game. She showed some examples of the resulting pitches.

I thought the idea of a pitch was brilliant. Students get to imagine what they want to do without all the mess of development. They learn to write pitches. She is working with an outfit that will implement the best idea. (I think that is right.)

Lynn Ramey is developing a project around Medieval Textual Transmission. This is a tiny data project as there isn't a lot of evidence so they are trying to immerse students in ways stories were told. Her students in an interdisciplinary course developed a VR game in Unity. In another course students studied virtual worlds learning different applications of VR in medicine and coaching, for example.

She showed the Adobe Fuse Character Creator and talked about a project where students had to make themselves and then make an alternative self. This got them reflecting on representation in a virtual world. Another great idea.

See References and materials.

I (Geoffrey Rockwell) talked about our work on Augmented Reality locative games.

Victoria Szabo talked about evaluation guidelines for virtual and augmented reality. She is bringing together a funded institute with people from all sorts of different disciplines using different technologies. They have developed three key principles:

  • Integrity
  • Interaction
  • Impact

They are also learning from areas like user interface design, accessibility, biometric measurements of presence, comparative analysis and so on. They are drawing on a mix of approaches to evaluation from the humanities and sciences.

She talked about different approaches to scholarly evaluation which are important to administration - things like tenure. Evaluation makes a difference so we need to make sure that humanities values are woven in. We need to also worry about how to preserve materials like other scholarship.

Micki Kaufman did a great job organizing us and chairing us.

Quintus Van Galen: Durchdruck im Fokus: Visualising the Spatiality of Articles in Historical Newspapers

Van Galen talked a project that visualizes articles. It takes into account layout. Visualizations show where articles appear in a newspaper for a search query. He gave an example of searching for words about empire in a British paper and showed how the articles with empire often appear in the financial section. One can also see how the paper was redesigned. Empire shows up also in letters showing a "banal everyday imperialism." He sees the tool not as providing an answer as seeing anomalies that provoke other research directions.

Extendability is a big part of the design. They are experimenting with using their tool to visualize topics from topic modelling.

Tito Orlandi: Reflections on the Development of Digital Humanities

Tito Orlandi was the winner of the 2019 Busa Award.

He started with some chronology. He pointed us to Julianne Nyhan's oral history book for more details. He asked what it was he was giving a chronology of? It used to be humanities computing. In his days (60s, 70s and 80s) colleagues were not interested in computing. Now the situation is different. We have lots of organizations and lots of centers and lots of programmes. We should see this as a triumph, but a triumph of the ubiquity of computing. Does it really matter if the field has contributed or not?

An important question is whether HC-DH has actually had an impact? Has it changed a discipline? We used to theorize more. Now we mistake the ability to use computers with an understanding of computing and its applications. He wants to focus us one what he considers the fundamentals of HC-DH.

We are working with two fields. Computation and humanities. Computation refers to the management of discrete data by a machine. All of computing is comprised by a Turing Machine. The use of the computer as a writing tool or measurement device or 3D device is not HC-DC because it doesn't influence the discipline. The competence of the scholar should not be limited by current technologies. For Orlandi what matters is the formalizing the problems of the humanities. Gardin has discussed the problem from the humanities perspective. The humanities scholar needs to understand what computing can do and how tor represent phenomena in digital form so that we can then run automata on representations.

Orlandi wants us in the HC-DH to understand the abstract Turing model of the computer. Understanding the abstraction allows us to model how an automata can be applied to problems (like those of the humanities.) We need to understand what are the computable problems in the humanities. The Turing machine is actually a philosophical idea. It is something that belongs to the humanities.

He talked about modelling. The best definition is in the Principia Cybernetica. The final aim is to build models in the HC-DH. He would discourage new names for old approaches. He listed a number of areas which don't need new names.

He ends on how we should introduce our field into the university. In some systems it is very difficult to convince the authorities to recognize a new field like ours.

Friday

Tiago Sousa Garcia: New Approaches to Women’s Writing Virtual Research Environment

The NEWW Netowork comes together to investigate womens' writing. Most research has focused on single authors so the network has been working on how to study broader groups.

Laura Kirkley, one of the investigators, studies Mary Wollstonecroft and wants to study how translations have circulated feminist ideas. They did an initial prototype using Google Fusion and developed a map of where translations were being published. They also looked at mentions. The problem with Fusion is that it is being discontinued.

From there they developed something better, what they call a pilot. They need more data, historical maps, and chronology. They are not concerned with maps printed in the time, but maps that represent the period with its borders and other features. Political and linguistic borders shift. There is also a question of what counts as feminist in the period for the purpose of following feminist writing. Therefore they had to make a lot of assumptions.

  • They are making decisions at a city level of granulatiry
  • If they didn't have a city they chose a capital city
  • The maps they had to use were not consistent and accurate

So he showed a quick demo. You can play through the timeline, select on the timeline and see the map. You can see synchronic view. What they want to do in the full project is see a progression of texts rather than things coming and going over time. They want to tackle how to deal with uncertainty.

We had a nice discussion about the visual rhetoric of using markers for

Frans Wiering: A Mobile Website To Support Teachers In Discussing Terrorism In The Classroom

Wiering started with a recent shooting near where we are on a tram. Following that there was a lock down of the city that affect people, especially children. There was confusion and assumptions about things like whether the terrorist was a immigrant.

One can imagine what things might be like the next day when schools reopened. They circulated a newsletter to schools with information about how to deal with the issue in schools that often have very diverse classes.

The newsletter is TerInfo. The project is led by Beatrice de Graaf. She has two beliefs, 1) knowledge makes children resilient in times of terrorism, and 2) teachers often avoid discussing things. The newsletter is to provide expert support to schools for teachers to organize discussion that is respectful.

The website emerged when de Graaf and Wiering met up and his students took on the project to develop a human-centered prototype. The prototype was an app. Apps can be difficult so they developed a mobile website. They had to think about the educational process and develop

The site has two parts. One part deals with incidents and the other part provides background information. The background info might include stuff on the history of terrorism. All the materials are for the teachers, not for children. They had materials to help teachers translate to children.

The class prototype led to a TerInfo project almost immediately. There are experts from international history, pedagogy, and information science. This was tested in a number of schools meeting with teachers. Some of the feedback was that it motivated teachers to speak about terrorism without the trigger of an attack. It took some of the fear out for children. The testing also showed stuff not used. This led to simplification. He showed wireframes and talked about the design that is being implemented in the next version.

The project has attracted nation-wide attention which raises questions about how to scale up as local community input is important.

What lessons are useful to us:

  • Digital artifacts can be a great catalyst
  • Process around the system is crucial - the teachers need to be in control. You have to give design back to the teachers
  • It is not about cool technology, but a match with the goals
  • Simpler is better
  • Good technology has a deep connection to the domain it is intended for. The values of the society should be reflected in the technology and content.

Angus A. A. Mol: Gaming Genres: Using Crowd-Sourced Tags to Explore Family Resemblances in Steam Games.

Mol talked about a project about historical games. He had to understand the way games are categorized. Wolf 2001 talks about classification in The Medium of the Video Game. Is Outlaw (1978) a Western? Games like Assassin's Creed Odyssey is a action/adventure game that is becoming a role playing game. Genres are being discussed online and by reviewers.

Mol decided to apply Wittgenstein's idea of the family resemblance. Wittgenstein uses games as an example. When you ask what is similar there isn't any essence, but a network of features. That said, this doesn't help in a concrete fashion.

He decided to use Steam that has a large userbase of hundreds of millions. He used steamspy - an API that one can access to get the data. He uses Steam Tags - a system of folk tagging that started in 2014 and was quickly turned into a curated list. There are 340 tags in the dataset and 21040 games with multiple tags. Tag popularity is uneven. Some tags are used a lot more. The top 20% of tags get 80% of the votes.

With the tags one can create a two-mode network. There are two types of entities in the network, ganre tags and games. The network is between players and games.

Of course he gets a hairball of a network. It just tells you that there is a subset of heavily tagged subset of games. One approach is to collapse it to a one mode network. You can weight connections. One can do network measurements like modularity community detection. Doing that for the historical games (the subset of games tagged as historical) He found three broad classes: Strategy, Action/Adventure and then a third genre, the Shooter games.

What happens if you look at other things like the "best" games? He selected the top 1% games and found games not critically reviewed are popular. Types like Visual Novels and Puzzle Games

The take-away message was: crowd-sourced tagging is messy, but useful.

Jeremie Vandenbunder: Former aux Methodes en Sciences Humaines et Sociales avec Bequali

Vandenbunder presented on beQuali a Qualitative social science survey bank. It is a project that brings together French datasets of social science qualitative surveys. The talk focused on using these for teaching and he showed tools in the site for exploring different components. Students (or teacher) can find guides, for example, if they want to teach preparing a guide.

There is an ethics process to get access to the data. Interestingly the interviews are transcribed in TEI.

I had to hand my laptop over to skype for this so these notes are after the fact.

Phillip Benjamin Ströbel: Improving OCR of Black Letter in Historical Newspapers: The Unreasonable Effectiveness of HTR Models on Low-Resolution Images

Strobel talked about improving OCR. He talked about evaluating getting a ground truth of bag of words. They are comparing different OCR tools. They have seen a significant improvement in some cases. Their project is at http://www.impresso-project.ch

Jennifer Roberts-Smith: Performing Historical Place: Leveraging Theatre Historiography to Generate Presence in Virtual Reality Design for Restorative Justice

The project is called DOHR project which is doing oral histories for reconciliation. Roberts-Smith is working on a virtual space that renders oral histories.

She then talked to us about three people who lived in a segregated care institution that ran until the 1990s where there was a lot of abuses. A restorative inquiry was launched later. DOHR is creating curriculum for students to witness the testimonies of people who were "home children" in the institution.

There is a tradition of historical place-making in the digital humanities. They were able to draw on these projects, but they wanted to create a rendering that was relational not historical. The place of the home is explicitly impressionistic and draws on the oral stories. It doesn't try to render photorealism to create presence. The DOHR tries instead to render the relational presence of the storytellers. In VR place is ontologically prior to the stories. In DOHR the idea of relational scenography is reconciliation.

Relational Scenography includes:

  • Roles
  • World
  • Interactivity
  • Making

Putting people first means that you listen to the storytellers. She talked abnout the roles of teachers as witnesses too in their classes. As makers Roberts-Smith and colleagues are not auteurs so much as nodes in a network. We should perform relational place making that makes just places. We should focus on the larger political processes rather than on the technology.

Roberts-Smith runs the QLab. When she encounters technologies she always thinks of them as she would anything else in theatre.

My battery ran out at this point so I am filling these out after the fact.

Marc Tuters and Emillie de Keulenaar: The Intellectualisation of Online Hate Speech: Monitoring the Alt-Right Audience on Youtube

Tuters and Keulenaar talked about monitoring hate speech on YouTube. Some of the points they made included:

  • They want to monitor process and language of radicalization. This is an eventual goal.
  • They found in their analysis of YouTube comments that hate speech is often a pseudo intellectual form
  • Their research grounded in media studies
  • They want to see how do ideas circulate

They talked about how detection of hate speech on social media is so often list based and can’t adapt to new speech. YouTube has a lot of troubles with hate speech - recently YouTube deleted lots of videos, which ironically makes it harder to study hate speech. In their case they had a corpus prepared by someone else.

“Blood sports” Is an area of hate where taboo speech is tackled in pseudo intellectual fashion. They found the pseudo-intellectual form common for hate speech.

In their method, comments and transcripts from right and left were scraped using race related words.

She showed a neat viz with list of words mutating over time - phrases like “race realism”

They looked at phrases with “Jew” - the Jews are othered - the word is in echo brackets - there is growth in the brackets .

Their conclusions focused on how:

  • much of what they found is not explicit hate speech - but scientific racism using pseudo-scientific language
  • This doesn’t happen in speech about Jews

We had a great discussion about hate speech and the role of the digital humanities.

She mentioned a cool OILab: https://oilab.eu/

You can see their abstract at: https://dev.clariah.nl/files/dh2019/boa/0768.html

OIL - check project - very cool

Johanna Drucker: Digital Humanities — Complexities of Sustainability

Drucker talked about how complex systems are non deterministic so why bring together with word sustainable?

She talked about how we thought the digital humanities could save humanities, but have we can’t do it. Certainly not alone.

We in DH are now doing so much - Now we need think about sustainability. She went through ways we are not sustainable.

Our project level work is not sustainable - content can be but not functionality. She gave examples from her work:

For example, her History of the Book site had to be moved out of Drupal and moved to the lowest technology which is HTML. See https://hob.gseis.ucla.edu/

Again, when it came to her Introduction to Digital Humanities she put it up as a PDF as that is stable. See http://dh101.humanities.ucla.edu/wp-content/uploads/2014/09/IntroductionToDigitalHumanities_Textbook.pdf

She then talked about how tools and platform are discipline agnostic. What we as digital humanists should be doing is thinking about ethics, historicity - that work is sustainable where the tools are not.

She then asked "What is institutional sustainability?" How can we partner with libraries - there was a time we were were disparaging of libraries - which is not fair. Now we realize sustainability has to do with administrative decisions, like those librarians are trained in.

Visibility - she talked about how we need registries for projects to be seen. Technology doesn’t last. Intellectual paradigms die too - Rossetti archive disappears from view.

She talked about hypertext - exciting - storyspace - now we have absorbed that and moved on. Some things get remediated - is that sustainability?

She switched to the global scale - we are all complicit in terrible practices - you can’t get tech without rare earth metals. How about much data are we generating - is that sustainable? What does that mean to building DH centers?

They she talked about connecting sustainability and complexity. Sustainability is thought of instrumentally now. What if sustainability is a complex epistemological process? Then the problem is reformulated.

She used the OCHO debate at the U of Virginia DH conference to give an example. She talked about how we are ceding the aesthetic to the technical - to concede to OCHO shouldn’t happen. Just because the technology points one way doesn't mean we should follow. Cultural authority too often is ceded to the technical. In fundamental ways there is a non equivalence of formal systems and hermeneutical.

Critical humanistic methods haver very different goals than formal methods. What is it that should be sustained? We have formal systems, we have the generative systems - we should not confuse the systems with what they represent. We should be careful about sustainability - it is not end in itself. We have to balance in ethics. We should sustain what is important, the ethical, the generative, not the formal. Sustainability recognizes the relationship between of the formal and generative.

That’s the work we need to do. There is a now a false front of absence of resources as excuse for cutting - this is a political issue. Ethics is central to all, diversity, accessibility etc.

DH contributes the ideas of scale to the humanities.

My apology for the disjointed nature of these last notes. I had to reconstitute them from notes on my phone.

Navigate

PmWiki

edit SideBar

Page last modified on July 17, 2019, at 11:26 AM - Powered by PmWiki

^