Main »

DH 2017

This week I am at DH 2017 (Digital Humanities 2017) in Montreal. These are my conference notes about the conference and the pre-conference events.

Note: As always, these are being written live and will be full of unfinished sentences, confused observations and big gaps when my laptop battery runs out. Please email me if you have corrections.

3DH (Three-Dimensional Dynamic Data Visualization and Exploration For Digital Humanities Research)

Before DH we had a two and half day retreat for the 3DH project. 3DH is developing new models for humanities visualization. Led by Chris Meister the project is now in a design and prototyping phase. This will be followed by a development phase.

I spoke on interpretation and what it would mean to support interpretative visualization. I showed what Stéfan Sinclair and I are doing with Voyant and Spiral.

New Scholars Symposium

Rachel Hendry and I organized another edition of the New Scholars Symposium supported by the CHCI and centerNet.

Lightning Talks

Tuesday evening we had lightning talks by participants that could make it. Here are my brief summaries.

Rachel Hendry

Rachel talked about a neat virtual reality project where she has visualized a corpus so it can be navigated and explored with VR technology. She showed a short video of viewing the model in a dome.

Jonathan Armoza

Jonathan is developing cool topic modelling visualization system for studying Emily Dickinson.

Joseph Rafla

Joseph is trying to develop a machine that can detect events and their duration in screenplays. Events are change in state. He wants to see what grabs and holds attention. He wants to look at wide release films and getting his data from a list from the Writers Guild of America.

Rasoul Aliakbari

Rasul works in Comperative Literature. He looks at print cultures of the Arabian Nights. He is documenting the actual meaning making of the Arabian Nights in different cultures. In his postdoc work he wants to document the use of Persian and Arabic books and documents in Canada. He wants to gather data about what is available across Canada.

Molly Kalkstein

talked about a project she is starting at the Center of Creative Photography at the University of Arizona. She is going to create an online version of "The Archive", a journal that highlighted aspects of the collection. She wants to digitize and make accessible. She talked about how the CCP is learning about digital projects and her project is meant to help the CCP understand how to do projects.

Maryse Kiese

Maryse is looking at publicity and cyberspace. She talked about the theoretical resources she is drawing on. She is studying the intelligence communities and how they think about cyberspace and use it for surveillance. She talked about cyborgs and our extensions.

Geoffrey Rockwell

I presented on my next project called Epistemologica. Stéfan Sinclair and I are replicating interesting and important text analysis processing practices.

Sevan Beurki Beukian

Sevan is looking at gendered oral history, specifically Armenian oral history. Most tales about the Armenian genocide are by and about men. Now we realize that there is a gendered experience. She wants to see if we can rethink the history. She is looking at different archives where people have collected stories about the genocide.

Ananya Ghoshal

Ananya is interested in DH and research in the anthropocene. What is the role of DH in an era of climate change and environmental degradation. She wants to also use DH to find ways to reorient people to the anthropocene. Despite having a lot of interdiscplinary and public power, DH is not that ecologically informed. What can we do?

She wants to bring into the mainstream those stories that aren't heard like those of Ananya Ghoshal.

Damiano Benvegnù

Damiano works in crticial animal studies and ecostudies. He wants to connect landscapes to issues. He wants to explore the connection between landscapes and the dialogues of pope Gregory ? These dialogues almost always have a non-human element. He is prototyping places where miracles occurred that are discussed in the dialogues. These prototypes would allow humanists and scientists to look at the sites.

Unconference

On Tuesday we had an unconference for the New Scholars Symposium. The topics covered included:

  • AI and Machine Learning
  • Crowdsourcing
  • Building a Twitterbot
  • Training opportunities
  • Pedagogy
  • Digital Collections and Copyright
  • Diverse Voices

Opening of the Conference

The local organizers included: Michael Sinatra, Stéfan Sinclair, Cecily Raynor, and Dominic Forest

A representative from McGill spoke about how McGill is engaged in the digital humanities. The Dean of Arts and Science at the Université de Montréal is a philosopher and he gave a welcome talk. DH has been a fad, but now it is becoming genuinely interesting. The humanities have been engaged in understanding and constructing the human experience. The digital humanities are part of this understanding and construction. At the University of Montréal there is a diversity of researchers working on the digital humanities from literature departments to library science.

Diane Jakacki, the programme committee chair spoke about the importance of inclusivity in our field. What could be done to include more people in more ways in our scholarly dialogue? The programme committee worked hard to involve more people in the review process. There are 450 accepted papers, posters, and workshops representing the work of 1000 contributing authors.

Karina van Dalem-Oskam, Chair of ADHO Steering Committee, spoke about what ADHO is and the code of conduct for our conferences. There is no place for harassment or intimidation.

Karina then officially welcomed us and declared the conference open.

Marin Dacos: The Unexpected Reader

Marin has led the development of major infrastructure including OpenEdition. He started talking about the Manifesto for the Digital Humanities. Part of the Manifesto is about open access which is part of open science. A lot of digital humanities work is on the production of writing not the reading. He talked about how fragile is open access. He is interested in the obstacles to open access, especially the obstacles from within the academy. He described a number of academic attacks on OA. He showed a great video from the 1950s of some git explaining why there should be an aristocracy of readers. Most people don't need to read and paperbacks make it too easy for them. They could end up with Satre in their pocket.

For Marin publication is an amplification of ideas. There is an attention income that has been expanded by online and open access. He showed maps of publication, access, and paywalls hit. It turns out that everyone is using sites like Sci-Hub (which has lots of pirated papers.) We don't have a lot of research on access and reading. We don't have much data on who is doing what. One problem is robots - we have too many bots downloading things. Our server logs are meant for server management, not studying

COUNTER is providing better statistics. We have snitches like Google analytics that provide much better information, but we need open source analytics like Piwik. We have heatmaps for how people read screens.

He talked about improving relevance and a ratio of Piwik hits to logged hits. Marin gave some data about revues.org and the other sites. What do we know about open access in French? We know more about PornHub (which has a data service) than about open access use. We know more about Amazon kindle data - ie. a lot of people don't finish. We have some research on attention. He showed some fascinating SCIELO usage data.

Gallica is the French equivalent to the Digital Library of America. He talked about data from Gallica about different types of readers. People go into Gallica from Google and back. From other research we find that people are task-oriented. They read what they need.

Marin wants to develop models for access. One model is the long tail, but that doesn't work. The more the catalog is open the more people go for hits. One is "sleeping beauty". This is a publication that goes unnoticed until some point when it becomes popular - an unexpected reading. The unexpected reader is about how open access has created a new readership. The problem is that we don't have the research to really confirm that there are unexpected readers. They now have some data from revues.org

Umberto is a detector of unexpected readers that they have developed. Umberto digests Piwik generated data. It then analyzes the data. The data has a lot of variation. The unexpected reader is the one that is outside the model. He gave examples of unusual data that indicates unexpected readers and their ideas about what caused the jump of readers. Examples included when the film based on a book is run on TV or when there is a demonstration about something an article is about. See https://lab.hypotheses.org/1822 to play with Umberto.

In sum they found 761 extreme anomalies. Some were due to a media effect, a google effect, community effect. He talked about comparing unexpected readers to expected ones to see what we can learn of them. They find more unexpected readers on Saturday, for example.

To conclude he talked about the opportunity for digital humanities. We have the opportunity to think about how to attract the unexpected reader. We have to construct new forms of media - new forms of writing - that are more accessible to the unexpected readers. For example, he collaborated with a graphic designer to add cartoons to his slides.

He also mentioned that OpenEdition is now being scaled up to including other European countries.

He commented on how he was honoured to present a keynote in a language other than English.

Tuesday, August 9th

SP-04: Short Paper Session

Hongshan Li: Tackling Innovation Networks with Smart Data: A Case Study of the Liquid Crystal Institute at Kent State University

Hongshan and his colleagues have assembled an archive on the LCIKSU. They are turning the raw data into smart data. They are using network theory to analyze the data as the Institute was itself a network.

Some of their preliminary results show that the Institute did play a role in the development of liquid crystal displays. The history is that the Institute grew to 20 by 1970 and then 95 by 1995. They paid close attention to international collaboration. Hongshan showed the network of collaboration. They showed a "structured hole" model that is believed to support innovation.

Steven Jones: Reverse Engineering the First Humanities Computing Center

Steven presented a paper on a project I am involved in. We are using digital methods to conceptualize an important moment/center in the history of humanities computing. In particular we are looking at CAAL, the first humanities computing center. We want to reverse engineer and reconstruct this center as a whole - as a system.

We make use of a cluster of methods, including:

  • Digitized Archives
  • 3D Models
  • Emulations
  • Oral Histories

Can we model this research center and its operations. We hope to complicate the key terms like "first", "humanities", and "computing."

What roles did the human operators take and how did they fit in the larger semi-automated processes. How were these roles gendered. We hope to complicate the roles.

My component involves the work I've been doing with Stéfan Sinclair and Marco Passarotti. See my blog entry on the Instant History conference.

Alex Green and Lalita Kaplish: Wellcome Data Week

Wellcome Trust is a major foundation. Henry Wellcome was not just a philanthropist, but also a collector. They are now digitizing the collection. Most of the reading of the collections is linear reading. The idea of Data Week was to model novel explorations. He then talked about two of the experiments including a project on the Chemist and Drugist journal. They laid out every page of every issue in an enormous space so you can see certain patterns. The second project was the one based on archive of Ticehurst House Hospital, a private hospital. The archive is most hand-written. Could they track a patient's history through all the hand-written stuff. The developed tools for paginating documents and identifying names. With that information they can visualized the cases.

They then stepped back and looked at which Data Week projects worked best. Was the hackathon too open? They found that they need more documentation. They found how important the metadata is. You also need to know the materials and to involve researchers.

An interesting question is what to do with research that isn't published, but might be useful to others. The project blog is still up: http://wellcomedataweek.wellcome.com

Luis Meneses: Shelf life: Identifying the Abandonment of Online Digital Humanities Projects

Luis talked about how we love digital projects, but they are fragile and often vanish. Digital projects often have a limited useful life. What is the shelf life of projects? We tend to think of projects like books that once finished are sent out and live forever.

To study these project Luis gathered the DH books of abstracts and extracted the URLs (using Apache Tika.) He then ran the URLs and gathered the HTTP Response Codes and headers. He had 5,845 unique URLs. He showed a graph of valid to decayed URLs. He looked at average time since last modification. The older the site the longer it has been since modification.

He showed how some sites have been bought by spammers - one web site now hosts diet pills ads.

What are the solutions?

We had a lot of discussion of the abandoned sites. What do redirects mean?

Panel: Studying Literary Characters and Character Networks

Mark Algee-Hewitt: Distributed Character: Quantitative Models of the English Stage: 1500-1920

We are used to social network graphs of individual plays, but can we work on a larger scale. He is working with two measures, Eigenvector Centrality and Betweenness Centrality. He has a typology of plays. Plays where everything goes through one character. Plays where the betweeness central character is different from the eigenvector center characters. Conspiracy plays have different centers.

Mark used the Chadwyck-Healey British Drama corpus to try to scale. He talked about the automatic generation of network graphs and generating scores with which to measure the distribution of protagonists. Over time we see that the play is less concentrated in one character. More people talk to more other people. He also see a difference between comedies, histories and tragedies. Over time plays are becoming more commedy-like.

Andrew Piper: On the Unreasonable Complexity of detecting Social Interactions in Literature

Koustov Sinha presented work led by Andrew Piper. They are struggling figuring out who a character is in prose so they are trying to figure out what an interaction is. They are trying to use human annotators to tag interactions and then compare them. They are also trying classifiers and not having much luck. Some of the features they look for include the presence of dialogue, tentative words, copresence ... .

The annotation experiment involved selecting passages from 200 best selling novels. Annotators had to mark the people and the interactions. The problem is that interactions are complex. "I fell into a depression after my dad left." (What is the interaction here?)

Andrew Piper: Emma: A feature space for studying character

Andrew talked about the difficulty of creating character and recognizing them. Characters are differently named by everything from names to pronouns.

Andrew is building on BookNLP by David Bamman that does a lot of work identifying characters. It assumes 4 positions of characters. With this information one can try to extract the character text (not the dialogue.) Character text can then be mined to see what words are in a relationship with a character. With their tool they are trying to calculate 5 character features:

  • Distinctiveness - how similar to other characters
  • Positionality - where is the character? how often do they have agency?
  • Modification - what is happening to the characters?
  • Centrality
  • Modality - looking at perception by character, thinking, motion ... of the character

He talked about 4 types of characters and showed how authors skewed. Austen skewed introverted. Women authors often write more introverted characters. He talked about a major shift of interiority; the interior characters at the end of the 19th century shifted from being women to being the SciFi characters at the end of the 20th who are anxious about technology.

Thursday, August 10th

SP-17: Short Paper Session

Overcoming Data Sparsity for Relation Detection in German Novels

Social networks reveal insight about the plot of novel. To do this they need to first do character detection. They have a tagged corpus for training. Then they have to do coreference resolution to figure out who the characters are interacting with. Finally they have to do relation detection to identify the relation between characters.

Manual tagging of relations takes lots of work - how to build a training set? Summaries of a novel could be used on the assumption that a summary would contain rich examples of relations. But the prose of summaries is different. So they have used a classifier to find sentences in novels for human tagging to supplement the summaries. They did a test and found that the summaries worked OK.

They found there were differences between human annotators.

Katarzyna Anna Kapitan: Network analysis of the manuscript context of Old Icelandic literature

There is a difference in Icelandic literature studies about genres. It is hard to manually look at all manuscripts so they gathered a database of existing digital editions and developed a social network between the texts in order to see if the traditional genre classification works.

She then talked about specific sagas that the social network graph suggested were ambiguously classified. They found a saga that over time shifted what it was collected with suggesting a shift in classification. They also found rewritten and invented sagas. In the end a mix of methods worked well.

Christoph Beierle: Using Methods of Computational Linguistics for Resolving the "Homeric Question"

The Homeric question is a fundamental philological problem of the genesis and history of early Greek texts. The question goes back to 1795. Now there are three competing theories:

  • Oral poetry theory - ie. that it was an oral tradition
  • Neoanalysis - poems written by single great poet experienced in oral traditon
  • Analysis - poems compiled from different periods

This project goes back to a project from the 1970s from the U of REgensburg. They look at single words and repeated word connections (iterata). They have a complete directory of all epic repititions. They focus on singular iterata (one occurence in Illiad and one or more in others.) The existence of these seems to challenge the oral theory and the neoanalysis theory.

Now the have more computing power and new methods. So they are looking for other properties of the texts for improving the theories. Given the assumption that the Iliad is the oldest, the idea is to compare the Iliad to other texts. They can look for iterata rare in Iliad and frequent elsewhere.

Sandra Murr: Digital Analysis Of The Literary Reception Of J.W. V. Goethe’s Die Leiden Des Jungen Werthers

Sandra looked at Goethe's Werther. The novel sparked a remarkable critical reception and imitations. They are looking at the social relations of the main characters in Werther. They just look at collection of character names and pronouns. There is a triangular relationship between the three main characters and the triangle is a topic in the text. They also looked at other novels that have have a triad of a protagonist, beloved and anti-protagonist. It is a persistent pattern in Werther adaptations.

LP-23: Long Paper Session

Kevin Page: Contextual Interpretation of Digital Music Notation

The motivation of the project was to see a) How do we incorporate knowledge in from outside within the musical notation and b) enable the notation to be included in a wider application.

They had a chance to follow a full Ring cycle and annotate the decisions outside the score. They have digital sources, structured relationships. They are developing Music Encoding and Linked Data (MELD) - a framework for annotation. They use MEI XML - Music Encoding Initiative - for musical. It isn't an approach that lets you do what you want. There are identifiers for elements and a rendering tool that carries them through. They then use the Web Annotation to connect stuff.

He showed an example of annotations and linking.

David De Roure, Pip Willcox: Numbers into Notes: Digital Prototyping as Close Reading of Ada Lovelace’s ‘Note A’

This project was a response to the preparations around the anniversary of Ada Lovelace. Wilcox talked about Lovelace's background and her mathematical training. Then she talked about Babbage and his Difference Engine. Babbage would have met Lovelace at one of his soirees when he showed of his engine. While he didn't lecture or write much, his idea and prototypes were shown and influential. He was building tools or prototypes tools. Lovelace translates and annotates a lecture on the machine.

Another colleague, Emily Howard, composed a musical work based on numbers in honour of Lovelace. They also built a simulator of the engine and then wrote a musical application - a case of experimental humanities. See http://numbersintonotes.net

Then they imagined music engines. They reproduced Numbers into Notes on the Arduino. These were physical algorithm that could handed out and gifted. There is a physicality to these machines that can play MIDI instruments.

They were thinking through making. Speculative humanities. By prototyping what someone else did they feel they are understanding the original.

Niels-Oliver Walkowski and Johannes Pause: Dead And Beautiful: The Analysis Of Colors By Means Of Contrasts In Neo-Zombie Movies

Digital humanities doesn't deal much with cinema, especially colour. One can use clustering to identify dominant colours, but the authors didn't find the existing tools worked so well. They were looking for a better colour model. They use different approaches to contrast which they call views.

They then used their approach on three zombie movies. In the holywood blockbuster form often follows function. They found certain contrasts associate with important events.

Friday, August 11

Panel: Scaling up the Arts and Humanities: The DARIAH Approach to Data and Services

Chad Gaffield introduced the panel on DARIAH. He talked about two questions DARIAH can help us with:

  • How can we make the whole greater than the sum of the parts
  • The digital problematizes the way we separate research from teaching from service

Digital scholarship forces us to think through the connections. DARIAH is one model for integration. DARIAH is trying to come to grips with a couple of issues. One is where to stop. Do we want to do everything?

DARIAH is a cooperative undertaking by member countries and associated projects. Each national member contributes to the distributed infrastructure. How can the network build itself to be infrastructure? They have a growing grant - HaS (Humanities at Scale).

MIke Priddy from the Netherlands talked about HaS and how they assess value of contributions. They have developed a method of coordinating cash and in-kind support from countries. They are developing a contribution tool and a marketplace. The contribution tool has metadata and self-assessments. The self-assessment is reviewed and then a selection that decides if a contribution was worth it. Then the worth can be calculated as a national contribution.

What are the activities and services? An activity is part of a process is part of a service. They can be human or machine mediated. Services can be packaged. They have to be repeatable or on request.

Assessment is difficult. They have a Reference Architecture based on a Reference Model for Social Science and Humanities. They take 3 viewpoints: Business, Information, Computational.

Developing tools is seen as part of the research so they develop demonstrators. The problem with demonstrators is that people move around and the IT landscape is evolving. How do move from demonstrators to stable and maintainable infrastructure? They are applying an Agile methodology adding features as they are needed.

Aurelien Berra talked about "Teaching: where and how?" They are creating registries of courses. They have also created DARIAH Teach videos. They have also taken up successful local projects to scale them to be useful at a European level. Finally they are developing Master Classes.

DARIAH and its sister ERIC (European Research Infrastructure Consortia for DH: CLARIN and DARIAH. CLARIN and DARIAH projects in countries are joining. It sounded like CLARIN is an ERIC which is a permanently funded infrastructure. One also has to find ways to close down projects in infrastructure. PARTHENOS is a meta-level project that has a limited life span.

In discussion we talked about how to build in capacity to survive over time with funding cuts. Marin Dacos talked about how duplication/backup of e-journals, data and code is important. We need succession planning. DARIAH, unlike CLARIN is very light at the inter-national level. Each country runs and funds their part. There isn't much central infrastructure (that isn't provided by a country.)

LP-35: Long Paper Session

Modelling Interpretation in 3DH: New dimensions of visualization

Chris Meister, Johanna Drucker and I presented on the 3DH project. I talked about the Galaxy Viewer and Spiral notebooks.

I drew some challenges to visualization based on what interpretation might be. Interpretation is (and thus visualization might):

  • by Someone: Show the position of visualization
  • about Something: Reveal the surrogates and data preparation
  • in a Tradition of Practices: Integrate with other (manual) practices
  • for a Community of Interpretation: Address community
  • is in an Ethical Context: Provoke
  • Reflect back on itself: Understand ourselves through visualization

Uta Hinrichs and Stefania Forlini: In Defence of Sandcastles: Research Thinking Through Visualization In DH

Forlini started by talking about the tension between tools and prototypes (sandcastles). She had a nice quote from Bruno Latour about how tools displace and change what was first desired. The paper was a story of detours through visualizations. They think of visualizations as speculative. They were studying the Bob Gibson anthologies of speculative fiction. These hand crafted anthologies were put together by a fan and are now in the archive at the U of Calgary. They were digitizing the collection and visualizing how to explore it in an iterative (

They call what they have the speculative Wonderverse. With it you can explore different views on the texts, metadata, filters, and so on.

Hinrichs then talked about the insights that came from visualization. The Wonderverse is not a means to an end, but as an end in a set of detours through visualization. They were exploring Gibson's symbols and their meaning. As they produced more and more metadata they were able to do more with visualization. Viz enables a critical thinking process. The problem with the final visualization is that it can hide the detours that it took to get there.

She talked about visualization as a aesthetic provocation. There is a tension in their Wonderverse between the visual provocations of the visualization and the visual design of the original anthologies.

Hinrichs then talked about the dynamic between visualization as a field and the humanities. One should not become the servant of the other.

She closed with a reflection on sandcastles. She thinks the process of generating visualizations should be discussed more, be foregrounded. Lets all build sandcastles.

http://stuffofsciencefiction.ca/

Pim van Bree and Geert Kessels: Iterative Data Modelling: from Teaching Practice to Research Method

van Bree and Kessels began by talking about the background to their work. They first focused on a visualization of correspondence and then they began to make it more generic to support other project. Their project when from Chrono Spatial Research Platform to NODEGOAT which is a web-based database application for the humanities online. You can get an account and use http://nodegoat.net . They demoed the environment and how it lets us model space and time. The system lets one map people, events and so on in a space and time. I was struck by how fast the system is when there are lots of nodes/links.

Then they talked about running workshops. In running the workshops they found that the problem was not using nodegoat, but in conceptualizing the data model and what one wanted to ask of it. They have to teach the modelling. The model is always an interpretation. Data is a fuzzy idea in the humanities - we have ideas for visualizations that have nothing to do with data. Alternatively we think of data as a spreadsheet. But we should be stepping back and thinking about the conceptual structure of our data and the relations between data - something like a relational model. Humanists think in term of texts so they think of linear data. They talked about the trap of the database model. The speakers have found what worked is an iterative modelling process of developing their ideas.

Closing Keynote: Elizabeth Guffey: The Upside-Down Politics of Access in the Digital Age

Guffey started with an example about a DIY wheelchair elevator. She then talked about a story by Vic Finkelstein about a village designed for wheelchair users only. I world where doors are too high and ceilings can lowered. The provocation was what would happen to able-bodied people who might choose to live in the village? The non-disabled would knock their heads and need supports. Guffey teaches this as it helps able-bodied understand what it would mean to overcome disabilities. We without impairments forget how

Why is digital access for disabled people not better understood and dealt with? Guffey approaches the issue with three questions:

  • What is access for disabled people in the digital realm?

We tend to think of access as a more technology for overcoming disability. There is such a gee-whiz factor to all these gadgets. We tend to think of the digital as automatically being accessible when it is not true. Think of virtual reality. Digital access lurks behind other things. There are also all sorts of disabilities from visual to cognitive. Making accommodations isn't so complicated - most people just aren't aware.

The environment can make us more or less disabled.

  • Why is this so little known?

Part of it is conceptual - we don't understand what access means. We need to think about it, to politically secure it and to implement it. Access is not about transfer speeds. Guffey talked about how computing in the 70s for humanists had an access problem. There were all sorts of hurdles to using computers back then. Access for digital humanities back then

For Guffey "access" has a deeper meaning. It ties to the wheelchair international "Access" symbol. The symbol allows her to do so much in her life. The symbol has changed in New York to a new symbol. Seeing the symbol Guffer began to study its history which led her to explore history of what was "access." Our profession of digital humanities was forming at the same time and same place as activists were lobbying for access as civil rights.

Part of the problem with digital access is that non-digital access is in plain sight while digital isn't. There are all sorts of images of ramp fails, but how do we see digital fails. We think the digital is compelling, but that doesn't mean it is accessible.

  • Can this change?

Accessible digital could and should be done. It doesn't take much. There are online sites that make things easy. The Khan Academy has tota11y a guide. There is a Teach Access that is a site to encourage us to teach accessibility. There is a site called Engineering at Home that focuses on how basic home things can be adapted to be more useful. The site passes access tests and gets you thinking.

There are a number of digital ramp fails. One can have students compare sites.

Then Guffey returned to the symbol. It is educational, but it also helps us see the ramp fails. We need a symbol for sites that indicate accessibility.

Guffey then talked about legal aspects. The Trump admin has shelved accessibility. The administration thinks enforcing accessibility is a damper on economic growth and that it can be handled by the market.

She ended with access as a concept to think about and to tackle, not just in digital work, but just to teach.

The End

We learned that Ottawa will be 2020! Bravo to the team including Constance Crompton, Brian Greenspan, Kevin Kee and others. Utrecht in 2019 and Mexico in 2018.

Navigate

PmWiki

edit SideBar

Page last modified on August 11, 2017, at 04:46 PM - Powered by PmWiki

^