Main »

Society For Digital Humanities-SEMI

These are my SDH/SEMI 2010 notes. They are being written on the fly so they will be full of typos and they will be incomplete. Whenever I get distracted, really interested, or my battery dies the notes will suffer.

Lynne Siemens: The E-paper Anniversary: Lessons from the First Year of INKE

Lynne Siemens works in management studies and has been studying large digital humanities projects like INKE (on which she is a research member.) She talked about how a large humanities team like INKE manages itself.

The team made a deliberate decision to delay research planning until there was a firm administrative foundation in place. They developed an accountability agreement that everyone has to sign.

SSHRC will evaluate us (I'm on the grant) on whether we are collaborative, integrative, accountable and inclusive.

They have moved away from email management and use a rolling agenda on a WriteBoard.

One thing they are looking at is to give graduate student assistants collaboration and project management skills. How can this be done?

Sophia Hoosien: Exclusionary Practices: Representations of Computers in the 1950s

Sophia presented a paper on how computing was presented in the Globe and Mail in the early years (the incunabular years) if the 1950s. Sophia started with a discussion of the content analysis methodology that ultimately didn't prove very useful compared to close reading.

Some of the themes of the early years:

  • Importance of Ferranti in early years
  • Orientation to the UK in early 1950s
  • Dominance of the U of Toronto in early years
  • Importance of academic computing in the first decade
  • Language used to describe computers like "brain"
  • Theme of automation and work (often women's work)
  • Exclusionary practices where computing jobs are advertised for men while job ads for women are for typing jobs

Victoria Smith: Before the Moments of Beginning

Victoria talked about the early years of humanities computing in Canada. She started with historiographical reflections. The idea is not to do the history of humanities computing as a history of great people, but to look at the descriptions of needs and desires from the time and look at how organizations are formed.

Victoria then talked about the digital archive of humanities computing we are setting up at the University of Alberta. We are working with ERA (Education and Research Archive) run by the U of A Library.

Victoria finally focused on the founding of humanities computing in Canada, specifically the CCH at U of T, the foundation of OCCH (Ontario Consortium for Computing in the Humanities), and finally COCH/COSH which finally turns into SDH/SEMI. OCCH is formed in 1986 and by 1987 is COCH/COSH.

Victoria then called for help from others. We want your archival materials.

Gerri Sinclair: From Shakespeare to Software

Gerri started by introducing her avatars. Most of her talk was about herself, about which she is quite funny.

  • Lesson 1: We are increasingly inhabiting hybrid real and virtual worlds. There is a "Reality Continuum" that goes from physical to virtual. Many interesting projects are blending the two so that the physical is overlaid with virtual.
  • Lesson 2: Code drives new media. Learn to code.
  • Lesson 3: A graduate degree in the social science or humanities is important to new media
  • Lesson 4: Digital media increasingly important for everything. She predicts
  • Lesson 5: Serious games are neither fun nor effective. Educators and academic often terrible game designers.
  • Lesson 6: Distinctions between research in hard and social/humanities sciences are no longer relevant.
  • Lesson 7: All media digital, social and mobile. Media is also moving to the cloud.

She talked about the Great Northern Way Campus and the 2 year Master of Digital Design programme. 65% of curriculum is based on industry projects externally funded. The curriculum is an engine of commercialization, forming new companies.

She talked about the 4 Rs - that content be something that can be Reused, Revised, Remixed, and Redistributed.

She talked about the Cartoon Doll Emporium, a fashion-themed virtual world. Gerri gave this as an example of hybrid reality where people are living increasingly in virtual worlds.

She concluded that Canada doesn't have a national digital strategy. There is an digital economy consultation process. We need to respond, especially on the issue of copyright.

Understanding the 'Capacity' of the Digital Humanities

I was on the panel discussing the capacity of the digital humanities. Lynne Siemens talked about the survey that she put together and early results.

Ray Siemens talked about training and the amazing hunger for training. Some of the types of training available are:

  • Formal undergraduate or graduate programs like our MA in Humanities Computing at the University of Alberta
  • Courses in different programs that have digital training woven in
  • Training institutes like DHSI
  • unconferences like THATcamp

Day 2

Alan Burk: Once and Future Social Sciences and Humanities Digital Research Centres

Alan is the recipient of the 2010 Award for Outstanding Achievement, Computing in the Arts and Humanities and therefore gave our plenary talk reflecting why there aren't more centres supporting digital arts and humanities.

Alan talked about the founding of the Electronic Text Centre that he set up and directed from 1995 to 2007. They got help setting up from David Seaman from the University of Virginia. Based in the Library they have completed a number a projects including Canadian Poetry. The TAPoR project allowed them to partner with computer science, setting up a what at the time was the largest Sun cluster in Canada and becoming a Sun centre of excellence. The Synergies project has supported digitization and setting up a publishing system that supports up to 20 journals.

The key to their success has been the recruitment and training of staff including many science students.

The ETC had an environment that didn't really fit in a library - it was relaxed with high expectations. Staff were treated as colleagues and given opportunities to do research and present papers at conferences.

Alas they were not able to get a graduate off the ground. Nor was a proposal to set up an institute supported.

Alan then shifted to reflecting on the benefits of such centres to humanities research:

  • A centre can link a researcher to communities beyond researcher's immediate discipline.
  • A centre can gather critical mass of expertise that allows a much higher level of support.
  • A centre can provide specialized training.
  • A centre can help researchers formulate and write the technology components of proposals which in turn increases success rate.
  • Centres can write their own grants to build infrastructure.
  • Centres can support teaching through information technology.

Then Burk talked about success factors:

  • Funding
  • Champions
  • Knowing when to drop a project
  • Building collaborations
  • Focused agenda and working on manageable number of areas
  • Organizational structure
  • Good staff
  • Relevance to institution

On the topic of agendas he talked about the two-edged sword of having too many agendas, trying to support too many projects. They also suffered from pulls from the main Library. Libraries are less interested in supporting one-off projects of the sort that academics pursue. For these reasons he argued that such centres shouldn't be run out of libraries, the way the ETC is. There is a tension between the desire of libraries for standardization and the desires of academics to customize their projects.

We had an interesting talk about the tensions between standardizing access and the individualized research of the humanities. We also talked about whether the time of Centres is past.

INKE panel

Stéfan Sinclair started our INKE panel talking about the difficulties of extracting citations from PDFs automatically.

Then Daniel Sondheim compare citation designs from print to the web. You see his examples at http://tiny.cc/citationdesign . The patterns of design include:

  • Canonical citations where a standard way of citing
  • Juxtaposition where the text and its citation are connected by placement
  • Elsewhere notes where symbols connect the original text and a note (somewhere else like the foot of the page or end of the article)

Stan Ruecker showed designs for a paper drill tool.

Shannon Lucky talked about citation style and reader experience. She ran a study where students read versions of an article with footnotes or without (where the content of footnotes was integrated into the narrative.) The idea was to see if footnotes make a difference. There doesn't seem to be any indication that one style is easier than another. There is no indication that citation style makes a difference.

Christian Vandendorpe talked about how the emergence of a novelistic form of reading discouraged the footnote. Readers began to expect a text that flowed smoothly like a novel so footnotes were put in at the end. Christian seemed to argue that there was a dialectic between reader's expectations and attempts to play with the flow. Footnotes come and go.

Speculative Timelines

We had three papers in the Canadian Historical Association on speculative timelines.

Megan Meredith-Lobay: Visualizaing Alba

Megan talked about a set of documents that tell of the founding of Scotland. The narrative of epics, whatever the archaeological evidence, tells a history that was important to Scotland's sense of their story. What "actually" happened doesn't change what is perceived to have happened.

I presented about representing the time of computing in Canada, specifically the history of humanities computing. I used McTaggart's "The Unreality of Time" to argue that there are incompatible types of events that we need to graph in timelines corresponding to McTaggart's A series and B series.

Bethany talked about the original temporal modeling project and led into her case study which is the time of Swinburne reception.

Day 3

Ray Siemens, Notes Towards the Social Edition

Ray talked about the evolution of editions:

  • Dynamic Text (late 1980s)
  • Hypertextual Edition (early 1990s)
  • Dynamic Edition (early 2000s)
  • Social Text ?

They are trying to model social text. See http://is.gd/czyJ2 (PDF of paper) They are hearing requests for social reading tools.

Ray then compared the "traditional scholarly edition" to the "social edition." The TSE is static and top down. The SE is process-driven and fluid. There is loose editorial mediation.

Karen Taylor: Humanities and Social Sciences Scholar's Use of Digital Technology for Teaching and Research

Karen and colleagues ran a survey about how humanists are using technology. Over 400 people filled out the survey. Many were tenured or tenure-track profs. The majority of the respondents were women. They percentages that use technologies like blogs was high, for example, 28% said they had written software. I worry that the high numbers for digital literacy reflects the people who filled it out.

Participants seemed to comfortable reading on screen, but prefer reading print. They prefer print for annotating, comfort, and it is easy to skim. It is also easier to concentrate on print. That said, it is very convenient to download articles.

As for information seeking respondents seem to heavily use searching the internet, but also use citation chaining.

Of the tools used, statistical tools were used most.

Karen had some very interesting results from general questions about value of technology. While most profs thought otherwise, retired profs and undergrads didn't think technology was useful in teaching. The undergrads were the outliers over and over. Could we be pushing technology too much on our undergrads?

Stéfan Sinclair: Theorizing Analytics

Stéfan presented a paper we wrote on theorizing how analytics work rhetorically. He started by talking about a rhetoric of failure - complaints that tools don't live up to the promise of computing. Then he used the following question to frame the theorizing,

What is literary interpretation such that computing can make a difference?

Stéfan then talked about Thing Theory quoting Baird on how scholars have overlooked instruments because they only see text as bearing theory. Two types of things seem to be able to bear theory, instruments and models. Stéfan then interpreted a tool, the KWIC. Finally he drew on a theory of aesthetic encounter from Joyce's Portrait of an Artist to develop a parallel theory for the interpretative encounter with computer:

  • Integrity and demarcation of the work one will treat as aesthetic - we encounter the work as a whole against a backdrop
  • Analysis of the whole into parts and studying how they fit together - we take the whole apart
  • Synthesis of the parts into an interpreted whole - we put parts back together (like a KWIC)

We had an interesting exchange about the idea of tools as theories.

Alan Liu and Pierre Levy: A Perspective for the Humanities in the Digital Age

Alan and Pierre held a dialogue across English and French. They were introduced as having divergent views.

Levy began by saying that he completely disagrees with Liu's Laws of Cool but he feels it is important to have such cultural criticism. He seems himself and more of a McLuhan-like prophet. He then started with 3 ideas:

  • Evolution of Mediasphere - one should see the explosion of digital media as a continuation of the evolution of the culture of language. He talked about the continuity of language innovations from the alphabet/writing, to the printing press, to mass media and now to ubiquitous media. he predicts a grand epistemological revolution similar to the scientific revolution of the 17th century.
  • The human and social sciences are now in the state of alchemy as disciplines. They are fragmented, with limited methods, and their object of study is fragmented. There are signs of growing strengths from collaboration, observation, and the availability of an open semiotic system. He sees Google Books as a the precursor to some sort of common computable semi-infinite, open semantic coordinate system. We don't know how to exploit this computable evidence and need to develop the methods. We need the equivalent to a Newtonian revolution that unifies our subject matter with a coordinate system in the sense that Newtonian physics provides a math for study of the physical world. He imagines a cognitive space that is computable that unifies the human and social sciences.
  • Human Development. He finally talks about collective intelligence that will transform everything from governance to the social to the arts. He sees an interdependence of networks from networks of will (governance) to personal networks (communication.) This collective intelligence will be a mirror to our cognition and knowledge such that we can change it and change ourselves which will lead to human development.

Liu Talked about Kurzweil's idea of the singularity when networked/computing intelligence. Liu nicely sketched out how this formula has been presented over and over. He says Levy is only half a "singularitarian" - he (Levy) imagines the universal without totality. Everyone's graphs are exponential and point up to the vanishing point of singularity. It is a monotony of singularity that needs to be broken up. The reality is divergence, chaos, trend lines that are chaotic. Liu feels Levy's universal without totality is not enough to stand up to the singulaterianisms. We need an intervening social ground where individuals can resist and critique the totalizing.

Kieth Lawson: Students' Use of Social Networking Sites for Academic Purposes

Kieth ran a survey on attitudes to FaceBook for learning. He found a form of the creepy treehouse effect where students really don't want academics to use FaceBook for education. They are comfortable collaborating using FaceBook but they don't want us to force use of it. I think it is a matter of control. If they choose FaceBook then they control it. When it is a creepy treehouse then it is controlled by us.

Lynne Siemens: Academic Capacity in Canada’s Digital Humanities

Community: Opportunities and Challenges

Lynne talked about results of a survey she ran as part of the Academic Capacity project.

Peter Organisciak: What do we say about ourselves? An analysis of the Day of DH 2009 data

Peter presented an analysis of the Day of DH 2009 data. See http://tiny.cc/daydh09 for the project. He painted a picture of what a digital humanist does from the data. They get up, drink coffee, read email, go to meetings, prepare or deliver classes, read, write, code, edit data, and work late from home.

Richard Cunningham: The Architecture of the Book for a Digital Age

Richard talked about the first year of the INKE project and in particular the textual studies group. Things don't seem to have gone well for that group as they had a hard time retaining their postdoc.

AGM

Last, but not least, we had the Annual General Meeting. Andrew Keenan of the University of Alberta won the Lancashire promise award!

Navigate

PmWiki

edit SideBar

Page last modified on June 07, 2010, at 09:57 AM - Powered by PmWiki

^